Abstract
The intellectual task of text modelling is an essential part of the procedures in migrating language into a digital condition. The act of modelling calls our understanding of texts into a remarkable level of self-consciousness and awareness. The challenge to render explicit much of what is left implicit in habitual reading practices or interpretive acts creates frameworks of understanding through these models – schematic abstractions of content, relations, readings and other aspects of textuality. Intellectual insights spring from an encounter with the tasks of creating metadata and mark-up schemes, style sheets and file structures, and in creating the schema that allow a relation between access, storage, display and other features of manipulation. These insights are also useful outside the technical arena, for the cultural import of decisions that modelling puts into place as well as for the way they offer purchase on the process of reading and understanding. This article discusses text modelling in a digital environment as a form of interpretative activity and uses particular case studies that suggest general implications for an understanding of textuality.
Acknowledgements
My debts to Jerome McGann are everywhere in evidence in this piece, as our conversations in the last seven years have shaped my thinking and work in digital textuality. The community in digital humanities at the University of Virginia, vibrant and intellectually charged in the first years after my arrival here in 1999, provided an unprecedented forum and opportunity to engage with these issues. The DRIS (Digital Research and Instructional Services) group in the Library at the University of Virginia continues to serve as an engaged professional community and support for this work. All of these have contributed to my understanding and use of metatexts, metadata and mark-up.
Notes
1 Hockey, op. cit., is an extremely useful, objective introduction to the field and its history. For a multi-faceted examination of digital humanities see Unsworth (Citation2004), Sutherland (Citation1997), McCarty (Citation2005) and Loizeaux (Citation2002).
2 Jerome McGann, ‘Texts in N-dimensions and Interpretation in a New Key’, expands this compact list through a broader discussion that summarises work at SpecLab, co-founded by McGann and myself at the University of Virginia in 2001. <www. humanities.mcmaster.ca/∼texttech/pdf/vol12_2_02.pdf– Supplemental Result>.
3 See McGann (Citation1983, Citation1993), McCaffery (Citation1992), Drucker (Citation1994) and essays by Nick Frankel, Randall McLeod, Peter Robinson, Manuel Portela et al. in Bray (Citation2000).
4 For a similar problem at the level of keywording, see Julia Thomas's article, below [Editor's note].
5 Aspirations for the project went beyond funding and time, and the graphic language of Temporal Modelling did not evolve as fully as it has potential to do.
6 The DTD can be downloaded, along with XML files and templates.
7 Not the Edition, which was the instantiation of the larger category of the Work, which included the conception of the project as a whole within which the book came into being.
8 NINES – A Networked Infrastructure for Nineteenth-Century Electronic Scholarship. URL: http://www.nines.org/.
9 See http://www.speculativecomputing.org for SpecLab activities and documentation. All URLs accessed 27 July 2006.