Sunday, 12 February 2012
Saturday, 4 February 2012
Now that data analysis is task relegated to the everyday, the processes that serve particular functions related to this task must be simple despite their complexity. Because of the large quantities of data that are generated for any given project, techniques must be developed that allow researchers to “[interpret data] holistically, and to expose meaningful patterns and structure, trends and exceptions, and more”.
The 2010 Horizon Report examines the blending of visualization, data mining, and statistics that has produced the new field of visual data analysis. Essentially, visual data analysis allows data to be placed into any number of charts, maps, tag clouds, or any other graphical means through pattern-matching in relation to human interaction, and the making of meaning from various sets of information. The report draws on tools such as self-organizing maps that “create a grid of ‘neuronal units’ such that neighboring units recognize similar data, reinforcing important patterns so that they can be seen” to prove the usefulness of such a system for the common user.
Although visual data analysis is useful for everything from flow charts to word collages, it also may be a useful tool due to its ability to seek and find patterns within a given text. Another boon involves the ease with which data may be accessed from various sites. Thus, interactive visualizations of data can someday be widely available, which will add to the effectiveness of electronic books and journals, as these visualizations will allow users to draw on and visually map the most up-to-date data available. In this way, “graphical representations, in whatever form they take, will be expected to clarify the narrative in an environment that combines increasingly sophisticated multimedia presentation with ever-increasing amounts and types of data” (http://net.educause.edu/ir/library/pdf/ELI7052.pdf).
For more information, the link to the 2010 Horizon Report is here: http://wp.nmc.org/horizon2010/chapters/visual-data-analysis/. Here also is an example of visual data analysis at work from Gapminder World:
Thursday, 2 February 2012
We see a similar emphasis on the facsimile in this incunable period of digital editing, but now the exemplar is commonly matched with a transcription, sometimes, but not always, marked-up in XML using the TEI schema. This combination of image and text forms the basis of most Web-delivered digital archives. (See for examples the Donne Variorum facsimiles, or The Rossetti Archive). Less commonly are these materials the basis for original textual scholarship and corresponding apparatus of the type one sees in the mature print facsimile exemplified in Jeanne Shami’s edition of a manuscript of John Donne’s 1622 Gunpowder Plot sermon (with authorial corrections). This edition presents its material in a form that is similar to the digital archive. Taking advantage of the full, two-page opening, it presents on the left a page facsimile of the exemplar, and on the right, the corresponding transcription. Interestingly, the lines in the transcription are numbered, just as they are in XML transcriptions to enable correlation between the facsimile and the transcription. Along the foot of the page are the major variant readings of the only other witness, the printed text found in Fifty Sermons (1649). So then, while the page arrangements are similar to those of the interface to a digital archive, the content presents more than the simple primary materials. It includes, in addition to an extensive introduction to the manuscript, a textual apparatus comprising a record of variants, and, at the back, a summary table of Donne’s corrections and a section of paleographic commentary titled “Transcription Details.”
|Fig. 2. An opening from Shami's edition of John Donne's 1622 Gunpowder Plot Sermon.|
Ernie Sullivan’s edition of The First and Second Dalhousie Manuscripts places the facsimile of the exemplar and the matching transcription side-by-side, but on the same page, rather than on facing pages. Although the book is folio-sized, the page is not large enough, or not laid-out well enough, to give the reader a legible facsimile image. There is also a great deal of wasted white space. In contrast to the Kent State edition of Sidney's Arcadia, and other facsimiles common in the 1960s and 1970s, this one provides much of the supporting documentation that one would expect of a newly edited text. It has a substantial introduction and note on the transcription at the front, and at the back, a section of explanatory notes, a section on "Manuscript and Print Locations of the Poems," a "Textual Apparatus," and an "Index" to the contents. That is, this edition, like Shami’s, is the work of original textual scholarship. The difference between these and other, similar editions, is the presence of page facsimiles of the original artifact (illegible though they may be).
|Fig. 3. A page from Ernie Sullivan’s edition of The First and Second Dalhousie Manuscripts|
McGann, Jerome, ed. The Complete Writings and Pictures of Dante Gabriel Rossetti: A Hypermedia Archive. http://www.rossettiarchive.org/index.html.
McLeod, Randall. “UN Editing Shak-speare.” Sub-Stance no. 33/4 (1982): 37.
Shami, Jeanne, ed. John Donne’s 1622 Gunpowder Plot Sermon: A Parallel-Text Edition. Language & literature series volume 22. Pittsburgh, PA: Duquesne University Press, 1996.
Sidney, Philip. The Covntesse of Pembrokes Arcadia. Ed. Carl Dennis. [Kent, Ohio]: Kent State University Press, 1970.
Stringer, Gary, ed. DigitalDonne: the Online Variorum. http://digitaldonne.tamu.edu/.
Sullivan, Ernest W. II., ed. The First and Second Dalhousie Manuscripts: Poems and Prose. Facs. ed. Columbia: University of Missouri Press, 1988.
Wilson, F. P. Shakespeare and the New Bibliography. Rev. and ed. Helen Gardner. Oxford: Clarendon Press, 1970.