Quantcast
Channel: Jonathan Usher – Infolet
Viewing all articles
Browse latest Browse all 6

Domenico Formonte – Jonathan Usher (a cura di ), New Media and the Humanities: Research and Applications

$
0
0

Autore: Domenico Formonte – Jonathan Usher (a cura di )

Titolo: New Media and the Humanities: Research and Applications. Proceedings of the First Seminar “Computers, Literature and Philology”, Edinburgh, 7-9 September 1998.

Editore: Oxford University Computing Unit, University of Oxford

Luogo di Edizione: Oxford

Anno: 2001

N. pagine: XI + 176

Prezzo: Sterline 18.99

Codice ISBN: 0-9523301-6-4

Acquistabile su: http://www.amazon.co.uk/, http://www.hcu.ox.ac.uk/publications/clip.html

fiormonte_usherMost students of Italian literature probably judge books that feature the word “computer” in the title not worth their exploration. What could be further from the emotional and imaginative wellsprings of a novel or a poem, after all, than the sterile logic of disk drives and bits and bytes? But philology — that humanistic discipline which provides a foundation for higher-level literary criticism — employs its own rigorous methods of comparative analysis that are indeed amenable to, and in some cases can be greatly aided by computational processing, in other words, computers. The coin “humanities computing” has been with us for a few decades already and is not unfamiliar to many ears, though its scope and significance are still not widely appreciated. To literary-minded scholars, New Media and the Humanities: Research and Applications, the newly published proceedings of the first international “Computers, Literature and Philology” (CLiP) seminar, presents an invitation to become acquainted with some of the philological tools and theoretical approaches practitioners of humanities computing are currently developing, many with direct relation to Italian Studies.

Italy was in fact the scene of one of the earliest humanities computing projects, the Index Thomisticus, a concordance with critical edition of the works of Thomas Aquinas that resulted from research begun in the late l940s by Jesuit Roberto Busa. Yet, as Dante Marianacci, Director of the Italian Cultural Institute in Edinburgh, points out in his foreword to New Media and the Humanities, the CLiP seminar held at Edinburgh University on 7-9 September 1998 “connected for the first time Italian Studies to mainstream research in Humanities Computing, establishing robust relationships between Italian research centres and pioneering institutions in USA and Europe” (vi). Since this initial meeting, CLiP has convened each year at a different European university, attracting a stable group of humanities computing professionals to participate in informal presentations and discussion of their methodologies and projects. New Media and the Humanities thus preserves the initial parleys of an ongoing, evolving conversation.

Unfortunately, however, the volume includes only the series of edited papers delivered at the Edinburgh seminar without any record of the undoubtedly lively and pointed exchanges among participants. Nevertheless, the collection of fourteen essays gives one a sense of the different directions from which the contributors approached the major themes of textual encoding, editing, and analysis. Editors Domenico Fiormonte and Jonathan Usher distill and recombine these themes in their introduction, but here we simply provide a précis of each of the essays, which fall roughly into three groupings, although the table of contents suggests no such organization: theoretical and practical constructs for understanding the significance of electronic philology, historical and critical perspectives on the development of computational linguistics and humanities computing, and case studies.

In the opening essay, humanities computing veteran Willard McCarty offers high-level perspectives on the discipline. “Humanities computing,” McCarty claims, “is centred on the mediation of thought by the machine and the implications and consequences of this mediation for scholarship” (3). Alluding to Busa, he argues that the computer must not be regarded as “just a tool,” but rather as “an agent of perception and instrument of thought,” a kind of “mental prosthesis” (3). As such, it shapes what we see and how we see it. When texts are encoded according to a rigorous scheme and then analyzed computationally, the result is a kind of translation of original ambiguities into a brutally simple logic that can at once reveal significant underlying patterns that might otherwise escape detection while throwing into relief the very richness of what cannot be adequately marked up. Francisco Marcos Marín agrees with McCarty that eleetronic philology is not just a tool limited merely to speeding up certain analytical processes and compiling statistics more accurately. Yet unlike McCarty, he does not propose theoretical models to explain or justify its existence. Instead he describes and classifies directly its processes and results, arguing for its place in intellectual and institutional landscapes based on the new questions and insights it makes possible.

In perhaps the most provocative piece in the volume, Allen Renear, the lone American among the otherwise European cohort, suggests that not enough thought or theoretical attention has been given to “non-critical editing,” a term he uses to encompass all but the most formal types of textual editing, and hence also the most common. Especially with the vast and ever-increasing publication of non-critically edited texts on the Web, the cultural consequences are enormous in Renear’s view. While marking up texts for presentation on the Web may seem a simple and straightforward task, he points out that literal transcription, or the “literal representation of the linguistic text of a particular document” (29), inevitably involves a complex set of problems. In order to achieve the appropriate balance between “getting the whole text” and “nothing but the text” (28), one must first understand what a text is, to which Renear gives the answer: an “Ordered Hierarchy of Content Objects” or OCHO (27) — a textual ontology that has been used to support strategies such as the Text Encoding Initiative (TEl).

Lou Burnard approaches the problem of encoding “just the text” from a somewhat different angle than Renear. Rather than departing from ontological concerns, Burnard works from a hermeneutical stance that may be specifically characterized as semiotic. Transcription inevitably involves interpretation because of the complex nature of textuality. A text, he points out, is simultaneously an image, a linguistic construct, and an information structure. A hermeneutical point of departure thus comes round to oritological categories in making its circle, so Burnard’s theses are not incompatible with Renear ‘s.

Fabio Ciotti approaches the problem of transcription from yet a third angle, namely epistemology. Departing from the premise that transcription is essentially concerned with reproducing the sequence of graphic symbols, Ciotti emphasizes the role played by the transcriber’s knowledge and competence in recognizing linguistic codes. From this standpoint, the act of transcribing, or re-encoding a code appears as a primarily epistemic endeavor that can be isolated in practice from metaphysical or ontological concerns. Textual encoding, according to Ciotti, is a theoretical language used to provide access to one set of codes through another.

Shifting to a reader-centered perspective, Claire Warwick asks whether there remains a role for the editor in the hypertext environment where readers have more determination than ever about to how read a text. She answers affirmatively, but argues that role is necessarily changed by the medium. The main difference she sees is that the discrete tasks involved in textual editing are less likely to be performed by an isolated individual than by several working in collaboration. Complementing Warwick’s views with a final theoretical perspective, Federico Pellizzi suggests that theories of discourse elaborated by Foucault and Bakhtin relate well to theories of hypertext since the latter, like the former, place emphasis on the different positions or distance inherent in communicative interactions.

Moving away from the theoretical aspects of textuality and textual encoding, Antonio Zampolli examines in more practical terms the current situation and opportunities for cooperation between computational linguistics and humanities computing, first on a global scale and then particularly in Italy. This is a task which he is uniquely qualified to do, given his continuous involvement in the field of computational linguistics since 1967 and his directorship of the CNR Institute of Computational Linguistics at the University of Pisa. Zampolli traces the parallel but often divergent histories of the two disciplines, pointing out avenues for future cooperation, urging practitioners of computational linguistics and humanities computing to collaborate in the creation of large linguistic repositories and of tools to analyze them. Likewise critical of the isolation of computational linguistics from the concerns of the humanities, Elizabeth Burr advocates setting aside a theory-driven approach to linguistics, which she argues has suffered from a dualistic conception of language based in the analysis of binary opposites, in favor of a data-driven approach, which assumes a functional attitude toward language and focuses on the analysis of corpora of naturally-occurring linguistic productions. A functional stance furthermore introduces an historical perspective that recognizes the evolutionary development of dialects and sociolects, as well as a linguistic knowledge perspective that considers the limits of a speaker’s competence. In his essay on “Researching and teaching literature in the digital era,” Giuseppe Gigliozzi outlines various initiatives of the Centro Ricerche Informatica e Letteratura (CRILet) at the University of Rome, which he directs, before turning to some theoretical reflections on the methodologies he has employed. In the final section, Gigliozzi offers a series of observations on the situation of humanities computing in Italy, correlating its emergence with the need to define alternative professional profiles within glutted humanities faculties.

Readers less interested in theoretical or professional concerns may wish to turn directly to the last four essays, which offer a variety of case studies. David Robey discusses computerized techniques he has used to analyze the sound features of Dante’s Divine Comedy and investigate whether patterns of accented syllables and assonance can be correlated with literary devices operating in the poem. In the only non-English contribution to the volume, Massimo Guerrieri discusses a project to present and compare statistically the variant editions of Eugenio Montale’s Mottetti. Guerrieri’s paper will help those less familiar with SGML (Standard Generalized Markup Language) and the TEl understand how they can be used to prepare textual corpora for analysis using tools such as TACT (Text Analysis Computing Tools), developed at the University of Toronto. Staffan Björk and Lars Erik Holmquist, two younger researchers from the Interactive Institute in Gothenburg, Sweden, sketch their development of software to facilitate sirpultaneous browsing of textual variants. Their work on interface design and screen display issues highlights the importance of devoting attention to these practical aspects of humanities computing in tandem with refining programs to perform computational and lexical analysis. Finally, Licia Calvi describes a hypertext project built around the teaching of postmodern Italian literature at Brown University.

The regrettable absence of a centralized site for CLiP on the Web makes following the evolution of the annual seminar difficult, though remnants of earlier calls for papers do contain some useful links to further information about participants and their projects. In addition, New Media and the Humanities includes a comprehensive list of works cited. More current sources on electronic philology can be found by browsing issues of Computers and the Humanities or the archives of the Humanist electronic discussion group (both sponsored by the Association for Computers and the Humanities), as well as the journals Literary and Linguistic Computing and Computational Linguistics.

Christian Dupont
Director, Special Collections Research Center
600 E.S. Bird Library
Syracuse University, NY
(Review originally published in: Annali d’Italianistica, n. 20 (2001), pp. 581-84).


Viewing all articles
Browse latest Browse all 6

Trending Articles