“No Text […] is Ever One Thing”: Utilizing Digital Technology in Literary Research

Sonia Howell1

National University of Ireland

Abstract: Driven by the debate between “specialist” and “generalist” methods of analysis in the study of world literature, we embarked upon a case study consisting of a textual analysis of the dialogical relationship between patient and therapist in “factional” Irish and English novels. By appropriately encoding text passages and using suitable visualization techniques, we aspired to provide a methodology that could be utilized at both national and international levels of study. This paper will argue that both the findings generated by the software developed, and those discovered through the development process, serve to visibly enhance both traditional textual scholarship and work in the field of Digital Humanities.

Keywords: World literature; Digital humanities; Textual analysis; Specialist and generalist methods; Computer scienece; Traditional textual scholarship; National literature

Sonia Howell is a Doctoral Fellow with An Foras Feasa and the School of English, Media, and Theatre Studies at the National University of Ireland, Maynooth, Co. Kildare, Ireland. Email: sonia.howell@nuim.ie .

The INKE Research Group comprises over 35 researchers (and their research assistants and postdoctoral fellows) at more than 20 universities in Canada, England, the United States, and Ireland, and across 20 partners in the public and private sectors. INKE is a large-scale, long-term, interdisciplinary project to study the future of books and reading, supported by the Social Sciences and Humanities Research Council of Canada, as well as contributions from participating universities and partners, and bringing together activities associated with book history and textual scholarship, user experience studies, interface design, and prototyping of digital reading environments.


Introduction

This article provides an individual reflection on the interdisciplinary team research conducted by a group of two literary scholars and computer scientists for a project presented at DH2010 (Keating, Howell, & Kelleher, 2010).2 Driven by the debate between “specialist” and “generalist” methods of analysis in the study of world literature, we embarked upon a case study consisting of a textual analysis of the dialogical relationship between patient and therapist in “factional” Irish and English novels. By appropriately encoding text passages and using suitable visualization techniques, we aspired to provide a methodology that could be utilized at both national and international levels of study. Although driven by a specific research question, in embarking upon our project our concern was as much with the processes, successes, and challenges of this cross-disciplinary collaboration, as with the results yielded.3 This paper will argue that both the findings generated by the software developed, and those discovered through the development process, serve to visibly enhance both traditional textual scholarship as well as work in the field of Digital Humanities.

Overview

Research has revealed that digital technology is fundamentally altering the way we relate to writing, reading, and the human record itself. Given that the concern of the humanities, and more specifically, linguistic and literary departments, is precisely the relationship between core social and cultural practices and reading and writing environments, the rise of digital technology is presenting new questions, new challenges, and new opportunities to those in the aforementioned fields. However, if humanities scholars are to fully engage with and capitalize on the changes being brought about by new digital media, interdisciplinary work between the codex-based disciplines and computer science is vital.

By bringing scholars from both the Humanities and ICT together into one domain, the rapidly evolving field of Digital Humanities, or Humanities Computing, has sought to foster and encourage such interdisciplinary work by:

using information technology to illuminate the human record, and bringing an understanding of the human record to bear on the development and use of information technology. (Schreibman, Siemens, & Unsworth, 2004, p. xxiii).

While the definition provided by Schreibman et al. (2004) of the Digital Humanities suggests a relationship of mutual dependence between the two disciplines, work in the field has revealed that this idealistic understanding of what the Digital Humanities should be has not manifested in practice. Lou Burnard, for one, laments the fact that, since its emergence in the UK in the 1970s, this new discipline, “has defined itself largely by technology rather than by theory” (Burnard, 2000, para. 12). For Burnard, it has been a case that the “digital” in Digital Humanities has been the dominant force in work being carried out in the field. Subsequently, in a large number of cases, the output from the research being conducted therein has proved to be of minimal benefit to the traditional scholarly practices of humanities researchers.

Although the perceived dominance of technology over theory in the Digital Humanities has served to deter many humanities researchers from actively engaging with new media in their traditional scholarly practices, this lack of engagement is perhaps most readily evident among literary scholars.

Literary studies and computer-assisted textual analysis

Despite the enormous potential offered by computer technology, Thomas Rommel rightly notes that the majority of literary critics seem reluctant to embrace electronic media as a means of scholarly analysis (Rommel, 2004, p. 93). The fascinating and valuable work being done by scholars such as Willard McCarty, Jerome McGann, and Franco Moretti, utilizing computer-assisted textual analysis, still remains on the periphery of the field of traditional literary studies. Stephen Ramsay has noted that those who are in favour of a computer assisted textual analysis, such as John Burroughs and Paul Fortier, are beset by an “awful anxiety that without some kind of objective, quantifiable, methodologically consistent framework … we are left with a ‘sort of anything goes’ hermeneutics” (Ramsay, 2003, p. 168). Hence their desire to move toward a more concrete form of criticism which relies on the quantitative facts yielded by computer technology.

However, as Ramsay has argued elsewhere, literary critical interpretation is not just a qualitative matter but “an insistently subjective manner of engagement” (Ramsay, 2007, p. 482). Which is to say that even the data yielded by computer-assisted textual analysis is only useful when interpreted by the literary scholar. Evidently, this has not proven reason enough for literary scholars to utilize digital technology in their scholarly practice as few, if any, literary departments actively employ computer technology in either their teaching or their research.4 It would appear that the current perception of digital technology among literary scholars is either that the new medium requires a change in the nature of their scholarly activity, or that it merely replicates their traditional practices, only in digital form – neither yielding any perceived additional benefit. It is evident therefore, that if digital technology is to participate dynamically in literary critical activity, it must not only enable critics to do what they traditionally do but must visibly enhance their scholarly work.5

Meaningful collaboration

For the purpose of a paper presented at DH2010, a collaborative project between computer science researchers and literary scholars in An Foras Feasa6 was embarked upon.7 Responding to the debate between “specialist” and “generalist” methods of analysis in the study of world literature, a case study consisting of a textual analysis of the dialogical relationship between patient and therapist in “factional” Irish and English novels was undertaken. By appropriately encoding text passages and using suitable visualization techniques, it was aspired that a methodology that could be utilized at both national and international levels of study would be developed. As our case study was embedded within the doctoral research of a literary scholar, it was driven as much by theory as by technology. Hence it demanded a relationship of mutual dependence between the textual scholars and the computer scientists.

A literary “problem”: World literature

As has been well noted, in the past century, the norm in literary studies has been to study literatures along national lines (Damrosch, 2003; Dimock, 2006). In recent years, however, the field of literary studies has witnessed a renewed interest in the concept of “world literature,” first coined by Goethe in 1827, which has called into question the validity of taking an “adjective derived from a territorial jurisdiction” that is the nation and “turning it into a mode of literary causality” (Dimock, 2006, p. 3). Yet the trend of studying literatures along national lines continues today, as many scholars claim that it is paramount that we examine works within their social, cultural, and historical contexts. But the possibility of recognizing the ongoing, vital presence of the national within the life of world literature poses enormous problems for the study of this dynamic field (Damrosch, 2003, p. 514). Hence, it is the case that this new field tends to be divided into “specialists,” those who are concerned with national literatures, and “generalists,” those who are interested in studying global patterns.

A new critical method

Responding to the challenges posed by this new literary concern, Franco Moretti makes the useful suggestion that we consider world literature not as an object but as a “problem.” But given the complexities of this new field, he maintains that the study of world literature cannot merely be “literature bigger,” that it has to be “different” from what literary critics have previously done. Subsequently, Moretti sees this problem as one “that asks for a new critical method” (Moretti, 2004, p. 149). This new method, Moretti argues, must enable the literary scholar to “combine the individual who reads a single work with great collective efforts and vision” (Sutherland, 2006, para. 16). Hence, what Moretti calls for is a method that affords new reading, but more specifically, new critical, abilities on the part of the literary scholar.

In our collaborative project, we sought to investigate whether electronic media could expand the literary scholar’s “interpretational procedures” by affording a new critical method with which to address the problem of world literature.8

Case study

Based on the doctoral research of a literary scholar, our case study was driven by a specific and quite “traditional” literary research question, which sought to investigate one of the driving concerns of world literature: that of cultural specificity. We conducted a comparative study of two different narratological treatments of trauma and cultural context by analyzing passages of dialogue between patient and therapist in Sebastian Barry’s The Secret Scripture (2008) and Pat Barker’s Regeneration (1992). In so doing, we sought to establish the degree to which the traumas are culturally specific and how this is supported/weakened by narrative technique. For the purpose of this study, we chose a novel by an Irish writer and an English writer respectively. Given that this methodology is in its infancy, it was presumed best to begin with two texts written in the same language and which originate from countries of similar cultural systems.

The Secret Scripture is set in present day Ireland and consists of a double narrative: the personal recollections of Roseanne Clear, who was incarcerated in a mental institution during the mid-twentieth century, and the account by the psychiatrist, Dr. Grene of his own investigation into Roseanne’s admittance into the hospital. Regeneration is based on the real life experiences of British army officers being treated for shell shock during World War I at Craiglockhart War Hospital in Edinburgh. Its narrative relays the treatment of soldiers suffering mental breakdown. It is shaped predominately around the discussions that the psychiatrist Dr. Rivers has with a number of patients within the asylum in which he works. This novel utilizes the narrative technique of “free-indirect discourse,” which shifts from various characters’ perspectives throughout. Both novels are centred around events which have caused psychological distress to the individual characters, but which have also caused what is known as “cultural trauma” to the nations in which they are set.

Our case study tests a methodology which is concerned with analyzing a “common-denominator” (the patient-therapist relationship) between texts. This opens up the two novels under examination to a useful comparative reading as works of world literature, while also yielding useful results to study of the texts within their respective national literatures. In so doing, the study: (i) introduces a new methodology to the academic community; and (ii) demonstrates the application of the results produced by said methodology to the answering of a specific literary question. The ultimate aim of the model was to provide a tool that supports individual and collaborative projects in the study of world literature, thus assuaging the needs of both “specialists” and “generalists.” In so doing, it is hoped that the software produced will assist in establishing a possible solution to one of the problems present in the study of world literature while simultaneously advancing the use of Digital Humanities in literary studies.

Building methodologies, retaining the ladders: Reflections on collaborative research and implications for literary scholarship

Although our case study set out to address a particular research question, in embarking upon the project our concern was as much with the process involved in interdisciplinary collaboration as with the results yielded. Before successful collaboration could occur between the respective disciplines, it was essential that both parties on either side of the disciplinary divide found a common language with which to communicate. In order to do so however, it was first necessary that all the researchers involved in the project developed an understanding as to what those in the opposite discipline “do.” Returning to Ramsay’s observation, as literary studies is fundamentally a “subjective matter of engagement” (Ramsay, 2007, p. 482), relaying how exactly one goes about “doing” literary studies presented one of the greatest challenges we encountered in the interdisciplinary collaborative research project discussed here.

However, despite an initial “mutual-incomprehension” (Sculley & Pasanek, 2008), during the course of this project, the premise put forth by a number of leading critics in the field of Digital Humanities (McGann, 1996; Ramsay, 2003; McCarty, 2004), that what literary scholars do is in fact not dissimilar to the practices in which the computer scientists themselves engage, was found to be true. It became evident that both disciplines are concerned with “marking up” texts in some form. The primary difference noted between the digital and the textual mark up was that in digital mark up the steps taken in getting from problem to product are available for critique: in literary studies, they are not. Unlike digital mark up, no other reader or scholar gets to examine what goes on between the literary scholar’s “encoding” of a novel and the journal article, the conference paper, or the dissertation that is subsequently produced.

Following Wittgenstein, Ramsay has argued that:

Throwing away the ladder … has … been the consistent method of literary criticism, which, as a rhetorical practice, is indeed often concerned with finding ways to conceal these steps by making it seem as if the author went from the open possibilities of signification in Lear to the hidden significance of the Fool in a single bound. (2003, p. 171)

That is to say that the subjective interpretations of literary scholars are often presented as final products; the steps taken in producing them are discarded. We begin at A and end up at Z: what happens in between is not of concern. Yet, as the renowned literary critic Raymond Williams observed as far back as 1973, “the relationship between the making of a work of art and its reception is always active, and subject to conventions, which in themselves are forms of (changing) social organization” (Williams, 1973, p. 47). Essentially, literary criticism is a sophisticated form of reception. As such, Williams argues that for a correct and useful approach to literary studies, we must “discover the nature of its practice and then its conditions” (p. 47) as opposed to focusing solely on the text itself.

Through collaborative research, literary scholars are afforded an opportunity to examine the nature of the practice involved in literary criticism since, as Sculley and Pasanek have usefully noted, in collaboration, “mutual ignorance becomes an opportunity for self-reflection, clarification and the speaking of what is usually unspoken” (Sculley & Pasanek, 2008). Through the cross-disciplinary dialogue required in work in Digital Humanities, “hidden and tacit assumptions” (Sculley & Pasanek, 2008) on the part of the literary scholar are brought into plain view and the ladder utilized in literary criticism is not only exposed, but becomes available for critique.

Given the import of this revelation to literary studies, it became one of the driving issues in the development of our software. In addition to permitting the visualization of an individual scholar’s markup, through the inclusion of an “insert comment” element in our markup of the passages from the two novels under examination, the reasoning behind a given scholar’s subjective interpretation of a text was made available for questioning. By visualizing the subjectivity of the encoder, the new text that was produced on the reception of the original novels by an individual scholar, became itself an object available for analysis. Hence, through the use of this software, the “practice” involved in the literary criticism driving the project became available for scrutiny; something that could not be achieved by traditional literary research methods alone.

Benefits of software for the study of world literature

Given that a driving concern among literary scholars in embracing the study of world literature (and indeed, Digital Humanities) is the issue of subjectivity, the software we developed has significant benefits. In a world where all aspects of human life are considered to be becoming increasingly “globalized,” or perhaps more correctly “Americanized,” it is feared that world literature threatens to do away with the subjectivity of both the author and the critic and to become merely another homogenizing force, a “‘master construct’ designed to produce the Same from the Different” (Saussy, 2006, p. ix). By encoding the individual subjectivity of the reader as a resource rather than a problem, our software ensures the plurality of meaning is not only retained, but becomes itself available for examination. Hence, in allowing for the multiplicity of meanings (all of which are reflective of cultural and social contexts) that are produced from a single text to be captured and visualized, this software offers a new interface that affords new reading and interpretative abilities, which serve to enhance the study of both texts in our case study as works of world literature.

Moreover, by enabling the visualization of markup by individual or multiple users,9 our software provides a tool that supports both individual and collaborative projects in the study of world literature, depending on the particular needs of the user and their specific Use Cases.10 Hence, it offers a critical method, which: (i) enhances traditional scholarly research practices; and (ii) assists the critic in the “unfolding of interpretive possibilities” (Ramsay, 2007, p. 484).

Pedagogical benefits

It is important to note that, in addition to advancing research regarding the specific area of literary studies in question, this software also has significant pedagogical benefits. In particular, it allows readers and learners to sharpen their awareness of narratological technique. By visualizing the various narrative structures within the two novels, the complexities involved therein, such as the use of third person narration or free indirect discourse, become much more readily identifiable for the user. Subsequently, students are more readily able to identify and examine how technique and theme work together. This is particularly useful in teaching undergraduate students for whom developing a nuanced understanding of narrative technique is often one of the greatest difficulties they encounter in their initial years in literary studies.

Conclusion

In 2001, Dino Buzzetti and Jerome McGann declared that “the general field of humanities education and scholarship will not take the use of digital technology seriously until one demonstrates how its tools … expand our interpretational procedures” (Buzzetti and McGann, 2001, p. 70). By engaging in a collaborative project with colleagues in the computer sciences, we found that the use of digital technology in traditional literary research did, as Buzzetti and McGann suggested it should, “expand our interpretational procedures,” by helping to address current literary problems while simultaneously raising new ones. If Lou Burnard’s lament that the field of Digital Humanities is driven by “technology rather than by theory” (Burnard, 2000) has been the case thus far, the success achieved by our interdisciplinary collaboration and its rich implications for the doctoral project upon which it was based, signal the advent of a new Digital Humanities, where both terms hold equal weight, where both disciplines are mutually dependent, and where both fields, in my opinion, are visibly enhanced by collaborative research.


Notes

  1. This research was funded by the HEA under the PRTLI4 Humanities, Technology and Innovation award to An Foras Feasa (NUI Maynooth).
  2. I also wish to acknowledge the support by software engineer Damien Gallagher and the An Foras Feasa Technical Officer, Aja Teehan.
  3. The focus of this paper echoes Sculley and Pasanek’s research concerning meaning and mining in the humanities (Sculley & Pasanek, 2008, p. 409), which is also informed by the real experience of collaboration between those in the field of literary studies and those in computer science.
  4. Franco Moretti’s “Literary Lab” in Stanford is the only exception to this of which I am aware.
  5. This issue is raised repeatedly throughout the work of a number of the leading literary critics involved in digital humanities (McGann, 1996; Ramsay, 2003; Rommel, 2004).
  6. The Institute for Research in Irish Historical and Cultural Traditions.
  7. Specifically, this involved a joint project with Dr. John Keating, Associate Director of An Foras Feasa, with input by Ms. Aja Teehan (Technology Officer at AFF), software engineer Damien Gallagher, and my supervisor, Professor Margaret Kelleher.
  8. In a forthcoming paper co-authored by four members of the research team involved in the case study, we provide a fuller discussion of implications of our case study for the literary models of “generalist” and “specialist.”
  9. For the purpose of a paper given at the IASIL Conference 2010, we asked ten literary scholars to “mark up” what they considered to be indicators of cultural context and trauma in The Secret Scripture and Regeneration by using the highlighting function on Microsoft Word. We also asked them to comment on their reasoning for doing so through Word’s “insert comment” function. The data yielded from the various responses was digitally marked up and visualized in the same way as that of the original researcher that project began with.
  10. For a useful commentary on Use Case modelling for humanities research, see Teehan & Keating (2010).

References

Barker, Pat. (1992). Regeneration. New York, NY: Dutton.

Barry, Sebastian. (2008). The secret scripture. New York, NY: Viking.

Burnard, Lou. (2000). From two cultures to digital culture: The rise of the digital demotic. URL: http://users.ox.ac.uk/~lou/wip/twocults.html [October, 2011].

Buzzetti, Dino, & McGann, Jerome. (2001). Electronic textual editing: Critical editing in a digital horizon. In L. Burnard, K. O’Brien O’Keeffe, & J. Unsworth (Eds.), Electronic textual editing (pp. 51-71). New York, NY: The Modern Language Association of America.

Damrosch, David. (2003). What is world literature? Princeton, NJ: Oxford University Press.

Dimock, Wai Chee. (2006). Through other continents: American literature across deep time. Princeton, NJ: Oxford University Press.

Keating, John G., Howell, Sonia, & Kelleher, Margaret. (2010). A new digital method for a new literary problem: A digital humanities approach to the problem of world literature. URL: http://dh2010.cch.kcl.ac.uk/academic-programme/abstracts/papers/html/ab-846.html .

McCarty, Willard. (2004). Modeling: a study in words and meanings. In S. Schreibman, R. Siemens, & J. Unsworth (Eds.), A companion to digital humanities (pp. 254-270). Oxford, UK: Blackwell Publishing.

McGann, Jerome. (1996). Radiant textuality. Victorian Studies, 39(3), 379-390. URL: http://www.jstor. org/stable/3829451 [November 5, 2010].

Moretti, Franco. (2004). Conjunctures on world literature. In C. Prendergast (Ed.), Debating world literature (pp. 148-162). New York, NY: Verso.

Ramsay, Stephen. (2003). Towards an algorithm criticism. Literary and Linguistic Computing, 18, 167–74.

Ramsay, Stephen. (2007). Algorithmic criticism. In R. Siemens & S. Schreibman (Eds.), A companion to digital literary studies (pp. 477-492). Oxford, UK: Blackwell.

Rommel, Thomas. (2004). Literary studies. In S. Schreibman, R Siemens, & J. Unsworth (Eds.), A companion to digital humanities (pp. 88-97). Oxford, UK: Blackwell Publishing.

Saussy, Haun. (Ed.). (2006). Comparative literature in an age of globalization. Baltimore, MD: Johns Hopkins University Press.

Schreibman, Susan, Siemens, Ray, & Unsworth, John. (Eds.). (2004). A companion to digital humanities. Oxford, UK: Blackwell Publishing.

Sculley, D., and Pasanek, Bradley M. (2008). Meaning and mining: The impact of implicit assumptions in data mining for the humanities. Literary and Linguistic Computing, 23(4), 409-424.

Sutherland, John. The ideas interview: Franco Moretti. The Guardian. Monday, 9 January, 2006. URL: http://www.guardian.co.uk/books/2006/jan/09/highereducation.academicexperts [December, 2008].

Teehan, Aja, & Keating, John G. (2010). Appropriate use case modeling for humanities documents. Literary and Linguistic Computing, 25(4), 381-391.

Williams, Raymond. (1973). Base and superstructure in Marxist cultural theory. In R. Williams. Culture and materialism: Selected essays. London, UK: Verso.


CCSP Press Scholarly and Research Communication Volume 3, Issue 3, Article ID 030133, 9 pages Journal URL: www.src-online.ca Received August 17, 2011, Accepted November 15, 2011, Published December 21, 2012

Howell, Sonia. (2012). “No Text […] is Ever One Thing”: Utilizing Digital Technology in Literary Research. Scholarly and Research Communication, 3(3): 030133, 9 pp.

© 2012 Sonia Howell. This Open Access article is distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc-nd/2.5/ ca), which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.