
Here’s a new one for you – what if we were to argue that literary scholarship and the general study of literature no longer requires you to actually read any books? Instead, the same results could be achieved by using computers to crunch “big data” and stores of literary information to provide new insights into the way we think about books, literature, and stories.
This obviously flies in the face of the standard understanding of literary study that for centuries has insisted upon the close reading of texts. Yet it is not a unique argument.
We’ve previously considered whether, with the rise of Apps and digital programming influencing the way we publish stories, the future of literature may be electric. And there is now an increasing number of groups and individuals who believe a similar approach could be taken towards academic literary theory. Indeed, they term this “computational criticism” – that is, the analysis of literature in a statistical way using computational models and digital programming.
Why now? Simply, because modern digital technology permits it. Since Google developed an electronic scanner capable of digitising books in 2004, the written words of all literature can be turned into data – and computers can scan and process this information to pick out trends and identify new areas of insight. They can create graphs, tables, and visual representations of this data that is – arguably – more engaging and interesting to consider than a 100,000 word treatise on the relationship between Kafka’s shoes and modern anti-establishment sentiment (please note: this may not in fact be an actual PhD thesis title – but there are some great ones out there, see for yourselves).
Of course, the idea of visually representing literature as data is not new. One of the great masters of the written word, Kurt Vonnegut, proposed mapping the plots of stories, as well as character development arcs, onto graphs. In 1952, the satirist’s work Player Piano predicted a dystopia in which giant computers have taken over the work of the human brain – and in his later lectures on the shapes of stories he opined “there’s no reason why the shapes of stories can’t be fed into computers.”
Needless to say, this topic has drawn some controversy among the literary establishment. Harold Bloom, one of the best-known literary critics and Sterling Professor of the Humanities at Yale University, has described the idea of digital literary theory as “absurd […] I am interested in reading; that’s all I’m interested in.”
Others are, however, more receptive to these ideas. Jonathan Franzen, for example, says: “The canon is necessarily restrictive. So what you get is generation after generation of scholarship struggling to say anything new. There are only so many ways you can keep saying Proust is great.”
“It can be dismaying to see Kafka or Conrad or Brontë read not for pleasure but as cultural artefacts,” Franzen continues. “To use new technology to look at literature as a whole, which has never really been done before, rather than focusing on complex and singular works, is a good direction for cultural criticism to move in. Paradoxically, it may even liberate the canonical works to be read more in the spirit in which they were written.”
We’ll let you decide for yourselves what you think of this new world of literary study. Below, you’ll find five of our “picks” of digital projects in the humanities. Let us know what you think in the comments section at the end!
- Mapping Emotions in Victorian London draws on annotations made by readers on passages of Victorian novels, to generate an “emotional map” of London. You can navigate the map online, exploring the emotions of the readers, as well as the underlying fictional passages, to discover the ways in which London was constructed, navigated and represented emotionally in its fiction.
- BookLamprecognises how similar one book is to others in the same genre. Simply type into BookLamp’s search bar one of your favourite novels and it will return a data-driven list of 20 more titles that you’ll like.
- VisualEyes, developed by the University of Virginia, is a web-based tool that uses data to digitally map, graph and chart important historical events, searching through vast online databases to pinpoint where concepts first appeared and how they spread across the world.
- ‘A View of the World Through Wikipedia’is a time-lapse video made by Kalev Leetaru, a researcher at the University of Illinois, charting how writers have expressed generally positive or negative sentiments towards the places they have written about. Leetaru has done similar analyses with books, social media and online news in a project entitled Culturomics 2.0
- The Circumstance art collective in Bristol is an interactive online model: a combination of a print book and an urban-walking app that overlays an imaginary world onto the physical.