Reading Machines Reading Machines: 100 Trillion Poems and the Scaling of Possibility

In Reading Machines: Toward an Algorithmic Criticism, Stephen Ramsay alludes to Raymond Queneau’s Cent Mille Milliards de Poèmes (100,000,000,000,000 Poems), which consists of 10 sonnets (14 lines each), where each line is interchangable with the same-numbered line from any other of the sonnets. Ramsay explains:

“If there are ten possibilities for the first line, and any one of those lines may be followed by one of ten additional lines, it follows that the number of posssible combinations is (or 102 = 100). Since that number will increase by a factor of ten with each additional line, a fourteen-line poem becomes, in its potential state, the Cartesian product of its sets: i.e., 1014 (100 trillion) possibilities. Queneau determined that a person reading the book twenty-four hours a day would need 190,258,751 years to finish it…” (26)

With its “generative qualities” (Ramsay 28), Queneau’s book seems the perfect monument for Ramsay’s proposal for a textual criticism that blurs boundaries with artistic reinvention and opens texts up to imaginative possibilities and potentialities. He sees algorithms as a way of doing this, and argues that all textual criticism, as a process of rewriting and transforming, has already done this anyway, rhetorically and methodologically, but it just hasn’t been transparent about it, and as a result has restricted possibilities for reading.

But for me Queneau’s book equally stands in for Ramsay’s haziness in defining “possibilities” and “potentialities” beyond idealistic hurrahs. I’m on board with Ramsay’s algorithmic criticism, but I’m not convinced by its unbridled veneration of the possible or potential.

Compare the description of Queneau’s book given by Christian Bök’s dissertation:

“a single reader, reading one [sonnet] a second, must survive for more than a thousand millennia in order to read every poem. Such a book remains inscrutable not because of its illegibility but because of its potentiality. Such a book…deals with the as if of what is possible in virtuality but impossible in actuality” (78)

Both Ramsay and Bök praise Queneau’s book for exceeding the reading capacities of a human life, as a measure of its awesome potential. And yet at the same time, both writers measure the human’s failure to match that potential by a standard of completeness: “read every poem” (Bök) or to “finish it” (Ramsay). This is the human impossibility by which they measure a text’s adequate degree of critical possibility. But this measure of possibility collapses in the face of Ramsay’s own proposed reading method, not to mention Bök’s comparison of Queneau’s book to a “literary computer” (78). For, while the human cannot read this book one hundred trillion times, couldn’t a computer?! “A single [human] reader…must survive for more than a thousand millennia in order to read every poem” but a computer could read the book’s one hundred trillion poems in a matter of minutes or seconds. In fact, it could do even better: it could read the book’s one hundred trillion poems one hundred trillion times. Or more.

This fact quintessentially illustrates reading’s scalability: there can be no distinction between “close” or “distant” in reading Queneau’s book. Even a close reading requires the methods of distant reading. For, what hermeneutic claims about “meaning” can we make for a book for which we have time to read maybe only 10 out of 100,000,000,000,000 of its pages, and only once? We need more readings of more pages, for which we need a computer. Of course, from a poststructuralist perspective, every book is more than the sum of its visible pages – we are always already reading only a small percentage of “possibilities.” But Queneau’s book explicitly makes this a problem – of quantification. It suggests maybe we must turn to statistics, to take a sample size to represent a larger population. The poststructuralism in me says that this would make the mistake (“scientistic,” Ramsay would call it) of treating the work as a potential whole, when in fact every text is inexhaustible. But perhaps that should be revised to something like: every text is inexhaustible, on an infinite timeline. Reading is always going to be incomplete, but to read at all requires some limited sample space of assumed stability. Therefore, as the computer’s speed demonstrates, Queneau’s book is only relatively massive. Our computer’s reading of it would be no different from reading “one” sonnet, for example Shakespeare’s Sonnet 1, one hundred trillion times, or conducting an aggregate study of a hundred trillion academic articles on Sonnet 1 (if such a collection existed). But it would also only be marginally different from reading Sonnet 1 only once. For who is to define the upper limit on Sonnet 1s, if the limit is always going to be relative to our allotted sample space? The New Critics would define one reading of sonnet one as “complete” – a ratio of 1 reading to 1 sonnet (just as, according to Bök and Ramsay’s definition of “completeness”, one sonnet of Queneau’s one hundred trillion would be a complete reading of that one sonnet). Poststructuralists would say no reading is ever complete: 1:n, where n approaches infinity (n possible versions of sonnet 1). But what’s more difficult are the scales in-between, and the serious questions Queneau’s book raises about how we read and what we read for.

To illustrate, consider a simple deformance of Queneau’s text: what if we read each line as interchangeable with every other line, instead of with only every other line in that position (e.g. line 1)? This would result in even more possible sonnets – Bök and Ramsay could praise generative, life-exceeding, hyper-actual, hypothetical possibilities to exponentially greater degrees! But the marginal shift in rhetoric such a deformance enacts suggests that these extra possibilities wouldn’t tell us much more about anything. And the computer could still read them.

Queneau’s book is, as Ramsay suggests, generative. In this sense, the content is less important than the mechanism: 10^14 times less important, perhaps. But then the possibilities, potentialities, and virtualities with which Ramsay and Bök characterize it maybe aren’t that important either. Not as important as the particularities and actualities of one or more readings through it. Reading it poem by poem, as Ramsay and Bök grandiosely metaphorize a hypothetical human reader as doing, is just one algorithm of reading it. Isn’t this precisely the kind of limited, unconscious algorithm Ramsay wants to shift us away from?! Essentially, Queneau’s book is a database, and a database can provide different answers based on different queries. So the question is, given a computer that can read it as much as we want and in many ways, what queries might we ask of this text?

I’m interested in feedback as I’m thinking about elaborating this post into my research paper, which would involve placing Queneau’s book into a MySQL database in order to experiment with this kind of “brute force” reading, as I’ve called it earlier, and to inquire further into how it challenges our assumptions, methods, and goals as readers.

Works Cited

Bök, Christian. ’Pataphysics: The Poetics of an Imaginary Science. Evanston, IL: Northwestern University Press, 2002. Print.

Ramsay, Stephen. Reading Machines: Toward an Algorithmic Criticism. Chicago: University of Illinois Press, 2011. Print.

Advertisements
Reading Machines Reading Machines: 100 Trillion Poems and the Scaling of Possibility