“An act of pure attention, if you are capable of it, will bring its own answer. And you choose that object to concentrate upon which will best focus your consciousness. Every real discovery made, every serious and significant decision ever reached, was reached and made by divination. The soul stirs, and makes an act of pure attention, and that is a discovery.” D.H. Lawrence
The notion of having God’s will, ideas, or natural law “engraved” or “inscribed” on the heart or mind has been a constant of Western thought from the Hebrew scriptures through the founders of modernity like Locke and Kant. (There may be nothing inscribed on Locke’s “blank slate,” but where did the notion of a blank slate, prepared to take inscription, come from?) The metaphor obviously depends upon writing, and presupposes a process of inculcating a sense of duties radically at odds with those of an oral culture. In an oral culture one’s primary obligation is to know all the names (of ancestors and divine beings) constitutive of the web of existence, or to know who knows them. Such ostensive knowledge has an imperative component—one tries to find out what the named beings want, and then one does it. With writing comes a transcendent voice that says the same thing to everyone and comes from everywhere or nowhere. Anyone can repeat the Word over and over again, inscribing it internally. While it exists objectively and can be checked when needed, the written word is only effective if memorized—while the prodigious feats of memory of the epic poets of oral cultures are no longer necessary, the book could not stay with one without at least some degree of memorization, of key passages, of general themes, and so on—after all, the written word was not readily available (who could afford to possess books?) and was often accessible only through public readings and sermons, and in educational settings. God’s word is then written on the heart through constant oral repetition, and is embedded in culture through its transformation of the language—in the same way in which our own contemporary English is still, unknown to most English speakers, saturated with phrases from the King James Bible and Shakespeare.
Even as books became readily accessible this relation between the written word and its “inscription” on our minds and hearts has remained remarkably constant, I suspect. I remember, as a graduate student, even though I could have dozens or hundreds of books, privately owned and borrowed from the university library, constantly trying to inscribe on my mind passages from books I had read and, even more, cross references from one book to another. If a critic I happened to be reading made a reference to, say, something D.H. Lawrence said about Christianity, I would try to call to mind what I had read by or about Lawrence that might frame that reference, and chastise myself for, inevitably, not having “inscribed” on my mind what I now needed; I would then inscribe what that critic said and draw upon whatever traces of my previous reading of Lawrence might enable me to locate that framing reference. Indeed, even finding a passage that I wanted to quote for a paper required some prior inscription—it was somewhere in chapter 2, or sandwiched in between two other discussions which I had inscribed in broad strokes. Ultimately, what marks one as a worthy scholar is being able, much like the first users of texts, to take a written text, available to all, as a prompt for a dialogue with others or an internal dialogue with oneself that, in the end, would produce a new text—a process that requires some ongoing retention.
Joshua Foer, in “The End of Remembering,” explores one central consequence of the displacement of print by electronic culture—the fact the one needs to memorize less and less. One striking example he gives is phone numbers, even one’s own (I don’t, in fact, remember my cell phone number). For me, and I am sure for many others, a critical rite of maturation was being able to remember my phone number—it meant I could be trusted to go out on my own. Now, a small child can have a cell phone but doesn’t need to remember his home number. I think that electronic culture is having a related effect on scholarly work and education as well—you don’t need to remember where a passage in a text is, or where to go back and find a particular comment by D.H. Lawrence on Christianity, because all you need to do is compose a search term, which I suppose still requires some memory but very little since you can try out a whole series of search terms (D.H. Lawrence critical Christianity… D.H Lawrence hate Christianity… D.H. Lawrence Christianity eternity…) in all of 20 seconds. There are lots of accounts of the changes of consciousness in process as a result of the emergence of electronic or digital culture (inquiries generally modeled on the studies by Eric Havelock, Walter Ong, Jack Goody and other on the transition from oral to literate culture). I think that Foer is right that we might advance such inquiries considerably by focusing on this singular fact of the obsolescence of memory.
Well, you still need to know why you would be looking for that Lawrence passage, and what to do with once you retrieve it—but we can imagine such search and deploy missions taking on a very different character from traditional scholarship. If I am working with a text—and, as a student, when we first learn to work with texts, I am working with it because I have been told to, or because it’s the test everyone is working with (and, in fact, for the vast majority of working scholars, this changes very little—one reads what is read)—and I encounter a name or word that I don’t know, and I feel I need to know to make sense of that sentence or paragraph, I can do a quick Google search that tells who the name refers to or a definition of the word, taking, perhaps, the first several items—the answers I get won’t provide me with the kind of context that having inscribed texts in my heart and mind would have done, but how would I know that? If I then have to write about that text, I will use the names and words I have retrieved in what would look to a traditional scholar like semi-literate ways (probably both overly literal and a bit random), but the more people do it that way, constructing hybrid figures and meanings, the more that will be the kind of work done in the academy and elsewhere. One would simply fill in the gaps in one’s knowledge as they appear, and they would be considered “filled” insofar as they enable you to get to the next gap. Better work would be distinguished from worse in the patience that has been taken to construct links across a range of texts and the consistency with which one has used and cross-referenced those links—and, in exceptional cases, the ingenuity with which one has provided unexpected links for others to follow up on. You really wouldn’t have to remember anything—you would only need to have acquired the habit of searching and articulating links whenever confronted with a text and a task. It is quite possible to imagine a whole new kind of intellectual work emerging out of the process, one which applies across the disciplines, including the sciences, which are probably already closest to this model—after you’ve put together all the links everyone you have “read” has read, there will be certain gaps in knowledge (possible but unmade links)—you just go ahead and fill in one of those gaps.
Indeed, college instructors should be avoiding standard topics like “D.H. Lawrence and Religion” precisely because of the ease with which one can patch together a series of passages and critical comments through the internet. Instead it might be better to imagine unprecedented topics, like, for example, selecting a particular word or phrase that recurs in an author’s work or a particular text, gathering up all the instances of that word or phrase, checking the rate of its recurrence across the writer’s work and in comparison with its occurrence in other authors, and use the findings to challenge some established critical views of that writer (one could make such tasks increasingly complex, as necessary—one could form new search terms for the use of the word or phrase in specific contexts, in proximity to other words and phrases, etc.). Culture, in that case, will tend to be experienced as the distribution of probabilities with which commonplaces and differing modes and degrees of variants of those commonplaces appear. More conservative interventions would seek to stabilize the most relied upon commonplaces, while radical ones would seek wider distribution of more “deviant” variants. Entertainment would continue on its current path of arranging in different but not too different ways common scenes, narratives, catchphrases, etc. We would almost literally be going with the flow—the flow of information regarding how distant from the norm our current array of ready to go phrases and gestures is at the moment; freedom would involve determining how distant we want it to be.
The features of digitality more commonly discussed, like social networking, seem to have the same effect of rendering memory obsolete. If someone puts photos of himself with his girlfriend on Facebook, he has no need to remember the experiences recorded in the photos—here, the public nature of the exposure is what makes the difference: the photos represent his relationship to her for those have access to his page, and that is their meaning. If they break up and he changes his status, the pictures can come down and be disappeared. Maybe such an individual, today, has the same regrets, nostalgia, hopes for reconciliation, reconstructed memories and so on that a normal modern person, who has inscribed his feelings upon his heart and mind—but I don’t see any need to assume that this will continue to be the case. Loves and friendships may be more and more reduced to the needs of public display, and more and more people will be their Facebook page (and whatever networking forms emerge in the years to come) and therefore memoryless. Some form of sexual desire can be taken for granted, but romantic love centered upon monogamous long-term relationships, relationships dependent upon both memory and forward looking narratives, certainly cannot be. Emotional life might take on shapes drastically unfamiliar to us.
What kind of people would these memoryless, or “dis”-membered beings be? It’s easy to assume the worst. With the obsolescence of memory, what would promises be worth? Would anything we (“we” being traditionally humanist thinkers) recognize as “thinking” be possible? If the past is constantly disappeared how would the future be conceptualized? Would people save money? Have children? Be capable of any kind of sustained attention whatsoever?
Marshall McLuhan, of course, raised these kind of questions half a century ago, and his notion of a “global village,” meaning both instant connection across the globe but also a return to the kind of oral culture, focused on spectacular events, driven by rumor, gossip, moral contagions and celebrity (a kind of mini-divinity) seems as relevant as ever. As does Jean Baudrillard’s vision of a society of simulacra, in which we are ourselves the models out of which we construct ourselves. The viability of such a society, with minor as well as major powers possessed of nuclear weapons and the rise of a global Islamic fanaticism, not to mention the problems involved in managing a complex global economy, would be dubious.
If signs are not to be inscribed in hearts and minds, what does understanding signs amount to? Nothing more, maybe, than the most originary of understandings—the capacity to iterate the sign, to maintain and extend the joint attention (to follow a line of attention) it has initiated and which has drawn you in. The capacity to iterate the sign involves the knowledge of the difference between a broken scene and an intact one—which is to say, knowledge of what kind of gesture is likely to get the desired response—or, at least, a response one would know how to respond to in turn. I would think about this as a kind of sincere pretending in which individuals try not so much to be like other individuals, as to approximate a kind of projected or imagined norm. But it is not easy to imagine and approximate such a norm, especially since its formation is constantly in flux, and what is normative or average in one site might be on the fringes in another. There will always be cases in which the projected norm is in fact an extreme anomaly or, to put it positively, sheer possibility.
This is the form thinking may take, or already is taking, as we move into the order of electronic communication: the generation of possibilities, the more sheer, the more barely possible, the better. Start with the assumption that anything is possible, anyone is capable of anything, and modify that assumption as the scene takes shape. Quite a few postmodern thinkers have already pointed in this direction. I will put it this way: modernity continues metaphysics, which sought out the ultimate reality in a higher, hierarchically organized, intellectual and spiritual order, by shifting our attention to the ultimate reality to be found in lower, unseen forces: material interests, sexual drives, the unconscious, etc. What comes after modernity is “patatiquity,” with the “pata” from the pataphysics, the science of the exceptional invented by Alfred Jarry and “tiquity” a temporal suffix modeled on “antiquity.” Patatiquity is the age in which possibility is more real than reality. Research conducted through Google constructs a possible array of links; social interaction carried out through networking online constructs a possible community. In both cases, the possibility is “real” insofar as others iterate the signs of possibility one puts forth, and these iterations in turn generate new possibilities (like a new hierarchy of Google links).
So, is patatiquity sustainable? On the face of it (and both conservative and postmodern critics, with differing evaluations, agree here), patatiquity seems to herald an era of irresponsibility and carelessness we can ill afford—isn’t the Obama cult exemplary of patatiquity, with its investment in the sheer possibility of hope and change; isn’t endless debt, both personal and national, equally patatiquital (or, perhaps, continuing with the model of antiquity, “patacient”)? Maybe—that’s certainly one possibility. But it might also turn out that the most avid explorers and investors in possibility will insist on the kind of minimal reality that makes possibility possible: to take just one example, real money. The modern attempts to control the economy and regulate habits through the monetary supply just inhibit possibility by governing according to the norm extracted by experts. The more we insist on unequivocal laws governing distinctive areas of human actions taken as literally as possible, the more is left over for possibility. In fact, Gertrude Stein’s political conservatism seems to based on a similar line of thinking: in a series of articles written for The Saturday Evening Post in the 1930s she argued for the necessity that money be “real” (i.e., not fiat) and for the government’s approach to requests for further spending to be that of the stingy patriarchal father (a stock figure Stein otherwise tended to despise); more generally, the intersection of habits that generates infinitely varied human interactions and idioms can only do so if minimal, but strict, rules are taken as given.
Under such conditions, the law can function more as constraint than restraint: restraint seeks to hold back while constraint seeks to channel, like the rules of a game that enable a wide range of moves displaying an equally wide range of intellectual and/or physical capacities. Out of a set of constitutive rules—those rules that make the game a game—emerge all of the regulative rules determining strategies. But patatiquity suggests something more: the regulative rules reveal more constitutive ones. The right to property is a constitutive rule of a free society, but there are many ways of enforcing that right, and each one of them—protecting one’s property oneself through arms and security systems, a public police force, a private security force, etc.—reveals something about the right to property itself (what kinds of ethics and capacities it requires and evokes, where it stands in relation to other rights). Just so does the elevation of possibilities involve an ongoing revelation of a community’s constitutive rules. Agreements would be made explicit and their limits clarified, and norms and assumptions about rights would emerge from those agreements; more long term institutions, most importantly family, that transcend individualized, explicit agreements, might very well change dramatically, becoming, as is already the case, more contingent and mixed—how to ensure the care of children will be a real problem. On the other hand, there will probably be far fewer of them, but, contra Mark Steyn, that may not be socially fatal—at the very least it will impose some very difficult choices: for one thing, it will become increasingly obvious that we can’t have both commitments to our present day middle class entitlement programs and regulations and tax policies that cripple the kind of productivity required to provide the excess wealth needed to subsidize those ever more bloated programs..
In patatiquity the sheerly possible can reveal constitutive rules that a more normative, declarative culture conceals. Imagine writing according to the following rule: each word aims at being the least predictable, given the surrounding words, for the normal reader. Your writing, then, is first of all a study of your possible readers, in an attempt to give those readers an opportunity to study themselves. Following this rule (which will not be easy) you will produce the sheerest of possibilities, the possibility left after all the other have been exhausted. And to read such a work would be to start exhausting those inexhaustible possibilities—all the clichés, commonplaces, formulas, chunks and constructions in one’s linguistic inventory. If the first word of the sentence is a personal pronoun, the next is most likely a verb, and a verb referring to an action carried out by humans, and then adding in the context, your own personal proclivities and some guess work you anticipate with a 63% probability one “kind” of word and with 37% probability another “kind”; given that next word, the same process starts up for the next one, and so on all the way through; and you could do this backwards and forwards or starting in the middle, and over and over again. This is the way we always use language—someone starts to speak, or to gesture, or we starting reading the first line, and each sign plugs into an array of constructions and possible relations between constructions we are familiar with in varying degrees. So, pataphysical writing makes visible the constitutive rules of language use, precisely by loosening those rules as much as is humanly possible. And now you can read anything in those terms, as a certain degree off-center, as containing anomalies, as even the most predictable text will, then, embody pure possibility—perhaps especially the most predictable text if we consider what an odd thing that is.
De-memorization would then leave us with nothing but memory of the constitutive rules, and a desire to rediscover those constitutive rules over and over again through “acts of pure attention,” or “divination.” So, if we return to my first example, of thinking as linking, then the most compelling texts, scholarly, popular, or esthetic, would be those that articulate the most probable links in the most improbable ways, grounding them in sheer possibility. Elaborate, counter-productive rules like those promulgated and incompetently enforced by government bureaucracies would be discarded as requiring too much “self-inscription”—too much remembering of specific rules and their normative “meaning.” Very simple things, like acquiring the most useful skills, and saving as much money, or real wealth, as possible, would be preferred—you could always check the status of those things daily on the market. The future can be divined in signs of the present, while the firm fact that (real) money will always be useful allows for the future to be otherwise completely open, populated only be sheer possibility that one need barely adumbrate.
Once we realize that our selves are possible, not actual, our energies will be devoted to the creation of plausible possibilities and spaces where implausible ones can be safely engaged; even more, our assessment of institutions will turn on our assessment of their ability to enhance our creation of possibilities. One’s own economic possibilities—and more and more professions—will focus on creating possibilities for others—helping others be imagined as they imagine themselves being imagined. PR will become the queen of the sciences. If you want to construct a representation that will have effects on a particular audience in a particular way, you must study the desires and habits of that audience; even more, you must treat those desires and habits as malleable, within limits. You will game it out—someone who says x will be likely to want to hear or see y; someone who does x everyday will be happy to be given a chance to do y; someone who has bought a, b and c will like something like d (note that in each case there is no reason to assume that the audience actually wants, or has ever imagined, the y or d in question—the marketer is filling an imagined gap in their experience, a gap opened by the inquiry itself). Already, more and more selling of products involves selling such simulated images, filling such gaps, and telling the consumer of the gap and that it is being filled. This is objectionable from various enlightenment and romantic perspectives assuming the uniqueness of the individual and the integrity of the thought process, but if we set those objections aside we can see that a mode of “critical” thought and “high” culture is already implicit in this very model: opening up new spaces or gaps between the normalized experiences and those experiences which yet lie immanent in them.
Finally, this turn toward the trivial, or a continual lowering of the threshold of significance (more things becoming less significant) would lead to a very strong desire to reduce violence. We already see increasing distaste for sustained confrontations and enmities—maybe they require too much memory. There is a preference for constructing defenses that make confrontation unnecessary. The free and more advanced societies will be able to create sophisticated defenses that make them impregnable vis a vis the failed, resentful societies surrounding them, while sparing them the pangs of white guilt involved in retaliation—Israel’s missile defense, which costs far more than an all out war to destroy their Palestinian enemies would, is an obvious example here. A premium will be put on keeping people out, except under precisely defined circumstances—once someone is in, you need to deal with them, so a lot of intellectual energy will be invested in determining who can be let in. If governments don’t defend their borders, communities and businesses will do their job; and people will shape themselves so as to be acceptable members of the communities they wish to enter (as I suggested earlier, much business will be generated in helping them to do so). These strategies of avoidance might impoverish social interactions by ruling out a wide range of possibilities from the start; but it might enrich the relations that remain by making them more meaningful in the literal sense of causing all signs among those who have been properly vetted and therefore already give forth much information to contain layers of significance.
And, anyway, no one will remember what they are missing—there will be students of history, but I think the idea that there are lessons to be learned from it will disappear, and rightfully so because history is nothing but the history of struggles to own the asymmetrical gift economy centered on the state, and patatiquity can only come into being by putting all that aside.