Deferral as Media

I’ve been limiting my discussions of media by assuming it is the sign and sign system that is the media while it is, in fact, the form of deferral created by that particular sign. It is the deferral created by the issuance of the sign that provides for the new form of sense and intellectual activity characteristic of engagement with that medium. Let’s take a simple example: someone insults me. I can take a swing at him, in which case all I really notice is the spot on his face I am aiming at (if I’m collected enough to aim). If I refrain, though, other things come into view. Maybe, his insult was a prelude to an attack on me, in which case I might notice a determined, aggressive expression on his face that suggests I would have been better off hitting him (sometimes he who hesitates really is lost—that’s the bet we lay down in gesturing rather than striking). Even in that case, though, I might now better prepare myself for what could be a sustained struggle. I might, though, see some hesitation in his expression and posture, I might notice something to suggest that his insult might be a response to something I said or did, perhaps inadvertently (maybe I notice a bag of groceries spilled at his feet and realize that I had in fact bumped into him). My deferral opens up a world of observations to me, directs my attention in ways it wouldn’t have been otherwise, and suggests other possible uses for my eyes and ears (at least). I might even refrain from returning the verbal insult, returning a good word instead, and see where that leads. It is whatever posture and attitude I have replaced the potential blow with that generates this space of deferral in which looking, listening and thinking are possible.

It would be necessary to show how each new media form, through the different forms of writing up until alphabetic, through print, film, TV and electronic media represent more advanced forms of deferral. Each of these formal advances have been denounced as sources of infantilization, starting with writing and most certainly continuing through the most recent forms of electronic communications. Certainly any modification of the sensorium will involve a loss of some capacities, capacities deemed indispensable by those still immersed in the newly marginalized form of media; and maybe capacities that it would really be better to preserve. It wouldn’t be a bad thing if we were all capable of memorizing a few thousand lines of poetry. But I think the more important problem is that each new media form reflexively attempts to model itself on the forms it is displacing. In the example I have spent a great deal of time on lately, writing took upon itself the task of making the written text a simulacrum of a speech situation, one that could be reproduced by the reader. We still use terms drawn from orality to speak about writing—we refer to what a text or author “says,” and don’t even have alternatives drawn from writing should we want to use them. If one individual speaks with another individual, or a few, the listeners or interlocutors know the speaker, can assess him, respond to him, speak amongst themselves, etc. A single text “speaking” to thousands or millions of people is taking up residence in their minds—it becomes their own voice. The experience and consequences are radically different, but it’s still modeled as an “internal dialogue,” even if one of the participant is remotely controlling that dialogue.

If writing were represented and performed more as what it is, a mapping of speaking and listening possibilities, it would not have these hypnotic effects. The focus on a single speaker, who must be made answerable, cross-examined, defeated in rhetorical and logical combat (essentially, expelled from the mind); or, on the other hand “internalized” and agreed with completely, would be deferred. One would, instead, be prompted to generate hypotheses. I think the same is true for the other allegedly stupefying media. There’s probably no point to talking about TV by now, since it has become such a minor medium, but the internet and online communications is anyway the far better example of how it is the holdovers of previous media that contribute to a kind of mindlessness easily associated with new communication forms. In fact, the recent discovery that the major media companies manipulate their algorithms so as to hide and marginalize thinking considered heretical by state liberalism, provides a perfect example. Now, there’s no doubt that Google, Facebook and Twitter have become the media arms of Antifa; that’s not the issue. The issue is the assumption that algorithms can be neutral, constructed without assumptions regarding a hierarchy of importance concerning ideas, events, agents, and so on. When someone searches “Trump” what should they find? The documents that mention his name the most times? The most recent documents? The documents that mention his name and are on sites that are otherwise the most searched in general? The documents that have received the most hits (partisans could hire illegal aliens to sit and click on the preferred sites all day long)? Some combination of all of the above (and a dozen other criteria we could easily devise)? Which combination?

People who ask for neutrality here are imagining what is in fact one of the precedents of the internet: the archive. They are imagining, however vaguely, a scholar, researcher or investigator, interested in getting at the truth of whatever one imagines oneself to be getting at the truth of, sorting through masses of documents, assessing authenticity and reliability, ascertaining relevance, generating links between documents that could only be discovered once one has seen enough of them. An ideal self for themselves as inquirers, really. But most people can only imagine the results of such work, having done very little or none of it themselves in the course of their lifetime. What they imagine is the popular narrative of the heroic sleuth discovering the truth hidden beneath a pile of lies and revealing it to all just in the nick of time, confounding the falsifiers. Or they are imagining a kind of decentered public square, an agora, where equals exchange ideas, battle it out, and get a bit closer to the truth. These are extremely attractive models. But they are all also essentially sacrificial. The attraction lies in the promise to have a scoundrel, stripped of all his protective covering, served up to all. Each new gradation of deferral saps sacrificial thinking of some of its power, power which powerful forces within the new form will seek to exploit and intensify. The online lynch mob is far more ferocious and consuming than the real thing, and if it seems somewhat less devastating in its effects, I would say that it would not be at all surprising to pass the point at which online lynch mobs start instigating the real thing. Would either George Zimmerman or Darren Wilson be safe in public in most places in the US?

But it’s possible to imagine a far more productive discipline of algorithmic design in a well ordered society. If computer programmers know what people need to fulfill their disciplinary assignments, they could design the algorithms most helpful to them, from the sovereign on down. We would all learn, unevenly and in accord with necessities, to think probabilistically, to project probabilities further and further into the future, albeit with declining degrees of certainty as we go further ahead. That is really the essence of deferral: if we don’t think primarily of how to kill each other right now, we can occupy ourselves with more profitable uses of our time; if we extend that period for ten years, yet further vistas open up; for a hundred, and we can imagine civilization building. Then all our thinking would focus on what builds trust and what minimizes resentment, and our practical activity would focus on deploying resources and energies so as to build that trust and neutralize and redirect that resentment. How would we know we have another hundred years (perhaps to then be renewed indefinitely)? We really wouldn’t, but we’d be able to speak in terms of which activities and which ways of thinking made it either more or less likely. We could be wrong, but then we could study the source of such errors as well, and seek to minimize them.

But there could be no such “we”—overlapping disciplines, concerned with the intersections of scientific development, technological advance and anthropological understanding—without an unchallenged center. If I want one figure at the center and you want another, that incentivizes us to start arguing over different definitions of “trust,” different assessments of this or that resentment, opposing opinions regarding which anthropological understanding best accounts for a particular conflict. Your acceptance of the self-evident belief that we should be building a society to last becomes for me the arrogant assertion that my subordination must be imposed, my complaint ignored, eternally. We have to then argue about how our respective claims can be adjudicated, and we have to argue about rights, procedures, mediators, and so on. The media form enabling a new transcendence is then weaponized by being saturated with corrupt simulations of earlier forms: oral argument in court, tabloid journalism, state propaganda etc.

The study of the new media counters this development by articulating the increasing delay of consequences and projection of consequences beyond the immediate consequences, of events “processed” through the media, on the one hand, with the centering and continuity of power, on the other hand. All our inquiries into the future effects of present decisions presuppose a fundamental stability and continuity of order, and all attempts to project probabilities onto particular “timelines” are also attempts to hold constant the bulwarks of order that, first of all allow me to hypothesize without having to daily defend myself and construct my own order. Those most devoted to research and scholarly pursuits, to the maintenance and articulation of the archive we are all becoming part of, should be the freest and the most powerless, and therefore the most insistent upon the organization of all institutions around central power. Inquiry and social commitment converge, because the best conditions for anthropological research, which is ultimately the basis of all research, are those in which human possibilities are multiplied and presented in well formed, public ways. We have nothing more learn from revolutions and other upheavals; we have a lot to learn from the endless possibilities of dialectical transformations of disputes into agreements, and then those agreements into more disputes that we already know will be aimed at generating new agreements.

But the media still remain, at least the most elementary ones, like body and voice. The new media become more proficient at turning ostensives and imperatives into declaratives—something like “look at that criminal—stop him!” becomes something like “demographic, environmental, urban and architectural studies demonstrate that instances of disruption can be most significant reduced through the following combination of lighting, surveillance and direction of pedestrian traffic…” Instead of shouting at bystanders to stop the guy running away with a purse, you report the incident to security which undertakes a review. But at each point along the way, “fleshy” human responses—what people see, hear and feel, what they look like, how they move, how they play off of each other—gets fed back into the system. But that just means the more ancient media are resituated within the new space of deferral, and they take on meaning insofar as they serve that deferral. We speak and gesture, but we do so as if we might be recorded or are ourselves recording; we write by hand, or by spray paint, but in doing so we present what we know can only be seen as a scrawl; even print writing has “always already” been chopped up into excerpts and sound bites; we sing, in anticipation of various remixes and electronic voice modifications. All the media can therefore be kept in play, for the most expansive production of meanings should be kept in play, but always as the oscillation between our “speaker’s meaning” and the now unlimited possible “text meanings” that might result. Deferral lies in that oscillation.

Leave a Reply