GABlog Generative Anthropology in the Public Sphere

March 28, 2013

The Loves that will Dare not Speak Their Names

Filed under: GA — adam @ 10:54 am

The United States is a pathetic joke of a country. Our political class (but who put them in office?) is paralyzed when it comes to crafting budgets, controlling debt, defending borders, developing coherent relations with friends and enemies abroad-but, for a non-issue like same-sex marriage, we are capable of moving rapidly toward self-righteous unanimity and policy clarity. Only those issues of concern to the victimocracy get addressed expeditiously, but the only issues of concern to the victimocracy are those pseudo-inequalities that allow them to conduct unending simulated Nuremberg-style show trials of stereotyped victimizers. It’s worth saying a few words about the same-sex marriage question, nevertheless—not because it is a serious human or political question, but because one of the desperate (but what kind of resistance to the left isn’t going to be a bit desperate these days) counter-proposals allows us to follow a thread through the unraveling. That counter-proposal comes from a strain of libertarianism, which says, just get the government out of marriage. Let individuals of any size, shape, number, dimension, mode or degree of intimacy create whatever contracts they like regarding the sharing and disposition of property, reciprocal obligations, the terms of contract termination and post-termination settlement, etc.; let the government remove all reference to marriage from tax codes or anything else; let churches, synagogues, mosques, etc., marry who they will, and, perhaps, agree to supplant the state as arbiters in cases of divorce (with the consent of the married parties, of course); and, let adoption agencies, schools, home sellers and renters, employers, etc., recognize whatever form of marriage suits their own interests and conscience.

This proposal should be put in play, because, obviously, once marriage can be same-sex, it can be anything—perhaps anything the government says but, ultimately, anything anyone says. But there are problems. Who, after all, enforces contracts? The government, which is to say the façade behind which the victimocracy conducts its crusades, would be far more involved than marriage then ever before, with the responsibility to sort out a whole array of confused, complex, and misconceived contractual arrangements, with, undoubtedly, extremely incompetent, unacceptable and easily evaded modes of enforcement, leaving tens of millions in legal limbo regarding crucial issues like child custody. Well, as someone has said, when you can’t solve a problem, enlarge it—that is, let’s radicalize further. Let’s shift over to private courts: we would have to get into the habit of signing our contracts before mutually agreed upon arbiters and, presumably, of conferring upon those arbiters agreed upon powers of enforcement. This new system would put an enormous social division in place: between those who would continue to rely upon state courts or simply exit any system and create informal, loose, polygamous family structures, as is already the case in many American inner cities and among the under classes generally, on the one hand, and those who would opt into the new system of private courts; and, two, within the new system of private courts, between those with discipline to work within a system relying heavily upon a willingness to abide by verdicts that will undoubtedly be difficult to enforce (will a private court really be able to impose a judgment on a husband to takes the kids, in opposition to prior agreements, from Tennessee to Oregon?) and to avoid recourse to the courts in the first place. The old system, especially once abandoned by the responsible and self-sufficient, will devolve into some combination of quasi-totalitarian nanny state rule, in which the state regularly steps in and regulates parents and cares for children, on the one hand, and renewed clan or gang systems, on the other, as women will have to rely on their male kin (or any other vehicles of male violence they could enlist, through whatever degradation to themselves) to avenge and rectify their violations and abandonment by promiscuous men.

Would 50% of the population be capable of migrating into the new system? 1%? How many would be necessary to make it sustainable? Whatever the answers to these questions, the point of this thought experiment (I don’t mean to suggest I think it couldn’t happen) is that if such a migration, or exodus, were not to be possible, then our current system isn’t possible either, because the ability to reciprocally promise and fulfill such promises necessary for such a system is exactly what we would need to hold on to what we have, because the government no longer supplements and shores up the values and commitments required for a nuclear-family centered society but actively undermines them.

One last question. When a woman has a baby, what makes that baby hers? (For clarity’s sake, let’s leave the father out of it.) Go ahead: justify her “right” to the child. I don’t think you can do it. Childbirth and parenthood are now state-sponsored and regulated activities like any other. Do you not need a birth certificate to authenticate the “provenance” of the child? Are there not myriad laws determining how you must treat and care for and educate the child, along with a full array of government agencies empowered to enforce those laws (would we have it any other way)? In other words, you didn’t birth that: just like the government built the roads and educated the workers and supplies the police and fire fighters that make it possible for you to do business, and so you didn’t build that, so the government (at least) funded the hospitals, gave loans to the doctors, had the FDA approve the pain killing drugs, vaccinations, etc. (and you usually need a road to get to the hospital as well) that made your giving birth and then raising your child possible. You will have no argument once some government bureaucracy, armed with a plenitude of studies regarding the needs of children, goes from house to house determining which children are better off where they are and which ones would be better removed to some more approved environment. One more term in office and Mayor Bloomberg will no doubt get to this. The difficulty you will have arguing against this without some presumption of the pre-state naturalness of the mother’s relation to her child is exactly the same difficulty we have arguing that marriage simply is a sanctified union between a man and a woman, the “joining of their flesh into one,” or whatever equivalent liturgical phrase it is sheltered under. The very fact of having to argue it makes us bereft.

In the privatized system I am imagining, what would make the baby the mother’s from another, equally disturbing standpoint—that is, what recourse would there be if mothers starting abandoning their children and taking off beyond the reach of whatever jurisdiction they inhabit? (We can no longer assume anything. Why, indeed, care for a child that just happened to pass through your birth canal?) A genuinely private system of civil law would only be possible among people who understood that such questions cannot be answered via an impeccable sequence of declarative sentences: marriage is marriage, children are children, parents are parents, and so on. People do terrible things, like abandoning babies, but only those who know, without necessarily being able to say how they know, what marriage is and what mothers and fathers are, are capable of knowing that such things are terrible and that we must step in to remedy them by taking in children, setting up orphanages and foster parenting institutions, restraining parents who become dangerous to their children, and so on.

Permanent damage to the language is probably harder to inflict than such damage on institutions. You can erase “husband” and “wife,” “mother” and “father,” even “son,” daughter,” “sister” and “brother” from official documents, but it will be more difficult to uproot them from people’s minds and our overlapping heritages. And the words themselves, which will exist not only in people’s conversations but in books written more than 10 minutes ago that some people will still read, will be found appropriate to experiences, and will serve as a rebuke to those see them as little more than slurs (if you think I am exaggerating, I don’t think you have worked through the logic of “same-sex marriage”—an examination of the implications for entire vocabularies of love, affection, intimacy, and so on would itself require a lengthy discussion, as would the inclusion of the totality of the effects of the therapeutic state). For quite a while, anyway, there will be ways out for people who want them.

March 8, 2013

Paulmania

Filed under: GA — adam @ 6:42 am

The almost unanimous conservative euphoria over Rand Paul’s filibuster the other day seems to me an odd thing. More precisely, it seems to me delusional, and therefore demanding understanding. Much of the excitement seems to result from the sheer novelty of a real filibuster, requiring the speaker to hold the floor (and hold it in) for hour upon hour; part of it is the hunger for someone with the “balls” to finally “stand up” to the Obama administration; part of it is the entrance into public life of the libertarianism that has been percolating for years around the margins of the conservative movement and Republican party, which takes its rhetorical power from a fetishization of the US constitution (by which I mean not adherence to its terms, but the belief that the defense of the constitution defines our political imperatives and priorities, and that the constitution contains the answers to political questions, even to questions regarding the best mode of public life); part of it is a sense of stealing the left’s issue and their thunder, and even gaining the grudging support of the more honest among them; and part of it is, I assume, genuine concern over the immediate issue at end—the question of whether the President is constitutionally empowered to assassinate (or is it only to assassinate using “drones”?) American citizens on American soil.

But if we start with that issue, without which the entire event is really nothing more than catharsis, it seems to me there is much less there than meets the eye. I have noticed that on conservative websites, discussions of the general principle get met with indictments of the Obama administration’s duplicity, opportunism, cynicism, and treachery, most of which indictment I share, but now tipping over into the further claim that if we can’t put it past them to demonize Tea Partiers, gun owners, Christians, veterans, etc., as potential terrorists, then we also can’t put it past them to start assassinating them. This slippage into left-wing style paranoia (the indulgence of which I would add to the above list as reasons for the euphoria) both misses the supposed “real” point and unwittingly demonstrates the emptiness of Paul’s entire exercise: such an administration would have no problem acknowledging they have no right to do something and then going ahead and doing it anyway; and whether the government might misuse the powers at its disposal tells us nothing about whether it legitimately possesses those powers. And on that question, the answer seems to me obvious: the President would have to have the power to put down a rebellion organized on American soil; such a rebellion, by definition, would exceed the powers of law enforcement and put us on a war footing; part of putting down a rebellion that, on this assumption, controls part of US territory, might very well involve assassinating its leaders, even those who are involved in political and propaganda rather than strictly military operations, something well within the laws of warfare; since it is conceivable, even likely, that, say, a portion of the Southwest in some combination socialist/Mexican nationalist Chiapas style revolt would include American citizens, those American citizens would be making themselves targets—so, yes, obviously, the President would have the right to kill them, using drones, poison, exploding cigars or any other available lethal technology. (I notice in rereading this that victimary thinking would exclude even the hypothetical construction of such a scenario, since one must in some way “stigmatize” some specific group in order to do so—I could have imagined an Islamist revolt in some part of Michigan, a white supremacist revolt in Idaho, etc.—in any case, names must be named—so, denying the very possibility of such an event feeds one’s self-congratulatory White Guilt or self-righteous victimary stance.)

Disposing of that question leads me to conclude that the roots of the euphoria lie even deeper than the causes I have given so far: the assertion that no American President could ever have the right to assassinate an American citizen on American soil silently assumes and therefore reassures us that it will never be necessary to do so—that we are as inviolate here, on our own land, as the 9/11 attacks may have led us to believe we no longer were. Any war on American territory would be for causes both left and right are well equipped to diagnose and combat, at least in their own imaginations: the home grown tyranny that our political doctrines have always warned us against. To put it bluntly: Paul’s filibuster allowed conservatives to join in the fantasy, which began 9/12/01 and has grown steadily in strength ever since—the fantasy that 9/11 didn’t really happen, and that there is no enemy out there that we don’t create by violating our own principles in some way, through some original sin of our own. On my reading of the evidence, those who enter this fantasy don’t leave it—indeed, why should they, as its terms are idyllic, combining in equal measure victimary resentments, an orientation towards one’s own, familiar, domestic political opponents, and an inexhaustible justification for romantic and populist posturing against the state. A state that they all, in the end, know will not really take them out with a drone as they sip their latte. This simply confirms what I concluded following the November elections: that Americans, having gotten on the victimocracy train, and will not get off until it has high sped to its destination, whatever that might be.

January 16, 2013

Notes on Cool (not cool notes)

Filed under: GA — adam @ 11:16 am

Our understanding of victimary thinking cannot be considered complete until we have accounted for the category of “cool,” which has proven to be extraordinarily enduring and generative. I wonder how far back the term goes—there must already be histories of “cool,” but the wikipedia page, at least, is no help—it traces the attitude of “cool” back to the Renaissance, but no actual uses of the term in its current slang sense going back more than a few decades. I assume it entered our vocabulary in the 1950s (although I’d be glad to be corrected by anyone whose personal memory or historical knowledge can date it earlier), which would situate it squarely within the emergence of post-war victimary culture.

Hannah Arendt observes somewhere that the German romantics of the early 19th century referred to their cultural antagonists as “squares,” in the same sense which is by now uncool usage but was pervasive in the 1960s. So, we can trace coolness, as an attitude, if not the word itself, back to romanticism—in which case, “cool” would be the synthesis of romanticism and victimary thinking.

This is important because without “cool,” victimary culture is shrill, desperate and ultimately unconvincing; with “cool,” victimary culture can produce iconic figures that offer alternatives to the cultural center. I think that Obama’s coolness and Romney’s squareness played a significant role in our recent election, and that the power of the “cultural issues” like abortion and gay rights have nothing to do with the effects of such issues on peoples’ lives and everything to do with cool.

Cool represents a pole of attraction on the margin, opposed to the center. Cool is not, at least first of all, antagonistic towards the center—it is simply uninterested in it, except as a source of amusement. Coolness embodies an attitude of deferral, which might account for the term—as opposed to those who are “hot,” i.e., worried about social expectations and judgments, always trying to influence or preempt them, the cool position themselves outside of that space of judgment. In distinction from cynicism, or “coldness,” cool separates itself from the center in order to make space for a kind of authenticity disallowed there: the cool are passionate, usually regarding some singular relationship or project. In defense of that space, the cool are ready to confront the center—that defense takes the form of the protection of some victim of the mainstream, an exemplary victim whose plight the cool, from his marginal perch, is qualified to identify.

“Cool,” as a word, has moved to the center—middle-aged women use it to refer to a clothing purchase or new flavor of coffee. It is used as honorific, often by adults to counter the exclusionary uses of cool among teenagers in their charge. And coolness might be disassociated from the victimary as, for example, with the high schooler who can initiate his fellows into forbidden pharmacological and sexual experiences. Ultimately, though, since cool is always a potential target of the center, its deepest alliances are with all those other potential victims, against which the center is seen to define itself. So, the coolness of jazz and now hip-hop frames the black victimary stance; the coolness of rock the youth victimary stance; while homosexuality has come to be marked as cool in various ways over the past couple of decades, generally as the uninhibited, joyful, stylish and honest amidst a swarm of hypocrites. Interestingly, there doesn’t seem to be any distinctly feminine cool—the cultural commissars have been working overtime for years to lay a patina of cool over Hillary Clinton but I don’t think it has taken. Among celebrities, perhaps Angelina Jolie, who cultivates the distance and the absence of neediness necessary for coolness, and also consistently plays the lead in action movies, is cool. Jewish humor—say, Lenny Bruce—was cool at one point, but that has dissipated as Jews have lost their victimary credentials. At the same time, it doesn’t seem to me that a form of Muslim cool has been forged—perhaps in Europe? That might mean that women and Muslims must become constituents, so to speak, of other forms of coolness, which speak for them. In Lena Dunham’s online ad for Obama, in which she notoriously (but for whom was it notorious?) compared voting for the first time to losing one’s virginity, it was not the women appealed to or Dunham herself who was cool (on the contrary, they are dependent, insecure and needy)—rather, the ad bears witness to Obama’s coolness, as the kind of guy you would want to be your first. This perhaps leaves women free or, depending on your perspective, obligates or even compels them to be the conscience of the victimary. The Muslim incorporation into coolness still seems to me highly problematic—perhaps that will be a cultural faultline in the coming years.

What cool adds to the victimary so as to complete it is marketability. Cool, of course, is unthinkable without what Eric Gans has called the “constituent hypocrisy” of romanticism—by setting itself apart from the center, the cool becomes a trendsetter, or mimetic model, determining styles across the culture. As I suggested earlier, the relation is symbiotic—without its victimary affiliations, the cool would drift into coldness, i.e., cynicism and cruelty (the territory that David Letterman, for example, often veers into).

Is there a viable alternative to coolness, then? Certainly not goodness—if goodness were an effective counter, a competing mimetic model, to coolness, we would know it by now. (Tim Tebow, alone among conservative and Christian NFL quarterbacks in recent decades—from Roger Staubach to Kurt Warner—has approached a kind of celebrity based on coolness through an explicitly religiously grounded “goodness”—alas, he doesn’t seem to be good enough to put this hypothesis to the test.)

One would assume that conservatism couldn’t be cool, insofar as cool defines itself as conservatism’s other, but one of the interesting phenomena of the 2012 election campaign was the emergence of a movement, largely youthful, around Ron Paul—old, cranky and starchy, obsessed with constitutional rectitude, holding unfashionable opinions on abortion with a checkered history regarding racial issues—somehow, Paul became cool. Freedom might be cool, then, when linked to an uncompromising rejection of all the corruptions and compromises of freedom wrought by the “establishment.” But Paul never threatened the establishment, and only made trouble for the Republican wing of it, so he was indulged by the traditional media—we didn’t get to see whether his coolness would survive the kind of full-scale assault launched against Sarah Palin (who also had some markers of cool). A libertarian like Paul (maybe we will see this with his son) would need to devise a strategy for turning such attacks into the elements of his cool. I suppose supporting drug legalization helps here.

Beyond such speculations, the problem here is whether positions on the margin can be made into mimetic models without rejectionist gestures toward the center—the historical center, or firstness (initiative, responsibility, representativeness), if not the political or cultural center. In other words, what kind of generative margin (a margin that produces new centers) could run on other than victimary fuel? Coolness, presently, is confronted with the problem of having won the political and cultural centers through a demonization of the historical one (Western culture’s insistence on equality versus the imperial—the very premises, in other words, that make sympathy toward the victim possible). In power, cool figures like Obama become extremely tiresome, not to mention incompetent (we now have a government, part Ponzi scheme, part protection racket, part victimary theater, that is utterly uninterested in what were once considered the defining responsibilities of government, like defending borders, passing budgets and distinguishing friends from enemies). On the other hand, that historical center has been, probably irremediably, sapped by its appropriation by the victimary. The parasite has destroyed the host.

The only alternative, I think, is a kind of originary ‘pataphysics, the science of the exceptional invented by Alfred Jarry, and carried on through a series of avant-garde aesthetic and cultural movements until today. (Jean Baudrillard, apparently, considered himself a ‘pataphysician, something I will have to explore further.) Of course the roots of ‘pataphysics lie in romanticism, and ‘pataphysics tself could plausibly be seen as precursor of cool. But ‘pataphysics is a program for thinking and learning, activities which interest cool not at all. One way of thinking about ‘pataphysics is via the famous Seinfeld episode in which George “does the opposite,” i.e., the opposite of what he would normally do in that situation; except here, one does not the opposite (an ultimately incoherent approach, as not everything has an opposite, there may be more than one opposite, etc.) but the least probable, and not as opposed to what one ordinarily does but in relation to the probabilistic frame implicit in the discourse one inhabits.

So, when you address me, you hope for and expect a certain response, based upon social conventions, the present context, and your knowledge of me and our shared past; perhaps you also fear other possible responses, the probability of which you have sought to reduce in your mode of address. As a ‘pataphysician, my interest is in surprising you, but in some recognizable way—I can only undermine your expectations if I display some awareness of them. In this way I create an event, a happening, and make it possible for us to recognize each other on the margin and affirm the signs and tacit agreements we share. Clearly, carrying out such performances across the field of culture is not easy, but, like coolness, it’s not something everybody would have to do—just enough create viable mimetic models. ‘Pataphysics must be rigorous and disinterested—its only politics must in defense of its own possibility, which is to say against anyone who wants to remove events and happenings from social life. (I have assumed that with the fall of East Bloc Communism, the work of talented and absurdist [i.e., ‘pataphysical] dissidents like Vaclev Havel had become irrelevant, but maybe we have much to learn from them.) Originary ‘pataphsyics, as an overtly marginal position shares the field with cool but it is not itself cool because it seeks to find and refound rather than stigmatize the center; maybe the other of cool can just be “firstness.”

Well, one might say, wouldn’t, say, a vicious or violent response to an amiable greeting be “doing the improbable”? Maybe, but only once—nothing is more monotonous (and therefore predictable) than violence (and the means taken to restrain it), once it has upset some space based on trust. Violence, or any kind of violation of already achieved forms of civility, would not, that is, open the field of possibilities, or lower the threshold of significance, which is the point of ‘pataphysics. The most valuable effect of originary ‘pataphsyics would be what the left has promised (or, for that matter, what modernity has promised), with unsatisfying results: the recovery of excluded voices and the creation of new ones. If I, say, improbably take you literally when you ask me how I am, unburdening myself of an exhaustive account of my current state, I remind you of several things: the kind of shared beliefs, commitments and experiences that must have once been necessary to put those standard greetings in place; the fact that we no longer share those beliefs, commitments and experiences and yet still need the greetings; that sustaining those greetings and civility, then, might not be guaranteed; that we might need to discover means (not necessary my current, excessive, gesture) to restore the foundations of civility; and more. I thereby make it more likely (another shift in the field of probabilities) that you will notice further fraying of standardized modes of civility, and be attuned to new refreshments of those modes.

There is no reason why we can’t have forms of art that gently intervene in everyday life, turning us self-reflexively upon our habits, without the implicit or explicit condemnation of middle class lifestyles which makes so much performative art so annoying. I think most people would enjoy losing a couple of seconds here or there with little installations that might play off of the constant surveillance now characterizing our lives. (How often do we now see ourselves entering and leaving places? What if we saw ourselves upside down once in a while? Or, looking up to see ourselves, see a celebrity walking out instead?) Or that play with our expectations of impeccability in business establishments—like an installation inviting customers to clean up a little mess, with each customer contributing to a new arrangement? We always think of little things that might go wrong, or awry, in carefully organized settings—little bits of art that fulfilled those possibilities, perhaps giving them surprising happy endings, would be appreciated. There might be a place for the victimary here—little bits of feminist or anti-racist theater that show people how it feels to be viewed as “other”—but they would have to reward the viewer/participant/customer.

None of this would be cool (even if those who see such works emit one of those soft, clipped “cool”s which have become so popular and hopefully weaken the power of cool), because these would not be ways of drawing attention to a potentially volatile margin—rather, they would be collaborative ways of remaking the center. Perhaps we can break up and reform the word “perhaps” to give it a name: “per”+”hap,” or through/by chance/event: firstness, then, creates perhaps (the plural), or perhaps (third person singular). Maybe we could set aside the more provocative “firstness” and simply say that after cool comes perhapsness. With text messaging and twitter, that would get reduced right away to PHPNS, and maybe rebranded as “pappens,” making it only slightly more verbally cumbersome than “cool.” Well, as Proust had his narrator say about a fantasy, that I have just imagined this means that it can’t possibly happen this way. But maybe that itself is an instance of perhapsness.

Cool can overpower goodness because moralities predicated on human equality want the scene without the scene—as if everyone could be arranged before the central object without the disturbance of everyone having to present his position to the others and interpret theirs in turn. Morality can only be thought in very limited ways in terms of abstract rights, obligations, fairness, rules of behavior, thou shalt nots, etc. The most basic morality is entering the language of the community, working with its terms, its tropes, its idioms, even its rhythms, and at least respecting and trying to learn them to the extent one is an outsider; somewhat more demanding is to speak the language of some specific other, the more differentiated the other the more demanding the obligation; more challenging yet is exposing the limits of the community’s or the other’s idiom, opening the possibility to accommodate as yet unrepresented desires and resentments; highest of all is the invention of those new idioms that will indeed represent those desires and resentments. That, in fact, is what the moralities predicated on human equality have done, so I am not dismissing them—it is just that they will serve us better if read as innovations in language to be revised rather than transparent principles to be defended against “illiberal” attacks. Cool exposes the limits of “bourgeois” morality, and can only be replaced by a mode of discourse that does it the same favor in turn.

Another way to think about it: when a civilization collapses, what is happening is that the immense architecture of tacit agreements, everything that has been agreed upon and settled long ago, so that we could go on and forge more practical and immediate agreements, turn out, after all, or by this point, to be or to have been, disagreements merely misunderstood as agreements. Naturally, at this point, those more practical and immediate agreements evaporate as well. We’re human, so we’ll need some kind of agreement, some mode of joint attention, just to get through the days, and those provisional agreements can emerge out of the frayings of the disintegrating ones—for example, in shared irony towards what was once taken for granted. What might become possible in such circumstances is what has not been possible for a long time—foundings, which can be found among the ways we just happen to be together, as a result of the intersecting trajectories that have brought us where we are. If have agreed to do something together, and the project falls apart, then we are released from the terms of the agreement, and yet there we are—we might as well do something. All of the habits, literacies, and implements we had gathered for that project are still lying around as well. Why not just begin by agreeing to do something, this or that, anything, making use of the now unfamiliar materials in a new way? The more arbitrary the better, because that places the agreement itself at the center, rather than the pretension that we are just doing what reality tells us to do—and because uses and potentialities of those materials which were otherwise hidden now become prominent through new articulations. Arbitrary, oulipo-style constraints will enable us find rules to our agreements, and to discover who we are coming to be through those regulated interactions.

I have been troubled by the sense that a cultural project interested in widening the field of possibilities might be taken as an evasion of reality—as fantasy, at best, or totalitarian attempts to remake the human condition at worst—until it occurred to me that reality itself is nothing more than the compilation of present possibilities. Nothing is fixed and set—as soon as anyone makes a move reality has already been adjusted. All originary ‘pataphysics would do is widen the field the possibilities in any present, not obscure the fact that at every moment a wide swath of possibilities is cut down. And that’s all we need in order to be realistic: be willing to accept that, whatever our threshold for acknowledging a possibility, somethings, lots of things, maybe most things, at any moment, will still fall beneath it. For originary ‘pataphysics, the rush of new possibilities will be matched by the discarding of old ones, creating “reality,” or conditions under which the consequences of choices can be accounted for.

January 6, 2013

Notes on Equality

Filed under: GA — adam @ 9:01 pm

“The Muslim position is a powerful attraction for the marginal (collectively and individually) and the disaffected. What it lacks, in its obliteration of the anthropological connection between God and humanity, is a way of theorizing the deferred equality inherent in firstness. The Islamist insistence on Sharia is a clear demonstration of the non-reciprocal nature of Islam. Sharia demands submission; ‘Islam’ means submission. We have all heard conservative complaints that feminists in the West find every straw in our own eyes but ignore the beam in that of Islamic societies. But there is a reason beyond political expediency for the difficulty in attacking, for example, Islam’s unequal treatment of women. Whatever the disparities in Islamic society between men and women, or even free persons and slaves, they exist on a base of firstness-free equality. Sharia is ‘the same for everyone,’ as though Islam effectively imposed the ‘veil of ignorance’ that defines John Rawls’ ‘original position.’ Sharia’s defender might well say: ‘Yes, Sharia distinguishes between men and women. But we all obey it equally. I obey Sharia as a man, but if I were a woman, I would submit to its rules in the same manner.’” Eric Gans, “Abraham’s Three Firsts,” Chronicle 435

Effacing the deferred equality inherent in firstness produces equality in the face of Sharia, which is to say, equality in the face of the destruction of deferred equality. With regard to both deferred equality and its demolition, there is a kind of inequality: in the first case, an inequality compared to the equality yet to come; in the second case, an inequality due to the arbitrary nature of the rules needed to abolish deferred equality. The equality, in both cases, though, is not an “objective” one, based upon some (which?) universally shared measure (the impossibility of which cannot, I think, be demonstrated any more effectively than Marx does in his Critique of the Gotha Program)—rather, in each case, equality means equality before God, or, more generally, some sacred center. This equality is reciprocally constitutive with whatever inequality it seeks to mitigate or reframe. Think of how easy it would be, in Gans’s example, to replace “men” and “women” with “king” and “subject,” or “master” and “slave”—given the right sacred center, these could also be seen as instances of equality—“I obey as a master, but if I were a slave I would submit to its rules in the same manner.” Master and slave, monarch and subject, would, indeed, be equal, if the sacred center established so as to defer some more terrible violence decreed the necessity of such positions. Any affirmation of equality singles out that feature which positions each one equidistant from some sacred center (with the measuring rod being a product of the center itself), and that form of equality defers the violence implicit in the remaining inequalities by providing some kind of access to some fundamental social good and meaning—including on the originary scene, where relations of leadership of some kind or another will be generated out of the results of the scene. That is why what often look to us like astonishing inequalities are enshrined in religious doctrines and rituals predicated upon equality before God without any sense of incongruity (even if such always exists somewhere o the margin)—what some mode of “framing” presents as inequities must be reframed as instances of the central equality. Otherwise, would good would that mode of equality be doing?

The first conclusion I see following from this formulation of (in)equality is that the modern notion of equality, which seeks equality outside of and even against any sacred center is both incoherent and insatiable. It will always be possible to identify some new form of inequality and render it intolerable, and there is no reason to assume that a corresponding and mitigating form of equality will always be imaginable. It might be simpler to say that the modern view of equality is simply insane. The American, not quite modern, understanding of equality in classical Christian and Judaic terms, as all men being created equal by God, is far less so, but the boundary separating equality before God from unbounded equality is not all that thick. All men (and women) can only be equal before the God who displaces a global (at least in principle) imperial center: without some God-man claiming a right to the lives, possessions and devotion of his subjects the one God before whom we are equal evaporates, and with it our equality. All modernity did is take the anti-imperialism of Judaism and Christianity seriously and direct its attention to overthrowing emperors. But the unanimous anti-imperialism (in the broader sense of anti-state) constitutive of revolutionary modernity requires a new empire, a more terrible empire claiming the right to shape its subjects and eliminate the misfits who unsettle that unanimity. All political talk now, on the left and the right, presupposes such a unanimous rejection of some form of tyranny with which the opposing party is complicit—the notion that liberal democracy has ushered in a new era of decision by open discussion is not only an illusion but conceptually incoherent because, in the end, one is modern or not, democratic or not, free or not. Liberal democracy absorbed the civil wars constitutive of the entrance into modernity and is now dissolving back into them—anyone who listens closely to even the more moderate Democrats and Republicans, and even when they are speaking to the center, can see that in the end neither side can really grant the legitimacy of the other. You can’t enter a discussion with those who will not change their views via that discussion, or who believe that the discussion itself is not decisive, and presenting your opponents as those who fit precisely that profile has proven irresistible and, ultimately, reasonable. The fit between democracy and the rule of law was always a rough one at best, but in the end why should the people accept the rule of law if it interferes with their desires?

The only sane alternative is to say that we are equal with those whom we reciprocally treat as equals. Each of us is then equal with many others, in many different senses, and at many different levels—I am equal with the friend with whom I share confidences, his and mine; I am equal with my children insofar as I fulfill the role of “father,” in which I deposit a certain sacrality that binds me to them as “children”; I am equal to my coworkers insofar as we all expect more or less defined fair treatment from our employer, who will not treat any of us as either slaves or cronies; I am equal to the vendor on the street from whom I buy a giant pretzel insofar as we each part with something we desire less for something we desire more and thereby better each other’s condition; and, beyond that, I feel a kind of liminal, potential equality with anyone whom I might someday encounter and try to engage in some way. Anyone can embrace a democratic spirit and seek to expand the circles of equality in which one participates, and the intensity of the equalities one already enjoys, but to pretend to equality beyond those circles, where there is no shared center, is utopian, savage, or both. Sometimes equality emerges out of the clash of incommensurables, and/or the decisive defeat of one by the other, as has happened with the warring parties of WWII, but that doesn’t mean one can elevate that possibility into a rule or method.

Once one form of equality is established, it is likely that others will be forthcoming—economic exchange can lead to political alliance and vice versa. But it is just as likely that one form of equality will reveal barriers to further engagement. At any rate, there is a problem of inequality, but it lies in some violation of the rules articulating all in relation to the shared center. Inequality is essentially a question of cheating—the rules are the “deferred equality inherent in firstness.” In that case, though, the solution is to rework the rules and/or their enforcement, or to accept that that particular form of association has been exhausted; it is also the case, then, that to foreground inequality as such as the problem is to poison the rules because then the rules become nothing more than a means to reduce or eliminate inequality, which is to say that the rules become nothing more than a weapon used by one side against the other, which is by definition outside of the rules.

The only possible politics in the ongoing self-dismantling of modernity is one that seeks to clarify the rules according to which we are playing. If the rules can’t be clarified in a way that satisfies all parties, then there is no “game” and the only reasonable and honorable alternative is to leave. The other side will do what they can to you and you will do what you can to stop them. I would be very curious to hear anyone try to clarify the rules according to which our federal government currently plays—I don’t think it can be done. The government simply rewards its friends and, if they are lucky, ignores its enemies, like any powerful patron or protection racket. Referring to the rules, like the law or constitutional principles, is futile because all of those rules have been weaponized. All that can be done is to avoid drawing the attention of the state, and, more importantly, to maintain the games one plays and the rules they rely upon, while preserving as much of their autonomy from state and society as possible. Study those rules, and divine the tacit agreements in which they are embedded—those tacit agreements, our idioms, will in turn reveal other possible agreements. One’s chosen equalities with others is simply external to the state—seeking to use them as levers to overthrow the state would reinstate the same totalitarian anti-imperialism that has brought us here. The state has not usurped some position of originary justice which it is now up to the people to retrieve and restore—the state is just the largest property owner, like the kings were, and even if it now invites a few citizens to help in the management of that property that doesn’t change the fact that your property is ultimately on loan from the state and that you are equal until it’s your property that the state sets its eyes on. Getting rid of the people presently managing the common realm will not solve the problem of how to manage or distribute it afterward. It’s better to prepare for that time when it might be possible to buy up bits and pieces of the state, maybe at bargain prices, when it starts to fall into pieces. Nothing, that is, prevents us from creating alternatives to the state, much less picking up the pieces of the relationships it destroys. And there will be a lot of pieces.

December 17, 2012

After Memory

Filed under: GA — adam @ 6:09 pm

“An act of pure attention, if you are capable of it, will bring its own answer. And you choose that object to concentrate upon which will best focus your consciousness. Every real discovery made, every serious and significant decision ever reached, was reached and made by divination. The soul stirs, and makes an act of pure attention, and that is a discovery.” D.H. Lawrence

The notion of having God’s will, ideas, or natural law “engraved” or “inscribed” on the heart or mind has been a constant of Western thought from the Hebrew scriptures through the founders of modernity like Locke and Kant. (There may be nothing inscribed on Locke’s “blank slate,” but where did the notion of a blank slate, prepared to take inscription, come from?) The metaphor obviously depends upon writing, and presupposes a process of inculcating a sense of duties radically at odds with those of an oral culture. In an oral culture one’s primary obligation is to know all the names (of ancestors and divine beings) constitutive of the web of existence, or to know who knows them. Such ostensive knowledge has an imperative component—one tries to find out what the named beings want, and then one does it. With writing comes a transcendent voice that says the same thing to everyone and comes from everywhere or nowhere. Anyone can repeat the Word over and over again, inscribing it internally. While it exists objectively and can be checked when needed, the written word is only effective if memorized—while the prodigious feats of memory of the epic poets of oral cultures are no longer necessary, the book could not stay with one without at least some degree of memorization, of key passages, of general themes, and so on—after all, the written word was not readily available (who could afford to possess books?) and was often accessible only through public readings and sermons, and in educational settings. God’s word is then written on the heart through constant oral repetition, and is embedded in culture through its transformation of the language—in the same way in which our own contemporary English is still, unknown to most English speakers, saturated with phrases from the King James Bible and Shakespeare.

Even as books became readily accessible this relation between the written word and its “inscription” on our minds and hearts has remained remarkably constant, I suspect. I remember, as a graduate student, even though I could have dozens or hundreds of books, privately owned and borrowed from the university library, constantly trying to inscribe on my mind passages from books I had read and, even more, cross references from one book to another. If a critic I happened to be reading made a reference to, say, something D.H. Lawrence said about Christianity, I would try to call to mind what I had read by or about Lawrence that might frame that reference, and chastise myself for, inevitably, not having “inscribed” on my mind what I now needed; I would then inscribe what that critic said and draw upon whatever traces of my previous reading of Lawrence might enable me to locate that framing reference. Indeed, even finding a passage that I wanted to quote for a paper required some prior inscription—it was somewhere in chapter 2, or sandwiched in between two other discussions which I had inscribed in broad strokes. Ultimately, what marks one as a worthy scholar is being able, much like the first users of texts, to take a written text, available to all, as a prompt for a dialogue with others or an internal dialogue with oneself that, in the end, would produce a new text—a process that requires some ongoing retention.

Joshua Foer, in “The End of Remembering,” explores one central consequence of the displacement of print by electronic culture—the fact the one needs to memorize less and less. One striking example he gives is phone numbers, even one’s own (I don’t, in fact, remember my cell phone number). For me, and I am sure for many others, a critical rite of maturation was being able to remember my phone number—it meant I could be trusted to go out on my own. Now, a small child can have a cell phone but doesn’t need to remember his home number. I think that electronic culture is having a related effect on scholarly work and education as well—you don’t need to remember where a passage in a text is, or where to go back and find a particular comment by D.H. Lawrence on Christianity, because all you need to do is compose a search term, which I suppose still requires some memory but very little since you can try out a whole series of search terms (D.H. Lawrence critical Christianity… D.H Lawrence hate Christianity… D.H. Lawrence Christianity eternity…) in all of 20 seconds. There are lots of accounts of the changes of consciousness in process as a result of the emergence of electronic or digital culture (inquiries generally modeled on the studies by Eric Havelock, Walter Ong, Jack Goody and other on the transition from oral to literate culture). I think that Foer is right that we might advance such inquiries considerably by focusing on this singular fact of the obsolescence of memory.

Well, you still need to know why you would be looking for that Lawrence passage, and what to do with once you retrieve it—but we can imagine such search and deploy missions taking on a very different character from traditional scholarship. If I am working with a text—and, as a student, when we first learn to work with texts, I am working with it because I have been told to, or because it’s the test everyone is working with (and, in fact, for the vast majority of working scholars, this changes very little—one reads what is read)—and I encounter a name or word that I don’t know, and I feel I need to know to make sense of that sentence or paragraph, I can do a quick Google search that tells who the name refers to or a definition of the word, taking, perhaps, the first several items—the answers I get won’t provide me with the kind of context that having inscribed texts in my heart and mind would have done, but how would I know that? If I then have to write about that text, I will use the names and words I have retrieved in what would look to a traditional scholar like semi-literate ways (probably both overly literal and a bit random), but the more people do it that way, constructing hybrid figures and meanings, the more that will be the kind of work done in the academy and elsewhere. One would simply fill in the gaps in one’s knowledge as they appear, and they would be considered “filled” insofar as they enable you to get to the next gap. Better work would be distinguished from worse in the patience that has been taken to construct links across a range of texts and the consistency with which one has used and cross-referenced those links—and, in exceptional cases, the ingenuity with which one has provided unexpected links for others to follow up on. You really wouldn’t have to remember anything—you would only need to have acquired the habit of searching and articulating links whenever confronted with a text and a task. It is quite possible to imagine a whole new kind of intellectual work emerging out of the process, one which applies across the disciplines, including the sciences, which are probably already closest to this model—after you’ve put together all the links everyone you have “read” has read, there will be certain gaps in knowledge (possible but unmade links)—you just go ahead and fill in one of those gaps.

Indeed, college instructors should be avoiding standard topics like “D.H. Lawrence and Religion” precisely because of the ease with which one can patch together a series of passages and critical comments through the internet. Instead it might be better to imagine unprecedented topics, like, for example, selecting a particular word or phrase that recurs in an author’s work or a particular text, gathering up all the instances of that word or phrase, checking the rate of its recurrence across the writer’s work and in comparison with its occurrence in other authors, and use the findings to challenge some established critical views of that writer (one could make such tasks increasingly complex, as necessary—one could form new search terms for the use of the word or phrase in specific contexts, in proximity to other words and phrases, etc.). Culture, in that case, will tend to be experienced as the distribution of probabilities with which commonplaces and differing modes and degrees of variants of those commonplaces appear. More conservative interventions would seek to stabilize the most relied upon commonplaces, while radical ones would seek wider distribution of more “deviant” variants. Entertainment would continue on its current path of arranging in different but not too different ways common scenes, narratives, catchphrases, etc. We would almost literally be going with the flow—the flow of information regarding how distant from the norm our current array of ready to go phrases and gestures is at the moment; freedom would involve determining how distant we want it to be.

The features of digitality more commonly discussed, like social networking, seem to have the same effect of rendering memory obsolete. If someone puts photos of himself with his girlfriend on Facebook, he has no need to remember the experiences recorded in the photos—here, the public nature of the exposure is what makes the difference: the photos represent his relationship to her for those have access to his page, and that is their meaning. If they break up and he changes his status, the pictures can come down and be disappeared. Maybe such an individual, today, has the same regrets, nostalgia, hopes for reconciliation, reconstructed memories and so on that a normal modern person, who has inscribed his feelings upon his heart and mind—but I don’t see any need to assume that this will continue to be the case. Loves and friendships may be more and more reduced to the needs of public display, and more and more people will be their Facebook page (and whatever networking forms emerge in the years to come) and therefore memoryless. Some form of sexual desire can be taken for granted, but romantic love centered upon monogamous long-term relationships, relationships dependent upon both memory and forward looking narratives, certainly cannot be. Emotional life might take on shapes drastically unfamiliar to us.

What kind of people would these memoryless, or “dis”-membered beings be? It’s easy to assume the worst. With the obsolescence of memory, what would promises be worth? Would anything we (“we” being traditionally humanist thinkers) recognize as “thinking” be possible? If the past is constantly disappeared how would the future be conceptualized? Would people save money? Have children? Be capable of any kind of sustained attention whatsoever?

Marshall McLuhan, of course, raised these kind of questions half a century ago, and his notion of a “global village,” meaning both instant connection across the globe but also a return to the kind of oral culture, focused on spectacular events, driven by rumor, gossip, moral contagions and celebrity (a kind of mini-divinity) seems as relevant as ever. As does Jean Baudrillard’s vision of a society of simulacra, in which we are ourselves the models out of which we construct ourselves. The viability of such a society, with minor as well as major powers possessed of nuclear weapons and the rise of a global Islamic fanaticism, not to mention the problems involved in managing a complex global economy, would be dubious.

If signs are not to be inscribed in hearts and minds, what does understanding signs amount to? Nothing more, maybe, than the most originary of understandings—the capacity to iterate the sign, to maintain and extend the joint attention (to follow a line of attention) it has initiated and which has drawn you in. The capacity to iterate the sign involves the knowledge of the difference between a broken scene and an intact one—which is to say, knowledge of what kind of gesture is likely to get the desired response—or, at least, a response one would know how to respond to in turn. I would think about this as a kind of sincere pretending in which individuals try not so much to be like other individuals, as to approximate a kind of projected or imagined norm. But it is not easy to imagine and approximate such a norm, especially since its formation is constantly in flux, and what is normative or average in one site might be on the fringes in another. There will always be cases in which the projected norm is in fact an extreme anomaly or, to put it positively, sheer possibility.

This is the form thinking may take, or already is taking, as we move into the order of electronic communication: the generation of possibilities, the more sheer, the more barely possible, the better. Start with the assumption that anything is possible, anyone is capable of anything, and modify that assumption as the scene takes shape. Quite a few postmodern thinkers have already pointed in this direction. I will put it this way: modernity continues metaphysics, which sought out the ultimate reality in a higher, hierarchically organized, intellectual and spiritual order, by shifting our attention to the ultimate reality to be found in lower, unseen forces: material interests, sexual drives, the unconscious, etc. What comes after modernity is “patatiquity,” with the “pata” from the pataphysics, the science of the exceptional invented by Alfred Jarry and “tiquity” a temporal suffix modeled on “antiquity.” Patatiquity is the age in which possibility is more real than reality. Research conducted through Google constructs a possible array of links; social interaction carried out through networking online constructs a possible community. In both cases, the possibility is “real” insofar as others iterate the signs of possibility one puts forth, and these iterations in turn generate new possibilities (like a new hierarchy of Google links).

So, is patatiquity sustainable? On the face of it (and both conservative and postmodern critics, with differing evaluations, agree here), patatiquity seems to herald an era of irresponsibility and carelessness we can ill afford—isn’t the Obama cult exemplary of patatiquity, with its investment in the sheer possibility of hope and change; isn’t endless debt, both personal and national, equally patatiquital (or, perhaps, continuing with the model of antiquity, “patacient”)? Maybe—that’s certainly one possibility. But it might also turn out that the most avid explorers and investors in possibility will insist on the kind of minimal reality that makes possibility possible: to take just one example, real money. The modern attempts to control the economy and regulate habits through the monetary supply just inhibit possibility by governing according to the norm extracted by experts. The more we insist on unequivocal laws governing distinctive areas of human actions taken as literally as possible, the more is left over for possibility. In fact, Gertrude Stein’s political conservatism seems to based on a similar line of thinking: in a series of articles written for The Saturday Evening Post in the 1930s she argued for the necessity that money be “real” (i.e., not fiat) and for the government’s approach to requests for further spending to be that of the stingy patriarchal father (a stock figure Stein otherwise tended to despise); more generally, the intersection of habits that generates infinitely varied human interactions and idioms can only do so if minimal, but strict, rules are taken as given.

Under such conditions, the law can function more as constraint than restraint: restraint seeks to hold back while constraint seeks to channel, like the rules of a game that enable a wide range of moves displaying an equally wide range of intellectual and/or physical capacities. Out of a set of constitutive rules—those rules that make the game a game—emerge all of the regulative rules determining strategies. But patatiquity suggests something more: the regulative rules reveal more constitutive ones. The right to property is a constitutive rule of a free society, but there are many ways of enforcing that right, and each one of them—protecting one’s property oneself through arms and security systems, a public police force, a private security force, etc.—reveals something about the right to property itself (what kinds of ethics and capacities it requires and evokes, where it stands in relation to other rights). Just so does the elevation of possibilities involve an ongoing revelation of a community’s constitutive rules. Agreements would be made explicit and their limits clarified, and norms and assumptions about rights would emerge from those agreements; more long term institutions, most importantly family, that transcend individualized, explicit agreements, might very well change dramatically, becoming, as is already the case, more contingent and mixed—how to ensure the care of children will be a real problem. On the other hand, there will probably be far fewer of them, but, contra Mark Steyn, that may not be socially fatal—at the very least it will impose some very difficult choices: for one thing, it will become increasingly obvious that we can’t have both commitments to our present day middle class entitlement programs and regulations and tax policies that cripple the kind of productivity required to provide the excess wealth needed to subsidize those ever more bloated programs..

In patatiquity the sheerly possible can reveal constitutive rules that a more normative, declarative culture conceals. Imagine writing according to the following rule: each word aims at being the least predictable, given the surrounding words, for the normal reader. Your writing, then, is first of all a study of your possible readers, in an attempt to give those readers an opportunity to study themselves. Following this rule (which will not be easy) you will produce the sheerest of possibilities, the possibility left after all the other have been exhausted. And to read such a work would be to start exhausting those inexhaustible possibilities—all the clichés, commonplaces, formulas, chunks and constructions in one’s linguistic inventory. If the first word of the sentence is a personal pronoun, the next is most likely a verb, and a verb referring to an action carried out by humans, and then adding in the context, your own personal proclivities and some guess work you anticipate with a 63% probability one “kind” of word and with 37% probability another “kind”; given that next word, the same process starts up for the next one, and so on all the way through; and you could do this backwards and forwards or starting in the middle, and over and over again. This is the way we always use language—someone starts to speak, or to gesture, or we starting reading the first line, and each sign plugs into an array of constructions and possible relations between constructions we are familiar with in varying degrees. So, pataphysical writing makes visible the constitutive rules of language use, precisely by loosening those rules as much as is humanly possible. And now you can read anything in those terms, as a certain degree off-center, as containing anomalies, as even the most predictable text will, then, embody pure possibility—perhaps especially the most predictable text if we consider what an odd thing that is.

De-memorization would then leave us with nothing but memory of the constitutive rules, and a desire to rediscover those constitutive rules over and over again through “acts of pure attention,” or “divination.” So, if we return to my first example, of thinking as linking, then the most compelling texts, scholarly, popular, or esthetic, would be those that articulate the most probable links in the most improbable ways, grounding them in sheer possibility. Elaborate, counter-productive rules like those promulgated and incompetently enforced by government bureaucracies would be discarded as requiring too much “self-inscription”—too much remembering of specific rules and their normative “meaning.” Very simple things, like acquiring the most useful skills, and saving as much money, or real wealth, as possible, would be preferred—you could always check the status of those things daily on the market. The future can be divined in signs of the present, while the firm fact that (real) money will always be useful allows for the future to be otherwise completely open, populated only be sheer possibility that one need barely adumbrate.

Once we realize that our selves are possible, not actual, our energies will be devoted to the creation of plausible possibilities and spaces where implausible ones can be safely engaged; even more, our assessment of institutions will turn on our assessment of their ability to enhance our creation of possibilities. One’s own economic possibilities—and more and more professions—will focus on creating possibilities for others—helping others be imagined as they imagine themselves being imagined. PR will become the queen of the sciences. If you want to construct a representation that will have effects on a particular audience in a particular way, you must study the desires and habits of that audience; even more, you must treat those desires and habits as malleable, within limits. You will game it out—someone who says x will be likely to want to hear or see y; someone who does x everyday will be happy to be given a chance to do y; someone who has bought a, b and c will like something like d (note that in each case there is no reason to assume that the audience actually wants, or has ever imagined, the y or d in question—the marketer is filling an imagined gap in their experience, a gap opened by the inquiry itself). Already, more and more selling of products involves selling such simulated images, filling such gaps, and telling the consumer of the gap and that it is being filled. This is objectionable from various enlightenment and romantic perspectives assuming the uniqueness of the individual and the integrity of the thought process, but if we set those objections aside we can see that a mode of “critical” thought and “high” culture is already implicit in this very model: opening up new spaces or gaps between the normalized experiences and those experiences which yet lie immanent in them.

Finally, this turn toward the trivial, or a continual lowering of the threshold of significance (more things becoming less significant) would lead to a very strong desire to reduce violence. We already see increasing distaste for sustained confrontations and enmities—maybe they require too much memory. There is a preference for constructing defenses that make confrontation unnecessary. The free and more advanced societies will be able to create sophisticated defenses that make them impregnable vis a vis the failed, resentful societies surrounding them, while sparing them the pangs of white guilt involved in retaliation—Israel’s missile defense, which costs far more than an all out war to destroy their Palestinian enemies would, is an obvious example here. A premium will be put on keeping people out, except under precisely defined circumstances—once someone is in, you need to deal with them, so a lot of intellectual energy will be invested in determining who can be let in. If governments don’t defend their borders, communities and businesses will do their job; and people will shape themselves so as to be acceptable members of the communities they wish to enter (as I suggested earlier, much business will be generated in helping them to do so). These strategies of avoidance might impoverish social interactions by ruling out a wide range of possibilities from the start; but it might enrich the relations that remain by making them more meaningful in the literal sense of causing all signs among those who have been properly vetted and therefore already give forth much information to contain layers of significance.

And, anyway, no one will remember what they are missing—there will be students of history, but I think the idea that there are lessons to be learned from it will disappear, and rightfully so because history is nothing but the history of struggles to own the asymmetrical gift economy centered on the state, and patatiquity can only come into being by putting all that aside.

« Newer PostsOlder Posts »

Powered by WordPress