GABlog Generative Anthropology in the Public Sphere

August 15, 2013

Violent Imaginaries

Filed under: GA — adam @ 9:10 am

Violent Imaginaries

Perhaps it has occurred to other members of the GA discipline that the negative pole of the event issuing in the originary sign, that is, the collapse of the proto-human community into universal, chaotic violence, is extremely implausible. In other words, even if we assume the most heightened mimetic fervor, it’s hard to see how the violence consequent upon the breakdown of the animal pecking order could actually lead anywhere near the violent death of all members of the community. How could they all tear each other apart limb from limb? They would exhaust themselves well before such a result. At most, such a breakdown in the hierarchical order would lead to some serious injuries, maybe a death or two, followed by a perhaps more fragile restoration of the pecking order.

What is extremely plausible, though, is that such an all-consuming storm of violence is what that breakdown would look like to the members of the proto-human group—and that it would need to look that way in order prompt the invention of the sign. In other words, the intuition or, perhaps, unconscious, constituting the sign is not a reasonable expectation of devastating violence but a violent imaginary. An imaginary gives rise to fantasies, but it is not the same thing. A helpful way to think of the concept “imaginary” is through R.G. Collingwood’s definition of “imagination.” Collingwood gives the following example: suppose you are looking from a window out on a lawn with a wall that one cannot see beyond. You will have no difficulty “imagining” the continuation of that lawn beyond the wall—in that case, “imagination” is the continuation or completion of what one perceives. In that case, though, imagination must make perception possible in the first place: I can only see the field of grass as a “lawn” insofar as I can see it as bounded, as a whole in itself and a part of larger wholes. This constitutive capacity of the imagination is an “imaginary,” a supplement to perception that makes meaningful perception possible. Needless to say, the imaginary need not be true—in the case of Collingwood’s example, the lawn may not, in fact, extend beyond the wall.

The violent imaginary on the originary works in the same way—as soon as the beginnings of this new form of confrontation becomes visible to the participants on the scene, those participants must continue and complete it—“must” in the sense that that is what an advanced, highly mimetic animal would do. And this continuation and completion would be, not an attempt at an accurate portrayal, but a sharpening of what is new in this configuration. What is new is the absence of discernable limits on the confrontation, readily imaginable as an uncontrolled, accelerating melee, with no exterior. It is, then, this violent imaginary that is both revealed and concealed in the sign—the symmetry of the shared sign on the scene must match the negative symmetry of the unbounded imagined violence. We can even assume that the form of the sign is shaped by its complementary violent imaginary—the sign would be effective to the extent that it conveys everyone’s awareness of the extreme “thought experiment” implicit in the violent imaginary.

All signs, and all of culture, then, would have to be constituted by some violent imaginary, one we could read negatively off of the signs of culture themselves. No human society, even in the midst of the most brutal war or total social breakdown, ever approaches “chaos”—society can be dissolved into clans, gangs, militias, tribes, but never into a “war of all against all.” Again, though, this doesn’t make the imaginary false—even in those gangs and clans, conflicts raise the specter of a breakdown of that order, and that breakdown, even for the most realistic, can only be viewed as a “breakdown” against the imagined background of “chaos.” And it is from such an imaginary that we derive our intuitions regarding the best way to ward off all consuming violence.

Perhaps such an originary theory of the unconscious can make psychoanalysis interesting for GA. Psychoanalysis has been swept, often derisively, from the stage of history, replaced by neurobiology, the cognitive sciences and more practical, localized forms of psychotherapy. Freud’s claims regarding the Oedipal Complex, castration fear, penis envy, etc., have been widely ridiculed as arbitrary and (Victorian) culture bound. Maybe. But at least Freud put desire, violence and deferral at the center of his psychology, and saw the central problem of humanity as relations between humans, rather than greater proficiency in the relations between humans and nature, or assuming the problem of reciprocity to already be solved. And Freud was also aware that the way we represent our experienced traumas to ourselves and others are related to those actual traumas only in very mediated ways—which is to say, he understood that reality is scenic, and the psychoanalytic session was one more scene or event in a long series of them leading us back to the original scene. If I am right to posit a violent imaginary as constitutive of any sign, than GA shares all this with psychoanalysis (including important post-Freudian figures like Lacan, Winnicott and Kristeva)—and not with what might be much more scientific, useful and, in their own domains, accurate representations of the human mind.

Reconstructing the constitutive violent imaginary through the continuing and completing of the scene by a sign would in turn enable the reconstruction of that sign so as to take more account of that violent imaginary. Bringing more of the violent imaginary into representation would not necessary quell the terror lurking within it, because such further elaboration of the representation would simply shift the terms of the violent imaginary. In other words, the violent imaginary can never be made fully conscious—as Freud realized, the patient who came in familiar with his works and proceeded to spill his guts about his passion for his mother and desire to kill his father had simply rearranged the unconscious material under a new repression. What can, perhaps, be done, though, is to invest the violent imaginary more fully in dialogue or disciplinary space established so as to examine it—in other words, the violent imaginary cannot be represented as such (we can’t paint a picture of it and look at it together) but it can be made the imaginary of the scene of representation itself, rather than of some represented scene. The value of this is to make our responses to each other direct, ostensive, framings of the violent imaginary that can provide suitable, matching signs, rather than explanations and diagnoses that subserve the power of the violent imaginary by keeping it out of our hands, so to speak, and rendering it, paradoxically, “fictional,” insofar as it has been made over some conventional narrative form.

Cultural analysis, in this case, would involve mapping the features of a representation onto some violent imaginary which is, to borrow Saussure’s image of the signifier/signified distinction, are like two sides of the same sheet of paper. One violent imaginary would be all in the group converging on the strongest member, and then the second strongest, and then the third, and so on. Another violent imaginary would involve all converging on the weakest, and then the second weakest, and so on. Another would have equally powerful subgroups facing off in an endless and increasingly bitter stalemate. And each of these imaginaries could be further modified by the positions one can occupy within them—seeing oneself as the third weakest in an imaginary in which the weakest is the target would be different than imagining oneself among the strongest in that scenario—one’s fear would be calibrated differently and ones responsibility and capacity to affect the scene assessed differently. And, then, the sign one puts forth would be correspondingly different, and we could read levels of confidence vs. diffidence, caution vs. recklessness, patience vs. panic, appeals to the entire group vs. appeals to more specialized constituencies, among other features of one’s sign, in these terms, reading them back to a hypothesized violent imaginary. Different violent imaginaries might map better onto particular historical events, and may thus provide us with a way of accounting from changes in historical interpretation. And cultural remediation would involve fleshing out and making more present those violent imaginaries, creating new positions within them and creating tactics and strategies for those positions available within and so as to further defer that violent imaginary.

May 24, 2013

Flipping the Conference

Filed under: GA — adam @ 7:05 am

According to a new pedagogical technique, called “flipping the classroom,” instead of using class time to provide the educational content (the lecture) with out of class time (homework) used to fulfill some assignment aimed at reinforcing what was heard in class, students learn more from providing the lecture online, so that students can listen to it at home and class tome then used to raise questions and probe the student’s understandings. I have seen the same approach proposed for the academic conference, where it seems to me to make even more sense: instead of sitting and listening to a complex paper, which one can hardly process and formulate pertinent questions for on the spot, with a meager 10-15 minutes left for discussion, why not post the papers in advance so that they can be read and the conference time used for more advanced and productive discussions? At any rate, I thought I would give it a try, and I invite others to do the same in this space, or to begin any discussions now, to be continued at the conference. Of course, I don’t preclude the possibility of further revisions, but here it is, in what seems to me a pretty much finished form:

Attentionality and Originary Ethics
Adam Katz
However paradoxical it may seem, I venture to suggest that our age threatens one day to appear in the history of human culture as marked by the most dramatic and difficult trial of all, the discovery of and training in the meaning of the ‘simplest’ acts of existence: seeing, listening, speaking, reading – the acts which relate men to their works, and to those works thrown in their faces, their “absences of works.”
Louis Althusser, Reading Capital, 1968

“In all the years in which I have attempted to explain GA in writing and in speech, I have tended to place the major emphasis on representation, and in particular on “formal representation” or language. One of the points I have insisted on is that human language is qualitatively different from animal “languages”; the researches and insights of such as Terrence Deacon have essentially ended the debate on this point. But it follows from my very “definition” of the human as the species that poses a greater problem to its own survival than the totality of forces outside the human community that the primary transformation of the proto-human into the human was ethical. Language and more broadly, representation emerged, per the originary hypothesis, to defer conflict, not to provide a cognitive or ratiocinative tool. But in the configuration of the originary event, the moral model of the reciprocal exchange of the sign is just as indubitably unique a human creation as language, and indeed more essential to the success of the event—and to the consequent emergence of our species. The urgent need that the event fulfills is to find a model of behavior that can defer violence within a community for which one-on-one animal hierarchy no longer provides an adequate solution.” Eric Gans Chronicle 431, “Originary Ethics.”
The distinction Gans posits here, between the sign as a formal representation of a transcendent object, on the one hand, and the sign as a result or manifestation of reciprocity seems to me one that the originary hypothesis itself transcends. In other words, “formal representation” is itself ethical, is indeed the origin and resource of any ethics, so that ethics cannot be thought outside of it. At the same time, formal representation cannot be thought outside of ethics, since the “formality” of the representation lies in the shared attention it effects, and in this shared attention lies any ethics. In shared, or joint attention, is the fundamental equality that constitutes the human. All the resources we need for thinking about ethics lie in joint attention, in our ability to point to something, and approaching ethics in this way might enable use to create more minimal, more pared down, ethical vocabularies.
To start with, if we can fold moral reciprocity into the shared attention constitutive of the scene, couldn’t we say that what is immoral and a denial of reciprocity is whatever interrupts that shared attention? There are two ways shared attention can be interrupted: first, through some kind of distraction; second, through some kind of fixation. Distraction (distracting others; allowing oneself to be distracted) tears us away from the scene of joint attention and thereby demands a renewed, necessarily risky effort to redirect attention in the object—that is, distraction raises the threshold of significance; fixation involves tearing oneself away from the scene and, ultimately, turning the other participants into objects of rather than participants in, that singular attention. Both distraction and fixation abort the scene, but both are also complementary possibilities of the originary structure of joint attention: the actuality or fear of distraction favors the formation of fixations. If we consider that anyone enters a scene by following a line of attention—by looking at what someone else is looking it and deferring appropriation as the other does in order to continue looking—one has not fully joined the scene until that line of attention has passed through oneself, and has been seen to do so. In other words, attention is not joint until all the participants show, through signs, that they are letting the object be so as to see what it has to show—in which case, each participant must be inspected, so to speak, or credentialized, by having the sign they put forth validated. For one’s joining of the line of attention to become evident and thereby accepted as legitimate, that attention must first land on oneself making oneself its object—in other words, each new participant on the scene represents a potential interruption of shared attentional frame. At this crucial point upon which one’s entry into the scene depends, one can only avoid becoming a distraction and potential source of fixation in others by doubling that attention back on oneself by joining it, becoming a sign and hence invisible, insofar as others are redirected back to the scene through you. In that case, you will have shown others that the line of attention passes through your own eyes; unless, of course, your self-referentiality simply intensifies your distractiveness. Whether a distraction has taken place will depend upon whether those attended to or, in Louis Althusser’s term, “interpellated,” as potential objects of resentment or desire can restore the line of attention by incorporating the interruption into the scene’s founding sign. I would call this the “loop” in the line of attention, and undergoing this looping is what I would call “ostentation,” which is where ethical being is located. Whether one can undergo or go through the loop depends upon the group’s ability to see you as restoring the line of attention as well as your ability to do so—ethics involves both ostentation and conferring a completed ostentation upon others, or the conversion of attentionality into intentionality. And this means that whether one has distracted or patched together the continuity of the line of attention, or fixated or proactively identified a break in that line can only be known in the aftermath on a new, converted scene of joint attention.
We keep the line of attention going by language learning—every loop in the line of attention involves an encounter of idioms. While it would be absurd to say that each of us speaks our own language, I think it makes perfect sense to say that at the margins we all differ in the emergent idioms we speak and that it at such margins that real ethical questions emerge: when I think I’m following your discourse and taking the next “logical step” but you think I am falsifying your most basic intuitions then a difference in language has emerged. Michael Tomasello, along with many others has made the argument that we learn language not as collections of single words with discrete meaning that then get combined in sentences, or as a series of grammatical rules applied to single instances of language use, but as pre-packaged chunks of discourse—phrases, formulas, commonplaces—that we can repeat appropriately insofar as we occupy scenes of joint attention with our elders. (I remember, for example, when I was very young, hearing “next door neighbor” as “neck store neighbor,” without it impairing my understanding of the phrase at all. Why should “neck store” refer to “proximal”? Who knows? How many other phrases couldn’t be made sense of through a strict following of the literal meaning of the words? If asked, perhaps I could have come up with an improvised etymology—I certainly would have believed one told to me.) Over time our language base extends through discovering iterable patterns in and analogies with those chunks, noticing similar contexts, mixing chunks, exchanging elements of the chunks we are familiar with, and so on. This process never ends, continuing, say, for we academics, when we read the sentences of one thinker through the sentences we have assimilated from another. We can identify patterns because we can arrange center-margin relations on scenes and still recognize them as the “same” scene (when I am done speaking and someone else takes “center stage,” it will still be the “same” scene); and we can identify analogies because the materials of one scene can be “plugged into” other scenes. Iterating (repeating differently) chunks, patterns and analogies, that is, is the ways we follow the sometimes bumpy line of attention.
The ethical stance is not so much learning the language of the other, or teaching the other one’s own language, because “language” is not a static entity that can stand still long enough for it to be the same language once it has been learned as it was when it began being taught. Rather, ethics involves learning the emergent language that arises at the margin or rough edges of the convergent idioms. Joint attention is always liable to lapse, prey to distraction and fixation, must always be checked and re-engaged—when we mistake ourselves and each other we realize that we have not been attending to the same thing after all, and our recourse is to attend to what we normally attend from: language. We have to check our use of words and expressions, to inform one another that I meant this word in that sense, or that I meant it figuratively or ironically rather than literally, or that I was alluding to what I thought was a common reference, or even just to pronounce the same word with a slightly different emphasis so as to distinguish it from a homonym, and so on. And from there attention can perhaps be redirected back to some signified. Explaining and justifying our actions to each other—the traditional content of ethics—is itself such an engagement with signs (our actions and bodies along with words) that threaten to fray some shared attention.
A useful model for the mode of ethical thinking I am proposing is the transference relation in therapeutic situations, in which the therapist allows himself to be interpellated by the patient, who projects upon the therapist scenes that have nothing to do with the therapist, transforming the therapist into a screen upon which repressed fantasies can be displayed and made available for analysis. While, as Philip Rieff has argued, the “triumph of the therapeutic” has eroded moral discourse by undermining the balance of interdictions and remissions constitutive of traditional culture, tipping the balance decisively toward remissiveness (the therapist is always trying to help the patient liberate himself from some social inculcated inhibition) the therapist’s own position comes with a strict set of ethics, one deriving from the ethics of disinterest and transcendence cultivated in the monotheistic and scientific traditions. To the extent that we are all, if not therapists to each other, inevitably objects of transference for each other, then transference provides a way of describing the way through and out the loop of attention issuing in ostentation.
In that case, the transferential relation restores the fraying joint attention, the center, by adopting the assumption that the interpellative attention paid one is essentially random, indicative of some crisis on the scene rather than revelatory regarding oneself. One first takes on responsibility by rendering oneself interchangeable with anyone else on the scene. This assertion of a very fundamental form of equality is ethical work, rather than a presupposition, involving the neutralization of any naturalized link between the source of attention and its object. The similarity between the scene of transference and the Girardian scapegoating scene is obvious, and the ethical stance I am describing seeks to centralize the same resentments Girard theorizes, with the difference that the transferential relation aims at restoring the center by recuperating the process of interpellation within a revised set of rules, or language games. So, while Girard’s model is complicated by the question of the actual guilt or responsibility of the target, that is not the case here: even if I am guilty as charged, my ethical obligation would be to minimize the attentional space my guilt takes up and toward the redirection of attention to the repairing of the joint attention I have myself broken—of course, a precondition of accomplishing that will likely be a full confession, acceptance of the normally imposed sanctions, and laying open my actions to further inspection by the community. In this way, the resentments aroused by the attention I have drawn on myself is less likely to be a distraction, continuing to fray the semiotic texture of the community, than a restoration and enrichment of the shared attentional space. The rule of such a practice of transference is that the more attention is directed towards me the less it is about me.
Jean Francois Lyotard introduced the concept of the differend, which he distinguishes from a “litigation” in that the litigants share a common language of negotiation and adjudication, while the differend involves a double injustice done to the “claimant” insofar as the language in which she would put forth her claim is incommensurable with the language of the respondent. Lyotard used as examples, predictably if appropriately, the Holocaust victim in the face of the Holocaust revisionist (who imposes a double-bind by treating the victim’s very survival as proof of the falsity of her claim) and the Aborigine subject to expropriation via the settler’s system of property which has no way of recognizing the native’s. There is no reason not to generalize the concept, though, to include the more radical transferential relation I am proposing, in which a deliberate incommensurability is introduced at the margins of divergent idioms so as to examine the limits of those idioms in relation to something outside each of them.
Differends are found in sentences that work incommensurably in different idioms. A sentence constructs a reality, immune to imperatives, by deferring other possible realities—realities that, in the judgment of the composer, would less effectively defer those imperatives (perhaps by falsifying or fragilizing reality so as to make it more compliant towards the imperatives; perhaps by constructing a reality so distanced as to not address the imperative at all). But one, or some, of those other possible realities would work just as well if we trace the deferred imperative back further to a longstanding, unfolding one, or up closer, to a more urgent one. A differend emerges when a speaker allows an interlocutor to join him in any of the realities, but only one, and in view of the others. Imagine that the same statement would be decisive proof of the speaker’s insanity, or of his surpassing wisdom, depending upon the frame (or would require urgent action, or infinite patience, depending upon the frame); and then imagine that one has to act on one or the other frame while acknowledging the undecidability between them. And then the other, likewise, keeps both frames in play while acting within one. Both participants would be generating and sustaining the threshold between the two frames—that is what I am proposing as an ethical model.
We create differends by learning the language that is other to everyone involved, which has the paradoxical result of restoring iconism to language. The declarative sentence takes on the iconism of the originary gesture, which means what it does, insofar as the differend constitutes not only the event represented by the sentence but the sentence as event—an event in which, rather than assuming a shared reality, the participants must stipulate to a provisional reality. Under such conditions, reality must be gathered together out of signs shaken loose from normalized reality so as to realign the relation between tacit and explicit. This realignment involves rendering all elements of the speech act—gesture, tone, sound/meaning correspondences, all the scenes trailing along the signs we bear with us, everything “chunked up” in any effort to keep up with the novelty of any speech act, everything that spills out when commonplace meets event, and everything banished by the doctrine of the arbitrariness of the sign—vouchers for the reality one attests and redeems. Language is what we attend from to each other’s attending to; language learning involves attending to what we have been attending from. Attending to the tacit knowing enabling any signification recuperates distractions by using them to break up the fixations that interfere with our attending to the overlapping margins of our idioms that make language learning possible.

March 28, 2013

The Loves that will Dare not Speak Their Names

Filed under: GA — adam @ 10:54 am

The United States is a pathetic joke of a country. Our political class (but who put them in office?) is paralyzed when it comes to crafting budgets, controlling debt, defending borders, developing coherent relations with friends and enemies abroad-but, for a non-issue like same-sex marriage, we are capable of moving rapidly toward self-righteous unanimity and policy clarity. Only those issues of concern to the victimocracy get addressed expeditiously, but the only issues of concern to the victimocracy are those pseudo-inequalities that allow them to conduct unending simulated Nuremberg-style show trials of stereotyped victimizers. It’s worth saying a few words about the same-sex marriage question, nevertheless—not because it is a serious human or political question, but because one of the desperate (but what kind of resistance to the left isn’t going to be a bit desperate these days) counter-proposals allows us to follow a thread through the unraveling. That counter-proposal comes from a strain of libertarianism, which says, just get the government out of marriage. Let individuals of any size, shape, number, dimension, mode or degree of intimacy create whatever contracts they like regarding the sharing and disposition of property, reciprocal obligations, the terms of contract termination and post-termination settlement, etc.; let the government remove all reference to marriage from tax codes or anything else; let churches, synagogues, mosques, etc., marry who they will, and, perhaps, agree to supplant the state as arbiters in cases of divorce (with the consent of the married parties, of course); and, let adoption agencies, schools, home sellers and renters, employers, etc., recognize whatever form of marriage suits their own interests and conscience.

This proposal should be put in play, because, obviously, once marriage can be same-sex, it can be anything—perhaps anything the government says but, ultimately, anything anyone says. But there are problems. Who, after all, enforces contracts? The government, which is to say the façade behind which the victimocracy conducts its crusades, would be far more involved than marriage then ever before, with the responsibility to sort out a whole array of confused, complex, and misconceived contractual arrangements, with, undoubtedly, extremely incompetent, unacceptable and easily evaded modes of enforcement, leaving tens of millions in legal limbo regarding crucial issues like child custody. Well, as someone has said, when you can’t solve a problem, enlarge it—that is, let’s radicalize further. Let’s shift over to private courts: we would have to get into the habit of signing our contracts before mutually agreed upon arbiters and, presumably, of conferring upon those arbiters agreed upon powers of enforcement. This new system would put an enormous social division in place: between those who would continue to rely upon state courts or simply exit any system and create informal, loose, polygamous family structures, as is already the case in many American inner cities and among the under classes generally, on the one hand, and those who would opt into the new system of private courts; and, two, within the new system of private courts, between those with discipline to work within a system relying heavily upon a willingness to abide by verdicts that will undoubtedly be difficult to enforce (will a private court really be able to impose a judgment on a husband to takes the kids, in opposition to prior agreements, from Tennessee to Oregon?) and to avoid recourse to the courts in the first place. The old system, especially once abandoned by the responsible and self-sufficient, will devolve into some combination of quasi-totalitarian nanny state rule, in which the state regularly steps in and regulates parents and cares for children, on the one hand, and renewed clan or gang systems, on the other, as women will have to rely on their male kin (or any other vehicles of male violence they could enlist, through whatever degradation to themselves) to avenge and rectify their violations and abandonment by promiscuous men.

Would 50% of the population be capable of migrating into the new system? 1%? How many would be necessary to make it sustainable? Whatever the answers to these questions, the point of this thought experiment (I don’t mean to suggest I think it couldn’t happen) is that if such a migration, or exodus, were not to be possible, then our current system isn’t possible either, because the ability to reciprocally promise and fulfill such promises necessary for such a system is exactly what we would need to hold on to what we have, because the government no longer supplements and shores up the values and commitments required for a nuclear-family centered society but actively undermines them.

One last question. When a woman has a baby, what makes that baby hers? (For clarity’s sake, let’s leave the father out of it.) Go ahead: justify her “right” to the child. I don’t think you can do it. Childbirth and parenthood are now state-sponsored and regulated activities like any other. Do you not need a birth certificate to authenticate the “provenance” of the child? Are there not myriad laws determining how you must treat and care for and educate the child, along with a full array of government agencies empowered to enforce those laws (would we have it any other way)? In other words, you didn’t birth that: just like the government built the roads and educated the workers and supplies the police and fire fighters that make it possible for you to do business, and so you didn’t build that, so the government (at least) funded the hospitals, gave loans to the doctors, had the FDA approve the pain killing drugs, vaccinations, etc. (and you usually need a road to get to the hospital as well) that made your giving birth and then raising your child possible. You will have no argument once some government bureaucracy, armed with a plenitude of studies regarding the needs of children, goes from house to house determining which children are better off where they are and which ones would be better removed to some more approved environment. One more term in office and Mayor Bloomberg will no doubt get to this. The difficulty you will have arguing against this without some presumption of the pre-state naturalness of the mother’s relation to her child is exactly the same difficulty we have arguing that marriage simply is a sanctified union between a man and a woman, the “joining of their flesh into one,” or whatever equivalent liturgical phrase it is sheltered under. The very fact of having to argue it makes us bereft.

In the privatized system I am imagining, what would make the baby the mother’s from another, equally disturbing standpoint—that is, what recourse would there be if mothers starting abandoning their children and taking off beyond the reach of whatever jurisdiction they inhabit? (We can no longer assume anything. Why, indeed, care for a child that just happened to pass through your birth canal?) A genuinely private system of civil law would only be possible among people who understood that such questions cannot be answered via an impeccable sequence of declarative sentences: marriage is marriage, children are children, parents are parents, and so on. People do terrible things, like abandoning babies, but only those who know, without necessarily being able to say how they know, what marriage is and what mothers and fathers are, are capable of knowing that such things are terrible and that we must step in to remedy them by taking in children, setting up orphanages and foster parenting institutions, restraining parents who become dangerous to their children, and so on.

Permanent damage to the language is probably harder to inflict than such damage on institutions. You can erase “husband” and “wife,” “mother” and “father,” even “son,” daughter,” “sister” and “brother” from official documents, but it will be more difficult to uproot them from people’s minds and our overlapping heritages. And the words themselves, which will exist not only in people’s conversations but in books written more than 10 minutes ago that some people will still read, will be found appropriate to experiences, and will serve as a rebuke to those see them as little more than slurs (if you think I am exaggerating, I don’t think you have worked through the logic of “same-sex marriage”—an examination of the implications for entire vocabularies of love, affection, intimacy, and so on would itself require a lengthy discussion, as would the inclusion of the totality of the effects of the therapeutic state). For quite a while, anyway, there will be ways out for people who want them.

March 8, 2013

Paulmania

Filed under: GA — adam @ 6:42 am

The almost unanimous conservative euphoria over Rand Paul’s filibuster the other day seems to me an odd thing. More precisely, it seems to me delusional, and therefore demanding understanding. Much of the excitement seems to result from the sheer novelty of a real filibuster, requiring the speaker to hold the floor (and hold it in) for hour upon hour; part of it is the hunger for someone with the “balls” to finally “stand up” to the Obama administration; part of it is the entrance into public life of the libertarianism that has been percolating for years around the margins of the conservative movement and Republican party, which takes its rhetorical power from a fetishization of the US constitution (by which I mean not adherence to its terms, but the belief that the defense of the constitution defines our political imperatives and priorities, and that the constitution contains the answers to political questions, even to questions regarding the best mode of public life); part of it is a sense of stealing the left’s issue and their thunder, and even gaining the grudging support of the more honest among them; and part of it is, I assume, genuine concern over the immediate issue at end—the question of whether the President is constitutionally empowered to assassinate (or is it only to assassinate using “drones”?) American citizens on American soil.

But if we start with that issue, without which the entire event is really nothing more than catharsis, it seems to me there is much less there than meets the eye. I have noticed that on conservative websites, discussions of the general principle get met with indictments of the Obama administration’s duplicity, opportunism, cynicism, and treachery, most of which indictment I share, but now tipping over into the further claim that if we can’t put it past them to demonize Tea Partiers, gun owners, Christians, veterans, etc., as potential terrorists, then we also can’t put it past them to start assassinating them. This slippage into left-wing style paranoia (the indulgence of which I would add to the above list as reasons for the euphoria) both misses the supposed “real” point and unwittingly demonstrates the emptiness of Paul’s entire exercise: such an administration would have no problem acknowledging they have no right to do something and then going ahead and doing it anyway; and whether the government might misuse the powers at its disposal tells us nothing about whether it legitimately possesses those powers. And on that question, the answer seems to me obvious: the President would have to have the power to put down a rebellion organized on American soil; such a rebellion, by definition, would exceed the powers of law enforcement and put us on a war footing; part of putting down a rebellion that, on this assumption, controls part of US territory, might very well involve assassinating its leaders, even those who are involved in political and propaganda rather than strictly military operations, something well within the laws of warfare; since it is conceivable, even likely, that, say, a portion of the Southwest in some combination socialist/Mexican nationalist Chiapas style revolt would include American citizens, those American citizens would be making themselves targets—so, yes, obviously, the President would have the right to kill them, using drones, poison, exploding cigars or any other available lethal technology. (I notice in rereading this that victimary thinking would exclude even the hypothetical construction of such a scenario, since one must in some way “stigmatize” some specific group in order to do so—I could have imagined an Islamist revolt in some part of Michigan, a white supremacist revolt in Idaho, etc.—in any case, names must be named—so, denying the very possibility of such an event feeds one’s self-congratulatory White Guilt or self-righteous victimary stance.)

Disposing of that question leads me to conclude that the roots of the euphoria lie even deeper than the causes I have given so far: the assertion that no American President could ever have the right to assassinate an American citizen on American soil silently assumes and therefore reassures us that it will never be necessary to do so—that we are as inviolate here, on our own land, as the 9/11 attacks may have led us to believe we no longer were. Any war on American territory would be for causes both left and right are well equipped to diagnose and combat, at least in their own imaginations: the home grown tyranny that our political doctrines have always warned us against. To put it bluntly: Paul’s filibuster allowed conservatives to join in the fantasy, which began 9/12/01 and has grown steadily in strength ever since—the fantasy that 9/11 didn’t really happen, and that there is no enemy out there that we don’t create by violating our own principles in some way, through some original sin of our own. On my reading of the evidence, those who enter this fantasy don’t leave it—indeed, why should they, as its terms are idyllic, combining in equal measure victimary resentments, an orientation towards one’s own, familiar, domestic political opponents, and an inexhaustible justification for romantic and populist posturing against the state. A state that they all, in the end, know will not really take them out with a drone as they sip their latte. This simply confirms what I concluded following the November elections: that Americans, having gotten on the victimocracy train, and will not get off until it has high sped to its destination, whatever that might be.

January 16, 2013

Notes on Cool (not cool notes)

Filed under: GA — adam @ 11:16 am

Our understanding of victimary thinking cannot be considered complete until we have accounted for the category of “cool,” which has proven to be extraordinarily enduring and generative. I wonder how far back the term goes—there must already be histories of “cool,” but the wikipedia page, at least, is no help—it traces the attitude of “cool” back to the Renaissance, but no actual uses of the term in its current slang sense going back more than a few decades. I assume it entered our vocabulary in the 1950s (although I’d be glad to be corrected by anyone whose personal memory or historical knowledge can date it earlier), which would situate it squarely within the emergence of post-war victimary culture.

Hannah Arendt observes somewhere that the German romantics of the early 19th century referred to their cultural antagonists as “squares,” in the same sense which is by now uncool usage but was pervasive in the 1960s. So, we can trace coolness, as an attitude, if not the word itself, back to romanticism—in which case, “cool” would be the synthesis of romanticism and victimary thinking.

This is important because without “cool,” victimary culture is shrill, desperate and ultimately unconvincing; with “cool,” victimary culture can produce iconic figures that offer alternatives to the cultural center. I think that Obama’s coolness and Romney’s squareness played a significant role in our recent election, and that the power of the “cultural issues” like abortion and gay rights have nothing to do with the effects of such issues on peoples’ lives and everything to do with cool.

Cool represents a pole of attraction on the margin, opposed to the center. Cool is not, at least first of all, antagonistic towards the center—it is simply uninterested in it, except as a source of amusement. Coolness embodies an attitude of deferral, which might account for the term—as opposed to those who are “hot,” i.e., worried about social expectations and judgments, always trying to influence or preempt them, the cool position themselves outside of that space of judgment. In distinction from cynicism, or “coldness,” cool separates itself from the center in order to make space for a kind of authenticity disallowed there: the cool are passionate, usually regarding some singular relationship or project. In defense of that space, the cool are ready to confront the center—that defense takes the form of the protection of some victim of the mainstream, an exemplary victim whose plight the cool, from his marginal perch, is qualified to identify.

“Cool,” as a word, has moved to the center—middle-aged women use it to refer to a clothing purchase or new flavor of coffee. It is used as honorific, often by adults to counter the exclusionary uses of cool among teenagers in their charge. And coolness might be disassociated from the victimary as, for example, with the high schooler who can initiate his fellows into forbidden pharmacological and sexual experiences. Ultimately, though, since cool is always a potential target of the center, its deepest alliances are with all those other potential victims, against which the center is seen to define itself. So, the coolness of jazz and now hip-hop frames the black victimary stance; the coolness of rock the youth victimary stance; while homosexuality has come to be marked as cool in various ways over the past couple of decades, generally as the uninhibited, joyful, stylish and honest amidst a swarm of hypocrites. Interestingly, there doesn’t seem to be any distinctly feminine cool—the cultural commissars have been working overtime for years to lay a patina of cool over Hillary Clinton but I don’t think it has taken. Among celebrities, perhaps Angelina Jolie, who cultivates the distance and the absence of neediness necessary for coolness, and also consistently plays the lead in action movies, is cool. Jewish humor—say, Lenny Bruce—was cool at one point, but that has dissipated as Jews have lost their victimary credentials. At the same time, it doesn’t seem to me that a form of Muslim cool has been forged—perhaps in Europe? That might mean that women and Muslims must become constituents, so to speak, of other forms of coolness, which speak for them. In Lena Dunham’s online ad for Obama, in which she notoriously (but for whom was it notorious?) compared voting for the first time to losing one’s virginity, it was not the women appealed to or Dunham herself who was cool (on the contrary, they are dependent, insecure and needy)—rather, the ad bears witness to Obama’s coolness, as the kind of guy you would want to be your first. This perhaps leaves women free or, depending on your perspective, obligates or even compels them to be the conscience of the victimary. The Muslim incorporation into coolness still seems to me highly problematic—perhaps that will be a cultural faultline in the coming years.

What cool adds to the victimary so as to complete it is marketability. Cool, of course, is unthinkable without what Eric Gans has called the “constituent hypocrisy” of romanticism—by setting itself apart from the center, the cool becomes a trendsetter, or mimetic model, determining styles across the culture. As I suggested earlier, the relation is symbiotic—without its victimary affiliations, the cool would drift into coldness, i.e., cynicism and cruelty (the territory that David Letterman, for example, often veers into).

Is there a viable alternative to coolness, then? Certainly not goodness—if goodness were an effective counter, a competing mimetic model, to coolness, we would know it by now. (Tim Tebow, alone among conservative and Christian NFL quarterbacks in recent decades—from Roger Staubach to Kurt Warner—has approached a kind of celebrity based on coolness through an explicitly religiously grounded “goodness”—alas, he doesn’t seem to be good enough to put this hypothesis to the test.)

One would assume that conservatism couldn’t be cool, insofar as cool defines itself as conservatism’s other, but one of the interesting phenomena of the 2012 election campaign was the emergence of a movement, largely youthful, around Ron Paul—old, cranky and starchy, obsessed with constitutional rectitude, holding unfashionable opinions on abortion with a checkered history regarding racial issues—somehow, Paul became cool. Freedom might be cool, then, when linked to an uncompromising rejection of all the corruptions and compromises of freedom wrought by the “establishment.” But Paul never threatened the establishment, and only made trouble for the Republican wing of it, so he was indulged by the traditional media—we didn’t get to see whether his coolness would survive the kind of full-scale assault launched against Sarah Palin (who also had some markers of cool). A libertarian like Paul (maybe we will see this with his son) would need to devise a strategy for turning such attacks into the elements of his cool. I suppose supporting drug legalization helps here.

Beyond such speculations, the problem here is whether positions on the margin can be made into mimetic models without rejectionist gestures toward the center—the historical center, or firstness (initiative, responsibility, representativeness), if not the political or cultural center. In other words, what kind of generative margin (a margin that produces new centers) could run on other than victimary fuel? Coolness, presently, is confronted with the problem of having won the political and cultural centers through a demonization of the historical one (Western culture’s insistence on equality versus the imperial—the very premises, in other words, that make sympathy toward the victim possible). In power, cool figures like Obama become extremely tiresome, not to mention incompetent (we now have a government, part Ponzi scheme, part protection racket, part victimary theater, that is utterly uninterested in what were once considered the defining responsibilities of government, like defending borders, passing budgets and distinguishing friends from enemies). On the other hand, that historical center has been, probably irremediably, sapped by its appropriation by the victimary. The parasite has destroyed the host.

The only alternative, I think, is a kind of originary ‘pataphysics, the science of the exceptional invented by Alfred Jarry, and carried on through a series of avant-garde aesthetic and cultural movements until today. (Jean Baudrillard, apparently, considered himself a ‘pataphysician, something I will have to explore further.) Of course the roots of ‘pataphysics lie in romanticism, and ‘pataphysics tself could plausibly be seen as precursor of cool. But ‘pataphysics is a program for thinking and learning, activities which interest cool not at all. One way of thinking about ‘pataphysics is via the famous Seinfeld episode in which George “does the opposite,” i.e., the opposite of what he would normally do in that situation; except here, one does not the opposite (an ultimately incoherent approach, as not everything has an opposite, there may be more than one opposite, etc.) but the least probable, and not as opposed to what one ordinarily does but in relation to the probabilistic frame implicit in the discourse one inhabits.

So, when you address me, you hope for and expect a certain response, based upon social conventions, the present context, and your knowledge of me and our shared past; perhaps you also fear other possible responses, the probability of which you have sought to reduce in your mode of address. As a ‘pataphysician, my interest is in surprising you, but in some recognizable way—I can only undermine your expectations if I display some awareness of them. In this way I create an event, a happening, and make it possible for us to recognize each other on the margin and affirm the signs and tacit agreements we share. Clearly, carrying out such performances across the field of culture is not easy, but, like coolness, it’s not something everybody would have to do—just enough create viable mimetic models. ‘Pataphysics must be rigorous and disinterested—its only politics must in defense of its own possibility, which is to say against anyone who wants to remove events and happenings from social life. (I have assumed that with the fall of East Bloc Communism, the work of talented and absurdist [i.e., ‘pataphysical] dissidents like Vaclev Havel had become irrelevant, but maybe we have much to learn from them.) Originary ‘pataphsyics, as an overtly marginal position shares the field with cool but it is not itself cool because it seeks to find and refound rather than stigmatize the center; maybe the other of cool can just be “firstness.”

Well, one might say, wouldn’t, say, a vicious or violent response to an amiable greeting be “doing the improbable”? Maybe, but only once—nothing is more monotonous (and therefore predictable) than violence (and the means taken to restrain it), once it has upset some space based on trust. Violence, or any kind of violation of already achieved forms of civility, would not, that is, open the field of possibilities, or lower the threshold of significance, which is the point of ‘pataphysics. The most valuable effect of originary ‘pataphsyics would be what the left has promised (or, for that matter, what modernity has promised), with unsatisfying results: the recovery of excluded voices and the creation of new ones. If I, say, improbably take you literally when you ask me how I am, unburdening myself of an exhaustive account of my current state, I remind you of several things: the kind of shared beliefs, commitments and experiences that must have once been necessary to put those standard greetings in place; the fact that we no longer share those beliefs, commitments and experiences and yet still need the greetings; that sustaining those greetings and civility, then, might not be guaranteed; that we might need to discover means (not necessary my current, excessive, gesture) to restore the foundations of civility; and more. I thereby make it more likely (another shift in the field of probabilities) that you will notice further fraying of standardized modes of civility, and be attuned to new refreshments of those modes.

There is no reason why we can’t have forms of art that gently intervene in everyday life, turning us self-reflexively upon our habits, without the implicit or explicit condemnation of middle class lifestyles which makes so much performative art so annoying. I think most people would enjoy losing a couple of seconds here or there with little installations that might play off of the constant surveillance now characterizing our lives. (How often do we now see ourselves entering and leaving places? What if we saw ourselves upside down once in a while? Or, looking up to see ourselves, see a celebrity walking out instead?) Or that play with our expectations of impeccability in business establishments—like an installation inviting customers to clean up a little mess, with each customer contributing to a new arrangement? We always think of little things that might go wrong, or awry, in carefully organized settings—little bits of art that fulfilled those possibilities, perhaps giving them surprising happy endings, would be appreciated. There might be a place for the victimary here—little bits of feminist or anti-racist theater that show people how it feels to be viewed as “other”—but they would have to reward the viewer/participant/customer.

None of this would be cool (even if those who see such works emit one of those soft, clipped “cool”s which have become so popular and hopefully weaken the power of cool), because these would not be ways of drawing attention to a potentially volatile margin—rather, they would be collaborative ways of remaking the center. Perhaps we can break up and reform the word “perhaps” to give it a name: “per”+”hap,” or through/by chance/event: firstness, then, creates perhaps (the plural), or perhaps (third person singular). Maybe we could set aside the more provocative “firstness” and simply say that after cool comes perhapsness. With text messaging and twitter, that would get reduced right away to PHPNS, and maybe rebranded as “pappens,” making it only slightly more verbally cumbersome than “cool.” Well, as Proust had his narrator say about a fantasy, that I have just imagined this means that it can’t possibly happen this way. But maybe that itself is an instance of perhapsness.

Cool can overpower goodness because moralities predicated on human equality want the scene without the scene—as if everyone could be arranged before the central object without the disturbance of everyone having to present his position to the others and interpret theirs in turn. Morality can only be thought in very limited ways in terms of abstract rights, obligations, fairness, rules of behavior, thou shalt nots, etc. The most basic morality is entering the language of the community, working with its terms, its tropes, its idioms, even its rhythms, and at least respecting and trying to learn them to the extent one is an outsider; somewhat more demanding is to speak the language of some specific other, the more differentiated the other the more demanding the obligation; more challenging yet is exposing the limits of the community’s or the other’s idiom, opening the possibility to accommodate as yet unrepresented desires and resentments; highest of all is the invention of those new idioms that will indeed represent those desires and resentments. That, in fact, is what the moralities predicated on human equality have done, so I am not dismissing them—it is just that they will serve us better if read as innovations in language to be revised rather than transparent principles to be defended against “illiberal” attacks. Cool exposes the limits of “bourgeois” morality, and can only be replaced by a mode of discourse that does it the same favor in turn.

Another way to think about it: when a civilization collapses, what is happening is that the immense architecture of tacit agreements, everything that has been agreed upon and settled long ago, so that we could go on and forge more practical and immediate agreements, turn out, after all, or by this point, to be or to have been, disagreements merely misunderstood as agreements. Naturally, at this point, those more practical and immediate agreements evaporate as well. We’re human, so we’ll need some kind of agreement, some mode of joint attention, just to get through the days, and those provisional agreements can emerge out of the frayings of the disintegrating ones—for example, in shared irony towards what was once taken for granted. What might become possible in such circumstances is what has not been possible for a long time—foundings, which can be found among the ways we just happen to be together, as a result of the intersecting trajectories that have brought us where we are. If have agreed to do something together, and the project falls apart, then we are released from the terms of the agreement, and yet there we are—we might as well do something. All of the habits, literacies, and implements we had gathered for that project are still lying around as well. Why not just begin by agreeing to do something, this or that, anything, making use of the now unfamiliar materials in a new way? The more arbitrary the better, because that places the agreement itself at the center, rather than the pretension that we are just doing what reality tells us to do—and because uses and potentialities of those materials which were otherwise hidden now become prominent through new articulations. Arbitrary, oulipo-style constraints will enable us find rules to our agreements, and to discover who we are coming to be through those regulated interactions.

I have been troubled by the sense that a cultural project interested in widening the field of possibilities might be taken as an evasion of reality—as fantasy, at best, or totalitarian attempts to remake the human condition at worst—until it occurred to me that reality itself is nothing more than the compilation of present possibilities. Nothing is fixed and set—as soon as anyone makes a move reality has already been adjusted. All originary ‘pataphysics would do is widen the field the possibilities in any present, not obscure the fact that at every moment a wide swath of possibilities is cut down. And that’s all we need in order to be realistic: be willing to accept that, whatever our threshold for acknowledging a possibility, somethings, lots of things, maybe most things, at any moment, will still fall beneath it. For originary ‘pataphysics, the rush of new possibilities will be matched by the discarding of old ones, creating “reality,” or conditions under which the consequences of choices can be accounted for.

« Newer PostsOlder Posts »

Powered by WordPress