GABlog Generative Anthropology in the Public Sphere

June 2, 2009

Futurity and Presence

Filed under: GA — adam @ 7:36 am

For years I have been convinced, and I remain convinced, that there is a simple and infallible way of breaking the victimary spell:  for the “dominant” to use their power to defend those who are the victims of those who claim to be our victims.  This should involve not merely charity or altruism, but a genuine alliance, however asymmetrical the contributions of each party, against a common enemy, leading to lasting covenants and institutions.  This principle can be applied in infinitely varied ways–conservatives who have tried to liberate students from the public school monopoly, or poor inner city blacks from their “civil rights” leadership through enterprise zones and other initiatives have intuited that this is the way forward.  It demystifies victimary claims as another mode of resentment without the potential for generating much intellectual content besides a few, rapidly aging maxims (regarding the virtues, of which there are certainly some, of seeing things from “below”).  It acknowledges the inevitability of asymmetry in human affairs, and that the establishment of symmetrical relationships is not meant to eliminate such asymmetries but to establish arenas with a shared sacrality and prescribed objects of desire that are open to all regardless of those asymmetries or, to put it differently, where resentments are directed towards attempts to introduce the asymmetries into that bounded space.  And so that the assymetries that will then arise within that space (we are all free to own property, but we won’t all get equally rich; we are all free to speak but don’t all become equally influential, etc.) reveal new possibilities within that field of human activity, that is, new objects of desire and means of appropriating them (inequalities in propery increases productivity and wealth; differentials in influence lead to models for refining our persuasive capacities). 

So, for me the obvious question is, why hasn’t this happened?  To be more precise, why has the one attempt (the “Bush Doctrine”), despite having been launched in response to the most propitious of events (the reductio ad absurdum of the victimary that was 9/11), turned out to be so feeble?  What desires and resentments have been more compelling, and why?  It’s very hard to get a sense of this from Leftists themselves, who answer questions about what they believe or how they conceive of the results of their actions (still!) primarily with diatribes about Bush–even Obama seems incapable of presenting any policy without framing it as a “new beginning” leaving behind a period of medieval darkness.  Everything then, can be described as “cleaning up the mess,” “turning a corner,” “restoring our image in the world,” etc.–i.e., phrases devoid of information, even the involuntary kind one provides when stating a any view, or communicating any sense of where one really sees things going. 

Here’s another, more conventionally political, way of thinking about where we are.  From 1932-1968, “Progressives” ruled America almost unchallenged, and seemed unlikely to be challenged.  They had seen us through–if not actually extricated us from–a depression, and had won the largest and most important war in history.  They had managed, in the post-War world, to meld a relatively mild welfare state to a revised version of traditional American, middle class values, and to produce leaders like Truman and Kennedy who could plausibly articulate those values. They ushered in, under quite a bit of pressure, it’s true, a new era of racial equality.  They even, after some problematic entanglements, managed to get the question of Communism mostly right. 

The progressive alliance, including the media, universities and most of the political class, then stumbled quite a bit over the next 40 years.  The first blow to liberal hegemony came from the Left,  in the form of resistance to the Vietnam War and cultural assaults on bourgeois morality.  In getting the question of Communism right, the liberal elites ultimately alienated a large chunk of the next generation, which tapped into the tradition of anti-imperialism that had been exorcised during the McCarthy period.  And the cultural split alienated an important chunk of the middle class.  The first political result of this was the election and re-election of Richard Nixon, who completely accepted the welfare state, but represented the resistance of the “silent majority” to attacks on middle class values and patriotism. 

A series of blows followed:   the election of Reagan, on similar grounds as Nixon, but with the important addition of a rejection of Carterite weakness in foreign policy and with a much more coherent, counter-Keynsian economic agenda.  And then, in the 90s, figures like Newt Gingrich, on the one hand, whose Republican majority actually promised to roll back important elements of the welfare state, and Rudy Guiliani, who restored the hope of decent urban governance which had almost been lost.  And, finally, Bush’s cooptation of liberal themes of human rights and democratization (following up on Reagan here) and tying it to an assertive foreign policy that involved the first serious use (and ultimately successful) of American military force since Vietnam. 

For a while the Democrats incorporated these themes, moderated their views on things like welfare, the market and regulation, kept their pacifism and tendency to blame America for its enemies in check–while still demonizing their opponents, usually in more coded terms like “competence.”  Perhaps most important, cultural transformations in the areas of sexuality, family life and popular culture continued unimpeded–enough common ground here with libertarians and the general desire of most people to stay out of others’ business made any counter-revolution other than verbal unlikely here.  But now, perhaps in part because of some of the successes resulted from these counter-revolutions against progressivism–the reduction in crime, the enormous generation of wealth over the past 30 years, the fall of Communism, the prevention of further attacks after 9/11–it seems, like a rubber band that has been stretched to its limits and then released, we have snapped back pretty much to where liberalism was in the 1960s–if you think about what they had in mind, had not the New Left and the debacle in Vietnam not derailed them, even taking into account our specific historical moment and our economic crisis in particular–wouldn’t any good liberal circa 1965 or so see, at least a first glance, today’s government as essentially picking up where they left off?  Indeed, all the complaints that have accumulated about the “Right” over the past 30 years, all the griping in publications like The Nation and Mother Jones, among aging graduate students and community activists–none of it seems to have been wasted.  The Obama Adminstration’s rhetoric and plans are all formulated in an idiom intimately familiar to anyone hanging around the (mostly hopeless) Left between 1980 and 2008.  In other words, however catastrophic (in my view, of course) the path Obama and the democratic congress have put us on, in some sense it looks like the “natural” condition of post-traditional (post-WWI, really) America.  In the end, the Left conceded nothing, and the link between Bill Ayers and President Obama represents the return of the New Left cultural and anti-American radicals back into the fold.  This is where we have been heading–the Republican revolts were simply detours.  

Auschwitz theology has proven so powerful because its roots lie deep in victimary modernity–in the compulsive self-liberation from obscurantist tyranny.  If you can’t imagine freedom in any other terms, you will keep imagining yourself enslaved in new ways with each new liberation from previous enslavement and you will keep seeking out previously invisible modes of victimization to abhor.  There is a covenantal modernity which displaces the victimary brand, but the US was really the only strong representative of covenantal modernity and, ultimately, the lasting influence of slavery gave victimary modernity a foothold here which it has prodigiously expanded.

But victimary modernity is impossible as a way of life–its triumph must lead to catastrophe.  The best thing to do, as far as I can see, is to stay out of the way as the catastrophe unfolds–predict nothing, don’t gloat, quietly offer alternatives which will be contemptuously rejected.  Unobtrusively abstain from the narrative of victimary modern liberation–that would include “tea parties,” references to Jefferson and Paine, etc. (although, of course, we need not criticize any of that, either, nor exclude the possibilities that some movement will emerge from it).  The new narratives will have to emerge out of disciplinary spaces, and they will coelesce around the discovery and invention of modes of symmetry which leave pre-existing asymmetries alone.  And symmetry seen not as equalizing liberty but as esthetic freedom–see all the beautiful ways in which we can exist on the same plane with each other!  Create spaces that people will want to join once the victimary state starts to go bankrupt, and that will ultimately be able to find public representation by applying its idioms and habits to devising novel compromises.  I would think of this as a continuous presence, as opposed to transcendence–transcendence sees some idea embodied in reality, while presence is awareness that only one’s activity sustains reality, with ever-renewed signs rather than ideas.  Presence will involve a recovery of imperatives and ostensives for public use–of course, they could never fall out of disuse in private life–and a proliferation of models which are adhered to tenaciously but in very restricted circumstances.  I feel like going on to talk about auxiliary verbs as a model for this kind of activity, but that seems to be a discussion for a discipline that doesn’t quite exist yet.  To get it started, though, why shouldn’t language serve us as a source of models for reality–now that we have finally dispensed with the notion that reality must be the model for language?

May 29, 2009

Competence

Filed under: GA — adam @ 1:15 pm

For a while, “competence” has been a weapon used by the Left against Republican Presidents.  It began with the Dukakis campaign, I think, most immediately as a way of distracting attention from the candidate’s liberalism, and while it failed for a while, it has finally yielded fruit–certainly, the Bush Adminstration was effectively labeled “incomptent,” and the Democrats can present themselves, with lots of Ivy League technocrats who really want to run everything they can get their hands on, as competent.  It turned out to a be savvy strategy for a couple of simple reasons:  first, any modern administration is doing so many things that one will, at any point along the way, be able to point to dozens of “mistakes,” many of them egregious and harmful; and, second, the mass media, still the liberal, mainstream media even in these, its dying days, is much more interested in recording mistakes made by Republicans than Democrats.  After all, what is the measure of such things–according to what “objective,” competently administered criterion of competence could one “rank” the Obama, Bush, Clinton, Bush and Reagan administrations?  All one can do in making a case is list a series of mistakes–by the time you get to 15 or 20, it looks pretty bad, even out of thousands of decisions, so it really becomes a question of who wants to make such lists and what you believe belongs on it (not to mention the problem of ranking more and less serious mistakes, mistakes which are repaired to some degree or another, mistakes made out of carelessness as opposed to tough choices that could have gone either way, etc.).  Along with the rreasons i just mentioned, Leftists prefer these lists because the Progressive philosophy of governing insists upon expert administration as the test of legitimacy–if you see yourself, as an elected or appointed official, as akin to an engineer or doctor, then the number of serious mistakes becomes an important measure of your performance.  Conservatives rarely think to make such lists, because they are more interested in having the government do less rather than doing it better–indeed, if the government does things, or can be presented as doing them, better, that provides a ground for having it take on more.   You would think this would make Democratic administrations vulnerable to charges of incompetence, but since now one really knows what it means anyway, having lots of plans and being staffed by the type of people the media likes is good enough.

We could usefully trace at least one central strand of Progressivism to John Dewey’s argument that the scientific method should be applied to public and social life.  Rather than being driven by tradition and prejudice and constant shifts in public opinion, let’s explicitly identify “problems,” study the “causes” of those problems, try out “solutions,” and then measure the results of those solutions–exactly the way in which we would test a hypothesis in the laboratory.  Democracy, in that case, would depend upon the scientific method coming to replace traditional common sense in the public as a whole.  There are quite a few rather obvious problems here–first of all, the inevitable split, which must persist even as the population becomes more educated, between experts and non-experts when it comes to social problem solving; the fact that “failed” experiments in the sphere of social life have lasting effects and can’t just be “scrapped” as in the laboratory; the law of unintended consequences or, perhaps, Heisenberg’s indeterminacy principle, dictates that the experiment itself will transform, in all kinds of unpredictable ways, the conditions that were to be tested in the first place. 

This is not ancient history–what else does one imagine Barack Obama meant by “restoring science to its proper place” in his Inaugural Address?  On the one hand, it is a gesture to environmentalists, the abortion lobby and others; more generally, though, it is an assertion of the Progressivist philosophy of governance–why shouldn’t science dictate the way we organize health care, education, gun control, etc.?  There seem to be limits, though–has anyone proposed a “scientific” foreign policy?  Has gay marriage been formulated as a “scientific question”–would supporters accept as conclusive a study showing that children raised in gay marriages are less “well adjusted” than those raised in traditional marriages?  Of course not, and rightly so–even if one could scientifically determine the meaning of a term like “well-adjusted” one couldn’t scientifically determine what portion of one’s “adjustment” is determined within the family and what portion determined by what is outside, and in the interaction between the two.  To put it simply, no one accepts a scientific accounting of their values.

So, progressivism is meaningless as an actual practice of government, but as an ethos of those who govern it is extremely powerful–what it refers to is not so much scientific practice but the rule of those of us who define ourselves as pro-science.  The secular and those who don’t identify themselves in terms of their obligations to any community; those who can comfortably present themselves as victims of religious, ethnic or bourgeois “prejudice”; for all of these, “science” is the default position because in the mythology of modernity the anti-science position of the church and monarchy is what stands in for all the forces of reaction holding back economic, ethical and social progress.  Which brings us back to “competence” as a purely political term, similar to a more recently invented one like “reality-based.”

But a claim like “science is what scientists say it is” is not mere tautology.  The real meaning of competence is in performing the practices of some specialized community in a way recognizable by other members of that community.  An astrologer who stumbled upon the theory of relativity in 1895 would still have been “wrong,” or, more precisely, “not even wrong,” because no one in the scientific community could have done anything with that claim–it didn’t emerge out of some problem recognized by the community, some unanswered question or unresolved anomaly.  To be competent in such a community–and, I am saying, this is the only real meaning “competence” has–is to be able to recognize the relation between problems, questions and anomalies and the ongoing revision of the practices of the community or, as I would prefer, the discipline (a community which focuses on addressing a specific region of reality, a specific set of phenomena). 

In this sense, competence is extremely important, politically.  The hijacking of disciplinary authority for short term advantage is scandalous because we rely heavily upon those who set aside immediate questions for the sake of what, in the words of Charles Sanders Peirce, “will prove true in the long run.”  But it will always be an ongoing temptation, because there can’t be any extrinsic authority governing the discipline–only those within it are competent to judge its workings; even while the results of work within many disciplines becomes increasingly valuable to the world.  Real conservative political thinking, at this point, would best direct its attention to finding ways to ensure that every one stays within their sphere of competence–a concern that would mirror that evinced by the American Constitution for a separation and interaction of powers.  Those within one discipline ask questions of those within another discipline; consumers, voters and elected officials don’t interfere with disciplinary activity but choose the results of such activity that they prefer–that is the proper relationship.

But disciplines change and overlap with each other–new domain of “competence” emerge all the time, and can take advantage of the time-lag between their “discoveries” and the progress of other disciplines to arbitrarily proclaim upon all manner of things–academic disciplines like cultural studies are perfect examples:  they have a fairly sophisticated vocabulary that draws upon serious trends within modern thought, so they are capable of repelling criticism and attracting supporters–very few people are in a position to point out that they are essentially frauds.   At their best, disciplines are in between the sheer love of inquiry and conversation without bounds characteristic of the “amateur” and the rigor and accountability of the “professional”–indeed, one might say that disciplines start off amateurishly, pursuing some anomaly or taboo subject within an existing field, or separating an interesting question or problem from some craft or cult that has hitherto monopolized it; and then establish a vocabulary and idiom of inquiry that might some day freeze into jargon but will hopefully generate enough anomalies, paradoxes and antinomies to prevent that from happening. 

But we can open a disciplinary space any time, any place–whenever there is something not immediately visible that we feel could be seen if we had the right instruments or found the right “angle,” and we set aside differing interests and opinions in order to, jointly, see if we can find a way to see it–we have a disciplinary space.  In this case, a disciplinary space is an iteration of the originary scene–an iteration in relative safety, but somewhere in the back of the disciplinary foundation there is the sense of danger, the sense that order might give way to violence if we don’t find ways to see the same things.  And this lurking danger shows up in error–whenever we try to see something new our old habits keep getting in the way, even if it was a kind of interruption in those old habits that led us to seek something new in the first place.  When we point at something together, there is no guarantee that we “see” the same thing, and the only way to check on that is by pointing to something else, which repeats the same problem, etc.  There is no guarantee that after several “sightings” in common, having assured ourselves that we see together, some “monstrous” divergence won’t disabuse us of that assumption (in such cases, how do we know which is the error?).  We revert back from “seeing” to the idiom that enables us to talk about convergences and divergences, and even here there are no guarantees.  We simply gamble that the generative is better than the self-enclosed–whatever can produce more of itself, and in varied forms, seems preferable to anything hermetic and repulsive.

In this case, there can be another discourse of “competence” other than the “progressive” one, which takes the “administration” of “society” as its disciplinary object.  We can speak about habits, signs–ostensive and imperative,  idioms, norms and error, and overlappings.  Any of us can be sufficiently self-reflexive to note where our extant habits are taking on new material; any of us can identify others whom we consider to be competent to judge our practices; and those who are competent to judge the results of our practices, and because they fall into the region covered by their habits;   we can position ourselves at the limits of others’ habits and point out–set up a disciplinary space aimed at pointing out–where they exceed their competence; and we can test, at the margins of practices, where norms get fuzzy and error and innovation get entangled.

The whole idea of a “mainstream” is un-American–far more normative for us is the 19th century conditions, when we were flush with con men, cults and debunkers, and it must have been hard at times to keep them apart.  The “mainstream” is an invention of progressives, a way of holding together the welfare state and Cold War belligerency.  Let FOX News cultivate some crazies; let the creationists have their conferences and densely argued and meticulously documented pseudo-academic treatises; and let the debunkers have at them. And maybe I was too hard on cultural studies a moment ago–once we see it as a specifically academic cult, with an affinity for other cults (UFO hunters, gay subcultures, conspiracy theorists) we can find a place for it as well.  But not in state supported academies–a major project, probably far more important than any strictly political activity, over the next few decades, will be circumventing and ultimately undermining the University as a source of authority and credentialing.  Employers should decide what they want their employees to be able to do; and then they should train them in those skills specific to that job, while relying upon academies that focus on requested skill sets offering credentials that testify to the student’s ability to do x, y and z.  Lots of vocational schools, and lots of on-line education, then–but the Humanities need not suffer, since there is no doubt that advanced interpretive and communication capacities will have an important place in economies of the future.  But there is a huge gap in that [employers] “should”–no one is competent to issue imperatives here.  Only the proliferation of disciplinary spaces on the margins of and outside of the University will fill in that gap.  For now, though, we can hammer away at tenure, on all its forms in all institutions–there is no more pernicious habit than that one. 

This is probably not the way in which most participants in the discipline of Generative Anthropology see it, but I would like to practice the originary hypothesis as a source of idioms of inquiry–a habit of prying loose new vocabularies and grammars from the anomalies within existing, especially decaying, disciplines.  It is the difference between iterating the gesture on the originary scene and assessing the results of that gesture.  Perhaps these are different competencies. 

May 22, 2009

Below the Threshold

Filed under: GA — adam @ 10:48 am

Desiring to educate myself in economics, so as to speak (and think) intelligently about current events, I have subscribed to the Mises email list, receiving a daily column.  I’ve been reading Hayek for a while, and I have been moving in an increasingly radical free market direction for some time, and so Mises’ thought is extremely helpful.  I’m taking it in small doses now, waiting until I have some time to continue reading Mises’ Human Action.  I haven’t been disappointed, as I find the economic analyses very enlightening and congenial.  One lesson I have drawn so far, and which I will probably write about soon, is that one source of our present troubles, which is to say one manifestation of our victimary regime, is the taboo on wages going down.  We–most of us as individuals, and in our policy preferences as a society–would rather see high paying jobs disappear because the companies in question go out of business rather than take pay cuts, leaving jobs intact at what would still be much higher wages than those for the jobs that will come in place of those lost.  This is very interesting.

My interests here are different, though.  Along with the economic analyses, I also receive political ones, from the Mises list.  “Analyses” is a generous term here–they are mostly diatribes, as virulent and maniacal as any coming from the farthest reaches of the Left–the state (not any particular government), American is particular, is a parasite, a bloodsucker, a mass murderer, all of its enemies justifed in their resentments.  In one recent animadversion on “imperialist” American foreign policy, rather sensible statements by American statesmen about the need to keep such policy buffered from (not completely immune to) democratic sentiment, were treated as criminal–and this comes from a tendency within social thought that sees even the most democratically determined encroachment on private property as illegitimate.  Here, in fact, seems to be the link to the Left:  if the democracy is not on your side, argue fundamental rights; if constitutionally determined decision making is at times unpopular, argue for the absolute claims of the democracy. 

This dichotomy between reasonableness in some areas and craziness in others seems to me increasingly common now, and it leads to a kind of schizophrenia, at least for those of us who see the cultic Presidency of Barack Obama as a perhaps irreversible disaster.  For example, in my view, debates over the constitutionality and rationality of the Federal Reserve Board are legitimate and highly interesting–just like I don’t believe that anyone knows what the temperature will be in 2040, I don’t believe any economist can possibly know what the interest rate “should” be.  But state such views publicly, and who will you find standing with you?  Some solid conservatives, of course, but also lots of conpiracy theorists, racists and anti-semites.  Speak forcefully against the weakness of the West in the face of adverse demographics and appeasement of Islamic supremacism and (at least in Europe) you find yourself ranged alongside neo-Nazis.  Such positions are, it would seem, dead in the water:  even the rapidly disintegrating “mainstream” media finds easy pickings in such a target-rich environment. 

It may be that the standard modern Western political narrative, the people liberating themselves from benighted, tyrannical rule, has been monopolized by the Left.  For the narrative to work at this late date, you need a monomaniacal victimary focus–specific, universally recognizable victim-types you can keep in the public eye constantly.  Defenders of free markets and individual rights and liberty can’t compete, because you can’t represent the entrepeneur who never was because of hostile regulation, and even if you could, how sympathetic would he/she be?  The market doesn’t lend itself to scenic representation–the only people on the right who can make the effort to compete with the Left, in formal terms, are the ones who construct a symmetrical relation with the privileged groups on the Left–i.e., white supremacists, claiming to be oppressed by the Zionist Occupied Government. 

Another example.  The popular blog, Little Green Footballs, has taken an interesting turn since the election.  Little Green Footballs, run by Charles Johnson, was, as far as I knew, a blog interested in focusing attention on the denial of the rise of Jihad throughout the world.  Technologically skilled, Johnson excels in exposing frauds–most famously, the forged documents used in 60 Minutes’ story on George W. Bush’s National Guard service, but also some photoshopped photos of the Israeli war in Lebanon in 2006.  Since the election, Johnson has shifted his attention dramatically, while still posting on the depredations and denials of what he used to refer to ironically as the “Religion of Peace,” to exposing creationists and other “extremists” in the Republican party as well as links between self-declared “anti-jihadists” and fascism, neo-Nazism, racism and anti-semitism.  I think Johnson should be seen as sincerely wanting the Republicans to become a party of what he would consider the “center”–that elusive fiscal conservative, socially liberal, national security hawk that would be conservatives embarrassed by the pro-life, “fundamentalist” crowd always dream of.  What is interesting here is that no one seems to fit the bill–the creationists seem to be the only ones exhibiting principled opposition to Obama’s plans, to be interested in what the Constitution actually says, or to find a center anyway other than in the latest polls–and, so, Johnson ends up gesturing towards rather pathetic “moderates” like Arlen Specter and Olympia Snowe as the salvation of the Republican party (or, at least, pointing to the marginalization of such figures as the self-destruction of the party).  

In other words, there is no “center”–that is, no set of “actionable” opinions on central questions regarding the continuance of American civilization firmly shared by a solid majority of Americans.  For example, Obama can continue the war in Afghanistan and continue Bush policies on prisoners, but what he can’t do is cease scapegoating Bush, which deprives these policies of any principled basis because the implication is that the policies need only continue until Bush’s “mess” has been “cleaned up.”  In other words, he can do what needs to be done only under the condition that we fantasize that it never really needed to be done in the first place.  But leave that aside:  everyone knows that Iran is a significant threat; and everyone knows that we will not do anything to prevent the Iranian regime from getting nuclear weapons.  We are allowing a fourth rate military power, one deep in various economic and political crises, which could be severely disabled and perhaps tipped over at a fairly low cost, elevate itself to an arbiter of affairs in the Middle East and beyond–what more proof is necessary that we are incapable of acting on basic observations and common sense?

So, I propose that we operate under the following assumptions:  first, that that modern Western political narrative has exhausted itself, and can be left to the husks to the Left, who will run it out through their increasingly baroque parodies; and, second, that there is no available center at present.  We might as well connect the two claims as well–there is no center because the only narrative we know has been exhausted, and nothing has been put in its place.  Why else would the cultic Obama presidency be almost pathologically determined not to question a single shiboleth of Keynsian economics or Carteresque foreign policy predicated upon the de-centering of America?  Without that version of liberation from an imperial, free market, Babbitized American, the world simply wouldn’t make sense to these people.

But narratives of liberation from obscurantist authorities is the only available narrative, and the one upon which all public discourse is predicated.  Try to imagine a political discourse focused solely on strengthening the center:  upon an auditing of all institutions based upon their conformity with Constitutionally authorized purposes and the degree to which they remain within the area of their competence.  Such an politics would involve arguments and conflict, but the victim card would not determine the victor in these disputes–the idea would be to try out a model of the institution one believes to have drifted from its originary minimality, and others would construct models out of the paradoxes inherent in your model, and these visible paradoxes would center conversation upon the proper version of minimality.  The problem with such a politics is that if you don’t point to a victim produced by the distortions in question, why should anyone care?  And if you do produce victims, not only will you not be able to compete with the more lurid victimary tales already circulating, but you have already lost the argument, because you won’t be able to claim that your approach necessarily offers a shorter path toward the elimination of that mode of victimization, or to the most appropriate recognition of the victim in question.

So, I must be arguing for a politics that is largely below the threshold of visibility and publicity, and yet can cross that threshold on occasion–more precisely, on those occasions when the threshold itself is lowered so that private activities normally hidden from view become political.  A useful way to think about this is in terms of those moments when you see someone you know through carefully framed events–at work, or school, or some recreational activity–in a new context, one in which their behavior can’t help but deviate from the normal script.  This new context can one other than the normal one–say, seeing your boss at a sporting event; or, it can be an interruption of the normal one–say, an ordinarily stolid co-worker breaks down in tears at work.  What one is seeing in such cases is the eruption of an extrinsic habit into an established frame or reference, and this habit, in its own terms perfectly normal (there’s no reason the disciplinarian, by-the-book boss can’t come to the Red Sox game decked out in colorful and comical team paraphenelia, or the quiet, reliable co-worker shouldn’t be a deeply sentimental individual in his own time; and the proof of that is that we revise our habits so as to incoporate this new way of seeing that person), “scandalizes” the scene into which it erupts.  The result is error, that combination of embarrassment and revelation, from which we can’t turn away even as we can barely stand to look on.  I would like to suggest that this errancy–getting into the habit of interrupting others habits by having our own interrupted–is the best model for the present moment in our society and politics, when almost everyone is straying outside of their area of competence and exposing their unthinking habits on a regular basis.

Not only is errancy the best model for contemporary culture and politics, but it is originary, having a critical place on the originary scene.  If we accept that there must have been a first “signifier” on the originary scene, then we must also accept  that the sign as put forth by this first signifier was merely “potential,” and therefore both sign (it is already iterable, and being iterated) and not-sign (it has not consolidated the scene, and therefore has not yet distinguished itself from the mimetic crisis it is “destined” to interrupt).  The most interesting part of the scene to think about, for me at least, is the process–or the wide range of possible processes–by which we could imagine the series of  imitations (through fits and starts, through automatic mimetic instinct and through a shared vision of the imminent crisis, through glances at the responses of others present, aborted movements back to the center, etc.) through which the sign would concantenate through the scene, very likely coming out of it looking very differently than it started, and with all memory of that origin (even on the part of first signifer himself) lost, since only now is there a sign and event which makes memory possible.   I consider this concantenation an originary grammar of norm and error–each signifier in turn modifies (“distorts”) the sign as it has come to him while at the same time “packaging” (correcting”) it for the next in turn.  

I am obviously now going to argue for a politics situated within this instant.  An auxiliary politics within an indicative culture.  I’ll first make a grammatical observation:  what makes a sentence meaningful is the presence of a “commanding name.”  The noun, ultimately a name or whatever stands in the place of the name, generates a sentence by commanding us to suspend or withdraw some command or demand (imperative) of our own pertaining to the space covered by the name.  In other words, adding the predicate to the noun situates the referent of the noun in “reality” and renders it inaccessible.  The noun, then tells us to cease our demands and align ourselves with that reality. 

If the sentence lacks a commanding name, it doesn’t make sense.  Think about how many sentences with no nouns, only pronouns and other deictics, you would need before no two people could agree on on what is being said.  I would suspect no more than 3 or 4 in most cases.  Now, while there can, in some languages (I personally don’t know how many), be sentences without verbs, and the first sentence might have been such, I’m going to (without argument, for here and now at least) insist that we don’t have real sentences until we have verbs.  And that verbs are, ultimately, imperatives.  So, the verb is the commanding name being commanded to stay in place, hold reality together, and command. 

Without the commanding name, all we have is a chaos of imperatives, interrogatives and ostensives–again, consider the example of a series of sentences without a noun.  What would keep it going, what would sustain the presence that we ultimately need to hold the world together–There it is.  What?  That–look!  Which one?  The one right there, between them.  Show me! Look in between the two that are in between those four… This is that instant where the sign has been put forth but not yet publicly “authenticated”–the “dialogue” can be held together by the shared presumption that there is something to look at, that there might and must be a commanding name, even if it is presently unavailable or withheld.  In fact, the more commanding the name, the more immense the reality it brings into being, the more we struggle to point to some part of it that will be verifiable as a part of it

So, our culture of errancy is one in which the possible commanding names are presumed to be there but beyond our ken.  So, a politics modeled on this condition would involve intense adherence to something floating around as a possible commanding name, along with the attempt to bring others into the process of commanding the name to command, to mistake and norm that commanding name together.  This involves the creation of habits of finding “pregnant” names, obeying them, and issuing commands to others to mis-take those names.  This would create an indicative culture, a habit of composing sentences that remain very close and sensitve to the world of imperatives while nevertheless just barely transcending them, staying close to the boundary between pointing at something together and not making sense.  This would include a “chiasmatic” relation to the public discourses, generating maxims through the reversal of existing maxims.  For example, I have heard Barack Obama’s favorite phrase, “we must reject the false choice between…” so many times that I would like to create a new habit and guiding maxim out of inversions–say, “we must choose to falsify the rejection of the between” because, in fact, the “between” is precisely what Obama systmatically rejects, his “false choices” always being completely false themselves.  So, in choosing to falsify the rejection of the between we open up the between as the arena of choosing.  It could use work, but that’s the beginning of a political maxim and habit.  Keep promoting and mistaking models as the falsification of the rejection of the between.

And I might as well have myself conclude by stumbling into one more thicket of mistakes, and argue for an “auxiliary” politics based upon the contemplation of the magnificent so-called auxiliary verbs.  Have, might, will, do, etc.–there are a lot of grammatical arguments here, but at the very least these are the verbs that can be followed by an infinitive without the “to.”  I find them intriguing because they are very difficult, if not impossible, to use as imperatives, which to me suggests their origins lie in the interrogative and answers to questions–from expressions like “think you to come?” to “do you think you will come?” and from “fears and worries assail me” to “I am afraid and have been worrying”… The auxiliary opens up a space of freedom–rather than thoughts, fears and worries operating directly upon one, one entertains, considers, distributes those thoughts, fears and worries.  The auxiliary makes reality somewhat less imperative.  And we can create whole chains of them without quite tumbling over into senselessness in some splendid ways–I will have finished considering whether I might still have had something more that could have been said, I might hope, before having done with this sentence.  The auxiliaries command us to mistake the space covered by the name, generating a present with ample references to possible pasts and futures.  So, I’m not quite saying that we should use a lot more auxiliaries; rather, that the possibilities for vagueness and hence freedom, along with the capacity to sustain a series of switches between tenses, actions and persons embodied in liberal use of the auxiliaries be our model for remaining just below the threshold.

Of course, one would be justified in requesting some examples.  I’ll get back to you on it.

April 29, 2009

Syntactic entanglements

Filed under: GA — adam @ 8:28 pm

My reading of contemporary history places the events of 9/11 as the pivotal event in the postmodern world governed by Auschwitz theology.  9/11 had, broadly speaking, two possible outcomes:  an overturning of Auschwitz theology,  White guilt, and the capitulation to victimary blackmail it compels; or a resurgence and intensification of that theology and guilt, as its adherents fight, as we all do, to preserve what is sacred to them.  I will maintain this reading of history until I see overwhelming evidency of some fallacy disabling it–from that standpoint, it is impossible to deny that the second outcome has, in fact, attained decisive ascendancy over the first one.  Ultimately, the overturning of Auschwitz theology required the dismantling of too much that is sacred, everything tied to the general reading of social reality in victimary terms.  The radical restructuring of our modes of pooling risk required for civilizational survival are simply unthinkable–no political figure would now suggest even something as moderate as Bush’s proposal for partial privatization of Social Security.  And yet the cultic Presidency of Barack Obama can’t solve any problems–if there is a meaningful politics now, it is in holding on to forms of understanding, to narratives, to habits and maxims, that can survive the coming wreck.  My own attempts to think of such a politics, in my essay on “Marginalist Politics,” in some recent posts, and in my posts on the JCRT Live blog, in terms of originary grammar, of the originary entwinement of norm and error and that I find to be embodied in habits, comprises the focus of my own work now.  How could I recommend it to others, though?    I have been recommending the courage of our habits, which is to say idiosyncrasy and eccentricity–where error, innovation and freedom overlap. 

 

Perhaps a trivial example:  Miss California, Carrie Prejean’s answer to a question about gay marriage at the Miss USA contest:

Well, I think it’s great that Americans are able to choose one or the other. We live in a land where you can choose same-sex marriage or opposite marriage. And you know what, in my country, in my family, I think that I believe that a marriage should be between a man and a woman. No offense to anyone out there, but that’s how I was raised, and that’s how I think it should be between a man and a woman.

For someone who teaches writing, this kind of thing is of the greatest interest (there was a bit of talk about some of Sarah Palin’s syntactical anomalies in impromptu speech during the campaign–I may go back and look through some of that, but I suspect I would find some similar phenomena as I will point out here).  “Well, I think it’s great that Americans are able to choose one or the other.  We live in a land where you can choose same sex marriage or opposite marriage.”  Perhaps Americans as a people, governed democratically, can choose one or the other–this would be an axiomatic reference to the terms of self-government.  Maybe it is a reference to state’s rights–the people of each state can choose one or the other.  This would be more accurate in terms of the progress of gay marriage through the political system; but it would also have a different resonance, more sinister for the cultural elite by which Prejean is being questioned and monitored here, but therefore also a more overtly political claim.  Or maybe it is a reference to the choice of each individual American–this would be an inaccurate claim, but, perhaps drawing upon the hopeful naivete granted to the beauty pageant contestant, it would position her more sympathetically.  And the very odd reference to heterosexual marriage as “opposite” marriage would then be either a very canny or completely serendipitous gesture towards the deconstruction of cultural norms she is presumably resisting.  The very grammar here resists being nailed down, keeps tailing off into near incoherence–and yet we kind of know what she is saying.  “And you know what, in my country, in my family, I think that I believe that a marriage should be between a man and a woman.” If you are going to ask, and we’re just expressing our own personal, non-binding opinions–“And you know what”–in my country (an assertion about American “values”?  the imagining of her own, private, America?), in my family (defending the family as the ultimate source of values, a family values supporter; but, at the same time, an implicit recognition that there are many families, many different kinds of families, from each and every one of which would issue a different set of beliefs, perhaps even a different “country”) “I think that I believe” (this is probably just “stuttering,” a nervousness about finally getting to the point here, making sure that a couple of layers of subjectivity buffer her from her interrogators) “that a marriage should be between a man and a woman” (At this point, is her support for heterosexual marriage as the norm anything more than her assertion of her own intention to marry a man?–and yet it still manages to be “controversial”!).  No offense to anyone out here (precisely her attempts to buffer and defer her expression of her very personal and almost inescapable belief–it’s her family and country, after all–might generate resentment, so the more explicit neutralizing of resentment is perhaps even more necessary) but that’s how I was raised (there are root causes),  and then the positively poetic “but that’s the way I feel it should be between a man and a woman.”  Probably, “that’s how I feel it [i.e., marriage] should be:  between a man and a woman,” but why not take her to be evoking some way of being, some transcendence of these degrading arguments, “between” a man and a woman (what is “between” them, connecting them, separating them?). 

This is an idiosyncratic, even idiomatic “grammar,” produced by the intersecting pressures of the traditional woman in the modernized version of the traditional worship of femininity, beauty and fertility, the hyped, sensationalized, and yet by now strangely antiquated “beauty pageant,” and the virulent, punk, self-ironizing but no less Puritan political correctness by the “celebrity blogger” whose position as a judge is meant as a kind of revenge upon the beauty pageant from within; and/or, perhaps, and attempt to maintain its legitimacy by bringing into accord with the very norms that make the pageant a kind of mini-scandal. 

Perhaps it is in such cultural/syntactical anomalies that the possibilities of resistance and change will emerge–perhap Ms. (Miss?) Prejean here is giving us an exemplary model of deferral by defending the traditional through the singular and ambiguous to the point of resisting hostile analysis, and therefore welcoming a sympathetic one.

April 27, 2009

Habit and Errors and Composition

Filed under: GA — adam @ 6:28 am

http://jcrt.typepad.com/jcrt_live/2009/04/habit-and-errors-and-composition.html

« Newer PostsOlder Posts »

Powered by WordPress