GABlog Generative Anthropology in the Public Sphere

May 29, 2009

Competence

Filed under: GA — adam @ 1:15 pm

For a while, “competence” has been a weapon used by the Left against Republican Presidents.  It began with the Dukakis campaign, I think, most immediately as a way of distracting attention from the candidate’s liberalism, and while it failed for a while, it has finally yielded fruit–certainly, the Bush Adminstration was effectively labeled “incomptent,” and the Democrats can present themselves, with lots of Ivy League technocrats who really want to run everything they can get their hands on, as competent.  It turned out to a be savvy strategy for a couple of simple reasons:  first, any modern administration is doing so many things that one will, at any point along the way, be able to point to dozens of “mistakes,” many of them egregious and harmful; and, second, the mass media, still the liberal, mainstream media even in these, its dying days, is much more interested in recording mistakes made by Republicans than Democrats.  After all, what is the measure of such things–according to what “objective,” competently administered criterion of competence could one “rank” the Obama, Bush, Clinton, Bush and Reagan administrations?  All one can do in making a case is list a series of mistakes–by the time you get to 15 or 20, it looks pretty bad, even out of thousands of decisions, so it really becomes a question of who wants to make such lists and what you believe belongs on it (not to mention the problem of ranking more and less serious mistakes, mistakes which are repaired to some degree or another, mistakes made out of carelessness as opposed to tough choices that could have gone either way, etc.).  Along with the rreasons i just mentioned, Leftists prefer these lists because the Progressive philosophy of governing insists upon expert administration as the test of legitimacy–if you see yourself, as an elected or appointed official, as akin to an engineer or doctor, then the number of serious mistakes becomes an important measure of your performance.  Conservatives rarely think to make such lists, because they are more interested in having the government do less rather than doing it better–indeed, if the government does things, or can be presented as doing them, better, that provides a ground for having it take on more.   You would think this would make Democratic administrations vulnerable to charges of incompetence, but since now one really knows what it means anyway, having lots of plans and being staffed by the type of people the media likes is good enough.

We could usefully trace at least one central strand of Progressivism to John Dewey’s argument that the scientific method should be applied to public and social life.  Rather than being driven by tradition and prejudice and constant shifts in public opinion, let’s explicitly identify “problems,” study the “causes” of those problems, try out “solutions,” and then measure the results of those solutions–exactly the way in which we would test a hypothesis in the laboratory.  Democracy, in that case, would depend upon the scientific method coming to replace traditional common sense in the public as a whole.  There are quite a few rather obvious problems here–first of all, the inevitable split, which must persist even as the population becomes more educated, between experts and non-experts when it comes to social problem solving; the fact that “failed” experiments in the sphere of social life have lasting effects and can’t just be “scrapped” as in the laboratory; the law of unintended consequences or, perhaps, Heisenberg’s indeterminacy principle, dictates that the experiment itself will transform, in all kinds of unpredictable ways, the conditions that were to be tested in the first place. 

This is not ancient history–what else does one imagine Barack Obama meant by “restoring science to its proper place” in his Inaugural Address?  On the one hand, it is a gesture to environmentalists, the abortion lobby and others; more generally, though, it is an assertion of the Progressivist philosophy of governance–why shouldn’t science dictate the way we organize health care, education, gun control, etc.?  There seem to be limits, though–has anyone proposed a “scientific” foreign policy?  Has gay marriage been formulated as a “scientific question”–would supporters accept as conclusive a study showing that children raised in gay marriages are less “well adjusted” than those raised in traditional marriages?  Of course not, and rightly so–even if one could scientifically determine the meaning of a term like “well-adjusted” one couldn’t scientifically determine what portion of one’s “adjustment” is determined within the family and what portion determined by what is outside, and in the interaction between the two.  To put it simply, no one accepts a scientific accounting of their values.

So, progressivism is meaningless as an actual practice of government, but as an ethos of those who govern it is extremely powerful–what it refers to is not so much scientific practice but the rule of those of us who define ourselves as pro-science.  The secular and those who don’t identify themselves in terms of their obligations to any community; those who can comfortably present themselves as victims of religious, ethnic or bourgeois “prejudice”; for all of these, “science” is the default position because in the mythology of modernity the anti-science position of the church and monarchy is what stands in for all the forces of reaction holding back economic, ethical and social progress.  Which brings us back to “competence” as a purely political term, similar to a more recently invented one like “reality-based.”

But a claim like “science is what scientists say it is” is not mere tautology.  The real meaning of competence is in performing the practices of some specialized community in a way recognizable by other members of that community.  An astrologer who stumbled upon the theory of relativity in 1895 would still have been “wrong,” or, more precisely, “not even wrong,” because no one in the scientific community could have done anything with that claim–it didn’t emerge out of some problem recognized by the community, some unanswered question or unresolved anomaly.  To be competent in such a community–and, I am saying, this is the only real meaning “competence” has–is to be able to recognize the relation between problems, questions and anomalies and the ongoing revision of the practices of the community or, as I would prefer, the discipline (a community which focuses on addressing a specific region of reality, a specific set of phenomena). 

In this sense, competence is extremely important, politically.  The hijacking of disciplinary authority for short term advantage is scandalous because we rely heavily upon those who set aside immediate questions for the sake of what, in the words of Charles Sanders Peirce, “will prove true in the long run.”  But it will always be an ongoing temptation, because there can’t be any extrinsic authority governing the discipline–only those within it are competent to judge its workings; even while the results of work within many disciplines becomes increasingly valuable to the world.  Real conservative political thinking, at this point, would best direct its attention to finding ways to ensure that every one stays within their sphere of competence–a concern that would mirror that evinced by the American Constitution for a separation and interaction of powers.  Those within one discipline ask questions of those within another discipline; consumers, voters and elected officials don’t interfere with disciplinary activity but choose the results of such activity that they prefer–that is the proper relationship.

But disciplines change and overlap with each other–new domain of “competence” emerge all the time, and can take advantage of the time-lag between their “discoveries” and the progress of other disciplines to arbitrarily proclaim upon all manner of things–academic disciplines like cultural studies are perfect examples:  they have a fairly sophisticated vocabulary that draws upon serious trends within modern thought, so they are capable of repelling criticism and attracting supporters–very few people are in a position to point out that they are essentially frauds.   At their best, disciplines are in between the sheer love of inquiry and conversation without bounds characteristic of the “amateur” and the rigor and accountability of the “professional”–indeed, one might say that disciplines start off amateurishly, pursuing some anomaly or taboo subject within an existing field, or separating an interesting question or problem from some craft or cult that has hitherto monopolized it; and then establish a vocabulary and idiom of inquiry that might some day freeze into jargon but will hopefully generate enough anomalies, paradoxes and antinomies to prevent that from happening. 

But we can open a disciplinary space any time, any place–whenever there is something not immediately visible that we feel could be seen if we had the right instruments or found the right “angle,” and we set aside differing interests and opinions in order to, jointly, see if we can find a way to see it–we have a disciplinary space.  In this case, a disciplinary space is an iteration of the originary scene–an iteration in relative safety, but somewhere in the back of the disciplinary foundation there is the sense of danger, the sense that order might give way to violence if we don’t find ways to see the same things.  And this lurking danger shows up in error–whenever we try to see something new our old habits keep getting in the way, even if it was a kind of interruption in those old habits that led us to seek something new in the first place.  When we point at something together, there is no guarantee that we “see” the same thing, and the only way to check on that is by pointing to something else, which repeats the same problem, etc.  There is no guarantee that after several “sightings” in common, having assured ourselves that we see together, some “monstrous” divergence won’t disabuse us of that assumption (in such cases, how do we know which is the error?).  We revert back from “seeing” to the idiom that enables us to talk about convergences and divergences, and even here there are no guarantees.  We simply gamble that the generative is better than the self-enclosed–whatever can produce more of itself, and in varied forms, seems preferable to anything hermetic and repulsive.

In this case, there can be another discourse of “competence” other than the “progressive” one, which takes the “administration” of “society” as its disciplinary object.  We can speak about habits, signs–ostensive and imperative,  idioms, norms and error, and overlappings.  Any of us can be sufficiently self-reflexive to note where our extant habits are taking on new material; any of us can identify others whom we consider to be competent to judge our practices; and those who are competent to judge the results of our practices, and because they fall into the region covered by their habits;   we can position ourselves at the limits of others’ habits and point out–set up a disciplinary space aimed at pointing out–where they exceed their competence; and we can test, at the margins of practices, where norms get fuzzy and error and innovation get entangled.

The whole idea of a “mainstream” is un-American–far more normative for us is the 19th century conditions, when we were flush with con men, cults and debunkers, and it must have been hard at times to keep them apart.  The “mainstream” is an invention of progressives, a way of holding together the welfare state and Cold War belligerency.  Let FOX News cultivate some crazies; let the creationists have their conferences and densely argued and meticulously documented pseudo-academic treatises; and let the debunkers have at them. And maybe I was too hard on cultural studies a moment ago–once we see it as a specifically academic cult, with an affinity for other cults (UFO hunters, gay subcultures, conspiracy theorists) we can find a place for it as well.  But not in state supported academies–a major project, probably far more important than any strictly political activity, over the next few decades, will be circumventing and ultimately undermining the University as a source of authority and credentialing.  Employers should decide what they want their employees to be able to do; and then they should train them in those skills specific to that job, while relying upon academies that focus on requested skill sets offering credentials that testify to the student’s ability to do x, y and z.  Lots of vocational schools, and lots of on-line education, then–but the Humanities need not suffer, since there is no doubt that advanced interpretive and communication capacities will have an important place in economies of the future.  But there is a huge gap in that [employers] “should”–no one is competent to issue imperatives here.  Only the proliferation of disciplinary spaces on the margins of and outside of the University will fill in that gap.  For now, though, we can hammer away at tenure, on all its forms in all institutions–there is no more pernicious habit than that one. 

This is probably not the way in which most participants in the discipline of Generative Anthropology see it, but I would like to practice the originary hypothesis as a source of idioms of inquiry–a habit of prying loose new vocabularies and grammars from the anomalies within existing, especially decaying, disciplines.  It is the difference between iterating the gesture on the originary scene and assessing the results of that gesture.  Perhaps these are different competencies. 

May 22, 2009

Below the Threshold

Filed under: GA — adam @ 10:48 am

Desiring to educate myself in economics, so as to speak (and think) intelligently about current events, I have subscribed to the Mises email list, receiving a daily column.  I’ve been reading Hayek for a while, and I have been moving in an increasingly radical free market direction for some time, and so Mises’ thought is extremely helpful.  I’m taking it in small doses now, waiting until I have some time to continue reading Mises’ Human Action.  I haven’t been disappointed, as I find the economic analyses very enlightening and congenial.  One lesson I have drawn so far, and which I will probably write about soon, is that one source of our present troubles, which is to say one manifestation of our victimary regime, is the taboo on wages going down.  We–most of us as individuals, and in our policy preferences as a society–would rather see high paying jobs disappear because the companies in question go out of business rather than take pay cuts, leaving jobs intact at what would still be much higher wages than those for the jobs that will come in place of those lost.  This is very interesting.

My interests here are different, though.  Along with the economic analyses, I also receive political ones, from the Mises list.  “Analyses” is a generous term here–they are mostly diatribes, as virulent and maniacal as any coming from the farthest reaches of the Left–the state (not any particular government), American is particular, is a parasite, a bloodsucker, a mass murderer, all of its enemies justifed in their resentments.  In one recent animadversion on “imperialist” American foreign policy, rather sensible statements by American statesmen about the need to keep such policy buffered from (not completely immune to) democratic sentiment, were treated as criminal–and this comes from a tendency within social thought that sees even the most democratically determined encroachment on private property as illegitimate.  Here, in fact, seems to be the link to the Left:  if the democracy is not on your side, argue fundamental rights; if constitutionally determined decision making is at times unpopular, argue for the absolute claims of the democracy. 

This dichotomy between reasonableness in some areas and craziness in others seems to me increasingly common now, and it leads to a kind of schizophrenia, at least for those of us who see the cultic Presidency of Barack Obama as a perhaps irreversible disaster.  For example, in my view, debates over the constitutionality and rationality of the Federal Reserve Board are legitimate and highly interesting–just like I don’t believe that anyone knows what the temperature will be in 2040, I don’t believe any economist can possibly know what the interest rate “should” be.  But state such views publicly, and who will you find standing with you?  Some solid conservatives, of course, but also lots of conpiracy theorists, racists and anti-semites.  Speak forcefully against the weakness of the West in the face of adverse demographics and appeasement of Islamic supremacism and (at least in Europe) you find yourself ranged alongside neo-Nazis.  Such positions are, it would seem, dead in the water:  even the rapidly disintegrating “mainstream” media finds easy pickings in such a target-rich environment. 

It may be that the standard modern Western political narrative, the people liberating themselves from benighted, tyrannical rule, has been monopolized by the Left.  For the narrative to work at this late date, you need a monomaniacal victimary focus–specific, universally recognizable victim-types you can keep in the public eye constantly.  Defenders of free markets and individual rights and liberty can’t compete, because you can’t represent the entrepeneur who never was because of hostile regulation, and even if you could, how sympathetic would he/she be?  The market doesn’t lend itself to scenic representation–the only people on the right who can make the effort to compete with the Left, in formal terms, are the ones who construct a symmetrical relation with the privileged groups on the Left–i.e., white supremacists, claiming to be oppressed by the Zionist Occupied Government. 

Another example.  The popular blog, Little Green Footballs, has taken an interesting turn since the election.  Little Green Footballs, run by Charles Johnson, was, as far as I knew, a blog interested in focusing attention on the denial of the rise of Jihad throughout the world.  Technologically skilled, Johnson excels in exposing frauds–most famously, the forged documents used in 60 Minutes’ story on George W. Bush’s National Guard service, but also some photoshopped photos of the Israeli war in Lebanon in 2006.  Since the election, Johnson has shifted his attention dramatically, while still posting on the depredations and denials of what he used to refer to ironically as the “Religion of Peace,” to exposing creationists and other “extremists” in the Republican party as well as links between self-declared “anti-jihadists” and fascism, neo-Nazism, racism and anti-semitism.  I think Johnson should be seen as sincerely wanting the Republicans to become a party of what he would consider the “center”–that elusive fiscal conservative, socially liberal, national security hawk that would be conservatives embarrassed by the pro-life, “fundamentalist” crowd always dream of.  What is interesting here is that no one seems to fit the bill–the creationists seem to be the only ones exhibiting principled opposition to Obama’s plans, to be interested in what the Constitution actually says, or to find a center anyway other than in the latest polls–and, so, Johnson ends up gesturing towards rather pathetic “moderates” like Arlen Specter and Olympia Snowe as the salvation of the Republican party (or, at least, pointing to the marginalization of such figures as the self-destruction of the party).  

In other words, there is no “center”–that is, no set of “actionable” opinions on central questions regarding the continuance of American civilization firmly shared by a solid majority of Americans.  For example, Obama can continue the war in Afghanistan and continue Bush policies on prisoners, but what he can’t do is cease scapegoating Bush, which deprives these policies of any principled basis because the implication is that the policies need only continue until Bush’s “mess” has been “cleaned up.”  In other words, he can do what needs to be done only under the condition that we fantasize that it never really needed to be done in the first place.  But leave that aside:  everyone knows that Iran is a significant threat; and everyone knows that we will not do anything to prevent the Iranian regime from getting nuclear weapons.  We are allowing a fourth rate military power, one deep in various economic and political crises, which could be severely disabled and perhaps tipped over at a fairly low cost, elevate itself to an arbiter of affairs in the Middle East and beyond–what more proof is necessary that we are incapable of acting on basic observations and common sense?

So, I propose that we operate under the following assumptions:  first, that that modern Western political narrative has exhausted itself, and can be left to the husks to the Left, who will run it out through their increasingly baroque parodies; and, second, that there is no available center at present.  We might as well connect the two claims as well–there is no center because the only narrative we know has been exhausted, and nothing has been put in its place.  Why else would the cultic Obama presidency be almost pathologically determined not to question a single shiboleth of Keynsian economics or Carteresque foreign policy predicated upon the de-centering of America?  Without that version of liberation from an imperial, free market, Babbitized American, the world simply wouldn’t make sense to these people.

But narratives of liberation from obscurantist authorities is the only available narrative, and the one upon which all public discourse is predicated.  Try to imagine a political discourse focused solely on strengthening the center:  upon an auditing of all institutions based upon their conformity with Constitutionally authorized purposes and the degree to which they remain within the area of their competence.  Such an politics would involve arguments and conflict, but the victim card would not determine the victor in these disputes–the idea would be to try out a model of the institution one believes to have drifted from its originary minimality, and others would construct models out of the paradoxes inherent in your model, and these visible paradoxes would center conversation upon the proper version of minimality.  The problem with such a politics is that if you don’t point to a victim produced by the distortions in question, why should anyone care?  And if you do produce victims, not only will you not be able to compete with the more lurid victimary tales already circulating, but you have already lost the argument, because you won’t be able to claim that your approach necessarily offers a shorter path toward the elimination of that mode of victimization, or to the most appropriate recognition of the victim in question.

So, I must be arguing for a politics that is largely below the threshold of visibility and publicity, and yet can cross that threshold on occasion–more precisely, on those occasions when the threshold itself is lowered so that private activities normally hidden from view become political.  A useful way to think about this is in terms of those moments when you see someone you know through carefully framed events–at work, or school, or some recreational activity–in a new context, one in which their behavior can’t help but deviate from the normal script.  This new context can one other than the normal one–say, seeing your boss at a sporting event; or, it can be an interruption of the normal one–say, an ordinarily stolid co-worker breaks down in tears at work.  What one is seeing in such cases is the eruption of an extrinsic habit into an established frame or reference, and this habit, in its own terms perfectly normal (there’s no reason the disciplinarian, by-the-book boss can’t come to the Red Sox game decked out in colorful and comical team paraphenelia, or the quiet, reliable co-worker shouldn’t be a deeply sentimental individual in his own time; and the proof of that is that we revise our habits so as to incoporate this new way of seeing that person), “scandalizes” the scene into which it erupts.  The result is error, that combination of embarrassment and revelation, from which we can’t turn away even as we can barely stand to look on.  I would like to suggest that this errancy–getting into the habit of interrupting others habits by having our own interrupted–is the best model for the present moment in our society and politics, when almost everyone is straying outside of their area of competence and exposing their unthinking habits on a regular basis.

Not only is errancy the best model for contemporary culture and politics, but it is originary, having a critical place on the originary scene.  If we accept that there must have been a first “signifier” on the originary scene, then we must also accept  that the sign as put forth by this first signifier was merely “potential,” and therefore both sign (it is already iterable, and being iterated) and not-sign (it has not consolidated the scene, and therefore has not yet distinguished itself from the mimetic crisis it is “destined” to interrupt).  The most interesting part of the scene to think about, for me at least, is the process–or the wide range of possible processes–by which we could imagine the series of  imitations (through fits and starts, through automatic mimetic instinct and through a shared vision of the imminent crisis, through glances at the responses of others present, aborted movements back to the center, etc.) through which the sign would concantenate through the scene, very likely coming out of it looking very differently than it started, and with all memory of that origin (even on the part of first signifer himself) lost, since only now is there a sign and event which makes memory possible.   I consider this concantenation an originary grammar of norm and error–each signifier in turn modifies (“distorts”) the sign as it has come to him while at the same time “packaging” (correcting”) it for the next in turn.  

I am obviously now going to argue for a politics situated within this instant.  An auxiliary politics within an indicative culture.  I’ll first make a grammatical observation:  what makes a sentence meaningful is the presence of a “commanding name.”  The noun, ultimately a name or whatever stands in the place of the name, generates a sentence by commanding us to suspend or withdraw some command or demand (imperative) of our own pertaining to the space covered by the name.  In other words, adding the predicate to the noun situates the referent of the noun in “reality” and renders it inaccessible.  The noun, then tells us to cease our demands and align ourselves with that reality. 

If the sentence lacks a commanding name, it doesn’t make sense.  Think about how many sentences with no nouns, only pronouns and other deictics, you would need before no two people could agree on on what is being said.  I would suspect no more than 3 or 4 in most cases.  Now, while there can, in some languages (I personally don’t know how many), be sentences without verbs, and the first sentence might have been such, I’m going to (without argument, for here and now at least) insist that we don’t have real sentences until we have verbs.  And that verbs are, ultimately, imperatives.  So, the verb is the commanding name being commanded to stay in place, hold reality together, and command. 

Without the commanding name, all we have is a chaos of imperatives, interrogatives and ostensives–again, consider the example of a series of sentences without a noun.  What would keep it going, what would sustain the presence that we ultimately need to hold the world together–There it is.  What?  That–look!  Which one?  The one right there, between them.  Show me! Look in between the two that are in between those four… This is that instant where the sign has been put forth but not yet publicly “authenticated”–the “dialogue” can be held together by the shared presumption that there is something to look at, that there might and must be a commanding name, even if it is presently unavailable or withheld.  In fact, the more commanding the name, the more immense the reality it brings into being, the more we struggle to point to some part of it that will be verifiable as a part of it

So, our culture of errancy is one in which the possible commanding names are presumed to be there but beyond our ken.  So, a politics modeled on this condition would involve intense adherence to something floating around as a possible commanding name, along with the attempt to bring others into the process of commanding the name to command, to mistake and norm that commanding name together.  This involves the creation of habits of finding “pregnant” names, obeying them, and issuing commands to others to mis-take those names.  This would create an indicative culture, a habit of composing sentences that remain very close and sensitve to the world of imperatives while nevertheless just barely transcending them, staying close to the boundary between pointing at something together and not making sense.  This would include a “chiasmatic” relation to the public discourses, generating maxims through the reversal of existing maxims.  For example, I have heard Barack Obama’s favorite phrase, “we must reject the false choice between…” so many times that I would like to create a new habit and guiding maxim out of inversions–say, “we must choose to falsify the rejection of the between” because, in fact, the “between” is precisely what Obama systmatically rejects, his “false choices” always being completely false themselves.  So, in choosing to falsify the rejection of the between we open up the between as the arena of choosing.  It could use work, but that’s the beginning of a political maxim and habit.  Keep promoting and mistaking models as the falsification of the rejection of the between.

And I might as well have myself conclude by stumbling into one more thicket of mistakes, and argue for an “auxiliary” politics based upon the contemplation of the magnificent so-called auxiliary verbs.  Have, might, will, do, etc.–there are a lot of grammatical arguments here, but at the very least these are the verbs that can be followed by an infinitive without the “to.”  I find them intriguing because they are very difficult, if not impossible, to use as imperatives, which to me suggests their origins lie in the interrogative and answers to questions–from expressions like “think you to come?” to “do you think you will come?” and from “fears and worries assail me” to “I am afraid and have been worrying”… The auxiliary opens up a space of freedom–rather than thoughts, fears and worries operating directly upon one, one entertains, considers, distributes those thoughts, fears and worries.  The auxiliary makes reality somewhat less imperative.  And we can create whole chains of them without quite tumbling over into senselessness in some splendid ways–I will have finished considering whether I might still have had something more that could have been said, I might hope, before having done with this sentence.  The auxiliaries command us to mistake the space covered by the name, generating a present with ample references to possible pasts and futures.  So, I’m not quite saying that we should use a lot more auxiliaries; rather, that the possibilities for vagueness and hence freedom, along with the capacity to sustain a series of switches between tenses, actions and persons embodied in liberal use of the auxiliaries be our model for remaining just below the threshold.

Of course, one would be justified in requesting some examples.  I’ll get back to you on it.

Powered by WordPress