Desiring to educate myself in economics, so as to speak (and think) intelligently about current events, I have subscribed to the Mises email list, receiving a daily column. I’ve been reading Hayek for a while, and I have been moving in an increasingly radical free market direction for some time, and so Mises’ thought is extremely helpful. I’m taking it in small doses now, waiting until I have some time to continue reading Mises’ Human Action. I haven’t been disappointed, as I find the economic analyses very enlightening and congenial. One lesson I have drawn so far, and which I will probably write about soon, is that one source of our present troubles, which is to say one manifestation of our victimary regime, is the taboo on wages going down. We–most of us as individuals, and in our policy preferences as a society–would rather see high paying jobs disappear because the companies in question go out of business rather than take pay cuts, leaving jobs intact at what would still be much higher wages than those for the jobs that will come in place of those lost. This is very interesting.
My interests here are different, though. Along with the economic analyses, I also receive political ones, from the Mises list. “Analyses” is a generous term here–they are mostly diatribes, as virulent and maniacal as any coming from the farthest reaches of the Left–the state (not any particular government), American is particular, is a parasite, a bloodsucker, a mass murderer, all of its enemies justifed in their resentments. In one recent animadversion on “imperialist” American foreign policy, rather sensible statements by American statesmen about the need to keep such policy buffered from (not completely immune to) democratic sentiment, were treated as criminal–and this comes from a tendency within social thought that sees even the most democratically determined encroachment on private property as illegitimate. Here, in fact, seems to be the link to the Left: if the democracy is not on your side, argue fundamental rights; if constitutionally determined decision making is at times unpopular, argue for the absolute claims of the democracy.
This dichotomy between reasonableness in some areas and craziness in others seems to me increasingly common now, and it leads to a kind of schizophrenia, at least for those of us who see the cultic Presidency of Barack Obama as a perhaps irreversible disaster. For example, in my view, debates over the constitutionality and rationality of the Federal Reserve Board are legitimate and highly interesting–just like I don’t believe that anyone knows what the temperature will be in 2040, I don’t believe any economist can possibly know what the interest rate “should” be. But state such views publicly, and who will you find standing with you? Some solid conservatives, of course, but also lots of conpiracy theorists, racists and anti-semites. Speak forcefully against the weakness of the West in the face of adverse demographics and appeasement of Islamic supremacism and (at least in Europe) you find yourself ranged alongside neo-Nazis. Such positions are, it would seem, dead in the water: even the rapidly disintegrating “mainstream” media finds easy pickings in such a target-rich environment.
It may be that the standard modern Western political narrative, the people liberating themselves from benighted, tyrannical rule, has been monopolized by the Left. For the narrative to work at this late date, you need a monomaniacal victimary focus–specific, universally recognizable victim-types you can keep in the public eye constantly. Defenders of free markets and individual rights and liberty can’t compete, because you can’t represent the entrepeneur who never was because of hostile regulation, and even if you could, how sympathetic would he/she be? The market doesn’t lend itself to scenic representation–the only people on the right who can make the effort to compete with the Left, in formal terms, are the ones who construct a symmetrical relation with the privileged groups on the Left–i.e., white supremacists, claiming to be oppressed by the Zionist Occupied Government.
Another example. The popular blog, Little Green Footballs, has taken an interesting turn since the election. Little Green Footballs, run by Charles Johnson, was, as far as I knew, a blog interested in focusing attention on the denial of the rise of Jihad throughout the world. Technologically skilled, Johnson excels in exposing frauds–most famously, the forged documents used in 60 Minutes’ story on George W. Bush’s National Guard service, but also some photoshopped photos of the Israeli war in Lebanon in 2006. Since the election, Johnson has shifted his attention dramatically, while still posting on the depredations and denials of what he used to refer to ironically as the “Religion of Peace,” to exposing creationists and other “extremists” in the Republican party as well as links between self-declared “anti-jihadists” and fascism, neo-Nazism, racism and anti-semitism. I think Johnson should be seen as sincerely wanting the Republicans to become a party of what he would consider the “center”–that elusive fiscal conservative, socially liberal, national security hawk that would be conservatives embarrassed by the pro-life, “fundamentalist” crowd always dream of. What is interesting here is that no one seems to fit the bill–the creationists seem to be the only ones exhibiting principled opposition to Obama’s plans, to be interested in what the Constitution actually says, or to find a center anyway other than in the latest polls–and, so, Johnson ends up gesturing towards rather pathetic “moderates” like Arlen Specter and Olympia Snowe as the salvation of the Republican party (or, at least, pointing to the marginalization of such figures as the self-destruction of the party).
In other words, there is no “center”–that is, no set of “actionable” opinions on central questions regarding the continuance of American civilization firmly shared by a solid majority of Americans. For example, Obama can continue the war in Afghanistan and continue Bush policies on prisoners, but what he can’t do is cease scapegoating Bush, which deprives these policies of any principled basis because the implication is that the policies need only continue until Bush’s “mess” has been “cleaned up.” In other words, he can do what needs to be done only under the condition that we fantasize that it never really needed to be done in the first place. But leave that aside: everyone knows that Iran is a significant threat; and everyone knows that we will not do anything to prevent the Iranian regime from getting nuclear weapons. We are allowing a fourth rate military power, one deep in various economic and political crises, which could be severely disabled and perhaps tipped over at a fairly low cost, elevate itself to an arbiter of affairs in the Middle East and beyond–what more proof is necessary that we are incapable of acting on basic observations and common sense?
So, I propose that we operate under the following assumptions: first, that that modern Western political narrative has exhausted itself, and can be left to the husks to the Left, who will run it out through their increasingly baroque parodies; and, second, that there is no available center at present. We might as well connect the two claims as well–there is no center because the only narrative we know has been exhausted, and nothing has been put in its place. Why else would the cultic Obama presidency be almost pathologically determined not to question a single shiboleth of Keynsian economics or Carteresque foreign policy predicated upon the de-centering of America? Without that version of liberation from an imperial, free market, Babbitized American, the world simply wouldn’t make sense to these people.
But narratives of liberation from obscurantist authorities is the only available narrative, and the one upon which all public discourse is predicated. Try to imagine a political discourse focused solely on strengthening the center: upon an auditing of all institutions based upon their conformity with Constitutionally authorized purposes and the degree to which they remain within the area of their competence. Such an politics would involve arguments and conflict, but the victim card would not determine the victor in these disputes–the idea would be to try out a model of the institution one believes to have drifted from its originary minimality, and others would construct models out of the paradoxes inherent in your model, and these visible paradoxes would center conversation upon the proper version of minimality. The problem with such a politics is that if you don’t point to a victim produced by the distortions in question, why should anyone care? And if you do produce victims, not only will you not be able to compete with the more lurid victimary tales already circulating, but you have already lost the argument, because you won’t be able to claim that your approach necessarily offers a shorter path toward the elimination of that mode of victimization, or to the most appropriate recognition of the victim in question.
So, I must be arguing for a politics that is largely below the threshold of visibility and publicity, and yet can cross that threshold on occasion–more precisely, on those occasions when the threshold itself is lowered so that private activities normally hidden from view become political. A useful way to think about this is in terms of those moments when you see someone you know through carefully framed events–at work, or school, or some recreational activity–in a new context, one in which their behavior can’t help but deviate from the normal script. This new context can one other than the normal one–say, seeing your boss at a sporting event; or, it can be an interruption of the normal one–say, an ordinarily stolid co-worker breaks down in tears at work. What one is seeing in such cases is the eruption of an extrinsic habit into an established frame or reference, and this habit, in its own terms perfectly normal (there’s no reason the disciplinarian, by-the-book boss can’t come to the Red Sox game decked out in colorful and comical team paraphenelia, or the quiet, reliable co-worker shouldn’t be a deeply sentimental individual in his own time; and the proof of that is that we revise our habits so as to incoporate this new way of seeing that person), “scandalizes” the scene into which it erupts. The result is error, that combination of embarrassment and revelation, from which we can’t turn away even as we can barely stand to look on. I would like to suggest that this errancy–getting into the habit of interrupting others habits by having our own interrupted–is the best model for the present moment in our society and politics, when almost everyone is straying outside of their area of competence and exposing their unthinking habits on a regular basis.
Not only is errancy the best model for contemporary culture and politics, but it is originary, having a critical place on the originary scene. If we accept that there must have been a first “signifier” on the originary scene, then we must also accept that the sign as put forth by this first signifier was merely “potential,” and therefore both sign (it is already iterable, and being iterated) and not-sign (it has not consolidated the scene, and therefore has not yet distinguished itself from the mimetic crisis it is “destined” to interrupt). The most interesting part of the scene to think about, for me at least, is the process–or the wide range of possible processes–by which we could imagine the series of imitations (through fits and starts, through automatic mimetic instinct and through a shared vision of the imminent crisis, through glances at the responses of others present, aborted movements back to the center, etc.) through which the sign would concantenate through the scene, very likely coming out of it looking very differently than it started, and with all memory of that origin (even on the part of first signifer himself) lost, since only now is there a sign and event which makes memory possible. I consider this concantenation an originary grammar of norm and error–each signifier in turn modifies (“distorts”) the sign as it has come to him while at the same time “packaging” (correcting”) it for the next in turn.
I am obviously now going to argue for a politics situated within this instant. An auxiliary politics within an indicative culture. I’ll first make a grammatical observation: what makes a sentence meaningful is the presence of a “commanding name.” The noun, ultimately a name or whatever stands in the place of the name, generates a sentence by commanding us to suspend or withdraw some command or demand (imperative) of our own pertaining to the space covered by the name. In other words, adding the predicate to the noun situates the referent of the noun in “reality” and renders it inaccessible. The noun, then tells us to cease our demands and align ourselves with that reality.
If the sentence lacks a commanding name, it doesn’t make sense. Think about how many sentences with no nouns, only pronouns and other deictics, you would need before no two people could agree on on what is being said. I would suspect no more than 3 or 4 in most cases. Now, while there can, in some languages (I personally don’t know how many), be sentences without verbs, and the first sentence might have been such, I’m going to (without argument, for here and now at least) insist that we don’t have real sentences until we have verbs. And that verbs are, ultimately, imperatives. So, the verb is the commanding name being commanded to stay in place, hold reality together, and command.
Without the commanding name, all we have is a chaos of imperatives, interrogatives and ostensives–again, consider the example of a series of sentences without a noun. What would keep it going, what would sustain the presence that we ultimately need to hold the world together–There it is. What? That–look! Which one? The one right there, between them. Show me! Look in between the two that are in between those four… This is that instant where the sign has been put forth but not yet publicly “authenticated”–the “dialogue” can be held together by the shared presumption that there is something to look at, that there might and must be a commanding name, even if it is presently unavailable or withheld. In fact, the more commanding the name, the more immense the reality it brings into being, the more we struggle to point to some part of it that will be verifiable as a part of it.
So, our culture of errancy is one in which the possible commanding names are presumed to be there but beyond our ken. So, a politics modeled on this condition would involve intense adherence to something floating around as a possible commanding name, along with the attempt to bring others into the process of commanding the name to command, to mistake and norm that commanding name together. This involves the creation of habits of finding “pregnant” names, obeying them, and issuing commands to others to mis-take those names. This would create an indicative culture, a habit of composing sentences that remain very close and sensitve to the world of imperatives while nevertheless just barely transcending them, staying close to the boundary between pointing at something together and not making sense. This would include a “chiasmatic” relation to the public discourses, generating maxims through the reversal of existing maxims. For example, I have heard Barack Obama’s favorite phrase, “we must reject the false choice between…” so many times that I would like to create a new habit and guiding maxim out of inversions–say, “we must choose to falsify the rejection of the between” because, in fact, the “between” is precisely what Obama systmatically rejects, his “false choices” always being completely false themselves. So, in choosing to falsify the rejection of the between we open up the between as the arena of choosing. It could use work, but that’s the beginning of a political maxim and habit. Keep promoting and mistaking models as the falsification of the rejection of the between.
And I might as well have myself conclude by stumbling into one more thicket of mistakes, and argue for an “auxiliary” politics based upon the contemplation of the magnificent so-called auxiliary verbs. Have, might, will, do, etc.–there are a lot of grammatical arguments here, but at the very least these are the verbs that can be followed by an infinitive without the “to.” I find them intriguing because they are very difficult, if not impossible, to use as imperatives, which to me suggests their origins lie in the interrogative and answers to questions–from expressions like “think you to come?” to “do you think you will come?” and from “fears and worries assail me” to “I am afraid and have been worrying”… The auxiliary opens up a space of freedom–rather than thoughts, fears and worries operating directly upon one, one entertains, considers, distributes those thoughts, fears and worries. The auxiliary makes reality somewhat less imperative. And we can create whole chains of them without quite tumbling over into senselessness in some splendid ways–I will have finished considering whether I might still have had something more that could have been said, I might hope, before having done with this sentence. The auxiliaries command us to mistake the space covered by the name, generating a present with ample references to possible pasts and futures. So, I’m not quite saying that we should use a lot more auxiliaries; rather, that the possibilities for vagueness and hence freedom, along with the capacity to sustain a series of switches between tenses, actions and persons embodied in liberal use of the auxiliaries be our model for remaining just below the threshold.
Of course, one would be justified in requesting some examples. I’ll get back to you on it.