GABlog Generative Anthropology in the Public Sphere

August 31, 2017

The Modernity of Absolutism

Filed under: GA — adam @ 9:35 am

The notion of sovereignty reaches back, in a various forms, to distant antiquity, as does the assumption that the monarch exercises complete power, unlimited by law, but the absolute right of kings, in the Western tradition, is only explicitly stated and defended in early modern Europe, by apologists for the absolutist sovereigns then emergent. Kingship was, through the middle ages, bound up with a whole network of rights and responsibilities which served to limit its power both explicitly and implicitly. At the same time, the king was the king, to whom all owed loyalty and obedience, so we could say there was some confusion there. In that case, the establishment of absolute monarchies, along with theorists defending them, in particular Robert Filmer, served as a genuine clarification of sovereignty. You can define sovereignty in such a way as to subtract everything personal from it, and that may be how it looks from the outside, but rule, at any rate, must be personal. A decision can be disguised as a corporate affair, but ultimately someone has made it, and all human activities and institutions are the results of decisions. We could see all of the social sciences which replace decision with “process,” “structure,” “interaction,” and so on as evasions. And with good reason –to say that something happened as a result of a “process” means it’s out of our hands and we don’t have to fight about it. We could, then, see something moral in this evasion, insofar as it is a mode of deferral; but it is a marker of moral immaturity, like telling children the tooth fairy will come to make them stop crying—moral maturity would involve examining the ways we might best make our own decisions so as to preserve or reverse the decisions of the past.

But while Filmer’s argument for the absolute power of kings is really exemplary and a model of reasoning, contemporary absolutists find the modern absolutist monarchs to be highly problematic. They centralized power by using the “people” as a battering ram against the middle orders, the nobility that exercised countervailing power (they could withhold funds and soldiers needed for war) and the Church that considered itself entitled to determine the legitimacy of the king. In doing so they demolished the entire traditional moral order that situated individuals within institutions, with well-defined roles, and set us on the path where there is no public morality other than screaming for a larger and more intrusive state to grant more equality by punishing those who seem to believe that there is anything more important than more equality. Was there another way that the “clarification” of absolutism could have been accomplished, though? This is obviously a relevant question for those interested in a similar clarification today. Perhaps that’s the wrong question—after all, we can’t rewrite history. Maybe there was no other path then, but there are paths now. That would still mean we should learn from history, if for no other reason than to help us identify the preferred paths. What, exactly, do we think a more overt absolutist order would accomplish? If we could identify lots of things—ideas, institutions, practices—that are “in the way” of establishing absolutism, surely they are not all in the way in the same way, much less to be gotten out of the way in the same way. Absolutism implies some kind of centralization—what kind of centralization, then, does not require that all on the margins have exactly the same relation to the center? What kind of absolutism would preserve and even enhance differentiation and embeddedness?

Originary thinking provides us with a model for moral development. At the origin of humanity lies representation as the deferral of violence. There’s an object that everyone in the group wants; the fact that everyone wants it, and everyone knows that everyone wants it, makes everyone want it even more. They want it so much that the normal pecking order of the higher animal group breaks down—the alpha animal can fight off any single contender but is helpless against the simultaneous convergence of all upon the center. Some new means of restoring order is needed: that new means is the sign, in this case the gesture by which all members of the group come to “communicate” to one another that they will defer appropriation of the central object. We now have a configuration: we all pay attention to something at the center. We pay attention to it rather than trying to appropriate it, and language is our way of letting each other know that is what we are doing. We can imagine that the first, foundational, instance of deferral was very short—as Eric Gans suggests, maybe no more than a brief hesitation preceding a more orderly, or at least “framed,” shared consumption of the object. In that case, moral and human development would involve stretching out that moment of deferral: a group that could defer appropriation for a couple of minutes would be more “competent” than one that couldn’t defer for more than a few seconds. And then the group that could defer for an hour would be even more competent—and would find it easy to conquer the less continent groups.

This greater competence comes, in the first instance, from a greater control over reactions and the development of a greater range of responses to the actions of others: think about who would win a confrontation between someone who feels compelled to respond directly and completely to every insult, every slight, and someone capable of seeing those insults and slights as baits to which one is free to reserve a response. It also comes, though, from the greater differentiation of signs that results from sustaining, shifting and manipulating attention. Language is essentially us getting each other to pay attention to things. The group that can defer appropriation for an hour will use that time to talk about a lot of things—they will notice things about the object, about how it came into their possession, about one another’s relation, or mode of approach, to the object, about the difference between this scene and previous ones. The human vocation is to continue extending the act of deferral, ultimately until infinity. Remember the Greek proverb: call no man happy until he is dead. That itself memorializes a history of deferral, through which rather than seeing human life as bound up with the immediate mimetically generated fears of rivals and ancestors and the constantly shifting “scorecard” in one’s struggle with them, it becomes possible to see a life as an ethical whole. But we could just as easily say “call no man happy until all the possible ways of understanding happiness have been exhausted,” which is to say, never. It’s very funny to watch some online disputes, for example in the comments section of blogs, where commenters harangue, ridicule and sometimes even threaten each other, in a style of communication that has its roots in oral communication, where one side will best the other right now to the acclaim of an audience. I can’t say for sure what works best for what purposes here and now (and I like a good meme as much as the next man), but eventually people will start thinking in terms of using these very extended lines of communication to intervene in long term ways in broader communications and institutional networks. Some people are surely doing this already, and seemingly short term strategies (like memes meant to humiliate) may very well be part of longer term strategies. But that would mean you have trained yourself to not really believe in the meme you are deploying, except in the sense that you “believe” in the arsenal you are maintaining.

So, we can say, in a preliminary way, that the centralizing imperative of absolutism is better directed against the entrenchment of lesser modes of deferral and in favor of more extended forms. We can see evidence of the degree of deferral attained in the ways communities assign responsibility. A community that attributes a plague to a microbe that can be isolated in a pool of water used as a drinking source has attained a higher degree of deferral than a community that blames the plague on a priest’s failure to perform the prescribed ritual properly. This is not just a question of knowing that science provides us with truth and that rituals don’t really have any effect on the natural world. It’s a question of whether the communities involved have suspended their desire to assign responsibility so as to consider a range of possible “causes.” A community that blamed itself for the plague for its failure to maintain justice in its courts would be just as wrong as the community that blamed the priest, but it would be exhibiting a higher level of deferral because rather than directing attention in the least resistant and most “satisfying” way it would have thought in terms of distributing blame, and finding a cure not in murder but in institutional repair. (Such a community would probably be more likely to find its way to some notion of “public health.”) Now, this approach doesn’t necessarily make for easy decision making and the determination of moral distinctions (we could imagine a very—but not infinitely!—patient and very merciless predator, for example), but these are the terms on which a serious moral discussion can be had, and we could say that in all uncertainly over decisions we could sort out the imperatives for extending deferral from those for collapsing it.

Liberalism and progressivism also claim to enable improvements in human behavior and social arrangements, but they don’t purport to do so by extending human deferral capacities. Both ideologies assert the possibility of downloading human morality into institutions—so, the “checks and balances” of liberal government will themselves restrict the violent tendencies endemic to human beings, or the “market,” given sufficient prosperity, will have the same effect. But the implication in both cases is really that advanced civilization is compatible with a reversal of previous tendencies and a decline in the capacity to defer. If one’s desires can be rerouted to objects readily available on the market, then domestication probably would be fairly easy—and a lot of study can be put into this rerouting. If you want to render human being a desert and call it peace, this is fine. The logical extension, as the social and medical sciences advance and intertwine, is to develop the optimal social and pharmaceutical “cocktails” to make the potentially problematic manageable. This process is obviously well under way. But this is also a kind of centralization, and it brings to bear social and medical developments that might have better uses. What would make the uses “better”? The only real argument is that since all actions, all scenes, involve someone occupying, albeit temporarily, the center, and others aligning themselves on the margins, which themselves on close look are little centers themselves, the absolutist wants everyone to be able to man their positions. Whatever enhances the ability of the individual to adopt a further increment of deferral—not take the quickest route to pleasure, not act out the most immediate resentment—is therefore to be preferred. Only in the course of making decisions within the fullest scope of your responsibility can you acknowledge the decisions made the same way up the chain of command.

The modernity of absolutism lies in the imperative to make delegation increasingly precise. Responsibility can always be more closely aligned with power. This involves the continual refinement of attention, the mark of a further increment in deferral. So, expecting the priest to stop the plague by carrying out a pre-determined ritual invites no refinement of attention. If the priest fails, that proves he is no longer worthy of being priest, and he should be replaced. Assuming the plague has a point of origin, and appointing someone to determine that point of origin, with that person in turn selecting those he wants to search, according to known criteria best identified by those trained by those who know, various sectors of the city, having them report periodically, pursuant to which he reassigns them, etc.—here we see attention continually refined. Let’s say the guy charged with searching for the origin of the contagion is required to place ads in the media, which restrict the job to those bearing specific credentials, state the equal opportunity character of the hiring process, and especially encourage minorities and women to apply; must meet environmental and labor safety standards in carrying out his charge; must respect the property rights of those who might refuse him entry. He is clearly no longer sovereign, or a bearer of sovereignty, but that doesn’t mean that to be sovereign he must only hire his friends and relatives, that he should trample all over the accumulated culture of the city in course of his search, that he should behave obnoxiously and imperiously to those with interests in the city that will still be there once the contagion is over. He’s sovereign because he knows the best people for the job and they know him and understand how important it is; because they all care about the city and are not just a bunch of hired hands who will get their paycheck and be gone tomorrow; because the people with property know their property best and want to help eliminate the contagion as much as anyone and so cooperate with the guy who has the job in hand.

The progressives want the equal opportunity employment requirement, the environmental standard, the labor law (and the media that can interview anyone inconvenienced by the search, the special prosecutor who can look into laws violated during the emergency, the licensing board who can take away the guy’s right to lead such a search, etc.) because they don’t want him to be sovereign, and they don’t want anyone to be sovereign over him—they want to be in the fight for power themselves, spreading it and regathering it. They deaden attention themselves, because you have to pay more attention to their rules, and therefore their power games, than to the task at hand—ultimately, it’s like dealing with the priest, as you have to figure out what kind of ritual performance will enable you to get to the next move. Insofar as that’s modernity, the absolutist is reactionary—the absolutist is ready even to see what the priest can contribute. I’ve been straw-manning the priest a bit—there was always a bit more to divination than carrying out a prescribed, mechanical ritual. The priest undoubtedly “read” the community, and not just the innards: his practices were in fact a form of deferral, a way of delaying panic and providing for solidarity. He may turn out to be an intractable obstacle, he may interfere with efforts to solve the problem that would discredit him, but why determine that in advance, just because he lacks credentials? He may know a lot more than he’s letting on. If you’re centralizing power, you should always start with and try to incorporate the existing chains of command. And you should always resist anyone clamoring for the removal of anyone from a position of power and authority for reasons other than their demonstrated inability to use that power to meet their responsibility (if the problem is that they need more power or less responsibility, you can see to that). But what all this means is that absolutism, as a political project, depends upon enough people working consistently to align power and responsibility, for themselves and others. For those with more power than you, read back to them their responsibilities by further refining the attention their delegation to you requires; for those with less power than you, dole out more power with each advance in adopted responsibility; for yourself, show a concentration of your powers dedicated to everything within your sphere of responsibility along with an absolute respect for other spheres. How many is “enough”? That’s unknown, but fortunately far less than a majority, at least to start turning things around.

Powered by WordPress