Self-Evidency

When we speak about the “arbitrariness” of the sign, someone usually hastens to add that what is meant by that is, of course, its conventionality. “Arbitrary” is the right word, though, for what is assumed: that the sounds we make in speaking the languages we speak could just as easily be any other sounds, with the evidence of this being the obvious fact that words for the same things are different in all the languages, not to mention the enormous differences in grammar. The more you think about it, though, the more problematic the claim is—how in the world could we imagine everyone in a community agreeing to confer meaning upon a particular sound that in itself has nothing to do with the meaning it bears? The political implications of “arbitrariness,” which we rightly associate with tyranny, are therefore relevant here: if the sign is indeed arbitrary, it could only be because it was imposed upon everyone by some oppressor. In this assumption about the sign, then, we can see the trajectory from Lockean social contract theory (Locke was a firm believer in the arbitrariness of the sign) to the contemporary Left—the arbitrariness of the sign, starting with medieval nominalism, and, indeed social contract theory itself, were weapons against the assumptions about natural social order and natural law constitutive of Western Christendom. The arbitrariness of the sign is liberalism in linguistics and, in the end, liberalism (in the classical sense) has shown itself to share enough genetic material with the Leftism that succeeded it so as to leave it almost devoid of antibodies to fight the Leftist infection.

There is another liberalism, another Enlightenment, and another way of thinking about social agreement, though, which has been severely marginalized by the line leading from Locke through Hume and then Kant and Hegel (even the individualism of John Stuart Mill is ultimately derived from the German romanticism he imbibed through Coleridge). This other liberalism starts from the common sense philosophy of Thomas Reid, and can be followed through the American pragmatism of, at least, Charles Sanders Peirce, and is then strongly represented in the 20th century (in very different ways) by Hannah Arendt, Friedrich Hayek, Ludwig Wittgenstein and Michael Polanyi. The basic assumption shared by all of these thinkers is that we know far more than we know we know; that is, our knowledge is to a great extent, to use Polyani’s term, tacit—and not merely because we haven’t yet brought it up to consciousness, but constitutively so. As the novelist Ronald Sukenick once wrote, “the more we know the less we know”—not only because knowledge continually opens up new vistas of possible knowledge, but more importantly because the ways we know what we know cannot be made part of the knowledge we make present to ourselves. Any “language game,” disciplinary space, or idiom takes a grant deal for granted in addressing itself to a particular, emergent corner of reality; if it tries to bring that taken for granted bedrock into sight (and we do this all the time) it can only do so in terms of everything else that is still taken for granted, included some new things that enabled us to turn toward this new corner. Do you know for sure that you are at this moment present on the planet earth, that you are surrounded by building, streets, other people, etc.? “Know” is a very strange word to use here, which is not to say that we can’t really be sure—rather, what would be taken as “proof” that we are on the planet earth, surrounded by all those things? What would be better proof of this reality than the reality itself, as Wittgenstein liked to say? The question of how we know we are here, that we are ourselves, that we have bodies, that our senses integrate us into our surroundings, etc., is a very artificial one, but it’s that kind of question that modernity (and the dominant strand of liberalism) started with—most explicitly in Descartes, but Locke’s empiricism is ultimately no less corrosive of such self-evidency, as Hume revealed and Reid so forcefully demonstrated.

I have been writing much lately of mistakenness as constitutive of our linguistic and therefore social being, but it is equally true that there can be no mistakenness without certainty. I can only be mistaken in my articulation of an English sentence because I am certain that I am speaking English, not Chinese. If I’m completely out of place or out of line, it can only because there is indubitably a place or line to be out of. Mistakes disrupt a scene because there is a scene to be disrupted, and we are certain that it has been disrupted and while we can’t be certain that it will be restored, we can be certain about scenicity, without which there would be no mistakes. My argument has been that rather than evidence of the fragility of our worlds, mistakenness can be treated as evidence of its solidity. Assuming the arbitrariness of the sign intensifies the sense of fragility—if our common use of signs has just been imposed through some kind of force, human or natural, and, therefore, must continually be re-imposed, then of course deviation is dangerous. (For leftists, meanwhile, the consequence is that the arbitrariness of the sign encourages one to see reality as “constructed,” and hence infinitely malleable, in particular by those best at managing signs.) If signs, though, have an irreducibly iconic dimension, an iconic dimension that pervades every level of language, including semantics and grammar, then we just need to uncover the iconic meaning of a given mistake so as to bring it back within a reformed linguistic fold.

Isaiah Berlin, in his study of the determinist theories of history that undergirded socialist and communist politics in particular, made the point that you simply can’t remove the terms referring to human intentionality and, therefore, responsibility, from social and political discourse without making it impossible to refer to anything intelligibly at all. “He killed them” can’t be the same kind of statement as, nor can it be assimilated to, “e=mc2” or “historical development is determined by the force of production.” It’s not just that such ways of talking are immoral or unjust; rather, it’s that they are not really “ways of talking” at all, and therefore can’t sustain themselves without inventing all kinds of crazy agents (like “history” and “society”) which perform “actions” which no one has ever seen and or would recognize if they saw them. As originary thinkers, we can now say that this is because declarative, propositional meaning is rooted in the ostensive and imperative domains. We notice mistakes, in fact, because we can notice that our attention has been misdirected, which in turn reminds us that our attention is always being directed by everything we experience in reality.

The iconicity of meaning can be traced back to the gesture. The originary sign had to be gesture—it couldn’t be imagined as a sound, or a line drawn on the ground. Gesture is embedded in what we call “context,” that catch-all phrase we use when we reach the limits of our capacity to describe why something means how and when it does. A joke’s funniness might depend upon one of the listeners being where he is, and not a couple of steps to the left—that’s the kind of “contextual” effect we sum up with the phrase “you had to be there.” Gestures are also self-evident, in the sense that unlike propositional discourse, they cannot be replaced by their definition or explanation because they require the entirety of their “context.” The self-evidency of gestures also means that any normal human, initiated into any linguistic system whatsoever, would be able to make sense, on a gestural level, of the actions of any other human, from any other linguistic system—at least insofar as the gestures of the other are directed toward herself. On the most basic level, even though the meaning of gestures of course varies widely across cultures, we could recognize signs of aggression or good will directed towards ourself, even if those signs could also be used to deceive us.

Self-evidency, though, provides no support for Enlightenment optimism regarding universal communicability and amity. Indeed, self-evidency is also what radically divides us. The members of another culture who deceive me by exploiting my awareness of the meaning of their signs of peace are able to do so because within their own gestural system, inaccessible to me, they can signify that this naïve hick is ripe for the plucking. That is, to act in concert against me they need no dark conspiracy, no secret agreement—they know each other, and they know when one of them is welcoming an other with an excessiveness that communicates irony to them but not to me; they know how to follow each other’s lead in ways that I won’t figure out until it is too late, they know that anyone who might object to their scheme is far away at the moment, and so on. Of course, once they are through with me, I will be able to understand what they have done, if I am still around to do so. All self-evidency “proves” is that any attempt to impose a common idiom will generate idiomatic sub-systems resistant to control, understanding, or even detection.

What we can do is enhance and elaborate upon overlapping idioms and habits so as to create broader spaces of attenuated self-evidency—the fact that we can do so is what makes human equality self-evident, even while the attenuation of iconicity is what introduces what is called (by Michael Tomasello, among others) the drift toward the arbitrariness of the sign. The self-evidence of human equality lies in our ability to complement the inclusive drift toward arbitrariness with new modes of iconicity, within language and in our social relations. It is such a process that has brought us from the egalitarian distribution of the most primitive communities to the more expansive gift economy and ultimately to the market economy where the need for a single measure for value leads us to the relatively arbitrary universal equivalent of the precious metals—and, yet, what could be more iconic than gold, signifying wealth? (The arbitrariness of fiat currency, meanwhile, is arbitrary in the bad sense—it measures nothing but the will of the central banker.) It is also such a process through which we can try and move conflicts from the category of exterminationist opposition to war with rules and some notion of honor; from war to arbitration—or from criminality to civil law, and from civil law to friendly disagreements settled informally. And we can engage in such civilizing processes without succumbing to the delusion that any of these categories will ever disappear once and for all.

People only support icons, not arbitrary signs—an argument in favor of human equality in general is meaningless; what can be meaningful is a particular example of human equality at stake. (Which is why we will never get past the “distractions” and focus on the “real issues”; but, there’s no need to worry because the real issues get addressed, always imperfectly and so as to produce new, and equally real, issues, through the distractions.) And icons can be incommensurable with each other, which is why there will always be conflict. Successful icons are those that provide a new ground for the struggle between icons, and those icons will have the character of rules in relation to the lower level ones; or, more precisely, they will embody the kind of deferral and intellectual flexibility associated with rule following behavior, while still being exemplified by individuals acting alone and together. How can we support egalitarian distribution in sites like the family or other institutions devoted to close bonding and comradeship, while ensuring that any individual within that compact group is free to enter the market society; how can the norms of honor and shame needed to produce individuals ready to protect market societies from the enemies they will always produce in abundance, without nurturing fatal resistance to market society within its very bosom—the answer to such questions will always come, if they do come, in the form of some representative of a provisional, partial solution.

But let’s come back to the obvious: “dog” is “perro” in Spanish and “calev” in Hebrew; ergo, the word can’t have any intrinsic relation to the referent—the sign is arbitrary, case closed. Things must look this way for the linguist, with single systems of language, and the amazing diversity of the world’s languages, laid out in front of them; and to the naïve language user, compelled by such examples to take the linguistic perspective. The fact that when a speaker of English says “dog” it rather self-evidently refers to the animal in question, that “doggy” seems to “fit” the specific animal we feel affectionately towards, seems to be a pretty slim counter-argument. But there could never have been a point at which the word “dog” was imposed upon an acquiescent community of language users; the word was always firmly embedded with all the other words in the English language, and the languages English in turn evolved from, and if there was a first time the word’s ancestor was used (there must have been, right?—we are committed to at least that assumption), then it was used in such a way as to best ensure its referential capacity and memorability; or, if the choice was random, if it worked, it was remembered in such a way as to do so. And there never could have been a time when it exited that orbit of self-evidency. The systematicity of language—the fact that words don’t stand alone, but take on their “value” from all their interrelationships with other words (so, “dog” takes on its meanings from its distinction from “cat” and “mouse” on the one hand, from “wolf” and “fox” on the other, and from more specific terms like “poodle” and “German Shepherd” on yet another)—makes the point even stronger—at no time was any word or “lexical unit” outside of the linguistic world experienced as a whole, a linguistic world itself always in direct contact with the real one, via the ostensives and imperatives which embed us in that world—and, anyway, the sound symbolism of language can be every bit as complex as the semantic and grammatical systems: we can assume here as well, not a one-to-one correspondence between single sounds and dictionary-style meanings, but overlapping and interconnected connotations, which in turn interact with semantics and grammar in various ways. To address the argument for arbitrariness head on, the claim that linguistic signs would imitate, in their formal character (the articulation of sounds comprising them) those things they refer to or those events they aim at generating doesn’t imply that there should be only one language—why wouldn’t there by as many ways as “interpreting” what “sounds like” “dog” as there are ways of interpreting any complex text? It would be better to speak of a drift towards abstraction, rather than toward arbitrariness: the sign is abstract, even the first one, which had to be normed in such a way as to supplement its self-evidence from the very beginning precisely because there was no single way of conveying the intent to cease and desist. But even in this case the abstractions we speak of are marked by the drift, by the disciplinary spaces that have constructed them: in other words, abstraction involves accentuation and abbreviation, which is necessitated by the entrance of outsiders for whom the particular version of the sign current is not self-evident while at the same time making the sign even more difficult for the next outsider to grasp. The abstraction of the sign, then represents the disciplinary space (the shared inquiry into how to modify the sign so as to fit it for its new purpose) iconically, creating privileged and typical (unmarked) users, enabling the sign to attain self-evidency throughout the community.

I feel a strong need for a name for the politics of this marginalized liberal tradition, and the word “liberal” is not worth fighting over any more—especially since you’d have to fight the leftists who still use the term, the rightists who won’t give up on using the term to describe the leftists, and the libertarians who are very interesting but ineffectual semi-anarchists. The term I have been using on and off, “marginalism,” isn’t bad, but it sounds vaguely “oppositional,” and suggests an reactivel rather than comprehensive politics. I would like to derive a name from the rereading of “arbitrariness” I am proposing here, which sees the arbitrariness of the sign as a kind of secondary iconicity, a commitment to the iconicity of the sign that realizes that we can only rely upon the icons generated through the scenes we constitute. Icons lose their primary self-evidency when outsiders who don’t use the sign properly, because it isn’t self-evident to them, having their own self-evident semiotic system, and because ensuring the self-evidency of the sign to the primary community has made it idiosyncratic, or idiomatic. It is precisely this idiosyncrasy or idiomaticity that is, simultaneously and paradoxically, the ground of self-evidence: the shaped, complexly marked nature of the idiomatic sign is what makes it learnable through immersion in the scene. Common sense is, in turn, the meeting ground of these idioms, the discovery of overlappings.

I have thought about “plurality,” not in the sense of a diversity of ideas and lifestyles (pluralism) but in the sense of fundamental incommensurabilities in any community which tempt to violence but can facilitate rather than interfere with living together. I want the sense of “sampling” that Charles Sanders Peirce associates with inquiry (any knowledge is knowledge of the relation between the proportions in a sample and the proportions in a whole)—the notion one can derive from the icon (not necessarily Peirce’s) of a continuous sampling of possibilities in any event (when you try something the first time what’s the proportion of visible supporters and opponents; and then the second, third and fourth times?) can ultimately lead to the conclusion that the generation of samples is itself the event. Politics in this case is about thinking and knowledge, but not knowledge which then guides politics—instead, the politics generates knowledge which can only be used within political action, as the provisional articulation of our tacit knowing. Alongside of “sampling,” I considered “a politics of proportion,” which shares with “sampling” the relation between parts and the whole, while including the word “portion,” which reminds us of politics’ relation to dividing and sharing in some “equal” manner, and suggests a notion of politics as balancing and inclusive while still being interested, inevitably, in one’s “portion.” But “plurality” seems like a way of describing politics from the outside, from within thinking, and sampling is too “experimental,” by itself, suggesting the progressive sense of a “scientific” politics; “proportion” has the same problems, while another idea, “partiality,” or particularity,” evoke partisanship and identity politics rather than the notion of a whole that not only exceeds but can only be grasped through the parts which we are.

What I have for now, and will try out, is the neologistic (according to Merriam-Webster, neologism is either “a new word, expression or usage” or “a meaningless word coined by a psychotic”) “anyownnness”—any, or “one-y,” evokes (for me, via Gertrude Stein) singularity but also plurality, since anyone is as any as anyone else; “own” replaces “one” (which is redundant here anyway), and can suggest one’s property, one’s ownership of oneself prior to and as a basis for property, the opacity of any’s “ownness” to others; I hope it can suggest that one’s ownness, one’s singularity and property, is (“constitutively”) bound up with that of others, hence maintaining the notions of proportionality, sampling—and marginality, in the specifically economic sense, i.e., that infinitesimal point at which one’s (or anyone’s) “weight” on a particular “scale” tips that scale in the opposing direction. A politics of anyownness, or of the anyown, then, is a politics of motivatedness: nothing is arbitrary, nothing is simply imposed, everything is exemplary and abstract, anyone can be the marginal representative of idiomatic common sense.

So, Next: The right of the anyown

Be Sociable, Share!

Leave a Reply