GABlog Generative Anthropology in the Public Sphere

September 24, 2009

The Human Condition: A Commentary on Originary Signification

Filed under: GA — adam @ 9:46 am

Any functional sign must involve the following:


1)  The possibility of being a lie (I borrow this from Umberto Eco’s A Theory of Semiotics).  There are better ways of putting this, as “lie” presupposes a declarative, an assertion about something in the world independent of the person making the claim.  So, when I shake someone’s hand, I am not exactly telling the truth or lying; the affirmation or gesture precedes the proposition.  But in a sense I am—my handshake can be sincere, or I could be proffering my hand so as to disguise my irreconcilable enmity towards you.  The originary scene itself is, indeed, beyond truth and falsehood—that some central object is indicated is simply constitutive of the scene; to put it another way, no object, no convergence of attention, no scene, and therefore no lie.  But that being beyond truth and falsehood will never happen again, precisely because of the scene—any future gesture could be a deception.  And the deception could only work because of the absolute trust that must have prevailed on the originary scene because there, in the phrase I consider prior to the truth/lie binary, one and then each stood as surety for the presence of the object.  Every sign, to be meaningful, or to work, must have its audience presuppose someone to stand in surety for some material or immaterial object of the sign.  Not necessarily a referent, or even a signified, but the possibility of a gathering of attention around some “thereness” to attend to.


2)  A prayer to the central presence or intelligence.  A prayer is an imperative, however politely or supinely put, to the central intelligence—most elementarily, to save or protect the supplicant.  But this demand implies the duty to obey the center; so, the subsequent prayer or, really, continuation of this same one, is a demand that the central presence provide guidance in fulfilling a divine command.  This more articulate prayer recognizes a dominion under divine sovereignty, wherein the divine command must be shared, applied and interpreted.  In claiming the invocation as a condition of intelligibility, I am pointing to the regular, or grammatical element of semiotics.  Whatever the rules in any language or idiom, I must follow them; but what are rules other than the way a particular interplay of imperatives back and forth from the center has unwound?  If I am on the verge, say, of coveting something of my neighbor’s, and I hear God telling me not to, and I beseech God to give me guidance, and I discover a way of redirecting my attention so that I covet no more, a habit and therefore a preliminary grammar is in place.  If someone then trusts me enough to ask me to help them find the way in a similar circumstance, I can present my discovery, and they will have to implore God to help them find their own way, analogous to mine—and my grammar has been transmitted, which is what really makes it a grammar in the first place.  I don’t think it’s any different with things like word order, conjugation, inflection, etc., in words and sentences—they are all habits by which imperatives have been moderated and woven into a transactional fabric where they intersect with other, often contrary, imperatives.  The equivalent on the originary scene is each of us looking at all the rest of us and ascertaining that a rule of interaction supplanting the uncontrolled surge toward the center has emerged.  To put it simply:  conscientiously following the rules, including the construction of ideal or model modes of rule following, is a form of prayer and faith that the right or needful thing to do when the rules fall short will be made present to me.  And it is a reasonable faith, because when the rules fall short, the tacit rules undergirding the overt ones, which are the imperatives we have so thoroughly embedded as to have forgotten, and which have been  preserved in the overt ones, are there as back-up.


3)  A hypothesis regarding how my audience or interlocutor will respond to my sign.  This is my misreading of C.S. Peirce, whom I take to be claiming that the meaning of a sign is all of those consequences you can imagine following your issuance of the sign.  This hypothesis must be internal to the sign itself, it must emerge with the sign.  That is, I don’t hypothesize and then issue the sign, or issue the sign and then hypothesize (one could only hypothesize with signs, after all).  The hypothesis is the sign:  whatever presence needed to be filled (signs wouldn’t be issued if some presence did not need to be filled, because sheer absence can only mean terror and extinction, whether experienced on a personal or collective level), I first put forth my sign with an inchoate sense of attempting to fill it, and as the sign is composed, and I get glimpses of its reverberations and possible mistaking, it seems to be more or less likely to provide that space with presence, to indicate that the need was in fact other than I took it to be and so my sign must be redirected or to the extent possible withdrawn, or that the sign will require supplementation which it must somehow be composed so as to solicit, and so on.  The “proof” of this hypothetical element of the sign is that when I “understand” a sign, what someone tells me, I am aware that I have been inscribed within it, that it has anticipated me and that it requires something of me.  It is up to me to render it meaningful or meaningless—it has predicted, or bet, that I would make it real.


If we have no “human nature,” then, we can have, as Hannah Arendt (who contended that for us to claim to know our own nature would be like trying to leap over our own shadow) asserted, a human condition—a set of possibilities and limits, always distributed differentially across individuals and history.  We must guarantee, and demand guarantees of, reality; we must follow (and be followed by) rules, more or less “religiously,” and insist that others do so as well; and we must anticipate, speculate, project and hope, while interfering with such on the part of others.  At our best, we preserve, within our signs, these diverse vocations, and occasionally even repair the damage that is constantly done them through resentment of our humanness; at our worst, we arbitrarily assign one priority over the other, or even betray any or all of them.

September 16, 2009

Hunters and Craftsmen

Filed under: GA — adam @ 6:22 am

I’ve just finished reading Thorstein Veblen’s The Theory of the Leisure Class. Obviously, I can’t claim that this puts me in the vanguard of anything, but I found his organization of economic analysis around the categories of, on one side, “invidious distinction,” and, on the other side, the “instinct of workmanship,” very provocative.  Economic life is organized around “invidious” distinctions when human life is predatory:  based on hunting, war and conquest.  Under such conditions, some men gain possessions and reputations that place them in a superior position to other men, and the way they manifest this superiority is through conspicuous leisure:  doing lots of things that serve no utilitarian purpose and, indeed, flaunt their contempt for utilitarian purpose.  For me, the analysis gets interesting when Veblen associates players on the market, or those driven by “pecuniary” interests, with the class of “predators,” and hence “archaic” by the standards of a “modern industrial” society.  He thereby places the entrepreneur, banker, broker, etc., at odds with those driven by the instinct of workmanship, who are interested in working out and applying causal relations: scientists, engineers, etc.  The economic figures driven by pecuniary interests are, then, simply hunters and warriors in a new, quasi-peaceful guise.  As, of course, are “administrators,” i.e., bureaucrats and the government.  


I suspect that a quick look at a transcript of some casual conversation among Wall Street brokers would confirm the plausibility of this classification, as is our use of terms like “robber barons” to describe the great corporate founders of the nineteenth century, idioms like “make a killing,” etc.  For Veblen, exchanges on the market are indistinguishable from fraud, an essentially predatory relation to others—all lines separating fraudulent form legitimate exchange are essentially contingent and pragmatic.  Are we so sure we could we say he is wrong about that?  He also associates gambling with the predatory disposition, and it is easy to wonder how much of our current economic crisis, especially that part attributable to the mysterious “derivatives,” is a result of nothing more than very high stakes gambling (with other people’s money, of course).  What is further interesting in Veblen’s account is his classification of Christianity as a religion grounded in the leisure class:  in this case, God is king/conqueror, and worship of Him, with its incessant emphasis on His infinite power, is the vicarious leisure of the servant class.  Veblen has quite a bit of fun with the clothing worn by priests, the architecture and decoration of Churches, and so on, in the process of establishing this claim.  Charity and philanthropy, further, fit into this characterization:  they are more conspicuous leisure, dedicated to promoting the honor and value of the benefactor warrior/king. 


Of course, those familiar with GA will notice several things here, which might be invisible to others.  First of all, we know that all human existence is based on “invidious distinction,” which enables us to reverse Veblen’s hierarchy of the two economic types.  Veblen argues that the “instinct for workmanship” is the more originary trait, characterizing human existence at a more primitive and peaceful stage, while the predatory element in human existence comes must later, and is ultimately a mere variant of the former.  For us, social relations based on invidious distinction are also based on the shaping of such distinctions into such forms as mitigate the inter-communal violence they would otherwise incite—if there were nothing but invidiousness, there would be no community at all.  The instinct of workmanship, meanwhile, we can easily locate in the esthetic element of the originary gesture, which is there from the beginning, as the gesture needs to “propose” some symmetry or harmonization of the group in order to take hold, but is nevertheless secondary to the felt need to interrupt the imminent violence itself.  This also means that the instinct for workmanship involves, first of all, a social relation between the maker and his/her fellows, rather than the direct relation to his/her materials and the manipulation of the causal relations articulating them, as Veblen would have it.


We could, further, identify Veblen’s account of the predatory/pecuniary interest with what we can call the “Big Man” stage of history—a stage of history which we have by now means exited (indeed, modern constitutionalist politics and the free market aim at harnessing Big Men more than at eliminating them), as Veblen, along with so many others, fervently hoped.  His discussion of Christianity and monotheism more generally is illuminating in this connection, since both Judaism and Christianity are invented as responses to the unimpeded rule of Big Men and the imperial moralities such rule generated.  If, as Eric Gans has argued, the centrality of scapegoating to social order only holds true for communities thusly organized, then faiths predicated upon a repudiation of the scapegoating morality of the Big Man presuppose his continued existence (and periodic chastisement).  If the total replacement of the Big Man as a social phenomenon by esthetic, conciliatory gestures and the reality revealed by norms of scientific inquiry (the instinct for workmanship) were to occur, then it makes perfect sense to assume that the monotheistic faiths would fade into oblivion. 


I’m not going to argue for the impossibility of such a development here—I’ll just say that the invention of the Big Man as an occasionally necessary medium of social deferral (albeit elected and subject to recall and liable to criticism and disobedience, or subject to the discipline of the market and the threat of and bankruptcy) can no more be revoked than the invention of nuclear weapons.  I’m more interested in the implications of Veblen’s classification of entrepreneurial and financial activity for the mode of economic theory I’m most interested in now, the Austrian theory of Mises and Hayek.  I assume that these thinkers, and those of their “school,” would vigorously repudiate Veblen’s claim:  for these free market thinkers and advocates, there is nothing more peaceful then the activity of exchange:  indeed, exchange is the antithesis of violence, it is what we do once we have successfully suppressed violence as a factor in human relations.


I want to explore the possibility that Veblen is right, and they are wrong—and the plausibility of this hypothesis lies not only in the very structure of competition, in which you can win just as easily by disabling your opponent as by improving yourself, and not only in the enormous destruction which can be deliberately wrought in the financial arena, but also in the very evident attitude of those who operate there, which seems to be one of obligatory triumphalism, machismo, threat, bluff, swagger, etc.  (Here, we would have to distinguish between those entrepreneurs who are closer to the workmanlike aspects of the job and those closer to the financial dimension—but no entrepreneur could indefinitely avoid the latter aspect.)  (In a similar vein, the Austrians like to believe that private property rights derive from occupancy and/or use of territory or object—but doesn’t it make more sense to say the property was first of all what one could take, defend, and persuade others to accept as a fait accompli—and that rights then emerged to mediate between property owners?) I also reject Veblen’s assumption that this position is obsolete.  So, if the pecuniary/predatory is here to stay, and is inseparable from a proper understanding of freedom, how do we incorporate that into our economic, ethical and cultural analyses?


Every commercial community must come to terms with the distinction between fraud and fair exchange—it is inevitable that such a distinction be made simply because even dealing among merchants would become impossible otherwise.  Even for purposes of fraud, reputation as a fair dealer is essential (Veblen associates “honor” with predatory/pecuniary fields; indeed, of what relevance is “honor” to an engineer, architect, dentist or plumber, except insofar as we confront them as merchants—we can see their work for ourselves); and you can only gain such a reputation if “fair dealer” has some shared meaning.  Such a distinction is inevitably rough and relative—there are a lot of things that could interfere with the fulfillment of a contract that couldn’t have been anticipated, whereas the parameters of expectations for the “workman” I just mentioned parenthetically can be much more tightly drawn.  The levels of required trust and acceptable risk will be drawn differently under different conditions—again, most unlike the standards of good workmanship:  the good dentist or carpenter is good in Boston or in Moscow, and their clients will be able to distinguish their work from more shoddy varieties.


This line will be drawn, like all lines, by events:  in the midst of a commercial culture given over, or in danger of falling into, general fraudulence and corruption, someone and/or some group will come to exemplify fair practices.  The establishment of fair practices would first of all be negative—we don’t do all the things our competitors do.  But it would eventually become subject to verifiable norms, and embedded in relatively transparent practices, and advertised as an intrinsic part of your experience, as a customer, with that business.  The fair dealers would seek each other out and, I think, would be genuinely “authenticated” by the business community and circle of customers once they had weathered some storm—once they had, for example, refused the compromise involved in obtaining some government sponsored monopoly, or abstained from participating in some boom or panic that wiped out other businesses, and ended up intact, perhaps even stronger, precisely due to the values implicit in their “fairness.”  Again unlike workmanship, though, where skills may deteriorate, but in fairly predictable ways, the “capital” of “fairness” can erode rapidly, and often as a result of what seemed at the time to be inconsequential decisions (cutting a corner here, lobbying the government there, when things got a little rough…). 


It further seems to me that the creation of such a capital fund of fairness will be in inverse proportion to government involvement in establishing and enforcing norms.  The government’s secondary function, indeed (second only to preventing violent assaults on citizens’ rights), is the prevention of fraud, which violates the sanctity of contracts.  But such a task would prove impossible to perform, or at least perform adequately, if standards of fairness had not already evolved within the commercial community itself, so that the government is essentially policing the margins of the community in accord with the norms of the community itself—it’s very hard to see on what basis the government (government lawyers, to be more precise—yet another set of predators who would need to establish a set of internal norms) could generate such norms in a non-arbitrary way.  But, of course, the government’s role will also be established through events—for example, through its protection of some “fair dealer” in danger of being scapegoated within the commercial community.  The relationship between business norms and the legal system, then, is an index of the moral health of the economy; and the moral health of the economy is itself an economic “factor”:  certainly, much wealth is lost to corruption and fraud, and gained by fair dealing.


In this way, the Christian morality that has emerged and sustained itself as a check on predatory Big Men (think of how focused both Judaism and Christianity are on the “haughty”), could become an economic value in its own right—perhaps one we could even learn to calculate.  Surely some economist could (or, for all I know, already has) invent a formula for determining the value of the moral economy (of course, we would need to be anthropologists to devise measures for the moral economy).  What is x number of people willing to leave cutthroat firms when they cross the line and become, not “community organizers,” but more honest versions of the business they have “exodused” from, “worth”?  Or x number of individuals willing to form companies in which their own money is at stake, instead of playing only with others’?  These are challenging questions, because below a particular threshold above which there would be enough of such firms to survive and impact the economy, their worth would be zero.  Even more challenging is determining which other, only indirectly economic elements of the culture would comprise a moral economy making such thresholds attainable.  We’re not just talking about honesty or altruism here—rather, “fair dealing” involves the ability to create, revise, and continually re-interpret, on the ground, in conjunction with others, sets of rules that are largely tacit.  Distinguishing between those who preserve and adhere to the rules so as to skew them in their direction and those whose actions always preserve a residue aimed at enhancing and refining the rules is a skill acquired, like any skill, through practice. 


The grammar of rules might be sought in a seemingly strange location.  Rules are difficult to describe—even the ones we follow flawlessly and thoughtlessly.  Indeed, the thoughtlessness is the problem—analytically, not necessarily morally. Rules always have a tacit dimension—if you ask someone (or yourself) how you follow the myriad rules you do follow to mediate all your daily interactions, you must either simply “point” to what you do and rely upon your interrogator’s own intuitions as a rule follower to understand; or, find a way to point to another set of (meta) rules which tell you how to follow the rules in question—but, then, how do you follow those rules?


A few posts back, I defined imitation as the derivation of imperatives from a model.  Iteration, meanwhile, derives from the response you get when you issue an imperative to your model in return, demanding that he/she show or tell you how to obey the previous imperative, subsequent to an inevitably failed attempt.  That initial attempt must fail because you will still be insufficiently like the model, hence indicating some portion of the imperative left unfulfilled.  To demand of the model another imperative, now part of a series (his implicit one to you, yours in return, and now his again), is to now treat the model as him/herself subject to imperatives, which he/she could convey intelligibly.  In that case, the two of you share the same source of imperatives; but this further means that part of the imperatives this newly revealed shared center issues involves articulating the imperatives each of you receives with those the other receives.  Hence, the birth of rules, which call upon one to act in such a way as to coordinate unknown acts along with everyone else. 


One is always within rules, but one becomes aware of the rules when they become problematic, and they become problematic when one must narrow them down, in a single case, to an unambiguous imperative—what must I do right here and now?  It is then that the origin of rules in a center issuing imperatives that must be shared becomes evident because one must then ask the center for guidance.  This, it seems to me, is the structure of prayer, which would mean that learning how to follow the “spirit” of rules means learning how to pray.  (“God, give me the wisdom to understand your will…”) (For Veblen, this is the kind of situation the “instinct for workmanship” could never lead us into.)  And in the monotheistic or, perhaps (I’m not sure where Islam is on this), anti-“haughty” faiths, such prayers would take on the greatest urgency in situations where one’s desire is to abuse the position of the “Big Man,” usurp that position, elevate oneself by discrediting the existing one, fantasize oneself as Big Man, or create a negative Big Man who will serve as the “cause” of some present crisis; but, also, where one’s desire is intertwined with the emptiness of the Big Man space, or the inadequacy of its current occupant—where one may need to help prop it up, in other words, but where such a need edges imperceptibly into these more sinful desires. 


Humbly demanding that the center, the iterable source of rules, or the “central intelligence,” come through with a clear imperative at such moments is the heart of the proper creed of our commercial civilization.  If we recognize that our entrepreneurial class is comprised, not of pacific servants of others unreasonably harassed by the predatory state but, with all the good they do, of actual and budding Big Men (who, of course, seek commerce with Big Men in other realms), thereby adding a political component to the economy, then we can find the economic value in the prayerful state that seeks a middle between haughtiness and debasement.  This middle would also turn out to exist between other poles inevitable in an increasingly sophisticated rule-based culture:  between the “letter” and “spirit” of the law; between the rules’ tacit and explicit dimensions; between preservation and innovation, and so on.  Such prayer is itself a kind of thinking, and I’m even thinking of considering prayer as the origin of the declarative sentence.  In another post.

September 1, 2009

Popular Culture

Filed under: GA — adam @ 7:37 pm

I take Eric Gans’ distinction between popular and high culture as axiomatic:  in popular culture, the audience identifies with the lynch mob, while in high culture they identify with the victim.  It seems to me, further, that this distinction manifests itself as one between two modes of reading, or “appropriation” or “consumption” of cultural materials:  in engaging high culture, one is enjoined to preserve the text or artifact as a whole—this means examining the parts and the text/artifact as a whole “in context,” with an eye towards its unity and purposefulness, as well as the accumulated historical labor expended on its production.  This also implies a hierarchy of interpreters and commentators and the institutionalization of the materials (museums, literature departments, etc.).  With popular texts and artifacts, meanwhile, elements of the cultural product can freely be iterated in contexts chosen by the user, without regard to the “intentions” of the producer.  We have no compunction about repeating catch phrases from a sitcom or movie in ways that show no respect at all to the way that phrase functioned in its “original” context.


Now, of course high cultural texts get treated in this way as well, but this just testifies to the dominance of popular culture in the contemporary world—that is, we are talking about ways of treating texts and artifacts as much as (or more than?—that’s part of the issue) about the texts and artifacts themselves; and, of course, putting it that way further testifies to the decline of high culture and the ascendancy of the popular.  We can also take as given the convergence of popular culture with the rise of the victimary:  the high cultural texts are themselves viewed as oppressors, and by “appropriating” them “violently” we take our justified revenge upon them for their presumption of centrality.  And we can also stipulate that the mass market and the “age of mechanical reproduction” have been central to this process. So far, nothing I have said takes us much beyond discussions of postmodernism going back to the 70s and 80s, which also highlighted the collapse of the high/popular boundary as well as the intensified “citationality” and “cannibalistic” nature of contemporary culture. 


But we can go quite a bit beyond those discussions, I believe, in particular in trying to figure out the consequences of these developments.  Left cultural theorists have tied themselves up in knots trying to convince themselves of the potentially “progressive” character of the rise of the popular, with results that have been brilliantly lampooned in a couple of essays on cultural studies by John O’Carroll and Chris Fleming.  Somewhat more serious, or at least earnest, approaches, like that of Gerald Graff, try another, in my view, equally flawed attempt to find something hopeful in our students’ attraction to popular culture.  For Graff, instead of trying to get students to engage thoughtfully with the products of high culture that we professors value, in order to develop and put to work their interpretive faculties, their ability to see things from different points of view and in “depth,” etc., we should recognize that students are really doing all these things already when they argue about their favorite sports teams, or the movie they saw last night, or the latest music video by their favorite artist.  Get students speaking about what they already know, already interpret, already canonize, already debate in more sophisticated ways then outsiders realize, and they will come to realize that they are already something like “scholars” or “academics” (or “critical thinkers,” or “interpretive agents,” or whatever you like).  What happens then seems to me less clear—if they are already engaged in serious discussions over esthetic and moral values, why do they need our high cultural texts, or the means of interpretation that have evolved in the history of responses to them?  On the other hand, if those discussions are not genuinely about such values, and the means of interpretation at work in them not comparable to institutionalized ones, then, in fact, they are not really doing what “academics” supposedly (or hopefully) do.  Nor do we have any reason to assume that having them attend to what they already do will get them one step closer to that goal. 


Defining popular culture as the free iteration of bits of models helps us to account for why these attempts to “redeem” popular culture can’t accomplish what the redeemers would like.  High culture is intrinsically totalizing, centralistic or holistic, whether it be the Marxist theory of history or the New Critical sacralization of the literary text—the idea from the start is to resist the fragmentation so celebrated by apologists for the popular.  The assumption is that some transcendent reality, embodied, albeit partially, in the most accomplished products of culture, is what militates against the scapegoating of those figures who stand out against ritual, tribal culture, figures utimately modeled on Socrates or Jesus.  No coherent political ethic can emerge from immersion in soap operas, Madonna videos or comic books; nor can any consistent and arguable esthetic stance be elaborated out of one’s baseball card collection, pornography addiction, or experimentation with shocked hair and body rings, because the entire notion of coherent and consistent ethics and stances derive from a different set of assumptions and practices.  At the same time, though, I don’t think there is any way of returning to the notion of high culture that presided up until, say the Second World War—not only has that notion of transcendence been displaced irrevocably, but it was flawed in important ways from the beginning, however great its service to the advance of humanity and however many the staggering accomplishments we owe it.    In that case, the problem with the cultural studies people (of whom Graff is one, of course, even if one of the moderate political center), is that they aren’t radical enough.


After all, the originary hypothesis confirms the central claim made by avatars of the 20th century’s “linguistic turn”:  human reality, at the very least, is indeed constituted by the way signs reveal relations between us through the things we move to appropriate, and not by the referential relation between language and a higher reality.  This must also mean that when we account for the human condition, we must do so in language and are therefore always further and newly constituting it—this “Heisenbergian” reflection irremediably undercuts any pretensions to knowledge of a permanent “human nature.”  Mimetic desire, rivalry and crisis will always be with us, and the bet made on traditional high culture is that that permanence renders different modes of deferral  secondary, so many “epiphenomena,” if you will—but if we reverse that claim, as I believe we must do as we become more conscious that we ourselves, everyday, are responsible for inventing such modes of deferral, then even those enduring traits of human reality are relativized by ever changing sign systems which not only resolve them in limited ways but shape their terms of emergence as well. 


And yet the paragraph I just wrote was, or so I would like to believe, composed on the terms of high culture—I am certainly aiming for the kind of “density” or “depth” in my discussion here that would mark this argument as one that would interrupt the prevailing modes of scapegoating.  And, of course, the theoretical and esthetic rebellions that have provided a vocabulary for the privileging of the free iteration of bits of models took place completely within high culture as well.  Indeed, notions of “depth,” “density,” “textual autonomy” and so on refer to our willingness, or our felt compulsion, to take the object on “its own terms,” to assume, as Leo Strauss put it, that its author knew more than us and was providing us with knowledge or an experience that was both valuable and one we couldn’t have procured or even thought to pursue on our own.  If we approach cultural objects with such an attitude, they become inexhaustible, but we will only do so as long as we believe the inexhaustibility lies in the object, not in our attitude towards it—once we assume there is no “text in this class,” to refer to Stanley Fish’s famous phrase, the sheer proliferation and ingenuity of interpretative strategies that have been accumulated over the past couple of millennia will not be able to sustain our interest for long.  The initial burst of enthusiasm deriving from the sudden sense that “hey, we’re really the ones who ‘made’ these texts!” will quickly dwindle into a deflated “you mean, it was just us all along?” 


The initial result of “unregulated iteration,” in both popular and high culture, was the creation of the celebrity—from the modernist writers and painters in the 1920s to the postmodern theorists of the 70s and 80s in the world of high culture, and from newly famed athletes, singers, actors, along with seemingly randomly elevated members of the idle rich, the scandalous, etc., also starting in the 20s, through the movie stars and rocks stars, also into the 1980s.  Perhaps this age in retrospect, if the title of Eric Gans’ recent Chronicle on Michael Jackson is correct, will be known as the “Age of Celebrity” as we move on to something else.  Maybe “celebrity” filled the space of sacrality previously filled by the Platonism of both the guardians of culture and the people, and now vacated, most immediately due to the historical catastrophe of the First World War; maybe it also fit an early stage in technological reproduction and the market, where such processes were far more centralized and monopolized then they are likely to be from here on in.  It seems to me that the precipitous decline in the power of celebrity which we are witnessing (and is perhaps best testified to by the openly staged, publicly “participatory,” “auditioning” for celebrity in shows like “American Idol”—the aura essential to celebrity cannot survive the public’s freedom to elect and depose celebrities at will, and with such naked explicitness) is more in accord with the logic of unregulated iteration, as well as healthier.  (It is noteworthy that while there may very well be something cultic in the devotion millions of people express towards political leaders like Obama and Palin, the nomination of these figures as “celebrities” was premature, as celebrity cannot survive the harsh criticism on inevitably divisive matters of public substance any political figure must endure—if an author touted by Oprah turns out to be a fraud, she apologizes publicly and has him come on the show and do the same; there is no analogous mode of “redemption” if, say, Obama’s leftist agenda crashes or Palin runs for President in 2012 and is thrashed in the Republican primaries.)  At any rate, though, one could imitate Babe Ruth’s swing or swagger in the playground, or Jordan’s moves in the gym; one could sing a Beatles tune or mimic some of Michael Jackson’s moves without having to have a “reading” of the “text as a whole”—while the celebrity of these figures, one might say, helped guarantee a unity and hierarchy of focus that could be shared nationally and sometimes globally, sustaining the type of community previously preserved through more transcendent means.  If celebrity is on its way out, we will have overlapping and often mutually uninterested, even repellent communities, sometimes aggregating into something larger but not in any predictable way.


If the generation of models in a period that is both post-transcendent and post-celebrity does not require a focus on “complete,” or “fleshed out” figures (about whom a story could be told, through whom a meaningful sacrifice performed), if they don’t have to conform to existing narratives so precisely (in part because the media, or means of establishing celebrity, are themselves increasingly decentralized and evanescent), it may be that the eccentric and idiosyncratic will come to the fore—not just any idiosyncrasy or eccentricity (and not necessarily the depraved or cartoonish) but, I would hypothesize, those that the make the figure in question just as plausible a figure of ridicule as of emulation.  Those who organize a space around a particular figure would do so with an awareness of this two-sidedness, which would in turn provide a basis for dialogue, friendly and hostile, with other groups—that is, “we” would organize ourselves around emulating a particular somebody and therefore knowingly organize ourselves against those dedicated to his ridicule; and vice versa.  (It seems to me that something like this is already happening with Sarah Palin who, despite what I said before, could, if she avoids putting herself in situations where her power of presence must be directly repudiated or ratified, might become an example of this new kind of…well, what would it be?)  What looks to one group like an accomplishment looks to the other like a botched job, what looks to one beautiful is grotesque to the other, a pathetic mistake to one is an innovation to another and so on—and, in the best of cases, each side will be able to see what the other is seeing.


In this case (to continue hypothesizing), popular culture will be performing what high culture might become increasingly interested in—that boundary between error and innovation, where rules get followed in ways that create “exceptions,” where the strictest literalism produces the wildest metaphors, where models get both emulated and mocked and it can be hard to tell which is which, where we find ourselves in the position of figuring and trying out ways of seeing others and objects as beautiful or repulsive, instead of simply being “struck” one way or another, where no one has proprietary rights in the line between “mainstream” and “extreme,” etc., but where one still has to come down on one side or another, at least at a particular moment.  High culture, whether carried out in the theoretical or artistic realms, would increasingly become so many branches of semiotic anthropology, interested the way in which avatars of the “human” keep coming to bifurcating paths (do nothing but keep coming before such bifurcations), going one direction or another for reasons we could guess at but with consequences we can identify and judge according to their irenic effects.  It’s not too difficult to imagine texts and performances being composed with this problem in mind, and critical and appreciative canons emerging to meet those texts and performances.  (Just think of the intellectual challenges imposed by the determination to write a text in which every phrase is a “taking” [an iteration or appropriation] as well as a “mistaking”—and think of how revelatory such an effort might be regarding idiomatic usage.)  (I suspect one could already construct a “genealogy” of such texts that have been classified as “modernist” or “postmodernist” while nevertheless sticking out as an anomaly.) I think high and popular culture would thereby become less hostile to each other, and both might become less sacrificial.

Powered by WordPress