I have argued previously for the priority of “attentionality” over “intentionality”—attention must precede intention, and “intention” individualizes what is “joint” in attention, making it more of a declarative than an ostensive concept. We can trace the emergence of intentionality from attentionality, whether by “intentionality” we mean the more philosophical notion of constituting an object or the more everyday use of the term as meaning to do something. On the originary scene, all participants attend to the central object, and attend to each other attending; the sign, as the gesture of aborted appropriation, is really nothing more than the demonstration of this reciprocal attending to their joint attention. Self-referentiality, then, is built into the originary scene. Even more, what is action if not a prolongation of attention? I see the other attending to me, which becomes a kind of self-attending, as I can single out that in my gesture that might be articulated in the other’s attention, and in that way move myself so as to fit the shifting attentional structure of the other. My movements, and therefore my actions, enter into and are supported by the attentional space I have co-created with others. In all of our actions, then, we are tacitly referring to this attentional space, of which we are mostly unaware at any moment. As Michael Polanyi says, we know more than we can say. But we can say more and more of what we know, in the process producing more knowledge we can’t yet say—becoming a representation of this state of affairs is what ethical action entails.
For originary thinking, the human being has a telos: to speak and act along with the center; to enter the history of deferral in such a way as to construct the world as the effect of and continuation of that history. We assume everyone is trying to do that as well, which is why we know every utterance includes a sovereign imaginary eliciting commands from the center. Traditional ethical thinking will start to speak in terms of will, judgment, capacities, desire and its education and so on and all of that is fine, but we can just speak of the center one becomes as soon as one is amongst people, a center both actual and possible, and that each of us constructs as the ways we want attention drawn to or deflected from us. You can compete with other centers within the economy of attention, or you can redirect attention from you to the center enabling you to so redirect attention. Sometimes the very competition with other centers can be turned towards that end.
Performing the paradox of self-reference is the highest good for originary thinking. Turn every reference to something else into a reference to you and every reference to you into a reference to something else. You can never run out of things to do this with because everything is marked by the history of such reciprocal reference, and so keeps becoming something new. In this way you keep turning the world into a completely internalized self-referential system. This would seem to be a completely closed, and therefore dead, system, but in relation to the center this self-referentializing system is itself just a thing comprised of references to the center. You point to something, enabling others to see it, which enables it to be, but its being in turn enables you to see it and to point to yourself seeing it along with others—the center makes its appearance in this layering of the scene and the impossibility of determining whether new things are coming into view or we are sharing attention so thoroughly that we’re not sure where your seeing begins and mine ends. The center tells us to sustain that, by constructing institutions out of sites where the articulation of shared reference and self-reference (where we find a way of saying to each other, “here’s how we’re making sense of each other”) can become a model of deferral.
We don’t need to invent clever ways of enacting the paradox of self-reference, like saying “I am lying.” ‘I see that” is quite paradoxical enough, because “I” can only see that because “you” and “others” are least potentially able do so (and have therefore “always already” done so) as well; “that” is that only because I am seeing it; and I “see” that because our deferral, our laying back from appropriation, lets that object, like all objects since the first object, set itself off against a background—seeing is always a refrained touching and tasting. The disciplined forms of literacy try to suppress the paradoxicality of the declarative by supplementing sentences within imaginary scenes whose parameters are set by those defining the abstractions used to perform the supplementation. To define “perception” in terms of physiological structures and learned Gestalts is to try to abolish the paradoxicality of “I see that.” But, of course, we have to say things like that, so it’s best to say them in the manner of little satires on these suppressive supplementations, reintroducing the paradoxes they hope to avoid. Eventually, these running satiric digressions become indistinguishable from the primary discourse itself. If you can find ways of iterating this digression-within-the-discourse in new variations within emergent events so as to have each variant naming the previous ones you enable others to join in self-referential centering.
One way of breaking with Western metaphysics is by acknowledging the traditional character of all thought. The concepts you are working with have been worked with in other contexts, and are conversions of earlier concepts, which solved problems within a now extinct paradigm which has nevertheless bequeathed to us some of it problems and some of its materials for solutions. But this means that the more we shape these concepts to our own purposes the more we are participating in an ongoing inquiry with those who did so earlier, and had no idea we were coming along. But since the most fundamental and universal tradition is language itself, it seemed to me that the self-aware participation in traditions of thought could more simply be understand as a form of language learning. When you learn a new language, or when children learn language, the process involves imitating chunks of discourse in ways that are inevitably mistaken because you must intuit their uses in unanticipated contexts—how else could anyone learn? In the process, you generate new idioms, and this is how language changes—enough people take the mistake, or even a shift in emphasis, as “correct.” We never stop learning, so we’re always students, but we also have to step outside of the flow of learning in order to teach people who we see falling into what we fear (but we could be wrong) are less productive patterns of error. Here, we have, broadly, two choices: one, we situate ourselves within a more or less institutionally protected orthodoxy, and correct those whose language usage doesn’t conform. The advantage here is that you guarantee you’ll always be right and smarter than anyone who comes along. Or, you re-use the misused idiom with some of the weight of inherited uses which the newcomer might be less aware of and thereby incorporate the mistakes into a regenerated tradition of discourse. Here, authority has to prove itself by showing itself capable of allowing digressions to flow back into a larger current. You keep emulative mimesis in play by allowing that play to construct the very space in which the implications of language usages can be explicitly hypothesized.
Many years ago I started working on what I called “originary grammar” because I felt that GA needed to be more than just another “theory,” one that offered its own “readings” of texts and “explanations” of social structures and historical events. I thought it needed to generate its own comprehensive vocabulary—a language others would have to and want to learn—rather than just saying something like, “here’s how we think it all began” and then proceeding to talk about ideas and interpretations and principles and beliefs and arguments and proving things like everyone else. And the way to do that was out of the dialectic of linguistic forms Gans worked through in the first work in GA, The Origin of Language(the new edition of which is of course available, and the Amazon page for which is still sadly bereft of comments). I was encouraged in this by the fact that Gans used a kind of grammatical approach to defining the two key intellectual and cultural transformations constitutive of the West: he defined “metaphysics” as taking the declarative sentence as the primary speech act; and he defined Judaic (I think “Israelite” is better) monotheism as “the name of God as the declarative sentence.” In both cases, the post-sacral or imminently modern world is constructed in terms of some tension between the declarative, on the one hand, and the imperative, or, more broadly, the entire ostensive-imperative network, on the other hand. Wouldn’t anything we would want to talk about be included in this field of tension?
Originary grammar should supersede scientism while preserving all the intellectual advances of science. Instead of “facts,” we have what is known ostensively: what could become an object of shared attention. Something could only become an object of shared attention on a scene, which cannot itself be prepared ostensively: we are driven to create new scenes by the breakdown of a previous scene, the central object upon which eventually generated new desires it could no longer defer. (Of course, the new scene could feature the “same” central object in a different way.) If the scene is not simply to break down; if a transition to a new scene is to be achieved, asymmetry must enter the arena in the form of an imperative: someone issuing an “inappropriate” ostensive regarding a new or old/new object. Here, the preservation of presence on the scene can be united with maximum innovation on the scene: we allow a space for inappropriate ostensives, to see which might work as imperatives. Finally, we can bound declaratives to the scene by allowing the declarative field maximum freedom to explore all the complexities of declarative possibilities (to cross over time and space, to organize all of reality around one center or another) on the condition that it represent actual and possible ostensive-imperative articulations. The declarative sentence constructs a linguistic present, the present in which you can utter the sentence, that, unlike the ostensive and imperative, can be separated from any particular scenic present—but that means that the “vocation” of the declarative sentence is to keep restoring the continuity and extension of the trillions of human scenes, each of which threatens in a new way to break that continuity. The declarative would be most interested in suggesting ways of preparing us, or issuing imperatives, to share new ostensives.
In this way we would have a completely self-contained and completely open system in which we would always be talking about what we’re doing in the language through which we are doing it. The content of our declarative sentences would be the way other declarative sentences have commanded us to draw lines connecting objects around a centerized one. So, discussions would take something like the following form: “you say I’ve been looking at things in such a way that others see what I don’t and this is because of where and how I stand and in saying this you are telling me to be led by the configuration which I have not yet identified as a configuration and thereby to see and lean toward something that would compel others to join me in reconfiguring it…” The specific details of any particular scene at the center of an array of scenes would be inserted.
We would be more precise than this sample indicates because each sentence modifies in some way inherited chunks of language and meaning is thus generated by the modification itself—in a language user’s noticing that you have eschewed the expression that 87.8% of listeners would have expected to come at that point in your discourse in favor of a rarely or never before used one because you want that point in the discourse to operate as a center that has you reworking language along with perception, intention and intuition. And the next declarative in the discussion could point that out or, even better, iterate it in a new modification that the language learners around you would be able to iterate in turn so as to open new fields of objects. So, we’d be talking about things in the world while talking about how we talk about things in the world while talking about how we can rework the way we and others talk about things in the world and it’s all really one “talking.” This still seems to me to be the imperative.