Extra-Sensory

Printer friendly version |

Progress and decline are spatial metaphors. They suggest a curve headed upward or downward over time. Now, the more points through which a curve is plotted, the better defined it is. So the farther backward and forward we can plausibly-- without floating away in fantasy-- extend the temporal axis, the better we’ll understand cultural tendencies like the decline of verbal literacy, about which Sven Birkerts wrote perceptively in “Into the Electronic Millennium, the first essay in this series.

Backward, then, into the prehistoric mists. What was it like before there was writing? Whatever other categories may be useful for imagining the differences between then and now, surely immediacy is. Between perception and reaction, between stimulus and response, there lay no shadow, no complex processing, no translation of cipher into referent into meaning (to adapt Birkerts’ handy terms). The large parts of our neurophysiology needed to decode, store, and retrieve written information went instead to speed and intensify the reflexes of pre-literate man. No writing meant fewer options to search out and compare before any decision (and fewer decisions, naturally); fewer competing perspectives or frameworks to choose among. Instinctual conflict, sometimes; but no pale cast of thought. I would guess that Homeric, or at any rate Neanderthal, heroes really did “leap” into battle, really did “embrace” death. Imagine their orgasms.

But this was not an entirely benign, nobly primitive condition. Humankind has not evolved biologically very much since the invention of writing, so pre-literate people had roughly the same neurophysiological capacity, the same quantity of imagination, as us. But since they were forced to deploy it within very much narrower dimensions, the results were exotic, even bizarre. They didn’t just charmingly endow snakes, trees, and waterfalls with personality, and sometimes divinity. They often heard them speak, and sometimes died of fright. Nearly every oral culture seems to have been a theocracy, and an amazing number of them (on the evidence of Julian Jaynes’ Origins of Consciousness in the Break Down of the Bicameral Mind) were based on hearing the voices of gods, i.e., on mass delusion. Imagine their terrors. Loss and gain, then. The things that matter most to us, the terms in which we tell our life stories-- loves, beliefs, tastes, ambitions--presuppose a degree of vicarious experience, an extent of information, inconceivable 50,000 years ago; while our ancestors’ significant 1ife-experiences-- to have lived among intimately familiar and subtly discriminated flora and fauna; to have enjoyed or endured sensations and enacted impulses with a vividness, spontaneity, and intensity unattainable now-- involved a radically different balance of direct and vicarious experience, of intensive and extensive information. Our existence is immeasurably more mediated, less immediate, than theirs.

Sven Birkerts’ and Todd Gitlin’s essays plausibly describe a transition to an era in which most experience will be still more vicarious and less direct, their information more extensive and less intensive. This may seem an oddly neutral way to characterize the chilling prospects these writers hold out. I do indeed share Birkerts’ unease and Gitlin’s indignation about the near-to-medium term, and will wail and gnash my teeth presently. But something at the margins of their vision, and in particular Birkerts’ allusion to the eclipse of individuality, calls for some more nebulous, less responsible comment as well.

Let me try for a moment to disconnect the what from the how: the evolutionary process from its political context; or what Birkerts calls “electronic collectivization” from the fact of its design and exploitation by business, the media, the entertainment industry, and the state. Let’s disregard, in imagination, these extrinsic, distorting influences on cultural development and suppose that we the people freely, democratically, and wisely controlled our cult evolution. What difference would this make to the fate of writing?

Every text, we know, has a context; and the more artful the text-- whether poem, tale, picture, argument, or equation-- the larger the relevant context. Texts of sufficient richness we call ineffable: the body of direct and vicarious experience, of extensive and intensive information, needed to register their whole force and depth is unattainable for beings with our capacities.

Depth is not the only dimension in which our aesthetic/intellectual reach exceeds our grasp. An aspiration to breadth or universality-- to “all-sidedness”, to assimilate the best that has been thought and said and be one of those on whom nothing is lost-- only became a cultural ideal in modern times; that is, just as its realization began to be impossible. The impulse to master the still (barely) masterable corpus of mid-l8th-century knowledge produced the Encyclopedie, which is, in respect of this ideal, the high tide of modernity. After the confidence of the philosophes comes the titanism (and ultimate resignation) of Goethe, the exquisite melancholy of Matthew Arnold and Henry James, the delirium of Pound and the High Modernists, and the white noise of postmodernism.

Along with the marketing requirements of late 20th-century capitalism and the (related) spread of a narcissistic or pre-Oedipal character structure, one contributing cause of postmodernism may be despair over the impossibility of assimilating more than a fraction of the best that has been thought and said “on all the matters which most concern us” (Arnold, Culture and Anarchy); of achieving “a harmonious perfection, developing all sides of our humanity” (ibid.). To know even a single branch of culture both intimately and exhaustively will soon exceed the capacity of just about anyone. In the arts as in science and politics, the division of labor has made available an abundance and variety of experience and information that are no longer merely stimulating but arguably overstimulating, even overwhelming. We can try, as Richard Rorty urges, “to admire both Blake and Arnold, both Marx and Baudelaire, both Nietzsche and Mill, both Trotsky and Eliot, both Nabokov and Orwell”; we can hope to understand “how these men’s books can be put together to form a beautiful mosaic.” But it’s a stretch. Add to this list Wittgenstein, Bartok, Rilke, Balanchine, and Levi-Strauss, and we begin to stagger. Add further --and who could bear to omit?--Duke Ellington, Robert Bresson, Jasper Johns, Frank Lloyd Wright, Martha Graham, Michel Tournier, and we have long since passed a limit. Though we may know enough to admire, we cannot really comprehend, cannot possibly devote to all these masters and masterpieces the patient, deeply informed attention they require.

And if per impossibile we could, we would scarcely have begun to do justice to “all the matters which most concern us.” I’m helpless to evoke, can’t even properly name, the beauties of science and mathematics. But no one, I suppose, believes they’re inferior to those mentioned in the preceding paragraph? Look steadily and whole at the misery for which, as an American citizen, one bears one’s mite of moral responsibility, and an interior voice sounds: you must change your life. But where to find the time, the energy, the spare imagination?

It’s too much. “Harmonious perfection” is out of the question. We must either accept cultural overload, partial vision, mutual incomprehension, or else find some way to extend our range, augment our capacities, enhance our neurophysiology. Actually, there’s a good deal to be said for the first alternative. Why does there need to be anybody who can “put together” all of culture? If print remains our principal medium of expression and communication, we can hold on, at least for a while, to the present rhythms and grain of our mental life, the architecture of our selves. “Privacy” and “autonomy” may be only names for our current balance of direct and vicarious experience, of intensive and extensive information. But it is our balance; it is us. No doubt our way of life will continue to change. I can no more imagine the cultural primacy of books lasting another 50,000 years than, say, theism or meat-eating or the nuclear family or private ownership of the means of production. But (for reasons I’ll explain in a moment) I’m more than ambivalent, I’m positively alarmed, about beginning the transition now.

Still, the transition will begin someday, and should. Though I don’t fully understand why-- here I can only appeal to intuition, shared or unshared-- there does need to be someone (or something) that can put together all of culture. Birkert figure/ground analogy for human identity is apt. But in the limit case, when the ground-- the sheer scope of cultural possibilities, even considering only those available in traditional forms-- alters drastically, qualitatively, then the implications of the analogy cease to be conservative. The figure must change dimension, perhaps radically, in order to maintain differentiation.

If this requires a new neural network, perhaps one extending outside our skin, then sooner or later, evolved or constructed, we will have one. Pace Birkerts, networks need not be exclusively “constitutive of the immediate present.” Networks can embed hierarchies, temporal as well as logical: memory, tradition, culture itself are such networks. Organic rather than electronic ones, to be sure; but then, it’s synergy rather than substitution that I look forward to.

Of course memory can be constricted and history flattened, whether by commissars, spin-doctors, or ideologically innocent, profit-maximizing advertising executives and media managers. The design of a culture, the shape of a species’ “collective sensibility” is a political question. Right now that question is being begged, whence my (and Birkerts and Gitlin’s) alarm. Ideally, verbal literacy would be subsumed or transcended in the course of cultural evolution, not simply eroded. The attrition of civic memory and craft knowledge, a reduced attention span and loss of discrimination, the attenuation of nuance and the homogenization of vocabulary-- in all these ways the decay of literacy serves both the manufacture of consent and the accumulation of capital. A populace that cannot recognize rhetorical devices, make moderately subtle verbal distinctions, or remember back beyond the last election or ad campaign is defenseless against official propaganda and commercial hype. Only rootedness makes sustained resistance to the modern Leviathan--state, corporations, and media-- possible. And an important form of rootedness is our internalization of the Word in one form or another: sacred scripture or poetic tradition or civic mythology or family lore. Benign cultural evolution, genuine emancipation, would lead us to work through such traditions, preserving even while going beyond them. As it is, we are merely being distracted from them.

The deepest and bitterest of all current disagreements is about whether modernity itself is an example of benign cultural evolution. In the creation of modern cultural and economic individualism, premodern communal traditions were similarly undermined without being worked through. For the most part, the people of Europe did not make their own painful way beyond village, kin network, handicraft, and local religion into a brave new world of mobility and rationality, city and factory. By and large, they were bulldozed. In that case as in this, the transition was shaped and paced though not entirely motivated-- by the needs of elites. True, a democratic transition to modernity in Europe would have taken centuries longer, and might not even now be consummated. But it would not have given rise to the twin spectres of antimodernist fundamentalism and postmodernist nihilism.

Marx and Freud made parallel and profoundly true observations, one about social practices and the other about individual beliefs. If a practice or belief is overthrown prematurely, is repressed rather than outgrown, the result is pathology. To suggest that humankind is now ready to leave behind verbal literacy, when only a tiny, fortunate fraction have savored its pleasurable possibilities to the full, is not hubris. It is fatuity; worse, cruelty. At this stage of our political and cultural development, electronic collectivization would produce not new, marvelously complex and efficient forms of cognition and communication, but historical amnesia and mass manipulation: the “societal totalism” Birkerts rightly fears.

If I may hijack Birkerts’ concluding metaphor: someday we will no longer need an ozone layer. Of course we must immediately stop depleting atmospheric (and linguistic) ozone or else face catastrophe. But eventually we will decipher the genetic code and redesign our skin, our immunological system, and probably much more. I hope, though, that it takes a few millennia. To think what the “free” market or the authoritarian state would do with genetic engineering is awful, just as it’s awful to see the transformative possibilities of electronics squandered on weapons production, law enforcement, advertising, the credit industry, and the entertainment industry.

That our organic senses, including memory, will someday be joined in a way we cannot now conceive, to electronic ones is something I certainly can’t prove yet don’t really doubt. Our perennial desire to integrate and master all knowledge can no longer be accomplished with our present sensorium. But we will not get there by continuing to dissipate our linguistic heritage. We are not transcending verbal literacy; we are merely forgetting it. Contemporary postmodernism is a false dawn because the finest possibilities of modernity have not begun to be realized. For the same reasons, the electronic millennium is now a threat rather than-- what it may yet prove to be, in the farther reaches of cultural evolution-- a promise.



Search:



Powered By Movable Type 4.1

Copyright © 2004-2008
George Scialabba