The image projected on the screen behind me, the promo poster for this conference (see pages 6 and 8, "Announcement"), shows the trappings of cartography: a topographical map of some non-existent or unnamed place, miscellaneous technical features, a bunch of arrows suggesting different kinds of activity... For various reasons, this kind of graphic imagery is often associated with strategy; that’s unfortunate, but the problem is hardly limited to design. “Mapping” is all the rage, and has been for a while now — a few decades that I can think of. It’s a bad habit, like the widespread adoption of other pseudo-military affectations (in language, organization,media, leisure time, and so on), and we need to stop it. We need to quit thinking in terms of space, particularly interms of imaginary “spaces,” and think instead in terms of time — or, better, in terms of times in the plural.
The concern driving this event today is a dissatisfaction with art and activism, a sense that both of these fields have reached a sort of impasse or, possibly, an end. The last thing I want to do is try to debate a thesis like that pro or con; instead, I’m interested in this moment of judgment. Declaring things dead, gone, over, finished, broken or somesuch is all the rage. We could dismiss it as a trend, but,trendy or not, let’s dignify it a bit more by calling it what it is - a peculiar feature of our time. After all, to usher in the new, you need to sweep out the old. But the new, as in the“new strategies” of this event’s title, often carries a whiff of underlying anxiety. In this case, it’s a self-consciously historical anxiety, the vague but pressing sense that we’ve we’ve lost our way, our orientation. And so our poster’s cartographic affectations have a second aspect: not just the big arrows of strategy boldly showing that we’re going somewhere, but also the befuddling spaghetti of topographical lines insinuating that we’re completely lost.
If there’s a crisis in art and/ or activism, it must be a historical phenomenon - something that didn’t exist fifty or twenty years ago and probably won’t exist twenty or fifty years from now. For what it’s worth, I do think there is a crisis, and I agree with much of what Konrad and Jim said in their opening remarks. But, still, I’m curious: Why is it happening now? And (or?) why is it a crisis? There’s a standard kind of explanation: that crises like this are symptoms of late capitalism, or of neoliberalism, or what have you. I don’t doubt that there’s some truth to the details of these kinds of arguments, but as answers they’re red herrings. We have no idea if this is late capitalism - it could be early for all we know, in which case what now seems to be a “crisis” could turn out, in decades or centuries, to have been child’s play compared to what’s in store. That’s glaringly obvious, but not enough to prevent us from indulging in judgments based on an almost occult belief that we know when we are.
Time is an unwieldy subject, and we’re much less fluent when we talk about it than when we talk about, for example, space. We’re conversational, absolutely — but, if anything,that’s the problem. In terms of our own lives, there’s the intimacy of the moment, anecdotes and stories (and you had to be there, never you had to be then), the obvious denominations (days, nights, weeks, months, years), seasons, phases, periods, and so on. At certain points and in different ways — this discussion is an odd example — these personal understandings of time merge with less personal, more aggregate denominations: again, the obvious structures (weeks, months, years), but also categories that sit in an uneasy relationship with biography — decades, “generations,”periods marked by events (the “postwar period,”“1968,” etc.), and eras. And, on an entirely different order, there are the uneasy transitions: the fin de siècle, the watershed, the “revolution,” and of course the crisis. And then there are categories that really should be imponderable, but over the last few decades we’ve spent quite a bit of time, if not pondering them, then being beaten over the head with them: centuries, the new millennium (and therefore the old one), and — worst of all — “ages.” These used to be empty foils (e.g., the Iron Age), but with the advent of the “Information Age” they’re everywhere, and feel pretty real.
I could go on like this for a long time, but even those few wordsand phrases should be enough to make it clear how often we rely on an ad-hoc (or nowadays just-in-time), contradictory jumble of temporal matrices to describe and make sense of anything and everything. But, by way of trying to shed some light on this crisis in art and activism, let me — as they say in movieland — get all medieval on your ass for a few minutes.The West’s current calendrical system, Anno Domini or “the year of our lord,” was instituted by Charlemagne at the urging of his advisor, Alcuin, in what we now call the late 700s. Before that, Europe got by on a crazy quilt of calendars — mostly local, practical stuff like “in the nth year of So-and-So’s reign” — and, more broadly, a series of calendrical systems that claimed to tally up the age of the world since creation. Charlemagne was famously crowned Holy Roman Emperor on Christmas day of 800 A.D.; less famously, his coronation — according to the previous Anno Mundi calendar he’d replaced with Anno Domini over the preceding decades — took place on Christmas of the year 5999. That’s the kind of number that gets more attention, for no better reason than it’s 1 less than 6000. As it happened, the number 6000 was pretty significant at the time.
For several centuries, there had been a long-standing and widespread strain of biblical interpretation — which might indicate a widespread popular belief — that the world would last for 6000 years, followed by 1000 years of rest, a sabbatical millennium. This interpretation was based in part on the Book of Genesis, in which the Lord is said to have created the world in six days and rested on the seventh, and in part on the pseudo-Pauline letter to Peter, which says that “a thousand years is a day in the eye of the Lord.” Charlemagne, then, was crowned on the eve of what some believed would be the end of the world. And, indeed, the chronicles of the period suggest this indirectly: they document a rising tide of signs and wonders (battles in the sky between the forces of good and evil, the bodies of deceased virgins remaining uncorrupted, etc.), outlandish allegations of libertinage and antisocial activities, and a growing numbers of charismatic preachers — in short, activists. These kinds of things often indicate apocalyptic beliefs.
The word “medieval” often serves as a shorthand for the hopelessly irrational; so the idea of hordes of filthy, ignorant peasants acting out some pathetic symbolic struggle is an easy pill to swallow for many people. But what about a subtle, technocratic exercise in liberal social engineering to undermine millenarian activity by instituting a new calendar decades in advance? That’s a pretty serious strategy, and not at all what you’d expect from the medieval period. The corollary: either they were much more modern than we think (they were), or we’re much more medieval than we fancy (we are). So, did Charlemagne’s (or, more likely, Alcuin’s) plan work? We can’t know: basic methodological rules — about not making arguments from silence, about control groups, and the like — make it pretty much impossible. But we do know that the adoption of the Anno Domini calendar laid the basis for another apocalyptic crisis two centuries later, around the year 1000. (Biblical interpretation is nothing if not flexible: denied the year 6000, millenarians settled for the next big number.)
Again, there’s lots of debate about how widespread millenarian beliefs were then; but there’s no doubt that in the decades preceding 1000 there was a striking rise in signs and wonders, in outrageous behavior, as well as in preaching — this time around about how, for example, “it is easier for a camel to go through the eye of a needle than for a rich man to enter into the kingdom of God”; and that the warring princelings of Europe transferred unprecedented amounts of wealth — treasure, land, and peasants — to religious institutions, in particular to monastic orders; and that, widespread and popular movements sprang up quickly by the standards of the day to curtail warlike activity on an ever-growing list of holy days; and that the resulting peace led to a more deliberate and effective stewardship of land and economic expansion; and to a widespread rise in literacy; and that this “renaissance of literacy” correlates with a “renaissance of heresy”; and that all of these phenomena contributed, over a few centuries, to what we later came to call the Renaissance.
These are gross simplifications, of course, and my point isn’t that centuries can be summarized in a few sentences — they can’t. But even those who disagree with how I’ve recounted these events would likely agree with my broader point, that historical expectations can drive immense, complex, and above all arbitrary social phenomena.
Now, that’s an awfully long way to go to make the simple point that people behave in particular ways because they think the end is near — but bear with me. And bear withme even further when I point out something even more painfully obvious — that people behave in other ways when they think there’s no end in sight. What an inane thing to say! That most people trust that things will continue, and act accordingly! Of course they do. OK, so let’s put these contrasting “styles” of thought into a current context — say, climate change. At one extreme, we have people who believe that humanity, faced with its abysmally errant ways, is teetering on the edge of destruction; at the other, we have people who think we can carry on as we have forever, and that through our intellect and industry we can solve any problem. As much as we might like to think that “those”people (medievals, rightists, polluters, etc.) embody and act out irrational beliefs and spare us the trouble of doing so, the truth — to the extent that there is one — is more elusive and, above all, much more mixed. It’s not as though the traces of apocalypticism have vanished or are relegated to the fanatical fringes.
Take our most recent millennial outburst, Y2K, which was said to threaten the very systems that embody our greatest advances. Nothing happened, right? It was just empty hype — profit-seeking consultants driving a 24-hour cable-news moral panic. The proof? That we didn’t see, say, the instantaneous collapse of global finance, the desolation of entire cities and regions due in large part to socio-technical meltdowns, nihilistic borderwars between self-proclaimed forces of light and forces of darkness, and so on. Those things — the dotcom collapse, 9/11, Iraq and Afghanistan, Hurricane Katrina, the current financial fiasco — took years of benighted and negligent despotism, much of it deeply informed by Christian fundamentalist belief in the end of the world... Then again, maybeY2K remediation really did work. Maybe we postponed a bunch of punctually synchronized disasters only to find that a cabal of resentful geezers hell-bent on reimposing the affirming clarity of black and white were quite capable of serving up a bunch of slightly desynchronized disasters anyway. And unless you genuinely believe in a Second Coming, the only alternative explanation for apocalypses that never came but nevertheless transformed the world is that a bunch of people were motivated enough to make it happen.
Most of the time it makes sense to insist on strict causality, but sometimes we do well to be a bit more flexible. Cultural analysis — and in particular cultural analysis of time — requires a little flexibility for a few reasons: first, because “strict causality” is itself a cultural construct that presupposes very specific models of time; and, second, because one thing that culture does very well is obfuscate causality. As Marx showed us, time is anything but constant— it’s a human creation and, as such, its structure, form and meaning vary over (yes) time. And as Foucault showed, the morphology of human creations isn’t smooth at all. Now, soon enough, you’ll have a stage full of geezers hell-bent not on reimposing the affirming clarity of black and white but just the opposite — thinking about how art and activism just don’t have the liberating potential they used to.
Allegedly, the occasion that brings us together is an urgent crisis in art and activism, even though that crisis has been going on for years now. Apparently, we were brought together by this immense historical process, even if the form that took was an email from Konrad rather than a tap on the shoulder from the Zeitgeist. None of us are all that young, but we are separated by more than a few decades and “generations.” We have lots of latitude to assert our autonomy from a certain level of historical determination by disagreeing with each other — though that the same words mean very different things depending on who says them and why. And where. And, above all, when. If I were to ask “Where are we?”, it’d probably seem like an irritatingly rhetorical question. However, if I were to ask “When are we?”, it’d just sound wrong, like I don’t speak English. Why is that? Are we really so certain that we know when we are that we can relegate such a simple question to idiomatic oblivion? Or is where we are so much more interesting than when we are?
If I have an argument today, it’s this: an uncertainty about when we are is at least as much a cause of whatever cultural crises you care to diagnose as is, say, “late capitalism,” “neoliberalism,” or some other premature judgment. And if I have a second argument —and I do — it’s that this crisis has been happening now because we’re still living in the shadow of the year 2000. And that it’s a crisis because, historically speaking, we have noidea whether we’re coming or going, whether we’re at anend or a beginning. When we look out of one eye, we’regobsmacked as we watch cultural edifices collapse on a weekly basis; but when we look out of the other eye, we see endless expanses of continuity, inertia, habit, repetition— business as usual. It’ll take a long time to understand these staggering contradictions, and my hunch is that the most sensitive explanations will emphasize just how desynchronized various cultural phenomena became in our time— how things that were held together by broad social narratives fell apart, or “unwound” as they say in finance these days, at different times and at different speeds. We know how brutally these social forces can bear down on and tear apart individuals, how precarity kills not with a single strike but through a thousand cuts.
The irony is that, if I’m right, the impact of the year 2000 isn’t some immensely intertwingled, global, synchronized countdown — that would be a typically self-serving modern cartoon of medievalism. Rather, it remains exactly what they told us it was, the cusp between the present and the future. But right when we desperately need to think reflexively and historically, we’re unable to do so because our most basic temporal categories have fallen apart. The present won’t go away, but the future isn’t here yet — and we’re getting the sense that it never will be.
William Gibson got it backwards when he said that the future is here, but it’s unevenly distributed; my retort is that the past is here but it’s unevenly distributed. Most of the generations heavily represented here today grew up on a steady diet of stories about dinners in a pill, life in undersea bubbles, personal helicopters, and so on. It was pop nonsense, of course. Nevertheless, the idea of a future structured around the rise of technology and the freedoms it granted was and remains extremely powerful. And, indeed, we have some very remarkable new things; but instead of dinner in a pill we got E. Coli and salmonella in prefab food, instead of the personal helicopter we got the TSA, and so on. The future we got — our present — has turned out to be much more precarious, much more uncertain, and much more frightening.
You’d think that the reason we’re having this discussion in public is to communicate it to others. And you’d hope, further, that the audience would be more varied than a bunch of geezers nodding in agreement about how everything sucks these days. As artists, activists, and their ilk, you might even hope that this discussion would interest lots of younger people. I certainly hope so. But even if the conceit of today’s event is that we’ll come up with some clever new ways of thinking about the fields cultivated by “art” and “activism,” there’s a real danger in diagnosing the current situation as a crisis vis-à-vis some “golden” (or maybe just“silver”) age of art and activism. The none-too-subtle message to you whippersnappers out there is that, through no fault of your own — you didn’t even have a choice or a chance! — your missed the boat, and all you’re left with are empty, sunken hulls. That may be turn out to be true, but it’s up to you. Then again, if it weren’t for our crisis, you’d be stuck with all our stuff.