- Show All
- Culture and Arts
- History & Ideas
- Non Fiction
If you want to make contact about any Non-fiction or Poetry item, you can contact me direct on peter at peterjukes dot com.Otherwise, when it comes to anything concerning Film, TV, stage or radio drama, the best first point of call is probably my UK agent.Howard Gooding at Judy Daish Associates (how
In light of the recently announced public inquiry into the murder of Daniel Morgan, I've copied below the MS portion of the chapter from my book The Fall of the House of Murdoch to aid the resource page set up by Jack of Kent. It could be a useful summation of events up to July 2012.
The Murder of
It's hard to believe, but at 4pm BST today it will be exactly a year since Nick Davies and Amelia Hill published online a leak from Operation Weeting, the newly recreated (third) investigation into phone hacking, and revealed that the News of the World had hacked the phone of a missing 13 year old s
The critically acclaimed US television drama could not be made here. We have writing talent in abundance, but its output is controlled by a stifling monopoly—the BBC. Plus, an interview with The Wire's creator David SimonRead Prospect’s interview with The Wire’s creator David Simon, in which
No, there is no major news about the three major investigations into multiple phone and computer hacking, bribing police officials, or perverting the course of justice by News International in the the UK. Nor is there any major development in the DOJ investigation into the parent company
Today in the High Court, News Group Newpapers, the News Corp subsidiary responsible for the defunct News of the World and The Sun, is settling dozens of hacking and surveillance claimsin an attempt to avoid a high court case on Feb 13th which could result in punitive damages.
There are over 60 hack
Plain text beneath the fold
Neither a philosopher, critic nor scholar, somehow Waiter Benjamin (born 15 July 1892) succeeded in being all three at once.His friend Bertolt Brecht called his suicide in 1940
For anyone following the #hackgate FOTHOM diaries, you'll know that that the slow motion crash of Murdoch's UK Empire is still developing. But it wasn't until Rush Limbaugh's recent implosion that I began to think this isn't just about News Corp, even though it is the world's 3rd biggest media group
Exclusive to Prospect online, the full transcript of Peter Jukes's interview with historian and author Tony Judt
Peter Jukes (right) with Judt (middle), 2007
Tony Judt died, surrounded by his family, on the evening of August 6th, 2010. The New York Times obituary can be read here. This
WHEN Peter Jukes let it be known last year that he was writing a book called The Fall of the House of Murdoch, a senior Sun editor emailed him to say: "Is this a joke?"
But with Rebekah Brooks and Andy Coulson both now facing charges over phone-hacking, and Rupert Murdoch slowly stepping back from
By Peter JukesJune 18th 201310:46 amCharles Saatchi built up the world’s largest advertising firm and became the face of the swinging ’80s in London—only to be ousted from his own company. Peter Jukes on the reclusive man who now has been accused of choking his
Until a few years ago, you could be climbing any chalk down in Southern England. Trails lead up from a council estate, past a recreation ground. On the slopes above, young men with tattooed arms walk their dogs. The grass is like an old rug, woven with wild flowers, cabbage whites and meadow browns.
English version of the article that first appeared in the Polish Magazine Krytyka Polityczna
Though it claims to be one of the world’s fasting growing religions, and now holds over $1 billion in liquid assets, last year wasn’t great for the Church of Scientology. The news that its most famous pub
Having seen the excellent TV series, I'm disappointed by the novelisation of Middlemarch. George Eliot's book lacks the rigour and economy of Andrew Davies' original. Long authorial interventions ruin the immediacy and the balance between the characters of Lydgate and Dorothea has been lost. A
Today in Parliament
As expected, the appearance of James Murdoch, the Chief Executive of News International (and related to some other famous people) before the DCMS Committee today failed to produce any huge bombshells. Let's remind ourselves that the Parliamentary Committee has no real power
I’ve spent much of the last year on the front line of one of the most contentious presidential nomination contests in memory—without moving from my London desk. I have been part of something historic: the first great political battle to take place in cyberspace.
For many in Britain, blogging
THE DARK WAVE
“Looking out to sea, I noticed a dark black object travelling toward the shore. At first sight it seemed like a low range of hills rising out of the water…. A second glance – and a very hurried one at that – convinced me that it was a lofty ridge of water many feet high.”
As my colonial cousins recover from an overdose of turkey and tryptophan, let me prod you into consciousness with the Frank Miller problem - which also allows me to post some awesome pics.
No, the Frank Miller problem isn't as simple as you think. From his slapdash rant about the OWS movement on hi
It was a long time coming, but inevitable six months ago. James Murdoch has stepped down as chair of News International, signalling the Fall of the House of Murdoch as the dynastic succession to Rupert's News Corp empire is finished. The official statement - which is probably worth no more than a ho
Having seen the excellent TV series, I'm disappointed by the novelisation of Middlemarch. George Eliot's book lacks the rigour and economy of Andrew Davies' original. Long authorial interventions ruin the immediacy and the balance between the characters of Lydgate and Dorothea has been lost. A brave attempt, but the adaptation begs the question posed by To Play the King: why reduce a good television series to a lesser book?
If this inversion of the usual literary bias doesn't seem entirely absurd, it's a tribute to the quality of British television drama. We often overestimate the international appeal of our output (it doesn't sell so well abroad) but there is little doubt of the medium's importance to our national culture. In Luton and Latin, from Breakfast Time to the Late Show, television is a central part of the dialogue we have with ourselves. And if one dramatist had to be thanked for high esteem of the medium, it must be Dennis Potter.
The works Potter is hurrying to finish before his death, Karaoke and Cold Lazarus, will be his farewell to the form he shaped so much and to which he devoted his life. They could also be a long goodbye to the writer's eminence in the industry. Potter is recognised in the street, vilified in the tabloids. His plangent voice, mesmerizing in his last interview with Melvyn Bragg, has amused and accused on the broadest platforms. Potter has put television writers up there with the best of them and will be considered, without doubt, along with Pinter, Betjeman, or Amis as a key figure in postwar British writing. Less certain of survival, however, is the peculiarly British tradition of the TV author he helped invent.
Script writers abound on television, but an author, one who originates form and content is a rare breed. Even the most successful writers today such as Andrew Davies or Lynda La Plante tend towards the adaptation of books or work within fixed genres such as the police or detective serial. For all their merits, To Play the King or Prime Suspect have not shaken the boundaries of what television drama can be. Anyone watching last week's BAFTA ceremonies would have heard writers constantly praised. They would have been hard-pressed, however, to spot one between the anonymous executives and famous actors gathered in Drury Lane. Appropriately, the awards for Best Drama Series and Best Single Drama, won by Between the Lines and Safe, were collected by the programme producers.
Here I must declare an interest: a lurking desire to fluff a BAFTA acceptance speech. Having written for stage, radio and print, I know that TV writing is one of the least respected forms. I also believe that, in technical terms, it is one of the most demanding. The closeness of the camera shows up any falsity of dialogue or character. Meaning has to be conveyed in action and, though it remains unseen, a good script should read like a good novel, the visual storytelling and mise-en-scène setting the rhythm for everything the actors and directors do. Despite this, in financial terms, the writer is near the bottom of the heap. TV drama costs about half a million pounds an hour and the script is usually a small fraction of the budget. Writers can be jettisoned at relatively little expense, while an aberrant director or star can lose millions.
This leads to a strange twilight existence for TV writers: long bouts of waiting-around interspersed with moments of panic. Over the last five years I've 'developed' six original screenplays or pilots. These I've had months to polish. They represent my best work, calling cards for new commissions. All remain, largely due to external reasons, unproduced. In contrast, last minute production decisions create the opposite problem when writing for existing serials. In just over a year I've completed seven scripts - most in less than a month.
Many blame the decline in the writer's profile on the demise of the single play. ITV has almost entirely dispensed with one-off dramas, and the BBC has transmuted them into high cost, director-led, producer-controlled 'screenplays'. When Potter started out, a dramatist could hone his or her skills in Armchair Theatre or Play for Today at relatively little cost. Today's aspirants either have to survive years of high casualty 'development' or cut their teeth in genre formats. Whatever the merits of The Bill, or Brookside or Casualty they are not the ideal grounds for experiment and, increasingly, storylines are provided by script editors, storyliners, producers or executive producers. In such an environment it is easy to become just a writer of dialogue and fail to take responsibility for the entire shape of a piece.
It would be wrong, however, to suspect a managerial conspiracy. The single play has been largely abandoned because the audience abandoned it. If the demand was there, ITV would fill its schedules with single plays (just as, if it made films the public wanted to see, the British film industry would not be in such a poor state). Besides, even in its heyday the studio drama was a strange hybrid between stage and film. Series such as Z-Carsnurtured as much writing talent and the most original television writing has been in serial form. In Potter's case, his two greatest achievements to date are both serials, Pennies from Heaven and - still unsurpassed for narrative brilliance - The Singing Detective.
Television's great dramatic innovation has been the series or the serial (who cares which - most the audience don't). The recurrent slot, the sense of development and repetition, is unique to the form. Trevor Preston has called the series the 'television novel' and to this extent Andrew Davies is the best contemporary representative of George Eliot. The popularity of the Victorian novel, with queues forming for the latest weekly instalment of Dickens in Household Words, is much closer to Shepherd's Bush than Bloomsbury.
But the novel is a solitary form and drama, by definition, is collaborative. For every new Alan Bennett, Bleasdale, or Debbie Horsfield, are ten dramas whose authorship is unknown or dispersed among many. This needn't necessarily dilute the quality. Both Morse and Cracker owe their origin to a number of hands and in America, where authorship is even more nebulous, team writing is the norm. The result is not always writing by committee. On the contrary, in recent years, imports such as LAPD, Twin Peaks, thirtysomething, or Hill Street Blues have shown more narrative innovation than home-baked material. One has to go back to Boys from the Blackstuff or Edge of Darkness to find a British series at the cutting edge, changing the way we tell our stories as well as the stories we tell.
When people complain about the decline in television authorship, therefore, they actually bemoaning a general lack of originality and invention. Budget and production values may have increased, but so too have the number of formulaic lego-built dramas. This sense of sameness and caution is not a mystery. It has a simple explanation: the massive concentration of commissioning power in a few hands.
Despite the flourishing of independent companies in the 1980s, the apparent diversity conceals a sharp and disturbing centralisation. You may now devise a project with any number of small companies, but it can only be given the go-ahead by an decreasing circle of people. The base has widened, but so have the decision-making layers, each layer trying to second guess what the one above will say. At the top, the pyramid has become almost perfect. The decision what millions will watch every week is effectively taken by only four men.
In any other industry this structure would come in for some kind of investigation. No matter how brilliant these four men are, how broad their minds, diverse their tastes, such a funnelling of the cultural power must be inhibiting. Ironically, it was the Office of Fair Trading that imposed the central commissioning and scheduling Network Centre on ITV. The BBC has no such excuse. Ever since El Dorado drama has been make or break for BBC controllers so both Yentob and Jackson patrol the commissioning process ferociously. Like cold war enemies locked in combat, BBC and ITV have begun to resemble each other with the same mad monolithic structures.
In this climate, authorship, like authority, is the prerequisite of a few. How can writers feel responsible for their work when producers, their executive producers, even their heads of department can't feel responsible? Like so many other apparent 'reforms' in health and education, deregulation in Television has disguised an increase in control from the centre. Perhaps it's a perverse tribute to the cultural importance a writer like Dennis Potter gave to the medium that the government has done so much to stifle it. But it's a tribute we can do without.
It's been 26 years since I last remember police cars speeding up my road to the riots at the Broadwater Farm estate in North London. Then - just like last night - a policing incident had been the spark that ignited latent social and political tensions that had been building for years. The previous Tottenham riot wasn't an isolated incident: prior to that there had been riots in Brixton in South London, Toxteth in Liverpool, and Handsworth in Birmingham. And last night the same scenes returned to my beloved city.
As is often noted, we all have a tendency to fight the previous war. Just as the 'quagmire' of Vietnam led to reluctance to intervene in Bosnia (at the cost of hundreds of thousands of lives), so too the successes of Kosovo led to the peremptory and ill planned interventionism of Iraq. But Libya is not Iraq. As jubilant crowds fill Green Square, the fall of Tripoli to the rebels is a victory on many counts
If there’s any shred of comfort that come come from the horrors of ten days ago, the bomb attacks in Oslo and massacre of dozens of teenagers in Utøya, it is scant consolation for bereft families or a nation in mourning. The biggest atrocity on Norwegian soil since World War II, and one of the biggest terrorist incidents in Europe in decades, is no occasion for political point scoring. But some good may yet come out of it: the full glare of public scrutiny (and one hopes police attention) has now been turned on the largely ignored growth of extreme right-wing Islamophobia in Europe.
Peter Jukes (right) with Judt (middle), 2007
Tony Judt died, surrounded by his family, on the evening of August 6th, 2010. The New York Times obituary can be read here. This is the full transcript of Peter Jukes’s interview with Tony Judt—conducted earlier this year via email, due to the progress of Judt’s motor neurone disease. The full text of Jukes’s portrait of Tony Judt is featured in the August issue of Prospect, and can be read online here.
Peter Jukes: I’ll start with a confession. Before we first met 12 years ago at the Remarque Forum [a conference Judt sponsored as professor of history at New York University] a joint friend of ours sent me an example of your work—a chapter I think from your 1979 book Socialism in Provence 1871-1914. I must admit my heart sank. I’m sure it was compelling and well documented, but it gave no indication of the liveliness and relevance of the discussion at that Forum, nor of the range of your writing. I think you must have been half way through your compendious history of the whole of post-war Europe at the time.
So my question is: how did you move from the micro-analysis of the French left between two wars to the often global historical issues you address today?
Peter Jukes discusses history, life and justice with the late Tony Judt—a master of morally charged rhetoric
Tony Judt and friend in Israel in 1967: being an ex-Zionist has helped him tackle the controversial issue of America’s relationship with Israel.
Tony Judt died, surrounded by his family, on the the evening of August 6th, 2010. The New York Times obituary can be read here. A full transcript of Peter Jukes’s interview—the last in-depth interview Judt undertook before his depth—can be read on our website here.
Though not one to run shy of controversy, Tony Judt—historian, thinker, professor; commentator on the French left, American identity politics, Israel and much more besides—has never been one of those controversialists whose opposition can be predicted. In the many times I’ve heard him speak, I have never been able to guess in advance what he would say next.
Part of this unexpectedness is no doubt due to his career spent dealing with the exigencies of history rather than the sweeping formulations of philosophy or cultural theory. Born in London in 1948, Judt took a doctorate in history at Cambridge before moving to Paris to study at the Ecole Normale Superieure. His first book, Socialism in Provence 1871-1914, appeared in 1979 and explored a small slice of time in forensic depth. Next came a series of essays on the French left, followed by a book on postwar French intellectuals; it was only gradually that Judt moved on to larger canvasses. As he put it to me: “My first non-academic publication—a review in the Times Literary Supplement—did not come until the late 1980s. And it was not until 1993 that I published my first piece in the New York Review of Books. So that’s a 25-year learning curve.”
I interviewed Tony by email earlier this year (a full transcript of this exchange is available here). The motor neurone disease he was diagnosed with in 2008 has rendered him quadriplegic and he dictated his replies to an assistant. Tony and I have known each other for 12 years, having met at the 1998 Remarque Forum—a conference he sponsored as professor of history at New York University and the first of what would become almost an annual institution, devised to keep the transatlantic dialogue alive after the cold war. The event itself, located at a remote retreat on the border between Florida and Georgia, was no typical academic conference but comprised an eclectic mix of writers, entrepreneurs and other European and American professionals on top of the historians and political scientists—from a former ballet-dancing Swedish cultural attache to a neuroscientist who would go on to be the CNN doctor.
The subject matter was “cultural policy,” something Tony confessed he knew little about. But listening to him was like watching an experienced jazz musician find a new riff, as he discoursed in sentences of considerable syntactical complexity and nuance and arrived, unbidden, at new conclusions. Since few of us then knew who Tony Judt was, we argued back in equal measure. The forum remains my first and best experience of the power of public discourse: diverse, dissenting voices, encouraged to be provocative and exploratory.
Given the controversy that Tony’s writings have generated in recent years, it’s easy to forget just how instrumental the Forum, and the Remarque Institute he founded in 1995, have been in encouraging others to take intellectual courage. Tony admits he is something of an “awkward customer”—a trait that may derive from his childhood in a left-wing Jewish household in London, or his youth amid the adversarial dialectics of Oxbridge. The need to go marching towards the sound of cannon fire has taken him on various journeys—most notably to Israel in June 1967 during the six days war, when he worked as a translator and driver for the Israel Defence Force. He was in Paris in 1968, and visited Prague, Warsaw and California in the 1980s.
By the time I met Tony he was already writing the book on European history that would consolidate his international reputation: Postwar. Largely a work of assimilation and historical consensus, it was in the appendix—entitled “The Unconscious of History”—that Tony set about unpicking one of the times’ darker and more disturbing trends: the way memories of the Holocaust have been used and abused over the last 50 years. This background paved the way for him to enter one of the most contested issues in US intellectual life: America’s relationship with Israel and Zionism.
A 2003 essay in the New York Review of Books, “Israel: The Alternative,” triggered the first great furore. In it, Tony made the case for “a single, integrated, binational state of Jews and Arabs, Israelis and Palestinians” rather than the “doomed” two-state solution. This argument has been heard in Israel since its foundation, but time and place are everything and Tony’s gift for saying “unpopular things in large public spaces” (his own words) brought widespread condemnation in the US.
Previously reticent about his background, Tony is honest about how he used his personal history to leverage this debate: “Being Jewish is not enough. Being an ex-Zionist is not enough. But being an ex-Zionist who wore the Israeli army uniform, and has a pic of himself complete with cutie and sub-machine gun [see opposite]: that helped. And in this case, the end justified the means. No one can shut me up on this subject, so they are forced to resort to cliches about self-hating Jews and the like: evidence of failure.” These revelations came at some personal cost. His op-eds disappeared for a while from the New York Times and he ceased writing for the New Republic. It’s still too early to tell whether Tony’s intervention helped or hindered productive dialogue, but he certainly opened a space to talk about middle-eastern politics which, in America at least, did not exist before.
And he did it with an acute sense of timing. The larger dramatic context was 9/11 and the subsequent invasions of Afghanistan and Iraq. Being a member of the Remarque Forum email list in 2001-03 was a memorable lesson in how the transatlantic dialogue was drifting apart, especially when one of the New York contributors wrote, on the morning of 12th September: “We’re all Israelis now.” This was one of the moments when not quite knowing what Tony would say was nerve-wracking.
The Forum had often discussed Nato military action in Bosnia and Kosovo and—to the shock of some—Tony had fairly unequivocally supported it. One would thus have assumed he was in the camp of the “liberal interventionists,” especially when he also spoke out in favour of the initial policing actions in Afghanistan. But then along came Iraq, and in the September 2006 issue of the London Review of Books Tony produced a devastating critique of the co-option of liberals by the neocons—titled, after Lenin, “Bush’s Useful Idiots.”
Four years later, its conclusion still resonates: “Intellectuals should not be smugly theorising endless war, much less confidently promoting and excusing it. They should be engaged in disturbing the peace—their own above all.” As he explained it to me, “My objection to all my liberal friends who ran with the Iraq hawks is that they were not making the case for liberal interventionism but for exemplary war… I don’t believe that one should have one-size-fits-all moral rules for international political action. That’s what misled [Adam] Michnik and [Michael] Ignatieff and others: because they believed in rights for Czechs and Poles they had to believe in them for Iraqis too and so had to back plans to liberate the latter.”
For all this, Tony remains ambivalent about his role as a public figure. “It does irritate me when I am described as a controversialist and commentator on Israel. I see myself as first and above all a teacher of history; next a writer of European history; next a commentator on European affairs; next a public intellectual voice within the American left; and only then an occasional, opportunistic participant in the pained American discussion of the Jewish matter.”
There’s little doubt, though, that his restless drive to pursue an idea to its proper conclusion has lost Tony friends and allies over the years. It’s not just the authority of knowledge that makes him hard to ignore—there’s also the talent for the provocative comparison or phrase. Tony admits that even during his undergraduate days at King’s College, Cambridge, in the late 1960s, “I was—and knew I was—among the best speakers and writers of my age cohort… John Dunn, my favourite King’s supervisor, once described me as ‘the silver-tongued orator’: a barbed compliment, since it suggested that I spoke before I thought and seduced rather than convinced. But I like it all the same.”
One of the things that entranced me at that first meeting with Tony was this ability to “speak before thinking”—or, rather, to “think while speaking.” Even before the onset of motor neurone disease in 2008, which forced him to conflate speaking and writing (his subsequent works have all been composed in his head and then dictated), Tony both talked in joined-up literary sentences and wrote with an aphoristic freshness. You can see this stylistic mastery let loose in the pieces he has been publishing this year in the NYRB: 1,700-word slices of memory and reflection about, as he put it in the first of them, “events, people, or narratives that I can employ to divert my mind from the body in which it is encased.”
Apart from these, the major work written under the extraordinary conditions of his paralysing illness is also Tony’s least historical and most overtly political. Published this March, Ill Fares the Land began life as a public lecture undertaken “to prove that what I had been saying about this disease—that it doesn’t affect your mind—was externally verifiable… I suppose the book would have been a little tighter and maybe more methodologically consequential if I had done it the old way. But it would surely have lacked the energy and anger.”
The result is an indictment of the Anglo-Saxon model of free-market economics since 1979. If there’s an element of improvisation to the book’s origin, there’s nothing ad hoc about the host of references it unlocks. The historical range underpinning Postwar is all here, but deployed to explain that government wasn’t always bad and that collective action wasn’t always tainted by the “socialist” authoritarian smear.
Dedicated to Tony’s two American-born sons, the text tries to address both a transatlantic and generational divide. The result is an occasionally awkward attempt to reach very different audiences, but from its opening sentence it rarely ceases to compel: “Something is profoundly wrong with the way we live today. For 30 years we have made a virtue out of the pursuit of material self-interest: indeed, this very pursuit now constitutes whatever remains of our sense of collective purpose.” Ill Fares the Land has attracted some negative reviews from both right and left for skipping over the historical failures of Keynesianism and the electoral success of the “third way.” But the financial collapse of 2008 and the way the state had to step in and “save capitalism” are pretty unassailable arguments against the status quo ante.
What Ill Fares the Land does not do is describe a grand project for a brand new hegemonic future—it’s much more modest than that. When I ask him about what some might see as the book’s inherently tragic vision, Tony rejects the idea: “You can’t have a tragic vision in politics: not if you wish to intervene and convince… One of the very few things that I know I believe strongly is that we must learn how to make a better world out of usable pasts rather than dreaming of infinite futures. It’s a very late-Enlightenment view that says that the only way to make a better future is to believe that the future will be better.”
Judt has studied, combated and even taken part in many of the radical new beginnings of the last 60 years: Zionism, the new left, deconstructionism, neoconservatism, neo-liberalism, new Labour. Yet from a deep-rooted left-wing liberal perspective, Ill Fares the Land ultimately offers a conservative conclusion. The left has lost too much in its obsession with newness and creative destruction, it argues. The “usable pasts,” with all their limitations and possibilities, have the virtue of being known.
One of these pasts is the lost role of the public intellectual: well-informed but willing to range beyond the ghetto of expertise—who doesn’t just observe but also tries to intervene or provoke. The economic background to Ill Fares the Land might be incomplete but the reappraising anger is just. Tony’s willingness to take on this topic—to use his fast-depleting energies on this particular stage at this particular time—is yet another dramatic intervention, combining a personal voice with a knowledge of history and sense of occasion in a way that is both responsive and responsible, timely and moral.
Click here to read the full transcript of Peter Jukes’ interview with Tony Judt
Thank God. Finally. Having come to office with a promise of healing divisions, and being faced with an obstructionist opposition veering increasingly to the zero tax anarchy of the Libertarian Right, your President finally came clean. No he isn't a corporate tool. (Hell, if he was, wouldn't he have taken those Wall St job offers rather than a meagre community organisers job?). Yes, he went there, and picked on that 1% which Joseph Steiglitz has so graphically depicted as enriching themselves in the last two decades. For those who haven't read it: here's his opening premise:
"It’s no use pretending that what has obviously happened has not in fact happened. The upper 1 percent of Americans are now taking in nearly a quarter of the nation’s income every year. In terms of wealth rather than income, the top 1 percent control 40 percent. Their lot in life has improved considerably. Twenty-five years ago, the corresponding figures were 12 percent and 33 percent. One response might be to celebrate the ingenuity and drive that brought good fortune to these people, and to contend that a rising tide lifts all boats. That response would be misguided. While the top 1 percent have seen their incomes rise 18 percent over the past decade, those in the middle have actually seen their incomes fall. For men with only high-school degrees, the decline has been precipitous—12 percent in the last quarter-century alone. All the growth in recent decades—and more—has gone to those at the top. In terms of income equality, America lags behind any country in the old, ossified Europe that President George W. Bush used to deride. Among our closest counterparts are Russia with its oligarchs and Iran. While many of the old centers of inequality in Latin America, such as Brazil, have been striving in recent years, rather successfully, to improve the plight of the poor and reduce gaps in income, America has allowed inequality to grow."
Does Obama listen? Does he dismiss as sanctimonious those who have criticised him for not pushing back? For being missing? Not any more. Here's the moneyquote from his speech today.
"Think about it. In the last decade, the average income of the bottom 90% of all working Americans actually declined. The top 1% saw their income rise by an average of more than a quarter of a million dollars each. And that’s who needs to pay less taxes? They want to give people like me a two hundred thousand dollar tax cut that’s paid for by asking thirty three seniors to each pay six thousand dollars more in health costs? That’s not right, and it’s not going to happen as long as I’m President. The fact is, their vision is less about reducing the deficit than it is about changing the basic social compact in America."
(Full speech here) Full disclosure. Though I'm thought of as an Obamabot, I'm actually well to the left on economic issues, especially income inequality, financial modal monopolies, and the importance of a mixed economy. Having sought consensus, and being delivered with a financial crisis not covered by his electoral mandate, I'm glad there are signs he is finally pushing back on this issue. His argument is fact based, personal, anecdotal. It's not about ideology but reality. It's one of the few ways to convince what, in the 30 years I have known it, has become an increasingly right wing anti Government country. What do my fellow Kossacks think?
See comments and replies to original diary on Dkos
THE DARK WAVE
“Looking out to sea, I noticed a dark black object travelling toward the shore. At first sight it seemed like a low range of hills rising out of the water…. A second glance – and a very hurried one at that – convinced me that it was a lofty ridge of water many feet high.”
That’s how a Dutchman saw the advance of a massive wave on the small Indonesian town of Anjer. He was also one of the few to survive and describe its retreat.
“The sight of those receding waters haunts me still… As I clung to the palm tree…. There floated past the dead bodies of many friend and neighbour. Only a mere handful of the population escaped... scarcely a trace remains of where the once busy, thriving town originally stood.”
This description, horribly familiar after the Indian Ocean earthquake of late 2004, is in fact an eye witness account of the massive eruption of Krakatoa in August 1883. Though separated by over a century these two cataclysms are closely connected. Both took place along the same plate boundary in one the most tectonically active parts of the world. On both occasions it was the subsequent sea surges rather than the original eruption or earthquake that caused the most devastation and loss of life. From a geological perspective, neither events are exceptional, just the regular release of energy from a subduction zone where the Indo-Australian and Eurasian continental plates collide. But Krakatoa marks a significant moment in human history. As Simon Winchester explains in his timely book, the development of the electronic telegraph, deep sea cabling, and the burgeoning news organisation founded by Julius Reuter, meant that news of the 1883 eruption travelled round the world within 24 hours. Krakatoa was probably the first time a natural disaster became a global news event.
Over a hundred years later, thanks to advances in scientific knowledge, we understand the mechanisms of plate tectonics and can – to a limited extent – predict and mitigate their impact. But though science provides a rational explanation of natural disasters, modern mass communication can have the opposite effect. We now witness the calamity almost instantaneously. Images of terror and death come crashing through our TVs into our living rooms. With the wide availability of hand-held video cameras we also experience the scenes from a vivid personal perspective; parents screaming at their kids to back away the hotel balcony as a second wave comes; holiday-makers trying to outrun the tsunami through forests and paddy fields; and most haunting of all, a wedding video from Banda Aceh which ended with shots of a town swept away in a turbid mass of bodies, vehicles and debris.
Modernity may have eliminated much of the mystery of many disasters, but it has done nothing to diminish their terror. In fact media images seem to affect our perception of the events themselves with many eye witnesses describing the experience as being ‘like a dream’ or ‘something out of movie’. A feeling of unreality is common to many survivors of such traumas, particularly children, who talk of the disaster as if it had been foretold in stories in or prefigured in dreams. Indeed these nightmarish visions may need no real event to provoke them.
"I saw this image in my sleep, how many great waters poured from heaven, drowning the whole land… The deluge fell with such frightening swiftness, wind, and roaring that when I awoke, my whole body trembled; for a long while I could not come to myself. So when I arose in the morning, I painted what I had seen." 
This description was scrawled in 1525 by Albrecht Dürer beneath his sketch ‘The Deluge: a Vision’. Years earlier, as a young man, Dürer had made his fame and fortune as a purveyor of apocalyptic imagery. The years leading up to 1500 were filled with political and religious fervour, and Dürer’s woodcuts of the Book of Revelations are one of the first best sellers of the age of mechanical reproduction. But this impromptu watercolour is a private nightmare, more personal and disturbing. Stripped of religious imagery and symbolism, it shows large dark blob of water looming over the landscape looking prophetically like the base of a mushroom cloud.
It seems, with or without direct experience, we have a deep psychological need to imagine the worst before it happens. These cataclysmic visions haunt our dreams and pervade our culture. But where does this fascination with catastrophe come from? To a certain extent it’s obvious: our evolutionary success as species is linked with the sudden leap in brain power which allowed us to extrapolate, imagine the future, and plan ahead. Humans are highly successful ‘scenario building machines’, and projecting calamities, floods, earthquakes, may be a mental early-warning system to help us prepare for the worst, act accordingly, and survive. But another dream suggests that our fascination with destruction may not always help us avoid it.
“I dreamed I saw a great wave climbing over green lands and above the hills. I stood upon the brink. It was utterly dark and hideous before my feet…. I could only stand there, waiting…”
This was the childhood nightmare of another master of apocalyptic narrative, J.R.R. Tolkien, and shows how the dream of destruction can lead to paralysis rather than action. We are frozen, rooted to the spot, surrendering to the awe-inspiring force of nature. This feeling of acquiescence explains the almost mystical feelings people have about catastrophes: they are our contact with a power greater than ourselves – through their terror we touch the face of God.
About two weeks after Indian Ocean disaster, driving through the well heeled suburbs of Atlanta I saw an electronic billboard advertising a service for a neighbourhood church: TSUNAMI: ARE THESE THE END TIMES? Here, amid the paraphernalia of modernity, the drive-by banks, Starbucks cafes, wireless networks and SUV’s, was a reversion to an older mythical way of thought. Whatever the advances of science and technology, however widely dispersed is the knowledge of continental drift and geophysics, it seems many of us are still addicted to signs and portents, to seeing in random or natural events some hidden plan or divine narrative.
This is what I call apocalyptic thinking. Sometimes this archaic but powerful narrative is as obvious as a billboard in Georgia. But the myth also persists I believe at an unconscious level in secular ideologies, popular culture, and contemporary politics. Not surprisingly, we can find the most complete account of it in religion.
RELIGION AND TERROR: DIVINE VIOLENCE
If religion is anything it is totalising influence, and makes claims over all of our lives, from our personal morals and social customs to explanations of how the cosmos originated and humanity began. It is in the area of origins that religion still regularly clashes with secularism and science – whether in the debates about the moment of conception, abortion and stem cell research, or in the continuing objection to teaching Darwinian evolution in Kansas schools.
One common complaint about the scientific narrative of origins is that it is cold, soulless, and reduces humanity to a series of haphazard evolutionary accidents. However, a quick look at religious accounts of origins is hardly more comforting. In most great myths, the floods in Gilgamesh, Inca and Mayan tales, or the great battles of the Mahabharata, divine creation is accompanied by conflict and cosmic turmoil. In the early Hebrew texts Yahweh also shows the same capricious divine violence, culminating in the destructiveness of the biblical flood. But once Abraham has offered his son Isaac for sacrifice, God starts controlling his temper. He intervenes more indirectly and subtly, through the rise and fall of genealogies, kingdoms and nations. As fitting for a self contained and exiled people, God becomes more remote and self contained. His main promise is in the future.
It is probably no coincidence that the three great religions that flow from this Abrahamic tradition have themselves had such an impact on world history. Judaism, Christianity and Islam all have strong messianic elements, preaching of a God that remains aloof but interested in human affairs but will one day reinstitute His kingdom and holy law. In a sense, the great innovation of these three monotheisms is that they off an explanation of future ends as much as historic origins . But these ends are more often than not apocalyptic – as tumultuous as the collision of tectonic plates.
Descriptions of the Last Times vary in detail, but their significant elements remain the same. The usual social order is inverted. Guests are not safe. Father is set against son. Children are disobedient to their parents and show no gratitude. A wider social breakdown occurs. All respect for authority is lost. The pious seem insane while madman govern. Vice is called virtue, and vice versa. The world falls into promiscuity and enters a dreadful decadent epoch.
In the countryside, the land is divided, and excessive wealth is dug from mines. Gold and iron is used a means of greed and war. Natural resources are exploited. Young and pregnant animals are killed for their skins. Forests are burnt and lakes are drained. Soon the natural order is disrupted: there is darkness at noon, unseasonal weather - snow in summer and drought in winter. Fruit rots on the trees, grapes on the vines. The earth turns barren. The air thickens. In the cities, luxury reigns, then gluttony then lascivious insolence. Men behave like goats and sheep, fowls and swine. The day of destruction nears. Apostates, false prophets and messiahs appear. The majority turn to idolatry, while the faithful few wait for a sign. A last battle looms, a final struggle or jihad. The tiny band of the faithful looks like it will be easily overwhelmed by the unbelievers. But then there is sign - a great wind, or a fire, or a plague, or a flood - as God finally intervenes. Mountains are separated like carded wool. Mankind is scattered like moths. The destruction is a purgation, negating the negation, the sweeping away the vanity of the old world for a new to appear.
One can see how this messianic tradition thrived (and continues to thrive) as an underground movement through varied contexts of oppression or exile. The prophet is a social satirist, revolutionary, and Hollywood movie maker rolled into one. By projecting compelling visions of Doomsday with all the twists and reversals of the genre, he is a dreamer of the absolute, providing a critique of the existing social order. For underneath the apparent fatalism and nihilism is a utopian impulse – a desire for good government and social justice. Judgement day in the prophetic tradition is not just God’s judgement on mankind, but our judgement on ourselves. A small amendment to the words of 19th century father of anarchism, Mikhail Bakunin, shows how the central premise remained relevant in the Age of Revolutions.
'Revolution [Revelation] requires extensive and widespread destruction, a fecund and renovating destruction, since in this way and only in this way are new worlds born...' 
Since the Book of Revelation is an apocryphal addition to the Christian Bible, it is sometimes argued that apocalyptic thought is only a marginal strand of Christianity, but this ignores the messianic elements scattered throughout New and Old Testaments (especially Ezekiel and Zachariah). In the early years of the Christian church an entire branch of theology, eschatology - the study of last things - was devoted to the apocalypse and many of the great theological schisms centred round the supposed date of the day of judgement. Throughout the middle ages, well into the Reformation, millennial sects emerged to exploit or interpret times of crisis and social upheaval as signs of the second coming.
This radical tradition also took root in North America with the non-conformists who founded the early colonies. Apocalyptic prophesy flourished during Second Great Awakening in the 19th century and has remained a central tenet of many of the evangelical churches in US, particularly among The Church of the Latter Day Saints (Mormons), Seventh Day Adventists and Jehovah’s witnesses. These churches – which are some of the most rapidly expanding today – have in turn inspired a cultural movement obsessed with apocalyptic interpretation of current events. One of the most popular of these is the ‘Left Behind’ novels by Tim LaHaye and Jerry B. Jenkins which have sold over 40 million copies worldwide.
This brings us to a perverse aspect of religious belief. For many agnostics and atheists, the death toll of natural catastrophes is used an argument against divine intervention or design. The same questions were raised in the newspapers after the recent Tsunami as Voltaire asked after the Lisbon earthquake of 1755: how can any benign God let such things happen? But this presupposes that God’s will is scrutable, and moreover misses one important part of his appeal: he is Dies Irae as well as God of Love: his power and glory is displayed in wrath and terror as well as hope. A poll soon after the Tsunami confirms this paradox. Nearly twenty per cent of respondents in Ireland thought the disaster actually deepened their religious belief.
I would go further and argue there is something in the Abrahamic monotheistic tradition which is inherently apocalyptic, and therefore lends itself easily to violence and terror. Martyrs are willing to sacrifice this life for the life hereafter, and as we have seen long before the advent of suicide bombers, they are often willing to take other people with them. It may be a big leap to from individual martyrdom to the extinction of earth, but in apocalyptic thinking the leap is easily made. On a purely practical level, the belief in other worlds makes us much more wanton with the one we’ve got. If the mundane material world is just a shadow play of an ideal everlasting reality, is it not more easily disposable? Apocalypse means 'tearing the veil' and maybe that veil has to be ripped to shreds to reveal the true order beneath. In a sense, it’s better to believe that some kind of agency is in control – even if this is manifest in the destruction of the world – than to imagine the universe is random.
UNFORGETTABLE FIRES: THE AESTHETICS OF DISASTER
Max Weber suggested that one of the great achievements of the Enlightenment was that it separated the totalising claims of religion into three distinct secular spheres; knowledge about the universe became the domain of science: codes of personal or political behaviour went to ethics and law: while the emotional solace and sensuous appeal of religion became the province of the arts.
We don’t have to look far to see where the apocalyptic element survives in modern aesthetics, From Birth of a Nation, through the fires of Atlanta in Gone with the Wind to the closing napalm sequence of Apocalypse Now, the spectacle of destruction has been a key component of the Hollywood film since its inception. Disaster Movies like Armageddon, Independence Day, The Day After are typical summer blockbuster fare, and even in other genres, such the Terminator or the James Bond franchise, inevitably end in a big bang as the nuclear warhead is detonated or the evil mastermind’s lair is blown up. It is easy to dismiss these pyrotechnics as adolescent fantasies, but as Gaston Bachelard points out in the ‘The Psychoanalysis of Fire’, such images enthral us for a good reason: “fire suggests the desire to change, to speed up the passage of time, to bring all life to its conclusion, to its hereafter... The fascinated individual hears the call of the funeral pyre. For him destruction is more than a change, it is a renewal.” 
It is an important insight. Disasters are not the total negation of things, but a rapid process of change from one state to another. Huge explosions are startling demonstration of matter being turned into energy. To watch buildings demolished, or even people blown up, is a perverse revelation of their inner structure. This aesthetic attraction to potent and unpredictable forces has a long a venerable tradition. In classical theory is known as the sublime.
Bold, overhanging, and as it were threatening, rocks; clouds piled up in the sky, moving with lightning flashes and thunder peals, volcanoes in all their violence of destruction; hurricanes with their track of devastation; the boundless sea in a state of tumult… The sight of them is more attractive, the more fearful it is, provided only that we are in security; and we readily call these objects sublime because they raise the energies of the soul above their accustomed height… to measure ourselves against the apparent almightiness of nature
Kant’s description of the sublime could easily be applied to most apocalyptic imagery. So too could his caveat ‘The sight of them is more attractive… provided that we are in security’. The real appeal of disaster movies is that, no matter how close the camera gets, we as an audience are actually at a safe distance.
A sense of distance from the spectacle is also something that separates the sublime from one of the other great genre inherited from classical antiquity – Tragedy. For though we speak of disasters as being ‘tragic’, tragedy doesn’t need explosions, or final battles, or lightning. It happens on a smaller more personal scale and surprise is secondary for the outcome of a tragic story is rarely in doubt. The audience know in advance that Oedipus, in his high minded quest to rid Thebes of its curse, will actually discover he has inadvertently murdered his father and incestuously married his mother. The same is true of Hamlet’s hesitation in avenging his father’s death, or Uncle Vanya’s inability to declare his true feelings, or Willy Loman’s failure to hit his sales targets. We know what is going to happen, and our objective awareness of the inevitability of the story gives us, in Aristotelian terms, the feeling of terror. But our identification with the central hero, our subjective engagement and sympathy with his with flawed intentions, compensates for this distance with a feeling of pity.
Contrast this tragic catharsis with the sublime apocalyptic vision. Events happen in sudden unexpected shifts, usually in a massive landscape of clashing armies and thundering elements. Where individuals are depicted and some kind of empathy results, this is usually abruptly curtailed by the brutal intrusion of powerful impersonal forces. Instead of the Aristotelian catharsis of pity and terror, our main emotional response to the sublime is shock and awe: shock at the violence; awe at the forces revealed. This violence doesn’t have to be on a particularly epic in scale either. In Sophocles’ Oedipus Rex, for example, the hero’s self immolation takes place behind a screen. We have only Oedipus’ verbal account of his blinding and, in a sense, we see his blindness from the inside. At the polar opposite of this experience is the blinding in Bunuel’s Un Chien Andalou. We see a eye forced open – then a razor slash - then vitreous fluid pouring out of the eyeball. The sight is so graphic it is almost impossible to watch. We have objective clarity of vision, in close-up, but a subjectively we are totally blind to the human cost.
It may be an abrupt shift from Greek Tragedy to 20th century surrealism, but it serves the purpose of emphasising how the apocalyptic sublime has survived in much modern art. From the salon de refusės, through cubism, Dadaism, fauvism, futurism, constructivism to modern conceptual art, there has been an avowed manifesto of iconoclasm, a consistent desire to shock the audience. If tearing the veil is an apocalyptic impulse, then this an apocalyptic aesthetics is at the heart of the avant garde project, both in its content and treatment of forms. Antonin Artaud expressed it in his Theatre of Cruelty ,and similar violent ruptures preoccupy film, literature, music, sculpture and poetry. But the problem with shock tactics is that they are subject to a harsh law of diminishing returns: to keep ‘challenging’ audiences artists require increasingly virtuoso performances, culminating in cows sliced up and put in formaldehyde.
In such extreme artistic images the Weberian separation between science, ethics and arts begins to break down. Should our judgements be moral, aesthetic or forensic? If this confusion could just remain in the art gallery it might be classed as ‘interesting’, but the uncertainty also affects how real events are mediated. One classic case is from the Gulf War in 1991. When the CNN team reported from Al Rashid hotel on the first night of bombing of Baghdad, they described it ‘the most fantastic fireworks demonstration since the fourth of July’, the flak from the anti-aircraft guns as ‘like a million fireflies’. One of the reporters had to be checked when he kept on describing the beauty. The aesthetic excitement at the spectacle was undermined by the moral revulsion at the fact that people were being killed – and all this in what was supposed to be a factual objective news report. Now wonder the conflict over Kuwait became known as the first ‘virtual war’, but worse was yet to come.
What finally shattered the accepted norms of media coverage was the Al Qaeda attack on the World Trade Centre in 2001. Images from 9/11 are still difficult to deal with, too horrific and disturbing to be merely iconic they also have their terrible sublime beauty. The images from ground zero are probably the most widely distributed mass media images of the age, and some of the most unrepeatable and taboo. There’s also no doubt that the perpetrators of the attacks chose their targets for their symbolic publicity value as part of an avowed campaign of ‘cultural destabilisation’. The essence of a terror campaign is psychological disturbance as much as physical devastation. In religious terrorism this psychic warfare is all the more important because one is ultimately fighting for souls rather than territory. Osama Bin Laden’s achievement was not just to turn civilian planes into guided missiles, but also to transform the technology of 24 hour media coverage into a global propaganda coup. It was another escalation in the virtual war.
Not long after the terrorist attacks on New York and Washington, and over a decade after CNN was so roundly criticised for their coverage of the Baghdad Bombing, the National Defense University designed a battle plan for the Pentagon. By combining a vast simultaneous attack by stealth aircraft, cruise missiles, and high altitude bombers, the aim was less to destroy military forces than to inflict a massive psychological blow to the enemy's will. The plan was called "Shock and Awe" and, to great media fanfare, was launched on Baghdad on the night of March 21st 2003.
THE CLASH OF ABSOLUTES: A NEW DIVIDE?
In a mediated world, where moral, political and even scientific judgements can’t be separated from the imagery through which we perceive them, it’s not just Weber’s distinctions that are eroded: the whole Enlightenment edifice is under attack. Back in 1990, after the Berlin Wall fell, intellectuals from right and left were already announcing the End of History - at least history in its linear enlightenment version of progress and social advancement. In a reaction against modern 'soullessness', people would turn elsewhere, seek a reunion with the dead and follow their desire for the 'beyond'.  In other words, religion would have its second coming.
To a certain extent these prophesies have come to pass. Religion is regarded as an increasingly powerful political and historical force in first decade of the new millennium. In her book The Battle for God Karen Armstrong describes the rise of Christian, Islamic and Jewish fundamentalism as “embattled forms of spirituality, which have emerged as a response to perceived crisis… Fundamentalists do not perceive this battle as a conventional political struggle, but experience this as a cosmic war between the forces of good and evil.” ‘Fundamentalist’ is a problematic adjective in that it implies a backward-looking, conservative force and misses the radical, revolutionary agenda. Perhaps a better term for these militant religious movements is ‘apocalyptic’.
In Europe, with our spreading secularism and lack of church attendance, we tend to see this fundamentalist apocalyptic trend as deeply alien. It belongs to Wahabi Madrassahs on the Pakistan border, Ultra-Orthodox Jewish settlements on the West Bank or - perhaps more worrying – to born again Christians on Texas’ ranches.  One of the recurrent themes of the coverage of the 2004 US presidential elections was George W Bush’s religious faith: his reported belief that he was divinely ordained to lead the country through the war on terror, and that when consulting on political decisions he answered to a ‘higher father’. The success of his strategist, Karl Rove, in mobilising the evangelical vote is often cited as the key component to of election success. Liberal commentators on both sides of the Atlantic now fear that Bush is beholden to the Christian fundamentalists on the radical right, evangelical prophets like Tim LaHaye, co-author of the ‘Left Behind’ novels, who described 9/11 as a wake up call to America: “Suddenly, our false sense of security was shaken. Now we realize we’re vulnerable. And that fear can lead many people to Christ… I see many signs of the Lord’s return.”
The only problem with this fear is that recent history does not support it. After all, it was only twenty years ago that a conservative republican president was in power with the key support of the religious right. As Gore Vidal reflected ominously at the time, this president was also infected by biblical rhetoric, talking of ‘evil empires’ and apocalyptic struggles. During a time of nuclear re-armament and conflict in the Middle East, he listened to theologians like Hal Lyndsey who believed that Soviet Union was the ‘Gog’ of Old Testament prophesy and reminded him that Armageddon is actually a village 55 miles to the north-east of Tel Aviv.
But for all the millenarian prognostications of the faithful, and the liberal forebodings of the sceptical, President Reagan’s administration led to the end of the Cold War, the fall of the Berlin Wall and a reduction in nuclear stockpiles.
Our mistake is to take the evangelical rhetoric of the American religious right at face value and, from our own history of religious wars, believe the ideology is more established and totalitarian that is actually the case. At first sight it’s paradox that the first major nation to institute a separation of church and state should boast so much religiosity in its politics, and such a thriving ‘faith based’ sector. But without the backing of the state, preachers have always had to sell themselves in the American marketplace. (Hence the attention-grabbing sign ‘Are these the End Times?’ in Atlanta.) We should also remember that nearly half the nation, the powerful coastal and mid-west states and city populations that voted for Kerry, share a similar sceptical belief-system about war and conflict as Europeans. Meanwhile, Christianity in the remaining red states is less like an organised religion and more like a competitive market, with churches and creeds rising and falling, going big and then going bust, like any other commercial US sector. American faith, even in highly successful organisations like the Church of the Latter Day Saints, is highly unorthodox and individualistic, and the alliance of believers to their creeds probably more akin to a consumers commitment to a brand. For all the talk of end of the world and the approach of doomsday, most Americans are still optimistic and much less fatalistic than, say, Germans about their lives being controlled by outside forces.
A sense of history should stop us from being too pessimistic about American apocalyptic thinking. It should also prevent us from being too optimistic about European immunity to it. I hope this brief tour will have indicated how it still permeates many aspects of secular culture and ideology. The messianic fervour elicited by Bakunin runs through many supposedly secular ideologies of the 19th and 20th centuries. Both Communism and Fascism created their ‘elects’ – a vanguard of true believers – and projected the future as a series of catastrophes, whether of class struggle or some neo Darwinian racial competition. Even the religious imagery is recognisable: the SS ‘totenkopf’ skull insignia is borrowed from medieval millennial art; the heroes of Marxist Leninism in Soviet iconography look like bearded patriarchs clutching their sacred books; But the most important difference between the American and European apocalypse is much starker and more simple. Within living memory Europe has an active battleground for the clash of absolutes, cities were razed, populations destroyed, for competing visions of a new world order.
Those living memories are fading fast, and it’s always easier to divide up the world into sheep and goats than to hold on to a complicated nuanced vision. Though most Europeans won’t go the whole way and describe the US as the ‘Great Satan’, it’s not unusual to hear people now cite the US as the main cause for most of the ills of the time, from ecological collapse, globalisation, unemployment, cultural decay, and the forced consumption of Starbucks cappuccinos. I have personally heard well respected intellectuals and commentators, both in the UK and Germany, suggest that the US brought the 9/11 attacks upon themselves, not because of any foreign policy blunder, but because Hollywood had already projected such disasters in movies, and the culture had at times imagined its own demise. In such a way are signs taken for wonders, causes confused with effects. Those who polarise the debate between a secular tolerant Europe and a hell-bent born again America are as guilty of the apocalyptic way of thinking they claim to despise.
Peter Jukes March 2005
 Quoted inSimon Winchester, Krakatoa, London 2003
 One of the acute forms of Post Traumatic Stress Disorder is known as ‘Omen Formation’, in which the victim sees signs and portents of the trauma everywhere.
 The Deluge can be seen in Kunsthistorisches Museum in Vienna. The nuclear resonances of the picture are explored by John Berger in ‘Two Dreams’, The White Bird, New York, 1985.
 See Malcolm Gladwell, Blink: The Power of Thinking Without Thinking, New York, 2005, but more importantly Stephen Pinker, The Language Instinct, New York,1994
 These lines taken from the DVD of The Return of the King. New Line Productions, 2004. In the third volume The Lord of the Rings book they are attributed to Faramir. Various biographies claim this was a childhood dream of Tolkien’s.
.In other religions this trend is not so clear. There are apocalyptic elements in some traditions of Hindu fundamentalism, especially around Shiva and his incarnations. Sri Lankan Buddhism is reported to have developed militant strands thanks to the recent civil war, and the Aum Shinrikyo sect, which launched the sarin attack on the Tokyo metro system in 1995, combined Buddhist elements with a quasi scientific belief in an imminent catastrophe.
 In English the original meaning of ‘doom’ and its connection to ‘deeming’ - i.e. judging – has been lost.
 Mikhail Bakunin, ‘The Reaction in Germany’, 1842
 For examples see Norman Cohn The Pursuit of the Millennium, Oxford, 1957; Christopher Hill, The World Turned Upside Down, London, 1972.
 “Last month, a survey by the market research bureau of Ireland found 87% of the population believe in God. Rather than rocking their faith, 19% said tragedies such as the Asian tsunami, which killed 300,000 people, bolstered their belief.” Ian Sample, ‘Tests of Faith’, The Guardian, February 24th, 2005.
 Translated by Alan C.M. Ross, Beacon paperback, Boston, 1968.
 first defined by Dionysius Longinus in Peri Hupsous The Art of the Sublime (1st century AD).
 Immanuel Kant, Critique of Judgement (1790), trans, J. H. Bernard, 2nd ed., London, 1931.
 Jean Baudrillard ‘Le Transparence: du Mal’ Essais sur les Phenomenes Extremes, Paris, 1990. Francis Fukuyama, The End of History and the Last Man, New York, 1992.
 John Berger 'Keeping a Rendezvous' by, reprinted in The Guardian, Thursday, March 22, 1990
 “ In 1998 a Harris poll found that 66 percent even of non-Christian Americans believed in miracles and 47 percent of them accredited the Virgin Birth; the figures for all Americans are 86 percent and 83 percent respectively.. According to a 1999 Newsweek poll, 40 percent of all Americans (71 percent of Evangelical Protestants) believe that the world will end in a battle at Armageddon between Jesus and the Antichrist.” Tony Judt, ‘Anti-Americans Abroad’ New York Review of Books, Issue Volume 50, Number 7, May 1, 2003
 Bob Woodward, Plan of Attack, New York, 2004
For an example of the fear see Bill Moyers ‘Welcome to Doomsday’, New York Review of Books, Volume 52, Number 5, March 24, 2005. Tim Le Haye interviewed by Morley Safer on 60 Minutes II, CBS News, February 8th 2004.
 See Gore Vidal, ‘Armageddon?’ Essays 1983-87, London, 1988
 Ronald Asmus, Philip P. Everts, and Pierangelo Isernia: ‘Power, War, and Public Opinion; Looking behind the Transatlantic Divide’, Policy Review, Number 123, February–March 2004, Hoover Institute
 “The percentage of Americans who believe that success is determined by forces outside their control has fallen from 41 percent in 1988 to 32 percent today; by contrast, the percentage of Germans who believe it has risen from 59 percent in 1991 to 68 percent today.'' From Right Nation, John Micklethwait and Adrian Wooldridge, New York, 2004.
Plain text beneath the fold
Neither a philosopher, critic nor scholar, somehow Waiter Benjamin (born 15 July 1892) succeeded in being all three at once.His friend Bertolt Brecht called his suicide in 1940
German literature's first great Nazi casualty. Like Brecht, Benjamin was a Marxist, but his writing emerges refreshingly free from dogma. Of all his works, the essay "The work of art in the age of mechanical reproduction" (1936) has been the most influential and prescient
1. It has entered our lives by stealth. Should Walter Benjamin pop over to your house to celebrate his 100th birthday, you might have difficulty furnishing him with an example of new digital technology. Perhaps you’d point to a video recorder or CD player. He wouldn’t find them totally unfamiliar. In form and function, they seem a natural extension of a 1930s film reel or record collection. More compressed, more attuned, more complex perhaps; but mechanical reproduction all the same.
Under closer scrutiny, however, digital technology is not just more of the same. From photography to sound sampling, book publishing to video editing, fed down phone lines or bounced off satellite dishes, the digital domain translates sound, picture and text into the same binary code. It signals a new kind of production, as well as reproduction.
2. Back in 1936, Benjamin’s essay sketched out a critique of the emerging mass media. Today, multimedia is used to describe the results of digitalisation. Because aural, visual and textual images now can share the same format, different systems can intercommunicate. Moreover, because the information has been encoded, it can be interpreted in new and surprising ways.
An example of this was the release last month of the entire 22 volumes of the Oxford English Dictionary on one CD.
Not only is this a remarkable feat of compression: you no longer have to scan the dictionary, serially, by alphabetical head word alone. In a few seconds, a standard desktop computer can submit the data to all manner of searches-by date, name, quotation, etymological origin- that would have taken a researcher years of work.
Interactivity is the key development. A glimpse of what this means was given by the launch of Philips CDI, or Interactive Television, earlier this year. Looking like a CD player, it sits under the TV like a video recorder. In fact, CDI is a powerful microcomputer that can grab and manipulate photographs or sound samples, recolour a cartoon, or play one of the projected (but as yet unrealised) interactive films.
3. However, multimedia does not herald a new era. It is best described as a domain. Everything about it confounds the linear concepts of eras, epochs and revolutions.
The format of a CD, for example, unlike vinyl, allows you to skip between tracks, scan forward and back through different sectors. In a similar fashion, digital technology spreads evenly through our lives, sporadically, by incremental shifts. Already, the microchips in an average household television set could talk to the cooker, the sound system, or control the heating. But that does not mean that your home will be run by a massive brain, like Hal the computer who runs amok in Kubrick's film 2001.
The idea of Hal erroneously projects information technology through the prism of an old economic structure. As the growth of personal computers over the past decade has proved, the industry itself proceeds through wider dispersion rather than agglomeration. Its standard model is not the central brain or robot, but the loose, open architecture of the network.
4. Unlike most of the critics and philosophers of his day, Benjamin refused to revile mass production, and' looked for its liberating democratic potential. To him, the new media eroded the possessive fetishism of the art object, and made the work of art, its human significance, more accessible.
The avatars of multimedia would be quick to claim Walter Benjamin for their side. They argue that the digital domain will break the monopolistic stranglehold of the mass media, which separates consumer from producer and makes viewers into passive receivers. Interactivity, they say, reverses the circuit. It requires its audience to participate in choices and makes every receiver potentially a transmitter.
5. But Benjamin would also be met by other siren voices. Many of his professed disciples (Hans Magnus Enzensberger, John Berger, Guy Debord, Jean Baudrillard) saw the new technology as an extension of the screen, the spectacle, the simulacra: a further step in the industrialisation of the mind.
Only a fool would accept a fax of a cheque, but multimedia poses similar dilemmas of value and authenticity. Old, mechanically reproduced images at least bear some trace impression of their original. One trusted the token value of a photo of Dietrich or a recording of Gigli, rather as one believed a paper bank note was equivalent to a pound of sterling silver.
The digital image's relation to its original is more like that of a credit card to precious metal: that is, virtually untenable. We have become accustomed to the fact that a new record release no longer reproduces a musician's performance so much as a producer's mix in a multitrack studio. But if we apply the same acceptance to photography? Digital cameras are already on the market, and most news photos are now processed through computer software before publication, where they can be manipulated, figures "air brushed" out, filtered, treated.
The prospect might have disturbed Benjamin. The digital domain seems to yield only images of images. It seems to take us a further remove from reality: deceiving shadows dancing on the walls of our crystal caves.
6. The point is, however, that the camera never told the truth. It lied by framing, crop ping, choice of caption, by sins of commission or omission. There is no point mourning the decline of the apparent scientific objectivity of the old media. As facts, they were highly factitious.
With the virtue of hindsight, Benjamin would see that mass production did not dispel the bogus religiosity of the work of art. It just replaced the aura of the object with a secondary glamour of the image. Throughout my childhood, film and pop stars attained a god-like status. To see a TV personality in the flesh was as miraculous as meeting an angel.
My two-year-old son, by contrast (thanks to his grandparents' digital camcorder) has already seen himself on TV many times. Per haps multimedia may yet help to demystify our culture. I doubt the medium will have the hold on his generation it did on mine.
More dramatic still have been the changes in publishing. Not long ago, the honour of being typeset, of "seeing your name in print", conferred real authority. Today, with the relative accessibility of desktop publishing, anyone can produce near typecast quality. If he could look over my shoulder as I write this on my PC now, Benjamin might not be so worried after all.
7. Clearly, the digital domain represents an important new form of production. But what about the other half of the title? Can multi media really create their own unique works of art?
8. No, they can't! Multimedia is merely the convergence of existing art forms. The digital domain can only reprocess the products of other genres-publishing, music, film - and reissue them at greater cost.
9. Yes, they can! These new art forms may be stirring already in software publishing, in computer games or virtual reality. As Benjamin noted elsewhere, most new art forms develop, unrecognised, on the unofficial margins of culture.
10. According to Niko Paape, a Dutch interactive designer, the new forms will be almost unrecognisable anyway. For the first time, artworks will be actively created by the spectator (within parameters set by the artist). An interactive picture, for example, may change colour or composition, depending on what clothes the spectator wears, how loud he or she talks.
Clearly this takes the notion that "all art is collaboration" through an unexpected twist If the spectator creates the artwork, who owns the copyright? Such problems of authorship are, of course, only the obsession of an acquisitive individualist culture.
11. Yet why should the new art forms be so radical and progressive? Personal computing promised us a paperless office, but actually increased the amount of paper printed. Paradoxically, given the constant fears about decline of reading (and rise of a three-minute, imagistic culture), multimedia may permit the secret revenge of literacy.
To Benjamin, the great achievement of the era of mechanical reproduction was film, a popular art that spoke in a new demotic language to the masses gathered in the cities.
Compare the gregariousness of the cinema with the solitude of a VDU or interactive TV, and multimedia can seem like a step back wards. A brief look at the metaphors of the dominant software packages- windows, rooms, desktops- suggests romantic dreaminess and solipsism.
The irony is that digital technology, with its hieroglyphics and clerkist conditions, is overwhelmingly writerly.
12. Yet the digital image has one great flaw, which casts a shadow over its artistic potential. In the first years of the compact disc, many people complained that the aural quality was too bright, almost empty. Even today, I rather like my old vinyl Gigli recording, complete with scratches, pops and hisses. It reminds me that this Italian tenor is singing across great gulfs of years, from the other side of the grave.
Celluloid film - particularly black and white - has the same memorial quality. When you see Casablanca now, you feel nostalgic for those absent faces, that vanished world. Indeed, the elegiac quality of "As Time Goes By' 'was apparent on the day of the film's release. Film, with its shadowy ghostly imagery, closely mimics the tricks of memory. Video tape, by contrast, makes no accommodation with our perception of time. When an episode of Morecambe and Wise is repeated there is something almost obscene about the fact that Eric seems so bright and alive, or that Glenda Jackson looks so young. No matter how vivid the image, it seems to flicker with unreality.
The banal glare of video recordings leads people to assume they are of poorer quality than film. But the opposite is true. Video images are technically perfect. As with most digital forms, errors and distortions would destroy the signal entirely. By virtue of their obsessive perfection, digital images remove the need for imaginative engagement, and deny the central human experience of mortality.
13. Meanwhile, the digital domain marches on. It has been estimated that some 40 per cent of the US economy is now geared to the "soft" production of information and entertainment. At this rate, the old Marxist distinction between base and superstructure, means of production and culture, will fall apart. Ideas and images are becoming the mainstay of the post-industrial economy. What would Benjamin make of this inversion? His resounding exhortation at the conclusion of 'The work of art in the age of mechanical reproduction" was not to aestheticise politics, but to politicise aesthetics.
But hasn't this been largely achieved? Semiotically trained image-consultants advise politicians and corporations. Foucault's dictum "Knowledge is Power" is stencilled on delegates' badges at hairdressing conferences.
14. Yet some of the key ideas Benjamin gleaned from Marx may not be entirely obsolete. In the digital domain, the victors are those who can extract significance from the flood of data. To rephrase a famous line from the Communist Manifesto: the struggle of existing society is the struggle of the classes to become historical (i.e. significant). For the unrepresented and unheard, the constant threat is that they will be drowned out by a rising tide of banality.
Truly politicised aesthetics, in the digital domain, will not be a question of bland slogans and political correctness. It will be a struggle, fought with wit, passion, surprise and guile, to grasp deep meanings without being obscure, to create accessible but captivating images. It will be the labour that Waiter Benjamin embodied in his work, to make, against all odds, a difference.
Once, the Andalusian ‘Flamenco singer’ Pastora Pavon, La Niña de Los Peines, sombre Spanish genius, equal in power of fancy to Goya or Rafael el Gallo, was singing in a little tavern in Cadiz. She played with her voice of shadows, with her voice of beaten tin, with her mossy voice, she tangled it in her hair, or soaked it in manzanilla or abandoned it to dark distant briars. But, there was nothing there: it was useless. The audience remained silent.
In the room was Ignacio Espeleta, handsome as a Roman tortoise, who was once asked: ‘Why don’t you work?’ and who replied with a smile worthy of Argantonius: ‘How should I work, if I’m from Cadiz?’
In the room was Elvira, fiery aristocrat, whore from Seville, descended in line from Soledad Vargos, who in ’30 didn’t wish to marry with a Rothschild, because he wasn’t her equal in blood. In the room were the Floridas, whom people think are butchers, but who in reality are millennial priests who still sacrifice bulls to Geryon, and in the corner was that formidable breeder of bulls, Don Pablo Murube, with the look of a Cretan mask. Pastora Pavon finished her song in silence. Only, a little man, one of those dancing midgets who leap up suddenly from behind brandy bottles, sarcastically, in a very soft voice, said: ‘Viva, Paris!’ as if to say: ‘Here ability is not important, nor technique, nor skill. What matters here is something other.’
Then La Niña de Los Peines got up like a madwoman, trembling like a medieval mourner, and drank, in one gulp, a huge glass of fiery spirits, and began to sing with a scorched throat, without voice, breath, colour, but…with duende. She managed to tear down the scaffolding of the song, but allow through a furious, burning duende, friend to those winds heavy with sand, that make listeners tear at their clothes with the same rhythm as the Negroes of the Antilles in their rite, huddled before the statue of Santa Bárbara.
La Niña de Los Peines had to tear apart her voice, because she knew experts were listening, who demanded not form but the marrow of form, pure music with a body lean enough to float on air. She had to rob herself of skill and safety: that is to say, banish her Muse, and be helpless, so her duende might come, and deign to struggle with her at close quarters...
Lorca translation's on this site
The critically acclaimed US television drama could not be made here. We have writing talent in abundance, but its output is controlled by a stifling monopoly—the BBC. Plus, an interview with The Wire's creator David Simon
It’s been a slow burning fuse. From its first broadcast on the US pay-TV channel HBO in 2002, it took seven years for The Wire to accumulate widespread critical recognition in Britain. And it has grown into something bigger than an artistic success. Like a great Victorian novel, David Simon’s epic portrait of the policing, crime and politics of post-industrial Baltimore is now cited by politicians and leader writers. But the success of this show and a raft of other imports such as The West Wing and Mad Men begs a question about the state of one of our key cultural industries. How come US television drama has captured the high end of the market and we have abandoned it?
I’ve spent much of the last year on the front line of one of the most contentious presidential nomination contests in memory—without moving from my London desk. I have been part of something historic: the first great political battle to take place in cyberspace.
For many in Britain, blogging, especially political blogging, is a bit of a disappointment. Many of our political sites are tacked on to party websites, or are simply online versions of established media outlets. They tend to be either controlled, conformist and rather dull, or unmoderated rants, the kind of online graffiti rightly parodied by Private Eye.
The US offers a glimpse of something different—how the internet can transform news and opinion. It is ten years since the Drudge Report broke the Lewinsky scandal. These days, American sites like Talking Points Memo, Politico and (as Andrew Keen described in the August issue of Prospect) the Huffington Post regularly scoop the conventional media by hours, or even days.
There’s a nasty bug doing the rounds. Like a computer virus it occupies apparently innocuous spaces, then starts replicating itself at amazing speed spawning logical contradictions that eventually bring the system shuddering to a halt. Fortunately, the symptoms are easy to spot. If words like ‘seduction’ ‘simulation’, ‘decentred individual’ and ‘posthumanism’ randomly flash across the page, you know you’ve found the bug of post modernism.
Neither Mark Dery’s Escape Velocity nor Shelly Turkle’s ‘Life on the Screen’ are free from this virulent force. Both claim to explore the cultural impact of the ‘digital revolution’, and yet rely on the prescriptions of Baudrillard, Jameson and Lacan - theorists who main contributions were relevant to the mass media of ten or twenty years ago, rather than the multimedia industry today.
It may mark the end of culture as we know it. The sideshows are raucous and sensational. They revolve around puerile fantasies of sex and aggression, horror and sleaze. They are also highly addictive. Something in their flickering imagery mesmerises the young (in some cases inducing fits) resulting in short attention spans, truancy, and an explosion of juvenile crime.
Initial comparisons are promising. Like early film, interactive games have become an unpredicted commercial success. In a few years, Sega and Nintendo have amassed a four to five billion dollar annual turnover, capturing some two-thirds of the recorded music market, a quarter of the entertainments industry as a whole. Just as cinema emerged from the protozoic soup of 19th century technology: Fantascopes, Zoopraxiscopes, Kinetoscopes, Zoetrope, Vitascopes, today's interactive media betray a similar proliferation of proprietorial formats. In the last year, in the compact disc market alone, Sega has launched the Mega-CD, Amiga the CD-TV, Philips CD-I. In a market teeming with life, natural selection is already in progress. Some of these mutations will be unviable: others might only last of few years. But among them, crawling out of the swamp, might be the prototype of a new medium destined to colonise the earth.
One proof will be the fear and loathing the new arrival attracts. Writing about film in The Work of Art in the Era of Mechanical Reproduction Walter Benjamin noted how new art forms initially appear brash and monstrous. They have, by necessity, to shake traditions, offend older sensibilities. Video games certainly fulfil this brief. Witness the uproar that has attended the proposed release of Night Trap for the Sega Mega-CD. The game is billed as an 'interactive movie' in which - among other things - you have to rescue some semi-naked women from a lurking alien threat. Night Trap was one of the first computer games to be referred to British Board of Film Censors (where it was awarded a 15 certificate). By video standards, the content was mild. But the disproportionate outrage just adds to the medium's credentials. As Malcolm McClaren has pointed out: if you want to rebel against your parents, the last thing you'll play is loud rock and roll. If you really want to get them worried, go and play a computer game.
Something strange has happened to the great microchip revolution. Cheap computer processing was supposed to bring us smart TVs and digital organisers, put the sophistication of a graphics workstation, recording studio and typesetting shop within everyone's reach. Through the home computer we were going to be connected to global networks of data, ushered into the era of information. Instead, rather than information, the bulk of home computer processing is dedicated to the production of disinformation: of simulations and games, Sonic the Hedgehog and Super Mario.
Maybe it's not so strange. Some marketing manager coined 'infotainment' to describe the paradox of information and play and, at heart, cinema also displays the same ambivalence. In the 1870s Étienne Marey and Eadward Muybridge developed the cine camera to help their investigations into animal locomotion. Louis Lumière thought cinema as a tool for scientific research, a way of recording and analysing events, not unlike a computer. It took a showman and conjurer to turn it into a vehicle for mass appeal.
The showman was Georges Méliès (an exhibition of whose career is still running at the Museum of the Moving Image). The Frenchman was the first fully to explore the fantastic possibilities of film: his favourite genres were fairytales, burlesque and science fiction. Méliès mastered endless unique special effects for making women turn into mermaids, heads to fall off or inflate, bodies dismember themselves and individual limbs go dancing about the screen. He loved the irrational side of film, the erotic, the comic, the macabre.
Cinema partly derives from photo realism, from the documentary impulse to describe the world, but Méliès represents the other equally important tradition: stage magic, deception, trick photography. Despite his kitsch and playfulness, Méliès probably advanced the genre further than any other single figure. Indeed, by pursuing the lowest possible audience taste, he was testing the form to the highest technical limits. After all, what better proof of the verisimilitude of the moving image than that it convey a sexual charge? And what more graphic demonstration of its capabilities than showing a man removing his head five times, and placing each singing head on a bar of five telegraph wires like so many musical notes?
In the history of cinema, play has often been the mother of invention. For the new digital media, this has proved equally true. In the 1970s, in the Californian town of Palo Alto (where Muybridge studied animal locomotion a century before) Rank Xerox set up a research centre. Its aim was to develop the next generation of computers and according to Larry Tesler, one of the core engineers, they soon turned to computer games for inspiration. Millions who found information technology complex and intimidating were less circumspect when it came to playing Space Invaders or Missile Command. If it appeared as a pacman or asteroid, most were quite happy manipulating electronic data on a VDU. It was this insight that led the development of the 'Graphical user interface', the use of windows, icons, and desktop metaphors to simplify tasks. This interface, in turn, led directly to the launch of Apple MacIntosh and Microsoft Windows and the beginning of the end of IBM. By imitating games personal computing became, through the 1980s, the fastest growing industry ever.
We are still living through the golden age of computing. Most of the languages, algorithms, and metaphors created today will provide the foundation for the next hundred years. Whether it's a game to save Lemmings from extinction, a beautifully tailored personal organiser, or a screen saver showing flying toasters, current software is a source of much of ingenuity, wit and invention. In a few centuries time the shoot-em-up arcade games that so worry commentators will probably be in display cases at the Design Museum or the V&A. Now that the office market is saturated, corporations are looking to exploit the home market and once again computer games are leading the way.
Yet, as with the early years of film, official culture refuses to recognise it's crass, nerdish newcomer. The higher arts look down on their pixilated relatives with dismay and disdain. Both film and computer software come from the boiler room of culture, designed by pioneers with a technical or manufacturing background rather than a training in the high arts. This means that achievements tend to go unrecognised until too late, just as the critics began to praise the era of silent movies the moment the talkies came in.
All of which makes predictions generically difficult. Indeed, this could be the moment the analogy between cinema and the digital domain ends. Unlike the mimetic traditions of photography and film, a new interactive art form might follow an entirely different logic: the logic of simulation. Current computer games such as A-Trainor Sim City are not really representations of the world but software models, virtual machines sustained by their own mathematical engines. Alternatively, film and computer technology could be converging. Francis Ford Coppola - who used much of the new technology in his underrated Dracula - has predicted that it will open film-making to ordinary people. He expresses the hope that one day soon a film masterpiece may be made at home in her room by 'some fat girl in Wisconsin'. In this light digital media could fulfil the promise of cinema, making the manipulation of images and the creation of a new visual language more accessible and seamless.
According to a Russian maxim, the Fox knows many little things but the Hedgehog knows one big thing. But what is the big thing that Sonic the Hedgehog knows? What else waits to crawl out of the digital swamp?
For the moment he is silent.
Published in New Statesmen July 1993