Spiked Online

Our Inner Spidey

Sequels are not destined to be rubbish – they aren't subject to the apparently inexorable laws making most remakes terrible. After 2 Fast 2 Furious and all that pap last year, Spider-Man 2 – like Shrek 2 the week before – has wowed many observers. Not only does it appeal to the 'kidult' demographic by packing in parents, their kids and independent teenagers, it also delivers by entertaining them all intelligently, without resorting to grafting adult gags on to the kids' action scenes (anyone for The Cat in the Hat?).

Spider-Man 2's other great marketing trick is to reinvent a version of an existing character while keeping the original fans on board. There are sufficient regular readers ('fanboys') to support several monthly Spider-Man comics, but not necessarily enough to make a multimillion-dollar feature film (or film franchise) profitable. Television shows can limp around in a DVD and convention afterlife by drawing on the fanbase. But once you start costcutting on comic-book adaptations they look awful. (Anyone fancy watching Superman IV: The Quest for Peace tonight? Didn't think so.)

The rise of computer-generated imagery (CGI) has meant that many superhero projects discussed in the 1970s and 80s are now more achievable, but bad CGI is no better than cheap special effects. And even without cost issues to consider, nothing can help a chronic script (step forward, Batman and Robin).

Taking the long view, it's apparent that superhero movies, despite the occasional blockbuster, have been essentially a minority sport. The late '70s/early '80s success of the first three Superman films did little to undermine the way that for mainstream audiences, the figure of the superhero was always tainted by the camp Batman TV show and its big-screen spin-off. The Incredible Hulk and, to a lesser extent, Spider-Man could thrive on 1970s TV provided they faced off against everyday criminals (and ninjas) and avoided wheeling on the costumed villains.

But elsewhere, characters from Marvel and DC comics seldom ventured off the four-colour page, except into cartoons. As live action it was all too silly after the age of seven, especially Batman. In his essay "Batman, Deviance and Camp," cultural critic Andy Medhurst recalls "pure pleasure, except for the annoying fact that my parents didn't seem to appreciate the thrills on offer. Worse than that, they actually laughed. How could anyone laugh when the Dynamic Duo were about to be turned into Frostie Freezies?"

If Medhurst later came to "share that once infuriating parental hilarity" (at least until he wrote the essay that partially rehabilitated the character), then so too did mass audiences most of the time. Charting the discussion of comic characters as entertainment in the 15 to 20 years since I was an intern at the Final Frontier comic shop, it's striking just how defensive the industry has been. Contriving the term "graphic novel" instead of "comic," rather than claiming to be art or literature, and the distancing of a successful movie like Blade from its comic book origins – it all adds up to a part of the entertainment industry being in denial. Meanwhile, the fanboys themselves acquired a reputation for poor personal hygiene and needing to get out more, embodied in "Comic Book Guy" off The Simpsons.

In defense of the comic, it is worth pointing to its ability – not unlike other forms of entertainment – to connect with wider social changes. Indeed, some overestimate the influence of comics to make a case for a red thread of immigrant radicalism in mass entertainment. While Superman evolved into a patriotic symbol, his fantastic origins owed much to immigrant fantasies of transcendence. Kill Bill Vol 2 sees Bill (David Carradine) remind us that Clark Kent is only Kal-El's disguise, but for creators Jerry Siegel and Joe Shuster, Superman was a steely alternative to being a bespectacled mensch.

DC Comics was in turn overtaken by Marvel Comics in the 1960s, precisely because the Marvel characters had flawed private lives echoing those of their college freshman readers, which later on connected to the nascent counterculture. Marvel stories were set in Manhattan rather than "Gotham City." Apart from a classic run on Green Lantern, DC had no reply to the "realism" of X-Men, Fantastic Four and Peter Parker as "old webhead himself."

The Spider-Man comic appealed to its fanbase almost as consistently as its hyphen was used inconsistently. Spider-Man director Sam Raimi plays to his own fanbase too, hence cameos from Bruce Campbell, his brother Ted Raimi and the car used in his feature debut The Evil Dead. (This has also got my fellow video-nasty buffs gloating, a quarter century after Raimi's "Ultimate Experience in Gruelling Horror" was banned in Britain.)

And sure enough Spider-Man 2 is a treat, with Parker struggling with the decisions made at the end of the last movie and Alfred Molina putting in a gothic performance and changing from a humane scientist to one infected with sinister AI nanobots, rather than simply mad. (Occasionally his "Doctor Octopus" arms move like Rod Hull's Emu, but that's still pretty sinister.)

A number of commentators have scrabbled around for answers as to why we like our heroes flawed these days. "When did men in tights discover the courage to bare their soul?" asks Sean Macaulay in The Times (London). One answer is that since comic characters are empty vessels, despite the layers of mythology built up behind them over the years (origins, alternate realities, parallel universes, crossovers – ask a fanboy), they can play more or less any role we want them to.

Marvel's 1940s Nazi, the Red Skull, resurfaced as a communist in the 1960s, while his opponent Captain America spent the late 1980s espousing a moderate form of patriotism. DC's Dark Knight Returns version of Batman – instrumental to Tim Burton's 1989 film adaptation of the character – battled an urban underclass, while Superman became a CIA stooge in Frank Miller's comic, and via a handsome romantic lead on prime-time TV opposite Teri Hatcher, he is now a confused teen on Smallville.

Although he has been troubled since his first appearance in Amazing Fantasy #15 (1962), Peter Parker is also a figure for our times. Thus his superpowers derive from a genetically modified spider as opposed to the comic's radioactive version. And who better to inspire kids in a counselling culture than a hero who himself needs counselling (and gets his "dreams" of being Spider-man analysed in the course of the movie)?

After all, many of the more conventional fictional heroes have lost their luster. When was the last time you saw a cop or espionage thriller where the protagonist wasn't betrayed by his superiors? Anti-terrorist movies might go in for ethnic stereotypes from time to time, but generally the first set of villains turn out to be wheels within wheels, backed up by US agency chiefs (see 24 and Alias for endless double-crosses, betrayals and double double agents).

In contrast, the superhero seems untainted in comparison to the "real heroes" of 1980s cinema. Throw in enough CGI to make things look decent, build on the "pre-sold" characters in Marvel and DC copyright (including merchandising opportunities), keep the fanboys away from the test screenings, and you have a formula for success. It is fitting that Mary Jane Watson (Kirsten Dunst) ends up with the troubled nerd rather than astronaut John Jameson (Daniel Gillies), given that the former is a hero for our times.

Spiritless Olympics

The Olympic spirit is sorely missing from the debate about Greece's preparations for the 2004 Olympic games, due to be held from 13 to 29 August.

The public aim of the games is to promote individual heroism and solidarity between countries, yet the world seems to be almost willing Athens 2004 to be a disaster. The Greeks are having to field predictions of doom, from fears about terrorist attacks to worries that the facilities won't be ready on time. 'An artificial climate of concern is being created', complained Greece's culture minister, Evangelos Venizelos.

Concerns that the 2004 Olympics could be a terrorist target have been floated for a while. Greece has spent a record £430million on security, and more than 50,000 security personnel, including 16,000 soldiers, will guard both the country's borders and Athens. Greek and American troops ran a three-week exercise, codenamed Shield of Hercules 2004, responding to 'catastrophic scenarios' such as dirty-bomb attacks and hijackings. According to one report, Piraeus, the port where visitors will be accommodated in cruise liners, will 'resemble a fortress' -- with gun-toting guard officers and barbed wire barriers. Athens will be a no-fly zone for the games, and US troops will be stationed offshore to respond to emergencies.

Yet the more Greece prepares, the more everybody else frets. Newspaper journalists probe the country, pointing out all its weak spots. One reporter said that she managed to walk into Athens' building sites unobstructed, and noted security experts' concerns that these sites could be 'scouted by terrorists' and 'a device could be planted amid the chaos'. Others detailed all the weaknesses in Greece's 'porous borders', which meet Macedonia and Albania in the north, and form a windy, island-strewn coastline in the south.

Athletes fear for their own safety, with US tennis star Serena Williams saying back in March that the threat of terrorism might keep her at home. 'My security and my safety and my life are a little bit more important than tennis', she said. The US government has warned its athletes not to wear their national colors outside the confines of the Olympic village, because this might make them targets. Meanwhile, UK counterterrorism officials are apparently so concerned about the standards of Greek security that they are considering dispatching their own marksmen to protect British athletes.

All this talk is not helping matters. Given that the aim of terrorism is to create panic, if terrorists know that the whole world will be watching and holding its breath there are more incentives to strike. And bands of international marksmen wandering around Athens with their guns cocked would be a recipe for disaster. 'Any presence of foreign armed security guards for the athletes' protection is strictly prohibited', said a spokesman for the Greek Ministry of Public Order. 'There are 202 countries participating in the Games, so you can imagine what would happen if we had armed security forces for every country.'

Another widespread concern is that the site won't be ready in time. The Times (London) flew out a British structural engineer to examine the Greek's preparations. 'It is going to be tight', Norman Train said gravely, 'they should have done a third by this stage but less than a quarter is done.... I can't see how they will finish this on time'. There was a bit of tut-tutting at the Greeks' lackadaisical attitude: 'Despite the urgency of the situation, there was little building activity at any of the sites visited', reported the article.

Competitors are concerned that inadequate facilities will affect their performance. Plans to build a roof over the swimming pool complex have been shelved for lack of time, so swimmers will have to cover themselves in sun cream. A spokesman for British Swimming said: 'It is asking a lot of them to perform in these conditions -- it could seriously upset their performance.' The International Olympic Committee (IOC) is now attempting to get insurance to cover the possibility of cancellation, insisting not very convincingly that this didn't indicate a lack of faith in Greece. Understandably, Greece is getting a little tired of all this worrying. A senior official at Athens 2004 said: 'You can tell that we like to do things at the last moment, but we will make it. Foreign countries like England and America should give us a break. We knew from the beginning that we would have to use the whole time.'

The Olympics have traditionally been a focus for both cooperation and conflict. In ancient Greece, wars would be suspended for the Olympics -- the different sides would put down their arms and compete on the field, challenging each other to contests of strength and skill. The modern Olympics, which began in 1896, involved an element of mutual cooperation and respect, but also often reflected the political rivalries and conflicts of the times -- most notably, during the Cold War.

At today's Olympics, we are seeing neither international cooperation nor staunch political rivalries. Instead, we are witnessing a mood that couldn't be less 'Olympian' -- the small-minded, petty worries about safety, with each country demanding their own security guards, coaches moaning that their swimmers will be put off by the sunshine, and one of the top tennis players in the world pondering whether her safety is worth more than her sport. All this fretting threatens to turn the 'greatest show on Earth' into a logistical and public order problem. Instead of admiring the feats of the athletes, it seems we'll just be hoping that the buildings are ready and everybody gets through the games alive. Merely surviving will be seen as a 'success'.

Luckily, there are a few modern-day Olympians who are just getting on with things. UK marathon runner Paula Radcliffe went out early to inspect the course, which wasn't finished. But Radcliffe was unfazed: 'It would be nice to have the new course finished but there is a course there now. Right now, I am just focusing on training and being ready for whatever course I have to run on.' If more people had her attitude, Athens 2004 would stand a far greater chance of success.

Josie Appleton writes for Spiked Online.

Take This Job

In the opening scene of The Office, David Brent interviews an applicant for a forklift driving job while telling porkies to his warehouse manager on the phone. This echoes an incident from Ricky Gervais's own life when, as a school-leaver, he attended a recruitment agency. The interviewer phoned a 'friend' and proceeded to lie his head off in order to get Ricky a job. As he fibbed away, he winked and made 'Pinocchio nose' gestures with his thumb and forefinger.

"This guy was supposed to be his mate!" Gervais recalls. "And I was meant to be impressed by this man lying to a friend! It's the arrogance of people who meet someone and want to take the short cut. They go: 'I'll be honest -- you're going to like me because everyone else does, so let's cut the bullshit and start liking me now.'"

Once upon a time, Ricky Gervais was just a short bloke with a Reading accent. Then, a few years ago, he became the short bloke from The Eleven O'clock Show. Now he's the short bloke with a few million in the bank, a shelf full of Baftas and Golden Globes and a reputation as a comic genius.

But the hype about this cult comedy series tends to obscure the reason why it has touched such a nerve -- the grim reality of office life. I talked to Ricky Gervais back before The Office reached the giddy heights of its current fame -- and his memories of the real world of office work were all too fresh.

As he recounts his tale of the recruitment agency, Gervais switches seamlessly into Brent mode, clicking his fingers, drumming on the desk, picking up the phone and putting it down again -- a bag of futile nervous energy. This effortless transformation poses the obvious question: how much of David Brent is there in Ricky Gervais?

"Well," he replies, snapping out of character, "he's that part of all of us that wants to be liked and thought of a good bloke, and also wants desperately to win at Trivial Pursuit. Most of us manage to keep it under control, though." Yet, Gervais can't bring himself to condemn his alter ego completely. "David Brent's not a bad bloke, but he's full of contradictions," he says. "Part of him regrets missing out on the eighties and thinks he could have had his own business and owned a Maserati by now. But he also wants this Californian thing -- a big commune. He doesn't know if he wants to be head of the mafia or a guru. He wants to win, but if he was a bit more honest and vulnerable, and occasionally admitted he was having a bad day, people would like him more."

If David Brent had a coat of arms, its legend would probably be: You don't have to be mad to work here, but it helps. Indeed, in his hands, this well-worn slogan becomes a full-blown management philosophy. The result is a tyranny of forced hilarity, and resistance is futile. Brent's partner in comedy crime is Chris Finch, an obnoxious and slightly menacing sales rep from up north. Between the two of them, they keep up a relentless barrage of badinage. No opportunity for zaniness is knowingly passed up, to the exasperation of the long-suffering staff. Needless to say, Brent has no inkling of how others see him: "He's got such a massive blind spot, he just wouldn't believe someone was slagging him off. If you got fed up with him and threw him in the fountain, he'd come back in and say: 'See what the lads did? They're mental! They love me here!'"

One reason the Brent character took off was because he is all too familiar, and the same goes for the setting. Gervais wanted something universally recognizable -- a million miles from the hot-desking, video-conferencing, feng-shuied fantasy office of the media's imagination. The location is telling too: the demographically important but anonymous and largely ignored Thames Valley. "Everyone thinks of England as London, Birmingham, Manchester and so on," Gervais agrees. "But most people live anywhere but. Anyway, it's the only accent I can do...."

The Office, in other words, is a fairly typical picture of working life as it is lived by a very large number of people. "Jobs, not careers," is how Gervais puts it. 'White collar' may once have meant respectability and 'prospects,' but now it just means a desk or a headset. A call centre in Newcastle is not much different to a credit card centre in Brighton: The product is irrelevant and the regional differences are fast disappearing. One striking thing about the paper merchants in The Office is that nobody actually mentions paper, unless the subject is unavoidable. "Nobody cares!" laughs Gervais. "You never talk about the product, or how proud you are of your work, you talk about people: 'You'll never believe what that Pete Gibbons said to me last night -- idiot!' or 'So-and-so's got a pay rise....'"

Although offices are changing, they are still places where people of different types and ages are thrown together -- unlike pubs and other institutions, which are becoming increasingly segmented. And some features of office life are simply modern variations on an old theme. Yesterday's union rep is today's health and safety rep, or serial sponsored marathon-runner. The pipe-smoker with leather elbow-patches who used to do cryptic crosswords has been succeeded by the paunchy Eric Clapton fan who runs the quiz night. The spindly bloke with the sweaty white shirt and bicycle-clips has given way to the lycra-clad Greenpeace activist who smells of mildew. Smokers gossip in the street instead of in the coffee room. Xeroxed joke sheets are now emailed jpegs.

Even the open-plan office itself, that symbol of openness and team-building has been divided up into personal territories, each customized with pictures, novelty mouse pads, trinkets and, of course, ersatz barriers like the wall of box files that is erected between desks in one episode of The Office. "I'm not an anthropologist or sociologist," admits Gervais, "but I don't think open plan offices are natural. I imagine that the first thing you do when you're thrown together with 30 people that you might not care for, is build a wall."

Gervais was drawn to the subject of offices by precisely this random -- and frankly misanthropic -- way in which they draw together people of different ages who have nothing in common. Yet, for all their personal quirks, there is something instantly recognizable about the dramatis personae of The Office." This is entirely intentional, as they were deliberately typecast. ("It was a case of: 'You know those type of people that....'")

So, David Brent was "the type who tells you he's popular instead of just getting on with being popular, and who you have to laugh with just because he's the boss." Tim is "the normal bloke, stable and genuinely a good laugh -- what most of us like to think we would be like." Gareth, the humorless assistant to the manager who insists on calling himself 'assistant manager,' is the obligatory Territorial Army soldier ("the type who really thinks he could survive after the bomb because he can skin a mouse"). And Finchy is the hard-drinking bore who inspires teeth-gritting hostility within seconds of his arrival.

Mustn't grumble. That's an order, by the way -- the great unwritten rule of office life. Criticize the job too much and, by implication, you criticize your colleagues as well. Complaint may be expressed only in coded form, via the strained jocularity of the novelty sign or the humorous email circular with its jokey complaints about the drink machine that's never fixed or the ever-diminishing lunch 'hour.' Timid, cringing, and, let's face it, pathetic.

There's nothing wrong with having a laugh, but this isn't really a laugh at all. "It's stifling, that feeling of claustrophobia and falseness," muses Gervais. "When they start teaching people how to enjoy life, there's something a little bit odd about that." And official fun invariably has a coercive element, as Brent's laugh-a-minute tyranny shows. It reminds everyone who's in charge. Jokes are barbed with references to lateness, or light-hearted threats of the sack. It can go further, with the management-led pillorying of those deemed not to be pulling their weight through 'Wally of the Week' boards, dunce's hats and 'humorous' forfeits.

Joking can be a serious matter. In one episode of The Office, a full-scale investigation is launched when a pornographic picture featuring David Brent is emailed around. No wonder he is careful to point out that his officially sanctioned material is kosher. ("Does my bum look big in this?" he chuckles, reading a cartoon on the wall: "It's OK, it's not sexist -- it's the bloke saying it.") Politically correct codes of conduct keep everyone walking on eggshells: anyone could take offence at anything, and someone usually does. And when they do, there's no sorting it out between yourselves: it's straight off to personnel for an official warning. Disciplinary sanctions cover everything from fighting to farting.

"I've done nine-to-fives and there's nothing wrong with them," Gervais says, "but you don't live to work: you live for the evenings and weekends, and have other ambitions." Those are the words of a man who can remember the days when people could work for the money and shoot off at five to get on with their lives. Those days are long gone. Today you are expected to buy into the job, and treat it as part of your lifestyle. David Brent, a man of his time, "would never admit that his job's rubbish." Dress-down days, works outings and charity events all help to promote an image of a relaxed regime, but they also help to blur the distinction between work time and your own. Today you must arrive early, leave late and take the job home with you.

People are a company's greatest asset, as Brent would no doubt say. That doesn't mean the company cares about its employees, though. When the bottom line is threatened, all talk of the 'team' goes out of the window. ("It's a very special boss who puts his head on the block for anyone.") People are assets that must be made to work for you. That means using their untapped talents to the max. It means coming up with ideas, making sacrifices and volunteering for extra duties. It means away-days and seminars, flipcharts and brainstorming sessions. "Nobody's got anything to say, but you get emails saying you must attend," says Gervais. "No I won't -- I'm going to the beach!"

In the end, are we all destined to let the bastards grind us down? Nobody starts their working life as an office 'type,' but most people end up as one. Is there a mysterious law at work, through which we grow into our allotted roles and become enmeshed in the petty politics of office life?

"Everyone picks up symptoms," declares Gervais. "It doesn't matter if you work for a paper merchant in Slough or ICI or NASA, if someone who hasn't been there as long as you gets a bigger desk, you're going to mention it." Look no further than Tim, who came for a week and stayed for 10 years. He ends up indulging in a magnificently petty feud with Gareth at the next desk. As Gervais says: "You can be in an office and someone won't lend you their stapler, and within two weeks, when they ask to borrow yours, you'll go: 'No, you wouldn't let me borrow yours.'"

'The Office' is shown Thursday nights at 10pm on BBC America.

Outsourcing the Occupation

"Our men train military personnel, train pilots to land airplanes, escort food and supply convoys, and provide protection for Paul Bremer, head of the Coalition Provisional Authority." Sound like an advertisement for the U.S. military? In fact, these are the words of a spokesman for Blackwater Security Consulting, a private security firm based in North Carolina, describing the role of his employees in the occupation of Iraq. It gives new meaning to the phrase "a private war."

Blackwater hit the headlines in late March 2004, when four Americans were killed, burned and dismembered in the turbulent northern town of Fallujah. Media reports described them as "civilian contractors" who were "protecting food shipments"; it later transpired that they were four of around 450 Blackwater employees on the ground in Iraq, who, along with other private firms, are playing a central role in the coalition's war effort.

The killings in Fallujah brought to the world's attention the mercenary phenomenon in Iraq. The coalition has contracted thousands of private military personnel from companies such as Blackwater, to do everything from feeding and housing coalition troops to maintaining key weapons systems, including M-1 tanks, Apache helicopters and B-2 stealth bombers, to providing armed protection to leading coalition officials. Such personnel are mostly ex-soldiers, from American, British, South African, European and other armies, who now make up to $1800 a day offering their skills and services to the private sector.

An estimated 15 to 20,000 of them are currently in Iraq -- and as one report points out, that is a greater number of men than provided by any other American ally, including Britain.

In some parts of Iraq, including the hostile Sunni triangle in the north, private military personnel have been at the forefront of the occupation. According to one report, they have become the most visible part of the occupation, often running high-risk operations that the American and British military would rather not do; and therefore, to some Iraqis, they have become "the most hated and humiliating aspect of the ... occupation."

Private personnel have borne the brunt of much of the Iraqi anger over the past two weeks. One report says "the ubiquity of heavily armed foreigners partly explains why so many people are being kidnapped in Iraq;" many of the "civilians" killed and kidnapped in hostile towns such as Fallujah in the north and Najaf in the south in the past 10 days have been private military personnel.

Why has the coalition contracted a private army of ex-soldiers, in the pay of profit-making security firms, to execute aspects of the occupation? Why is it increasingly relying on private guards, who are not subject to the coalition's chain of command and who can up and leave whenever they please?

Some claim the coalition is intensifying the occupation, creating a fearsome vanguard of toughs from around the globe; others that it is "privatizing the war" in order to make up numbers on the ground -- both the U.S. and British armies have shrunk over the past 10 years and are currently stretched, from Haiti to Afghanistan to the Philippines. Some on the American left argue that the rise of a private military complex shows the "jobs for the boys" mentality of the Bush administration, which is providing security contracts to "politically connected" firms.

These considerations may have impacted on the nature and number of private military personnel in Iraq. But fundamentally, the use of private firms in the war and occupation points to problems within the coalition. By outsourcing the occupation, the coalition is trying to distance itself from the political consequences of its actions, in effect creating a buffer between its war and what happens as a result.

The coalition seems to want to occupy Iraq without physically having to occupy it, concentrating its troops in Baghdad and on the outskirts of hostile towns and cities while pushing hired guns to execute risky tasks, including protecting America's main man on the ground. What the coalition may have gained in political distance, however, it has paid for in terms of logistical coherence on the ground.

According to Peter W Singer, a national security fellow at the Brookings Institution in Washington, the number of private military personnel in Iraq is "unprecedented," in both scale and scope. Singer tracked the development of the private military sphere over the past 10 years for his authoritative book "Corporate Warriors: The Rise of the Privatized Military," published earlier this year. He notes that private firms have played an increasingly central role in war zones around the world since the end of the Cold War, but says that something bigger is happening in Iraq. "We have never seen numbers this high," he tells me. "We're talking somewhere between 15,000 and 20,000 private personnel, and that is expected to rise to 30,000 when the coalition hands over power to Iraqis on 30 June."

Singer says the U.S. and British military have for the past 10 years used private military personnel to "protect military installations, escort convoys, things like that -- usually in war zones that the great powers didn't really care about, like Sierra Leone." But in Iraq, the big, defining global issue of today, private personnel have become central to the war and occupation effort.

"We're talking about people using military training and weapons to carry out military functions within a war zone," says Singer. "Some refer to them as "security guards;" but they aren't like security guys in the shopping mall. Some of these firms have been given airport security contracts in Iraq, and airport security in Baghdad doesn't mean watching bags go through the x-ray machine -- it means hiring ex-Green Berets to defend the airport against mortar attack."

Singer points out that traditional U.S. military doctrine held that civilians accompanying U.S. forces abroad should not be put into roles where they had to carry or use weapons (although they were allowed to carry small pistols in "extraordinary circumstances"), and that "mission-critical" roles should strictly be kept within the military itself. "But in Iraq, the private guys are heavily armed," he says. "They have guns, helicopters, everything. And they are carrying out mission-critical operations, including military support, military training and advice, and tactical military roles. They are even protecting Bremer, and you can't get more mission-critical than that. So much has been handed over; basically we have a system where it's not civilians accompanying the force, but civilians who are an essential part of the force."

This has become apparent over the past two weeks, as private contractors have clashed with armed Iraqis. Much of the fighting in the north and south and in parts of Baghdad has been between Iraqis and coalition troops, leaving an estimated 500 Iraqis and 30 Americans dead. In some parts of the country, however, private military personnel have been left to fight their own battles -- and the coalition's.

According to the Washington Post, on Apr. 5 eight commandos from Blackwater "repulsed an attack by the militiamen of Shia cleric Moqtada Sadr against the Coalition Provisional Authority (CPA) headquarters in Najaf." Apparently, the Blackwater employees spent hours calling the U.S. military and CPA for back-up, but to no avail. Blackwater had to send in its own helicopters, twice, to deliver ammunition to its employees.

On Apr. 6, the home of five private contractors employed by the London-based Hart Group Ltd came under fire from Iraqis in Kut. The five defended their home for two days -- one was killed, bleeding to death after being shot; the other four were wounded. Apparently, the Hart employees called U.S. and Ukrainian military forces so many times during the two-day siege that "the battery on their mobile phone ran out."

"We were holding out, hoping to get direct military support that never came," said Nick Edmunds, Hart's Iraq coordinator. Other private military personnel have been wounded or killed and some kidnapped, while defending their own posts or coalition buildings and property.

Now, the Washington Post claims that private security firms, "unable to rely on U.S. and coalition troops for intelligence or help under duress...have begun to band together, organising what may be the largest private army in the world, with its own rescue teams and pooled, sensitive intelligence."

Defending coalition buildings and defending themselves with no back up from coalition forces -- it seems that in some parts of Iraq, private military personnel are not so much accompanying the coalition and providing it with back-up as substituting for it. Coalition forces have abandoned private military personnel in some hotspots, leaving them to maintain some semblance of control.

As we watch private contractors fighting against Iraqis, getting kidnapped and in some cases killed, Singer says we could be forgiven for asking: Whose occupation is this? "Instead of questioning the mission, the public were probably trying to figure out just who exactly was performing the mission in the first place," he says.

Where private military personnel in the past were generally not armed and certainly not used for mission-critical operations, today, in Iraq, there are thousands of them, most with guns, and some apparently fighting the coalition's corner. "Yes," says Singer, "there has been a severe blurring of the lines between civilian and military forces in Iraq."

Why has the coalition privatized so much? Singer believes it is more for political reasons than practical ones. He doesn't buy the argument put forward by some, that the use of private firms is a money-saving technique. "There is no single study that has proven that the use of private military contractors saves on the costs of the occupation." He does believe, however, that the shrinking and stretching of the U.S. and British military over the past decade has been key in forcing a turn towards the private sphere. "The U.S. military is 35 percent smaller than it was during the last Gulf War; it is also deployed in places like Afghanistan, Haiti, Uzbekistan, and so on. The British military is the smallest it has been since the Napoleonic wars. Private forces are filling the space."

But he thinks political considerations have been the main motivation. "There is an attempt to displace the political costs of the operation, and those political costs include everything from having to call up more regular forces, which the American military in particular does not want to do, to avoiding suffering casualties. Between 30 and 50 private contractors have been killed, and a lot more are captured right now. Can you imagine the controversy if actual U.S. marines were being held hostage? There is a desire to offset the controversies associated with war."

These considerations no doubt impact on the coalition's thinking in Iraq -- but there is more to the privatization of the occupation than a desire to avoid casualties. The "handing over" of more responsibilities to the private sector, the unprecedented use of thousands of armed civilians to operate in hostile towns and cities, the overriding question of "who exactly is performing this mission?," fit into the coalition's curious style of occupation.

America and Britain's domination of Iraq often looks like an occupation-in-denial; the coalition may have thousands of troops in Iraq, but it has little desire to exercise political control, or even political responsibility, over Iraq's present or future. Its occupation has a hands-off feel, where a huge military presence can coexist with political cautiousness and a desire to stand above events on the ground. In much of Iraq, coalition forces are all but invisible.

While the occupation has proved fatal for many Iraqis over the past year, particularly protesting Iraqis, it has not been an all-out military clampdown on Iraq's towns and cities. Much of the occupation has been conducted from behind high walls or from helicopter gunships. One report describes how hundreds of American troops spend their time in Saddam's old palaces, or guarding the "Green Zone" in the centre of Baghdad, a cordoned-off part of the city centre, massively guarded and for the exclusive use of coalition officials, only occasionally venturing out on missions.

A recent poll asked Iraqis what they thought of the coalition forces -- 77 percent said they had never had an encounter with a member of the coalition forces. Coalition forces have stayed on the outskirts of some cities, in particular Fallujah and Najaf - which is one reason why private contractors have been left to defend themselves, and coalition buildings, in those cities over the past two weeks.

The outsourcing of so many responsibilities to private firms looks like being part of this strange occupation -- an attempt to create a distance between the coalition's actions in Iraq and the consequences of its actions, between its physical occupation and the political ramifications of the occupation. This is best summed up in the person of Paul Bremer -- America's "administrator" in Iraq (not its high commissioner, note), who surrounds himself with armed men trained by Blackwater in North Carolina and whose main job is to ensure the nominal handover of power to an Iraqi administration on 30 June.

The end result of this frantic outsourcing of responsibility is logistical and political confusion on the ground, which has come to a head over the past two weeks -- where no one seems to know who controls which cities, or why.

Feeding Our Fear of Flying

Is a terrorist attack on an American city imminent? First the US Department of Homeland Security upped its color-coded terror warning from yellow to orange, claiming there had been a worrisome rise in chatter between terrorists; then six flights from Paris to Los Angeles were cancelled after US officials told French officials that al-Qaeda types might hijack one of the planes to use as a missile against America; then US officials insisted that selected foreign airlines should post armed marshals on flights to America; now British Airways has become embroiled, with BA flights to Washington and Riyadh cancelled on the basis of 'security advice."

Of course terrorists might be plotting to hijack planes -- but for all the cancelled flights, armed guards and endless column inches about the return of terror, it remains uncertain whether or when an attack will occur. The investigations into the Air France and BA flights have turned up little, except that a five-year-old child on one of the French planes had a similar-sounding name to a Tunisian terrorist. The British Air Line Pilots Association believes that the cancellation of BA flight 223 to Washington last week was a 'shot across the bows" by America, an attempt to get skeptical BA pilots to accept having armed air marshals on their planes.

Whatever the prospect of an attack might be, the recent security scares highlight some big problems with the "war on terror." America and Britain's approach to the alleged terror threat appears less as a measured reaction to specific information, than a panicky response to often indecipherable "chatter;" not so much an attempt to deal with specific threats, as a very public fretting about a potential attack occurring somewhere, somehow, some time. By going public with all sorts of intelligence -- reliable or otherwise -- the US and UK elites appear effectively to be projecting their own uncertainty on to the rest of us, and fostering a climate of paranoia in the process.

The US Department of Homeland Security kick-started the current terror alert when it raised its threat warning to orange, in response to an "unprecedented increase in "chatter"" -- the term used by intelligence officials to describe communication levels between suspected terror groups and individuals. But how reliable is "chatter," as an indicator of terrorists' intentions or imminent action? Unlike human intelligence -- which collects information through human contact with a terror group or enemy state, usually through infiltration -- "chatter," or signal intelligence, is collected by technical means, by using satellites to eavesdrop on phone conversations and email correspondence between suspected terrorists. Not surprisingly, such chatter often proves inadequate for those involved in counter-terrorism.

"Chatter is a descriptive term," says John Hamre, President of the Center for Strategic and International Studies (CSIS) in Washington. "Much like when you walk into a room and there are many conversations underway, you can not clearly hear any complete conversation, only random pieces." Hamre, who served as deputy secretary of defense under President Bill Clinton in the late 1990s, tells me that "chatter" also refers to background noise. So as it relates to intelligence, chatter reflects a larger, though not necessarily a large, number of messages that have suspicious references, which appear related but with inadequate specific details.... Often the capacity to analyze these references is quite limited, so they are aggregated into general categories."

Hamre doesn't believe that the French or British flights were cancelled on the basis of chatter alone. "I don't think they would have cancelled specific flights based on non-specific information. I suspect there were specific references to flight numbers or take-off times." This may be so -- though some security experts reportedly suspect that BA flight 223 may have been cancelled because intelligence officials "heard" terrorist chatter about United Nations Resolution 223, which criticizes Israeli treatment of Palestinians and is often cited by Arab leaders and activists, or that terrorists allegedly picked on this flight because of its numerical symbolism.

However, Hamre points out that even responses to specific leads can later become bound up in non-specific "chatter." "Once a flight is cancelled, this can become the subject of subsequent "conversations" which could be collected and might become the basis for subsequent groundings." So chatter provokes a response, the response becomes part of the chatter, which might cause another response....

According to Adam Dolnik of the Terrorism and Political Violence Programme at the Institute of Defense and Strategic Studies in Singapore, human intelligence is always preferable in counter-terrorist situations. "Signal intelligence can be, and historically has been, a reliable source of intelligence," he says. "But its utility is limited when it comes to fighting terrorism, since terrorist groups make decisions in a small circle of people who communicate personally in unknown locations. That is why human intelligence is the most valuable tool in fighting terrorism, as getting a person into this closed circle can provide you with information that you could not attain otherwise."

John Hamre is concerned that responding to non-specific threats could have the unintentional effect of helping out the terrorists. He describes a simulated war game, "Silent Vector," conducted by CSIS in late 2002, to test how the US National Security Council might cope with a credible but non-specific warning. "One of the key findings of our exercise was the need to avoid responding to non-specific threats," he says. "It is quite easy for would-be terrorists to create ambiguous but credible warnings in general terms. If we respond every time we have a general warning we would fall prey to cheap terrorism -- doing the work for the terrorists by disrupting our economy and society only at the pointing of rumors."

Instead of responding to everything, Hamre argues that "it would be better to announce that there are general threats, but that they lack the specificity to act in a sensible way." Yet isn't this penchant for going public with all manner of intelligence, whether chat-related, specific or non-specific, also part of the problem? The recent terror alerts are not only the result of listening too keenly to "chatter" or misreading and overplaying apparently specific threats -- they are also a consequence of US and UK officials increasingly making public their concerns about the potential for attack. This conflation of private intelligence and public information has contributed to today's climate of fear and suspicion.

Of course, the authorities ought to disseminate and analyze the intelligence they receive, whether that intelligence is recorded "chatter" or information gathered by humans in the field. And there also ought to be public information, specific, targeted information when the need arises, that might assist the public in dealing with an event or crisis. Yet these are two distinct things -- intelligence should be analyzed privately and discreetly, behind closed doors, by experts; and information should become public when it is potentially useful. In the current war on terror, these two things have been thrown together in a dangerous concoction, resulting in the publication of both general and specific intelligence in the name of permanent vigilance.

This "going public" with intelligence plays an important role for the authorities in the war on terror: Government officials and institutions are keen to demonstrate concern for the public, to show that they are actively considering our welfare and safety; and it also becomes a means of avoiding blame -- when, if anything does go wrong, the authorities can say: "We did issue warnings beforehand...." Yet this conflation is detrimental to public trust. Indeed, it seems that we now have the worst of both worlds -- private intelligence that is often inadequate or misjudged, which is made public in the name of blame-avoidance and reassuring the public. The end result is that the authorities continually project their anxiety on to us -- just in case.

Writing in the UK Independent, Kim Sengupta argues that "in the current security climate, overreaction is preferable to under-reaction. The IRA message to Margaret Thatcher after the Brighton bombing -- 'You have got to be lucky all the time, we have got to be lucky just once' -- has never been so relevant, even if the threat comes from another quarter."

Yet the notion that we should organize the skies, or any other part of society, around guarding against one potential (and unpredictable) instance of "luck" on the part of terrorists is highly problematic. The price of this kind of permanent vigilance and suspicion is a disabling state of anxiety -- the like of which any cynical terrorist, whose aim is to spread fear and loathing in the West, could be proud.

Liberation by Camera

"Deaths will bring Iraqis real liberty …" So said a headline in yesterday's U.K. Sun, as political editor Trevor Kavanagh claimed that the "richly deserved deaths" of Saddam Hussein's sons Uday and Qusay had finally given Iraq "the chance to shake off 35 years of repression and claim its place as a major voice in the Arab world."

How could the killing of two former Ba'ath Party henchmen who have been on the run for four months lead Iraq into a bright new free and Arabic future? The widespread hope that the deaths - and the grisly photos to prove them - will turn Iraq around reveals much about the coalition's hollow campaign.

The coalition and its supporters are staking a lot on the killing of Uday and Qusay. According to one report, "the way is now clear for democracy to flourish." Others hope that the deaths will be an "important morale boost for the increasingly demoralised occupation forces," and a "useful distraction for the White House, which is under intense scrutiny for its misleading prewar claims."

An Iraqi civic leader says the deaths "might turn the tide in favour of the coalition," by finally bringing the sporadic violence against U.S. troops to an end. American military analyst Ralph Peters says the killings "could be more important in the long run than the fall of Baghdad," finally showing the world, and more importantly Iraqis, that Saddam is a "man of the past."

Of course two deaths cannot resolve political crises, deliver democracy or end violence -- even if the dead are, as U.S. officials helpfully remind us, the Ace of Hearts and the Ace of Clubs in America's playing-card guide to evil Iraqis. That much is clear from the fact that, since Uday and Qusay were killed on Tuesday, five Americans have been killed, bringing the total number of U.S. deaths for the past week to 11 -- the highest weekly rate since the war officially ended in mid-April.

Rather, the idea that Uday and Qusay's deaths represent the liberation of Iraq and the vindication of the war indicates that this is still very much a campaign of style over substance. Throughout the Iraqi venture, American and British forces carried out symbolic stunts and searched for "defining images" as a means of "demonstrating" that they had "liberated Iraq."

Americans blew up a 40-foot statue of Saddam in order to "send a powerful message to remnants of the regime;" Brits launched a dawn raid to knock down two statues in Basra "to show the people … we will strike on any representative token of that regime;" even when U.S. forces seized Saddam's New Presidential Palace in early April, it was described as being a more symbolic than strategic target.

In wars of old, it was usually after declaring victory that the winners would put on massive displays of symbolic force and topple symbols of the defeated regime. In Iraq, such gestures took place before the war had ended, as part of the campaign to "liberate Iraq." In a war where the very aim was to "send a message" and project a positive image of the coalition forces, toppling statues was seen as being just as important as winning cities.

By the same token, the deaths of Uday and Qusay are now talked up as "more important than the fall of Baghdad" -- not because there's any evidence yet that the Hussein brothers were organising operations against the coalition forces, but because their deaths apparently "send a message" to Iraqis and the world. As Ralph Peters claims, "The symbolism is almost biblical…. And as our symbolism goes up, theirs goes down. Far from driving us out of Iraq, we go in and kill [Saddam's] sons."

So the battle in Iraq is a battle of symbols - with America apparently symbolising Good, and Uday and Qusay symbolising Evil. From this view, killing Uday and Qusay was the equivalent of knocking down a statue, or some other "representative token" of the old regime. The shootout in Mosul on Tuesday was less a clash between U.S. forces and threatening hostile elements, than yet another image war played out in real time.

This same obsession with the symbolic informed America's decision to release Uday and Qusay's death photos. After 24 hours of internal doubt and debate, the Pentagon decided that making the pictures public was in the best interests of the Iraqi people. Why? Because it will finally convince them that they are free, thus transforming Iraq's fortunes.

According to one report, U.S. forces may even allow broadcasters and journalists (those all-important communicators of "messages") to film and photograph Uday and Qusay's bodies, as part of the mission to "convince Iraqis that they are free of the former repressive regime."

U.S. defence secretary Donald Rumsfeld has justified the publication of the pictures with reference to the killing of the Romanian tinpot dictator Nicolae Ceausescu in 1989: "It was not until the people of that country saw him, saw his body, that they actually believed that the fear and the threat that his regime posed to them was gone." According to Paul Bremer, America's man in Iraq, "The strategic importance of the killings, of their being dead, is to help us persuade the Iraqi people that [we have] liberated the country."

In this psychological view of postwar Iraq, all that Iraqis need is to see a photo or two of their dead former rulers in order to realise that they are free. Iraqis are seen as being either in thrall or in fear of the former regime, and America's role is to snap them out of it with the aid of some snapshots. In short, the barrier to Iraqi liberation is the people's own fearful mindset.

Of course, there's much more to it than that. It is one thing to chase a failing regime out of a failing state, to topple statues and to kill two has-been dictators in a six-hour shoot-out in Mosul. It is quite another to create something of substance to put in their place. As postwar Iraq remains a mess, the attempt to "convince" Iraqis with images and stunts that the war was good and their liberation is imminent, looks increasingly like an attempt on the part of the coalition to convince itself of its own role in the world.

Brendan O'Neill is an editor at Spiked Online.

A Roadmap to Nowhere

"There can be no peace for either side ... unless there is freedom for both," declared President George W Bush, as he introduced his roadmap for peace in the Middle East to an expectant world (1).

By "freedom" Bush means the people of the Middle East will be given strict instructions on how to resolve their conflict. The Palestinians will be told what kind of government to install, whom to elect, when to elect them, why to elect them, and what kind of politics to practise. The roadmap for peace lays the ground for relentless intervention by a "Quartet" of powers (the USA, the EU, the UN and Russia) to oversee the Middle East's transition from conflict to peace by no later than 2005. Freedom doesn't get a look-in.

The roadmap sounds less like an historic strategy to negotiate a treaty between warring factions, than a Third Wayish attempt to wish away political conflict. It is a "performance-based and goal-driven roadmap, with clear phases, timelines, target dates and benchmarks aimed at progress through reciprocal steps by the two parties in the political, security, economic, humanitarian and institution-building fields" (2). It sometimes sounds as if the Quartet of powers is trying to get the trains to run on time, rather than resolving a clash of two nations.

Yet behind the new-fangled focus on timelines and targets, the roadmap seeks to impose a solution. It is "goal-driven" in the sense that the goal has already been defined by the Quartet, and there will have to be "clear, unambiguous acceptance by both parties of the goal of a negotiated settlement as described below," because "non-compliance ... will impede progress" (3). It is "performance-based" in the sense that the Quartet "will meet regularly at senior levels to evaluate the parties' performance," and will take an "active and sustained" interest in "monitoring" the emergence of a Palestinian constitution and elections (4).

The roadmap is profoundly anti-democratic. Like the peace process that gave rise to it, the map is premised on the idea that a solution to the Middle Eastern conflict can only come from outside the Middle East. The parties to the conflict are clearly too blinkered and untrustworthy to resolve the issues among themselves, and need the helping hand of a disinterested and rational outside power (or four), who can show them what their best interests are.

This approach has defined the Israeli/Palestinian peace process. From the Madrid Conference of 1991 to kickstart "a Middle Eastern peace strategy," to the "historic handshake" between Yitzhak Rabin and Yasser Arafat on the White House lawn in 1993, to the Oslo Accords of the same year -- the perceived wisdom has been that the further you are from the Middle East, the better placed you are to determine a sensible and fair outcome to the whole debacle.

In the world of the peace process -- not only in the Middle East, but in South Africa, Ireland and elsewhere -- the most authoritative are those who can rise above messy clashes and conflicts, who have no apparent self-interest in the end result, and who only want peace. This idea of resolution by a disinterested third party (usually the USA or the UN) is now seen as a commonsense approach to conflict resolution around the world -- yet it is the antithesis of democracy.

The roadmap confirms the final emasculation of the Palestinians. The Palestinians have always been party to the peace process from a position of weakness rather than strength. Like other US-sponsored peace processes, the Middle Eastern one came about as a result of shifts in the global political climate. With the collapse of the Soviet Union and the end of the Cold War in the late 1980s, Western powers had a freer hand to impose solutions on conflicts around the world. The end of the Soviet Union and the isolation of the left-wing movements internationally put national liberation movements like the Palestine Liberation Organisation (PLO) on the defensive.

Palestinians first got involved in peace talks at the Madrid conference in 1991, when the PLO leadership was increasingly isolated in the Arab world as well as internationally. During the 1991 Gulf War the PLO supported Saddam's Iraq, which alienated many of their wealthy supporters in and around the Gulf. It was from a defensive position of global and local isolation that the PLO entered the peace negotiations, during which it has made numerous concessions on its traditional goals and aims.

Now the new roadmap illustrates that the Palestinians -- as a community with political aims and aspirations -- no longer figure as an independent factor in the Middle East. Palestinian political life exists only inasmuch as it supports and subscribes to the broader requirements of the US-sponsored peace process. The "viable" Palestinian state envisaged by the roadmap will be one designed to suit external requirements, rather than being internally built and sustained by the needs and desires of the Palestinian people.

Consider the Palestine Authority's new office of prime minister. This was created and filled because America demanded it, rather than as an expression of Palestinian political will. Earlier this year, Bush officials said there could be no progress in the Middle East until Palestinians installed a prime minister. America's primary interest was to sideline PLO leader Yasser Arafat, who is seen by some in the Bush and Blair camps as a barrier to a settlement. So Palestinian leaders created a prime ministerial office and filled it with Mahmoud Abbas, in line with American diktat.

The Palestinian prime minister is in power (if you can call it power) to suit outside demands, rather than being anything like an embodiment of Palestinian will. Likewise, the roadmap describes how the Quartet of powers will keep a close check on the emergence of a Palestinian state, to ensure that it puts "tolerance and liberty" centre stage and follows the roadmap rules (5). In this vision for the Middle East, "Palestine" will be little more than a hollow shell, where the leaders' primary responsibility will be to the peace process and the roadmap, rather than to their own people or politics.

Many have interpreted the roadmap as an old-style US/Israeli assault on Palestinian demands. Yet for the defeat of independent Palestinian politics, this roadmap also turns Israel into a pawn of the Quartet powers. The roadmap makes a list of demands of the Israeli leadership relating to its borders, security and political culture. The roadmap may not have Palestinian interests at heart, but neither does it represent Israel's interests.

It remains to be seen whether the new Palestinian state due to be created by the roadmap will be "viable" -- the Quartet's favourite word. Durable political structures emerge from real struggles to decide the future and direction of society. The PLO, for all its shaky politics (a kind of unhappy marriage of Stalinism and Islam) and its documented corruption, was created and sustained by a struggle and by mass support from those who saw it as representing their interests. Its aims were internally generated, and it won fierce allegiance from the Palestinian people.

Will the Palestinians swear a similar allegiance to the hollow state envisaged by America's "performance-based and goal-driven roadmap?"

(1) President Discusses Roadmap for Peace in the Middle East, White House, March 2003
(2) A Performance-Based Roadmap to a Permanent Two-State Solution to the Israeli-Palestinian Conflict, US Department of State, 30 April 2003
(3) Ibid.
(4) Ibid.
(5) Ibid.

Brendan O'Neill is coordinating the spiked-conference Panic attack: Interrogating our obsession with risk, on Friday 9 May 2003, at the Royal Institution in London.

Watching the Dark Side

As any film historian or critic will tell you, the relationship between society and science fiction cinema is a complex and often symbiotic one.

Science fiction has always responded to and utilised high-profile or newsworthy events. In a sense, SF films mirror the important issues of the day, although this doesn't necessarily mean that they reflect public opinion. For example, some of the first SF films -- including Wallace McCutcheon's "The X-Ray Mirror" (1899) and Thomas Edison's "The Wonderful Electro-Magnet" (1909) -- focused on electricity and X-Rays, because these were big news at the time.

A flurry of films followed the birth of nuclear energy and the first atomic bomb explosion. There was an endless parade of monsters trotting across cinema screens, mutated by the effects of nuclear power -- the most memorable, and perhaps most understandable, being Japan's "Gojira," or "Godzilla": a hulking green metaphor with attitude.

Ecology and nature were newsworthy in the 1960s and 70s, influencing films like "Silent Running" (Douglas Trumbull, 1971) and "Soylent Green" (Richard Fleischer, 1973). Such films posited what might happen to the Earth and its population if we carried on abusing the planet's natural resources. In "Soylent Green" we all turned into unsuspecting cannibals because of the shortage of food, with Charlton Heston trying to expose the culprits.

It followed that the rise of computer technology and robotics in the late 1970s and 1980s gave rise to movies like "Demon Seed" (Donald Cammell, 1977), "The Terminator" (James Cameron, 1984) and "Robocop" (Paul Verhoeven, 1987). These films suggested that technology might spin out of our control, or that as androids we could lose our individuality, our humanity or our very souls.

All of these examples have one thing in common: They play on our fears of change, of new developments happening so quickly that we hardly have time to take a breath. In effect, they are warnings about what might happen; wake-up calls, or the present extrapolated to the nth degree.

Genetics have been used in science fiction stories for a while now -- John Wyndham's "The Day of the Triffids" (1951) and the movie that followed hinted at botanical tampering long before GM crops appeared. But because the subject of genetics is much more high-profile now, it should come as no surprise that the number of movies on this theme has increased. Recent examples include "Gattaca" (Andrew Niccol, 1997) and "The 6th Day" (Roger Spottiswoode, 2000).

Genetics even invaded the 'Force' in "Star Wars: Episode 1" (George Lucas, 1999), where the young Darth Vader was tested for metaclorians -- something Old Ben Kenobi never did with Luke Skywalker back in '77.

Genetics is a fascinating topic for writers and directors; I myself have explored it in past stories. But the films mentioned above fulfil a dual role for the audience. They provide spectacle, something you don't see every day -- which is fundamentally what SF and any fantastical genre is all about. And they speculate on the future -- sometimes the not-too-distant future. Such films are one way of dealing with rapid developments in society -- not always a good way (sometimes they provide a knee-jerk response), but one that nevertheless appeals to a wide audience.

So why do they have to paint such a pessimistic picture? Dystopias have always been and always will be more popular than utopias. Films where everything goes wrong are infinitely more attractive to audiences than films in which everything goes right. You only have to consider the popularity of disaster movies for proof of this.

Without dystopias there'd be no "Planet of the Apes," no "Mad Max," no "Metropolis." "Jurassic Park" (Steven Spielberg, 1993) wouldn't have been half as much fun, and wouldn't have made so much money at the box office, if the genetically created dinosaurs had all been well behaved. No chase sequences, nobody getting eaten on the toilet....

The same is true of the tradition of the iconic 'mad scientist' that has built up through SF movies. This began with adaptations of HG Wells' "Invisible Man" and "Dr Moreau," and Mary Shelley's "Frankenstein," and continues today with genetic reworkings of the theme in "Hollow Man" (Verhoeven, 2000), where Kevin Bacon uses genetic research to make himself invisible, "Deep Blue Sea" (Renny Harlin, 1999), in which supersharks are the result of genetic manipulation, and "Resident Evil" (Paul WS Anderson, 2002), where an underground genetic experiment turns people into zombies.

These are considerably more exciting than a documentary film about what really happens inside a lab. The simple truth is Hollywood needs to make a profit, and if the fictional scientific experiments were harmless, there'd be no entertainment value for the punters.

However, there is also a serious side to the movies. SF has always commented indirectly on the state of the world. Cloning may have been used to great effect in "Multiplicity" (Harold Ramis, 1996) to demonstrate Michael Keaton's comedic capabilities, but it also threw up the question of what would happen if this were real, if a clone of yourself, or three in this case, existed in the world with its own thoughts and actions.

Cloning also allowed filmmakers to bring back Ripley after she was killed off in "Alien 3," but there was a cost: a room full of mutated Sigourney Weaver clones, barely human, that didn't quite make the grade. Would we use these genetically bred people as slaves, as huge corporations do in the future of "Blade Runner" (Ridley Scott, 1982)? And just how much power might these corporations have over us, or even over governments?

Would we put clones to work as policemen or soldiers as they do in "Judge Dredd" (Danny Cannon, 1995) and "Star Wars: Episode 2" (George Lucas, 2002)? And what would be the ramifications of a war where you never ran out of fighters on either side? What rights would these 'individuals' have and would they be discriminated against like the mutants of "The X-Men" (Bryan Singer, 2000), with laws passed to try and separate them from the rest of humanity? Or would tinkering with genetics before birth lead to an Aldous Huxley-style "Brave New World," with a hierarchy of genetic classes? And where would that leave the rest of us poor 'In-valids', as they're called in "Gattaca?"

For all its negativity, science fiction cinema tends to hit some nails on the head. Although it rarely offers solutions, it at least raises questions about such issues. The best science fiction should inform as well as entertain -- and I hope it will continue to do so well into the future.

Paul Kane is a film critic, author and lecturer from Derbyshire, who runs a ,A HREF="http://www.shadow-writer.co.uk">personal website. His Dark Fantasy books include "Alone (In the Dark)" and "Touching the Flame." This is a shortened version of a speech he gave at the Institute of Ideas' Genes and Society Festival in London on 26-27 April 2003.

Friends of the Right Color

Amid the furious speculation in recent weeks about what the Iraqi people want, one British newspaper columnist played a trump card by quoting an "Iraqi friend." This is a shortcut to the moral highground: "Others claim to speak for the Iraqis, but I actually have an Iraqi friend and he says X."

In this case the Iraqi friend was pro-war, but that's not really important. What disturbed me was the use of the word friend. I don't doubt the guy is on speaking terms with an Iraqi, but friends? They went to school together? They hang out at work? They meet in the pub and talk about football? What is a friend, anyway?

We are rightly suspicious of anyone who says, "Some of my best friends are black/gay/Inuit." 'What's coming next?' we wonder. Because it's perfectly possible to have black/gay/Inuit friends and still to hold outrageous political views about these groups in general. And it is just as possible to feel political solidarity with groups of people you never encounter in your private life.

I don't have a lot of friends, certainly not enough to represent all the ethnic and social groups I don't hate. I gave this some thought, and realised that my five closest friends are all white, male, heterosexual, Protestant atheist and lower-middle class. (Actually one is a Catholic atheist who pretends to be working class and is possibly gay, but I'm constantly ribbing him about all three deviations.)

Of the five, three are called Stuart. Does this mean I'm prejudiced against people who aren't called Stuart? Come to think of it, there is another Stuart outside the five with whom I get along very well, but whom I've always considered a little odd on the grounds that he believes in God. Is that intolerant of me? I am also quite friendly with a lesbian, but that's only because I secretly wish she wasn't.

Friendship is a messy business, and it would be quite wrong to judge anybody by their friends, for better or for worse. The idea that we should each seek to recreate the full diversity of society among our own friends both trivializes the struggle for political equality and makes a mockery of friendship. Contrary to the saying, we don't choose our friends at all. If my experience is anything to go by, we fall in with them against our better judgement.

So next time somebody wheels out an Iraqi, a Chechnyan dissident or an Amazonian Indian "friend" in support of an argument, give them a slap from me. I'm telling you as a friend.

Flagging Support for Flag-waving

It isn't only US troops in Iraq who are taking down the Stars and Stripes, after their commanders warned that raising the flag gave the "wrong impression" of the coalition's war aims. On the homefront, too, wartime flag-waving seems to be flagging.

Across America, newspapers report that the flag has been notable by its absence since the war started. In Michigan, the Detroit News headlined a report "Combat fails to inspire new patriotic movement," claiming that "plenty of flags still sit on store shelves, but people aren't flying them in the days following the invasion of Iraq" (1). Further West, the San Jose Mercury News says: "Look around. The nation is at war. But the flags aren't flying..." (2)

Even in the traditionally pro-war South, where some have hoisted up the Stars and Stripes since the war kicked off, there hasn't been quite the patriotic outpouring many expected. A Texan paper reports that "Old Glory is making a reappearance," but it is "nowhere near the level we witnessed" after the Sept. 11 attacks.

What's with the lack of patriotic symbols? An American shop-owner, disappointed by poor flag sales, reckons people "still have the flags they bought [after 9/11], and they don't need new ones" -- though that raises the question of why not very many are dusting off their flags and flying them for the war in Iraq (3). One war supporter thinks patriotism has become privatized, arguing that "there's patriotism here ... but it doesn't have to be worn on your sleeve" (4).

In truth, the lack of flag-waving reveals a deeper ambivalence about the war. Across the USA and the UK, from the war planners themselves down to everyone else, there doesn't seem to be much strong support or passion for the war in Iraq. The majority of people seem to have a shoulder-shrugging approach to it, rather than any especially pro-war feelings. After all, if even US troops have effectively been banned from flying the flag, for fear of coming across as conquerors, then Americans are hardly likely to be fired up to flag-wave.

In Britain, too, there is nothing like the "Falklands Factor" of 1982, when then prime minister Margaret Thatcher stirred up nationalist sentiments over the Falklands War with Argentina to boost her own standing. The UK tabloid The Sun may have printed its usual wartime one-page special saying "Support our boys" (and girls, this time), which it is encouraging its readers to stick in their windows -- but outside of army towns and naval bases, not many seem to have taken up the offer.

Even those who are waving the flag in America don't sound especially gung-ho. "These flags are not about whether you are for or against the war," says one US flag-waver; it's more about "feeling helpless," so you want to "do something to show pride in our American boys and girls" (5). One retailer says that some pro-war types have opted for "patriotic paraphernalia," like teddy bears with Stars and Stripes t-shirts, over flags themselves. And some are wearing blue, white and red "remembrance ribbons" instead of flying flags, to express their support for the troops but not necessarily for the war.

The lack of wartime flags suggests a broader discomfort with war, aggression and gung-ho displays of old-style patriotism. Everyone contrasts the few flags fluttering in the USA today with the masses of flags that flew after the Sept. 11 attacks. But the flying of the flag post-9/11 wasn't a traditionally patriotic outpouring, so much as collective expression of fear and angst in the wake of the terrorist attacks.

After Sept. 11, the Stars and Stripes was flown not as a symbol of American power and defiance but as a "coming together" around a shared sense of grief. As one report put it, the flag became "an instant, universal symbol of mourning" (6). Some Americans who would never have dreamt of flying the flag in the past claimed that it had "lost its stigma" after 9/11, while others flew the flag next to signs saying "Hate-free zone," encouraging people to "respect Muslims". This was hardly war fever.

It seems that while it's okay to fly the American flag as a symbol of victimhood, it's not so cool to fly it in support of a war. We live in an age that feels uncomfortable with aggressors, but comfortable with victims -- uncomfortable with old-style ideals about military heroes, but comfortable with the notion that we're all damaged goods now, who can often only feel united in response to tragedies like the Sept. 11 attacks.

So one way that some at home are expressing their "support" for American and British troops in Iraq is by empathizing with them, rather than by supporting their mission (whatever that may be). In parts of America, communities are apparently holding candlelit vigils for US troops, while in Britain some have laid flowers outside army and naval bases in memory of the troops who have died. According to one report, it is ceremonies like these that most express "support on the home front" for the troops in the Gulf.

As the war drags on -- without much enthusiastic support from those organizing it or from those watching it -- some in America and Britain are supporting their boys (but not necessarily their boys' war) by expressing solidarity with them as potential victims, rather than supporting them as soldiers fighting for a worthwhile cause.

Brendan O'Neill is an editor at Spiked Online.

Keep reading... Show less

The Fear Superbomb

'Mother of all bombs', says the UK Sun, next to a picture of America's new 'fearsome superbomb', which will apparently 'help destroy Saddam Hussein's evil regime' (1).

The MOAB (Massive Ordnance Air Blast) is a 30-foot long bomb, made up of nine-and-a-half tonnes of high explosives, which apparently unleashes a 10,000-foot mushroom cloud when it blows up. 'This is not small', said a deadpan defence secretary Donald Rumsfeld yesterday, after the US Air Force tested the bomb at Eglin Airbase in Florida (2).

The superbomb may be the world's 'most powerful non-nuclear bomb'. But it also shows the gap between America's military might and its deeper political uncertainty. US officials are trying to fill the moral vacuum exposed by the planned attack on Iraq with a very big weapon -- a case of 'when all else fails, get your bombs out'.

The US Air Force made a public spectacle of the MOAB test explosion. Where weapons testing in the past was generally shrouded in secrecy, the explosion of the superbomb in Florida was accompanied by a press release, official statements and even video footage.

One of the aims seems to be to make a virtual impact in Iraq. Apparently, US officials plan to have the video coverage of the test explosion 'beamed to Iraqi troops to terrify them into surrender' (3). The US Air Force has taken to dropping bombs in 'safe military zones' in sunny Florida in an attempt to make an impact in the enemy camp of Iraq. That's enough to make the US military's increasing reliance on unmanned drones to do their bombing in Afghanistan and elsewhere look like the height of military engagement.

Some are comparing MOAB to the atom bombs that were dropped on Hiroshima and Nagasaki in 1945, arguing that this new superbomb might put a stop to the Iraqi crisis just as the atom bombs brought an end to the Second World War. Leaving aside the fact that you can't compare the Second World War to the West's self-induced crisis in the failed state of Iraq, the contrasts between the role of MOAB today and the role played by the A-bomb nearly 60 years ago are striking.

Of course, for those on the receiving end, the choice between being MOABed or A-bombed is no choice at all. Whichever weapon of mass destruction America decides to use in its foreign ventures, the end result is devastation. That the new 'superbomb' is a 'non-nuclear weapon' but with the 'power of a small nuclear explosion' is a detail that will be lost on those who feel its heat. But the current discussion about MOAB and Iraq captures something of America's current cautiousness on the international stage.

The atom bombs were hugely devastating, laying to waste two Japanese cities and killing at least 200,000 people. But the atomic bombing of Japan was about more than reaping blind destruction, even though it did that very well. With the Second World War coming to a close, and a new world order emerging, the devastation visited on Japan was about displaying America's military, economic and political power to the world, embodying the USA's aim to assume dominance over the postwar globe.

As the Japanese Committee for the Compilation of Materials on Damage Caused by the Atomic Bomb in Hiroshima and Nagasaki succinctly described it: 'The A-Bomb attacks were needed not so much against Japan - already on the brink of surrender and no longer capable of mounting an effective counter-offensive - as to establish clearly America's postwar international position and strategic supremacy in the anticipated Cold War setting.'

The new superbomb seems to be all about making a military impact, and little else - a demonstration of America's brute force and ability to instil fear. Military officials talk up the psychological and military impact that the superbomb will have. 'A primary reason to utilise this kind of weapon is psychological', says one Pentagon official. 'The intent is to paralyse and terrorise Iraqi troops, to stop them in their tracks.' (4)

For some US officials, one of the benefits of the superbomb is that, upon impact, it looks like a nuclear attack without actually being one. 'It sends up a mushroom cloud so vast that enemy soldiers who see it from many miles away will think America has done the unthinkable', says a Pentagon official (5). In 1945 America did the unthinkable in order to impose its political power on the postwar world; now America wants to project an image of the unthinkable in order to traumatize Iraqi troops.

There also appears to be a different attitude to civilian casualties within today's uncertain administration. The atom bombs of 1945 were specifically targeted at civilian populations. Hirsohima and Nagasaki were picked because they were cities with huge numbers of civilians, and the bombs were dropped, without warning, at times of the day that would ensure the largest number of civilian fatalities.

Today, officials claim that the superbomb will be kept away from civilian areas. According to a Pentagon official: 'It will be used for shock value alone and dropped well away from cities where it could inflict civilian casualties.' (6) Shock value? Listening to US officials, you could be forgiven for thinking they were launching a piece of performance art in Iraq, rather than a war.

Of course, the fewer civilian casualties there are in Iraq, the better. But the USA's concern about civilian casualties today is not driven by a newfound respect for innocent life. Consider the Afghan campaign, where large numbers of civilians were killed by American bombs, including civilians at at least three wedding parties. Indeed, in Afghanistan it was military disengagement and uncertainty - the fact that America fought its war largely from the air without going in on the ground to sound out allies and gather intelligence - that led to some of the more lethal bombing raids.

Rather, US concern about civilian casualties is more about how such issues play in today's supposedly humanitarian era. At a time when wars are justified in the language of human rights, when foreign interventions have to care as well as kill, civilian casualties don't look good. For US officials, being seen to avoid civilian casualties is more about PR than principle.

None of America's uncertainty means that US forces will not ship their superbomb from Florida to Iraq and reap destruction with it. But the apparent choice of the superbomb reveals much about the planned attack on Iraq. It seems that some US officials would rather drop superbombs on Iraq from on high and hope the crisis just goes away, rather than have to think about it for very much longer.

(1) 'Mother of all bombs…', Sun, 12 March 2003
(2) Air Force tests MOAB monster bomb, United Press International, 11 March 2003
(3) 'Mother of all bombs…', Sun, 12 March 2003
(4) 'Mother of all bombs…', Sun, 12 March 2003
(5) 'Mother of all bombs…', Sun, 12 March 2003
(6) 'Mother of all bombs…', Sun, 12 March 2003

Cheap Thrills Documentary

Jacko is wacko. Thanks to Martin Bashir's intrepid, insightful journalistic skills, we now know that. And there we were thinking that he was a nice well-adjusted all-American boy.

But what else do we know? Following the two-hour ITV special Living with Michael Jackson, broadcast in Britain on Monday night to an audience of 14 million, the UK press has worked itself into a revelation-induced frenzy. Apparently, Jacko is everything from a serial pedophile who lures boys into his bed to a shopping-addicted logical conclusion of our consumer society to a fucked-up prototype of an abused childhood.

As if to compensate for the fact that Bashir's tedious documentary, with its combination of obsequiousness and snide asides to camera, got nowhere near any "deeper" issues, psychobabblers and po-faced social commentators have rushed in to theorise, understand and condemn. What has Bashir's two hours done to deserve such attention?

The striking thing about Living with Michael Jackson was not anything that Bashir uncovered, but what Jackson boasted about himself. "I am Peter Pan," he said, escorting the camera crew around his personal fairground, showing off his toys, talking about his love of climbing trees. The idea that this apparently unhinged personality likes having innocent sleepovers with his pre-pubescent friends was quite believable -- or at any rate, more believable than the allegations that he wants to have actual sex with them (or, indeed, with anybody else).

Okay, so it is not normal for a 44-year-old man to aspire to being aged 12. But it's a reasonably interesting story -- and one that Bashir, with his muttered innuendo and moral outrage, skimmed over in his quest to push the pedophile button. Just as he pushed the poor-parenting button instead of getting anywhere interesting on the story of Jacko's kids. It can't, indeed, be much fun for two children to walk around wearing masks or veils -- although Bashir's solution to their inability to lead normal lives ("Get the security guards to take them to the zoo") seemed pretty surreal, too.

And Jacko might have excruciating taste in over-priced furniture -- but did we really need to spend what felt like half an hour with him in a shop, with Bashir playing to that famous British anti-ostentation and muttering things like "a bargain at a quarter of a million dollars, that one!" to the Argos customers on their couches at home? Was this worth eight months of Bashir's life, let alone five minutes of our time?

In fact, when Jackson's PR people slam the documentary for being a "salacious ratings chaser," you can only think that they don't go far enough. It was two hours' worth of tawdry moralising, which substituted rumour and innuendo for journalism, and in which Bashir didn't even criticise Jackson to his face, reserving that for his after-the-event bitching to the TV audience.

Of course, Bashir isn't a journalist. He's a Big Ear with a talent for therapeutic smarm, who made his name with that famous 1995 BBC Panorama interview with Princess Diana. Jackson knows this -- which is why, now he too has been Bashired, he claims to be "devastated" and "betrayed." "Martin Bashir persuaded me to trust him that his would be an honest and fair portrayal of my life and told me that he was 'the man that turned Diana's life around,'" Jackson whinges now.

Given that Jackson doesn't seem to live in the real world, he may have missed the fact that Diana had never been accused of child abuse, and that while the likes of Martin Bashir might play along with tree-climbing and learning dance steps, he's not going to do anything that his audience might find controversial. Like criticize Princess Diana, or not claim moral outrage behind Jacko's back.

From this viewer's couch, Living with Michael Jackson was a superficial portrait of two equally unappealing characters. At least one of them can dance.

No War of Liberation

You've heard of national liberation, women's liberation and even animal liberation -- but what about accidental liberation?

This is a theory doing the rounds among some liberal commentators feeling guilty about their support for war with Iraq. It holds that, however bloody, barbaric and American the war will be, at least it will have the godsent side-effect of liberating Iraqis from oppression.

According to Johann Hari of the UK Independent, "This war is going to be terrible -- but leaving Saddam in place would be even more terrible.... The difference is the deaths at the hands of Saddam will shore up Ba'athist national socialism, while deaths in war would at least clear the way for a free and democratic Iraq" (1).

Guardian loudmouth Julie Burchill puts it more bluntly: "If you really think it's better for more people to die over decades under a tyrannical regime than for fewer people to die during a brief attack by an outside power, [then] you're really weird…." (2)

The idea that the coming war will accidentally liberate Iraqis betrays a breathtaking naivete about the consequences of Western intervention. Outside interference in Iraq has already exacerbated local tensions, and military intervention can only further unravel the fragile Iraqi state. The internationalisation of Iraq's local conflicts threatens to divide Iraqis further and store up conflict for the future, rather than herald anything like a new era of freedom.

By turning Iraq into an international issue, America and Britain have paved the way for a carve-up. Local players like Turkey, Iran and Saudi Arabia all want a piece of postwar Iraq, while the big powers -- including the supposedly anti-war French and Germans -- have their own plans for postwar occupation. And if you think such intervention will bring democracy to Iraq, then you're really weird.

On the ground, the divvying up of Iraq between different powers has already started. As part of its deal to allow US forces to use Turkish territory to launch attacks on Iraq, Turkey has been given the green light to double the number of its troops in northern Iraq from 6000 to 12,000 in recent weeks (3). Northern Iraq is territory that the United Nations designated as a "safe haven" for Kurds following the first Gulf War in 1991, taking the area out of Baghdad's control and granting limited self-government to Kurdish groups.

Turkish forces are fortifying a 25-mile buffer zone between Turkey and northern Iraq -- though according to Newsweek magazine, Turkish forces are keen to go even further into Iraqi territory. "Turkey is demanding that it send 60,000 to 80,000 of its own troops into northern Iraq to establish "strategic positions" across a "security arc" as much as 140 to 170 miles deep in Iraq", reports Newsweek. "That would take Turkish troops almost halfway to Baghdad." (4)

The Bush administration claims that it is allowing Turkish forces into northern Iraq for "humanitarian reasons only" (5), to assist with the flood of refugees that the war in Iraq will no doubt create. In truth, with America's blessing, Turkey is pursuing nobody's interests but its own in northern Iraq.

Turkey is demanding free rein in northern Iraq. It wants to be in charge of "supervising the armament and disarmament of Kurdish groups" and of "restricting the movement" of Kurdish forces where necessary (6). Under the guise of a humanitarian effort, Turkey's intervention in northern Iraq is about keeping a check on Kurdish demands for independence, to ensure that such demands do not impact on Turkey's own volatile Kurdish population.

Since 1984, Turkey has been at war with the Kurdistan Workers' Party, which fought for Kurdish independence within Turkish territory. Turkey refuses to recognise the "ethnicity" of its Kurdish population and continues to ban the Kurdish language. Now, Turkey sees intervention in northern Iraq as the latest front in its war against the Kurds. As Turkish foreign minister Yasar Yakis said when asked about postwar Iraq: "A Kurdistan should not be set up." (7)

The opening up of northern Iraq to Turkish forces as part of the planned attack on Iraq lays the ground for renewed conflict between Turks and Kurds. According to Hoshyar Zebari, a senior official of the Kurdistan Democratic Party (KDP), which administers the western portion of northern Iraq: "Any Turkish intervention under whatever pretext will lead to clashes." (8) "People in northern Iraqi Kurdistan are more scared of the Turkish military than of Saddam", says Nasreen Sideek, a KDP minister (9).

For Independent columnist Johann Hari and other Western commentators, northern Iraq epitomises the kind of democracy that ought to be extended throughout Iraq. According to Hari, "[U]nder US and British protection, a democracy with freedom of speech and protection of human rights has flourished for the past decade" in northern Iraq (10). Yet, according to a Kurdish newspaper poll taken on 22 February 2003, 83 percent of the residents of northern Iraq are opposed to "any Turkish intrusion" (11). In what kind of "flourishing democracy" can you have foreign intervention against the will of the majority?

Western intervention in Iraq has turned northern Iraq's local problems -- muted conflict between different Kurdish groups, the existence of Islamic terrorist groups -- into an international issue. Whatever stability existed in northern Iraq as a "safe haven" is likely to be undermined by Turkey's US-backed intervention to pursue its own interests.

Elsewhere in northern Iraq, Iran has sent in 5000 Shia troops, complete with "heavy equipment" (12), in an attempt to protect its borders with Iraq during and after the war. The Iraqi Shia troops were originally an Iraq-based Islamic opposition to Saddam's regime, though they have been granted safe haven and training by Iraq's longstanding enemy Iran for the past 20 years.

Iran claims to have sent the troops into northern Iraq as a defensive measure, to protect against a potential attack on Iran by the People's Mujahideen Organisation, an Iranian opposition group based in Iraq that allegedly receives support and funding from Saddam's regime (13). But Iran's real interests seem to be in fortifying its borders by pre-emptively crossing over into Iraqi territory, and staking its interest in any set-up in postwar Iraq.

According to the Financial Times, "Through inserting a proxy force, Iran is underlining that it cannot be ignored in future discussions over Iraq's make-up" (14). One expert on Iraqi/Iranian relations claims that Iran is pursuing "nothing but an Iranian agenda", to ensure its future stability. Some Iranian officials are floating the possibility of extending their influence among Iraq's Shia Muslim population, by encouraging them to stand up to the Sunni Muslims that dominate Saddam's regime -- a move that could only cause further fragmentation and division inside Iraq.

It might seem odd that Iran -- one of America's "axis of evil" states, remember -- can send 5000 heavily armed troops into Iraq without incurring much international condemnation (though the Bush administration is apparently "concerned"). Perhaps Tehran officials have been buoyed to intervene in Iraq by their meetings with UK prime minister Tony Blair earlier this year, who promised that Iran's interests would be "taken into consideration" during and after war with Iraq.

With Turkish troops on one side of northern Iraq and Iranian-backed troops on the other, US officials are said to be ever-more concerned about "the increasingly complicated patchwork of forces in northern Iraq", and the potential for instability that this brings about (15). But who was it, if not Western forces, that made northern Iraq into such free-for-all territory in the first place?

The north was taken out of Baghdad's control after the first Gulf War by Western forces. It was one of the UN "safe havens" that was being demanded by many of those now opposed to military intervention. As a consequence, Iraq's sovereignty and borders were seriously undermined, making northern Iraq a less governed (and generally less governable) place than the rest of Iraq. It was the West's undermining of Iraqi state control over northern Iraq that made it such a borderless and intervention-friendly place.

As Muzaffer Baca, vice-president of a Turkish humanitarian relief organisation, argues: "There [has been] no effective control of the central authorities or international institutions. Northern Iraq is a haven for drug and arms smugglers….The instability creates an atmosphere in which terror and terrorist organisations can flourish." (16)

Far from being an example for the rest of Iraq, northern Iraq shows the dangers of Western intervention, and how undermining a state's sovereignty heightens the potential for instability and conflict. Besides, the "patchwork" of Turkish and Iranian-backed forces in northern Iraq that so concerns Bush and co appears to have come about as a result of at least American and British agreement, if not their full-blown support.

Perhaps in response to the potential for what one newspaper calls "the permanent disintegration of Iraq", the Bush administration unveiled its latest plans for postwar Iraq in late February 2003. The White House plans a total occupation of Iraq following the war, to oversee the "reconstruction of the country's shattered infrastructure" (the infrastructure that US forces will just have shattered?) (17).

According to one report: "The White House will outline plans…for taking complete control of post-Saddam Iraq "for an indefinite period" and overseeing the reconstruction of the country. General Tommy Franks, the Texan commander of the allied invasion forces, will be named as interim governor until all weapons of mass destruction are found and disabled and wanted members of the regime tracked down and arrested." (18)

And what will happen once the military occupation has disarmed Iraq and destroyed any opposition to its presence? Then the reins will be handed over to an American civilian, or an "American of stature" as one report puts it, who will, again, control Iraq for an "indefinite period" (19).

The French and German alternative to America's occupation plans isn't much better. France and Germany may be heralded by the anti-war movement as forces for peace in the Iraqi crisis, but they too propose that Iraq be occupied -- only by UN rather than American forces. In France and Germany's preferred option for Iraq, the UN Security Council would take control of Iraqi airspace and soil, and Iraq would effectively become a protectorate, like Kosovo. Liberation, accidental or otherwise, would be notable by its absence.

There is something missing in the American, British, French and German proposals for postwar Iraq - the Iraqis themselves. The people of Iraq may have a starring role in Bush and Blair's rhetoric, but in the plans for postwar Iraq they don't even get a look in. Bush and Blair talk up the need to "free Iraqis" from "Saddam's grip", but they push ahead with a plan that will divide Iraq up and put American generals in charge.

This is the "free and democratic" Iraq we can expect following further Western intervention -- an Iraq where Iraqis are more divided than ever; where local conflicts are internationalised and exacerbated; where neighbouring powers Turkey and Iran vie for territory and influence; and where the country is occupied by American or UN forces.

The liberals' idea of accidental liberation is a con. It depicts the people of Iraq as hapless saps who should only expect freedom as the by-product of a Western war. And it displays a wilful ignorance of the big power interests that are currently carving up and destabilising Iraq, even before the war has started. I prefer the idea of human liberation for the people of Iraq. And that is something that only the Iraqis themselves -- free from outside interference -- have a vested interest in fighting for.


(1) The case for war: we must fight to end the Iraqis' suffering, Johann Hari, Independent, 15 February 2003

(2) Why we should go to war, Julie Burchill, Guardian, 1 February 2003

(3) Turkey weighs economic, political costs of a Gulf War, Ilene R Prusher, Christian Science Monitor, 10 January 2003

(4) Risking a civil war, Owen Matthews, Sami Kohen and John Barry, Newsweek, 24 February 2003

(5) US to station thousands of troops in self-rule area, Michael Howard, Guardian, 24 February 2003

(6) Kurds brace for Turks, Cameron W Barr, Christian Science Monitor, 24 February 2003

(7) Turkey, US rebound from stalemate over aid package, Ilene R Prusher, Christian Science Monitor, 24 February 2003

(8) Kurds brace for Turks, Cameron W Barr, Christian Science Monitor, 24 February 2003

(9) Kurds brace for Turks, Cameron W Barr, Christian Science Monitor, 24 February 2003

(10) The case for war: we must fight to end the Iraqis' suffering, Johann Hari, Independent, 15 February 2003

(11) Kurds brace for Turks, Cameron W Barr, Christian Science Monitor, 24 February 2003

(12) Iranian-backed forces cross into Iraq, Najmeh Bozorgmehr and Guy Dinmore, Financial Times, 19 February 2003

(13) Iranian-backed forces cross into Iraq, Najmeh Bozorgmehr and Guy Dinmore, Financial Times, 19 February 2003

(14) Iranian-backed forces cross into Iraq, Najmeh Bozorgmehr and Guy Dinmore, Financial Times, 19 February 2003

(15) Iranian-backed forces cross into Iraq, Najmeh Bozorgmehr and Guy Dinmore, Financial Times, 19 February 2003

(16) War would threaten Iraq's Kurds and Shias, Muzaffer Baca, AlertNet, 29 November 2002

(17) General Franks "to run Iraq after war", Ian Bruce, Herald, 24 February 2003

(18) General Franks "to run Iraq after war", Ian Bruce, Herald, 24 February 2003

(19) General Franks "to run Iraq after war", Ian Bruce, Herald, 24 February 2003

Gone To the Blogs

"We don't know exactly how many there are. But they number in the tens of thousands. They are everywhere among us. They intend to tear down the world as we know it...."

Those might sound like the opening lines to a trashy Triffids novel or a Rumsfeld rant about mad mullahs threatening the USA -- but Jonah Goldberg in the Washington Times is in fact writing about blogs.

You know blogs: personal, self-made websites where Anyman (or Anywoman) comments about the news, links to other sites or posts pictures of their pets. Weblogs, to give them their full techie name, have been around since the mid-1990s, and now they're everywhere. According to CNN, "Several sources put the total number of blogs in the range of 200,000 to 500,000". Blogger.com, which provides idiot-proof software for setting up your own weblog, claims that 1000 blogs are created on its site every day.

Most blogs have a dozen or so readers, but a handful have built up audiences in their thousands. There are news blogs, comment blogs, war blogs, diet blogs, disease blogs, cat blogs, dog blogs, blogs about blogs. There's even a "homeless guy" blog, written by a fortysomething man who lives on the streets of Nashville, Tennessee. "All human life is there," said the UK Guardian in a feature about the "blogging phenomenon."

The vast majority of the estimated 200,000 to 500,000 blogs are little more than online diaries, where individuals post musings and write about their daily experiences. But there is a growing number of "big bloggers" -- bloggers who write about news, politics and culture -- who claim to be forging new forms of journalism and correspondence.

Apparently, such blogs threaten the traditional media's hold over the spread of information and ideas. By allowing the man in the street to get his hands on the means of production -- to write, produce and publish his own content without needing an editor or publisher -- blogging has been hailed as a "publishing revolution," which will "transform journalism" and "democratise the media."

Glenn Reynolds, an American law professor who runs the hugely popular weblog InstaPundit, claims 2003 will be the "year of the blog." "For Big Media," says Reynolds, "[blogging] is going to produce an increasing degree of either conscientiousness or paranoia, as it becomes apparent that the megaphone now works both ways..."

For Andrew Sullivan -- British-journo-in-America, Sunday Times columnist and big into blogging -- there has been nothing less than a "blogging revolution." "Blogging is changing the media world and could, I think, foment a revolution in how journalism functions in our culture," says Sullivan. He goes so far as to argue that blogging might represent "a publishing revolution more profound than anything since the printing press." Wow.

So is the "blogosphere" making the crusty publishers of yesteryear obsolete? Is the spread of personal websites on a par with the birth of print? Not quite. Blogging may be fun -- which is why I've been publishing one at brendanoneill.net for the past six months; it may even be a new and exciting way of using the web. But it's not journalism, and it ain't no revolution.

For all the claims that the "big bloggers" are challenging the traditionalists, in fact many blogs simply leech off the old-style media. The political and comment blogs that are seen as being at the forefront of the "blogging revolution" often do little more than write about and react to articles published in traditional media outlets (or "the Big Media" as they call it), rather than generating new journalistic content.

Two of the things that bloggers became famous for in 2002 were "Fisking" and the "fact-checking of asses" (seriously -- as in "Blogs: fact-checking Big Media's ass"). Fisking is named after the Independent's left-leaning foreign correspondent Robert Fisk, who is despised by many right-wing bloggers for what they perceive to be his anti-American attitudes and especially for his critical comments about Israel. According to one "Blogging Glossary," published on a libertarian weblog, to Fisk is to "deconstruct an article on a point by point basis in a highly critical manner."

And they really mean "point by point" -- some bloggers leap on the latest column by Robert Fisk, or Paul Krugman of the New York Times or George Monbiot of the UK Guardian, whomever they like the least, and Fisk the content in often laborious detail.

Then there's "Fact-checking asses," which, according to the Blogging Glossary, means "using internet search engines to ascertain the veracity of dubious claims made in the press." According to Glenn Reynolds of InstaPundit, there is something almost subversive about keeping the old media in check like this. He writes of the "underdoggish thrill of hobbyists 'fact-checking the asses' of the pros who chafe at the slightest indication of non-pros intruding on their monopoly turf."

If bloggers want to spend their time fact-checking the traditional media's ass, that's fine -- and some of them even do it entertainingly. But when that becomes a major focus of blogging, it hardly points to a "radical transformation" of the "journalistic culture." Blogs come across less as a revolutionary vanguard remaking journalism into something new and dynamic, and more like traditional journalism's poor cousin -- putting it down, picking holes in its arguments, and generally having a good old moan about the Fisks and Krugmans of the world.

An ironic effect of the "Fisking culture" has been to boost traditional journalism's fortunes on the web. As a result of having his name turned into a verb, Robert Fisk has assumed almost legendary proportions on sections of the internet. Fisk is now a kind of mythical figure, that strange British journalist who dares to say the unthinkable -- a view which, it has to be said, is often out of proportion to any biting insight on Fisk's part.

In May 2002, Hollywood star and well-known web-user John Malkovich was asked whom he would most like to fight to the death and he nominated Robert Fisk, capturing Fisk's newfound fame (and loathing) as a result of bloggers having spread his name around the web.

Likewise, some claim that the UK Guardian has become a big read around the worldwide web largely as a result of bloggers attacking it. Andrew Sullivan, the right-wing journalist whose personal blog is one of the most popular, has made a point of "Fisking" Guardian articles -- and according to Wyeth Ruthven, who runs a centre-left American blog, "No one here had even heard of the Guardian until Sullivan began his personal jihad" (surely an exaggeration?). Writing in the New Statesman, British blogger James Crabtree claims that "in a country [the USA] with no recognisable left of its own, bloggers have made [the Guardian] the pantomime villain of the right."

The blogosphere's focus on the faults and failings of the traditional media does more than give a shot to print journalism's web presence -- it also makes for blog-writing that is more bitter and bitchy than insightful. By setting themselves up against "Big Media," against the writers they love to hate (or love to love), bloggers often sound like the critics who can't rather than journalists who can. There is a group of right-wing British bloggers who spend their days Fisking the Guardian -- except they don't call it the Guardian, they call it the Wanker. Which is not even funny. And see what I mean about bitter?

This kind of blogging is little more than a subjective spouting match, where bloggers spill forth their views on everything, anything and sometimes nothing. But there is more to journalism than instant reaction and response. Good journalism involves rising above your immediate concerns, weighing up the facts, and attempting to say something more measured and insightful -- sometimes even truthful and profound. Blogging creates a white noise of personal prejudice, akin to students arguing in a bar rather than experts saying anything striking. I haven't got a problem with pub-style debates about the issues of the day -- but journalism it isn't.

On my weblog, for example, I have a recurring item called "What the fuck...?" -- for when something so bizarre happens, or when a public figure says something so ridiculous, that there is little more to say in response than "What the fuck...?" President Bush says Saddam Hussein got al-Qaeda to bomb Bali: what the fuck? Jimmy Carter wins the Nobel Prize for Peace: what the fuck? Et cetera...

But this isn't journalism -- it's a blogger's rant. If I were to write a journalistic piece about Bush's obsession with a link between Saddam Hussein and al-Qaeda it would have to say more than 'What the fuck...?' And if bloggers fancy themselves as cutting-edge "new journalists" giving the old media a run for its money, they'll have to do more than post quickfire comments in response to already published material or breaking news or another blogger's comments about another blogger's comments. Perhaps they could start by generating some new content.

The rise of blogging on the web, and the way in which it has been hailed as a media revolution not only by bloggers but also by some newspapers, reflects recent shifts within journalism itself. In the traditional media, everywhere from the papers to the TV, there has been a rise in personal opinions and emotionally responsive journalism over objectivity and hard-hitting investigation. Of course, there's nothing wrong with opinion journalism, especially if the journalist has got something to say. But too often today, much opinion writing seems to be driven more by feelings and emotions than by insight or having a distinct argument to put forward.

It is unsurprising, then, that similar trends are impacting on online journalism. And the thing about the web is that, even more than newspapers and TV, it lends itself to the expression of personal opinion and prejudice. The ease and speed with which anyone can publish their views on the web is no doubt a potentially positive development, but it can lead to an explosion of opinion that ends up saying very little. To describe this free-for-all expression/commenting/ranting as a journalistic revolution is disingenuous. As Clint Eastwood once said, "Opinions are like assholes -- everybody has one."

A final claim made by bloggers is that they can offer radically more content than we will find in the word-counted articles subbed and squashed into a newspaper or magazine. "We can publish everything online," says one blogger: "transcripts, research material, every detail. Newspapers publish, what? 800 words?"

To this end, Sheila Lennon, a well-known American blogger, recently published on her blog the entire transcript of an interview that she gave to the New York Times. The NYT interviewed Lennon about the blogging phenomenon, but only used one sentence from her in the published article. Lennon said the Times "asked fine questions, and I didn't wince when I read my answers," but still she felt the urge to post the entire interview on her own blog. "It seemed natural for me to publish the 'rest of the story' online for readers who might be interested."

All of which is fine -- except this reproduction of source material was then presented as some kind of radical act. According to the American Journalism Review: "Not only had Lennon revealed the raw material of a story; she'd empowered herself as a citizen publisher and an interviewee." The Review claimed that by publishing the interview transcript, Lennon had created a "really revolutionary scenario," where "anyone can set up a virtual press in order to contribute to the reporting process, talk back to a journalist or set the record straight."

The "Lennon incident" (as some now refer to it...) showed the benefits of publishing on the web -- and also how such benefits get blown out of proportion. One of the most transformative things about web publishing, as distinct from print publishing, is that you can provide "extra material." Through hyperlinks, further reading suggestions and footnotes, articles on the web can become gateways to a wealth of material. Some blogs do this very well, providing links to articles you might otherwise not have found -- and you only have to browse the BBC News website to see the promise of such publishing.

But to describe this as a new form of journalism, as "putting journalism's house in order," is bizarre. Indeed, Sheila Lennon's self-publication of her interview transcript ended up reminding me what journalists are for. There was some interesting stuff in the transcript, but generally it was long, rambling and boring in parts -- as transcripts tend to be. By contrast, the final New York Times article, which incorporated a tiny part of the interview, was measured, concise and a good read.

It might feel "empowering" to publish transcripts and other "behind the scenes" material -- but a professional journalist's job is to take all that material, consider it, and turn it into something more profound. That's why editors exist -- to ensure that published material is readable and clear. No doubt some writers would like to have their every word published, but editors put economy and clarity before writer overload. Indeed, some modern newspapers and book publishers could do with harder editors.

The blogosphere, by contrast, not only lacks editors, it celebrates their absence. It claims that this lack of quality control gives the blogosphere a special freedom. As a result, the bloggers' "radical act" of providing raw material as a way of challenging traditional journalists' stranglehold over information often shows up just how important traditional journalists, and editors, are.

For all that, I actually like blogs. Really I do. Some are funny, some alert me to interesting articles, some even say original things. But the biggest revolution since the birth of the printing press? Blog off.

Brendan O'Neill is an assistant editor at Spiked Online.

The Rise of French Diplomacy

In the United Nations' Security Council debate over the resolution on Iraq, many were surprised to see France re-emerge as a key player.

According to reports President George W Bush was for the first time in months regularly calling President Jacque Chirac on the telephone. Now that the Security Council has agreed to impose tough weapons inspections on Iraq, France is claiming to have secured an important concession. Meanwhile, British sources are ridiculing the minimal change in words in the final resolution. While visiting London, U.S. Pentagon advisor Richard Perle told the Guardian newspaper that, from France, "I have seen diplomatic manoeuvre, but not moral fibre".

The return of French diplomacy is something of a surprise -- not least for the British foreign office -- smarting at the recent isolation of London at the European summit. Though a permanent member of the United Nations Security Council and a nuclear power, France's status had dwindled in recent years.

America's support for rebel forces in Rwanda and Zaire in the 1990s reduced French influence in Africa. In 1995 France's nuclear tests in the Pacific upset U.S. attempts to enforce the nuclear Non-Proliferation Treaty. In 1996 France's nomination for the post of UN Secretary General Boutros Boutros-Ghali was denied a second term in 1996, after Britain and the U.S.A ganged up to nominate their "own African" Kofi Annan, at a time when French-U.S. antagonism was said to be at its peak.

Behind this succession of international incidents stood France's diminished status in the emerging world order. France's chosen role for many years was as the diplomatic and military wing of the Franco-German alliance. While Germany was the European Union's economic locomotive, that country's troubled history limited its capacity to challenge the American order in the realm of diplomacy. But the Franco-German alliance was called into question by German reunification and the consensus-building international diplomacy of the Clinton Presidency in the 1990s.

With America more willing to deal directly with a reunified Germany, France's special position seemed to be over. Instead, Britain, having lost its Empire, at last appeared to have discovered a role. As the Clinton team sought to build international alliances, Britain under prime minister Tony Blair emerged as an intermediary between Europe - and specifically Germany - and America.

By driving the pace of humanitarian intervention in third world trouble spots, Britain was once again "punching above its weight" in the international ring, while France was looking like a non-contender. Lionel Jospin's foreign minister Hubert Vedrine did attempt to enhance France's reputation as a diplomatic innovator, but remained under Britain's shadow as Tony Blair stole a French proposal from the 1950s for a European Defence Force.

The changing mood of U.S. diplomacy following the election of Republican George W Bush, however, disturbed the pattern of international relations that had been established by President Bill Clinton and Tony Blair. Convinced that the previous administration had given too much away to the "international community" (read European diplomacy), the Bush administration tore up one treaty after the next in an effort to re-model the world order with its own status as sole superpower at the core.

In the renewed conflict with Iraq, Bush threatened that the United Nations (UN) could find itself irrelevant if it failed to endorse the U.S. war drive. In the event Bush's bluster was more negotiating strategy than isolationism, but it did put the European powers to the test - and that test proved as favourable to France as it proved awkward for Britain.

Everything is in the timing in international diplomacy, and Blair's gamble was that by being first to endorse Bush's position he would secure his standing in Washington -- uncertain since Bush's election -- and steal a march on his European allies. In the event, although Britain claimed to have won America to working through the UN, Blair was ridiculed for his servility.

German reaction to the new turn in U.S. diplomacy had been gathering pace, and eventually spilled out in the recent elections, but in terms that demonstrated the elite's enduring diplomatic gaucheness. A bullish Gerhard Schröder appalled Americans by refusing to pay the bill for any new war -- saying out loud what many Germans believed, that the last Gulf War in 1991 was a scam to wring money out of the Bundesbank. When a junior minister in the outgoing administration got carried away with the anti-American mood and likened Bush to Hitler, the White House had its excuse to freeze out Germany. Only on Nov. 10, during a visit by Germany's new defence minister, Peter Struck, did the White House grudgingly accept that relations with Germany were now "unpoisoned" again.

With a reputation as a crook and owing his election as president to a scare over the far right, the elderly Jacques Chirac seemed ill-placed to project France's standing in the world. However, Chirac saw that Britain's premature Americanism, and Germany's premature anti-Americanism left an opening for France. Chirac proposed the formula that true friends are prepared to disagree to steer a course between Britain and Germany.

Understanding that European objections to being rail-roaded by the U.S.A needed to be expressed more diplomatically than the German government was capable of, Chirac made a show of holding out for a distinctive position in the Iraq debate.

In substance, there is no disagreement between France and America over the attractions of making an example of Saddam Hussein's Iraq. But France did persuade America to observe the niceties of consulting with its allies. For that Europeans will be grateful to Chirac, as they were scornful of Blair for being Bush's poodle.

Whether France can sustain its newly enhanced diplomatic standing is open to question. Certainly the Foreign Office will be working overtime to enhance Britain and do down the French.

BBC Plays Judge and Executioner

For some years, critics have commented that the trend in "reality TV" will end with someone being killed for public entertainment. Strangely, when it did happen on British TV, nobody commented.

The three-part BBC2 series The Hunt for Britain's Pedophiles is so ghoulish that even champions of the child-protection industry have avoided comment. The series has regaled the public with barely disguised child pornography with such relish that it is difficult to see the difference between the emotional charge of outrage or titillation the filmmakers are trying to provoke.

But they saved the best till last.

Joining a police raid on a sex offender, the BBC was thrilled that ferret-keeper Mark Hansen allowed them to film him in situ while his overstuffed council flat was turned upside down. To the filmmakers' delight, Hansen spoke frankly about his perverse compulsion and his life of prison terms punctuated by police surveillance.

All parties concerned adopted the cod-psychology of sex-offender treatment, with its central proposition that offenders are not in control of their urges. Hansen admitted what he did was wrong, but claimed that he couldn't help himself. The police reduced the proposition to cliché: "The leopard doesn't change his spots." Only later did one inspector complain that if the offender can't help himself, are they supposed to provide a pedophile support group?

But is there really a social type called "pedophile", who is driven by irresistible psychological drives to commit sexual offences? Surely it is wrong to speak of "a pedophile," but rather of someone who commits the act of paedophilia. It is an act of moral depravity, no doubt, but the desire to cordon off the pedophile wing of society, just as we have the "sex offenders" wing of the prison, is a way of reaching for moral absolutes in an age where there are few to be found.

Are certain people hardwired to commit sex offences against young children? UK Home Office research on reconviction rates has indicated that sex offenders in Britain as a whole are less likely to reoffend than other kinds of offenders, and that reconviction of sex offenders aged over 30 is "exceptionally low."

Despite the level of public anxiety over the crime, it remains exceptionally rare in Britain -- with around 900 charges under crimes specifically dealing with sexual offences against children, and perhaps 1000 more acts against children dealt with under other sexual offences. The majority of these offences are one-offs committed by family members or people the children know - while, by the Home Office's admission, there are only a "handful" of the kind of predatory pedophiles that the BBC has focused on.

But watching The Hunt for Britain's Pedophiles, there was something unnerving about Hansen's openness. Even in the age of reality TV you might expect a greater sense of self-preservation. The follow-up to the interview came the next day, when it was reported that he had killed himself that night, rather than face another prison term -- and, presumably, the shame attached to the broadcast of the interview.

The BBC took his suicide note as consent to broadcast, while the inspector said that "the streets are safer". What is he suggesting -- death sentences for child pornographers? In the end it was not the gameshows that offered the public humiliation and suicide of a participant as public entertainment, but a "serious" documentary.

More Alienated Than Ever

Since Sept. 11, says Ahtsham Ali, vice-president of the Islamic Society of Great Britain, British Muslim youth have been taking US president George W Bush's with-us-or-against-us rhetoric to heart.

'They say you are either with bin Laden or you are with America', says Ali - and they have little doubt about which side they would choose. 'If you go to Bradford or Oldham, the average youth identifies - well, sympathises - with bin Laden.'

Touring Islamic societies and youth clubs around the UK, Ali has noticed a changed mood in recent months. 'The number of activated Muslims in the UK is increasing. The number of frustrated youth is increasing. Islamic societies are packed out - the largest university societies are Islamic societies.'

Ali wants to show Muslim youth that 'there is an alternative' to Islamic fundamentalism, by taking them through true 'Islamic principles'. He clearly has his own beliefs and prejudices - but he raises questions about the allegiances of Muslim youth that deserve more attention than they have received so far.

These changes in the British Muslim community began long before Sept. 11. 'Prior to Sept. 11, there was an increase of identity-searching among British Asians', says Ali. Many young Muslims have been asking: '"Who are we in Britain - are we this or that?"' There were also feelings of impotence and frustration - a sense of distance from Western institutions. If you disagree with something, says Ali, 'all you can do is march, or go to your MP' - but when 'there is no legal release for the frustrations' of Muslim youth they tend to spill over, and frustrations can lead people 'beyond the boundaries of sanity'.

Sept. 11 has 'speeded up' the search for an identity and the feelings of frustration. According to Ali, the war in Afghanistan 'showed the average British Muslim the hypocrisy of the situation'. Bush was preaching humanitarian values while bombing Afghanistan even further back to the Stone Age. The war on terrorism became a focus for cynicism and anti-American sentiments - and hardened the conviction that Western governments cannot be trusted. 'They see this as "America, Britain - they're all the same"', says Ali. 'Once they have this view, they feel impotent. They can't understand how it is possible that in today's world of equality people can get away with violence and mayhem.'

The recent Israeli incursions into Palestinian territories, as America apparently stood by, compounded the way some young Muslims view the West. There is a sense that 'America will always back Israel', says Ali. And such experiences have clarified people's feelings of alienation from Britain and the West in general: 'Many who were on the fence were pushed over into the Islamicisation of their identity.'

Ali believes that 'this feeling of impotence might generate some to take some kind of drastic action'. More Muslim youth might 'go abroad to do their bit for the war'. There may be problems now, says Ali, but things will get worse: 'In the future there will be much more.'

Ali exaggerates the prospect of a coming storm. But he raises useful questions. Since Sept. 11, the war on terror and fundamentalism has been seen as an issue of 'over there' - of al-Qaeda and the Taliban in the Afghan mountains. Western elites have been less keen to face up to the problem of alienation at home - the question of what makes some Western Muslim youth sympathise with bin Laden more than with the political leaders here.

Sept. 11 and the war on terrorism raised the question of allegiance to Western governments: 'are you with us or against us?' For many, it seems that the answer remains unclear.

World Trade Center Vanishes Again

The upcoming film Spider Man is an escapist story about a guy who can climb skyscraper walls, spin webs out of his fingertips, and handily morph into an office worker whose girlfriend doesn't suspect a thing. It's a major player in the 2002 summer escapist-blockbuster season, along with Episode Two of Star Wars, and the sequel to Men in Black.

Six years and one Sept. 11 after Independence Day blitzed the worldwide box office, on the strength of spectacular special effects that showed the White House and the Empire State Building being blown to pieces by hostile aliens, film companies are extremely cautious about showing the World Trade Centre in films that were completed before the towers' destruction.

The original marketing campaign for Spider Man featured the twin towers prominently. The trailer had escaping bank robbers becoming snared in a giant spider's web, with the camera zooming out to show that the web was suspended between the twin towers. Audiences who saw this trailer before Sept. 11 whooped and cheered.

Apparently, this was promo-material only, and the scene was never intended for the final cut. In any case, the trailer was pulled after Sept. 11. The original publicity poster for the film showed Spider Man wedged, larger-than-life, between the twin towers - the new poster has no towers, just Spider Man.

In the wake of Sept. 11, Hollywood showed its skill in keeping up with the times - which is, after all, one of the keys to winning good box office. But it also showed one of its weaknesses: its love of the simplest-and-lowest-common-denominator. It thought, understandably, that audiences might be thrown by the sight of the twin towers, now too infamous and tragic to be slotted into the background of a no-brainer popcorn flick.

So some films released late last year, like Zoolander, had glimpses of the towers digitally removed before their release. Films with close-to-the-bone subject matter were also affected. Arnold Schwarznegger's new film Collateral Damage - about a firefighter who goes into action after a terrorist attack - was postponed by several months.

Several upcoming films, including Men In Black 2 and Martin Scorsese's Gangs of New York, are being re-edited or digitally modified to exclude visual references to the twin towers. The finale of MIB2 was to feature Will Smith battling a giant alien worm on the rooftop of one of the towers. What audiences will see, thanks to computer effects, is the same battle, fought on the rooftop of a different building. What about the second chapter of The Lord Of The Rings, which is called The Two Towers - will it be retitled?

Similar things happened in September 1997, after the death of Princess Diana. The Australian film Diana and Me featured Toni Collette as an Australian girl called Diana Spencer who travels all the way to London to meet her namesake, and ends up getting involved with a photographer in a dogged pursuit of the princess. The film was due to be released in late summer 1997, but after Diana's death, two new scenes were quickly shot and the film was re-edited to comment anew on the role of the paparazzi.

What does this avoidance of jarring current affairs events say about our relationship to film? We line up in droves to watch films where people die, leaving behind their devastated loved ones (Terms Of Endearment, Titanic), or films that explore serious issues (The Insider or American Beauty). And movies featuring violence, death and destruction have been box office gold for decades.

So why do we, or the film companies that entertain us, feel the need to be shielded from recent shocks? It's as if Hollywood cares for us like a nanny, tucking us up at night and assuring us there are no real monsters out there….

We are still unsure about how to react to Sept. 11. Creative artists, screenwriters and filmmakers will need time to figure it out too. At a recent Hollywood forum, writer-director Peter Hyams said, 'If I had a film that was a comedy and there's a scene of two people walking up the street and in the background is the World Trade Centre, I'd want that out of my film, because that would certainly make people like me start to cry'.

On the one hand, I understand what he means. Recently I was watching an episode of Sex And The City, when the towers briefly appeared. Suddenly, lost in thought, I missed the next few minutes of dialogue and lost the thread of the whole episode.

But on the other hand, I can't help thinking of the lame early 1990s gay/AIDS film, Longtime Companion, which explored a friendship group of gay men, many of whom die of AIDS during the film. At the end, all of the dead characters magically reappear in a saccharine dream sequence on a beach, running through the sand in slow motion and hugging their friends with full corporeality. The anguish of the subject matter, in this case AIDS, is avoided - just cancelled out.

The filmmakers were betraying the fact that they had an uncertain reaction to the subject matter of the film. So much for one of the few 'serious' films 'about' AIDS. Likewise, the makers of Diana and Me virtually remade their film, updating its point of view to suit contemporary events (in vain, as it turns out - the film flopped). The same process is happening today, as filmmakers scramble to make their fictional films reality-savvy. And for now, reality is Ground Zero, not the standing twin towers.

It is understandable from a marketing point of view, but a bit dubious from a rational point of view. The twin towers were destroyed, and it was truly horrible. But what is achieved by pretending they were never there in the first place?

Mark Adnum is a Sydney-based writer.

A Very Strange Time Capsule

Joel Meyerowitz - whose photographs appear in the exhibition "After September 11: Images from Ground Zero" at the Museum of London -- says he took photos of New York in the aftermath of the terrorist attacks because he "wanted to do something useful".

"I had the same wounded feeling as everybody else", he says. He had tried to volunteer, but they turned him away, so he decided to document instead. He was horrified that photographers were being kept away from Ground Zero: "We can't have a blackout on history. What event is not photographed? I was determined to go in and make an archive." His photos -- of workmen taking out the dead, of the remaining North Wall -- are the only visual record of the recovery work at Ground Zero.

After Sept. 11, many New York cultural institutions felt the same urge -- to document and preserve. There is an ongoing attempt to collect all of the memories and material culture related to Sept. 11. From poems left by New Yorkers to aspirins sent for relief workers, from bits of rubble to "Missing" signs and dust masks...Sept. 11 is fast becoming one of the best-documented events in history.

But why are we collecting all this stuff? And what will it tell future historians about us?

Dr Sarah Henry, vice president at the Office of Programs at the Museum of the City of New York, told me in October 2001 that Sept. 11 "becomes an opportunity for a time capsule". By documenting the event and "looking at history from its every angle", there "may be an opportunity for people [in the future] to understand things about social, political and economic history in ways we cannot anticipate now".

A massive time capsule is a good way of describing the current collecting efforts by state and cultural institutions. Almost everybody seems to be collecting something. South Street Seaport Museum is documenting the response of the maritime community to the attacks, collecting oral histories, photographs, videos and other artefacts. The New York Center for Urban Folk Culture (Citylore) has photographed the spontaneous shrines that sprung up in response to the attacks, and is collecting "found" poems. The Museum of Comic and Cartoon Art is collecting prints inspired by Sept. 11.

The New York Historical Society is collecting artists' responses to the attacks, World Trade Centre memorabilia, children's artwork, victim's personal effects, and equipment worn by rescue workers. The Museum of the City of New York has acquired Bellevue Hospital's "Wall of Prayer", a spontaneous bulletin board that sprung up in the immediate aftermath of Sept. 11, containing images of the lost, prayers and poems -- and it is creating a "Virtual Union Square", collecting electronic submissions of people's artistic or poetic responses to Sept. 11. And the Association of Public Historians of New York State is coordinating members' efforts to document their communities' responses to the attacks.

State agencies are also involved. A group led by New York State Archives and the National Archives is assembling evidence of how governments, hospitals, schools, mental health organisations and religious groups responded to the event. According to state archivist Kathleen Roe, "It's probably the most monumental documentation you can think of".

Columbia University Oral History Project is recording post-Sept. 11 "oral histories". Mary Marshall Clark, director of the Columbia University Oral History Research Office, told me that "we have never done anything on this scale. We have done this many interviews before [around 360], but we have never done this many this quickly, or so close to an event". The project will keep in touch with some of the interviewees over time, to see how the event has affected their lives.

Dr Steven Jaffe is curator at the South Street Seaport Museum in Manhattan, which has collected interviews with some of the people involved in the evacuation of around 300,000 people across the Hudson River by tugboat, yacht and speedboat after the attacks. According to Jaffe, "Everybody has felt a deep personal need to do something in response.... This is how I respond -- I can preserve for posterity, not only documents, but also how people felt about it".

The story of how 300,000 people were evacuated will no doubt be remarkable. Indeed, viewed on their own merits, many of the post-Sept. 11 projects and exhibitions seem fair enough. But collectively, they add up to something rather strange.

It is strange that so many institutions seem to be collecting -- and that they are collecting so much. The New York Historical Society says "there is always an increase in the documentary and creative record in response to...seismic occurrences". This might be true in some cases, but not in others. According to Jane Carmichael, director of collections at London's Imperial War Museum, during the Second World War collecting virtually ceased: "Everything was in short supply -- museums had made it their priority to protect exhibits from bombing. Some of our exhibits were wheeled out to take part. It wasn't until the war was over that the collecting effort began."

When the Imperial War Museum did begin to collect, it didn't just ask for people to send in their submissions. Museums have to make hard choices about what is worth collecting and what isn't. "We are offered a lot of material", says Carmichael, "and we have learned how to say no politely and courteously".

What is also strange is that cultural institutions are documenting people's manufactured responses to the Sept. 11 attacks. A future society looking back on Sept. 11 won't just have physical remains of the attacks, official documents, newspapers, videos, diaries and letters to go on. They will also have personal documents made as an actual response to the attacks.

Historically, this is unusual. Do we have this for Hiroshima or the Blitz? Of course we have letters, diaries, poems and other things that indicate something of what people felt about historic events -- and no doubt historians often wish we had more. But these were things that people wrote or produced for themselves, or for their friends or lovers -- they generally didn't do it for a museum or to put in the street. We don't have this kind of mass production of "responses" to catastrophes of the past.

Today, cultural institutions are actually appealing to people to give their testimonies, or to produce art and write poems. The New York Historical Society has invited New Yorkers to "share their reminiscences of the people and events of 11 September", asking people to "please send your stories, along with your name, phone number, and contact information". The Museum of the City of New York's 'Virtual Union Square' "invites you to contribute images of your drawings, sculptures, posters, paintings, memorials, signs, poems or other creations made in response to the events of Sept. 11".

What a society chooses to collect can tell you a lot about it. I often see an object in a museum and think, "Why did they collect that?". In London's Victoria and Albert Museum there are two rooms full of plaster casts of buildings and monuments from all over Europe, largely made and collected in the late nineteenth century. Those two rooms are like a snapshot of the Victorian mindset: a people who wanted to possess the finest in the world, and to learn from it.

In the early twenty-first century, cultural institutions no longer have the conviction of their Victorian forebears. Museums have become less certain about their role as collectors, studiers and presenters of artefacts, and are refocusing themselves around their audience. On both sides of the Atlantic, they increasingly see their role as responding to the needs of the public, as playing a social role.

It is perhaps out of indecision that cultural and state institutions are collecting so much in response to Sept. 11. There is a certain unwillingness to refuse material or to decide that one thing is more important than another -- which looks like a refusal to step outside of the event and look at it in historical perspective, instead just documenting it, over and over, in all its different aspects.

And the collecting of people's responses to Sept. 11 is related to cultural institutions' desire to play an increasingly social role. Dr Sarah Henry of the Museum of the City of New York said that after Sept. 11, "The city needed something from us as a museum; we felt we had to play a role in healing... Promoting civic dialogue is part of our mission." But she was wary of going too far down this road -- "we don't offer therapy". She visited the Brooklyn Museum a week after the attacks, and they "had a packet out about how to deal with tragedy, how to talk to your children after tragedy. This was a step further than what we were doing".

The growth of the therapeutic ethos is important here. Increasingly, institutions see their relationship to the public as one of soothing and healing. UK prime minister Tony Blair and former US president Bill "I feel your pain" Clinton both made caring, feeling and healing into a big part of their role as leaders. After Sept. 11, the therapeutic ethos became even more upfront -- with everybody wanting a piece of the public grieving process.

But the post-11 September scramble for stuff isn't all down to the institutions -- after all, people began to produce material responses to the attacks spontaneously, almost immediately after the attacks occurred. "Union Square became a particular focal point for kinds of public display, all sorts of sentiments", says Sarah Henry, "commemorative, memorial-type sentiments, also political ones. It literally filled up with material that people brought. It began as a gathering place, then became literally blanketed. There was a need to consider collecting that material.

"There was a cry from the public that it would be preserved for posterity -- people were calling us, sending us money. We had calls from city agencies, saying 'we find ourselves in possession of this material -- what do we do with it?'. A lot of places, where material had to be taken down, there was a great concern that it ought not be destroyed."

Some of the memorials I saw when I was in New York in October 2001 struck me as odd. There were intimate reflections on a lost loved one, angry calls for revenge, and more oblique reflections. Essentially, they were very personal, individual expressions of experience and emotion. The messages were all next to each other, but they didn't connect. And there was none of the consideration or control that normally goes into our public encounters. All you have is a disembodied fragment of somebody's emotion -- and because you don't know them, it is difficult to know what they are talking about.

These messages, poems and artworks looked like the products of vulnerable and isolated individuals, of people frightened in the face of terror. It was a graphic illustration of the "lonely crowd" -- people coming together, but ultimately alone.

The memorials emerged only 48 hours after the attacks -- people came together, seeking support and sources of meaning. After the attacks, the U.S. state to all intents and purposes collapsed - the president disappeared underground, popping up from time to time to make nervous statements. People were faced with fear and confusion, and little means to make sense of it.

As Robert Putnam claims in his book Bowling Alone, over the past few decades there has been a gradual breakdown of civic associations and community networks. People did not have strong support networks to turn to after the attacks -- so they sought solace and belonging in Union Square.

It is questionable how valuable these very personal responses will be to future historians. Aside from the general sentiment, it is difficult to glean much from many of them. The contributions were too fragmented and too self-conscious to give you the insights of a diary, where you would get a proper idea of that person's character and thoughts. They were too unmediated and emotional to serve as a public record.

For museums to actually generate responses to the event is, as Jane Carmichael says, "slightly suspect", as the point of a response is surely that it is natural and undirected. Once you start generating your exhibits, perhaps you have to question their authenticity. According to Carmichael, there may also be problems with using reflections on a historical event: "The point of collecting is to try and be as authentic and close in time to the experiences as possible. What people recollect in retrospect, the details aren't as clear. Memories aren't that reliable. The closer the account is to the event, the more reliable it is about the emotions generated about the event."

However, what we collect cannot help but say interesting things about us. The very fact that responses are being produced, and that institutions feel driven to collect them, will provide interesting historical insights.

It may also reveal a lot about the event itself. After all, Sept. 11 was defined by the reaction to it. The media coverage of the event was often more concerned with giving us experiences and reactions, than analysis. These reactions and experiences, in a sense, are what happened -- and seemed to become even more important than the actual destruction of the World Trade Centre. As Sarah Henry says, "We should try to think about what things people want to understand this event, to understand this moment in history, which is different from picking up bits of daubing -- what does that tell you exactly?".

The attacks and their aftermath, claims Henry, will have "tremendous causative impact. It may reveal things about longer processes, other issues in New York and society, that come to the surface in a moment of crisis". If the collecting after Sept. 11 is making a time capsule of our society, it is a very strange time capsule -- for a very strange time.

Josie Appleton has written articles on museums for Spectator, BBC History Magazine and Museums Journal.