Haunted House Films Are Really About the Nightmares of Gentrification

Where all are guilty, no one is; confessions of collective guilt are the best possible safeguard against the discovery of culprits, and the very magnitude of the crime the best excuse for doing nothing. --Hannah Arendt

In the course of 15 years as a tenant organizer, my friend and mentor Artemio Guerra has become intimately, disturbingly familiar with the process of gentrification -- the shifting demographics, the clash of old and new tenants, and the monstrous machinations of landlords bent on pushing out rent-controlled tenants. The threats and harassing late-night calls. Whole buildings left without heat. Bombs planted in lobbies. INS called on immigrant tenants who fight back. A nightmare so pervasive it would surely rate broader attention if it wasn't a "normal" consequence of capitalism.

Artemio and I always end up having long discussions about horror films and politics, so he called me up after seeing the haunted house film Cold Creek Manor. "It's all about gentrification!" he said. "It's a piece of crap, but still ...

He was right on both counts. In the film, an upper-middle class family from New York City moves into a rural working-class community, and finds itself under assault by a crazy handyman who used to live in the house, as well as the angry spirits who haunt it.

Rich city folks move out into the country and find themselves up against nasty poor locals and a ghost in another recent vengeful-spirit film, Wendigo. The more I thought about this recurrent motif, the more I realized: the modern haunted house film is fundamentally about gentrification. Again and again we see fictional families move into spaces from which others have been violently displaced, and the new arrivals suffer for that violence even if they themselves have done nothing wrong.

This thriving subgenre depends upon the audience believing, on some level, that what "we" have was attained by violence, and the fear that it will be taken by violence. In the process, because mainstream audiences are seen as white, and because gentrification predominantly impacts communities of color, the racial Other becomes literally monstrous.

The biggest cliche in the modern haunted house film is that of the Indian Burial Ground. In Pet Semetary, The Shining, and The Amityville Horror, the source of the problem is that the real estate parcel in question has desecrated sacred ground.

The conquest of North America could be classified as our most extensive gentrification, where thousands of communities of color were violently pushed out by white settlers manifesting racist destiny. The ubiquity of the Indian burial ground points to screenwriter laziness, but it also constructs a movie-going public all too willing to accept that our homes are literally built upon genocide and terrified that those dead Indians will come back -- not to scalp us or to take "our" land through armed force, but to suck our children into the television or make our husbands go insane and try to kill us with an axe.

Guilt over the North American genocide persists, in spite of centuries of racist history that have clouded the general public's grasp on the extremity of violence perpetrated against the Native Americans -- the broken treaties, the Indian Removal Act, the smallpox blankets. With the death of the Western as a film genre and the success of the Civil Rights Movement in challenging the blatancy of racism in mainstream culture, the Indian-as-bloodthirsty-savage was transformed into the Indian-as-murderous-ghost.

That's one of the main ways the horror genre, on its surface so apolitical, connects to the United States' histories of genocide. How far a leap is it from the menacing ex-slaves in Birth of a Nation to the zombies in Night of the Living Dead? Even though its subtext of displacement and gentrification might foreground race and violence and displacement, the haunted house film participates in the mystification of demographic change by convincing us that we are innocent, and the people we have displaced are monsters.

Displacement creates a paradox: We acknowledge the wrong that has been done but feel powerless to do anything about it. A sort of collective guilt springs up, a sense that we are insignificant cogs in the machinery of economic and social factors that create gentrification. This is particularly true for the middle class, who are often forced by economic necessity to move to gentrifying neighborhoods or to new suburban developments that have demolished pre-existing space.

Regardless of their place on the political spectrum, most people acknowledge that their government does some very bad things, and that they themselves might have to face the consequences. As in Malcolm X's famous comment on the assassination of John F. Kennedy -- "the chickens are coming home to roost" -- and following the Golden Rule, a system that maintains itself through violence will engender a violent response. The price of living in the comfort that globalizing imperialism can provide is the chance that we will be the victims of retaliatory violence -- like the Oklahoma City bombing and 9/11.

In the same way, the consequences of gentrification flicker on our radar regardless of whether or not we feel personally culpable. The question is, can we do anything about it? The modern haunted house film tells us that we can't -- that the only way to live in peace is to destroy the monsters we have already replaced.

From its roots in the Gothic tale, the haunted house story has often been about guilt visited upon the innocent for things their ancestors (or husbands, or cousins) did. Somebody did something wrong, and somebody else is paying for it. Think of Jane Eyre, taunted by the madwoman in the attic who turns out to be the wife her lover has locked up. The children in The Turn of the Screw are destroyed by their governess' sexual frustration, manifested in ghost form. In what might be the most influential literary example of the "bad house" story, Shirley Jackson's The Haunting of Hill House, the "evil" has its source in its owner/architect's repressive patriarchal Puritanism.

The assumption has always been that "innocent" beneficiaries of privilege won, though violence will be made to pay for that violence. This construction of innocence is disingenuous, since real guilt does exist, even though the complex mechanisms of modern markets fog the issue in ways that play into "our" desire to feel like we have no role or power in the process.

Race is structured out of haunted house films, because the horror film is largely intended to allay guilt -- scary movies invoke it only to exploit and then banish it. Candyman and The People Under the Stairs represent attempts to expose the racial underpinnings of the genre, but even they depend upon the audience (constructed as white) having a pre-existing fear of "black" spaces -- housing projects, tenements, the inner city -- since those spaces are represented in exaggerated forms that exploit middle-class misconceptions.

And even this exploration has come to an end with the current glut of horror films -- witness Dark Water, about an urban renter whose affordable housing is haunted by the ghost of tenants past, and which takes place in a New York where somehow both ghost and victim (and just about everyone else) manages to be white.

What is a ghost?" Stephen Dedalus wonders in Ulysses. "One who has faded into palpability through death, through absence, through change of manners." The haunted house film mimics the workings of the real estate market, where gentrification and urban renewal push people of color into homelessness, into shelters, into prisons. People of color register as monsters -- homeless boogeymen, gangsta rappers, violent crack addicts waiting outside your house.

Gentrification is itself something of a ghost -- trivialized by the mainstream media, ignored by government, distorted in academia as "impossible to quantify," or obfuscated by policymakers -- as in a report from the Brookings Institution that somehow wonders Does Gentrification Harm the Poor? Because the "audience" for gentrification is always the poor, people of color, immigrants, working class seniors, and combinations of the above, the realities of gentrification are usually "invisible" to those who shape the public's understanding of the issues.

In my day job, organizing homeless folks who have been displaced by the tens of thousands by rising rents to fight back against city policies and practices that abet gentrification, there is no question that the poor are harmed by gentrification and that poor people of color are disproportionately harmed (currently, 90 percent of the 35,000 people in NYC homeless shelters are black or Latino). The other thing that's painfully clear is that everyone wants to do something about it. In spite of the mainstream media's demonization of the homeless as crazy, violent substance abusers, many people acknowledge that the presence of homeless people is the result of systemic problems and that homeless individuals are not "garbage."

Despite the claims of local government and real estate interests (if one can indeed claim them as separate) that "neighborhood improvement" will transform poor, crime-infested communities into bright, green utopias, most people are able to see the realities and are eager to support grassroots efforts to transform blighted neighborhoods in ways that do not negatively impact existing demographics. The survival and success of the haunted house film indicates a considerable (subconscious?) guilt, which in turn indicates acknowledgment of culpability and oppression.

Horror films give us back our sins as monsters. The parents who burned Freddy Krueger alive find their randy teenage offspring butchered. Nuclear testing wakes up Godzilla. In slasher films, sexuality is a capital offense. Dr. Frankenstein's hubris leads to the deaths of everyone he loves. And starting with The Texas Chainsaw Massacre, class antagonism has been at the heart of the horror film.

These days, the two most popular plotlines in the dozens of scary movies that come out each year are: (1) A middle class family or group of teenagers wanders into the wilderness and the clutches of a depraved monstrous lumpenproletariate ("The Hills Have Eyes," "Wolf Creek," "The Descent," "Wrong Turn," "Cabin Fever," "Chainsaw Massacre," "Silent Hill"); or: 2) A similar configuration of victims menaced on their own luxurious turf by monsters who symbolize "our" paranoid fantasies of the violent, dispossessed working class, even if they do not actually come from it ("When A Stranger Calls," "Cry Wolf," "Cursed," "Scream," all the slasher films that do not fall under the first category).

The spate of slow-moving zombie films that followed in the wake of "Night of the Living Dead" represent a capitalist nightmare of communist revolution: the brain-dead bloodthirsty working class, desiring nothing but our destruction, rises us up to besiege "us" in our comfortable homes, our malls, our military bases.

Would a haunted house film have any resonance in a communist country? Is it possible to imagine The Grudge in an economic structure where housing is guaranteed -- however problematically -- and where people have extremely limited freedom to choose their own housing? Present-day capitalism leads to an inevitable fetishization of home, of "our" space, rooted in our understanding that nothing is guaranteed. The haunted house film expresses the universal human fear that your home is not safe, that it will be taken from you by violence.

House of Sand and Fog is an honest look at the emotional costs of a system where housing is a commodity, and not a right -- the film can be read as a haunted house tale with no ghosts or monsters, just "normal" human beings whose basic needs are in direct opposition and cannot be reconciled.

Haunted-house escapism allows us to evade two fundamental truths: that on some level we participate in the displacement of others, and that we ourselves are vulnerable to displacement and homelessness. At the same time, the stigmatization of the homeless in media and in governmental policy has become so extreme that "we" equate the homeless with monsters. When you lose your home, you lose your membership in the human community. You become something else: a ghost, a monster.

Not all haunted house films end with the ghosts getting brutally exorcised, or the humans packing up and running for their lives. Although the dynamics always play out as a war of Us vs. Them/ Good vs. Evil/ Old vs. New, the battle sometimes ends in a draw. The parody Beetlejuice, also about clueless, rich, urban gentrifiers colonizing a haunted house in the countryside, ends with the dead and the living recognizing that they are fundamentally the same, and learning to co-exist in harmony. The nature of scarcity economics makes this precise solution impossible with real-life gentrification, but active cooperation across the lines of class and race is not only possible, it's essential.

Expecting a mainstream horror film to give us a road map towards fighting gentrification is as absurd as hoping that an anti-war film will tell us how to stop a war. Instead, art -- bad art, good art, corporate art, independent art -- should prompt us to examine our fears and our assumptions, and move us to a deeper inquiry of how they impact our reality.

The haunted house film makes assumptions that are worth questioning: Who are "we" as an audience? To whom do these films address themselves? Who haunts "our" homes? Whose homes do "we" haunt? But it also contains the seeds of a real dialogue concerning the human costs of the housing crisis, and our responsibility and our power to do something about it.

Right Pitches Dubya as Henry V

Shakespeare always has been, and will continue to be, misread and misquoted in support of any and every position. As a playwright who was himself constantly lifting quotations, sometimes verbatim, from classical and contemporary sources, he probably would be vaguely amused by this enduring phenomenon. After all, he penned the line "the devil can cite scripture for his purpose" (Merchant of Venice), and who better than Shakespeare serves as secular scripture in our world today?

Yet even within the general trend of Bardolatry (evidenced by renewed conspiracy theories, best-selling guides such as Harold Bloom's The Invention of the Human, and the now decade-and-a-half-long resurgence of movies adapting, citing or about Shakespeare), Henry V stands out in the public sphere, long amenable to propagandistic interpretations. In the United States today it enjoys an unchallenged predominance on syllabi for graduate courses in leadership and public policy -- for instance, excerpts from this play (and this play only) appear in at least five courses at The Kennedy School of Government alone.

Last year I was asked by a friend of a friend to serve as a fact-checker for an upcoming profile in Fortune magazine. A consulting company, Movers and Shakespeares, uses the Bard's plays to present "Fun, Team-Building, Executive Training, Leadership Development & Conference Entertainment based on the insights and wisdom of the Bard . . . as relevant in today's world as they were 400 years ago!" (Look closely on the Web site, and you'll find photos of Donald Rumsfeld and Cokie Roberts merrily reciting in costume.) The writer of the Fortune article wanted me to confirm a few of their claims about what Shakespeare's plays teach us.

Most of these claims were, on the whole, largely innocuous, if blandly reductive and politically conservative. The leaders of the workshops, Ken and Carol Adelman, are both Republican politicos and thus tend to read Shakespeare as a kind of proto-free-market capitalist. (Ken "Cakewalk" Adelman occasionally taught Shakespeare at George Washington University's continuing education program, sits on the board of The Shakespeare Theatre, and, in his role as D.C. insider, currently serves as a member of Rumsfeld's Defense Policy Board.) But what struck me the most about their take on Henry V was their loose use of the plot. Or, rather, their fabrication of it.

One of their "lessons" from Henry's victory at Agincourt was that good leaders leverage superior technology to defeat their enemies. Henry's troops used the longbow, which helped them overwhelm a vastly more numerous French force. This fact is, of course, historically true. But this technological "fact" is one that Shakespeare deliberately omitted from his narrative of Henry's victory, in order to play up the valor of the English soldiers and the glory of God -- he never even hints at the longbow's strategic import. Thus the Adelmans actually invert Shakespeare's own lesson from this play in order to justify a particular business strategy.

The Adelmans rely quite heavily on it for their corporate seminars, and understandably so: Henry makes a number of difficult executive decisions, all having to do with the management of people, in anticipation of, during, and after his expedition to reclaim France. He seeks legal and religious justification for his mission; he traps and executes traitors; he inspires his troops; he metes out justice; he consults; he negotiates; he wins. In short, if you want to be a good leader, you need to be able to "persuade like Henry V," as the handbook Say it Like Shakespeare exhorts us to do.

The historical Henry was far less appealing. Shakespeare, according to 19th-century essayist William Hazlitt, "labours hard to apologise for the actions of the king," but there are still traces of Henry's repellant character present in the play, leading many readers to sense a darker undertone to the apparent celebration of "this star of England," as the Epilogue lauds him.

If you think the business world fawns excessively over Henry, just wait until you hear what the Right does with him. Bill Bennett once introduced Margaret Thatcher with lines from the play; Dan Quayle fancied himself an underappreciated Prince Hal; Henry Hyde likened the managers of the impeachment trial to his fellow Henry's "band of brothers." 

The play is, admittedly, eminently quotable, with purple passages ready-at-hand for such men who would be king as Pat Buchanan (who trumpeted the coincidence of his bid for the Reform Party nomination and St. Crispin's Day, the day of Henry's victory at Agincourt) and Phil Gramm (who likewise invoked the holiday during his exit from the 1996 Republican primaries). Does the insistence of these Republican invocations unwittingly reveal some unspeakable fantasy for monarchical government? (Is it merely accidental that every few years someone re-discovers the Bush family's royal lineage, information that has been public since the first Bush administration?)

Henry V didn't always belong exclusively to Republicans. Woodrow Wilson cited the play, with approval, as representing "the spirit of English life [which] has made comrades of us all to be a nation"; Franklin Roosevelt viewed the rousing Olivier film in a private screening; John F. Kennedy called Shakespeare "an American playwright" after a performance of some lines from Henry V at the White House. Yet Democrats aren't nearly as invested in this drama in recent years. In fact, the closest they come to it is through a pejorative association with kings preceding or following Henry V -- for instance, Jimmy Carter has been painted (literally and figuratively) as Henry VI. Some conservative commentators relished the opportunity to recite Henry IV's dying advice to his son whenever Clinton launched a missile attack:

Keep reading... Show less

War Coverage with a Beat

You wouldn't know it now, but in the early days of the war with Iraq, some of the best and most balanced coverage on television was on MTV.

Yes, that's right, Music Television.

It's true that MTV has dialed down and almost eliminated its war coverage in recent weeks, returning to its party-all-the-time spring break programming. However, in the weeks leading up to the war, MTV's coverage of 11th-hour diplomacy was extensive and detailed. Even through the first week of the war, MTV continued to provide detailed coverage of events at home and around the world.

At first glance, it seems the war couldn't have come at a worse time for MTV, which was getting ready to dive into its legendary spring break coverage from Miami. Scheduled programs with titles like Full Body Search Miami and a "hot or not" contest on Miami's South Beach were scrapped. Those programs are usually sponsor-heavy with some of advertising's heaviest hitters including Pepsi, Sony, Virgin Mobile and major movie studios.

In place of scheduled programming, MTV stuck to a diet of music videos and educational news packages and specials -- and advertisers hung in there. News programs, which could have been termed "Iraq 101," explained the issues, ideas and personalities making headlines; subjects ranged from a brief bio of Saddam Hussein to a history of the United Nations and a definition of the Geneva Conventions. This war-related programming aired during MTV's afternoon prime-time schedule as well as in the evening hours when cable channels and broadcast networks were also airing news reports about the war.

It was a smart decision. According to an MTV poll conducted in early March, more than 60 percent of MTV viewers favored military action against Iraq. The poll coincided with MTV News reporter Gideon Yago's trip to Kuwait for a documentary series about the Marine Corps and young people living in Kuwait and Iraq. (Never one to shy away from making their reporters the focus, MTV titled the series Gideon's Journeys in Kuwait.) An earlier poll, conducted in January, reported that 67 percent of MTV viewers had at least one family member in the military.

The 25-year-old Yago spent time with the 5th Marines "Grizzly Brigade" in the Kuwaiti desert, discovering that in terms of age, at least, the military is composed largely of MTV's target demographic. In one scene, as he walked into a tent to chat with the Marines, he was shocked to see that most of them were close to his age or younger. As the camera panned across soldiers wearing headphones and nodding along to the music, Yago marveled out loud about their youth.

"Almost all of the Marines we talked to are combat virgins, outside of the scores of kills they may have racked up playing video games," he told viewers. "They are young: 74 percent of the Corps is 22 or younger, many are married, engaged or have children, and they believe strongly in their training and in each other."

The Marines crowded around Yago. He asked who their favorite rappers are, and they asked if they could shout out to friends at home. Sitting in a bunker, surrounded by soldiers, Yago was right at home talking music and sports. His reports helped MTV viewers better understand the experiences of their peers stationed in the Middle East.

Back in New York, long-time MTV news anchor John Norris replaced Carson Daly as MTV's most ubiquitous on-air presence. In the run-up to the war, Norris, at MTV's Times Square studio, interviewed Democratic Sen. Tom Daschle in Washington about the pros and cons of invading Iraq.

Norris also went to the United Nations for a sit-down interview with U.N. weapons inspector Hans Blix. In another segment, Norris spoke with his MTV counterpart in France, and they discussed the frayed relations between the two countries. This hands-across-the-water approach was unseen on any other broadcast or cable network.

But while the network's news teams may have been forging bridges, the music side was forced to take a quieter, no-offense-intended approach. MTV Europe issued a memo March 20 that said, "Obviously, there will be heightened public sensitivity to representations of war, soldiers, bombing, destruction of buildings and public unrest at home. The ITC Programme Code requires us not to broadcast material which offends against good taste or is offensive to public feeling. We therefore recommend that videos featuring the following are not shown at the moment." The memo cited war, soldiers, war planes, bombs, missiles, riots and social unrest, executions and "other obviously sensitive material." The list also included videos with the words bomb, missile, war or "other sensitive words in the artist or song title."

MTV USA may have issued a similar statement, but it hasn't surfaced. However, hip-hop artist Michael Franti said during a March 27 interview with the radio program Democracy Now! that his record company received a mass e-mail from MTV stating that no videos could be shown that mentioned bombing or war.

In the States, a fair amount of self-censorship made news as well. Madonna voluntarily pulled her own video to "American Life," lest she appear unpatriotic. And in the wake of the Dixie Chicks' fiasco in London, artists are being careful to tread on the side of public opinion, lest they be dropped from radio playlists across the nation -- as the Chicks were from hundreds of country music stations. Even alternative bands such as Electric Six, whose song "Gay Bar" includes nuclear war references, have voluntarily pulled new videos with war imagery. The song's release date has been pushed back and a new video is being filmed.

Make no mistake, MTV is not going to become a key source of news outside the music world anytime soon. It is still all about defining what's hip and making money from advertisers trying to reach teens and college students with disposable incomes. But a controversial war involving a large portion of its demographic was something MTV News could not ignore. Fortunately, viewers were well served by all the attention.

Frances Katz is a freelance writer who has written about media and technology for The Boston Herald, Cowles Media Daily and the Atlanta Journal Constitution. Her work has also appeared in The New York Times, The New York Post, Wired, Money and TV Guide.

Using War to Sell Country Music

The crimes of American country music are great and many: For every Hank Williams, there's a Conway Twitty; for every Gram Parsons, a Garth Brooks. Of course, Johnny Cash may well have been arrested for stopping to smell the flowers in Starkville, Miss. But his crimes pale in comparison to the current crop of country stars whose rejuvenated musical careers owe as much to Sept. 11 and the current war in Iraq as they do to (negligible) talent.

Consider country's latest heartthrob, Darryl Worley, who recently received an American flag from Lt. Gen. Richard Cody at a concert in Montgomery, Ala. The flag, one of many flown at the Pentagon on the first anniversary of 9/11, went to Worley in recognition of his vocal support of American soldiers and the patriotism of their families.

Now that's a mighty fine accolade, even for a self-confessed good ole boy from Hardin County, Tenn. But Worley earned it: He plumbed the depths of musical distaste by writing a weak song that calls for a war on Iraq. "Have You Forgotten?" (almost certainly not a rhetorical question) is an emotive call to arms: "I hear people saying we don't need this war/ I say there's some things worth fighting for/ What about our freedom and this piece of ground?/ We didn't get to keep 'em by backing down."

Worley then proceeds to conflate the current war in Iraq with the events of 9/11 in a chorus whose blustering rhetoric and fuzzy logic have proven popular with any number of undiscerning country music fans and right-thinking Americans: "Have you forgotten how it felt that day/ To see your homeland under fire/ And her people blown away?/ Have you forgotten when those towers fell?/ We had neighbors still inside/ Going through a living hell/ And you say we shouldn't worry 'bout Bin Laden/ Have you forgotten?"

"Have You Forgotten?" is currently riding high at number one for the second consecutive week in Billboard's Hot Country Singles and Tracks chart. Never, it seems, has the phrase "number one with a bullet" been more apposite.

Worley bristles at charges that the nakedly emotive nature of the song has helped forward Bush's war agenda; he argued two weeks prior to the American-led invasion of Iraq: "I am not a politician. I never have been. It's amazing to me how a lot of people become successful at their particular job in entertainment; whether it be an actor or a dancer or a singer or whatever. And all of a sudden they become this force to be reckoned with on a political level. There is nothing in this world that I want less than that."

Fine words from a man who, at a March 26 concert and rally for families of American soldiers (part of the "Spirit of America" tour) held at Tampa's MacDill Airbase, took George W. Bush's hand and said, "Mr. President, I want you to know that I pray for you every day." Bush, happy to make banal statements on a grand scale, responded: "That is the greatest gift you could ever give a president."

Gen. Michael DeLong, deputy commander of the U.S. Central Command, told the cheering crowd of 4,000 that "[i]f Darryl Worley, Toby Keith and the 'Star-Spangled Banner' can't get your blood boiling, you're at the wrong place." He very probably meant "pumping," but "boiling" will certainly do.

Bush, for his part, missed the show proper, citing special presidential dispensation: "One of the problems with being the president is you always end up being the last guy here," he told the crowd. Then he solemnly thanked Keith and Worley for "providing their talents in support of our efforts to make the world a more peaceful place." Yet daily doses of Bushian genuflection notwithstanding, Worley will doubtless be crushed to hear that the commander in chief is a closet Toby Keith fan. Keith scored a massive hit last year with "Courtesy of the Red, White and Blue (The Angry American)," which asserts: "Justice will be served/ And the battle will rage/ This big dog will fight when you rattle his cage/ And you'll be sorry that you messed with the U.S. of A./ 'Cause we'll put a boot in your ass/ It's the American Way."

Keith's brand of angry Americanism has already wowed the Pentagon. He participated in a USO tour of Bosnia and Kosovo. He penned the song as a tribute to his father who served in the Army and died in 2001. "It was a song I was inspired to write because I lost my father six months before 9/11," he said on a recent radio show. "Nobody wrote an angry American song, and this was one. It was the way everybody felt when they saw those two buildings fall."

Keith is witheringly disparaging of those who have only let Ol' Glory back into their hearts post 9/11: "He taught me to be a flag-waving patriot long before it was cool to wave a flag like it is now." A close friend of metaphor, Keith's live performances include a video backdrop showing a bulldog -- "Toby," natch -- urinating on a newspaper picture of Osama Bin Laden.

Recent live performances have seen the "Big Dog" look at the lot of the average "two-bedroom-cave"-dwelling Afghan, "middle-aged Middle Eastern camel-herding man" overjoyed at the downfall of the Taliban in -- keep you eyes out for it! -- the imaginatively titled "The Taliban Song."

In fact, prior to the onset of hostilities in Iraq, Keith sought to distance himself from the song's gung-ho sentiment in a clumsily formulated attempt at clearing the decks:

"Probably the biggest thing that people don't realize about my situation on, that is, I'm as anti-war as the next guy -- I really am. I'm not for ever having to go to war. If you have to go fight... If our president and our people that we've got elected... I have faith that they'll make the right decisions and if we do, then I think you've got to go in gung-ho and protect as many of us as you can."

Bush must surely have been listening.

"I'm angry about a singer in a band called the Dixie Chicks," the Angry American told an appreciative Alabama audience in March. "She felt a need to tell the L.A. Times my song was ignorant and you were ignorant if you listened to it," Keith said, referring to criticisms leveled at him by the Chicks' Natalie Maines. "She was also recently on a European tour where there was an anti-war flavor and said some things about President Bush and the war. So, what do I think about her?" he asked.

Cue "Courtesy of the Red, White and Blue" played against a visually doctored backdrop of Maines and Saddam Hussein together. I guess you had to be there.

But spare a thought for those poor godforsaken Dixie Chicks. The popular country trio saw their latest single, "Travelin' Soldier," tumble down the country charts thanks to very public anti-Dubya comments made by Maines at a recent London concert. Maines, doubtless appalled by the resulting lack of radio airplay and the potentially damaging commercial implications of her comments, later offered not one, but two very public apologies:

"As a concerned American citizen, I apologize to President Bush because my remark was disrespectful. I feel that whoever holds that office should be treated with the utmost respect. We are currently in Europe and witnessing a huge anti-American sentiment as a result of the perceived rush to war. While war may remain a viable option, as a mother, I just want to see every possible alternative exhausted before children and American soldiers' lives are lost. I love my country. I am a proud American."

Recently a group of outraged Dixie Chicks fans started a freedom-of-speech petition in support of Maines following the South Carolina Legislature's adoption of a resolution calling both for a public apology from Maines and a free concert for military families when the popular country trio resumes its U.S. tour in May.

Irate talk radio host Mike Gallagher has proposed an alternative concert to the Dixie Chicks' South Carolina date, with all proceeds from the concert donated to South Carolina military families. Fit to burst, he said, "Obviously, this is designed to send a message that it's not OK to run down our president during this time of war. They insulted their core audience. Country music fans are red-blooded, patriotic Americans who support our military and support our commander in chief. "

Country Music Queen Rosanne Cash has claimed that the treatment of Maines resembles something very much like the rise of McCarthyism. "It's the people who scream loudest about America and freedom who seem to be the most intolerant for people with a different point of view," she told Australia's Undercover Music. The current issue of The Onion hits the nail dead on the head in a mock opinion piece by Ellen Dunst entitled, "I Should Not Be Allowed To Say The Following Things About America":

"True patriots know that a price of freedom is periodic submission to the will of our leaders -- especially when the liberties granted us by the Constitution are at stake. What good is our right to free speech if our soldiers are too demoralized to defend that right, thanks to disparaging remarks made about their commander in chief by the Dixie Chicks?"

Unfortunately, truth really very often is stranger than fiction. The Dixie Chicks made it on to an online traitor list alongside other such showbiz luminaries as Madonna, Mos Def and Sheryl Crow, to name but a few of the "flaky" celebs who fall foul of the Web site's guiding principle: "If you do not support our president's decisions you are a TRAITOR to our country!"

A patriot list is also provided in the interest of balance, which very handily comes replete with a useful dictionary definition of the "P" word for those not quite certain of the increasingly tainted word's meaning ("one who loves his country, and zealously supports its authority," it reads). José Maria Asnar, 54, from Madrid, Spain has signed up, but Darryl and Toby are noticeable by their absence.

Still further bad news followed the hapless Chicks with the announcement that Al Gore had taken up the freedom-of-speech cudgels on their behalf. Speaking recently to an audience of college students in Murfreesboro, Tenn., Gore said, "They were made to feel un-American and risked economic retaliation because of what was said. Our democracy has taken a hit. Our best protection is free and open debate."

Yet where Maine's timid anti-Bush outburst resulted in a very public slapdown from the media and country radio programmers alike, Worley and Toby Keith have dripped with praise and country music-award nominations as a result of their twangin' post-9/11 triumphalism. Not only did Keith record the fastest-selling record of his career to date, he scooped up eight Academy of Country Music Award nominations. Worley, for his part, bagged a best New Male Vocalist nomination. And according to his record label, Dreamworks, the song is "scaling the charts faster than any single in recent memory. Obviously, Darryl has hit a nerve that strikes to the core of this country's conscious."

The song has certainly hit a very obvious emotive nerve. Whether it strikes to the core of the American conscious is another thing. One can only wonder what the good folks of Basra and Baghdad would think of Worley and Keith's chest-beating invocations to war. That, though, was probably of little concern to Worley when he picked up his USO Merit Award at the Metropolitan Washington USO black-tie dinner event April 9. Previous recipients include Liz Taylor, Steven Spielberg and Bob Hope.

The "Spirit of America" has truly been reawakened. Unfortunately it is the paranoid America of McCarthy's House on Un-American Activities -- simply substitute Hanns Eisler and Pete Seeger with Natalie Maines or even Pearl Jam's Eddie Vedder (who has recently taken to impaling a mask of President Bush on a microphone stand at recent concerts). In that context, red-blooded patriots Keith and Worley may well be the Elia Kazan Lites of their generation.

Richard Perle, one of the chief architects of the Bush administration and former president of the defense policy board, famously said, "If we let our vision of the world go forth, and we embrace it entirely, and we don't try to piece together clever diplomacy but just wage total war, our children will sing great songs about us years from now." Thankfully, they'll know whom to call.

The likes of Rosanne Cash and Maines aside (it's taken almost as a given that John Cougar and Steve Earle, et. al., act as a liberal counterpoint to the worst country music political excesses), dissenting country voices have been relatively few and far between to date. ABC World News Tonight anchor Peter Jennings allegedly had Keith dropped from last year's 4th of July TV special, although ABC cited Keith's existing concert engagements as the real reason for his no-show.

"I find it interesting that he's not from the U.S.," Keith said of Jennings at the time (Jennings is Canadian). "I bet Dan Rather'd let me do it on his special" he huffed. That's the spirit, Toby: Play the nationality card.

As is so often the case in these things, the last word must go to Toby "Big Dog" Keith: "Soon as we could see clearly through our big black eye / Man, we lit up your world like the fourth of July."

Repeat to fade.

William MacDougall lives and works in Berlin, Germany. He is a regular contributor to a number of political publications and Web sites, including Counterpunch, Red Pepper and Z Magazine.

TV News Gets Freaky

The “voice” of television news is expected to be more conversational and less formal than newspaper writing. But lately, as cable news organizations try to fix what ain’t broken, that voice is becoming more and more wince-inducing.

Back in October, a CNN Headline News producer sent an internal e-mail to the writing staff that read: "In an effort to be sure we are as cutting-edge as possible with our on-screen persona, please refer to this slang dictionary when looking for just the right phrase." Words such as “freak” (sex), “fly” (sexually attractive), “jimmy cap” (condom), and “ill” (to act inappropriately) followed.

The e-mail instructed the writers to “use this guide to help all you homeys and honeys add a new flava to your tickers and dekos” -- that is, the graphics that appear on the overcrowded, seizure-inducing Headline News screen.

The revelation of this memo gave humorists a nice little shot in the arm. James Earl Jones intoning "This is CNN, beeyotch" -- this stuff writes itself!

But by the time it had come to light, AOL-Time Warner's news war-horse was already wedging terms like "bling bling" into its crawl. This e-mail merely revealed that the man behind the curtain has a glossary.

CNN, with its average viewer age of 62, was no doubt responding to the belief floating around out there that coveted younger audiences, the 18- to 34-year-olds who command the highest ad rates, get their news from hipper television sources, including comedians like David Letterman, Jon Stewart and Tina Fey. Even if that were true, it's not like those three are peppering their jokes with hip-hop lingo; when was the last time you heard Letterman complain about Saddam being all up in our grill?

Slang is an ever-changing organism, which makes it impossible to keep a dictionary current or relevant; wouldn't squeezing obsolete slang into the "dekos" completely undermine the purpose of using it to sound hip? The claim that slang "modernizes" a newscast only makes sense if they are replacing old slang with new, but it's not like Judy Woodruff has been calling Wolf Blitzer “Daddio” all these years.

And it's just as cynical to believe that everyone in a certain age group, or any other group, is hip to certain (or any) slang as it is to believe that a target 20-year-old would flip by, see that "jimmy cap" has been shoehorned into CNN's crawl, not recognize that she's being pandered to or having her culture exploited, and lock in CNN with "favorite channel" status, saying to herself, "Finally, news that speaks to me."

Or maybe that should be "Finally, news, speaking to me," because CNN fronting with the slang is just one change in the language of TV news. Tune in to CNN, Fox News and occasionally even the broadcast networks, and you'll hear elliptical, participle-filled sentence fragments like these, recently uttered by Fox News Channel's Shepard Smith: "Meantime, the Navy, looking for another suitable training location, the Navy secretary saying it will be tough but not impossible. The Navy using Vieques for the past 60 years."

Or this bit of verblessness, from NBC's Andrea Mitchell: "Gary Condit today, the first sighting in weeks."

Or these shards, from CNN's John King: "Those negotiations continuing. Mr. Bush speaking to reporters earlier today. Suddenly optimistic."

This phenomenon was explored by the NewsHour with Jim Lehrer, where it was somewhat mistakenly called "The Vanishing Verb" (the verbs are usually there, they're just tense-less), and, more recently, in The New York Times. Smith of Fox News calls it "people-speak." He told the NewsHour's Terence Smith, "I try to talk like I speak when I'm yakking with my buddies."

But this really isn't the way people talk in conversation, casual or otherwise, unless you go around saying things like "Me, suddenly stuffy. A cold today. Buying cough drops."

This flashy, spliced speech has also been implemented to save valuable time (of course, "Those negotiations continue" is actually shorter than "Those negotiations continuing"). Even if it does occasionally shave a couple of syllables off a sentence, is that really necessary? Is there really so much more to cram into a newscast than there used to be (when 24-hour news channels didn't exist) that words need to be squeezed out of sentences?

Judging from the numbing repetition, examination of minutiae, and parade of "usual suspect" pundits, the biggest problem the 24-hour cable news channels seem to have isn't making time, but filling up the time.

The addition of hip-hop slang and "people-speak" are two misguided "solutions" to the same imagined "problem": that the “on-screen persona” of TV news has to be changed to accommodate the supposed throngs of young people who aren't interested in, or can't focus on, straightforward information, and who need to be lured in by choppy, faux-chummy incomplete sentences and forced, contrived lingo -- like trying to focus and entertain an infant by shaking a shiny, jangly set of keys in its face. But if the result is the loss of credibility with younger viewers and the repulsion of older viewers, who is left to tune in?

"It's definitely not your mom's Headline News anymore," Headline News chief Rolando Santos told the San Francisco Chronicle about his revamped product (ironically using a stale cliché in the process). But has anyone in this focus group-crazy business even asked if people thought the news sounded too "old-fashioned" before? What makes the ratings go up on cable news channels is a big juicy story, not how chatty or “down” the middle-aged, overly-coiffed anchors sound. When the story goes away, so do the ratings. That proves that people tune in to 24-hour news channels when they are looking for information.

It seems odd, then, that news, just news, is becoming harder to find on cable. Cable channels, underestimating the very viewers they’re trying to attract, are coming off like that desperate high school teacher who tried in vain to be "cool."

Karen Lurie is a writer living in New York City.

One Night in Baghdad

As we are about to wade into the murky waters of Iraq: The Sequel, the HBO film Live From Baghdad comes to us taped from Morocco. Starring Michael Keaton and Helena Bonham Carter, Live From Baghdad chronicles the adventures of a group of CNN journalists reporting from Baghdad in 1990-1991 as Desert Shield morphed into Desert Storm. 

Unlike other movies such as "Salvador"," Cry Freedom" and "All The President’s Men", in which journalists serve as a catalyst to tell The Real Story (CIA involvement in Central America, apartheid in South Africa, and Watergate, respectively), Live From Baghdad barely delves into the roots of the Iraqi-American conflict at all. Here, the journalists are the story.

CNN, in its own review of the film, was kind enough to make a parenthetical note of the fact that both HBO and CNN are owned by the same parent company, AOL Time Warner. It’s hard to critique such a coincidence without dipping into Cold War era terminology (words like “propaganda” come to mind), but it’s worth pointing out that Live From Baghdad is not a story about the Gulf War, or even about covering the Gulf War. It is, specifically, about CNN’s triumph over other news outlets in getting the midnight scoop on the first night of bombing over Baghdad on Jan. 16, 1991.

In the film we see old footage of other network anchors, including Ted Koppel and Peter Jennings, congratulating CNN -- “once known as ‘the little network that could’” -- for its bravado. We see war correspondent Peter Arnett (Bruce McGill) leaning against the wall in a hotel corridor, beer in hand, calling out to other journalists as they scurry to catch the first flight out (“Oh, look at ‘em run. Hurry!”). We see producer Robert Wiener (Keaton) phoning Atlanta after the night is over, where he’s told that he “can rest assured that [he’s] the envy of every journalist in the world.”

That much is probably fair. Prior to the Gulf War, the idea of a round-the-clock news network was considered about as thrilling as the Weather Channel. Desert Shield, and its slightly more dramatic following, Desert Storm, did, in fact, propel CNN into its current Major Player status. Remember when you could only get world news for half an hour each night on one of the three major networks? Yeah, me neither.

Where things get cloudy is in the subsequent insinuation that CNN’s reporting of one chaotic night was synonymous with “getting the story,” however that may be defined. Many reporters have commented on the control, by both the Americans and the Iraqis, over what could and could not be photographed and described on international news during the Gulf War. The film’s implication that CNN’s reporting of one night of bombing in Baghdad is synonymous with free-wheeling, in-depth coverage of the Iraqi-American conflict in general is, at very best, misleading.

Post-Vietnam censorship of war coverage has been a topic of interest for more than two decades. American reporting from Vietnam is widely held responsible for the popular resistance to the war; allowing reporters unlimited access to battle scenes was a mistake that President Reagan and subsequent administrations weren’t willing to make twice. The plot of "Good Morning Vietnam" centered around just this issue: the conflict between a DJ’s insistence on reporting war casualties to American GIs stationed abroad, and his superiors’ resolve to stop him lest he damage morale.

Peter Arnett himself has talked at length about the Pentagon’s efforts to keep disturbing battle images, particularly of dead Americans, from filtering back home to American TV screens. “The Pentagon basically doesn’t want anyone to see the first brutal elements of war,” he said during a 1991 speech at Stanford University. “In Desert Storm, for example, there were few if any pictures transmitted during the fighting of either dead or wounded GIs and few of the thousands of dead Iraqis. It’s not easy to find a picture of any of those. Accounts of the relatively few combat engagements fought by the Allied side on the ground were delayed by elaborate censorship schemes.”

This was offset, or perhaps enhanced, by Iraqis’ censorship over coverage of the Iraqi resistance movement. “There was no way in Baghdad,” said Arnett, “even if I believed it could I talk about the bravery of the Iraqis. I knew there were real restrictions on the kind of words that I could use.”

What Live From Baghdad did capture -- to the exclusion of almost all else -- is the frenetic pace of American-style television journalism. None of the correspondents are portrayed as having any particular interest in the conflict itself (aside from one shot of CNN producer Ingrid Formanek (Helena Bonham Carter) shoving a copy of Republic of Fear in her luggage), much less an interest in the Middle East, save for its Indiana Jones thrill factor. What they do love is war. Any war. “I cried when that goddamn war ended,” says McGill-as-Arnett, speaking about Vietnam.

It’d be funny, if it weren’t so true. American journalists, as author Stephen Hess has noted, are frequently thrown blind into a conflict and expected to pick up the particulars as they go along. This sort of commando-style journalism gives rise to hot spot reporting by “parachutists” who focus on “mayhem, bombings, gun battles, mortar attacks, and civil strife” at the expense of background and analysis.

War, in such an environment, ceases to be a vehicle of foreign policy, ceases to have any relationship with history, international trade, or human rights. What it does become is an event that can make or break a journalist’s career. Rather than gloss over or apologize for this sad fact, Live From Baghdad revels in it. As the heroic Keaton-as-Wiener informs Atlanta, “I’m gonna stay. I’m gonna ride it out. This is my walk on the moon.”

To the real Wiener’s credit, his book (on which the film was based) contains some thoughtful political criticism of American foreign policy, of Iraq, of censorship, and of war itself. The film version apparently wrote all this off as boring and pedantic, unable to compete with the wham-bang image of Keaton being blasted across his hotel room in the middle of a bombing.

The most compelling thing about Live From Baghdad, unfortunately relegated to a subplot, is the relationship between Wiener and Iraq’s Minister of Information, Naji Al-Hadithi (David Suchet). Although both characters jokingly acknowledge that they are using each other, just what, exactly, the Iraqis are getting from the exchange is never fully explored. Viewers are left with the impression that their only job is to censor television footage, as though they had nothing to gain from CNN’s access to Pentagon officials and the American military in Saudi Arabia. Yet anyone who remembers the Gulf War will remember the notion that “Saddam could be watching this” loomed large and colored all discussions, all arguments, over what should and should not be censored during wartime.

But Saddam Hussein wasn’t the only one watching CNN. Following the Gulf War, CNN -- and, later, Al Jazeera -- emerged as potent influences in Arab countries. Satellite television’s disrespect of national borders renders it a powerful tool indeed.

“A dynamic new Arab world has emerged,” wrote Fatema Mernissi in "Islam and Democracy", “in which constant mobility in both mental and physical space, juggling with divergent opinions, and selecting from different cultures have been instinctively adopted by our youth as techniques for survival. The master educators of this new Arab world, which is still classified by the disoriented International Monetary Fund (IMF) and the World Bank as ‘illiteracy-ridden,’ are neither the religious teachers at mosques nor the instructors at state schools and universities but the designers of satellite TV programs.”

As we move into the next phase of the American conflict with Iraq, it’s useful to reflect on the changes that have occurred in the media over the past 10 years. Al Jazeera and satellite dishes have put a dent in Arab regimes’ attempts to control information, and the DIY anarchy of the Internet has served as counterweight to American media monopolies.

Even in Iraq, perhaps the least wired nation on the planet, reporters got around UN sanctions prohibiting computers by feeding questions to Baghdad via satellite phone; news from the 1998 bombing was dictated to a London office using U.S. technology for the benefit of 12,000 online chat participants worldwide. This is just one of many examples in which cobbled-together technology can challenge censorship without depending on major news networks like CNN, who are themselves dependent on remaining in the good graces of both their host country and their advertisers back home.

“Manufacturing one’s identity,” writes Mernissi, “previously the monopoly of ruthless military states which sent a whole generation of Arabs into prisons, is now the privilege of any youth who has access to a cybercafe.”

“You own this war,” Wiener is told in Live From Baghdad, a sentiment that already seems dated. If the medium is to become the message, let it be for its potential to undermine the idea that any one person can own a story, that any one news organization -- or president, or dictator -- can control the flood of information coming in and out of a war zone.

Technology’s ability to override national borders, despite the frustrated efforts of even the most diligent information ministers and the most stringent sanctions policies, provides all kinds of possibilities for networking between activists, policymakers, writers, exiles and dissidents.

CNN’s 1991 coverage of Baghdad is perhaps an interesting place to mark the beginning of this revolution, but we should be assured that the bravery of a handful of journalists is already matched -- if not outdone -- by the efforts of hostages with cell phones, refugees with radios, and students in Internet cafes throughout the world. All wars from here on out will belong to them. And if improved communication can avoid those wars altogether, that, too, will be their victory.

Laura Fokkena‘s essay, “Watching Them Grow Up,” appeared in Expat: Women’s True Tales of Life Abroad (Seal Press, 2002). She currently lives in Boston. Other PopPolitics articles can be found here.

American Style Justice

A lusty sense of vengefulness is hanging over America. Simply put: We're ready to kill.

We're ready to kill accused sniper John Allen Muhammad. And, while we're at it, we'll kill his teenage companion and apparent co-conspirator, John Lee Malvo, too. In fact, we're ready to do much more than kill: We want to make them suffer first. Just tune into talk radio or turn on the TV and you'll hear numerous suggestions for retaliation -- from both the legal "experts" and the public -- that essentially boil down to this: Torture them, then leave the bodies for the wolves.

The sniper shootings were crimes of extraordinary brutality. Not only were the attacks vicious beyond comprehension, but the perpetrators succeeded in terrorizing an entire region. But still the question remains: What's behind our quest for primordial revenge?

Following an embarrassing struggle among the various jurisdictions involved, Attorney General John Ashcroft has now awarded Virginia the bragging rights for the first trial -- despite the fact that Maryland would seem the more logical choice since more of the shootings happened there. But Maryland, like the federal government and many other states, doesn't permit the execution of killers who were minors at the time of their crimes. This means 17-year-old Malvo would not be subject to the death penalty, and Ashcroft wasn't about to let that happen. (Never mind that the only countries in the world, aside from the United States, that have used the death penalty against juveniles since 1985 are Iraq, Iran, Bangladesh, Nigeria, Pakistan, Saudi Arabia and Yemen. At least Saddam Hussein is on our side.)

Besides, Maryland has a troubling history of acting slowly and with deliberation in applying the death penalty -- you know, all that due process stuff. Virginia, on the other hand -- which proudly sports the second highest kill rate in the country, second only to George W. Bush's Texas -- has the process streamlined to a tee; its motto could be, "Vengeance delayed is vengeance denied."

Criminal law is supposed to be about justice, not revenge. As Francis Bacon said, "Revenge is a kind of wild justice, which the more a man's nature runs to, the more ought law weed it out." Thus, even when authorizing the use of the death penalty, judges and legislatures traditionally have been careful to justify its use on other grounds, such as the notion that capital punishment deters violent crime (no matter that this has been discredited). For the most part, however, they don't bother with such excuses today. We seem to have reached the point where revenge is considered justification enough to kill -- no additional gloss needed.

Particularly troubling is how exclusively American a phenomenon it is. Among Western democracies, the United States stands alone in its use of the death penalty for myriad crimes, though that wasn't always the case. A few hundred years ago, for example, the common law in England authorized the death penalty for more than 200 crimes, many of them quite minor. It was possible for a starving man to be sentenced to death for stealing food. But the law in Great Britain grew up. Over the years, capital punishment was dropped for one crime after another, until it was finally abolished for murder in 1965. (It remained on the books solely for military wartime offenses until 1998, though the last execution there was in 1964.)

For a while, the United States kept pace, and there was every reason to believe it would soon follow suit in abolishing the practice. The execution rate dropped drastically in the 1960s, due in part to various legal challenges and lack of public support, and an unofficial 10-year moratorium on executions began in 1967. In 1972, the Supreme Court, in Furman v. Georgia, voided all existing state death penalty statutes, thus suspending the death penalty.

But in 1976 capital punishment came roaring back to life, which raises the question: Why does America cling so tenaciously to the death penalty when Europe so strongly opposes it? Not only will no member of the European Union extradite a suspect to the United States who could potentially face the death penalty, but Germany, consistent with the mandates of its constitution, has gone so far as to refuse to even provide evidence to U.S. authorities for use in the prosecution of accused terrorist Zacarias Moussaoui, because the death penalty is being sought in his case. Similar policies have been adopted by many non-European nations.

There are many factors that play into America's infatuation with capital vengeance. Commonly mentioned suspects include Hollywood's romanticized depiction of violent retribution as part of our Wild West heritage, the culture of violence that has grown up around our national love affair with firearms, and the highly sensational media coverage given to high-profile crimes. Yet the biggest reason for the success of the death penalty may be good old-fashioned racism. There is little doubt that if this were a more racially homogeneous country, capital punishment would have gone the way of the dodo bird 30 years ago.

No one with knowledge on the subject can deny that race has played a role in the application of capital punishment: Study after study has proven its disproportionate use against minorities. And, yes, many white people have also been executed. But that doesn't change the fact that the death penalty remains predominately something that whites impose on blacks.

The sniper case is the exception. The death penalty, in all probability, would be just as much in play if the suspects were white. But that's a testament to how awful the crimes were and the public outcry that resulted, not to the fairness of the system. In the less sensational murder cases, it's often a different story.

Over the years, support for capital punishment proved to be the perfect stealth technique for exploiting racial fears for political gain. Sometimes the racial elephant in the room is painfully obvious, like when a political advertisement flashes the mugshot of a black offender on the screen at the same moment the candidate pledges to vigorously support the death penalty. At other times the pitch is more subtle. But when capital punishment becomes one of the biggest issues in a campaign, you can take it to the bank that it's race that's really being discussed.

As the death penalty's popularity soared throughout the 1980s and 1990s, many of the poll-hugging "New Democrats" jumped onto the band wagon, leaving very few political leaders willing to oppose the practice. Given this, it's not surprising that support for capital punishment of convicted murderers remains high in this country (72 percent are in favor, according to Gallup polls) despite the fact that more than 100 death-row inmates have been exonerated since 1973.

The sniper case is, unfortunately, likely to make it even more popular. This is just the sort of case death penalty enthusiasts live for. Never mind that the extreme facts serve to distort the debate, which in turn can distort the law that will apply in future, less egregious cases.

Last month, the Supreme Court, by a vote of 5-4, declined to hear a capital murder case involving a death row inmate who was 17 when he committed the crime. Justice John Paul Stevens, one of the dissenters, wrote that the use of the death penalty for 16- and 17-year-olds "is a relic of the past and is inconsistent with evolving standards of decency in a civilized society."

After the arrest of Malvo, voices against the death penalty have been drowned out by those clamoring to put the snipers to death, age be damned. Clearly the United States is destined to remain a world leader when it comes to capital punishment. Perhaps we should ask ourselves whether that's an honor we really want.

Steven C. Day, an attorney in Wichita, Kansas, is a contributing writer to PopPolitics.com.

Reflecting on Black Men as Snipers

The recent sniper attacks in the Washington, D.C. area made for a lot of TV watching. It wasn't because the story was always riveting or the news was always breaking. In fact, more often than not, the news briefings held by Police Chief Charles A. Moose of Montgomery County, Md., were uninformative. 

And yet, even as the shooters' range expanded to other jurisdictions and even as the timing of attacks was variable, Moose would, every few hours, make his way to the microphone at the Montgomery "command center," surrounded by reporters, camera crews, support staff and law enforcement types, to tell us what we already knew: The killer was still at large. The whole business had become "personal." Someone had seen a white van or a white box truck, somewhere.

As October wore on, and frightened locals were increasingly testy concerning what the cops did or didn't know, how they were handling the case, and what they were saying publicly, or leaking or refusing to leak to the camped-out press, Moose kept on. Now, in hindsight, the reasons for his cryptic tactics seem multiple and complex. You know that he was communicating more or less directly with the killer ("The person you called could not hear everything you said; the audio was unclear and we want to get it right. Call us back so that we can clearly understand") and responding to notes left at crime scenes ("Our word is our bond"). 

Exactly what Moose knew or when he knew it may never become completely known, but during those difficult 23 days, the emotional toll on him was visible daily. As he now describes the strategy, post-suspects-capture, "I'll talk to the devil himself to keep another person alive."

He wasn't the only one doing a lot of talking, of course. Television news shows trotted out legions of "profilers," professional, retired, amateur and increasingly annoying, lining up on TV and in print to tell everyone how to imagine the killer. Their occupations were varied -- legal correspondents, cable news hosts, professors, reporters, columnists, former lawyers, current lawyers, terrorism experts, former employees of the FBI, former cops and, in one sensational instance, even a former serial killer: David Berkowitz, interviewed by the Fox News Channel's ever intrepid, ever misguided Rita Cosby.

For all the (ratings-bumping) brouhaha that greeted Cosby's ostensible coup, the Son of Sam's insights were much like everyone else's -- the killer was shooting from a distance, he was moving about, he was angry. For all the lack of available information, these authorities and specialists dutifully wore their suits, appeared against book-lined backdrops or in maps-and-graphics-equipped studios, to proclaim what everyone knew, or worse, what they had no substantive grounds for asserting: The killer was white, male, of a certain age and station. He was taunting police, he wanted to get caught. He was ingenious, he was insane. He was a terrorist, an expert sniper, an expert first-person video game player, indicated because he used that odious and supremely unimaginative phrase, "I am god."

A few "news" shows included discussions of the "news" coverage. Was the coverage excessive and sensational? Was it fear-mongering? Ratings-mongering? Was it superseding other news? Was all this profiling bogus? (Recall how many times you heard someone set up his -- and it was mostly "his" -- opinion by saying, "Well, I don't have all the facts, but ...")

In most areas around the United States, the story was awful and upsetting -- and not the only news. Still, the cable news stations took it up as if it was, deploying expensive crosshairs graphics, interrupting themselves to "break" news that wasn't new. For Donahue, Jerry Nachman, Connie Chung, Bill O'Reilly, Dan Abrams, Geraldo, John Walsh, Chris Matthews, et. al., it was the story of the minute for almost a month, and that meant that the teams assembled beneath the Montgomery tents came from all over.

However, in and around D.C., where I live, the story was (and remains) local news, intensely. The wall-to-wallness was unavoidable. Morning to night, Sniper TV ruled. Schools closed in Virginia, interviewees explained why they were staying home or going to the mall, traffic stopped for hours following an attack, and the secret military high-tech surveillance plane was flying around, somewhere, sometime, maybe. And no matter where the violence and grief spread, Chief Moose came before the mic, to read his statements and (maybe) take questions. Politely, because, as he instructed one journalist, his parents raised him that way.

And then, shortly after midnight, on Thursday, Oct. 24, the news channels went nuts with "Breaking News" banners, splitting their screens and their experts' opinions across chopper shots of a backyard in Tacoma, Wash., and a liquor store in Montgomery, Ala. 

Suddenly, there was a license plate, then names and descriptions: John Allen Muhammad, a 41-year-old Gulf War Army veteran with a couple of ex-wives and several relatives who were all too ready to talk to Larry and Katie and Greta; John Lee Malvo, variously termed Muhammad's "stepson," his "teenage sidekick," or "a 17-year-old Jamaican immigrant with a sketchy past." He left his fingerprint on a gun-lovers' magazine in Alabama, which led ATF, FBI and local police investigators directly to the pair, as soon as one of them phoned the task force and told them about "Montgomery."

Suddenly, there were arrests at a truck stop in Maryland, a "courageous" truck driver, and ongoing arguments about who gets to prosecute first; as Montgomery (Ala.) Police Chief John Wilson put it, in a much-repeated sound bite, "We're going to make an example of somebody." 

Suddenly, there was a Bushmaster rifle and a blue Caprice with a hole cut in the trunk, a grim "killing machine." There were court reporter sketches, perp walks and home videos of martial arts classes.

Suddenly, there were faces -- black faces.

This last bit came as something of a surprise, especially if you'd been even half-listening to all the experts who were so sure about what to expect and presume of serial killers. But the surprise, disturbing and pause-giving as it might be, hasn't exactly stopped the experts from yapping. 

After all, now there are more stories to put together and tell: unhappy childhoods and military service records to dig up; last visits and ominous conversations to recall (Muhammad's six-months-ago question to an "old Army friend": "Can you imagine the damage you could do if you could shoot with a silencer?"); distressing anecdotes to relate (Malvo's diet of crackers and honey, and his "scared" behavior). And on Saturday, another arrest of another "material witness," Nathaniel Osbourne, 26-year-old co-owner of the Caprice, whose picture appears on TV screens, seemingly cut off from a snapshot that once included someone else.

Presently, all terms and assumptions have changed. Most of the stories emerging have to do with this unexpected turn, this unexpected blackness.

Item: One note from the sniper closed with five stars. Says Howard Kurtz of the Washington Post: "This case hardly lacks for bizarre elements. Who would have thought that the ... suspect's notes would reference a 'duck in the noose' fairy tale or a Jamaican band called the Five Stars?"

Item: Muhammad converted to Islam in 1985. "This is a politically incorrect thing to say," announced "Terrorism Expert" Steve Emerson on The Abrams Report (Oct. 24), "but the bottom line is, [for] the devout Muslim who believes in taking action in jihad, it's a short line for them to go into carrying out violence."

He added, not quite as an afterthought, "Again, we don't know what motivated this particular shooter."

Item: Muhammad may or may not have been sympathetic to the 9/11 hijackers, and reportedly worked "security" at the Million Man March. This last bit has been denied by Louis Farrakhan, who adds that if Muhammad is convicted of murder, he'll be kicked out of the Nation. CNN reports: "Farrakhan noted that Oklahoma City bomber 'Timothy McVeigh confessed that he was a Christian, but nobody blames the church for his misconduct.'"

Item: "I am god" now doesn't designate video game prowess, but, reported Fox News Channel (for a minute), it may refer to a belief held by members of the Five PercentNation (also here). According to the Associated Press (Oct. 27), Robert Walker, "a consultant based in Columbia, S.C., who helps police identify gangs" observes of this coincidence: "I'm not saying he's a Five Percenter. I don't know that. Only that 'I am God' is something a Five Percenter might say. All black men who are followers and members of the Five Percenters refer to themselves as God and will even refer to someone else who is a Five Percenter as a God also."

Item: Geraldo and others are scrambling to make their tabloid points, loudly. The Oct. 27 edition of At Large With Geraldo Rivera led with the reporter's announcement that this "angry loser" (Geraldo is colorful, as always) is linked with human smuggling. In April 2001, it comes out, Muhammad was detained by immigration inspectors at the Miami International Airport, who suspected him of trying to smuggle two undocumented Jamaican women into the country.

Item: On Saturday, Oct. 26, Malvo, handcuffed in his holding cell, made an awkward effort to climb out through the ceiling. There wasn't a chance he might have made it, but responses from Virginia's (painfully named) Attorney General Jerry Kilgore and still more "experts" were immediate and harsh. Keep him in shackles, in waist chains and leg braces. And, most emphatically, make sure he gets tried in Virginia, where he can be tried as an adult and eligible for the death penalty. So that someone, somewhere, can "make an example of somebody."

Looking back on the past month and looking forward to the feeding frenzy that the court cases will inspire, journalists persist. Time magazine has called on Vietnam War veteran, essayist, and novelist Tim O'Brien (Going After Cacciato) to ponder "the difficult task facing soldiers returning to society."

His insights are at once predictable and weighty. In war, he notes, "The capacity that you could do terrible things is awakened." It's difficult to come back to the world, to a civilian life. At the same time, O'Brien indicts the U.S. media and its consumers: "I was disgusted to see this country transfixed by a sniper while a war's being planned in Iraq," he writes. People in "the rest of the world ... could be dying by the thousands and we'll go on with our business with no fear or personal stake. I never fail to be stunned by our appetite for atrocity and violence."

As if to illustrate (or feed) same, this past weekend's TV (Oct. 25-27) has been rife with efforts to describe, reframe and sensationalize the sniper story, not to mention to justify the previous attention paid to all those so-wrong experts. In addition to underlining (and amplifying) the obvious drama of the case, these shows are also explaining their own existence on the scene(s). The sniper mounted "an attack on the fabric of life," says reporter Jean Meserve, on CNN's Manhunt: Cracking the Case (Oct. 27), while she stands on a street in the "D.C. metropolitan area," the same area where so many D.C.-based reporters live.

Manhunt goes on to trace events: The first day's shooting spree, the ensuing tensions and tactics, the sites of attack, the daily reports by Chief Moose (these images enhanced to grainy digital close-ups to enhance horror-effects, say, when he reads from the postscript, "Your children are not safe, anywhere, at any time"), the amassing media coverage. 

At its peak, reports CNN, the ratings for networks covering the story "more than tripled," helped along, no doubt, by the startling cross-hairs logos and galvanizing theme music. "Marketing murder or serving a public need?" asks the reporter. "In fact, it was both." In fact.

As might be expected, given his reputation, his man-in-action opening credits graphics, and his penchant for reporting on all-things-gigantic-that-will-enhance-himself, Rivera's show Oct. 27 obscures fact in favor of high emotion and low tabloidism (his set for the past couple of days includes a mock-up of the "killing machine," a 1990 Caprice his "expert" has outfitted to resemble that of the killers).

First, Rivera beats down any effort by his guest, John Mills, Muhammad's lawyer during a custody case in 1999, to suggest that Muhammad's (alleged) violence may have been long in the making, a function of building frustrations. Geraldo wonders why Mills wants to probe into the past (even though Geraldo has, again, invited him onto the show). Mills attempts to state his case: "It's important to understand what happened, in order to prevent this from happening in the future." 

Rivera rejects that. Mills tries again, noting that the talking-head-guest-psychologists Robert Butterworth and Cyril Wecht are diagnosing a man they've never met (they're commenting on Muhammad's sympathy for the 9/11 terrorists, while the man who knew him then, Mills, observes, "He wanted to see his kids"). Rivera scoffs. "There are 13 bodies here, 10 of 'em dead. Maybe you're the oddball here, maybe you're the one who's wrong."

Not one to stop when he's (even nominally) ahead, Rivera moves on to the next segment: Former prosecutor Wendy Murphy declares, "The death penalty isn't enough. We want them to suffer more ... How else do we vindicate the interests of the entire region?" At Rivera's (seeming) invitation, attorney Geoffrey Feiger attempts to inject sanity into the proceedings, suggesting that this "thirst for blood ... speaks volumes about this country. It makes us as uncivilized as [the killers]." Yet again, Rivera passes judgment: Muhammad and Malvo, driven by their "diabolical chemistry," deserve to die. So much for due process.

None of this is to say that the murders are not heinous or the murderers not horrific. The process of media story-making, however, is hardly transparent. 

When, also on Oct. 27, Greta Van Susteren interviews Chief Moose and his wife, Sandy Herman-Moose, the focus is not on lust for punishment, sensational violence or perpetual horrors. Moose, humble, gracious and looking rested at long last, focuses instead on his "pride" in the cooperation between departments and individuals, the ongoing mourning of survivors' families, and the role of the media in the investigation (and still, he doesn't blame or make noise, only notes, quietly, his view of what happened). Moose is no longer in charge, since prosecutors in Virginia, Maryland and Alabama have jumped on the self-promotional bandwagon. And the case looks increasingly hysterical and sad.

A decent and now much respected man, Moose's prominence in this lengthy "show" -- not to mention his enigmatic messages to the sniper ("You have asked us to say, 'We have caught the sniper, like a duck in a noose'") and brusque behavior with reporters -- now appear to have been motivated by some knowledge that he could not disclose at the time.

Pete Hamill speculates in the New York Daily News (Oct. 25), that this knowledge had to do with the killer's identity; that is, his race. Hamill writes that Moose's "anguish seemed to intensify as communications were opened with the killers. Almost certainly this was because he knew they were black. He is clearly a decent, tough, disciplined black man, an American before he is anything else. But he also must have known what my friend knew yesterday: Black people didn't need this. He almost certainly knew one other large truth: Race had nothing to do with it."

But of course, race has everything to do with it. No one has to say it to make it so. 

Still, several major newspapers (among them, the Washington Post, The New York Times, the Baltimore Sun) have run stories on reactions of "black community" members to the news that the suspects are black. Black call-in radio shows (like Tom Joyner's) were inundated with responses, as was BET.com (read the messages here); Tavis Smiley plans to do a show on the subject, and it's a good bet that Ed Gordon will do so as well. (Johnny Cochran, however, has already begged off the case, telling Donahue that he felt "fear" when in D.C. during the period of the attacks, so he's no longer objective.)

This is a discussion that "white community" members will never need to have. They don't feel that a Timothy McVeigh, an Eric Harris, or a Ted Bundy represents them, that others will perceive them differently because someone of their race commits atrocities. Rather, white folks tend to see these criminals as "evil," deviant or otherwise not like them. 

To be sure, most black folks will not identify with a Muhammad or a Malvo, but fear being identified with them. Amid all the fears available out there, this is one fear that white people in the United States won't need to confront.

Cynthia Fuchs is an associate professor of English, African American studies and film and media studies at George Mason University.

Burning Our Cultural Bridges

Sometimes we’re just dumb.

Consider, for instance, the subject of visas. One of our goals in the war against terrorism is to “win the hearts and minds” of the “Arab street” and Muslims around the world. In other words, try to make them hate us a little less and perhaps even engender something akin to mutual respect. Then, hopefully, less of their young people will grow up wanting to achieve martyrdom by killing Americans.

So how do we go about this process of trying to ingratiate ourselves to young Muslims? Why, by insulting their cultural heroes, of course.

Take the shabby way our government treated Iranian film director Abbas Kiarostami, widely viewed as one of the greatest living filmmakers. Kiarostami was unable to attend the premier of his new movie, Ten, at the New York Film Festival, which began in late Sept., because he couldn’t get a visa to enter the U.S. in time.

His story is far from unique. Scores of artists and pop performers have fallen into this quagmire.

The difficulty grows out of the U.S. Enhanced Border Security and Visa Reform Act, signed into law by the president May 14. Under the act (and related regulations created by the Bush administration), citizens of nations designated as “state sponsors of terrorism” -- currently Iraq, Iran, Syria, Libya, North Korea, Sudan and Cuba -- are required to go through lengthy FBI and CIA background checks before receiving visas to enter the country. Citizens from 26 other undisclosed (but thought to be mostly Islamic) countries are subjected to a shorter mandatory waiting period.

Add to this the intensified scrutiny all visa applications are receiving in the aftermath of Sept. 11, and you have an obvious recipe for delay. The end result is that visas that once were issued in a few weeks can now take up to six months.

The idea, of course, is to prevent terrorists from entering the country, plainly a laudable goal. But the 62-year-old Kiarostami would seem an unlikely terrorist threat: An award-winning writer and director of 30 films, he has visited the U.S. seven times in the last 10 years. His 1997 film, Taste of Cherry, was the Palme D'Or winner at Cannes, and his latest film explores the lives of Iranian women living under an oppressive system. There was no indication he ever tried to blow up anything -- except, perhaps, a few cultural stereotypes.

And the snub wasn’t an oversight. Ines Aslan, a spokeswoman for the New York Film Festival, said festival organizers and others tried “very, very hard” to prevail on officials at the U.S. Embassy in Paris to make an exception for Kiarostami. Similar exceptions have been allowed in the past. But they hit a brick wall.

“It wasn’t that they could not make an exception,” she said. “It was that they did not choose to. It is very sad.”

Not surprisingly, this news wasn’t received well abroad. Jack Lang, a former French minister of education and culture, called it “intellectual isolationism and ... contempt for other cultures.” Aki Kaurismaki, a film director from Finland, boycotted the New York festival in protest. “If international cultural exchange is prevented,” he mused, “what is left? The exchange of arms?”

Other cultural figures who have been caught in the U.S. visa squeeze include Iranian pop diva Googoosh, who was forced to cancel a long scheduled concert, and 22 Cuban musicians prevented from attending the Latin Grammys; one of them, jazz pianist Chucho Valdes, won the Grammy for pop instrumental album.

One suspects that George W. Bush is no great lover of foreign language films, Persian pop music and Latin jazz. It probably doesn’t break his heart that cultural exchange, involving these and other art forms, has been hindered by the war on terrorism. But before he writes the whole thing off as soft-headed intellectual nonsense, he might want to talk to Norman Pattiz.

Pattiz is the creator of Radio Sawa, the U.S. government-sponsored Arabic-language broadcasting service. Broadcasting 24 hours a day, seven days a week since its debut in March 2002, Radio Sawa -- which means “Radio Together” in Arabic -- has been hugely successful in attracting listeners in its under-30 target audience. Ha'aretz reports that it is the most listened-to radio station among young people in Jordan's capital, Amman. While the broadcasts include news reports, its popularity is generally attributed to its multi-cultural musical programming that allows listeners to hear their favorite Arab and Western performers in the same broadcasts.

The hope is that the station will improve America’s image among young Arabs. While testifying before the Senate’s Committee on Foreign Relations in June, Pattiz said, “There’s a media war going on (in the Middle East) and the weapons of that war include disinformation, incitement to violence, hate radio, government censorship and journalistic self-censorship. And the U.S. didn’t have a horse in the race.”

Whether Radio Sawa will be successful in improving America’s standing with young Arabs remains to be seen. But the willingness of the U.S. government to fund it, to the tune of $35 million in fiscal year 2002, demonstrates an awareness of the power of pop culture to help bridge cultural divides.

So why then are we using visa delays to burn down those bridges? Why, when we’re spending good money to promote cultural understanding in the Middle East, are we deliberately undercutting those efforts by belittling their cultural icons? This is not the way to win the hearts of their youth.

Think of it this way: How would Americans respond if another country announced that Steven Spielberg or Bruce Springsteen would have to sit out an awards ceremony so that background checks could be completed to make sure they weren’t terrorists? Would we think that reasonable? Would we assume that no insult was intended against the United States?

What makes this all so sad, of course, is that the problem could be fixed with modest efforts. Developing a system that expedites visa requests from performing artists and similar cultural figures, perhaps combined with some form of grandfather clause for those who have visited here in the past and who have already undergone background checks, would be a snap. And it wouldn’t harm homeland security one iota. But so far, at least, our government has refused to budge.

In a statement released to the press, New York Film Festival director Richard Pena summed it up this way: “It’s a terrible sign of what’s happening in this country today that no one seems to realize or care about the kind of negative signal this sends out to the entire Muslim world (not to mention to everyone else).”

Like I said before, sometimes we’re just dumb.

Steven C. Day is an attorney practicing in Wichita, Kansas.

American Democracy and National Amnesia

“From the time it was passed in 1870 until 1965, no president, no Congress, and no Supreme Court did anything serious to enforce the Fifteenth Amendment ...” -- Howard Zinn

“I’m gonna sing this verse, I ain’t gonna sing no more / Please get together, Break-up this old Jim Crow.” -- Lead Belly, “Jim Crow Blues”

Certain myths remain popular in spite of reality. Americans, for example, rely on a few admired figures and television images to help them understand the impetus behind the civil rights movement, even though the bus boycotts, voting drives and the March on Washington were just the final breach following 50 years of institutionalized repression in the South.

Likewise, many still argue that Reconstruction exacerbated Southern resentment toward blacks, leading to repressive Jim Crow laws. But this is also a simplistic, and inaccurate, reading. It’s not surprising, however, that the Jim Crow era following Reconstruction is a poorly understood period in popular consciousness: Books on the Civil War and civil rights fill up public library shelves; the period in between -- how we got from A to B -- remains a national blank.

In Democracy Betrayed (University of North Carolina Press, 1998), historian H. Leon Prather, Sr. noted when describing the 1898 race riot in Wilmington, N.C., “Most Americans remember nothing of these events despite the enormous impact that they continue to have on racial politics in the United States.”

A new PBS series, The Rise and Fall of Jim Crow, fills this empty space, documenting the period of segregation and the African American experience in the South from Reconstruction to the early 1950s. Co-directors and writers Bill Jersey and Richard Wormser, both veterans of the civil rights movement, clearly believe that the story of Jim Crow is central to understanding our current racial problems. There are portraits of Booker T. Washington and W.E.B. Du Bois, and examinations of the riots in Atlanta and Tulsa. The roles of everyday men and women -- sharecroppers, factory workers and prison laborers -- are also documented in the four-part series, which began Oct. 1 and runs through Oct. 22. These narratives and voices, combined with rare photographs and film, create a living, multi-layered history.

“Jim Crow” originated in the early 19th century as a character in a minstrel song by a white actor named Thomas Dartmouth "Daddy" Rice. Rice also helped established Crow, like Jim Dandy and Zip Coon, as a stock character in minstrel shows. At one point, Jim Crow served as a racial epitaph and, at the end of the 19th century, became the common term for a series of repressive Southern laws that singled out African Americans.

A number of tragic and traumatic events are recalled in The Rise and Fall of Jim Crow, including an in-depth look at the riot that took place in Wilmington, N.C. The segment begins like a black success story in the midst of Southern tyranny. Wilmington was the home of a small, prosperous black middle class made up of businessmen who owned tailor shops, drugstores and restaurants. African Americans also held a number of political offices. Many whites, however, found the combination of black economic and political power threatening. In 1898, North Carolina Democrats began a vicious campaign built on nothing more than malicious speeches and editorials claiming that white women were in danger from black men. Incredibly, women dressed in white would ride on floats in parades carrying signs reading “Protect Us.”

African-American newspaper owner Alex Manly ratcheted up the hostility level several notches when he countered with an editorial suggesting that many of the rapes that resulted in lynching were actually consensual sexual acts between white women and blacks. Although Democrats easily won the election by stuffing ballot boxes, they wanted revenge. They burned Manly’s newspaper office and began to shoot blacks on the street at random. When the violence ended the following day, the dead -- officially 25, though some place the count as high as 300 -- were dumped in the Cape Fear River. Blacks throughout the country wrote the president in protest, but President McKinley and the federal government did absolutely nothing.

There are also stirring stories of courage. Barbara Johns, a 16-year-old girl in Farmville, Va., (near Appomattox, where the Civil War ended), led her fellow students in a school-wide strike in 1951. Disgusted that the 450 black students of Moton High School were crammed into facilities made for 180, she gathered the students in the auditorium where she gave them their marching orders.

“We wanted so much here and had so little,” Johns later recalled. “And we had talents and abilities here that weren’t really being realized. And I thought that was a tragic shame. ... There wasn’t any fear. I just felt, this is your moment. Seize it.”

Together, the students marched off campus to the superintendent’s office where they demanded a modern facility like the one that housed the white students only blocks away. Their demands were met with threats: Their parents would lose their jobs and possibly go to jail. The parents, nonetheless, backed the students, and when the NAACP visited with the striking students, it was decided that separate but equal wasn’t enough: They would demand full integration. The Farmville case would eventually be bundled with four others into Brown vs. Board of Education.

Because The Rise and Fall of Jim Crow relies heavily on oral history, the program occasionally lacks the needed context to understand the events taking place. It is important, for instance, to connect the rise of populism with the acceleration of black disenfranchisement in the 1890s. When the African American vote held the balance of power in Democrat and Populist contests, the black vote was quickly eliminated. This, in fact, was part of the fuel that ignited the Wilmington riot.

By failing to provide context, The Rise and Fall of Jim Crow is sometimes reminiscent of Ken Burns’ Civil War. After a general introduction is offered to a particular episode, a number of stories, biographies and events are poured into the loose mold. But without a narrative thread, a riot in Wilmington, W.E.B. Du Bois’ battle with Fisk University, and a renaissance in Harlem fail to provide a coherent history.

The total impact of the program, however, is overwhelming. As a work of oral history, The Rise and Fall of Jim Crow allows submerged voices to speak out. Historians, civil rights leaders and sharecroppers talk about their lives as well as their parents and grandparents’ lives under Jim Crow rule. As director Wormser has noted in press materials, he wanted to “let African Americans tell the story of their own struggles themselves.” Letters, journals and newspaper columns are likewise utilized, providing a vivid picture of the struggles that individuals and families faced as they sought an education, steady work, the right to vote and the freedom from fear.

One might argue that Americans’ lack of knowledge about the Jim Crow era issues from a desire to forget an unpleasant historical episode in which the culture at large participated by turning its back on African Americans in the South. It’s easy in retrospect to feel proud of the abolition movement and of the Radical Republicans’ attempt to establish black rights following the Civil War -- just as it’s easy to side with Martin Luther King and condemn the South for attacking children with water hoses. But in between those two historical moments, African Americans in the South were pretty much left without a net.

While The Rise and Fall of Jim Crow will help fill this blank in the national memory, the program will also serve as a reminder of the ongoing racism that we live with everyday. You don’t need to take a class in black studies to notice Uncle Ben or Aunt Jemima at the grocery store (Kraft preserves Uncle Ben because he represents quality and because Cream of Wheat lovers are fond of the old fellow. As far as Aunt Jemima goes, a man in black-face, dressed as the pancake matron, won first prize in a Milwaukee Halloween contest in 2001); or drive very far on Virginia or Georgia highways to find people who still believe that a Confederate battle flag on a bumper sticker has something do with heritage.

Spike Lee has made the argument that gangsta rap videos serve as a modern minstrel show, and it wouldn’t take too much imagination to see crumbling inner-city schools as a visage of separate but equal. Jim Crow, born of racial hatred in the South and institutionalized by an America that looked the other way, remains with us.

Ronnie D. Lankford, Jr. writes about music and nonfiction film for Documentary Films.net, Pop Matters.com, Dirty Linen and Sing Out! He lives with his wife and five cats in Appomattox, Va.

The Ultimate TV Candidacy

On Sept. 26, 1960, John F. Kennedy and Richard M. Nixon faced off in the first televised presidential debate. What they said was secondary to how they looked and behaved: Kennedy was charming, tan and wore makeup that pleased the camera; Nixon was underweight, pale and refused powder that could have covered a five o’clock shadow.

As a result, those who watched the debate on TV picked Kennedy the winner by a mile. Radio listeners, in contrast, picked Nixon, believing he had done a better job responding to questions.

“The Great Debates marked television's grand entrance into presidential politics. They afforded the first real opportunity for voters to see their candidates in competition, and the visual contrast was dramatic,” Erika Tyner Allen wrote for the Museum of Broadcast Communications. “Those television viewers focused on what they saw, not what they heard.”

Forty-two years later, it has come to this: Rupert Murdoch and FOX's cable channel, FX, are bringing to television American Candidate, an American Idol-like game/talent show in which 100 political hopefuls will strut their stuff in an attempt to be picked by couch potatoes nationwide to run for president (apply here).

It is, perhaps, the ultimate merger of popular culture and politics. Audience members and at-home viewers, voting by telephone and Internet, will reduce the number of "candidates" each week. The public will pick the winner from the final three, and the results will be broadcast live from the the Mall in Washington, D.C., in July, 2004. Buoyed by the publicity, the winner will presumably be encouraged to run as a third party candidate.

Surprisingly, the United States wasn’t the first country to propose such an idea. A Buenos Aires television channel beat Murdoch by a couple of weeks when, in early September, it announced the launch of The People’s Candidate, a reality TV show that will not only put its winner up as a congressional candidate in 2003, but will also launch a new political party.

According to the BBC, “About 800 people have already auditioned for [The People’s Candidate], including pensioners, transvestites and the unemployed. The search is already underway as judges whittle the hopefuls down to 16 who will appear on the show.”

It's enough to make one laugh for Argentina, much less cry -- but just remember a U.S. network is in on it, too. You could read many things into this: Is it a sure sign of the apocalypse? A natural extension of the Survivor and American Idol-ized global phenomenon? A wonderful, Marxian chance for the proletariat to rise to prominence (although the elite still control the airwaves, so that wouldn’t go too far)? Or is it merely another step in the evolution of the political campaign?

After all, turning the U.S. presidential race into a long, contrived television program would require fewer adjustments than one might think. Campaigns are already so structured that, much like “reality TV,” they are hardly “real” at all. Networks have cut back their coverage of national party conventions because nothing unscripted happens. Only C-SPAN stays tuned.

Remember the 2000 Republican convention, in which organizers paraded members of minority groups across the stage as a sea of white faces looked up from the crowd, all the while keeping far-right, controversial figures such as Texas Rep. Tom DeLay out of sight? That wasn’t reality -- that was sleight of hand, a bag of tricks.

The Democrats pulled their own tricks during Al Gore’s nominating convention. Bring in Bill and Hillary on the first night, then get ’em out the door. Bring in the religious senator Joe Lieberman two nights later to wipe the moral slate clean. Don’t let America be distracted by scandals, real and purported; make it clear that Gore is “his own man.” Still not convinced? Cue up the movie directed by MTV-video-director-turned-film-director Spike Jonze that shows off Gore’s adventurous side and portrays him as a kind, honest family man: the anti-Clinton.

Conventions and other campaign events may be “real,” and they may be “news events,” but they are also manufactured as a commercial product designed to convey a specific message. When things aren’t so neatly controlled, when something such as an unchecked microphone is on, chaos can ensue. Consider George W. Bush’s near-legendary flub at a campaign event when he called a New York Times reporter a “major league asshole,” and everyone heard. Clearly this “backstage” moment was the last thing Bush wanted; it helped temporarily stall his campaign and diverted attention from his message.

Now compare the politicians with the participants on Survivor, who knowingly perform for the camera while trying to “keep it real.” Many events are contrived, and the final product is a carefully constructed, heavily edited, 44-minute packaging of three days’ worth of “news,” designed to typecast players to focus the audience’s attention on “storylines” deemed interesting by the producers. The producers sometimes even re-arrange the order of events “for dramatic purposes” -- to manufacture discord and create a narrative that isn’t really there. And we call this reality? (To his credit, Survivor executive producer Mark Burnett has said from the beginning that “reality TV” is a misnomer for his show -- that “unscripted drama” is more apt. He’s right.)

The presidential debates are not much better. The execution is just as tight, down to the exact room temperature, the height of the lecterns or seats, the overly negotiated rules regarding the questions and answers, and the carefully chosen audience members for the so-called “town meeting” debates. When then-President Bush famously glanced at his watch 10 years ago during a debate with Clinton, it may have been the most spontaneous debate moment recorded on film that year.

There’s a reason why lawyers and campaign advisers argue over format. If the rules of the 2000 debates, for example, had been looser -- had the events been more “real” -- there could have been at least two doozies. In the first debate, Gore erroneously stated that he had traveled to Texas in 1998 with the head of the federal disaster agency to view flood damage. Bush was fairly sure at the time that this was incorrect. In a more free-flowing format, he might have called Gore on it. Now that would have been interesting.

Similarly, in the third debate, Bush was asked a question about affirmative action and gave a meandering reply that revealed no depth of knowledge on his part as to the meaning of the concept. Gore asked Bush if he agreed with what the Supreme Court has said about the legality of affirmative action, and Bush was trapped. He looked to moderator Jim Lehrer for help, and since the rules stated that candidates could not ask questions of each other, Lehrer moved on to another question. But the immediate impression was that Bush had no idea what the Supreme Court had said about affirmative action and needed Lehrer to bail him out. If Bush were forced to answer the question, the confrontation might have kicked up a few more Nielsen points.

Political parties don’t want us to think about structure (though we do nonetheless). They want us to see it as pure reality, to pay no attention to the man behind the curtain. But this simply illuminates that politics and campaigning is already a show, no more perfectly “real” than an episode of, say, The Mole. American Candidate, then, may simply be an inevitable step in the process.

And it might not be such a bad idea. Imagine if the 2004 pool of Democratic contenders appeared on ABC each week (the network that most needs a hit) with Clinton-adviser-turned-journalist George Stephanopoulos. They could debate the issues and take part in challenges: Persuade a foreign head of state to side with the United States on invading Iraq! Find Cheney’s hidden lair! Get the country out of an economic slump and deliver a tax cut -- all without destroying Social Security! An impartial group like The League of Women Voters could serve as “judges” and Democratic voters nationwide as the “jury,” voting off one candidate each week.

In theory it could make the whole process far more democratic, but this type of “reality” show would sap too much power from the political establishment. (In a 2000 Republican version of such a show, John McCain might have actually bested Bush.)

True, the whole concept seems more suited for a Saturday Night Live skit than the real world, but Murdoch is likely to see it through in one form or another. And you can bet that on American Candidate, no presidential wannabe will be caught on camera with an unattractive five o’clock shadow.

Chris Wright, an editor and writer in the Washington, D.C., area, is pursuing a master's degree in communication, culture and technology at Georgetown University. He previously wrote about Gary Condit and the news narrative and has reported extensively on Survivor.

All-Star Militia

KABUL -- Master Sgt. Brett Favre, leading a Special Forces platoon in a ground assault in the southwestern mountains of Afghanistan, said troops need to "go deep and long" in the search for Al Qaeda members.

JERUSALEM -- Approximately 1,000 troops surround historic Bethlehem awaiting orders from the U.S. command post in Jerusalem.

"I'd like us to roll in and take care of business as soon as possible," says tank commander Kobe Bryant. "Playoffs are supposed to begin in two weeks."

If the names sound familiar, it's because they belong to the elite corps of professional athletes that America loves but also often resents for their salaries, their behavior and their highly dubious status as role models. So wouldn't there be less of that resentment if these multi-millionaire jocks had to do double duty as America's first-strike military force?

I'm not talking about the occasional athlete-turned-soldier, or the way dozens of athletes were drafted during WWII. My suggestion is to draft all pro athletes into the armed services. Sign them up as soon as they're drafted by any team, and send them to basic before spring training. Then, when there's an uprising in Somalia, or intelligence about new terror training camps cropping up again in Afghanistan, let ESPN's rich and famous be the first reserves we send into battle.

Think of it: Would all of us be a little less angry if the sport stars that we pamper, spoil and often deify earned their stripes literally? If these same gifted men had to risk their lives for us (instead of just their anterior cruciate ligaments), wouldn't we be more likely to forgive them for charging money for autographs? For striking in order to protest their $3 million per year "slave" wages? For spitting at umpires? For not running out ground balls while making $126,000 -- a baseball player's average take per game?

A $40,000-a-year over-the-road driver might feel a heap better about Kurt Warner making $5 million if he knew the St. Louis Rams quarterback had to train eight weeks at Fort Bragg and then serve two years on call as a reservist. I, for one, would be less inclined to boo the Texas Rangers' Alex Rodriguez ($25 million a year) for striking out if I knew he slept in a tent in the desert in Yemen last week, safeguarding me at home. And when Philadelphia Eagles' Antonio Freeman runs on an opposing team's field signaling that he's number one, fans just might stand up and salute him in agreement.

But what's in it for the athletes? Everything they want, actually. They'd keep their salaries and their sport celebrity. But they'd also finally have a legitimate claim to the oft-debated moniker of "hero." And many of them, especially kids right out of high school like basketball players Tyson Chandler and Eddy Curry, would be able to get a normal education -- one consisting of skills and values that we currently watch them flaunt or struggle with in their daily travails, as they are splashed all over the front page of the sports section, like a World War II poster featuring boxer Joe Louis.

In the Army (or Navy or Marines), they'll cultivate maturity, teamwork, discipline (let Latrell Sprewell try choking a drill sergeant), manners, selflessness, loyalty and patriotism. They'll even acquire a skill or trade to fall back on if it turns out that in the big leagues they cannot hit the curve ball or catch a pass in traffic.

And those are just the known perquisites for the stars. Those pros who covet the limelight -- your Neon Deons and Reggie Millers -- can move up from the toy department (sports page) to world news in the off season, photographed alongside President Bush and Colin Powell and interviewed by Ashleigh Banfield.

Better yet, your Ray Lewises and Leroy Butlers, who confess that they "love to hit people," and hockey enforcers like Marty McSorley, who simply "love to fight," will finally get to be in an arena where their preoccupations are not only legal, but are rewarded with medals and ribbons.

It'll never happen, you say? Consider that Pat Tillman, a 25-year-old safety for the Arizona Cardinals, did exactly that last spring, giving up three years of fame and riches in the NFL to serve as an elite Ranger in the U.S. Army. While Tillman has avoided all interviews, his coach, Dave McGinnis, said it was a matter of pride, integrity and patriotism for him, someone who is clearly not the typical materialistic pro athlete.

In the big leagues of baseball, football, basketball, hockey, soccer, golf and tennis, there are about 10,000 young, superbly conditioned athletes like Tillman (more than 20,000 if you count the minor leagues) who benefit more than most people from America's free capitalistic society. Who better to man (and wo-man) our Army, Navy, Air Force and Marine first-strike forces in times of trouble?

But Tillman is an extraordinary exception. Since it is a safe bet that the other pros will not follow his example, in order for this plan to work, it will be necessary for Congress to pass a bill that requires any professional sports contract to include a military service provision. Call it a small tax on their American dream of wealth and fame. My guess is that there'd be ample support of such a bill, especially from a citizenry that is asked time and again to subsidize their city's professional sports franchises, their stadia, and especially their skyboxes.

And it's not that there hasn't been some precedent for this arrangement. In its military and Olympian supremacy during the Cold War days, U.S.S.R.'s skaters and weight lifters listed their full time occupations as "soldier." The allegiance they pledged to Mother Russia apparently spilled over into the sporting quest, as they perennially led all nations in gold medals at the summer and winter Olympics. It makes sense, since soldiers moonlighting as athletes (or vice versa) is among the more compatible pairings of pursuits; both share similar goals, skills and the same kind of preparation and training.

So let's reconsider those news lead possibilities: Gen. Bill Walsh's elite infantry men, all of whom could run the 100-yard-dash in less than 10 seconds, pilot the frontal assault against the enemy; Col. Bobby Knight's legion of snipers protects their flank; Maj. Mike Ditka's tank division brings up the rear, roaring ahead with all the pride, fire, might and abandon of a special team covering a kickoff return.

And our country is not only entertained but also protected by a generously paid, super-conditioned, balletically unified, highly motivated strike force that's trained, programmed and, in fact, born to win.

God bless America. Go team!

David McGrath teaches writing and Native American literature at College of DuPage.

The Spectacular War

Now You Know. So goes the title of a recent anthology of online postings written by viewers in response to Steven Spielberg’s Saving Private Ryan (1998). As the title implies, the postings feature testimonials by people who, after seeing the film, claim a new understanding of the sacrifices made by their parents and grandparents and the horrors of modern combat.

Wow, can we say problematic? Let’s bracket, for a moment, those pesky postmodern crises that beg the question of what it means to know anything in the first place. Still, the very notion that by watching a piece of narrative Hollywood filmmaking we can somehow come to know the reality of war is both prevalent and dubious. It suggests that narrative film, due in part to its staggering verisimilitude, can actually come to be seen as a kind of document that can colonize -- and even replace -- historical memory. And when the context of the narrative is war, the stakes become dangerously high.

The vast majority of cinema invents a fiction; it creates an illusion of narrative reality by stringing together a series of sounds and images in familiar, lifelike ways. The war film represents one among many recurring patterns these fictional narratives take. And more so than most film genres, the American war film has historically made precious little pretense about the political subtexts of its narratives, perhaps because politics and war make such inseparable bedfellows. We’ve gone from the rah-rah of Capra to the cynicism of Kubrick, Coppola, Stone and, well, Kubrick again.

The 1990s saw an altogether new take on the war film, one that began in some ways with Steven Spielberg’s Schindler's List (1993), reached its apotheosis with Saving Private Ryan, and has seen various incarnations ranging from Mel Gibson’s Braveheart (1995) to Ridley Scott’s Black Hawk Down (2001, see review here). Suddenly, it seems, the filmmaker had positioned himself more as historian than storyteller, a move that has concealed the political subtext of the films’ narratives more deeply beneath a veneer of authenticity. The new American war film seeks more to document the reality of war than to explore the true nature of war. It has replaced artistic exploration with spectacular re-creation and recreation. Schindler’s List began a trend that has robbed the war film of its real artistic significance, a trend that has been broken only by The Thin Red Line (1998).

Schindler’s List marked a radical turning point in the career of Spielberg and the history of the American war film. Spielberg made a name for himself by crafting hyper-stylized action-adventure pictures. His early films are spectacular, larger than life, too fantastic to be real. When he set out to make his Holocaust picture, the wunderkind-turned-mature-director dumped much of his trademark technique. He dispensed with his cranes and his dollies for a stark realism. Here was a World War II/Holocaust film shot in startling black and white with raw, hand-held camera work. The violence was brutal and direct.

Spielberg made no pretense about why he had chosen this approach. The choice to film in black and white came out of a desire to make the film look like genuine footage from the era, and the cinema veritae stylings were ripped directly from documentary film technique. The result was a film that sought to bring the audience closer to the Holocaust than any fiction film previously produced.

A great deal has already been written about Schindler's List and the creation of pseudo-historical document (see, for example, Robert S. Leventhal's “Romancing the Holocaust, or Hollywood and Horror: Steven Spielberg's Schindler's List”). Most of it comes down to a very simple problem, though: Here was a filmmaker shooting a “true story” war film in a documentary style, the subject of which was genocide, the story of which was redemption, rescue and survival.

There’s a fundamental tension here. If Schindler’s List has colonized our collective memory of World War II and the Holocaust, it has created a history not about death and destruction but rather about survival. It makes for great drama, but as history, it proves a little disingenuous at best, dangerous at worst. After all, a filmmaker can recreate images of death, but that’s far different from providing the audience with the experience of it, with the palpable, tangible, terrifying experience of war. There is a serious gap between the spectacle of war and the emotional and psychological reality of it.

The Spielberg approach to the war film has become rather commonplace since Schindler’s List, from Braveheart and Gladiator (2000) to The Patriot (2000) and Black Hawk Down. Still, the most noteworthy iteration of the war film as historical document is almost inarguably Spielberg’s own Oscar-winning blockbuster, Saving Private Ryan.

Saving Private Ryan purports to give its audience war through the eyes of soldiers. The now famous Omaha Beach sequence that opens the film stands as one of the most technically memorable sequences in film history. It features graphic violence, hand-held camera-work, blood splattering on the camera and a plethora of subjective point-of-view shots that place the viewer directly in the position of the soldier in battle. It in fact begins with the suggestion that the events portrayed are the direct recollection of a war veteran in the present day (it isn’t until the end of the film that Spielberg reveals that the character doing the remembering wasn’t even at Omaha Beach). Spielberg sutures the audience right into the action; the audience lives the battle and experiences the carnage. By the film’s end, it appears as though Spielberg, too, wants to tell his audience: “Now you know.”

But you don’t know. Narrative film isn’t how we should do history at all. Saving Private Ryan, like Schindler’s List before it, exploits the context of war to tell a story of rescue, survival and redemption. The film follows a small group of soldiers selected for an atypical mission: to find and return a young soldier -- the only one of four brothers still believed to be alive -- safely to his mother. The real narrative subtext here is the restoration of family and the recuperation of a nation and its history. The film eschews large political questions or philosophical meditations on the nature of armed combat. War, here, is a given, a necessary fight to save the world from an unspeakable evil. The only question the film seems to leave is whether we, the beneficiaries of this conflict, have earned the sacrifice.

Fortunately for all involved, it ends on a note of reassurance. “Am I a good man?” asks the aged Private Ryan, by way of asking, “Have I earned the sacrifice?” “Of course you are,” his wife reassures him -- and by proxy, us -- thereby leaving no doubt that we are the rightful heirs to this historical legacy. For a piece of art that poses as an authentic recreation of the experience of war, this pat conclusion offers none of the thoughtfulness or political and ideological complexity necessary from anything aspiring to the status of either historical document or artistic exploration. It takes the path of formal rigor and ideological ease, and, in the process, it sets the standard to which almost every other contemporary war film aspires.

Take, as an example, British director Ridley Scott’s Oscar winner, Gladiator and his latest opus, Black Hawk Down. Scott, always willing to sacrifice content for style, exceeds the spectacular standard for battle scenes set by Saving Private Ryan. Yet here, too, there’s a curious lack of any real insight into the psychology of the soldier or the nature of warfare. Beyond spectacle, he offers heroism and redemption -- and precious little more. Whither the moral complexity of Kubrick’s Paths of Glory (1957) -- or even the call-to-action of Frank Capra’s propaganda series Why We Fight (1943-45)? Has the increasing technical proficiency come at the expense of the serious grappling with the nature of the beast? In short, has the search for what is real in war killed the exploration of what is true in war?

To answer this question, it’s necessary to consider one last 1990’s war film: Terrence Malick’s under-appreciated World War II epic, The Thin Red Line. Malick’s film saw wide release a few short months after Saving Private Ryan swept the nation, and, for both critics and audiences, suffered by juxtaposition and comparison. If Malick sought to rival the Spielberg epic for harrowing realism, he most certainly failed. The Thin Red Line is indeed quite convincing; it’s a high-gloss, expensive Hollywood war epic. Yet Malick does not indulge in the same hyper-realism of Spielberg or Scott. He seems instead to intuit the truth that eludes Spielberg and Scott: Even though the cinema appears capable of imitating or re-creating reality, it is fundamentally an artistic medium that must grapple with reality within a fictional framework.

In this regard, Malick is the heir to a tradition in American film that includes Paths of Glory and reached its artistic and popular height after Vietnam with films like Coppola’s Apocalypse Now (1979), Stone’s Platoon (1986) and Kubrick’s Full Metal Jacket (1987). Those films, like The Thin Red Line, are less concerned with re-creating combat than with the political, psychological and philosophical implications of war.

Malick’s real achievement stands in stark contrast to Spielberg’s vision of war; Malick’s is a war film that searches for meaning in the conflict. The Thin Red Line lacks a clear narrative structure, instead placing emphasis on the psychological experience of war as seen through the eyes of a host of young men. They ponder the power that comes with the ability to kill, the arbitrary distinctions that turn fellow men into enemies, the legitimacy of following and breaking orders, the ethics of euthanising a wounded comrade with morphine. At the same time, Malick counterbalances these considerations with a clear sense of the political import of the conflict; ranking officers may be disinterested careerists, but they still make it clear that this battle is integral to the success of the war. Most importantly, Malick uses the conflict to explore the very nature of war itself, how it comes to be in the world.

In The Thin Red Line, war thus becomes a set of contradictions. It is a part of nature even as it destroys it. It is made by man even as it shatters him. It comes from God even as it represents God’s absence. War, here, is staged as the Biblical Fall as well as a part of some divine mystery and order. There are no answers here, just ambiguity, doubt and humanity. Perhaps Malick reaches too far, but among his contemporaries, he towers simply by virtue of his willingness to reach out at all.

Film cannot offer the real experience of war. At the end of the day, the cinematic experience is nothing more than a group of people in a dark room watching tricks of light and sound. Steven Spielberg, Ridley Scott, et al. may be magicians working within this idiom, conjuring re-creations more dazzling than we might have dreamed even 15 years ago, but they cannot offer real history or real experience. Confusing cinematic spectacle with the horrors and necessities of armed conflict signals an end to our ability to discern truth. American filmgoers would be well-served if they stopped looking to cinematic technicians to find out what war is really like and instead turned to cinematic storytellers to find out what it truly is.

Christopher Wisniewski is a freelance film writer living in Brooklyn, N.Y.

The Television Ghetto

In June 2001, the Screen Actors Guild released the African American Television Report, a study commissioned by SAG that provided an analysis of both the quantity and quality of African American representation on network television. Conducted over a five-week period in the fall of 1999, the study was authored by Darnell Hunt, a sociology professor and director of the UCLA Center for African American studies.

Hunt concluded that African American characters on television are largely "ghettoized" by three contributing factors: network placement (most African American-centered shows were limited to UPN and WB), time slot (shows featuring all-black casts aired on Monday and Friday nights only), and show type (blacks were more likely than any other racial group to appear in sitcoms).

The study's findings were widely published, and the resulting public criticism prompted the major networks to promise reform. So now, a year later, are those public discussions and corporate adjustments reflected in the networks' fall offerings? Comparing some of the conclusions of the SAG study with the fall lineup, it seems that not much has changed.

African American Television Report: African Americans are over-represented in prime time and are concentrated in sitcoms.

According to the study, while African Americans comprise about 13 percent of the U.S. population, black characters accounted for nearly 16 percent of the characters on network shows during prime time in 1999. This would, at first glance, seem to be a good thing. But in this case, quantity definitely does not equal quality.

"We compared the distribution of characters with the time they had onscreen and we saw these really troubling patterns with how African American characters were being used," Hunt said in an interview with PopPolitics.

"Most African American characters with the highest screen time were the ones who appeared in African American-oriented situation comedies, those same six or seven shows that accounted for almost half of all African American characters on TV, and happened to be on two networks, primarily UPN and WB, and on two nights a week, Monday and Friday. If you get rid of those two nights a week, then you cut away half of the characters and the lion's share of the characters who have meaningful roles. That's what we meant by ghettoization," Hunt said.

Looking at the fall network schedule, few adjustments have been made. African American characters are still highly concentrated in sitcoms, and the shows in which they appear are still limited to certain nights and networks, although the configurations have changed, as we'll see below.

AATR: African Americans are underrepresented on FOX and NBC.

Hunt and his research team noted that "less than 10 percent of characters appearing on FOX and about 11 percent of those on NBC were African American," and most of these characters weren't central to the programs' narratives.

A year later, NBC and FOX are doing slightly better when it comes to giving black characters more substantial roles. NBC's programming is still overwhelmingly white, but at least one of the network's fall sitcoms, Hidden Hills, will feature a black couple in its group of suburban characters. (Only the white couple, however, is pictured on the NBC site.)

FOX has kept the critical and popular favorite The Bernie Mac Show as part of its Wednesday night lineup and will add Cedric the Entertainer Presents..., a comedy variety show starring one of the Original Kings of Comedy, in the half-hour directly following Bernie Mac. It is also slated to introduce Wanda At Large, a midseason comedy about an African American TV morning news show correspondent. 

ABC appears to be in stasis, although My Wife and Kids, starring Damon Wayans, is returning. CBS is introducing Robbery Homicide Division, which includes two African American detectives among its main characters, and the heavily advertised new crime drama Hack, which features an African American detective, Marcellus Washington (Andre Braugher), who helps a white ex-cop, Mike Olshansky (David Morse), with his vigilante crimefighting. 

While Braugher's role will allow him a good deal of screen time, the role of a black sidekick is still all too familiar. As BlackVoices.com columnist Janice Rhoshalle Littlejohn observed in a recent column, "[M]ost African-Americans can be found on the sidelines, either as helpmate to the white star ... or co-starring in an ensemble cast."

CBS's reality shows, meanwhile, have generally included token black contestants -- one black male, one black female on both Survivor and Big Brother. The new Survivor: Thailand is nothing if not consistent.

The WB (formerly one of the networks on which African American-oriented sitcoms were concentrated) is doing worse. The network will no longer broadcast The Hughleys, a sitcom featuring an all-black cast. In fact, it won't offer any programs featuring African American characters in lead roles, even though WB was once known for introducing programs like The Steve Harvey Show and The Jamie Foxx Show. 

Aside from ER repeats, the rest of WB's lineup -- from original dramatic series like Smallville and Gilmore Girls to syndicated broadcasts of familiar shows like Friends and The West Wing -- adds little diversity. 

AATR: Prime time scheduling remains largely segregated.

The SAG report noted that shows airing on Monday and Friday nights "accounted for more than half of all African American characters appearing in prime time."

The fall network schedule indicates that things have largely remained the same. UPN, for example, is airing all of its African American-oriented sitcoms sitcoms (The Parkers, One on One, Girlfriends and Half and Half) in a two-hour block on Monday nights. Those shows account for half of the network's primetime offerings (UPN only needs eight shows to fill three nights; Thursday is devoted to WWE Smackdown and Friday is movie night).

But is this concentration ideal for viewers? Not quite, according to Hunt.

"In a way, this [ghettoization] is creating this segregated television audience that works against this notion of living in an integrated society where everyone's sort of living together with one another and understanding and appreciating the diversity of what we have here," Hunt said.

"Instead, there's this assumption made by programmers and people who are putting these shows together that somehow we have to niche market to an individual group, and we can't come together and enjoy the same things," he added.

And in a move that's drawn criticism from viewers and from the shows' stars, ABC and FOX have placed their African American family sitcoms, My Wife and Kids and The Bernie Mac Show, respectively, in direct competition with each other -- at 8 p.m. on Wednesdays. This programming decision forces viewers to choose between two critically acclaimed black sitcoms, or at least requires them to remember to set their VCR.

Getting out of the TV ghetto

In the SAG study, Hunt describes two types of programming that are relevant to this discussion: "resourceful" programming -- shows that manage to use a diverse cast and to do well; and "missed opportunity" -- shows that didn't even try to diversify.

"There are a lot of shows on TV that do well, are in the top 10, and have diverse casts, and we highlight those in our report," Hunt said. "The Practice was an example of a show that was highly rated in 1999 and had a very diverse cast and had the African American characters in important, lead roles in many episodes, where they were the hero or heroine of the episode and got a lot of screen time and the show did extremely well in the ratings."

"Lots of shows could do that, but they don't even try," Hunt continued. "Or they have this unspoken rule-of-thumb that says 'the more people of color we put on the show, the less marketable that show will be to the broader (i.e. white) audience, which I think is faulty logic, and there are plenty of counter-examples to disprove the assumption."

Indeed, shows like The Practice beg the question of why other dramas and comedies have not integrated African American characters into their own narratives. It's hard to understand why, for example, Joey and Chandler couldn't have at least one close black Friend as a series regular; or why, in cosmopolitan New York City, the ladies of Sex and the City rarely date African American men and have no African American girlfriends or coworkers; or why we don't see more complex African American characters on TV -- characters like Six Feet Under's Keith, a black, gay cop with some anger management issues, whose portrayal is one of the main reasons viewers tune into the show, or any one of several characters from The Wire.

Hunt notes that it requires more than just getting African American actors on screen at any given moment -- though that does matter. And it requires more than "creating shows for black people" and then relegating those shows to secondary networks. Achieving programming diversity is about creating characters and scenarios in which black characters are portrayed as more than the villain/gangsta or comic sidekick/helper or incidental neighbor. In other words, it's about creating representations that reflect, in some meaningful way, the culture at large.

Roadside Distraction

His nose must be over a foot wide at its base. True, I did not climb the billboard and measure with a ruler, but I can vouch that the Indian’s nose is broader than each of his arms and legs. This is no 1950s relic staring down at me. The modern billboard towers each day over drivers on Interstate 294 in Alsip, 20 miles outside Chicago: a tri-color image of an Indian aligned to the left of a banner for Arrow Chevrolet.

Get it? Arrow … Indian?

Here’s what drivers see: big nose, little legs, dark skin, headband and single feather. He wears a necklace strung from animal teeth and a loincloth. He displays the same bow-and-arrow combative behavior that Hollywood used to feature.

The same buffoonish character is stenciled on Arrow’s showroom windows in Midlothian, another Chicago suburb, and on Arrow’s Web site, ostensibly to boost recognition of its company name.

What Arrow has done is essentially resurrect the cigar store Indian to shill for Chevrolets. Some Americans may remember the wooden statues of Indians placed in front of drug stores and tobacco emporiums. They were used to advertise cigarettes and cigars, a practice vaguely related to some Indian tribes’ use of pipes and tobacco in sacred ceremonies.

Would objecting to Arrow’s ad campaign be just another hysterical charge of political incorrectness? After all, what’s the harm?

“It’s hurtful, dehumanizing,” says Arieahn Matamonasa, a psychologist and professor of contemporary Native American issues at DePaul University.

Matamonasa made the dealership aware of its display ad’s offensive nature as early as four years ago, when she commissioned her students to write research papers on the dangers of stereotyping. Each year since then, several of her students have tackled the subject of the Chevy Indian and have written to the dealership’s management.

“They’ve ignored all of our letters,” says Matamonasa, who has crow-black hair that hangs well below her shoulders, and the straight-on, relentless gaze that reminds one of Natalie Wood’s character in The Searchers, except that Matamonasa is, indeed, of Lakota and Menominee descent.

She is concerned particularly about the social and potentially violent consequences of stereotyping, recalling the cartoons Julius Streicher published prior to and during WWII that gave public “permission” for dehumanizing and exterminating Jews.

Matamonasa compares the Chevy Indian caricature to Chief Wahoo, the mascot of the Cleveland Indians baseball team, and to the “Frito Bandito,” the cartoon character eventually retired, under pressure, by Frito Lay (click here for a Fritos commercial clip).

“He has a feather tucked in a headband, and he has no shirt. He looks ridiculous,” she told me. “And we are such a small group that people don’t have real life experiences with Indian people. So for them, he represents the whole culture.

“When people see, on the other hand, something like the leprechaun of the Fighting Irish of Notre Dame, they know enough of the Irish culture that this mythic figure is not representative.”

As a cartoon image, this is not the characteristically touted noble, sacred or serious Indian stereotype, ala University of Illinois’s Chief Illiniwek, the school's mascot. Neither is it a frolicsome or supposedly innocent depiction, like Disney’s Pocahontas. The Arrow Chevy Indian is, in fact, more akin to the TV series Pow Wow the Indian Boy meets F-Troop -- which may speak to the age as well as to the cynicism of the dealership’s management.

Perhaps because the company would be hard-pressed to claim, as does the U. of I. or the Atlanta Braves, that they are “honoring” Indians with their billboard and Web site art, they have decided not to talk at all: The general manager of Arrow would not return my calls. So I can only speculate about how they would refute Matamoras’s challenge to stop using the cartoon character.

One of the usual arguments, that “no one seems to mind,” is baseless considering Matamonasa’s protests as well as resolutions passed by state governments condemning the use of Indian logos and mascots. A bill was recently proposed by a California state assemblywoman banning the use of Indian logos and mascots in public institutions, though it failed to muster enough support and was defeated in May.

More such legislative crusades are imminent, however, as Indian political and economic influence grows as a result of the proliferation and success of casinos. According to a recent survey by Indian Country Today, the national Indian weekly, “81 percent of respondents indicated use of American Indian names, symbols and mascots are predominantly offensive and deeply disparaging to Native Americans.”

Additionally, the logo defenders’ claim to their right of free speech sounds similarly fallacious in light of the recent recommendation issued by the United States Civil Rights Commission that calls for an end to the use of Native American images and team names by non-Native schools. The recommendation, which was released in April 2001, states: “These references, whether mascots … logos, or names, are disrespectful and offensive to American Indians and others who are offended by such stereotyping.”

The call was accompanied by the more telling warning that the mascots and logos “… may violate anti-discrimination laws.” In legal precedent, the point at which discrimination against others begins is where free speech ends.

Finally, though some continue to maintain that Indian iconography is intended to “honor” Indians, what community has ever felt sincerely “honored” by a nation or group that vanquished them? Would present day Jews feel honored by Germans drawing caricatures of yarmulke-wearing Jews to sell cars or watches? Would African Americans feel “honored” if “Spears Stereo” used native African warriors wielding spears to hawk its CD players?

Professor Matamonasa said that following one of her presentations about Native American culture, a child came up to her and proclaimed with enthusiasm, “I believe in Indians,” evoking the famous chant of faith in fairies from the movie Peter Pan, though Native Americans still make up close to 1 percent of the U.S. population.

“If people knew more about Native Americans, it [the Arrow cartoon character] might not be as damaging,” she says. But our culture does not provide a diversity of representations, nor are students well educated about Native American history and current cultural issues. Images such as the Arrow Chevrolet Indian end up being larger than life -- in this sense literally as well as figuratively.

What’s clear is that Native Americans are offended. What’s also clear is that the logos and mascots will continue to appear until a lot more non-Natives take offense.

David McGrath teaches writing and Native American literature at College of DuPage. His essays and short stories have been published in The Chicago Reader, Education Digest, Chicago Tribune and Artful Dodge. His short story "Broken Wing" was nominated for the Pushcart Award for Fiction, and he recently published his first novel, Siege at Ojibwa.

Cannibal Culture

Hey, Mr. Lemire! Do you remember Alf?

This query, posed to me during my graduate study at the University of Michigan, came from two of my freshmen students. They approached me after class, a section of English comp.

The reference was to the 1980's TV sitcom Alf, which I knew (without seeing an entire episode) was a sitcom about an American middle-class family who adopt an alien, played by a puppet. Why were my students asking about Alf when we had spent the last hour discussing Lester Bangs’ essay on the death of Elvis Presley? Because the late rock critic (known to my students, if at all, as a name speed-dropped in an R.E.M. song) had asserted that despite Elvis’s decline, no other entertainer existed in 1977 for whom people would wait in the rain to see, just so they could say they had seen him or her.

I asked my students: Could they think of a living entertainer now, in 1996, of whom that could be said?

One female student, Michigan-born, proffered the name of native son Bob Seger. Not everyone in class knew Bob Seger (much to the female student’s incredulity), so I asked if people knew the song "Old Time Rock ‘N’ Roll," which got a few more heads nodding. Seger, I explained, sings that song.

So when those two after-class students asked if I remembered Alf and I said yes, they said: "Remember that episode of Alf where he was dancing in his underwear to ‘Old Time Rock ‘N’ Roll’? That was hilarious!"

"I never saw that episode," I said, "but it sounds like a spoof of Tom Cruise in Risky Business."

Both students looked at me blankly.

"You guys never saw Risky Business with Tom Cruise?" I asked. (Neither had I, actually; I had seen only Bob Seger’s MTV video featuring that famous -- or so I thought -- film clip.)

No, the two students hadn’t seen Risky Business. Later, I did the math: These boys were 4 when Risky Business came out in 1983, and considering I had never seen Tom Cruise’s breakout movie replayed on TV, I had no grounds to expect that they, or their classmates, would know it.

My class hadn’t understood, either, Lester Bangs’ essay reference to Donna Summer, because they were all born in the aftermath of the general consensus that Disco Sucked. No one in class even knew the name Donna Summer.

I don’t adduce this anecdote as an example of pop culture ignorance in young people, or to point up what slim pop culture creds I, at 29, had over them. After all, that same year, in one of my graduate English classes, a fellow student employed the phrase "the stuff that dreams are made of," and I, eager to score points, noted aloud the phrase’s original source: The Maltese Falcon.

When a fellow student did me the courtesy of correcting me, naming the original original source -- something called A Midsummer Night’s Dream -- I reflected that despite my being known by my friends as a Shakespeare fan, I had graduated Boston College as an English major without ever having taken a Shakespeare class (or, for that matter, never having seen more than that one clip from The Maltese Falcon).

We are, all of us, culturally ignorant to some degree and in some respect, whether the literacy refers to the kind endorsed by E.D. Hirsch, Jr., or to the pop culture kind. Not only are there works we do not know, we often do not know that the works that we do know often borrow, were inspired by, pay homage to, reference, appropriate or downright steal from works that we may or may not know.

When I was an undergraduate, the push was on in the American culture wars for young people to learn as many "cultural terms" (read: high culture terms) as possible, to correct their shameful cultural illiteracy. But the fact of the matter is that consumers of pop culture need a higher degree of cultural literacy than even the highbrows, since pop culture is the greater cannibal of other cultural products -- and, of course, of itself. 

The problem is that there is no established mechanism by which people -- young as my students or as old as I am -- can see and learn how pop culture borrows, steals, references, pays homage or is inspired by other sources (hence, the above situation with Bob Seger and Alf). And the more profound question -- Is all the ripping and riffing just cheap plagiarism or one-upmanship, or is it a central dynamic of the creative process? -- is rarely discussed. 

It’s hard, admittedly, for adults to resist the urge to berate younger people for their ignorance, even if it’s over a reference to pop culture. The MTV VJ Kennedy, on a retrospective program on that channel, cited Martin Landau as her worst interview: When, at the 1996 Hollywood premiere of Mission: Impossible, she innocently asked what the aged actor was doing there, he replied that, well, of course, he was in the original TV show -- but not only did Kennedy not know Landau was in the TV show Mission: Impossible, she didn’t know Mission: Impossible had been a TV show. She made the mistake of blurting this out to Landau, who responded with an ugly, verbal dressing-down right there on the spot.

Maybe the problem is with MTV. Consider the story (probably apocryphal) about MTV correspondent Tabitha Soren, covering President Bill Clinton: When Clinton, an amateur saxophone player, referred in a press conference to Theolonius Monk, Soren allegedly turned to an older reporter next to her and asked, "Who’s the loneliest monk?"

Hilarious, right?

Granted, one can hardly chock up Kennedy’s ignorance to her youth: Mission: Impossible was revived for network TV in 1988 (when Kennedy was 16), and the show ran for two years. Blame, perhaps, Kennedy’s poor research skills -- a furious Landau certainly did -- but it’s scarcely reflective of intelligence for a young Oregonian woman who grew up wanting to be an orthodontist not to know who starred in a TV espionage drama. Kennedy, like other TV talking heads, relies on a staff to supply her with such knowledge.

The fact is that if no one tells "these kids today" who Theolonius Monk is, or who came first, Shakespeare or Bogart, or even that anything existed before the year they were born, the kids will have no way of knowing and should not be blamed for not knowing -- any more than I, at 34, should be blamed for not having intimate knowledge of the lyrics of Linkin Park.

Apart from reading and catching connections willy-nilly, the best approach is for one generation to educate the other, and the place to start is to help young people understand the creative process, specifically the ways in which an artist takes material from a preexisting source. A spectrum of terms describes this taking: reference, nod, tribute, homage, emulation, appropriation, inspiration,adaptation, rip-off, plagiarism, copyright infringement, theft.

Let young people decide for themselves what term describes any connection between Marilyn Manson and someone named Alice Cooper. (We might also mention to young people, in case they didn’t know, that Manson’s hit song, "Sweet Dreams (Are Made of This)," was a hit for another band, 12 years earlier.)

Rampant sampling in pop and hip-hop music makes more a lively debate and exchange (as opposed to a flat-out argument): Does it or should it matter to us that "I’ll Be Missing You" by Puff Daddy (now P. Diddy), the 1998 tribute to the late Notorious B.I.G., samples "Every Breath You Take" by The Police? Does knowing the reference exists mean anything? Is it homage or indicative of a dearth of creativity?

These are questions that people of different generations can engage to each other’s mutual benefit, hopefully without the conversation becoming acrimonious, each generation defending its products of culture as "better" or "more original" than the other’s.

A conversation perhaps less controversial would be to explore the ways in which artists use their media to talk to each other, to wink or nod, to tell a joke, or just to have fun. This exchange -- whether it be between generations or genres -- reflects much more than artistic thievery or competition; it enhances the creative process as well as the experience of culture.

Consider the one television program that contains more pop culture references per second than any other: The Simpsons. I’ve encountered more than a few devotees who breathlessly extol the show’s "hundreds" of references. (Ask them to name five in under five minutes, and they’re stymied.) A few examples come to mind: Mr. Burns "re-educating" the Simpsons’ dog Santa’s Little Helper by subjecting him, via the Clockwork Orange treatment of forced-open eyeballs, to disturbing films (in this case, Lyndon Johnson holding up a dog by its ears); or Bart, distributing Thai take-out menus as a ninja, mimicking The Matrix with that movie’s soundtrack behind him; or Marge reminding Homer that he loved the movie Rashomon, which elicits in Homer the rejoinder, "That’s not how I remember it."

If Simpsons fans derive pleasure from recognizing these references, it is based, it seems to me, on the knowledge that they themselves "get it" and the assumption that the majority of other viewers don’t. We’re back in class, and we’ve just earned a gold star.

For those of us who don’t get it, it’s a "Ballad of the Thin Man" moment: Something is happening here, but we don’t know what it is. (In case you don’t get that, it’s a Bob Dylan song from his 1965 album Highway 61 Revisited). No matter what our age, suddenly we’re the young person, and we need someone to educate us.

The makers of The Simpsons, surely, aren’t nesting these references to elevate the show to any high cultural level (the show’s humor relies more on characters’ stupidity than on the intelligence of the viewer), or in hopes that viewers will rent Rashomon in order to get the joke. If anything, the jokes to "get" are ones the show’s creators insert for their own pleasure, even if it means suspending disbelief that Homer would ever watch Rashomon.

Movies, especially, are a medium through which filmmakers talk to one another, or at least nod and wink. Clearly, Brian dePalma is not trying to pull a creative fast one in his 1987 film The Untouchables with a train station shoot-out sequence that is inspired by (or lifted from, whatever you prefer) Eisenstein and Aleksandrov’s 1925 film Battleship Ptomekin. DePalma cannot be thinking, "No one’s seen Ptomekin or will ever get this reference -- I’ll be lauded as a genius." To be sure, those of us who recognize the allusion or imitation are meant to smile.

Think of the opportunities with young people to craft special video rentals: Friday night, it’s Kurosawa’s 1954 film The Seven Samauri; Saturday night, it’s John Sturges’ The Magnificent Seven, 1960; and finally, Sunday night, George Lucas’s 1977 hit, Star Wars.

If you rent the Meg Ryan-Nicholas Cage movie City of Angels and your resident teenager finds it to be cloyingly sentimental claptrap -- or even if they don’t -- it’s easy to say, "Well, if you did/didn’t like that, what do you think of this?" Then cue up Wim Wenders’ Wings of Desire. Or rent La Femme Nikita and compare it to Point of No Return. Or rent Cruel Intentions with Dangerous Liaisons.

The young person may dislike both films, but at least he or she stands to learn something about creativity and culture (and the lesson learned depends on the teacher): that there is nothing original under the sun; that cultural products and their creators need not be burdened by the task of being "original"; that inspiration or referencing imbues even the most serious cultural product with a welcome element of play; or that appreciating a cultural product can be a first step in being led to other, related products.

Once young people begin to see that art is not just a game of connect-the-dots, but is one in which motifs, characters, images and symbols recur and are revived; the conversation can turn from entertainment culture to political culture. For example, in October 2000, to bolster George W. Bush for president, the Texas-based nonprofit Aretino Industries paid for a remake of the infamous "Daisy" TV commercial, in which the innocent image of a white girl picking petals off a daisy is followed by that of a nuclear bomb mushroom cloud. Originally broadcast in 1964 by the Lyndon Johnson campaign, the commercial was only shown once and yet lingers as a notorious example of the so-called "attack ad" -- for those who remember it.

The conversation will go on, to larger trends in politics that recur and are revived, to arguments and points of view that come up in different guises and from different voices, and to cultural moments that mirror or echo ones before it. Attention to and inquiry into "first uses" makes people bigger and broader consumers of art and cultural products, and, to reference another source, "It’s a good thing."

It’s like Groucho Marx said: An old joke is never old if you’ve never heard it before, and the younger you are, the fewer jokes you know.

Tim Lemire is a Boston-based writer and a MFA graduate in creative writing (fiction) from the University of Michigan.

Consumers and Creators

Recently I've been wondering: When exactly did I become a fangurl?

Was it at the Multiple Alternative Realities Convention last month, when my friends and I found ourselves whispering answers to "Buffy/Angel Jeopardy" questions during a game session, or later that evening, when I was dressed up as Darth!Willow, dancing with a group of vampires for the evening, to a set dj'd by Dr. Demento?

Or was it last fall, when I obsessed about what to wear to see filmmaker Jim Jarmusch speak at our local contemporary arts center? Or last summer, when my drag king friends and I danced onstage at a club in New Orleans, lip-synching and busting boyband-style moves during a homoerotic performance of 'NSync's "Bye Bye Bye"?

I don't know when it happened exactly, just that it did, and now I've found myself, at 26, involved in more fandoms than I care to count.

In a recent New York Times article about potential copyright violation by Star Wars fans who digitally revise George Lucas' films, Jim Ward, Lucasfilm's vice president for marketing, offered his company's take on fandom: "We've been very clear all along on where we draw the line," he said. "We love our fans. We want them to have fun. But if in fact somebody is using our characters to create a story unto itself, that's not in the spirit of what we think fandom is about. Fandom is about celebrating the story the way it is."

Ward was referring to fan edits of Star Wars circulating online, and about which of these the company deems appropriate and which violate Lucasfilm's copyright. The sort of fan behavior Ward supports is the fandom of appreciation and consumption. It's a fandom that's pleasurable for many, one that's accessible if you can afford a movie ticket or CD or a cable hookup.

While this definition makes sense for Lucasfilm -- or for just about any large corporate production unit interested in selling its film, featured celebrity, band or television show and then protecting its interest by controlling use and distribution of the product -- it's a limiting interpretation for most of the popcult-obsessed fans I know.

The definition of fandom is a tricky one. If you regularly watch a particular TV show each week, does that make you a fan? Or is it more than that (taping the show, discussing it with others, re-viewing it, quoting dialogue, taking screencap photos to post on your Web site, which will then be the basis for others' bad fan art?) Is it standing in line for a ridiculous length of time to see a film's opening, or working with digital technology to create a version of the same film other fans may enjoy more?

It's my belief that fandom exists along a broad spectrum -- including, but not limited to, fans whose idea of participation is sitting back and enjoying a show's broadcast, to those who read spoilers and speculate a series' plots in online forums weeks in advance, or to those who put their creative energies to work writing fan fiction. These writings, which are based on a show/band/movie/etc. and introduce alternate storylines and/or character relations, are then posted online (or, if you're old-school, distributed via fanzines).

While I don't want to create a hierarchy of fan behavior by suggesting that it's better to be one sort of fan than another, I do believe that those on the further-out end of the fan spectrum are the most interesting, because while they're actively consuming popcult product, they're also creating it. Instead of solely behaving in the appropriate, good-Lucasfilm-fan-way (consuming, collecting, appreciating), these fans are putting their consumption to work, making their preferred cultural product meaningful in different contexts and mediums.

If being a bad, obsessive fan means learning how to use various technologies in new ways for your own ends, such as digitally editing videos and manipulating images in Photoshop, creating and maintaining fan Web sites, building virtual communities around shared interests, or exerting creative agency in any number of other ways, then there are millions of "bad fans," operating online and off -- and they're all the more informed and engaged because of it.

At present, I'm somewhere in the middle of the fandom-spectrum, operating as a purely appreciative consumer in some cases, and demonstrating a more rabid obsession in others. There are TV shows I sit and watch each week like a normal person (watching "Looking for Love: Bachelorettes in Alaska" counts as research for a cultural critic!), but then there's also "Buffy the Vampire Slayer."

Three years ago, I started watching the show, alone in my apartment, and didn't tell friends about my viewing. I was a cultural studies grad student curious about the representation of the show's young lesbian couple, Willow and Tara. I didn't realize I was a BTVS addict until the next fall, when I found myself living in New Orleans without cable TV, begging a Tulane University faculty member who I'd heard was doing scholarly work on the show to lend me her tapes of the episodes I'd missed during the first few weeks of the new season.

Then came the cable subscription I couldn't really afford on my salary. And then, a few months later, I was at Tower Records, scooping out the BTVS official fan magazine and the BTVS lunchboxes and memorabilia. Shortly thereafter, I got together with a college friend whose devotion to BTVS fandom inspired and amazed me: She was co-writing and co-presenting her scholarly work about BTVS's biggest online fan forum, The Bronze (here's the original forum, started during the WB days, and the new UPN forum). She was writing her own fanfic. She was a co-editor at a hip, snarky, girlie pop culture site. She and her cohort introduced me to the world of spoilers and online discussion about the show. And she made me understand that what had seemed like crazy-obsessive fan behavior was really OK, because while it is obsessive, it's also intellectually and socially engaging, and a whole lot of fun.

I still have moments of shame. When I found myself searching online for Spike/Giles "slash," fanfic in which characters are re-written in a romantic or sexual way, usually in same-sex pairs (see cultural critic Constance Penley's book Nasa/Trek for some of the best slash theorizing around), or when I do things like derail my household's Thanksgiving plans so that we can tape the episodes of an FX BTVS marathon, I've had to pause and ask myself at what point fandom becomes extreme. But there have also been moments of pride.

I love that this past Christmas all of my roommates exchanged gifts that were BTVS-related. Some of them we bought (the boardgame, the Sunnydale High Yearbook, several volumes of Buffy-inspired comics), but others we made (bedazzled t-shirts with "Slayerette" and "Spike" ironed and glittered across their fronts, CDs of this season's musical episode, games we've devised to play around our burgeoning vampire obsession). Fandom became a way to express our collective participation and to acknowledge each other's relationship to the show and its characters.

In our house, BTVS is the only show we all watch together; it's the only weekly event guaranteed to bring us all together on the couch to watch, critique, squeal and moan, and then later take what we've seen and interpret it, write about it, co-opt it and appropriate it for our own use. And this is the part of fandom that I think is the most valuable, the part that Ward misses in his definition: In this particular mode, it's more fun to admit our obsession and put it to some creative use than it is to watch passively from our spot on the couch.

Alana Kumbier is the television critic for PopPolitics.com, where this article originally appeared.

News Anchor Or Games Show Host?

What's the difference between Tom Brokaw and Tom Bergeron? Not much.

Brian Williams was just coronated as Brokaw's replacement when the NBC anchor steps down after the 2004 election. As obvious a choice as the clean-cut, straight-arrow Williams would seem to be, there is some hand-wringing about this decision among nostalgic types.

You see, Williams will be the first anchorman of a network news program who does not have gritty reporting experience (covering the Clinton White House doesn’t muster the same respect as dodging bullets or bombs). Americans instead know him best as the perpetually tan, behind-the-desk Brokaw substitute with the unnervingly perfect hair.

But maybe the worry warts would feel better if, instead of "anchorman," the new title for the man who comes to us from MSNBC's The News with Brian Williams were "News Host." After all, isn't this what all anchormen have become?

Broadcast news has changed over the years from the days of craggy-faced, hard-bitten, seen-it-all ex-reporters earning their well-deserved semi-retirements while sitting at desks in front of multiple clocks. Suave Peter Jennings still sits at a desk, but Brokaw now stands in front of groovy video screens, and Dan Rather, certainly the most unpredictable of the Big Three, seems to alternate between sitting and standing. But sitting or standing, they are called "anchormen" because they stay in the studio, the nucleus of the broadcast, and the reports and reporters revolve around them like trench-coated electrons.

All three of these men have reporting backgrounds. Is it necessary for the jobs they have now? Not really. But the necessary traits for a good News Host are the same traits you expect to find in a good Game Show Host:

1. Both should be able to read smoothly from teleprompters.

2. Both ask questions that hard-working underpaid writers research and prepare for them.

3. Both should be reasonably attractive in an accessible, non-threatening way.

4. Both should project a well-informed and caring image.

5. Both should inspire trust, and be worthy of being invited into our living rooms on a daily basis.

The comparison to Game Show Host is no exaggeration. Tom Bergeron, the host of Hollywood Squares, was recently mentioned as a possible replacement for the as-yet-unreplaced Bryant Gumbel, former co-anchor of the CBS' The Early Show. So was Meredith Vieira, formerly of the news magazine 60 Minutes, and now of that Cosmo-of-the-airwaves, The View. Bergeron elected to stay on his game show, and Vieira decided to add a game show to her resume -- she will soon be host of the syndicated version of Who Wants to Be a Millionaire.

Thirty years ago, "You want Art Fleming to do the NEWS?" would have bellowed through the hallowed halls where dignified, avuncular, well-informed types like Cronkite once roamed. But news is now entertainment, and entertainment needs to be hosted, not anchored.

For proof, look no further than the "branding" of cable news "stars" like MSNBC's Chris Matthews (has the expression "Let's play Hardball!" been copyrighted yet?) and Ashleigh Banfield, who, according to her promo featuring the Michelle Branch song “Everywhere,” is well, everywhere. Hey, it's important to know where to find Banfield -- you wouldn't want your email questions about her hair and eyeglass frames to end up in Fox News' John Gibson's mailbox.

If Bergeron and Vieira can seamlessly switch back and forth between gladhandling game show schmoozer and Trusted Source of News, TV news executives must think it's acceptable for anchormen and game show hosts to be considered interchangeable. The message is that integrity, knowledge of public policy, and a sense of responsibility to inform the public take a back seat to having a pleasant face and being conversational.

So why shouldn't Dan Rather consider taking the helm of Win, Lose, or Draw when he steps down from the CBS Evening News? His homespun ad-libs would fit in perfectly when interviewing a homemaker from Ventura, Calif., who likes to yell, "Big Money! Big Money!"

We seem to be evolving toward one generic Host Figure who will handle the delivery of all information. "The Mets beat the Dodgers last night ... there's a cold front coming in ... 6,000 people were massacred yesterday ... how would you like to win a new car?" Is there any reason to have four separate people to do that? All we need is someone so genial and adaptable, he can be plugged into all of those roles, like an evolutionary Ubermensch.

I cast my vote for Al Roker.

Karen Lurie is a writer living in New York City. She contributes to HoleCity, Modern Humorist and Flak Magazine.

Orientalist Kitsch

Desperate never looks good on a preppie.

Clothing companies like Gap, J. Crew and Abercrombie & Fitch are facing a crisis of cultural and financial capital. Brands that once dominated the landscape of cool have seen their stocks and sales slump to embarrassing lows over the past few years.

But when Ohio-based Abercrombie & Fitch released a line of T-shirts in April depicting Chinese laundry workers and smiling Buddhas, captioned by groan-worthy puns, the company was launched into the headlines, thanks to media-savvy Asian American college students protesting the reproduction of century-old caricatures.

Activists criticized the T-shirts for denigrating Asian Americans and trivializing "an entire religion and philosophy." Even as company spokespersons claimed innocence and regret, activists staged protests outside the retailers' stores, organized boycotts across e-mail lists, and demanded "respect" for Asian Americans as a lucrative market. I have to confess I was hardly shocked by the "Get Your Buddha on the Floor" or "Wok-n-Bowl, Let the Good Times Roll -- Chinese Food and Bowling" T-shirts, which are only the latest splashes in the tidal wave of kitsch merchandising and "orientalia" that's been crowding store shelves for years now.

But what this particular instance does reveal is that the demand for realistic and "positive" images of racial and ethnic communities in popular culture is often an inadequate response -- one that fails to address the other, often more complicated messages involved in the transformation of racial caricatures into products and the meaning of the way they are packaged.

To accuse the company of misrepresenting Chinese or Asian men, culture, etc. or of "misleading [consumers] as to what Asian people are" does not suffice. The company's now infamous "Wong Brothers Laundry Service -- Two Wongs Can Make It White" T-shirt is not meant to function as an "accurate" representation of Chinese masculinity. (Although the correlation between "white" and "right" in the pun is both banal and striking.) The clothiers acknowledge these are not realistic images.

Protestors also claimed that Abercrombie views Asian Americans either as laundry workers or (as one angry editorial writer put it) as a "mass of consumers [so] full of self-hate and self-loathing that they will latch onto any negative stereotype of themselves and parade it around town like a yellow minstrel." But these arguments imply that images can belong to only one of two categories: stereotypical (negative) and realistic (positive); and Asian Americans themselves can only be either authentic (protesting) or assimilated (buying). The criticism that these T-shirts "sell Asian self-hate and shame," or that Asian Americans who might buy them are "whitewashed," ignores the possibilities for other kinds of consumers, images or interpretations.

Rub My Belly Buddha and Art's Auto Body tees are better understood within the context of the rise of kitsch as the hallmark of "cool." They mark the emergence of what could be termed "orientalist kitsch," in which a racist caricature is resurrected and marketed as hip or trendy. But we need to recognize that the use of a racist image a century ago does not have the same meaning as the use of the same image -- or a similar one based on the same racial stereotype -- today. If we understand these images as kitsch, we can understand them as a function of marketing strategies such as parody and irony.

The Abercrombie PR honchos claimed these T-shirts were meant to be funny. Ironic, right? But in this instance, irony is conservative in its operation. It implies that if a long enough view is taken, all history becomes insignificant, including the history of oppression. The Abercombie & Fitch executives do not mean to reinstate turn-of-the-century Chinese exclusion, legal discrimination or even the emasculation of Chinese men, as much as dismiss these histories as meaningless today.

The same effect is at work in the recycling of revolutionary iconography or heavy metal tour T-shirts. South American guerillas, heavy metal progenitors and Chinese laundry workers are divorced from history and transformed into commodities. Of course, what distinguishes the "Two Wongs" T-shirt from one featuring Che Guevara or Judas Priest is that it is an image of a racist stereotype. The gesture skittles between declaring a "postracist" state and resurrecting old ghosts and bad memories.

Nevertheless, this transformation process (turning a racist caricature into kitsch) is of a different nature than the faithful re-creation of racist images. The T-shirts derive their cultural capital precisely from their assigned "bad taste" -- witness their resale for as much as $250 on eBay as collectors' items. Nor has Abercrombie & Fitch been damaged by the controversy and the reams of bad publicity. Shares in Abercrombie sold at $33.30, a 52-week high, on April 18, the day the company apologized and pulled the T-shirts from store shelves.

In the language of kitsch, bad taste is a valuable quality that sells to the hip, urban consumer of tiki bars, wobbly-headed dashboard dogs, mullet paraphernalia and Buddha T-shirts. And because these items are typed as trashy or low class -- the (sometimes faux) detritus of thrift stores and garage sales -- their purchase as kitsch is accompanied by the necessary wink, which distinguishes the wearers of the T-shirts from those who, say, might really work at Art's Auto Body. In the case of Abercrombie, it's a wink with no concern for memory or history.

Contrary to its claims, Abercrombie & Fitch clothing is hardly funny, daring or even interesting. Have you seen the clothes? Strictly dullsville. I would rather wear plastic garbage bags and orange legwarmers than sport the sartorial remnants of Reagan-era preppie. Its reinvention of the elite classics -- polo shirts, chinos, whatever -- has for years balanced the brand image on the sensibility of a privileged whiteness steeped in hedonism.

The thick quarterly catalogs feature luscious models -- many recruited from college campuses and most of the Anglo-Saxon variety -- frolicking nude or lounging in stately dorm rooms and lush football fields in suggestive (and often homoerotic) poses. This provocative approach has drawn the censure of cultural conservatives. The rightwing fundamentalist Bob Jones University banned the Abercrombie logo from its South Carolina campus two years ago, while Michigan's attorney general pushed Abercrombie to shrink-wrap and slap "adults only" labels on the catalogs.

Unfortunately, recent criticisms of Abercrombie for retailing controversies like the "Wong" T-shirts too easily and quickly articulate a similar conservative leaning toward "good taste." An article in a left-leaning Asian American student newspaper accuses the retailer of "skirting the rules," and that the "Abercrombie & Fitch catalog stunts encourage behavior [like underage sex] that flaunts social conventions."

The pairing of an Asian American critique of racism with a social conservatism of sexual propriety and obeisance to "rules" is a dangerous strategy. What makes for good taste? Martha Stewart? High culture? What does a positive image look like? Middle-class? The good girl who doesn't kiss on the first date? And how would this look on a T-shirt? Wow, problematic much?

This complaint about "skirting the rules" is as disturbing as the suggestion that Abercrombie & Fitch "respect" Asian Americans as a target market -- which itself skirts dangerously close to a "model minority" argument.

We know by now that corporate management and market influence can assimilate even the most revolutionary -- or in this case reviled -- sorts of images or themes, and in the process often reproduces and repackages racial and social inequality. But because we already know these things, we can begin to ask other kinds of questions about how this happens.

At the very least, these controversies should remind us that all images and representations are staged -- stereotypical or realistic, negative or positive -- and, as such, we're only as authentic as our kitsch, which is to say, not at all.

Fantasies of Fame

Was I dreaming, or did novelist Ann Beattie whisper in my ear that she had not done her homework?

It was Day Two of the Key West Writers Workshop led by instructor and author Harry Mathews (The Conversions, Cigarettes). Seated around a rectangular table in the historic Village Hall in downtown Key West, Fla., 11 aspiring novelists and poets who had plunked down $300 for eight hours of dicta and philosophy from the enigmatic Mathews listened in awed silence.

Some of the awe was because of Mathews -- a bespectacled, bald, serious writing professor (Columbia and Bennington) who had a habit resting both elbows on the table and leaning forward, peering intently at each participant as if defying them to doubt his proclamations. And then there was yesterday's surprise discovery that one of the 11 students was the acclaimed novelist and short story writer Ann Beattie (Picturing Will; My Life, Starring Dara Falcon). Seated at the table's end opposite Mathews, the striking, smiling, silken-haired literary phenom seemed as dazed by Mathews as the rest of us.

But the major kick for the amateurs, or why we had traveled thousands of miles for a weekend workshop, had more to do with where we were on this sunny 70-degree morning. Would-be writers, hanging with the pros in the tropical lair of word-gods like Hemingway, Elizabeth Bishop, Tennessee Williams and now Beattie, we were living out a literary fantasy. Really.

In this age of extreme adventure vacations, you could, for the right price, indulge any lifelong fantasy for a week or a day. Ante up for baseball fantasy camp and shag grounders hit by Cub great Billy Williams in Mesa, Ariz. Sign on for the crew of a tall sailing ship, and climb to the crow's nest 40 feet above the Caribbean. Register for the Navajo cultural exchange, and sleep in an authentic hogan, 100 yards down the hill from an authentic outhouse.

And to provide a similar fix for the countless number of aspiring writers in the United States and England, conferences, conventions, writers' groups, writers' colonies and writing workshops have bloomed. An Internet search turns up hundreds of such offerings each season. Some are affiliated with schools and universities; others are private, for-profit monthly services. Registration fees vary -- from $35 for a single Adventure Travel writing session to $1,200 for a week at the prestigious Stonecoast Conference in Maine, including credits -- and depend on the length of experience, and, of course, the name value (read celebrity) of the pros who will be in attendance.

Add the expense of flights, car rentals and bed and breakfasts (literary types eschew hotel chains), and its easy to see how an entire industry is making money off those who think and type and, most importantly, dream. The market pool is limitless; you don't have to be blessed with athletic talent, beauty or wealth to entertain a serious ambition to write. All you need is the conviction, secret or public, that you have a story to tell.

For a total of about $1,500 (including tuition, lodging, too much Key West Sunset lager and not enough key lime pie), I participated in this year's Key West Writers' Workshop (KWWW). Sponsored by Florida Keys Community College, the program schedules weekend writers gatherings, each proctored by a reputable artist. This year's instructors included novelist Robert Stone, poets Robert Creely, Sharon Olds and Carolyn Forch, and poet/novelist/essayist Harry Mathews, in whose workshop I was lucky enough to secure a seat (writing samples were required). The KWWW Web site urges those interested to apply early, as the workshops fill fast because of the additional lure of the brilliant Florida Keys sunsets in the middle of February, when the rest of the country is freezer burned and stir crazy.

KWWW Director Irving Weinman has this whole beguiling fantasy thing down pat. At the end of your mundane week, after emerging from some arctic city to land under sunshine in Key West, you follow his careful directions to the Friday night orientation session. Pass by the palatial home of Ernest Hemingway and his old watering hole, Sloppy Joe's, as you walk in short sleeves to the Westwinds Inn for the sunset gathering. Proceed to the torch-lighted courtyard, where, seated under palm trees, you're served white wine and literary anecdotes by Mathews and Weinman, novelists who hobnob with John Ashbery and Raymond Roussel. You keep silent, sipping and smiling and nodding. The evening ensues and you shake hands and say, "'Till tomorrow," as if you always speak that way. You walk in the dark toward Mallory Square, slightly abuzz from the wine, the smell of flowers in mid-winter, the music tingling from the water front, and, most powerfully, from the question inside that you dare not even whisper: Am I one of them?

Harry Mathews turned out to be the ideal tease for this dream. At 73, he stood erect in a black T-shirt and white deck pants, soberly assaying anyone who dared approach him in the courtyard. I say dare because Harry disclosed to us tuition payers that he is repelled by people who flatter him, quoting another writer who compares the act of complimenting his novels to a stranger remarking how beautiful his ex wife is. Wanting to meet a writer because you like his work is like wanting to meet a goose because you like its droppings, Mathews said.

The tanned, chain-smoking Mathews fixes his listeners with a steady gaze and carries over his legendary literary economy into his social interaction, discouraging meaningless small talk. He will abide someone else's remarks or questions but prefers to pivot to a monologue on Key West's atmosphere (a feel of the continental), on France (where he lives half the year), or on fishing for prized permit in the flats.

On the first day of the workshop, (a misnomer for what is essentially a series of lectures punctuated by student sessions of free writing, which Mathews does not read), he set out the rules for the weekend like the boss man in Cool Hand Luke: "You will not be late. You must go to all four sessions. Do not interrupt with questions. Say nothing if you disagree (or take it up with me outside of class). Do not take notes, as note taking is actually a form of not paying attention," he said.

I looked around the table, expecting to see a welling up of protest, or maybe grudging resignation. Instead, what I saw was glee, a welcoming of rules and discipline, as if this answered their craving for order, for punishment, for initiation into the writer's life.

And then Harry Mathews asked for a show of hands: How many of you got into writing as awkward adolescents, spurned by your peers, homely social outcasts who sought in reading and in writing a refuge from the world?

Every hand went up. (Even Ann Beattie's, though I cannot swear since note taking was disallowed.) I dont know if I was embarrassed by the admission or embarrassed that I and these 10 others, and the 10-hundred others who show up at these seminars each month in all 50 states, had gravitated to this life not out of talent or avocation, but because of a social behavioral disorder.

Though Mathews is internationally known, and his work is critically acclaimed for its freshness, intelligence and departure from literary convention, his work is not widely read. As with Faulkner or Joyce, his prose is not reader friendly. A musicologist who graduated from Harvard, Mathews does not find conventional writing challenging, so he experiments with and invents forms that lead to unique modes of expression, giving him a kind of literary liberty.

One of these experiments involved writing without using the letter e. He said he had taped a thumbtack to the e key on his typewriter to compel him to use alternative words and, concomitantly, phrases and clauses that he would not otherwise use.

The results of his inventiveness are challenging novels that are more imaginative than realistic. Even his most conventional novel, Cigarettes, which explores the tentacled body relationships among a battery of male and female characters, is based on a structure that can befuddle casual readers. (His controversial book Singular Pleasures contains 61 one-page descriptions of methods of masturbation.)

So one of things he must do is teach. And Ann Beattie, who was taking the class because she is, of course, interested in Mathews thought processes (her missing homework notwithstanding), also indicated that she was auditing the workshop in preparation for her own teaching, which she and her sculptor husband must do to supplement their artists income.

One would think that the revelation of Beatties finances would be a sobering dose of reality for the workshop neophytes. Apparently, not even the life of a successful literary artist is worry-free. But it was never about wealth, as Mathews' first lesson on motivation testified.

The problem with some writers, said Mathews, is that they don't know why they are writing. If they think it's for self-expression, or for escape, or to get published and reviewed and win prizes, their writing will fail them. The writer's real mission, he says, is to search for his own story with which to fill the blank page. It may take years of pain, volumes of work, reams of paper; even at life's end, he may not have entirely fulfilled that mission. But the journey is key.

As far as fantasy camps go -- and for the illusion of immersion in the Key West artist's cachet -- Mathews was contrarily well suited. As an intellectual figurehead, he kept the dream safely remote. Granted, we did learn important attitudes about, and approaches to, writing. And while its hard to expect any improvement in writing skills after a two- or three-day workshop, the value of networking at these venues can't be overestimated (Novelist Lynn Crawford began a collaboration with Mathews as a result of an earlier workshop, which led to her inclusion in his latest anthology, The Oulipo Compendium).

But because Mathews joined us for cigarette breaks on Duvall Street and for lunch at a harbor side oyster bar during our one weekend in paradise, we flew home sated, tanned and star-dusted, though not necessarily any better at writing.

We Are Family

Everyone’s talking about MTV’s new reality show, The Osbournes. My middle-aged library co-workers are impressed with wife/mother Sharon’s household management skills. The suburban neighborhood kids show off their best Ozzy impersonations. Mechanics at our local Wal-Mart say they haven’t missed an episode. Even George Bush admits he’s a fan.

Sam Donaldson and Cokie Robertson seem to be the only holdouts, having recently opined on their ABC talk show that they find the show incomprehensible, profane and too low-cult for their tastes -- issues that don’t seem to have turned many other fans away.

For a show that initially received little hype, The Osbournes is amazingly successful, reaching approximately 6 million households each week. Recent cover stories (Entertainment Weekly, Rolling Stone) didn’t appear until well into the show’s run, and MTV didn’t do much advertising for the show before its March debut. There are reasons to have doubted its success: The Osbournes arrived at a moment when reality shows seem to be degenerating into more tawdry scenarios; the focus is on a rock star who’s not at the top of the charts (and who is definitely not young, hip or on MTV’s TRL); and it’s broadcast on a cable network not known for family-values programming.

MTV is, however, known for its reality programming. It shouldn’t be that surprising that the network that gave us more than 10 years of The Real World (which seems to have lost touch with reality considering its formulaic casts and conflicts) could do so well with The Osbournes.

None of MTV’s other ventures into reality and reality-esque programming (Road Rules, Cribs, Undressed) have matched The Osbournes’ success -- but somehow the network created a hybrid of its lesser shows that’s greater than the sum of those parts. The Osbournes concept was spun from an episode of Cribs that featured the family moving into its plush new Beverly Hills home and provides the conflict and voyeuristic elements that make The Real World such a guilty pleasure.

The end result is nicely packaged as a wacky sitcom, complete with schmaltzy title song (Ozzy’s “Crazy Train, interpreted by a crooner at lounge-speed), and credits that list each family member by name as well as by role (e.g. Ozzy Osbourne as “The Dad”). Most significantly, The Osbournes currently has the cleverest comic editing on TV. For each shot of Ozzy embodying his rock star persona in front of concert crowds and hordes of fans, we get many more of the 53-year-old out-of-shape father of two shuffling around the house, bewildered by the complicated TV remote or stepping in the dogs’ water dish. During the debut episode, Sharon Osbourne (“The Mom”) describes her teenage son Jack’s status as a loner and outsider at school while we see footage of Jack decked out in camouflage and a helmet, stalking around the house with a family cat, then poking at an empty cardboard box with a bayonet.

The key to The Osbournes’ success is this: It gives us everything we want from reality programming while fully acknowledging the extraordinary nature of the family’s situation. We are privy to the glamorous, excessive side of celebrity-family life -- underage admission to nightclubs like the Roxy for teenage siblings Jack and Kelly, mother-daughter shopping sprees on Ozzy’s credit card, extravagant birthday parties, house calls by pet therapists, and a cameo appearance by Special Guest house visitor Elijah Wood, who graciously helps to clean the dog pee from a soiled cushion in the den.

The Osbourne kids are spoiled in ways we imagine the offspring of rock stars to be. When Jack sasses back to his nanny, Melinda, or when Kelly mopes around because she doesn’t have a special record label of her very own (like Jack), it seems like this is the role for which they’ve been groomed. It’s also hard to blame their parents: Ozzy and Sharon love each other, and their children, very much. When the excesses allowed by the family’s situation lead to trouble -- like Jack’s all-night partying, drug use and club hopping sprees -- the family sits down and talks it out. Granted, this discussion is peppered with censors’ bleeps and is incomprehensible at times due to Ozzy’s slurred speech, but it’s heartfelt and earnest all the same.

Even though he’s a celebrity, Ozzy is also a very real father: He’s a problem-solver, he’s honest (using himself as an example of why Jack and Kelly might not want to abuse drugs or start addictions at an early age), and he’s up-front with his kids. In another favorite Osbournes moment, as the kids get ready to go out for a night of clubbing, their dad implores, ''Don't drink, don't do drugs, and if you have sex, wear a condom,'' much to Kelly’s embarrassment.

During a serious family talk about Jack and Kelly’s respective bad behaviors (Jack’s marijuana use and late-night parties, Kelly’s possession of a fake ID), Ozzy and Sharon work with the teens to try to resolve these issues. What’s refreshing about their approach is that both parents seem open to letting Jack and Kelly in on the discussion -- Sharon begins by asking both of them if what they’re doing is “right” and suggests that the family work together to create a better structure for the kids.

Jack and Kelly are both allowed to whine and to air their complaints, while Ozzy and Sharon seem ready to offer some solutions and advice. When Kelly complains about feeling alienated at school because she’s Ozzy’s daughter, he suggests maybe she should be home-schooled. When Jack refuses to admit to getting high and playing around on the computer in his bedroom (he claims he spends this solitary time reading), Ozzy warns him about the slippery slope of addiction -- what seems like something Jack “chooses” early on will become something he craves and needs later. Ozzy’s best argument to Jack against habitual drug use? “Look at me!”

While neither child seems immediately affected by the family chat, Sharon notes later while talking to the camera that Kelly had come to her the day after the discussion and said she had reconsidered hanging out with some of her friends, the ones with whom she'd gone clubbing.

I worried when watching the show’s debut episode that Kelly’s whining and Jack’s bad attitude would make the show difficult to watch. This is, surprisingly, not the case. Yes, they can be horrible to each other and to their parents and their nanny at times. But they’re also exemplary teenagers. And there are moments when Kelly’s malaise seems strikingly familiar. The way they fight, they way they broadcast their disputes to their parents, and the way they seem constantly both privileged and vulnerable wins me over in the end.

Like Married with Children or The Simpsons, The Osbournes both exaggerates and reflects the state of the modern family, and it does so with near-universal appeal -- proving that reality TV works best when its subject matter is more real.

Alana Kumbier, television editor for PopPolitics, lives in Columbus, Ohio.

The Newbies of Americana

This article wanted to be nice.

It started out that way; it had the best intentions; it was full of good feelings, full of sunshine and butterflies and fresh country air. It was not a pushover -- you wouldn’t want to get on the bad side of this article, that’s for sure. But if you let it alone it was a huggybear.

But then it went off to the city, and modern life was too much for the article to stand. Now, it’s not as if the article turned altogether mean. But something changed, a note of sorrow entered in, and there’s the pity. There’s so little wit and glamour in this world; it would have been nice if the article could have given us wit and glamour.

It wanted to be sweet. It had planned to talk about the recent "Down from the Mountain" concert at Rupp Arena in Lexington, Ky. That concert was the first stop on a tour that would feature many of the same musicians who had appeared earlier at Ryman Auditorium and Carnegie Hall, holy places both, to sacramentalize traditional acoustic music and, not coincidentally, the success of the Grammy-award winning O Brother, Where Art Thou? soundtrack, which had "started it all," as many newspaper writers have put it.

The article was going to set the scene, describe the music and performers -- including the Fairfield Four, Norman Blake, Emmylou Harris, Alison Krauss and Union Station, Ralph Stanley, Patty Loveless -- and then take an ironic turn. But the irony was not supposed to be hardhearted or sharp-witted. This article wanted to keep on the sunny side.

The gentle irony would be focused inward -- self-deprecating, self-mocking. The irony would be at the expense of the article’s writer. Perhaps some of his readers would recognize that the writer was using himself as an example of bourgeois intellectualism and thereby understand that his article was speaking not just about himself, but about other bourgeois intellectuals like them.

The writer and his readers share an interest in the brand of authentic music this article considers -- the kind of authenticity you can buy. And the writer and those readers are reflexively, if gently, self-ironic types. Yeah, they think, we’re privileged white Americans, but we’re self-deprecating, so that makes everything OK, and then they smile in that way that privileged white Americans have.

Gentle self-mockery was part of the article’s plan. The article would use the "Down From the Mountain" concert as a pretext for fashionable boomer self-loathing. More than that, though: The article would present the ambiguities of what "authenticity" stands for today, now that mountain music is the height of bourgeois intellectual cool, endorsed by such hotshots as NPR, which earlier endorsed Cajun and Celtic and zydeco, and by the Coen Brothers, producers of the concert and every self-conscious boomer’s favorite self-conscious boomer directors of self-conscious films, including O Brother, Where Art Thou.

Not that the article would seriously interrogate the term "authenticity," which is, as the theoretical cool kids put it, "an arena for contestation." Nor would it do more than take an affectionate swipe at the Coens or NPR; the writer, despite the aforementioned "fashionable boomer self-loathing," adores Fargo and All Things Considered and can even sit through Barton Fink if you bring him snacks and pat his hand to comfort him during the weird parts.

To wit: Work in the sentence "not a gun rack in sight, though the pungency of well-chosen cheeses pervades."

Would the article have to respond to the actual event, the first concert of the "Down from the Mountain" tour? Not at first. Not when the article was in its infancy. The writer could start the article before he’d even clicked onto the Ticketmaster Web site and groaned (as ever, shocked and livid) at the "service [sic!] charge." It would be simplicity itself to make the article explore a clash of cultures: the authentic mountain culture celebrated in virtual obscurity by most of the artists on the night’s roster; and the newbie culture of the middle-class writer and his chardonnay-sipping, running shoe-wearing, PopPolitics-reading cohorts. Attending the concert would only confirm the writer’s suspicion of the cultural complexities in Rupp.

Thus, from the mental notes of the writer in transit up I-70:

* Survey the arena’s parking lot pre-show. Note the number of Volvos parked therein for this "mountain music" concert. Compare with the paltry number of mud-spattered pickups. Offer other cultural ambiguities which appear to sneer at the blue-collar crowd but instead reveal the class anxieties of the newbies. To wit: Work in the sentence "not a gun rack in sight, though the pungency of well-chosen cheeses pervades." Quote bumper stickers, especially those denoting arty or timidly progressive leanings; e.g., from the writer’s very own vehicle: "Art Works for Kentucky" and "Re-elect Gore and Lieberman, 2004."

* Work in a reference to Rupp Arena itself, a holy place among certain college basketball fans. Allude snootily to Fenway Park (since baseball is classier than basketball), also to Cameron Indoor Stadium (since Duke University is classier than the University of Kentucky). Thereby further damn the writer as a snob (but gently, make the readers pity not hate the writer, their brother, their double) while still getting the snootiness into the piece. Note that a colleague, Dr. Hugo Freund, has recently quoted Spiro Agnew as regards the writer and his friends: "impudent core of effete snobs who characterize themselves as intellectuals." Self-referentiality = sincerity (cf. Eco, Postille a 'Il nome della rosa'). Neat!

* Mention that downtown Lexington itself is a very pretty and well-preserved city, with many lovely brick buildings and modern hotels that actually fit in. Mention the Kentucky Theater which shows art movies. Mention the excellent variety of Scotches available at the different locations of the Liquor Barn. Look in the Yellow Pages for the names of fashionable and authentic ethnic restaurants to work in. Use the phrases "funky but chic" and "cheeky but fun." Ignore the fact that the latter phrase is meaningless; pray that it, unlike the earlier coinage fratriotism, gets picked up and bandied about by influential people, thereby ensuring the writer literary immortality.

* Briefly discuss outlying Lexington, a sprawl of windswept Red Square-ish shopping centers and upper-middle-class worker houses, thereby justifying this quotation from former Lexingtonian Richard Hell: "I was sayin’ let me out of here before I was even born." Also Daniel Boone’s stated reason for leaving Kentucky for the wilds of Missouri: "Too many people. Too crowded! Too crowded! I want more elbow-room."

* Write "Man O’ War Boulevard" often.

But eventually, the writer must experience and report. After having been lost for some time (on Man O’ War Boulevard), he and his wife check into the affordable and surprisingly comfortable Comfort Suites, eat dinner at the Applebee’s next door, and go to the actual arena itself. He modifies some of the ideas he formed on the way into town. The idea about the parking lot, for instance. Because he parks at a Kinko’s instead of at the arena, he cannot be certain about the Volvo-to-pickup ratio. Regarding bumper stickers he lacks a clue. He decides to lie to his readers and tell them that he did, after all, park among the other Volvos and survey the stickers. What his readers don’t know, etc.

Inside the arena, which is next to, on top of, surrounded by, engulfed within --geography is not the writer’s strong suit -- a three-story shopping mall, he commences surveying the scene. He asks his wife, who writes legibly, to take notes. She stares hostilely but complies. Herewith, a transcription of selected items:

* Stage ostentatiously bare. Backed by a (gray?) curtain. Three mike stands. Easy chair and podium stage right. What appears to be a cheap rug, not large enough for this stage (Rupp, though the first venue of this tour, is also the biggest: unwieldy site?) The frugality is artful. "Tasteful."

* A good sign: No more than four cowboy hats visible in the entire arena. Not attracting a "country" crowd (Faith, Garth, etc.) despite the logoed-t-shirt-wearing radio station goons giving away bumper stickers to whoever wants one. (No one wants one.) A bad sign: One guy dressed completely in black, wearing a beret. Why is this a bad sign? Guys in berets are always a bad sign.

* Four men wearing jeans and tweed jackets within spitting distance. Five if you’re not the writer but rather the writer’s wife and count every guy in tweed you yourself could easily spit on. It looks like they just wandered in from a faculty meeting. No, scratch "like"; write "as if."

* Besides the chardonnay-sipping, running shoe-wearing, PopPolitics-reading types sitting nearby, there is a goodly number of real-looking country folk as well. You can tell they are real because their clothing does not combine denim with houndstooth. They seem pleased by the attention their music is getting and arrive in their seats early. Still for the real deal, they might have driven to Cincinnati, a couple of hours north where Ricky Skaggs and Del McCoury are playing tonight. But people (i.e., the bigtime press) are watching: you come to show your support. It’s a matter of support.

* The show’s host is Bob Neuwirth. Patti Smith once wrote a poem for him. It’s called "For Bob Neuwirth." He was with Bob Dylan’s Rolling Thunder Revue. In Renaldo and Clara, which Dylan directed, he played the Masked Tortilla. T-Bone Burnett (who produced the movie soundtrack and tonight’s show) played the Inner Voice and Bob Dylan played Renaldo. (Ronnie Hawkins played Bob Dylan.) Emmylou Harris sang on Dylan’s Desire album, released at roughly that same time.

* Another way that a lot of these artists are connected: Many of them -- the Nashville Bluegrass Band, Norman Blake, Emmylou Harris, Ralph Stanley -- have appeared on Prairie Home Companion. Aren’t the Coens, like Dylan and Garrison Keillor, from Minnesota? Does Minnesota make fiddle music okay?

* This is not pedantry.

* As to the music itself: Competent. Professional. Slick. Scripted. Bloodless. Joyless. Sexless. Occasionally tedious. We are, one gathers, given to understand that this is a Special Evening and should therefore stop looking at our watches so often.

* Highlights: (1) The Fairfield Four sing bass parts that could dissolve kidney stones; (2) Alison Krauss, suffering from laryngitis and thus unable to sing the songs she’s been scripted to sing, instead cuts loose on fiddle; (3) several different performers refer to the ascendancy of mountain music on the pop charts and, barely able to contain their glee, note that they are at last earning a living wage; (4) Julie Miller gets miffed at something or other and claws the air in the general direction of a mystified roadie while continuing to sing harmony for Emmylou Harris; (5) Patty Loveless.

* Ralph Stanley calls for the come-on-back-everybody finale before, it appears, everybody is altogether ready to come on back. There is general confusion before the performers wander back onto the stage. Ralph looks down at his lyric sheet. Eventually they perform "Angel Band." This is a genuine relief, because all along the evening has been heading in the direction of, please please no, "Amazing Grace."

* For an encore, the gathered musicians perform the last verse and chorus of "Angel Band." They walk offstage. With no to-do whatsoever, the audience gets up and leaves. They do not stomp or clap rhythmically or otherwise demand more for their money. In the lobby of Rupp and in the streets, there is nothing resembling exhilaration.

* Before the tour has gone through too many more dates, the concert adds another encore. It is "Amazing Grace."

Let’s lose the cute third-personism. It worked to distance me from "the article," but now I’d like to be more forthright.

I live in southeastern Kentucky, in a renovated carriage house which you can see right here if you rotate the image about 180 degrees and zoom in. I have lived here, and taught in the college across the street, for nearly two years. Moving to a small town in Appalachia meant leaving behind certain things that I had come to rely on -- the nearest real bookstore is 80 miles away, and let’s not talk about movie theaters. For many years, I had patronized terrific independent record stores, like Papa Jazz in Columbia, and Manifest Records in Greenville, S.C. But no more: Now the nearest outlet for recorded music is Wal-Mart. And the second and third nearest outlets are other Wal-Marts.

But, in another sense, I do not entirely reside in that renovated house. Like my "bourgeois intellectual" readers, I feel like a citizen of some abstract but still viable nation. I have no bookstores or record shops nearby, but I can send off for anything I’d like and have it in two business days. Like anyone else with the proper wiring, I can read the New York Times as soon as I get to the office; I can read The Nation, American Prospect, Village Voice. I mourn the death of Lingua Franca and I’m not even sure that I’ve ever seen a copy in print.

And consider this: The only radio station that broadcasts here in town brags of having "the best mix of the 80s and 90s." Thus I am wistful for NPR and search the Internet for Web casts. Recalling the stations I enjoyed back home, I find good old WNCW, which plays tasteful boomer stuff like Dr. John and Little Milton and therefore could just as well be in New York or Chicago or here in tiny Barbourville (say it so it rhymes with "marvel") as in Spindale, N.C. It gives me World Café, which is bogus, but nice for doing paperwork.

These intruders did not know enough to shut up.

Several years back, WNCW was a "flagship bluegrass station," which is what it misleadingly continues calling itself today. It played real, which is to say "weird" and "scary," mountain music long before the Coen Brothers got an Americana jones. It was the station that I listened to when I wanted to hear the music you’d hear on mountaintops and in hollers without actually going up onto mountains or down into hollers. WNCW was a highway to deep country that didn’t get too near frighteningly authentic rural people.

The "Down From the Mountain" tour is not just a celebration of traditional music, but a final exam for many of us 40-ish smarty-pants who listened to radio stations like WNCW when we were kids -- who continue listening now even though they have grown pretty blah. The bourgeois intellectuals I’m talking about don’t think of music as simple entertainment or even as simple "texts." Instead, we understand pop culture as a stage for intellectual performance. While we’re listening to the latest Dylan, we read Greil Marcus for context. We study the Anthology of American Folk Music. We scuttle to Amazon and order David Johansen and the Harry Smiths, the album we heard about on Fresh Air.

We can tell you this: On Aug. 1, 1927, a signal day in American history, Ralph Peer signed and recorded both Jimmie Rodgers and the Carter Family. The Carters had come to Bristol, Tenn. -- the Birthplace of Country Music -- from Mace (or Maces, or Mace’s) Springs, Va. The mountain they came down from was Clinch Mountain, a holy place in its own right. And Maybelle Carter learned her distinctive guitar style from Leslie Riddle, an African American neighbor.

So, as you see, those of us who showed up in our Volvos were ready both for the concert and for our comprehensive exams. And how did we come up with our wealth of knowledge? Through some field work, but also from -- mostly from? -- secondary sources. As my wife, Sharee, reminds me, she has often urged me to visit Bill’s Music Shop and Pickin’ Parlor in West Columbia, S.C., where I could have seen Ralph Stanley up close, before he got the imprimatur of The New Yorker. She further reminds me that, at Bill’s, we could have seen Ralph and the Clinch Mountain Boys for $10 advance, 12 at the door (we’d have to bring our own chairs). But at Bill’s (she goes on, barely controlling her need to dope-slap me), I would have lacked distance from the topic. I’d have had to enjoy the music, not study it and file away what I learned for finals.

So here is why I did not especially enjoy "Down from the Mountain" and why the article turned mean. The quality of the show is not what soured me. The experience as compared with my expectations did. My mind was made up; I knew how the concert, and the article I’d write, would go -- the concert was going to be an excuse for gently probing boomer longing for The Authentic, for setting me and my faux dithering down among the old timers. That would be a pleasant way to set the concert in its psychic place

And then the Mallrats showed up. I am using the term "mallrat" broadly, not to mean just the kids you see hanging out at malls. Instead, I mean everyone of any age who thinks that, if God had wanted you to buy it, he’d have put it in one of 300 climate-controlled stores beneath a single roof with acres and acres of convenient parking.

They sat behind us. And next to us. And all around us. So, while we were trying to hear Ralph Stanley’s unaccompanied "O Death," we had to contend with a yee-hah-hollering goober or two. Or 10. And when Emmylou Harris came on, out came all the flash cameras. The old timers must nearly have been blinded.

Cellphone user in crisp new overalls: "Hey, Scott. Guess where I am? Rupp. Rupp Arena. At a concert. An honest-to-God bluegrass concert. Guess how much I paid for the goddam tickets?" Holds cellphone up so Scott can hear Union Station.

Worse, these intruders did not know enough to shut up. During ballads, they talked, loudly, to one another about (1) irrelevant things like who was dating whom and (2) sort of relevant things like the last Five Blind Boys of Alabama album, but shut up anyway. The woman sitting one row behind and two seats to the right of Sharee: would you like to know what she considers the best place for wings in Louisville? I know: she told me and the whole world during "Red Dirt Girl."

And they not only talked. The Mallrats sang. That is how I know they were Mallrats. The songs they sang identified them. They did not sing along to Emmylou Harris’s "Red Dirt Girl" or Patty Loveless’s "Mountain Soul." (They talked during those songs.) Instead, they sang along to "I am a Man of Constant Sorrow" "Big Rock Candy Mountain," "Angel Band," and "I’ll Fly Away" -- all the songs on the O Brother Where Art Thou soundtrack. They had gone to the mall, bought the album, and learned the songs. Then they came to the concert because it was cool.

Look: People who listen to NPR know better than to sing along at concerts. And they expect others to follow the rules of decorum. For gosh’ sakes: civilizations fall when people talk at concerts. That’s why, after my wife asked the woman behind her to keep it down (her male companion intelligently told Sharee to "chill out"), I went looking for ushers. I’d show these upstarts.

Here’s what I learned. In Rupp Arena, people are allowed to speak as loudly as they like, so long as they don’t cuss or abuse others. It didn’t matter that the show was acoustic and that the miking was inadequate to fill Rupp with anything other than minimal volume. The Mallrats -- interlopers -- could yap as noisily as they pleased.

"You mean you can’t make them be quiet?" I asked the usher.

"They paid to be here. They have the right," she said.

"So I have to put up with their noise?"

"They bought tickets."

"I bought tickets, too," I said. "Don’t my $90 count for anything?"

Silence. So I walked back to sit beside Sharee and seethe. The Mallrats got quieter as they ran out of things to say, but I got angrier. The injustice of it all.

As the evening ended I glared at the Mallrats. They left without comment. Good thing, too. I’d have had words for them.

Like: Why can’t you act like decent people?

Like: How dare you turn this music into Top 40? A fad like Gap t-shirts?

Like: Just because you paid for it doesn’t mean you own it.

Jimmy Dean Smith is an associate professor of English and communications at Union College in Kentucky and a contributing editor to PopPolitics.

The End of a Seasonal Affair

Some women have a history of getting mixed up with loser guys. I get involved with loser TV shows.

Year after year, I flirt with the season’s critically acclaimed new prospects until I fall madly, deeply for one of them. Soon I’m planning my teaching schedule around the show, declining social invitations in order to watch it, feeling giddy two days before the next episode airs. Meanwhile, the viewing public treats my show like it’s a C-SPAN2 airing of “Proposed Patent Office Budget Hearing.” I beg my friends to give my show a chance, to really get to know it. Inevitably, they ignore me, the ratings tank, the program gets cancelled, and I am bereft.

This sad story has been repeating itself for a decade. Remember Brooklyn South? I didn’t think so. I was one of about 12 Americans who regularly tuned into this absorbing 1997 Stephen Bochco drama about a sweet, burly desk sergeant and his crew of beat cops. Critics hailed South as the second coming of Hill Street Blues. The public avoided it like the plague.

Or how about ABC’s I’ll Fly Away? This tender 1991 drama, set in a small Georgia town in the 1950s, starred Sam Waterston as an attorney raising his three children with the help of a black nanny, played by Regina Taylor. The show dealt smartly and subtly with racial issues and was touted by one critic as “the finest dramatic series ever to run on network television.” It was cancelled after two seasons and my heart was broken. PBS picked up the show for a two-hour series finale, but watching it felt like having sex with your ex: momentary pleasure, but you still knew it was over.

Remarkably, the shows I love get cancelled prematurely while the shows that no longer capture my heart continue on. Once I was such an Ally McBeal fan that I got a speeding ticket in a mad dash home to watch it. But that show has long since degenerated into derivative dreck. And yet, Ally survives. (Don’t even get me started on The Practice.)

If you’re a long-suffering sports fan, you might think you feel my pain. You do not. Even if the Chicago Cubs were to lose 162 consecutive games, the team would still exist. Heck, even the lowly Montreal Expos -- with the average 2001 attendance record of a Gary Condit fundraiser -- managed to escape cancellation this year. But there is no mercy for loser TV shows.

All of this brings me to the current TV season, in which FOUR of my beloved shows have been dumped. Frankly, I do not know any TV watcher who has suffered as I have.

The first of my shows to get the heave-ho was CBS’ Citizen Baines. This gem starred James Cromwell as a longtime U.S. Senator who loses an election and is forced to face life as an ordinary citizen back home in Seattle, where his three daughters have done fine without him. The show was novel: The star had thinning, gray hair, and not a single character lived in a $3,000-a-month apartment despite no visible means of support.

Of course, since Citizen Baines aired at 9 p.m. on Saturdays, I was the show’s lone viewer outside of the 70-89 demographic. But I was grateful for the time slot. Baines debuted at a time when I had a string of very slow Saturday nights. The only guy I was seeing regularly was Raul, the cashier at the local Mexican takeout joint. I didn’t just love Citizen Baines; I needed it. The show was cancelled after six episodes.

I attempted to recover from the Baines affair with a time-tested strategy: a rebound TV show. I pledged to have a little frivolous fun with a show that wasn’t really my type. I tried ABC’s police comedy The Job. The fling lasted less than one episode. I hadn’t heard such lame sex jokes since I taught a class of fifth-grade boys who pelted me with chalk and repeatedly yelled “penis.” The Job was the dregs.

Soon I moved on, as one does, to something more meaningful -- this time, A&E’s fine, no-frills courthouse drama, 100 Centre St. The show featured characters who were both flawed and endearing, including a conservative black lesbian judge and her liberal Jewish counterpart. The show was neither preachy nor heartwarming nor action-packed. It was sometimes surprising, sometimes dull, always real. It was like life. In February, it was consigned to the scrap heap.

This cancellation hit me particularly hard, and not just because I’d already lost Citizen Baines this season. I was also reeling from the announced demise of the WB’s Felicity, with whom I’d shared a wonderful, if difficult, four years. Once, in a single-handed attempt to boost ratings, I went to a Halloween party dressed as Felicity herself -- wearing her signature Converse high tops and a New York University ID card I’d made at Kinko’s. I stuck with Felicity, even through that wretched second season when she chopped off her hair and lost her way.

Still, much as I’ll miss Felicity, I’ll concede that the show may have run its course. The same cannot be said for ABC’s Once & Again, the messy, delicate family drama that has been lauded by critics and dismissed by viewers. This show is so good, I wish it were on five days a week. Yet O&A has been disposed of, too, despite a grassroots e-mail campaign to persuade the network otherwise. (I demanded that all my friends join the campaign as penance for not watching in the first place. This time they complied, but it was too late.)

All of these cancellations leave me with a fine conundrum: What to watch now? I could, as a defense mechanism, avoid future TV dramas of substance. I could stick with shallow, sexy shows like Law & Order and CSI -- with their ripped-from-the-headlines plots and story lines that are neatly wrapped up in an hour. Watching these shows is like having a series of one-night stands with the same fun guy: You have a good time but wouldn’t be devastated if it all ended. But that isn’t my style. It’s only a matter of time before I latch onto another critically acclaimed, pathetically rated ensemble drama.

Ah, but perhaps my luck is changing. Last season I fell hard for HBO’s Six Feet Under. Here’s how hard: While I was on vacation last summer in Arctic Russia, weather problems left me stranded in a remote town with no restaurants, no hot water and no TV other than a government-owned station that aired Brazilian soap operas dubbed in Russian. Remarkably, my group located a resident who had rigged up Internet access, and I managed to send an e-mail to my family: “Am safe but stuck indefinitely. Pls. record Six Feet Under.”

I’m deeply invested in this show and would be devastated if Six Feet got deep-sixed. But a miracle seems to have occurred: This critic favorite has actually become a ratings sensation. Maybe, finally, things will work out for me.

A contributing editor to Shape magazine, Suzanne Schlosberg is the author of Fitness for Travelers and co-author of Fitness for Dummies and has written for Health magazine, the Los Angeles Times and the travel Web site worldhum.com.

The Disturbing Sound of Silence

Is there ever a time when silence is the music of democracy? Not that I can imagine. In fact, I can't even think of a situation where a gentle lullaby or the sweet harmony of a string quartet could do it justice. Democracy is the stuff of rock 'n' roll -- loud and sometimes obnoxious -- screeching electric guitars, pounding drums and lyrics amplified to ear-splitting decibels. Freedom is about noise -- irreverent and raucous debate. Silence is the trademark of other forms of government, those that work in darkness and struggle to keep the will of the people hidden.

So when folks like Lynne Cheney, William Bennett and Meistersinger John Ashcroft tell us that it's our patriotic duty to stand in reverent silence for the duration of the War on Terrorism, I have to wonder if they've forgotten what country they live in. This isn't Iraq, North Korea or any of those other countries they want to blow the hell out of. This is the United States, where freedom of speech isn't just tolerated, it's considered part of our strength.

I've been at this business of trying to be a good citizen for some 46 years, and it still astonishes me how the very people who scream God Bless America the loudest are often the ones who seem to understand the least about what the country stands for. Didn't anyone explain to them the basic political values underlying our culture? Were they playing hooky when civics was taught? Do they assume that Thomas Jefferson, George Washington, John Adams and James Madison were conformists who never rocked the boat? That Congress and the states were just fooling when they adopted the Bill of Rights? Were they taught that Henry David Thoreau advocated marching to the sound of the same drummer? Or that Robert Frost penned that he took the road most traveled by?

Americans have a history of being cantankerous -- shouting objections, raising hell and generally making life miserable for those in power. And Cheney, Bennett, Ashcroft and the rest of their pack know this well. We know that they know it because that's precisely what they did, often far beyond the bounds of common decency, during Clinton's presidency.

But things are different today, they argue, because now we are at war. But once again their own history betrays their self-righteousness. Only a few years ago, many of these same GOP leaders freely criticized Bill Clinton over his handling of the war in Kosovo. One assumes they didn't consider themselves traitors for doing so. Then there's Michael Tomasky's priceless article in Salon, which establishes that less than two weeks after the attack on Pearl Harbor, Robert A. Taft, Mr. Republican himself, gave a speech that offered a stirring defense of wartime criticism as an act of patriotism.

Contrary to what Ari Fleischer said shortly after the terrorist attacks, this would be the worst possible time for Americans to decide they need to "watch what they say." There are too many important issues that need to debated freely and without fear of retribution. Here are just a few:

* Eighty-four years after the "war to end all wars," George W. Bush seems hell bent on starting a "war to end all peace" and our government is pursuing a nuclear arms policy that make Dr. Strangelove look like a sissy.

* We have a president who believes that the people's business is none of the people's business.

* We have an Attorney General who considers civil liberties to be an annoyance (and calico cats to be a sign of the devil).

* Our president is pushing for the adoption of a permanent wartime budget, which will neglect domestic needs while substantially increasing the already huge national debt we are so generously passing on to our children.

* Our national parks and wilderness areas are now under the stewardship of people who have never seen a tree they didn't think would look better as a stack of lumber, or a wilderness area that wouldn't look lovelier with a nice assortment of oil derricks.

* We are being led by people who honestly believe that the principal economic goal of the federal government should be to help rich people get richer.

* We have a president who also believes that Congress should stop bothering him while he's busy running the country.

* And by the way, all of this is being done by a president who didn't actually win the election.

How can anyone seriously argue that silence is the proper response for someone troubled by this state of affairs? Entering into a war, to name the most pressing issue -- especially a war as poorly defined as this one -- is the most serious step a society can take. It puts at risk the lives of our children and commits our nation's resources to the art of destruction, while more creative endeavors shrivel up and die from lack of nourishment. War can fundamentally change the character of a nation, often in ways no one even imagined going in. If freedom of speech doesn't apply here, when all the chips are on the table, then what good is it?

This is no time for silence. It's time to rock 'n' roll.

Celebrity Boxing for the C-SPAN Crowd

OK, let's fess up: All of us watched last week's televised fisticuffs festival featuring celebrity oddities Tonya Harding, Paula Jones, Todd Bridges and Vanilla Ice. Fox's Celebrity Boxing scored a knockout in the ratings ring, undoubtedly ensuring a long string of rematches, grudge matches and return bouts. So intoxicating was the buzz generated by the telecast, in fact, that the Foxies are rerunning the show (March 21, 8 p.m.) in prime-time TV's equivalent of the holiest of holies, the Friends time slot.

Rerunning a sporting event -- even one as loosely defined as celebrity boxing -- is almost unheard of in network television. Nevertheless, it seems likely that Fox will score another Nielsen coup, adding those who somehow missed out on the hype the first time around to a sizable repeat audience.

However, it would be fair to say that some Americans watched Celebrity Boxing a little more avidly than others. On line at the Wal-Mart, over the communal clothesline at the trailer park, membership in the "CB" viewing fraternity was no doubt considered a badge of honor, like owning a Shoney's coupon. But where I live, in blue-state America, admitting to having watched a wrathful Danny Bonaduce pound the living bejesus out of a bewigged Barry Williams is a bit, well, problematic. Sure, we like trash as much as anyone else, but you don't want to be rhapsodizing Paula Jones' demolition over the water cooler with Henri from marketing. You want to be discussing the latest insouciant New Yorker cartoon or bemoaning the dumbing down of NPR.

Fox seems to realize the cultural divide at play here. Recent reports indicate the network is exploring ways to shed its exploitative brand identity and attract a more upscale demographic. Normally that would mean throwing geek show proprietors like Celebrity Boxing producer Mike Darnell over the side of the boat, but we all know that will never happen. Perhaps there's a way the suits at One Murdoch Plaza could keep pugilism in their programming and expand their audience to include us out here in PBS Nation.

Herewith a few suggestions:

Battle of the Plagiarists: A main event pitting historians Stephen Ambrose and Doris Kearns Goodwin against each other would finally settle the question: Which of these blowhards can take the most blows? Mike Barnicle, Alex Haley and Joseph Ellis comprise the three-judge panel. And on the undercard, disgraced New Republic staffers Stephen Glass and Ruth Shalit could duke it out to see which former flavor of the month melts faster under pressure. Martin Peretz is the guest referee.

Rumble on the Right: In this corner, former Wall Street Journal columnist John Fund. In the other corner, Morgan Pillsbury, the conservative gadfly's one-time live-in lover, whom he reportedly badgered into having an abortion after learning she was pregnant with his child. And in yet another corner, socialite Melinda Pillsbury-Foster, Morgan Pillsbury's mother, with whom Fund had an intimate affair in the 1980s. The fourth corner has been reserved for Jerry Springer's bouncer. Former Clinton adviser Sidney Blumenthal, whom Fund once falsely accused of beating his wife, serves as cut man for the Pillsbury-Fosters.

When Apostates Attack!: Or, "When Good Conservatives Go Bad." David Brock, one-time American Spectator enfant terrible turned liberal truth-teller, trades low blows with Arianna Huffington, former rightist trophy wife turned Bush-bashing blog proprietrix. An audience of Republican National Committee members will ritually stone whoever is left standing at the end of the bout. Senator Jim Jeffords provides color commentary from inside a fortified bunker outside of the MCI Center in Washington, D.C.

WWF Blogwar World Wrestling Federation CEO Vince McMahon has a problem: Ratings for his televised blood feuds are down precipitously over the past year. The Internet has a solution: Scores of logorrheic pundits itching to scrap with one another. This steel cage free-for-all places Andrew Sullivan, Mickey Kaus, Virginia Postrel and Joshua Micah Marshall in a barbed wire-enclosed ring with only their laptops and their idiosyncratic points of view to defend themselves. Watch the epithets fly as a pumped-up Sullivan accuses Marshall of undermining the war effort in Afghanistan. See InstaPundit Glenn Reynolds bum rush the cage to rescue a prostrate and bleeding Kaus from a savage pummeling by Postrel, who's enraged over a Kausfiles item on the steel tariff decision and packing brass knuckles in her trunks. Guest referee Matt Drudge calls out the topics from a set of index cards.

It's the stuff of fantasy, of course, but matches like these would send a powerful message that celebrity infamy isn't confined to trailer park floozies and has-been TV stars. It's just as rightfully the province of corrupt fourth-estaters and chowderbrained talking heads. And hey, isn't bringing us all together what celebrity boxing is supposed to be about anyway?

Robert E. Schnakenberg is a writer living in Brooklyn. He is the author of The Encyclopedia Shatnerica and a major contributor to The St. James Encyclopedia of Popular Culture.

Sex and What City?

Let’s forgive and forget the fashions that would invite an ass kicking -- Newsboy cap at a jaunty angle? With suspenders? A short girl with a parasol? I guarantee any woman who tries that in Manhattan is friendless and will be mugged.

We’ll ignore, too, that no one on Sex and the City ever seems to go to the movies, a bookstore, work, any kind of live performance, a park, a diner, a bodega or a Starbucks, though in my world it’s the rare day when at least two or three of these quintessential New York amenities aren’t visited. (There was one day when I somehow made it to EACH thing on this list, though it was a bad day, to be sure, and not one you’d like to see fictionalized for television.)

In these ways I can accept that my New York is different from the New York of our Sunday night gals. After all, I haven’t worn anything strapless since my high school prom, and these women seem to exist in a world where every night, and most days, is the prom, glitter and all. Clearly, our experiences, probably of both sex and the city, would be different. And in a way, I'm quite grateful for the differences between our lives: While my life is certainly less photogenic than theirs and would hardly shape up into gripping half-hour episodes, I have the sense that it would be tiresome to have to work so hard to keep my hair looking nice.

But if the details of their Manhattan lives and mine are understandably divergent, surely on that little 14-mile island we all call home we would at least bump into some of the same things on the street. For example, though that foursome’s set moves a lot faster than mine, if we happened to be passing through, say, 14th St. and 9th Ave., wouldn’t we all see the hotdog cart on the corner? If we all found ourselves on 32nd Street (they going east, I heading west), would we not all be in Little Korea, where one sometimes finds Koreans? Or, if we all passed a synagogue and went inside at the same time by mistake, would it not be filled with the people of the book?

The answer to these and other similar questions is No, of course, as anyone who is as devoted as I to Sex and the City will know. It is a show without blacks, without Hispanics, without Asians (except the occasional appearance by Margaret Cho), without foreigners and, most significantly for this little Semite, it’s a show about New York City taking place in New York City and it’s got no Jews.

Well, not no Jews. There was that episode in Season 2 where the annoying childhood friend (whose husband got to be too much of a disaster) stayed with Carrie for a bit and irritated the hell out of her. Though it was never explicitly mentioned, the woman’s larger than life personality, jewelry and ass seemed designed to present her as a --- (I can’t spell it out because my mother finally convinced me that it’s a somewhat anti-Semitic phrase and I don’t want to contribute to any anti-Semitic sentiment in the world, so let’s just say that it is an acronym that used to be used derogatorily against a certain Asian-Pacific population around WWII and it rhymes with nap).

There was another episode in Season 2 where Samantha went to see a psychiatrist to deal with her boyfriend’s small penis, and the psychiatrist, if I had to guess, was also Jewish. I think I remember one episode from this past season where a sales person seemed like she might be Jewish, too. That’s three Jews in three seasons (I missed Season One, where perhaps there was more flourishing of the tribe).

Let’s compare that to my life: If it’s a weekend, I get up in the morning and want to get brunch, so I call a friend. Probably a Jewish friend, because I’d say around half my friends, especially the ones who might want to get a nice weekend brunch, are Jewish.

Before I leave, my roommate wakes up and saunters into the kitchen. Bam, another Jew. After brunch, perhaps my friend and I go see a movie. We stand in line for the movie, maybe at the Film Forum, where we’ll run into probably around 10 or 15 more Jews. And I don’t mean people who reach out their hand and say, “Good day. I’m Saul, a Jew. I work at Barnes & Noble, but really I’m a painter.” I mean people who you can see are Jews. Jews can pick out other Jews; it’s a gift we have. It’s how I know that there are no Jews on SATC -- I can SEE there aren’t, like that kid from the Sixth Sense sees dead people, except in his case there were dead people to see.

But let’s get back to my day, and we’ll take it somewhere where those sassy fashionistas might also hang out so we can see that their world is actually quite shockingly devoid of chosen ones. We’ll go to … a bar. A classy one, in the 20s on the East Side. One with a lot of people in suits, probably investment bankers or lawyers whose firms haven’t gone business casual (or, more likely, unemployed lawyers coming to drink after bad interviews with firms who aren’t hiring; this takes care of my work-week crowd, too, since I am a lawyer with lots of unemployed friends). Bankers and lawyers? In real life, Jews and Jews. In SATC, not Jews and Jews.

I don’t know how many Jews there are in New York City (the census report is too damn confusing), but I know it’s more than there are on the show. A lot more. Like, it’s a well-documented fact that Jews have played a vital role in shaping New York City in every way, be it culturally, gastronomically or educationally. But on this weirdly idealized version of New York, diversity is represented by gay bars and that one episode where Sam dates a black man whose sister makes him break up with her for being white. Somehow, on this show, man after man is afflicted with some unusual penile disorder, but not one of them had his foreskin snipped in a blessed ceremony.

Which shouldn’t, on reflection, be any surprise. After all, though Jews appear all the time as characters on sitcoms, I cannot recall any soap opera that features a Jewish character, except for Beverly Hills 90210, which had its bookish Andrea who eventually got knocked up and had to drop out of college. And I can’t even think of one commercial on TV that has a recognizable Jew in it, either. (That said, it seems like swarthy men in heavy black glasses are becoming staples in cell phone ads, so maybe people who were bar or bat mitzvahed are soon to be seen endorsing consumer goods.)

Since Sex and the City is essentially a mix between a soap opera and a commercial, I guess it’s less surprising rather than more that there isn’t even one Jewish character on the show -- except that the show is made out to be a love letter to Manhattan, and doesn't true love embrace all of the lover, even his (or her) slightly neurotic and nebbish sides?

The show-girls have seen maybe five Jews in five years. Most New Yorkers come across more Jews by breakfast, or by lunch at the latest. So, I won’t be the first to point out that it’s an unrealistic portrayal of life in New York, but I do find this particular omission to be somewhat shocking. After all, isn’t New York without Jews like a bagel without cream cheese? In a word: dry. Which I guess helps to explain all the drinking.

Arin Greenwood is spending the year clerking for the Northern Mariana Islands Supreme Court. She is one of three Jews on the island. Her stories have been published in numerous publications, including Ironminds, American Window Cleaner Magazine, Travelmag, The Providence Journal, and Phony Lid Pickpockets.

Making Cancer Sexy

Attention, teenage boys: The cover of the Feb. 18, 2002 issue of Time magazine features a naked, airbrushed, very thin woman with blond hair, shown from the waist up, standing sideways, covering her breasts with one arm while the other is awkwardly bent upward. She is staring off into space with a completely disengaged expression, like a mannequin, or a blow-up doll.

Page three, the table of contents page, shows her again, this time facing forward, arms bent in front of her, head thrown back in ecstasy, eyes closed. She is back on page 51, in the same shot, but this time we see more of her torso, right down to her pubic bone.

Oh. She's there to tell you about The New Thinking on Breast Cancer.

Breast cancer is a terrible disease. It is the most frequently diagnosed non-skin cancer among women in the United States, second only to lung in cancer fatalities. It has those of us on the distaff side nervously kneading ourselves in the shower once a month, having our breasts flattened and crushed in mammogram machinery while our fingers are crossed, and wearing pink ribbons on our lapels. Early detection has done much to lower the mortality rate of this disease, and we owe that to public awareness.

But you might not know that more women die of lung cancer than breast cancer (although a woman can't fondle her naked lung on the cover of Time). And, more women -- more people -- die of heart disease than of all forms of cancer combined. In fact, the top three leading causes of death in 1999, according to the National Center for Health Statistics:

1. Heart disease -- 30.3%
2. All forms of cancer combined -- 23%
3. Stroke -- 7%

Much of the press coverage of heart disease focuses on diet tips. But then, heart disease isn't "sexy."

Breast cancer coverage, whether in the press or in well-meaning TV movies, is obviously aimed at women. Lifetime touts itself as "Television for Women"; you won't find a newscast, but Designing Women and The Golden Girls air in perpetuity. It's a land where Judith Light, Valerie Bertinelli and Nancy McKeon get battered by Gregory Harrison, starve themselves, fall fatally ill, fight for their children and wear sweatshirts and no makeup. Breast cancer is a popular Lifetime movie disease. Movies about this disease always touch on the "Will I still be attractive to my husband?" aspect. And it is a sad fact that a woman who has to lose this part of her body does indeed have to confront issues of sexuality and body image -- not the case after a woman loses a lung or has a heart attack.

Is this why we'll never see a Lifetime movie about a woman with heart disease? Is it why we'll never see Jill Eikenberry struggle with a low cholesterol diet ("Damn it! I can't stop eating bacon. It's who I AM!") while Michael Tucker promises to stay with her even if she loses interest in sex as a side effect of her blood pressure medication? Because the struggle with heart disease doesn't seem noble or feminine enough? Because it can't be trivialized and reduced to sexual terms? Because it doesn't require actresses to be filmed examining their naked selves in the mirror?

Then there's the age of the woman on the cover of Time. She's clearly in her 20s. And yes, women in their 20s can get breast cancer. But according to the National Cancer Institute, a woman's chance of being diagnosed with breast cancer is one out of 257 from age 30 to 40; one out of 67 from age 40 to 50; one out of 36 from age 50 to 60; one out of 28 from age 60 to 70; and one out of 24 from age 70 to 80. But you don't see a 70-year-old woman fondling herself with her head thrown back on the cover of Time.

And let's not forget gender. Prostate cancer has the highest incidence rate among men, and it gets a lot of attention from the media too. Would Time feature a young, buff, naked guy cupping his "family jewels" as he looks thoughtfully into the distance on the cover for that story? No, the editors would no doubt go with the hangdog-yet-resolute face of either Joe Torre or Rudy Giuliani.

Young naked women are used to sell things to men and women alike. And the argument could be made that if a naked woman on the cover of Time gets you to read about breast cancer, then it's done its job. Maybe any coverage is good coverage, if it can save lives. But as the mainstream media keeps the focus on young healthy naked breasts and how they identify and feminize women, one can't help but wonder if breast cancer gets so much coverage because of the first word in the disease, not the second.

Karen Lurie is a writer living in New York City. She contributes to HoleCity, Modern Humorist and Flak Magazine.

Black Music and the Presidential Experience

"Do you remember lying in bed with the covers pulled up over your head? Radio playin' so no one can see?"

When he died this April, Joey Ramone may have looked like a kid ready to slouch against the wall of a laundromat, but really he was nearly the same age as the Early Boomers who then ruled the world: Clinton, Bush, Gore. (The dates of boomer birth, 1946 to 1964, may be sociologically useful in some respects, but not for accurate popular culture critique. Hence, Early Boomers, who can remember getting their first television sets; Mid Boomers, who to the best of their recollection have had TV in the family room their entire lives; and Late Boomers, who've always had TV in their bedrooms.)

In "Do You Remember Rock 'n' Roll Radio," he was recalling one of his generation's mythic type-scenes. Most boomers can share a story like his of sneaking off to listen to forbidden music. Most would agree with the language: "so no one can see," not "so no one can hear." You didn't pull the covers up because the noise would upset your parents. Instead, it was the concept of rock 'n roll that shook them to their Victorian cores. Listening to "Jerry Lee, John Lennon, T. Rex and Ol' Moulty" thus outstripped teenagers' other most popular solitary bedtime activity for parent-shocking potential.

More than television, in its infancy a socializing machine that involved the whole family, the transistor radio helped define boomers as a separate (even secessionist) generation. Boomers with transistors may have started out using them to listen surreptitiously to the World Series during history class (a mid-‘50s caricature scene), but eventually found them more useful for sneaking through the back alleys of mass culture. In bed, late at night, you could listen in while strange people sang about things you'd never even think of if you lived in the suburbs or other dominant culture compounds. With your secret transistor radio, you experienced life with hillbillies, with Latinos, with blacks from the Delta and from the big city.

This generational epiphany, this class-and-race-vexing instant of revelation, is as much a sign of Early Boomerdom as CNN coverage of the Gulf War is to Gen X and CNN coverage of school shooters is to Gen Now. That is why a recent White House ceremony is interesting. Although the President is generationally a boomer -- indeed, given his birth soon after his father's return from service, he is chronologically a boomer of near-laboratory quality -- George W. Bush lacks certain important connections to his contemporaries. At the ceremony he revealed a startling cluelessness about 20th century music, especially popular songs, that suggests that, growing up, he may have lacked benefit of a transistor radio. Certainly he appears not to have twiddled the knobs in adolescent secrecy, tuning in interesting strangers from far away, while Poppy and Bar had cocktails downstairs.

On June 29, Bush proclaimed June Black Music Month. The dating of this proclamation -- and the "backdating" of the celebration -- caught the attention of Slate's Timothy Noah: "Two measly days to celebrate Black Music Month? Particularly given the central role blacks played in creating most genres of contemporary music, including jazz and rock 'n' roll, that hardly seems enough."

And certainly this does seem a miscalculation, though it is not raising much ire. For instance, BuzzFlash.com, which is sensitive to every Bush misstep, had not even linked to Noah five days after his posting and was in every other respect ignoring Bush on Black Music. That this sensitive barometer has thus far failed to register the president's apparently shoddy treatment of Black Music Month suggests that this story has no legs just yet.

But there is more to the story than simply the suspicious tardiness of the June 29 proclamation. There is also a speech which allows textual analysis of the president's (or his speechwriters') failure to understand how people with his generational and class backgrounds use black music. At the signing ceremony for his proclamation, the president celebrated "People who brought a lot of joy and heart and energy to the American scene." While the musicians he celebrated included some who created jazz, the president seemed unwilling to extend his welcome far beyond the entertainers whose r&b, soul, and rock 'n roll music was his generation's soundtrack -- a soundtrack that, it turns out, he hardly listened to. And he went on to say that black music "is always easy to enjoy, yet impossible to imitate."

Easy to enjoy, impossible to imitate: The description is so neatly balanced, so rhetorically tidy, that you have to stop, go back, and reflect on it before realizing how wrong it is. Of course, it is possible to imitate black music. Elvis did it; Creedence did it; Madonna did (and still sort of does) it. You can come up with your own list: just close your eyes and point at any list of Top 40 artists from the '50s on, you're sure to tap an imitator. (Earlier: Stephen Foster imitated it; the guy who wrote "Dixie"; etc.) In fact, any history of American popular culture has to account for white Americans' insatiable desire to experience blackness vicariously -- that is, through imitation. Carl Wilson ached to be Chuck Berry just as Fred Durst aches to be Chuck D.

And "easy to enjoy" is just as dumfounding. How can an American George W. Bush's age -- he turned 55 July 6 -- say that black music is "always easy to enjoy"? What black music has he been listening to? (Performers at the White House celebration included James Brown, who regularly performs at Strom Thurmond's birthday parties, and the Four Tops.) In short, how can he call black music -- which includes not only the Tops' "Baby I Need Your Loving" and JB's "Hot Pants (Part 1)," but also such works as AmeriKKKa's Most Wanted, There's a Riot Goin' On, and Fear of a Black Planet -- "easy to enjoy"?

He has famously claimed to prefer country music over other pop genres, and numerous cultural critics have pointed out his blissful obliviousness toward the arts. So there are perhaps those pretexts for his cluelessness. But he has also claimed to be a representative of his generation. In his Republican National Convention speech, he uses the word generation 11 times, and his point seemed to be that he would represent boomers better than his predecessor by reining in Bill Clinton's excess. So how can he, who campaigned as a kind of super-boomer, say that black music is "easy"? How can his speechwriters give him such shallow words? How can his handlers imply that he missed out on his generation's acquaintance with black and other minority musics, arguably the most significant agent of cultural change in the United States over the last half-century?

Did he not have a transistor radio?

Now that Little Richard has become a fixture on the Disney Channel it is hard to remember just what he meant to the first boomers listening to him shriek on their transistor radios. But what he implied was the deliriously violent overthrow of Western civilization. The name of his band -- the Upsetters -- is a hilarious (and somewhat chilling) reminder of the contrarian extremes early rock 'n' roll operated in. A white middle-class kid listening to Little Richard on the sly was widening his experience to include apocalyptic culture-hopping as a lifestyle.

One such kid went to the trouble of plagiarizing the name of Richard's band for his own high school group: Upsetter's Revue. He would grow up to become perhaps the first real Republican super-boomer. When he was 8 years old, Lee Atwater experienced the generational type-scene. John Brady writes about it in Bad Boy: The Life and Politics of Lee Atwater.

"One afternoon, ... Lee sat on the automobile's floor fiddling with the radio dial. A song came on with a cadence, a beat, a spirit that seemed to transfix the youngster. It was James Brown singing 'Please, Please, Please.' [His father] turned the dial, but that night Lee found the same station on the radio in his room -- WLAC in Nashville, Tennessee - -and he was hooked on rhythm and blues."

Apparently Lee Atwater shared his generation's transistorized epiphany; like Joey Ramone, he remembered rock 'n' roll radio.

John Brady offers many more details of Lee Atwater's appropriation of black music. His appreciation of r&b usually seems sincere, though there is almost always a hint of willfulness, as if performing with his band on a flatbed truck in a Columbia, S.C., shopping center parking lot in the late ’60s is simultaneously a true celebration of black music and an underhanded bad-boy assault on good taste. Thus in the life of this most honored Republican strategist, there is a markedly complex lifelong interest in black music. (An article at Soul-Patrol.com even blames Atwater, with some rationale, for "the Destruction of Black Music.")

His musical appearance on David Letterman's program -- he was one of those sidemen who sit in with Paul and the band -- predated Bill Clinton's Soul Man turn with Arsenio by several years and was not as embarrassing as it could have been. His fratboy vanity CD, Red, Hot, and Blue, may not stay in your personal rotation, but, for celebrity musicianship, it beats, say, Keanu Reeves and Russell Crowe. Along with Atwater's political acumen, his endorsement of Sun-Tzu, his deathbed renunciation of assault politics, and his continuing influence via such disciples as Karl Rove and Mary Matalin, that interest in black music forms a major part of the Atwater mythos.

So, too, does the 1988 Bush, Sr., campaign, which Atwater successfully managed (not creating the Willie Horton strategy, as his apologists are quick to point out, but benefiting from it nevertheless). During the '88 campaign, the younger George Bush "observed" Atwater, ostensibly because Atwater, a Reagan functionary in 1980 and hence perhaps not viscerally loyal to Bush, bore watching. George W. Bush and Atwater eventually came to be close friends, the Butch and Sundance of the Right. (And last year's election provided many opportunities to link the candidate with Atwater -- a way to establish his gravitas.) According to a Washington Post article, one "mutual friend" even called it "a giggling, laughing Beavis and Butt-head [sic] relationship."

Of course, Beavis and Butthead shared an interest in music, a connection the mutual friend perhaps did not intend to draw. But the comparison leads once again to the question of how George W. Bush, who had in his background both four years of swinging frat parties and a Damon and Pythias relationship with a white bluesman-Machiavelli manqué, could have made it to 2001 without having noticed that black music is more than "easy to enjoy" and "hard to imitate."

Of his youthful experience with popular music, we know that George W. Bush participated in a rock 'n' roll band of sorts when he was at Andover, but that it appears to have been a parody display, an excuse for schoolboys to stand onstage and make fun of things they didn't understand. At college he gave up trying to seem interested in his generation's music. Calvin Hill, a fellow Deke Yalie, noted in an interview with the Washington Post, "George was a fraternity guy, but he wasn't Belushi in Animal House."

So our commander-in-chief did not gator while Otis Day sang "Shout," but in the end that is about all we can say of George W. Bush's response to the music of the ‘60s. (He told Oprah that his all-time favorite song was the Everly Brother's "Wake Up, Little Susie," an estimable song, but a sign that his musical interest extended only to 1957.) History records that the Yalie Bush swiped a Christmas wreath (charges dropped) and went to Princeton to pull down its goalposts, but not what he thought of "Mr. Pitiful" or "Are You Experienced?"

The liner notes for that first Jimi Hendrix album reflect the generational changes that apparently escaped George W. Bush's attention. "Used to be an Experience made you a bit older," you read. "This one makes you wider." In the well-advertised generation wars of the '60s, you could gain experience, meaning wisdom or cache, without suffering the indignity of growing older. (At the Republican National Convention, George W. Bush spoke about the age anxieties of boomers: "Our generation has a chance to . . . show we have grown up before we grow old." He did not name-drop Pete Townshend.)

But there is more to the notes' lexicographical quibbling than mere age-focused marketing savvy. Whatever sexual and pharmacological roads of excess the Jimi Hendrix Experience led dominant culture along, whatever the message implied by his band's racial integration (and shortly thereafter by Sly and the Family Stone's racial and sexual integration) a broader definition of "widening" is worth noting, especially since those liner notes seem to look backward rather than to the future.

To a major extent the groundwork for boomer appropriation of black music had already been done several years before "Are You Experienced?", when kids were listening to music on radios by themselves because they knew their parents wouldn't allow it. An entire generation ought to have come to some sort of understanding about how black music, among other then-outsider arts, worked within dominant culture. Popular music was no longer the simple entertainment of your parents' generation. Sometimes it was "easy to listen to," but often it was hard.

You do not have to be an especially adept critic of popular music or have a terrific memory for pop songs to understand that. Nor ought you to be a particularly astute politician to know that almost every boomer knows these things. All you need is a sense of the secret history of your generation, a memory of the strange and wonderful things that came through the night sky many years back, a marvelous experience if only you had been listening.

Are You PMDD-ing?

There is a lot of talk these days about the danger of male emotions, or rather, the danger of suppressing male emotions. Our culture’s persistent message that “boys don’t cry” is blamed for everything from road rage to school violence. But if we are going to argue that this characteristic of masculinity is unhealthy, then it is interesting to note that the opposite corollary -- girls do cry -- is often regarded as equally unhealthy. In fact, women’s tears -- or any other overt manifestation of their emotions or desires -- have often been viewed as symptoms of illness.

Historically, doctors and others have blamed a woman’s reproductive system for any number of negative emotions. For much of the 19th century, women were warned away from formal education because it would divert energy from reproduction and place too much stress on delicate systems not built for intellectual pursuits. In the first decades of the 20th century, the most radical women activists -- the ones who dared to demand the right to vote or share then-illegal information about birth control -- were described as unnatural or sick.

At other times in American history, women who were active in politics or professional life were labeled as “lesbians” or “communists” by those who considered such terms slurs and code words for mental abnormalities. (These days, the words “liberal” or “feminist” often seem to serve a similar purpose). When I was in high school, this same attitude was communicated in the question most likely to silence a girl’s temper or freeze her tears: “Are you PMS-ing?”

In each case, women -- primarily middle-class white women -- were advised of the symptoms that signaled they had strayed too far from the natural demands of their bodies: irritability, exhaustion, frustration, inattention to grooming, or even disinterest in sex. If any of these symptoms manifested, women were advised to return to a healthier lifestyle -- one that included a husband, children, and the soothing routines of home.

Nevertheless, women continued to resist the assertion that they ventured into politics, higher education, and professional careers at the price of their sexual health. They gradually changed the conventional wisdom of medical science in the process. In the past, men’s bodies were assumed to represent the norm while women’s reproductive systems were considered a variation to be controlled. But over the past 30 years, female doctors have drawn national attention to gender bias in medical education, drug testing, and disease studies, making slow but steady progress at removing this presumption from medical research.

During her tenure as the first woman to head the National Institutes of Health, from 1991-1993, Dr. Bernadine Healy oversaw a $625 million Women’s Health Initiative to study diseases that affect women. Similar initiatives have allowed medical professionals to collect information about the ways in which women respond to non-gender specific illnesses, such as heart disease, as well as to a host of medications. Studies of genetics and hormones also hold promise for enabling doctors to understand meaningful differences in the physiology of men and women.

But despite this progress, the basic assumption persists: Women’s emotions are threatening and unnatural.

This message still permeates American popular culture, as reflected in a recent trend in marital pop psychology: the surrendered wife. The basic premise of this movement, as espoused by Laura Doyle -- a former journalist who writes books and leads seminars to help women achieve “surrender” -- seems to be that women destroy their relationships through constant efforts to communicate their opinions. But rather than advising women to seek healthy avenues of emotional expression, “surrendered wife” proponents encourage women to give up any attempt to control their relationships.

In fact, Doyle advises women to adopt the phrase, “Whatever you think, dear,” and to turn household accounts over to husbands in exchange for a weekly allowance. Her philosophy suggests that healthy relationships -- and therefore healthy women -- are made through surrender to male authority. Somehow I can’t imagine the suffragists of the 1900s -- not to mention the Christine Todd Whitmans and Sandra Day O’Connors of today -- would agree.

Popular culture also suggests that professional, middle-class women contaminate society as well as their personal relationships. For example, the media still often cite working mothers as a cause of the afflictions of violence and drug use among adolescents. And while Americans may be more likely to feel sympathetic toward women who must work to ensure their children’s basic needs, we remain suspicious of those who work for personal satisfaction. How do we tell the difference between these two types of working mothers? Just tune into the Oprah Winfrey Show: If you’re exhausted, irritable, and fear you have no time to nurture your children or your marriage, you may be creating a sick household.

What then, can we make of those women who forgo professional careers and yet still experience exhaustion, short tempers, and crying jags? Might they be frustrated by the routine of life as stay-at-home moms? Of course not. Like the women of yesteryear, they are simply victimized by the demands of their reproductive systems. No worries, though -- a new diagnosis and cure are available, according to advertisements on daytime TV. These commercials tell us that women’s monthly hormonal fluctuations affect levels of serotonin in their brains, resulting in a serious-sounding condition called Pre-Menstrual Dysphoric Disorder. Any day now, I expect to hear the updated question on high school campuses: “Are you PMDD-ing?”

I’m not a medical professional. Perhaps PMDD is a legitimate diagnosis of legitimate symptoms. But as a cultural historian, I have my doubts. Despite enormous changes in acceptable roles for women, old prejudices and old dichotomies persist. The normal functioning of women’s reproductive systems is always assumed to be a source of stress for them, while men’s reproductive systems are only a source of stress when they do not live up to men’s expectations.

Just compare the way drug companies market the cure for PMDD to the way they market a cure for ED: erectile dysfunction. PMDD ads feature a visibly agitated woman ordering her sheepish husband to leave her alone; a distressed size 8 woman trying to squeeze into size 6 jeans; and a clearly overwhelmed and exhausted woman with her head in her hands.

Across the gender divide, meanwhile, former sufferers of ED sit calmly at the dining room table, holding hands with their supportive (surrendered?) wives, describing the pleasure of being “normal” again. Pfizer, the drug company that markets Viagra, even paid Bob Dole, the even-tempered former senator and presidential candidate, to appear in an ad urging those afflicted by ED to see their doctor. Not surprisingly, he was chosen because he represents “courage,” Pfizer spokeswoman Pam Gemmel said when the ads first debuted in 1999. Dole is, apparently, doing well, since he was last seen ogling Britney Spears.

I’d like to see a depiction of the emotional turmoil caused by men’s temporary, abnormal affliction: anger? crying? frustration? an inability to deal with the demands of a relationship?

Hey -- are they PMS-ing?

Denise D. Meringolo does not suffer from PMDD or PMS, but from Ph.D. candidacy. She is also Sicilian, which some say is an alternative explanation for her occasional emotional outbursts, but that’s another article ...

Selling the White Wedding Fantasy

You have dreamed of this day your entire life, and in your dreams it has looked like this:

You in a creamy white organza gown, your hair done in ringlets bunched around a shimmering tiara. He in a crisp, black tuxedo, his rugged jaw quivering slightly as he watches you walk down the aisle. Together you stand beneath a crystal blue sky on a cliff overlooking the Pacific Ocean, the ocean breeze gently blowing your hair. This is where you make your promises. There is a tasteful yet gleefully romantic kiss. And then the deal is sealed.

Afterwards, you and your beloved and 200 of your nearest and dearest retire to a large tent where fine china and centerpieces of fully bloomed, peach-colored roses adorn each table. Ribbons upon ribbons of tulle and organza drape the walls. You dine on a sumptuous feast, sip champagne from engraved glasses, and then come together, fingers interlocked and eyes brimming with joy, for your first dance - the dance you dreamed of since 10th grade - as husband and wife.

If you are a woman about to be married, this is what you want. Or, this is what all the magazines, television programs and movies about weddings tell you that you want. And, based on the increasing success of America's wedding culture, women appear to be buying into this fantasy with every bridal magazine we skim, every episode of A Wedding Story we watch, and every ticket to see Jennifer Lopez play a wedding planner who falls in love we purchase.

In the last decade, wedding-related products and entertainment have become increasingly popular. Bridal magazines, like Modern Bride and Martha Stewart Weddings, fly off the grocery store shelves. Internet sites like the knot have turned into mini-media empires, selling books and other print publications in addition to their Web services. The news media covers Madonna and Guy Ritchie's wedding as though it were a G7 Summit.

On television, it seems impossible to escape the "I do" syndrome. Proposals and weddings have long been used as a device for ratings-boosting season finales, from Joanie and Chachi's marriage on Happy Days to Monica and Chandler's engagement on Friends. These days, in addition to sweeps gimmicks, networks are broadcasting a seemingly endless parade of reality and infotainment-style shows about getting hitched: TLC's A Wedding Story; Lifetime's Weddings of a Lifetime; the In Style Celebrity Weddings special; VH1's All Access: Rock & Roll Weddings. The list is longer than a full-length bridal train.

And, of course, there are the movies: My Best Friend's Wedding, Runaway Bride and The Wedding Planner -- which despite universally awful reviews, was number one at the box office for two consecutive weeks and grossed $47 million-plus in four weeks. So how can we explain this seemingly insatiable appetite for all things bridal? Just as there are many ways to pop the question, there are many theories that may shed some light upon the wedding culture boon. Take a walk down the aisle and let's explore them.

Theory Number One: It's All About the Benjamins

Thanks to a booming economy that peaked in the late '90s, savvy investors and dot.com dabblers found they had a lot of cash to spare. And when you've got money to burn, there's no easier way to turn it to ash than by planning a wedding. According to recent statistics, roughly $50 billion is spent annually in the U.S. on weddings, which averages out to $20,000 per wedding. One can assume that the stable economic times allowed well-off Baby Boomers to feel more comfortable exceeding wedding budgets for their children. In many cases, the couple themselves -- lready well established and successful in their careers -- found they could contribute to the proceedings as well, if not pay for it outright.

In recent years, stories about lavish weddings have spread in the media, making it seem somewhat normal to treat a nuptial event as a big-ticket item rather than an intimate gathering. Destination weddings -- over-the-top parties in exotic locales like Italy and the Caribbean -- became a major trend. And celebrity weddings, with all their fanfare and ludicrously expensive pomp and circumstance, were hot topics. (When you hear that Brad Pitt and Jennifer Aniston spent $1 million on their wedding, it somehow seems reasonable to spend a mere $200,000 on your own, doesn't it?)

Given the fear about a possible recession, will wedding culture go out of style? Probably not, because regardless of Puff Daddy's assertion, it's not solely about the Benjamins. It's also about...

Theory Number Two: Supply and Demand

It's so basic that it's almost embarrassing, but here it is: Wedding-related products continue to proliferate because they make money.

Weddings are special events but they're also big, big business. Magazine magnates, television producers and movie studio executives wouldn't continue to use bridal themes as a blueprint if they didn't snare consumers. Ten years ago, the magazine Martha Stewart Weddings did not exist. Now it's one of the most popular, if not the most popular, bridal magazines around. And that's because people are willing to pay $7 to thumb through all those glossy pages and gaze at the pretty pictures. (Sorry, guys; the industry has yet to convince men that they need 500-page guides to figure out what to wear and where to honeymoon.)

If you've ever been a bride, then you know that one bridal magazine isn't enough. After becoming engaged, buying a truckload of those advertising-rich periodicals is practically a rite of passage, a way of proclaiming to everyone in your check-out line and to society-at-large that you have entered the nuptial sorority. If your engagement will be a long one, then you'll buy even more bridal magazines. And even if you don't, it's OK -- there's a steady stream of brides-to-be and an assembly line of rose-colored-glasses wearing wedding planners who will buy these publications from now until the end of time.

The same can be said of cinema. The past decade has been chock full of mostly mediocre yet highly profitable marriage movies: Father of the Bride, Four Weddings and a Funeral, the Julia Roberts two-fer My Best Friend's Wedding and Runaway Bride, The Wedding Singer, and the most recent Lopez vehicle, The Wedding Planner.

Consider The Wedding Planner for a moment. One could argue that it was successful largely because of Lopez's drawing power. But do you think the film would have done as well if she and Matthew McConaughey fell in love against the backdrop of a homeowner's association battle? Probably not. The wedding theme has worked in the past and when new ideas are scarce, any movie studio or other revenue-hungry business will go back to what's tried, true, borrowed and blue.

Theory #3: We Are All Joan Rivers

I don't mean that we've all had plastic surgery and enjoy using the phrase "Can we talk?" Deep down inside there's a little bit of Joan (and Melissa) in many of us because we enjoy analyzing fashion. And fashion increasingly has become a part of our collective culture.

At least one reason so many people watch major award shows like the Golden Globes and the Oscars is to see what the stars are wearing. When someone sports a dress that's particularly outrageous, it becomes fodder not only for gossip columns, but for mainstream news media as well. Look no further than Ms. Lopez again for proof.

As much as we enjoy the Academy Awards, few of us ever actually get to go, much less explain to an Entertainment Tonight reporter that tonight we're wearing a dress Carolina Herrera specially designed for us. The closest most of us get to high fashion is our wedding. For once, the red carpet is yours and you get to dress as close to the Hollywood dolls as you choose. You can hop from bridal boutique to bridal boutique trying on $4,000 gowns and pooh-poohing them because they're "just not what you're looking for."

It's a game of dress-up made reality, inspired by the media's fascination with celebrities. The recent In Style: Celebrity Weddings program broadcast on NBC put the spotlight on several celebrity weddings and, during one segment, on singer Toni Braxton and her search for a bridal gown. Viewers won't find out what she chose until next year -- and since the first In Style wedding show merited a second, you can rest assured there will be a third. As fluffy, shallow and silly as such segments might be, they pique viewer interest, particularly among women. Before you're married, you dream about how you'll look. After you're married, you dream about how your friends will look (or how you could have looked). And if you have a daughter, you dream about how she will look.

So you eat up these TV specials like pints of Ben & Jerry's chocolate chip cookie dough ice cream. After all, you're Joan Rivers. And you want your little Melissa to look like a princess. Anything less and you're likely to feel marginalized by a society that continues to press the white wedding fantasy on women.

Theory #4: The Circle of Life

In case you haven't guessed, I am engaged to be married. My wedding is in May and for the past 15 months (long engagement, many bridal magazines), I've been knee deep in planning, but without a Lopez-esque wedding planner to help me out -- good thing since she has a tendency to fall for other people's fiancés. As level-headed and practical as I think I am, I have found myself engaging in heated arguments and internal debates about subjects to which I never thought I'd be reduced. I've analyzed the pluses and minuses of dyeable bridesmaid shoes, the virtues of a detachable veil and the benefits of buttercream frosting. On several occasions, after hearing myself talk, I have paused and asked, "What are you even saying? What have you BECOME?"

It's so easy to get wrapped up in the money, the style, the etiquette and all the assorted wedding accoutrements that we forget what weddings are really about: commitment, family, love and tradition. When you reduce everything down to its essence, that's why weddings have always fascinated people, from the time the Song of Solomon was written to today. Today's wedding culture is just a modern, money-driven response to an old obsession.

It's hard not to be smitten by weddings; they reaffirm that despite all the negativity observed in society, love and hope endures. For many, they are formal rituals that solidify a commitment to family and God. For me, as for others, the wedding is a final goodbye to childhood. I'm 28, so technically I've been an adult for 10 years. But becoming someone's wife, making a lifetime commitment to another person, will be the most adult thing I've ever done, even more adult than paying my own rent for the first time.

There are certain moments that define the cycle of life: baptisms, bar mitzvahs, graduations. But weddings, with their drama, their tears and their toasts, may be the most joyous, the most emotional and the most beloved. Like few other events, they remind us of what we once were and what we hope to be. And the wedding industry and the media know a good thing when they see it. After all, when you break down the four most consistent expectations in life, what have you got? Birth, marriage, death and taxes. And no movie studio will ever greenlight a film called Four Tax Audits and a Funeral.

Jen Chaney is a Washington, D.C.-based columnist and feature writer.

Naming Our Destiny

Choosing a name for our first son required a value synthesis. Between my wife and I, one of us wanted an African name to link him to our cultural heritage. The other preferred a more culturally neutral name, to shield our son's resume or school applications from prejudice. The crossroads of class-oriented and cultural values made me seriously question if we had truly become "bourgeois," or had we become what some African Americans call "bougie" (pronounced with a soft "g"). Finding the answer would not only clarify my professional mission, it would help us chart the course we want our growing family to follow.

There is a difference between the terms "bourgeois" and "bougie." Bourgeois is an observation identifying a true commitment to frugality, the accumulation of significant material wealth and the preservation of the aristocracy and similar capitalist values. Bougie is a commentary characterizing certain African Americans as mostly concerned with the appearance of wealth. It also suggests a mask covering one's true culture.

In some ways, the name we chose for our son was a mask since his ethnic identity would not be readily apparent. But our deliberations involved more than appearance. When we pored over Web sites and books containing more than 20,000 names and genealogies, we talked about the symbolism each potential name would carry. Old Testament names were strong candidates because they were derived from virtues like faith, dedication and hard work; by giving him my middle name, which reaches back several generations, it tied him to family tradition. Admittedly, we did give up some of our heritage by not choosing an African name. Yet bougie did not fully explain the reasons for our choice.

So how about bourgeois? Neither of us has an aristocratic background, nor are we conservative on fiscal policy. We both attended reputable universities but we're still working on accumulating material wealth. Moreover, our views on social policy seem much too progressive to mesh with the stodginess evoked by the bourgeois adjective. So bourgeois didn't seem applicable either.

Still pondering the answer, I was listening to National Public Radio when David Brooks was discussing his book Bobos in Paradise: The New Upper Class and How They Got There (Simon and Schuster, 2000). Brooks explained that "bobos" are those among the educated elite who attempt to reconcile bourgeois ideals with bohemian ones (i.e., promote artistic or cultural expression, question the status quo and serve the disenfranchised). They might also appear to be contradictory at first. Using phrases like "compassionate conservatism," "smart growth," and "natural hair color" without irony, they can articulate the link between two (seemingly) opposing concepts.

Brooks' analysis of the bobo lifestyle can be broken into three categories: environment, personal history and attitude. A bobo environment includes expensive stores that provide shoppers the opportunity to pay above-market prices for items made to look old, simple or "earthy." High-end furniture stores like Ethan Allen or Pottery Barn fit this category with their rustic and Shaker styles. The personal history of bobos is marked by high professional achievement based not on family wealth, but on education, ambition and hard work.

The bobo attitude or world view shapes their environment and guides their personal histories. It moderates their bourgeois taste for expensive specialty markets with "organic" production or "Old World" cooking. It balances the hubris that might accompany an elite college degree with community service or understatement ("I went to a small university in New Haven"). The richer bobos become, the harder they try to find bohemian ways to express their financial success.

As I listened more closely, I thought perhaps Brooks had captured the disparate ideas I was trying to reconcile. Then a female caller asked him whether he had attempted to apply his thesis to African Americans; he confessed he hadn't. I've decided to try to answer the question, believing it might clarify my perspective on our family's values in the process.

By examining his book, I discovered that upwardly mobile African Americans and bobos share the same spirit. For example, as members from each group move up the social ladder, many legitimize their success by channeling it into social causes. My own values seem to be a composite of the interconnections between the two cultures, but it's worth first taking a look at the rise of the black bourgeoisie.

Merging Africa and America

Africans who crossed the Atlantic in the 1700s did not bring luggage. However, memories of their old lives were stowed away in their African names. But when they arrived in the New World, those names were replaced with European ones, triggering an ongoing process of African redefinition in America.

Olaudah Equiano exhibited an early attempt at redefinition. He was a slave captured in Northern Nigeria and brought to America around 1755. His first name, which means "favored," "having a loud voice" and "well spoken," turned out to be prophetic. It was illegal for slaves to read, but he found a way to master English and write a narrative in 1789 that recounted his Atlantic crossing and his introduction to slavery. In one passage he wrote, "My [new] captain and master called me Gustavus Vassa. I at that time began to understand him a little, and refused to be called so ... and when I refused to answer to my new name, it gained me many a cuff; so at length I submitted, and was obliged to bear the present name, by which I have been known ever since."

Equiano forged a new identity out of his African memory and European present; the result was an assimilation formula many African Americans after him followed to varying degrees. He wrote that he "imbibed their spirit, and imitated their manners," but he then directed his growing literary talent toward the Abolition cause.

Perhaps more difficult to reconcile were dual racial identities. A great number of biracial (or less favorably, mulatto) children were born during slavery, often the result of rape. Their lighter skin generally meant they would be exempt from grueling fieldwork, and would instead perform domestic work. It also meant they would receive a second-hand education of bourgeois values. By working in the "big house" they were exposed to the conversations and activities concerning the business of slavery. Hence, they were, white and black, privileged and enslaved.

Some biracial progeny simply accepted the privilege that came with their lighter skin and conferred its benefits upon their "darker" family members. Others created a separate society, from which sprung a set of values and traditions still within the spectrum of black culture. For others still, there was no true reconciliation. They either muted African memories and "passed" for white, or were rejected by their extended family. This last group lived in a type of cultural exile. In his novel Autobiography of an Ex-Colored Man (Sherman, French & Co., 1912), James Weldon Johnson described such a dilemma, "Sometimes it seems to me that I have never really been a Negro, that I have been only a privileged spectator of their inner life; at other times I feel that I have been a coward, a deserter, and I am possessed by a strange longing for my mother's people." (The character's mother was black; his father, white).

Questioning Education

Unshackled but not unfettered, free black men and women seeking education found incongruity between it and their self-image. On one hand, education could mean liberation of the mind and preparation for new opportunities. On the other hand, the education they received often advanced European ideals that conflicted with black realities. For example, Western Civilization courses recorded Africa as the "dark continent" or periods of glory for African nations as the "dark ages." African American contributions to history were overlooked or downplayed. Psychological studies of blacks depicted them as sociopaths, genetically prone to commit crime.

The question posed to would-be scholars then was: "Does academic excellence in the context of a society that holds African Americans in low regard mean acceptance of anti-black ideals?" Author Carter G. Woodson believed the segregated educational system prepared black students for second-class living. In his landmark book, The Mis-Education of the Negro (Africa World Press, Inc., 1933), he wrote: "When you control a man's thinking you do not have to worry about his actions. ...You do not need to send him to the back door. He will go without being told. In fact, if there is no back door, he will cut one for his special benefit. His education makes it necessary."

Conversely, pioneer educators such as Mary McLeod Bethune, W.E.B. DuBois, Anna Julia Cooper and Booker T. Washington had a more utilitarian outlook. "Education has the irresistible power to dissolve the shackles of slavery," Bethune said. Based on this belief, she went on to build Florida's Bethune-Cookman College. Washington founded Tuskegee University in Alabama. But whether education for blacks was a means to an end or a way to challenge European thought, the quest for education engendered intense introspection.

As many doctors, lawyers, educators and other professionals emerged from historically black institutions, they legitimized their education by discussing how it would help their communities. Returning to their native communities to begin private practice, black professionals took pride in providing services previously inaccessible to blacks.

To be Gifted, Black and Bourgeois

As the black bourgeoisie emerged in the mid-20th century, its members started living in enclaves that grew into exclusive communities: Highland Beach, Md., Baldwin Park, Calif., or Oak Bluffs in Martha's Vineyard. These communities also practiced their own form of discrimination by excluding blacks who did not have the proper pedigree. While membership generally guaranteed a financially rewarding professional life, it still could not protect wealthy blacks from the fallout of racial discrimination.

In his trenchant, if not caustic, book Black Bourgeoisie (Free Press, 1957), former Howard University Prof. E. Franklin Frazier analyzed the lifestyle of African Americans who had "made it" when school desegregation was occurring. He found that financial or professional success, coupled with their rejection of (and by) white society, led to destructive attempts at personal reconciliation. Instead of using social activism to balance their success, blacks purchased yachts, mansions and cars to mitigate their rejection. Instead of looking at their "blackness" as a context to celebrate the obstacles they had overcome, they saw it as the last obstacle preventing them from true success. The adoption of these "white manners" created other problems, as some were left feeling empty and filled with self-hatred.

The difficulty in finding equilibrium between bourgeois success and bohemian connections persisted into the late 1980s. When Lois Benjamin interviewed a hundred prominent African Americans for her book The Black Elite: Facing the Color Line in the Twilight of the Twentieth Century (Nelson Hall, Inc., 1991), she found that successful corporate executives worried about being labeled "Uncle Toms."

Though a misnomer, the term stung when used against blacks accused of trading their ethnic and racial identity for acceptance within a white business environment. Still, they knew they had to stifle progressive ideals if they wanted to be considered for the same positions as whites. Despite their knotted existence, some black corporate executives claimed they could successfully combine their influence with social consciousness. Some tutored children in distressed communities. Others made significant financial or professional contributions to organizations such as the National Urban League and the NAACP.

Building upon both Frazier and Benjamin's work, Lawrence Otis Graham has painted the most current picture of the black bourgeoisie in Our Kind of People: Inside America's Black Upper Class (Harper Collins, 1999). In observations and interviews, Graham found that elitist attitudes and materialism, which Frazier noted in the 1950's, were still prevalent among the black elite. School reputation, house size and, to a large extent, the shade of their skin, determined which blacks qualified as one of "our kind." And yet Graham, who grew up as one of "our kind" and who also admitted getting a "de-negroidizing" nose job, fervently defended the social conscience of his colleagues. He recounted numerous examples of how exclusive organizations to which he belonged contributed to the well being of their communities.

For many of the blacks described within Graham's book, acceptance to the invitation-only inner sanctum of blacks was not only a means to success, but success itself. Those who belonged to exclusive black social organizations like Jack and Jill (for young adults), Boulé (for professional men) or Girlfriends (professional women) defended their membership, saying it positioned them for educational and professional opportunities. While that pursuit may in fact lead to material benefits, it is clear that acceptance into such groups is also an end into itself. Graham notes that negative opinion followed those who mistakenly attended the wrong colleges, joined the wrong Greek organizations, or moved to the wrong neighborhood.

Resolving Incongruities

Reviewing centuries of African American identity and redefinition made it clear that the questions my wife and I asked ourselves about our identity had been raised long ago. Phrases in our lexicon like "Give back to the community," "Remember where you came from" and, more recently, "Keep it real," or pejoratives like "Uncle Tom" and "oreo" (synonymous with bougie) indicated that many other African Americans were seeking answers to those questions.

This enhanced perspective, however, now provided answers for me: First, none of the terms - bobo, bougie, or black bourgeoisie - fully described us. Our high educational and professional standards might appear bougie to those who don't know our true motivations, but we will not change those standards because of appearances. We share the bobo attitude (financial success must blend with social conscience), but we have not yet achieved the financial success typical bobos have enjoyed. Likewise, our community, Prince George's County in Maryland, is still trying to make itself suitable for bobos. Though we have a few interesting bookstores and restaurants, we lack Old World bakeries, Ethan Allen stores and other high-end retail.

Our lives intersect with some of the social structures upon which the black bourgeoisie draw their strength. For instance, my wife went to a prominent historically black college. Also, the church where we were married is known for its network of black professionals. We know the success of the people with whom we interact in these places inspires other young blacks, but we don't share the elitism that genealogically separates one "kind" of successful black from another. Our friendships cross economic boundaries and are based on mutual feelings of respect, not who will help us gain entry into the inner-sanctum.

To be a successful black couple in America is to accept some bourgeois values. We maintain thriftiness in order to save money to buy a home. And we help preserve the aristocracy by working at or buying products from Fortune 500 companies instead of starting our own. So we'll just keep on making sure each morning we recognize the couple we see in the mirror.

With these answers we could look back on our son's name and feel at peace with our decision. We chose his first name, Spencer, because it seemed to connote dignity and individuality. We chose his second name, Madison, because his father, grandfather and great-grandfather share it. And despite the fact his name is not identifiably African, our attitudes, experiences and extended family will cause it to take on African American meaning. The clarity these answers provide for our family's mission comes at a great time because Spencer is now taking his first steps.

Arnold M. Kee is a writer whose day job involves monitoring access and equity in the nation's community colleges.