Dan Kennedy

Faces of Death

It was a little more than two weeks ago that NBC News broadcast a piece of video from Fallujah that was both startling and sickening. U.S. Marines are seen walking into a mosque where several injured, unarmed Iraqi insurgents are lying on the floor. Although NBC censored the audio, we now know that one of the Marines excitedly said, "He's fucking faking he's dead. He's faking he's fucking dead." The Marine aims his rifle – and shoots the insurgent in the head.

For a few days, at least, the video clip – taken by freelance journalist Kevin Sites, a veteran war correspondent – seemed certain to become one of the signature images of the war in Iraq. And perhaps it will. An investigation is under way, and if and when the young Marine who pulled the trigger is publicly identified, the image may take its place in the pantheon of wartime horror. To this point, though, something odd has happened, or rather hasn't happened. Because so far, it seems, the clip is already fading from memory, and has not joined such terrible images as the torture photos from Abu Ghraib, or those of the American contractors who were butchered and mutilated in Fallujah a year and a half ago, or the heart-stopping photos of casualties, many of them civilian, on display at falluja.blogspot.com, or even the unseen but easily imagined execution of Margaret Hassan, killed in cold blood after a lifetime of helping the Iraqi people.

Why this should be is hard to say. But here's a guess: we know too much to let the clip stand alone, without context. In Susan Sontag's 2003 book on photography and war, "Regarding the Pain of Others," she writes that "to photograph is to frame, and to frame is to exclude. ... A photograph – or a filmed document available on television or the Internet – is judged a fake when it turns out to be deceiving the viewer about the scene it purports to depict." There was a time, perhaps, when Sontag's insight would have been regarded as a revelation. Today, though, it's commonplace. In introducing Sites's report on the Nov. 15 NBC Nightly News, anchorman Brian Williams said, "It illustrates how complex and confusing life can be on the front lines of this war," thus setting the stage for an ambiguous interpretation of video that appeared, on the surface, to be pretty unambiguous. Sites provided more context, reporting that American forces had been killed or injured by the booby-trapped bodies of dead insurgents, and that the Marine who shot the injured Iraqi had himself been shot in the face the day before.

Thus, rather than being cast as a symbol of all that's gone wrong in Iraq, the Marine has been treated almost as an object of pity. To be sure, that has not been universally the case. Amnesty International and Human Rights Watch have both called for an investigation into whether the Marine may have committed a war crime. Human Rights Watch executive director Kenneth Roth, on Fox News's The O'Reilly Factor, went so far as to say that "there is a prima facie war crime here that deserves court-martial." And yes, Arab news services such as Al-Jazeera and Al-Arabiya, not to mention Web sites that spew flat-out propaganda on behalf of the Iraqi insurgency, have reportedly been showing the video constantly. But among the American media, even staunch anti-war critics have been subdued.

The reason, I suspect, is that Sontag's lesson was internalized a long time ago by a generation of Americans who grew up learning about atrocities committed by U.S. troops in Vietnam, who blamed young American soldiers for a failed and immoral policy, and who later realized they were pointing the finger in the wrong direction. What that young Marine did in Fallujah was horrifying. But it didn't take place in a vacuum. Rather, it took place in the midst of days upon days of street-to-street fighting, of exhaustion, of fear, of split-second decisions that could mean the difference between life and death. What happened in that mosque was a tragedy, but who among us could say that we wouldn't have done the same thing? The real tragedy is that a scared young man made a mistake, while there are no consequences for the far more serious mistakes committed by the likes of George W. Bush, Dick Cheney, et al.

This is NOT to say that the Marine shouldn't be held accountable. What he did may not have been a war crime, and a court-martial seems pretty drastic for his spur-of-the-moment reaction to a potentially dangerous situation. (Indeed, according to the current U.S. News & World Report, unnamed Pentagon officials expect the Marine will be cleared of all charges.) But to argue that he should not be locked up in Leavenworth is not the same as saying that he did the right thing. Sites wrote a long, impassioned entry on his weblog, www.kevinsites.net, on Nov. 21, eight days after the shooting. Numerous reports have made clear Sites' empathy for the troops with whom he has been embedded. Yet what's been overlooked to some extent is the degree to which Sites, and those around him, understood that something had gone drastically wrong inside the mosque.

Sites wrote about the Marine coming up to him after the shooting and saying, "I didn't know, sir – I didn't know." Sites added, "The anger that seemed present just moments before turned to fear and dread." And he wrote that "observing all of this as an experienced war reporter who always bore in mind the dark perils of this conflict, even knowing the possibilities of mitigating circumstances – it appeared to me very plainly that something was not right. According to Lt. Col. Bob Miller, the rules of engagement in Fallujah required soldiers or Marines to determine hostile intent before using deadly force. I was not watching from a hundred feet away. I was in the same room. Aside from breathing, I did not observe any movement at all." And for those who would argue that Sites should have, well, lost the video, he had this to say: "Hiding this wouldn't make it go away. There were other people in that room. What happened in that mosque would eventually come out. I would be faced with the fact that I had betrayed the truth as well as a life supposedly spent in pursuit of it."

These are the words of an honorable man pursuing an honorable course. Yet so twisted with rage are some of our so-called patriots that they have all but accused Sites of treason for telling the truth – the whole truth, complicated and contextual, explaining not just what the Marine did, but what he had been through before he did it. Sites makes it clear that the U.S. Marine Corps itself is anxious to find out what happened, to learn whether a breakdown in discipline and training had occurred that could place other Marines in danger. To some here at home, though, things look a lot simpler.

Take, for example, "Frank from Malden," who called The Howie Carr Show on WRKO Radio (AM 680) the day after Sites's report aired on NBC. Calling himself "a former Marine and very proud of it," Frank said, "I think this young gentleman should have got a medal for what he did." Then there was this: "I would think that the reporters in country should kind of be looking over their shoulder. Because if the reporters are going to put these kids in that situation, they may have some friendly fire there, you know?" An incredulous Carr asked, "What, do you think they're going to frag this guy the next time he goes out, this Kevin Sites?" Frank replied, "It could happen. That's the way it was. It could be again, you know?" Frank laughed and hung up. Carr seemed momentarily flustered; he then recovered and said that Sites "seems to be a real pro," and that he presumably could not have gotten his video out of Fallujah without military approval.

Yet Frank from Malden's point of view isn't all that unusual. In 1965, Morley Safer, then a young reporter for CBS News, accompanied some Marines to a group of Vietnamese villages known as Cam Ne. What Safer observed was the first televised American atrocity of the war. The Marines set fire to thatched huts, and threw hand grenades and fired flamethrowers down holes, killing the civilians who were cowering inside. It would have been even worse if Safer's South Vietnamese cameraman hadn't intervened. As described by David Halberstam in "The Powers That Be," CBS executives were deeply unhappy when they saw Safer's story, knowing – then as now – that they would be accused of being unpatriotic, of undermining the war effort, by putting the truth on the air. "They knew they had to go with it," Halberstam wrote. "It was not so much that they wanted to as that they simply could not fail to use it." And so they did. And so CBS president Frank Stanton was awakened the next day by a phone call from Lyndon Johnson, who told him, "Frank, this is your president, and yesterday your boys shat on the American flag."

As we all know, to this day you can still find reasonably bright people who believe that it was the media that lost the war in Vietnam – that the United States never lost a battle, but it lost the larger fight because the media undermined morale and dissipated support for the effort. Well, it's true enough that we never lost a battle in Vietnam; nor are we likely ever to lose a battle in Iraq. But the dilemma then – and, one fears, the dilemma now – is that no matter how many battles we win, the war can't be won because it's based on false premises. How can we win a war that we're fighting on behalf of people who hate us and who want us to leave their country? In any case, covering up the truth is hardly the solution, then or now.

"I feel very strongly that everything should be shown," says Jules Crittenden, a Boston Herald reporter who was embedded with the Army's Third Infantry Division in the spring of 2003. "In this particular case," Crittenden says of Sites, "he had a job which is very unambiguous. His job is to record what's going on. The military invited him there with full awareness – and I know, because I've spoken to many of the people involved with designing the program – that the embed will produce good, bad, and ugly. The military, in establishing this program, understood that there are going to be some bad days. I think they have always expressed a great deal of faith in the professionalism and fundamental goodness of American soldiers. You don't have a bunch of loose cannons running around out there. And they can trust their people to deal with this. I don't think that sense of trust that the military has on their own part or the trust of the American people has been violated by this incident. There's an investigation under way. The majority of people out there seem to understand the context in which this situation happened."

Context is vital, but it can also change, and it's never complete because everything can't be included. Another moment from Vietnam: in 1968, Eddie Adams photographed South Vietnamese general Nguyen Ngoc Loan at the very instant that he executed a Viet Cong fighter with one bullet to the head. Adams won a Pulitzer Prize for the photo, which seemed to encompass all the insanity and immorality of that war. Yet Adams, who died earlier this year, later came to see it quite differently. He got to know General Loan, and realized that the execution was perfectly justified; and he regretted that Loan's life was made much more difficult because of that infamous image.

"Photographs, you know, they're half-truths, you know, that's only one side," Adams told National Public Radio in 1998, shortly after Loan's death. "It's just a sad statement, you know, I think of America. He was fighting our war, not their war, our war, and ... all the blame is on this guy. I got to know him pretty well. I talked to him the last time about six months ago. He was very sick, you know, he had cancer for a while. And I talked to him on the phone, and I wanted to try to do something, explaining everything and how the photograph destroyed his life, and he just wanted to try to forget it. He said let it go. And I just didn't want him to go out this way."

I asked Dirck Halstead, himself a former war photographer and an acquaintance of Adams, whether he could draw an analogy between Adams' experience and Sites'. Halstead, now the editor and publisher of a magazine called the Digital Journalist, responded by e-mail. "In general, photojournalists are like cops. They have pledged themselves to always do the right, ethical thing. However, we all have heard of countless police officers who have become traumatized as a result of having to shoot someone in the line of duty. Unfortunately, this comes with the turf," Halstead told me. "Kevin Sites was covering a battle, as a pool embed. His job was to record what was going on. He was as surprised as Adams was by what happened. He also, obviously, was conflicted and confused by what he had just shot. ... He clearly has bonded with the men he has been covering. This happened with most of the pool reporters and photojournalists who have covered the war. This makes it even more difficult, since he obviously feels he let his comrades down. But he has to keep in mind why he was there, and what his job was. I feel for him and want to express to him my respect for a job well done."

Adams only learned of the broader context of Nguyen Ngoc Loan's life later, after his photo had been seen around the world. Sites tried to offer what context he could in his original report – the exhaustion, the fear, the booby-trapped bodies, the death that lurked around every corner. But a photographer can, at best, help tell the story of what's happening just outside the range of the viewfinder. The broader context – the broadest context – remains elusive. On Nov. 17, NPR's Melissa Block interviewed an Al-Jazeera spokesman, Jihad Ali Ballout. The subject: why Al-Jazeera was running Sites' video on an almost-continuous loop, whereas it refused to show the execution of Margaret Hassan, a video that network officials have admitted is in their possession. Ballout told Block that "these atrocities of killing innocent people, especially people such as the late Mrs. Hassan, was really an outrage. There is a difference between that and when there is a whole army of 20,000 military people converging on an area in Fallujah." Block responded by asking whether Al-Jazeera was using a "double standard" in showing the Sites video but not the Hassan execution. Ballout didn't really have an answer.

Now, of course, the Hassan execution does not balance off the Fallujah mosque incident in any way, and the moral equation is complex. On the one hand, what happened to Hassan does not somehow justify the misbegotten war in which we are now embroiled. On the other hand, it is useful to remind ourselves – and it is obviously useful for the Arab world to remind itself – that what the Sites video documents is not the moral equivalent of shooting Margaret Hassan in the head. One was a split-second reaction to a confusing, possibly deadly situation. The other was an act of terror in the most literal sense – that is, it was the taking of an innocent life solely for the purpose of spreading terror. One was a tragic mistake. The other was pure evil. But though we should surely see both – as well as the bodies of the civilians who have died or been maimed by our arrogant act of liberation, as well as the beheadings and the Abu Ghraib images and everything else – we travel down a dangerous road when we use these images to try to justify. At best, they help us to understand, however imperfectly.

"The meaning of these pictures is not embedded in the video itself. What people think about this video is going to depend on what they think about the war," says Tom Rosenstiel, director of the Project for Excellence in Journalism.

Bob Zelnick, who chairs Boston University's journalism department, and who is a former war correspondent for ABC News, praises in-depth reportage, such as Dexter Filkins' Nov. 21 New York Times article on accompanying U.S. troops in Fallujah, for educating the public about the terrible consequences of urban warfare. "There has been a realistic picture presented of what these guys are up against," he says. "You read that stuff and you can understand what's going on over there, why anybody would pull the trigger first and ask questions later. Human beings have the blessed ability to make distinctions. We can distinguish between Abu Ghraib and Fallujah. The reason we can do that is because of good reporting in each case."

The problem – the tragedy, really – is that though the images tell us much about the way the war is being conducted, they tell us little about the wisdom of the war, or even its ultimate cost. It says much about this war that we can see pictures of a Marine killing a wounded insurgent, of Iraqi inmates being tortured, and of atrocities committed against Americans and other Westerners by terrorists, yet we cannot see the flag-draped coffins arriving at Dover Air Force Base. That – as well as the additional suffering we've inflicted on the already-long-suffering people of Iraq – is the ultimate context.

Five on the Floor

When the new Senate storms Capitol Hill early next year, the narrow Republican majority of the past two years will disappear, to be replaced by a much wider Republican majority. Currently, the Senate comprises 51 Republicans, 48 Democrats, and an independent – Jim Jeffords, of Vermont, a former Republican who usually votes with the Democrats. Because of last week's election, the Senate will soon seat 55 Republicans, 44 Democrats, and Jeffords.

Who are these people? Unlike the House, where Republican members lead lives of near-anonymous fealty dictated by Speaker Dennis Hastert and majority leader Tom DeLay, senators matter as individuals – not as just a voting bloc. There are moderate Republican senators, such as Olympia Snowe and Susan Collins, of Maine; Lincoln Chafee, of Rhode Island; and Arlen Specter, of Pennsylvania – who nearly got his head handed to him last week for daring to suggest that anti-choice judges might not pass muster. There are religious conservatives, such as Sam Brownback, of Kansas, and Orrin Hatch, of Utah. And there is Jim Bunning, of Kentucky, who's in a class by himself: last week he was re-elected despite widespread reports that he has Alzheimer's disease, and even though two of his supporters had sneeringly suggested that his Democratic opponent was gay.

Seven new Republican senators were elected last week. Two are unremarkable. Mel Martinez, of Florida, was George W. Bush's first secretary of Housing and Urban Development. Despite a poor record on the environment, Martinez deserves some thanks from Democrats: he and the White House intimidated Congresswoman Katherine Harris (yes, that Katherine Harris) into not running for the Senate this year. In Georgia, Republican congressman Johnny Isakson will succeed Democratic senator Zell Miller, who's retiring. Isakson – a moderate who's pro-choice (except when he isn't) – may well be more a voice of reason than Miller has been. That said, Isakson's outburst earlier this year that Bush is "the best president the United States has ever had" was certainly embarrassing, if not nearly as embarrassing as Miller's red-faced rant at the Republican National Convention.

What remain are five genuine specimens of right-wing Republicanism. Keep an eye on these guys. They're dangerous.

1) Tom Coburn: Keeping us safe from condoms and the 'gay agenda'

Fresh from helping to save Oklahoma from the scourge of teenage lesbianism, Tom Coburn arrives in Washington with perhaps the most bizarre set of right-wing credentials of anyone in the Republican Class of 2004. A former three-term congressman who was swept into office 10 years ago on the coattails of Newt Gingrich's Contract with America, Coburn – who succeeds retiring Republican senator Don Nickles – is an obstetrician possessed of an obsessive fascination with other people's sexuality.

In 2003, George W. Bush named Coburn to co-chair the Presidential Advisory Council on HIV and AIDS. Coburn's very first act was to speak out against the one preventative behavior (other than abstinence) that actually works. "I will challenge the national focus on condom use to prevent the spread of HIV," he said upon his appointment. Earlier, as a congressman, he had sought to force condom manufacturers to label their products as "ineffective" in slowing the spread of sexually transmitted diseases.

But that doesn't begin to plumb the depths of Coburn's so-called thinking. In his successful Senate campaign against Democratic congressman Brad Carson, Coburn called for the death penalty for doctors who perform abortions. That certainly gives new meaning to the term "pro-life." As a physician, Coburn himself performed abortions, although he says it was always to save the life of the woman. Tell it to the judge, Doc. Nor is that the only dissonant note from his career in medicine: Coburn was once accused of having sterilized a young woman without her permission. He says she had asked him to perform the surgery, though he conceded that he had lacked the written authorization that the law required.

In the 1990s Coburn criticized NBC for broadcasting Schindler's List, the Oscar-winning film about the Holocaust, charging that it would encourage "irresponsible sexual behavior." That particular outburst was so odd that even one of his ostensible allies, self-appointed morals czar Bill Bennett, felt compelled to label Coburn's remarks as "unfortunate and foolish." Coburn is also an outspoken opponent of the "gay agenda" in general and same-sex marriage in particular; as a member of Congress, he refused to allow the city of Washington to fund its program for domestic-partnership benefits.

Earlier this year, Coburn said that lesbianism is "so rampant in some of the schools in southeast Oklahoma that they'll only let one girl [at a time] go to the bathroom." Coburn's source: a campaign worker. He later said his remarks had been taken "out of context," whatever that was supposed to mean. His spokesman gamely insisted that Coburn was worried that "our kids are getting mixed messages about sexuality." Mixed-up, rather, if they've been listening to Coburn.

Sources: Salon, September 13, 2004; AlterNet, March 28 and October 13, 2004; the Associated Press, October 12, 2004.

2) Jim DeMint: 'The Family' values, homophobia, and tax chicanery

If Tom Coburn is #1 on our list of exotic senatorial specimens, South Carolina's Jim DeMint might qualify as #1A rather than #2. Congressman DeMint, who defeated Democrat Inez Tenenbaum in the campaign to succeed another retiring senator, Democrat Ernest Hollings, belongs to a secretive religious organization with anti-Semitic leanings, and is a tax-cut hypocrite and an outspoken homophobe to boot.

The decades-old religious group, best known for sponsoring the annual National Prayer Breakfast, is generally known as "The Family," "The Foundation," or "The Fellowship." A magnet for high-ranking conservative Washingtonians, it is said to have supported some vicious Third World right-wing dictatorships over the years – as well as performing the occasional good deed, such as helping to foster the relationship between Menachem Begin and Anwar el-Sadat. Members also reportedly believe that God's covenant with the Jews is broken, and that they are "the new chosen." DeMint is close enough to the inner circle to have lived, along with five other congressmen, in a million-dollar Capitol Hill apartment subsidized by "The Family."

During his campaign against Tenenbaum, though, DeMint's membership in this little-known group was far less of an issue than his mouth was. At a debate in October, DeMint said, "If a person wants to be publicly gay, they should not be teaching in the public schools." Even a local Christian Coalition official and DeMint supporter named Bette Cox said, "I wouldn't have said that. It's a civil rights issue with me. You can't cut off someone's civil rights." DeMint refused to apologize – although he did apologize for saying that unwed, pregnant women should not be allowed to teach either. And he declined to fire an aide who'd sent out an e-mail referring to "fags" and "dykes" (or, to be more precise, "dikes").

One of DeMint's key issues during the campaign was getting rid of the federal income tax and replacing it with a 23 percent flat national sales tax. It's an idea that President Bush himself has been cozying up to in recent weeks. The simplicity of such a system is undeniably appealing, but, unless carefully designed, it would be the mother of all regressive taxes, biting deeply into the poor and the middle class for everything they buy. So it's pretty amusing to learn that DeMint is a serial tax scofflaw, repeatedly making late payments on his federal, state, and local taxes between 1987 and 2001.

If nothing else, a flat federal sales tax would prevent well-connected people from gaming the system. People such as Jim DeMint.

Sources: Harper’s magazine, March 2003; the Associated Press, April 20, 2003; the Columbia State, October 4, 2004; 365Gay.com; Salon, October 7, 2004.

3) David Vitter: Putting young men and women in harm's way

The election of Louisiana congressman David Vitter to the Senate is an ominous sign of the problems facing the Democratic Party, especially in the South. Vitter won more than 50 percent in a multi-candidate election last Tuesday, thus avoiding a runoff next month. The retiring incumbent, John Breaux, is a Democrat who's conservative enough to inspire teeth-gnashing among liberals. But unlike Zell Miller, who these days sounds more Republican than Dick Cheney does, Breaux is a Democratic loyalist capable of pulling off the occasional bipartisan compromise. Vitter, though, is a straight-down-the-line ultraconservative.

According to rankings published by the National Journal, a nonpartisan political magazine, Vitter is the most conservative congressman elected to the Senate this year – more conservative than 87 percent of his peers. He has a 100 percent ranking from the National Right to Life Committee; a zero percent ranking from Human Rights Campaign, a leading gay and lesbian civil rights organization; a zero percent ranking from the League of Conservation Voters and the Sierra Club; and an "A" from the National Rifle Association.

Vitter's opposition to reproductive choice is so unwavering that he has co-sponsored legislation to require doctors who prescribe RU-486 – a drug that, if used properly, can induce a safe, nonsurgical abortion – to have both the ability and the necessary equipment to perform a surgical abortion should one become necessary. As James Ridgeway observed in the Village Voice, "That's a little like asking a doctor who prescribes heart medicine to be able to do open-heart surgery, right there in the clinic."

Vitter was also responsible for inserting a provision into the No Child Left Behind Act that requires public high schools to supply the names and phone numbers of all juniors and seniors to military recruiters – an invasion of privacy that could have tragic consequences for impressionable, economically stressed young men and women. (To be fair, generous opt-out provisions are included.) When asked to explain his reasoning, Vitter said the previous nondisclosure policy "demonstrated an anti-military attitude that I thought was offensive."

Somehow, no right-wing success story is complete without an example of grotesque hypocrisy. So let the record show that, for several years now, Vitter's supporters have been denying the claims of a Louisiana prostitute that she'd had an 11-month affair with Vitter when he was a state legislator. For the record, we don't care whether the story is true or not. But you'd think the Christian Coalition, which gives him a 100 percent rating, and the Family Research Council, which grades him at 92 percent, would care quite a bit.

Sources: AlterNet, September 29, 2003; the Village Voice, March 27, 2001; Louisiana Weekly, December 29, 2003; National Journal, February 27, 2004. Interest-group rankings from Project Vote Smart.

4) Richard Burr: Corporate errand boy scoops up PAC money

North Carolina has come a long way since the days of Jesse Helms. Its Research Triangle is as sophisticated and well-educated as – well, as in any blue state. So it's only appropriate that John Edwards's successor in the Senate stand out as being somewhat different from his fellow Republican freshmen. To be sure, Congressman Richard Burr is as anti-choice, anti-gay, and pro-gun as the rest of them. But he comes from that strain of Republicanism more interested in sucking up to corporate interests than in joining hands with the godly.

How in the tank is Burr? With $2.4 million in donations, this distant relative of Aaron Burr received more money from political action committees than did any other Senate candidate this year. "The main people he looks out for and answers to are the large corporations. That is the most troubling thing about Richard Burr to me," says Berni Gaither, a North Carolina Democratic Party official. Democratic activist Hayes McNeil puts it more succinctly: "Burr's record in Congress looks like a whore's bed sheet."

The good life, Burr-style, can be awfully good indeed. In April 2002, the National Association of Broadcasters – the fine folks who brought you corporate media consolidation – flew Burr, first-class, to Las Vegas for its annual convention. The amenities included poolside drinks and a massage, although Burr reportedly reimbursed the association for his spa stay. "It's extremely valuable for members to get that overall snapshot of their particular industry," said Burr, who at the time was vice-chair of the Energy and Commerce Committee. "If not, we rely on everyone to come up here and tell us how things have changed."

North Carolina remains a place apart. Burr and his unsuccessful Democratic opponent, Clinton White House chief of staff Erskine Bowles, were falling over each other to take credit for a federal buyout of the state's struggling tobacco farmers. But there is an area where Burr stands out: his contempt for the environment. The League of Conservation Voters has named Burr one of its "Dirty Dozen" (along with fellow freshmen senators-elect John Thune and Mel Martinez). The particulars: he supported President Bush on an energy bill provision protecting manufacturers of the gasoline additive MTBE from lawsuits over groundwater contamination; he voted six times against a ban on drilling for oil off North Carolina's Outer Banks; and he has opposed efforts to reduce mercury contamination and greenhouse-gas emissions.

"He has one of the worst environmental records on clean air and clean water in the U.S. Congress," says Mark Longabaugh, the league's political director. "That's one. Two, throughout his entire career he has shown a bias toward special interests, oil and gas or other polluters."

Sources: the Raleigh News & Observer, October 27, 2004; the Durham Independent Weekly, July 7, 2004; the Washington Post, March 11, 2003; Grist magazine, October 26, 2004; National Review Online, September 22, 2004.

5) John Thune: A simple-minded campaign of flag-waving and heterosexuality

Of all the freshmen Republican senators-elect, there is one celebrity – John Thune, of South Dakota, who knocked off Senate minority leader Tom Daschle. But though Thune, a former congressman, is an ultraconservative with ties to the religious right, he doesn't stand out for any particular policy outrage. Rather, Thune is a master of the sort of political cheap shot that excites the imaginations of those who like their symbolism both simple and stupid.

Take, for instance, a debate between Thune and Daschle on NBC's Meet the Press. Thune was agitated over something Daschle had said in March 2003, just before the war in Iraq began – that is, that "this president failed so miserably at diplomacy that we're now forced to war." Never mind that a) Daschle was speaking the truth, b) he had voted in favor of the war resolution and later backed the $87 billion in reconstruction money for Iraq and Afghanistan, and c) he was a veteran and Thune was not. Thune took the opportunity to accuse Daschle of something close to treason, saying, "What it does is emboldens our enemies and undermines the morale of our troops."

Or take a proposed constitutional amendment against flag-burning – a cause that you might have thought had gone out of style with George H.W. Bush way back in the 1980s. Not, apparently, in South Dakota. "Unfortunately, Senator Daschle has consistently voted against this amendment. My record on this is very clear," Thune said at an event in Rapid City featuring some three dozen veterans, the recitation of the Pledge of Allegiance, and the singing of "The Star-Spangled Banner." Don't you wish you'd been there?

Or, finally, take a radio ad that the Thune campaign broadcast this past summer that attempted to lump together Washington, Massachusetts, gay marriage, and Daschle in one unsavory stew. "The institution of marriage is under fire from extremist groups in Washington, politicians, even judges who have made it clear that they are willing to run over any state law defining marriage," Thune intoned. "They have done it in Massachusetts, and they can do it here."

This is just ugly, nasty stuff. The intellectual dishonesty of it all is matched only by its sheer brazenness. By appealing to voters' fears and by demonizing anyone who would get in his way, Thune, unfortunately, demonstrated that he is well-qualified to join the Republican majority.

Sources: the Washington Post, September 20, 2004; the Rapid City Journal, South Dakota, September 22, 2004; Salon, September 30, 2004; the Advocate, July 16, 2004.

Mad as Hell at the FCC

If you've followed the media-consolidation story for lo these many years, you might discern a certain resemblance to the movie "Groundhog Day" -- with one notable difference. Bill Murray was destined merely to live out the same day over and over again. With media concentration, it gets a little worse each time, as more television channels, radio stations, and newspapers fall into the hands of ever-larger, ever-fewer corporate owners.

Thus it was this past Monday, when the Federal Communications Commission -- chaired by Michael Powell, son of the secretary of state -- voted by a three-to-two margin to loosen the few restraints that were still in place. Daily newspapers will now be able to buy television and radio stations in the same communities in which they publish, a heretofore illegal arrangement known as "cross-ownership." Conglomerates will be allowed to own television stations reaching 45 percent of the national audience, up from 35 percent. A company will be allowed to own two -- and in some larger cities three -- TV stations in the same market.

"It violates every tenet of a free democratic society to let a handful of powerful companies control our media," said FCC commissioner Jonathan Adelstein in a blistering dissent. "The public has a right to be informed by a diversity of viewpoints so they can make up their own minds. Without a diverse, independent media, citizen access to information crumbles, along with political and social participation. For the sake of democracy, we should encourage the widest possible dissemination of free expression through the public airwaves."

And so it goes.

But wait. This time it might be different. This time there are signs that the public, as well as opinion leaders of various and diverse ideological stripes, are finally so outraged by this ongoing power grab that they will demand action.

The public snoozed during the 1980s, when the Reagan White House eased the public-interest and equal-time provisions to the point of irrelevance. It looked the other way when the Telecommunications Act of 1996 set off a gold rush, especially in the radio sector, which was taken over almost in its entirety by a tiny handful of owners. And that somnolence was encouraged by the news media, which, whether by design or indifference, served their corporate masters by failing to cover what was happening as anything other than a routine business story.

By contrast, the run-up to Monday's vote was distinctly unquiet. Opposition to the FCC's latest deregulatory moves came from an unusually broad cross section, from liberal and reformist groups such as Common Cause, the Center for Digital Democracy, and the Consumer Federation of America to conservative organizations such as the National Rifle Association and the Parents Television Council.

And it wasn't just special-interest groups that got in on the action. More than a half-million people reportedly submitted comments to the FCC, nearly all of them opposed to deregulation. MoveOn.org, a progressive organization founded to fight Bill Clinton's impeachment, forwarded some 180,000 electronic comments to the FCC. The group, which emerged earlier this year as a leader in the anti-war movement, also took out television commercials that raised the specter of international media baron (and Fox News Channel founder) Rupert Murdoch's extending his global reach even further.

"This really is just the beginning," Eli Pariser, the Maine native who is international campaigns director of MoveOn.org, told me by e-mail. "When we decided to engage on this issue, we knew that Commissioner Powell was probably committed to the approach of railroading the rule change through. We wanted to highlight his contempt for the democratic process and raise the noise level to the point where Congress paid attention."

Jeff Chester, executive director of the Center for Digital Democracy, believes progressives were finally mobilized by their disgust at the rah-rah, unquestioning tone of much of the war coverage. "The most important thing post-June 2 is to take the momentum and broaden this from the coalition that we've been able to create and go after some very serious victories on legislative action," says Chester. "You have the potential for a left-right coalition here to go back to Congress and try to be serious." As an example of legislation that could ameliorate the effects of the FCC's ruling, Chester says media reformers might push for a rule requiring companies that own a newspaper and a television station in the same market to employ separate editorial managers.

Besides that, Chester adds, he and other activists intend to protest every single merger. "We will file petitions to delay. We will slow down the process," he vows.

In a country whose Constitution guarantees freedom of the press, the very idea of media regulation has an un-American ring to it. And in fact, strictly defined, "the press" -- that is, newspapers, magazines, and other print outlets -- are essentially unregulated. By contrast, the rules that govern television and radio stations are grounded in the laws of physics: there are only so many broadcast frequencies available, and they must be divvied up in such a way that one station isn't trampling on space reserved for another.

If you want to start a daily newspaper, you can. It will cost many millions of dollars and you will probably fail, but no one will tell you that you aren't allowed to try. But if you want to own a TV or radio station, you'll need to buy one: nearly all the available frequencies are already being used by other broadcasters. It is this situation that gave rise to the current regulatory regime, which dates back to the 1930s. The airwaves, according to this doctrine, are finite and publicly owned. Licenses to use these airwaves are granted for a limited period of time, and must be exercised in the public interest.

That was the theory, anyway. Starting in the 1980s, though, the system began to crumble. During the Reagan presidency, public-interest programming started to disappear, as regulators made it clear that they no longer considered it a condition for license renewal. The Fairness Doctrine and equal-time provisions gave way to today's reality, in which nationally syndicated talk-show hosts such as Rush Limbaugh and Sean Hannity can openly call for listeners to vote Republican, with virtually no one on the air to take the other side.

Commercial radio was so thoroughly destroyed by the Telecom Act of 1996 that FCC member Michael Copps -- a Democrat who has emerged as Michael Powell's most outspoken opponent -- cited it as an object lesson in his 22-plus-page dissent to Monday's ruling. "Diversity of programming suffered. Homogenized music and standardized programming crowded out local and regional talent. Creative local artists found it evermore difficult to obtain play time on the air," Copps wrote, adding: "Competition in many towns became non-existent as a few companies -- in some cases a single company -- bought up virtually every station in the market. This experience should terrify us as we consider visiting upon television and newspapers what we have inflicted upon radio. 'Clear Channelization' of the rest of the American media will harm our country."

That last bit is a reference to Clear Channel, a conglomerate that has expanded its empire to some 1200 radio stations since 1996. Companies today can own as many as eight stations in a given market; Monday's ruling did not change that. Perhaps the most infamous case of radio gone bad is the town of Minot, North Dakota, where Clear Channel owns all six commercial stations. According to a New York Times report, when a 1 a.m. train derailment caused toxic gas to leak into the air in January 2002, police attempted to alert the Clear Channel station that was the town's designated emergency broadcaster -- and couldn't get their calls answered, forcing them to rouse station employees at home. Police complained that because the station was programmed by remote control, no employees were actually at the station, a charge that Clear Channel denied. Still, the story has become a cautionary tale regarding the evils of deregulation, and has transformed Senator Byron Dorgan (D-North Dakota) into an anti-monopoly crusader.

Incredibly, Clear Channel criticized Monday's ruling because the FCC announced that it intended to take some mild oversight steps with regard to radio. Then, too, Clear Channel went to great lengths to make sure its stations were on board with the war in Iraq, even going so far as to organize pro-war rallies. Perhaps the company's executives believe that the Bush administration is showing insufficient gratitude.

It's not just news that gets short shrift -- culture suffers as well. Donna Halper, a radio consultant and Emerson College journalism professor, saw what happened to radio firsthand. In 1974, she was music director at WMMS, in Cleveland, when she received a record in a plain brown envelope. It was by an unknown band called Rush. She put the record on the air -- and thus gave a huge boost to a band that has proved to be both highly popular and long-lived.

"If I were a music director doing my job today, I couldn't do what I did in 1974," Halper told me. "I couldn't run down to the disc jockey on the air and say, 'Hey, I just heard this great new band, we've got to put them on the air.'" Increasingly, she observes, radio stations in small and medium-size markets -- the kinds of places where music promoters used to be able to break new acts -- are programmed out of headquarters many hundreds of miles away.

This kind of gigantism can take a more malignant turn as well. Last week, ABC's Nightline -- in a segment for which Halper was interviewed -- reported that Cumulus, the second-largest radio chain (after Clear Channel), banned the Dixie Chicks from all its 42 country stations after lead singer Natalie Maines dissed George W. Bush. Now, you could argue that that was Cumulus's right, or that it might even have been a good business decision. But one company taking a group off 42 stations is considerably different from 42 separate radio-station owners making the same decision.

After all, it is nearly unimaginable that 42 different individuals would make that same decision. Which is, after all, the whole point about what's wrong with media consolidation.

As Michael Powell accurately notes, consolidation is not taking place in a vacuum. For one thing, in recent years the increasingly conservative federal court system has been knocking down various media regulations, meaning that some loosening of the rules was inevitable. For another, there are many more media outlets than ever before -- hundreds of thousands of Web sites, satellite television, and a cable universe that won't stop expanding until it's up against the current technological limit of 500 channels.

The problem with Powell's first argument, though, is that he appears far too eager to do the media conglomerates' bidding rather than test the limits of the courts' forbearance. Powell has encouraged a shockingly cozy relationship with the industry. According to a recent study by the Center for Public Integrity, "FCC officials have taken 2,500 trips costing nearly $2.8 million over the past eight years, most of it from the telecommunications and broadcast industries the agency regulates." Top destinations: Las Vegas, New Orleans, and New York.

The problem with the second argument is that, though there are more broadcast and cable outlets than ever before, the majority of them are owned by five huge companies: AOL Time Warner, Viacom/CBS, General Electric/NBC, Disney/ABC, and Murdoch's News Corporation. Studies show, in fact, that those companies control some 75 percent of the prime-time audience.

To be sure, the notion that big-is-bad is too complicated to be correct in all cases. Local media activist Steve Provizer, the founder of Allston-Brighton Free Radio and the Citizens Media Corps, notes that a study earlier this year by the Project for Excellence in Journalism found that TV stations owned by newspaper companies (under "grandfathering" arrangements) tend to produce higher-quality newscasts than other stations. He also observes that WHDH-TV (Channel 7), an independently owned operation (there is a sister station in Miami), introduced the sort of whiz-bang graphics and quick hits that dragged down local news throughout Greater Boston, with the chain-owned stations emulating at least some of Channel 7's style in a desperate bid to retain audience share.

"You can throw around the big terms: democracy, civic discourse. But I'm at the point where I've been doing this a long time, and people's real choices have got to change," Provizer says. "Will people choose their media on the basis of the fact that it's an independent rather than a group owner? They never have." He adds: "In my own mind, I feel like community and alternative media have got to do some real significant work if they're going to have any impact on people's media habits."

These days, Provizer is involved in a project called the Commonwealth Broadband Collaborative, which is aimed at producing high-quality, community-based programming for public-access cable TV and the Internet.

Provizer is right: independent owners aren't always better than corporate chains, and community-based media do no good if people aren't paying attention. Still, there are limits to the amount of damage a bad independent owner can do. The FCC's stated goals of localism, diversity, and competition are far better served by a multiplicity of owners than they are by a handful of corporate titans controlling most of what we see, hear, and read.

So what's next? The new FCC rules won't actually take effect until fall at the earliest. There's been considerable speculation about what might happen in Boston, but the truth is that no one knows. The New York Times Company, which owns the Boston Globe and the Worcester Telegram & Gazette, also owns TV stations, and editorialized in favor of dropping the cross-ownership ban. Tribune Company, which owns WLVI-TV (Channel 56), has made it clear that it would like to gobble up TV stations in cities where it already does business. Boston Herald publisher Pat Purcell has publicly stated that he would like to buy a radio station. He might also find a way to do business with his old mentor, Rupert Murdoch, who owns WFXT-TV (Channel 25).

Then, too, there is a possibility -- slight though it may be -- that Congress will seek to prevent some or all of the FCC changes from taking effect. David Moulton, a spokesman for Representative Ed Markey (D-Malden), a key opponent of deregulation, says the congressman might file legislation to retain the 35 percent national-audience cap. Markey himself said in a statement that "all segments of the population will enjoy fewer and fewer options for independent news and information" as a result of the FCC decision.

Most of the Democratic presidential candidates, including Senator John Kerry, denounced the FCC decision. Senator Olympia Snowe (R-Maine), a national leader in the fight against deregulation, said in a written statement that Monday's action would "limit freedom of expression and curtail discourse, which are the very tenets of freedom and democracy our nation is built on." Senator Russ Feingold (D-Wisconsin), an outspoken critic of the way Clear Channel muscles musicians with its monopolies in radio and concert promotion, announced that he was considering various legislative options as well. And these progressives have been joined by such conservative Republicans as Senators Trent Lott of Mississippi and Ted Stevens of Alaska.

Like Eli Pariser and Jeff Chester, University of Illinois media scholar Robert McChesney believes the coalition that formed around the FCC's latest deregulatory steps may represent a real opportunity for those who believe in a decentralized, democratic media. McChesney is involved in a new organization, Free Press (www.mediareform.net), based in Northampton, that is attempting to organize around issues related to media concentration and ownership.

"What we've seen this time is an extraordinary amount of activism and interest in this issue. We've had nothing like this in the last 50 or 100 years that's even remotely comparable," says McChesney. "This is not the last battle in a war that is now lost. This is the first battle in a war that is just beginning."

Dan Kennedy can be reached at dkennedy@phx.com. Read his daily Media Log at BostonPhoenix.com.

Monopoly Money

A.J. LIEBLING WOULD not be pleased. Nearly 40 years ago, the legendary press critic lamented the rise of one-newspaper cities, a phenomenon considerably less common then than today. Where there is no competition, Liebling wrote, "news becomes increasingly nonessential to the newspaper. In the mind of the average publisher, it is a costly and uneconomic frill, like the free lunch that saloons used to furnish to induce customers to buy beer. If the quality of the free lunch fell off, the customers would go next door."

Since then, things have gotten only worse.

When the first edition of Ben Bagdikian's The Media Monopoly (Beacon Press) was published, in 1983, some 50 corporations were identified as controlling most of our newspapers, magazines, books, television networks, radio stations, and movie and music studios. Twenty years later, in the current "Big Media" issue of the Nation, that list is down to 10 international conglomerates, their vast holdings detailed in a fold-out color chart.

But though media consolidation is hardly a new story, there is a disturbing sense that the pace of monopolization is accelerating, and that the end game, or something like it, is at hand. Particularly distressing is the rapid consolidation of the cable industry, which threatens to turn the wide-open, decentralized, but slow Internet of the 1990s into a corporate-owned, profit-oriented, high-speed network with no room for independent voices. The Net is the last, best hope for a truly democratic media. Yet if we don't act, it may soon be too late to save it.

The most significant recent development took place just a month ago, when AT&T Broadband, the country's largest cable-television provider, was acquired by Comcast, the number-three company. AT&T Comcast, as the new company will be known, will control some 22 million subscribers — more than a third of the nation's 60 million cable households. And if that weren't chilling enough, analysts are already predicting that the most humongous media conglomerate of them all, AOL Time Warner, whose 13 million cable subscribers make it the number-two company, will work out some sort of a partnership with AT&T Comcast.

The AT&T Broadband–Comcast deal did not take place in isolation. Earlier last year, the Federal Communications Commission (FCC), whose alleged job is to make sure that media giants do not trample upon the public interest, dumped a half-century-old rule that had prohibited one network from owning another. The result: Viacom, which owns CBS, was allowed to acquire UPN. That's why, in Boston, you can now watch Channel 4's news on Channel 38 (see "Big Media Stalk Hub," sidebar).

At about the same time that the cable giants were consolidating, the French media conglomerate Vivendi Universal announced that it would buy USA Networks for about $10.3 billion. Vivendi owns the Universal movie studios; USA's holdings include a television-production operation and the USA and Sci-Fi cable channels. Earlier in the year, Vivendi acquired Houghton Mifflin, the last of the big, independent, publicly traded book publishers — and the holder of the suddenly lucrative Lord of the Rings franchise.

Moreover, all of this is taking place at a time when a series of pro-industry court rulings and changes at the FCC threaten to sweep away what few restrictions remain in place following passage of the Telecommunications Act of 1996, which greatly relaxed ownership rules. The FCC appears poised to junk such old standbys as the prohibition against a newspaper's owning a television or radio station in the same market, as well as a passel of local and national restrictions on the number of radio stations, television channels, and cable systems any one company is allowed to own.

"The problem is that a lot of this stuff is happening behind the scenes," says Danny Schechter, executive editor of MediaChannel.org, a media-watchdog Web site with an international and progressive orientation. "The FCC may make any concerns about this completely irrelevant when it chooses to lift all remaining regulations, which is certainly possible. I think there really is kind of a tipping point. It's hard to get it back to the way it was, not that the way it was was so great. But what you did have was more of an ethos, at least a lip-service ethos, to public service. And now even that has gone out the window."

At the center of all this is President Bush's handpicked FCC chairman, Michael Powell, who, like his father, Secretary of State Colin Powell, is bright, smooth, and articulate -- but who, unlike his father, espouses the kind of doctrinaire free-market conservatism that Bush favors in his domestic-policy appointees.

Michael Powell has a penchant for saying provocative things, and sometimes the nuances get lost. For instance, when he was asked last year about the "digital divide" — the technology gap that exists between rich and poor -- Powell memorably replied, "I think there's a Mercedes divide. I'd like one, but I can't afford it." The Washington Post later showed that Powell's remarks immediately before and after showed considerably more thoughtfulness than the dismissive sound bite suggested.

Yet there's little question that when it comes to deregulation, Powell intends to outdo even his deregulation-minded, Clinton-appointed predecessors, Reed Hundt and William Kennard. In a little-noticed interview with the Wall Street Journal published last September 10, Powell spoke disdainfully about "what I call the 'Big Fish Problem,' which is this inherent anxiety about bigness in a capitalist economy." He also made it clear that his view of the public interest was not necessarily the same as that of those whose business it is to act as the public's eyes and ears.

"Every decision I make, I will argue to the last day I am here, I am taking in the name of the public -- not in the name of some company and not in the name of some consumer-interest group," Powell said.

Says Andrew Jay Schwartzman, president and CEO of one of those consumer-interest groups, the Washington-based Media Access Project: "He's very bright, very, very shrewd. And although it's a very appealing package, he is in fact a good deal more conservative than his father, and he's hell-bent on lifting ownership rules. I'm always the optimist, and we won't stop working on him. But he's intent on where he's going, he's come in with preordained objectives, and he's pushing very hard to obtain them."


TO BE SURE, not all media bigness is necessarily bad, and even when it is, not all of it can be regulated or outlawed. The most significant obstacle: the US Constitution. After all, the First Amendment says, "Congress shall make no law ... abridging the freedom of speech, or of the press." As my Phoenix colleague Harvey Silverglate, a noted civil-liberties lawyer, likes to say, "What part of 'no law' don't you understand?"

That doesn't mean media companies can engage in illegal predatory practices aimed at putting their competitors out of business. But it does mean that government can't break up media conglomerates based merely on a sense that such conglomerates are somehow not in the public interest.

Besides, there is at least an argument to be made that only big media have the power and influence to cover the large institutions that dominate modern life. In January 2000, Jack Shafer wrote a piece for the online magazine Slate (owned by the extremely big Microsoft Corporation and thus part of a media alliance that includes NBC, MSNBC, General Electric, the Washington Post, and Newsweek) arguing exactly that.

"Small, independently owned papers routinely pull punches when covering local car dealers, real estate, and industry," Shafer wrote, asserting a nasty little truth known by every reporter and editor who has ever worked for a locally owned community newspaper. "Whatever its shortcomings -- and they are many -- only big media possesses the means to consistently hold big business and big government accountable."

And though Shafer doesn't say it, the whole notion of government officials' regulating the size and scope of media companies sounds suspiciously like what's going on in Russia, where the government of President Vladimir Putin has shut down nearly all of that country's big independent media -- in the public interest, of course. To quote Liebling again: "Men of politics cannot be trusted to regulate the press, because the press deals with politics. Pravda is even duller than the Times."

Moreover, despite the dominance of just a handful of huge conglomerates, it's hard to argue that we have fewer choices today than we did, say, a generation ago. US Representative Edward Markey, a Malden Democrat poised to take over the chairmanship of the Subcommittee on Telecommunications and the Internet if his party can recapture the House this fall, is worried about media concentration — and says he plans to order a "top-to-bottom review of the ownership rules aimed at restoring diversity and localism as cornerstones of telecommunications policy." Yet Markey is quick to add that, in some respects, consumers have never had more options than they do today.

In the 1970s, Markey recalls, there were just three major commercial television stations in Greater Boston. Now there are five stations with daily newscasts, New England Cable News, dozens of channels on cable, and the Internet. "I don't think there's any question that people are better off today than they were then in terms of total diversity," Markey says. And, because of the increasing ubiquity, speed, and capacity of the Net, Markey sees the situation only getting better -- if, he adds by way of warning, the Internet remains as free and open as it is today.

That brings me back to cable television, which may, in turn, pose most the important media-regulation question of all.

The entire rationale for media regulation is the notion of scarcity. The reason that the government may regulate the number of radio or TV stations a company owns is that those stations make use of the airwaves -- a finite, public resource. The Internet, at least theoretically, is infinite. Seen in that light, there's no more rationale for regulating the Internet than there would be for regulating the number of newspapers Gannett can own on the basis that its papers are made of ground-up trees, which are, after all, a finite, public resource. And since just about all media -- radio, TV, newspapers, what-have-you -- will one day be delivered over the Internet or something like it, then government regulation will, of constitutional necessity, go the way of all dinosaurs.

Except it's not that simple.

Last summer, a small advertising firm in Wakefield called Prime Communications filed a $20 million lawsuit against AT&T Broadband. According to accounts in both the Boston Globe and the Boston Herald, Prime accused AT&T of refusing to sell it advertising time after Prime turned down AT&T's offer to buy an Internet-based business it had developed. Prime president Neil Bocian told the Herald, "I have to have access to all the media. Now I can't buy cable, and I don't have an alternative because they own all the cable systems."

AT&T, of course, denied Bocian's charges, and it remains to be seen how this will play out. But it's a perfect illustration of a much larger problem: cable companies typically control both programming (or some of it, anyway) and the pipeline over which that programming travels. Cable companies such as AT&T claim a First Amendment right to run their businesses as they see fit. The problem is that one aspect of their business -- the pipelines -- is a monopoly, usually granted by local elected officials. That gives them enormous leverage over what content will be allowed to travel through those pipelines. It's as if state highway officials let you drive on the Mass Pike only in cars you rented from them. Neil Bocian may be right or he may be wrong, but this much is certain: he can't take his business to a competing cable company, because there isn't one. And with cable companies emerging as the preferred provider of high-speed Internet access, corporate control of the pipeline is becoming a threat.

As Stanford Law School professor Lawrence Lessig argues in Code and Other Laws of Cyberspace (Basic Books, 2000) and his new The Future of Ideas (Random House), the reason that anyone can be a content-provider on the Internet is that the Net was specifically designed to be wide-open, democratic, and neutral. The flip side, Lessig warns, is that it could just as easily have been designed another way -- and big media, having missed out on the first wave of the Net, could take advantage of the dot-com meltdown and the rise of broadband to rewrite the rules to their advantage this time around. In an interview with Newsweek's Steven Levy this week, Lessig said that "every major change that's going on right now around the Internet is a change to undermine that neutrality, so those who control the legal system or control the physical network are able to veto innovations they don't like. So you get the right to innovate depending on whether AOL or AT&T or the music industry likes your innovation."

Without government regulation, in other words, there's nothing to stop the cable companies from excluding Internet content just as surely as AT&T Broadband may be excluding Prime Communications. This private Internet could be engineered in such a way that only content approved by the cable company can be accessed. Or only content for which the cable company is receiving money can be easily found. Or certain types of content that the cable company doesn't want to compete with, such as streaming video from independent media, can't be transmitted at all.

It's not that the old, wide-open Internet will go away, says Jeff Chester, executive director of the Washington-based Center for Digital Democracy. It's that the high-speed Internet is going to become Fun City, and few people will bother with the traditional Net, where nonprofit and independent voices will cry out to be heard. It's at least theoretically possible that the full range of content will remain available only to those who keep a slow dial-up connection --- something most people just aren't going to do.

"While the Internet posed a truly competitive threat in the early 1990s of a much more open and democratic communications system, that promise is now truly threatened," Chester says. "It is not visible, it is not apparent, it is an iceberg sitting in the water. It's not like somebody's going to take away your Internet, but the fact is that the Internet is going to change in subtle ways. Clearly the network owners are going to have the ability to banish certain Web sites if they wish."

Chester fears that when Big Media perfect the high-speed, privatized Internet, with full video, music delivery, personalization, and other features, "people are going to love this stuff. That's the other problem." Independent voices, he says, "will just fade into the digital twilight."

CALL IT THE GREATEST story never told. According to a report by the Center for Public Integrity, which keeps an eye on the unappetizing stew of politics and money, media corporations and their employees contributed $75 million to candidates for federal office and the two major political parties between 1993 and mid 2000. From 1996 to 2000, the report continues, the 50 largest media companies and four of their trade associations lobbied Congress and the executive branch to the tune of $111.3 million.

Among the goodies these media moguls sought were more-corporate-friendly copyright laws, the elimination of the estate tax, fewer restrictions on tobacco and alcohol advertising, a halt to proposals that would mandate free air time for political candidates, and, most important, the elimination of FCC rules aimed at restricting ownership.

Usually the media love such a story of greed and influence -- especially when it's spoon-fed to them in the form of a respected interest group's report, complete with a predigested three-page summary. But chances are you didn't read, hear, or see anything about this one, titled, fittingly, Off the Record. "This is major news about the influence of an extremely powerful industry and its relationship to government and its favors from government," says Charles Lewis, executive director of the center. "And it was basically nonexistent in terms of news coverage. I don't think that's completely coincidental. Of course, what makes the media industry so powerful is not just the amount they spend, but the fact that they control access to the airwaves and newspaper pages."

It was the power of the media lobby -- especially in the form of the National Association of Broadcasters -- that croaked a plan by the previous FCC chairman, William Kennard, to license low-wattage, nonprofit, community-oriented radio stations whose reach is measured in city blocks rather than square miles. The NAB -- joined, believe it or not, by National Public Radio -- argued, against compelling technical evidence, that these small stations would interfere with its members' own signals, even if care were taken to locate the low-powered stations on unused portions of the FM dial. Kennard's vision, limited though it was, got nixed by Congress in the closing days of the Clinton administration, with no prospects of revival any time soon.

"It's been shut down completely in any urban area," says Steve Provizer, who heads a tiny, grassroots outfit called Allston-Brighton Free Radio, which transmits a barely detectable signal at AM 1670. (Some of its programming is rebroadcast on WJIB, AM 740.) "It's really a service that will only be useful in rural areas or exurban areas. God bless it for having that much usefulness, but it's largely been undermined by congressional action as instigated by NPR and the NAB." Previously, Provizer ran Radio Free Allston, shut down by the FCC several years ago for broadcasting without a license. Illegal? Well, yes. But also vital -- so much so that the station had received a commendation from the Boston City Council for broadcasting local political debates and otherwise serving the community in ways that bottom-line-obsessed commercial stations just don't care about.

The media lobby's next target: ownership rules that prevent a company from owning more than eight radio stations in a given market, that prohibit one company from owning a cable system and a TV station in the same market, and that prevent one company from owning a TV or radio station and a major daily newspaper in the same market.

That last regulation -- known as the cross-ownership rule -- had a major role in shaping the Boston media landscape. The Boston Herald Traveler, a predecessor to today's Herald, survived for years on the strength of its ownership of a radio station and a TV station through a waiver it had dubiously obtained from the FCC. The Globe fought back — and in the early '70s, the Herald Traveler lost its broadcast properties. The paper fell into the hands of the Hearst Corporation, and it appeared to be dying a slow, lingering death until international media magnate Rupert Murdoch acquired it in the early 1980s. (So close did the Herald come to shutting down that work crews started ripping vending machines out of the cafeteria.) Murdoch himself ran afoul of the cross-ownership rule when he bought WFXT-TV (Channel 25) in the late '80s, and Senator Ted Kennedy, a frequent target of the Herald, blocked Murdoch's attempts to obtain an FCC waiver. Murdoch sold Channel 25 only to repurchase it after selling the Herald in 1994 to his long-time protege Pat Purcell.

To bring the story full circle, Purcell -- who a year ago bought about 100 community papers in Greater Boston and on Cape Cod -- would now like nothing better than to go into the broadcasting business in order to compete more aggressively with the Globe, whose corporate owner, the New York Times Company, also owns the Worcester Telegram & Gazette and, unless the sale is derailed, will soon own a chunk of the New England Sports Network and the Boston Red Sox as well.

"If the rule didn't exist anymore, who knows what would happen?" asks Purcell. "It's a little early to speculate, but a whole lot of options would open up for us." Clearly, it's a subject close to his heart. His newspaper has come out against the cross-ownership rule on both its editorial page and in its business columns. And Herald reporters are already featured on Channel 25 — just as Globe reporters are featured on New England Cable News and on Channel 4's The Boston Globe/WBZ News Conference.

The cross-ownership rule, in fact, may need some rethinking. Allowing a media executive such as Purcell, who's rooted in the community, to extend his franchise and spread out his costs could benefit not just him but also those who like the Herald's brand of journalism. But simply repealing the rule could be dangerous. Who, after all, would be better positioned to buy a TV or radio station (or both) than the mighty Times Company, thus giving the Globe even more of an advantage in a market that it already dominates?

The Center for Digital Democracy's Jeff Chester says he would have no problem with allowing, say, the number-two newspaper in a market to acquire the number-three or -four TV station. But he adds that Boston -- one of the few competitive newspaper towns left in the country -- "is a unique case." Preventing one media company from amassing too much power in a given community, Chester says, is still a worthwhile goal.

BACK WHEN A.J. LIEBLING was writing about the death of newspapers, he was mainly concerned about the cuts in news coverage that monopoly publishers inevitably ordered. "Money is not made by competition among newspapers, but by avoiding it," he wrote. That's still true today, even when competition at least theoretically exists. Witness the foreign bureaus that were closed and the reporting positions that were eliminated during the 1990s as the Big Three networks fell into the hands of conglomerate owners -- cuts that made it difficult (although not, thankfully, impossible) to cover the war against terrorism following the September 11 attacks.

Just as important as competition or the lack thereof is the dominance of corporate over community values.

Huge radio companies compete fiercely, but they do so by offering lowest-common-denominator syndicated programming in city after city, such as Howard Stern and Opie and Anthony, and right-wing talk shows, such as Rush Limbaugh's. The crude-but-intelligent Imus in the Morning is a notable exception, but even that stands in contrast to the localism that once made radio a unique medium.

A conglomerate such as AOL Time Warner produces the movie Harry Potter and the Sorcerer's Stone, and then promotes it in its magazines (Time, People, Sports Illustrated), on CNN, and on the AOL Internet service.

NBC News and ABC News have to think two or three times before running any negative reports on their corporate owners, General Electric and Disney, respectively.

Newsweek is owned by the Washington Post Company, which has a content-sharing relationship with MSNBC and MSNBC.com. That will prevent Newsweek from ever again getting beaten on its own exclusive, as it was with Michael Isikoff's revelation that Bill Clinton had had sex with that woman, Monica Lewinsky. But there are weeks when the magazine looks like nothing so much as a print version of MSNBC, flogging the MSNBC.com Web site on every page and, recently, publishing a piece of media criticism by that noted journalistic thinker Chris Matthews. (His verdict: the media are doing a pretty damned good job, thank you very much!)

"Who owns these companies makes all the difference," says Tom Rosenstiel, director of the Project for Excellence in Journalism. "Ownership matters profoundly. It's not just the system of ownership, it's the human values of the people who do the owning." He adds, though, that he is concerned whenever news organizations are acquired by conglomerates whose primary businesses are not news. He notes, for example, that ABC News represents just two percent of profits at Disney.

"There's a lot of reason to worry about the fact that journalism is being subsumed as a minority presence inside conglomerates," Rosenstiel says. "One dark cloud of conglomeration is if you have owners who don't care about journalism. The second dark cloud is if they see their properties as an opportunity for synergy."

Michael Powell told the Wall Street Journal last September, "I think I'm a little misunderstood on the whole area of media consolidation." He added: "The public interest is not always served by strict liability and slavish commitment to a linear judgment made 30 years ago."

Big isn't always bad, and, in some respects, it makes as much sense to rail against media conglomerates as it does to boycott Starbucks, where the coffee is better than it was at the mom-and-pop shop it replaced and where the employee benefits include health insurance and stock options. Nostalgia based on blind allegiance to the past is just stupid.

But Powell needs to understand that the public interest doesn't consist merely of getting the coolest technological advances into the public's hands as quickly as possible. A diversity of voices and a place for independent media are just as much a part of the public interest.

There's a reason that the First Amendment protects the media from government regulation: the framers believed that free and independent media were absolutely essential for the same public interest that Powell claims is his primary guide.

The danger is that Powell will release the media from the last vestiges of government regulation — and then stand back and watch as the media's corporate masters use their power and influence to silence any voices that threaten their economic interests.

Dan Kennedy is a media critic for the Boston Phoenix.

Beef Stew

Last fall, The Wall Street Journal published a horrifying article on its front page. Under the headline THE U.S. MAY FACE MAD-COW EXPOSURE DESPITE ASSURANCES FROM GOVERNMENT, staff writer Steve Stecklow reported that the domestic cattle herd is far from safe, and that the government is doing little to test either cattle or people for signs of illness.

Yet despite Stecklow's meticulously detailed findings and the story's prominent placement in one of our most respected newspapers, it pretty much disappeared without a trace. To the extent that any attention has been paid to mad-cow disease during the past month, it was to plug a reassuring report by the Center for Risk Analysis, at the Harvard School of Public Health, that there is vanishingly little likelihood here of a British-style outbreak of mad cow.

What a difference a year makes. In late 2000 and early 2001, network television newscasts and national newsmagazines were filled with terrifying stories about what had happened in Europe, especially in Britain. Cattle in increasing numbers were coming down with bovine spongiform encephalopathy (BSE), a little-understood disease that kills by punching the brain full of tiny holes.

Worse -- much worse -- was the likelihood that a similar fatal brain disease affecting humans was spreading through the consumption of contaminated beef. The illness was called a "new variant" of a rare condition named Creutzfeldt-Jakob disease, and thus became known, for short, as nvCJD. More than 100 people, nearly all of them from Britain, have died of nvCJD over the past five years.

For the US media, the story was made to order, featuring as it did video of wild-eyed, staggering cows, heaps of burning animal carcasses, distraught farmers, and — in a few cases — footage of twentysomething nvCJD victims trembling through the final stages of their awful disease. "I hate to be blunt, but there was a strong visual to go with it," says Boston University communications professor Tobe Berkovitz. "A typical science story doesn't get much play, because you need a visual to be aired ad infinitum or ad nauseam to make it a television news story. And video of shaking, crumbling cows gives you a visual."

There was, though, a problem with sustaining interest in mad-cow disease. First, there was the inconvenient fact that not a single case of BSE or nvCJD had ever been found in the US. Second, federal officials assured the public that steps taken several years earlier -- banning the importation of beef from Britain, and outlawing the use of beef byproducts in animal feed, thought to be the principal means by which BSE is spread -- would prevent an outbreak from ever occurring here. By spring, few mad-cow stories were making their way onto the front pages or the network newscasts, as the media turned their attention to more characteristic obsessions. No, it hasn't disappeared completely -- witness a recent episode of The West Wing in which the Bartlet administration debates how best to spin an outbreak of mad cow. But in terms of public consciousness, this is one potential crisis that has faded far into the background.

"There was a period when mad-cow disease was a very telegenic story in an ugly and disturbing sort of way," says Robert Thompson, director of Syracuse University's Center for the Study of Popular Television. But then, he notes, "the summer of Gary occurred, and all of a sudden we had all of that time being spent in the cable and broadcast media on Gary Condit and Chandra Levy." Finally, Thompson observes, "what Gary Condit did to mad cow and some other stories, September 11 did to Gary Condit."

But if mad-cow disease is, understandably, not as pressing an issue as the hunt for Osama bin Laden, it remains, as the Wall Street Journal article suggests, an important, ongoing story. If mad cow -- and, more crucially, nvCJD -- breaks out into the US population at some point during the next several years, the media's chronically short attention span in covering this complicated scientific and medical story will surely stand out as one of their principal failures of 2001.

MAD-COW DISEASE, "classic" (that is, non-nv) CJD, and a similar illness in sheep called scrapie are all known as transmissible spongiform encephalopathies, or TSEs. All of them occur naturally, and scientists believe that mammals, humans included, contract TSEs at the rate of one per million in population. Although the exact cause of TSEs is poorly understood, it is thought by many scientists to be related to the presence of "prions" -- proteins that somehow take on a different and deadly shape, and that force other proteins to follow their lead. This process has been compared to "ice-nine," the substance in Kurt Vonnegut's 1963 novel Cat's Cradle that destroys the earth by changing all the water so that it turns solid at room temperature.

Among the best and most thorough treatments of mad-cow disease was an article written for the Atlantic Monthly in 1998 by science journalist Ellen Ruppel Shell. She argues that if BSE arises in cattle naturally at the rate of one in a million, then there would be 100 with BSE among the nation's 100 million head of cattle at any given time. And if any of those cattle somehow entered the food chain -- say, in high-protein animal feed that is later fed back to cattle -- then BSE can spread far beyond those 100 head. Humans are exposed by eating contaminated beef -- a danger heightened by such practices as slaughtering cattle with pressure guns, which blast highly infectious brain and spinal tissue into the edible parts of the animal carcass.

Shell focuses especially heavily on animal-rendering plants, "a series of altogether unsavory places where dead cats and dogs, road kill, the occasional circus animal, and the diseased carcasses of farm animals are mixed into a ghastly, belching stew." Yum. Among the products made by these plants is the aforementioned high-protein animal feed, which turns cows into cannibals by feeding them byproducts of other cows -- including, potentially, cows with BSE. Complicating this considerably is the fact that other animals dumped into the stew may also have TSEs -- especially road kill such as elk and deer, which, in the Western United States, are experiencing an epidemic of a TSE known as chronic wasting disease. Finally, to save on energy costs, rendering plants in recent decades have perfected a system of low-temperature cooking. The problem is that sustained exposure to high temperatures is absolutely essential for killing TSEs.

But if feeding cows to cows is now illegal, well, why should we worry? As it turns out, it's not nearly that simple. It is still perfectly legal to feed cow byproducts to pigs and chickens, which are not thought to harbor TSEs. And it's perfectly legal to toss those same pigs and chickens into the rending vats to manufacture feed that can then be fed back to cows. Also, as the Journal article reports, 13 percent of rending plants do not comply with the new regulations against putting beef byproducts into feed intended for cows -- and the federal government itself admits that "scores of shipments of animal byproducts for use in animal feed came into the U.S. in recent years from countries that now have mad-cow disease in their cattle herds, a potentially serious source of contamination."

There is still, though, the simple fact that no cases of BSE or nvCJD have ever been diagnosed in the US. Right? Well, maybe. Some mad-cow specialists say the problem is that the United States has not been inspecting cattle in anywhere near the numbers or with the rigor that British and European authorities now do, meaning that cases of BSE could be slipping by. As for the lack of any human cases, there is at least some reason to believe that there are, in fact, tens of thousands of cases -- many of them sitting right in front of us when we visit the nursing home.

In the 1990s, researchers at Yale University and the University of Pittsburgh studied autopsy results of people who had died of Alzheimer's disease. Although their sample sizes were small, the results were chilling: somewhere between eight and 13 percent were found to have actually had CJD rather than Alzheimer's. That's far more than the 250 or so cases that would be statistically expected of "classic," or naturally occurring, CJD, meaning that a more likely explanation would be the consumption of contaminated meat.

That would also fit with the decades-long incubation period for CJD and nvCJD. According to the Web site of the Alzheimer's Association, approximately four million Americans have Alzheimer's -- about 10 percent of those who are 65 or older, and nearly half of those who are 85 or older. As Sheldon Rampton and John Stauber wrote in their book Mad Cow U.S.A.: Could the Nightmare Happen Here? (Common Courage, 1997), "If the true number of CJD cases in the United States turns out to be 40,000 instead of 250, the implications for human health would be severe. It could mean that a deadly infectious dementia akin to Britain's problem has already entered the U.S. population. And since CJD has an invisible latency period of up to 40 years in humans, 40,000 cases could be just the beginning of something much larger." (A downloadable version of Mad Cow U.S.A. is available for free on the Web site of the Center for Democracy and Technology.)

Ronnie Cummins, national director of the Organic Consumers Association, whose Web site contains an extensive archive of mad-cow information, says that at a minimum the federal government should launch a program of quick, inexpensive tests of both cattle and Alzheimer's-disease patients to determine whether we may have a hidden mad-cow crisis that warrants further study and action.

"It's not a question of 'Do we have mad cow in this country?' Of course we do," says Cummins. "Every livestock-grazing country in the world has always had it at low levels. The question is 'How much do we have, and how quickly is it magnifying?'"

The conventional wisdom is that we have little to fear from mad-cow disease, and of course the conventional wisdom may be right. Lloyd deMause, editor of the Journal of Psychohistory, says it's not unusual for societies to develop a cultural fear of poisoning at the end of a long period of prosperity -- it's a natural reaction to feelings of guilt over having experienced such good fortune. Seen in this light, last year's media obsession with mad-cow disease -- an illness never detected in this country -- is similar to panicky news stories over West Nile virus, a rare, rather mild, flu-like illness, and this fall's outburst of fear over anthrax, which, after all, killed just a tiny handful of people.

"Why poison? This is pre-verbal," deMause says. "It goes way back to when you were still drinking milk from the mommy's breast or from the bottle." (It's this same dark, guilt-ridden fear of the forces around us, deMause says, that explains what he considers to be our overreaction to the terrorist attacks of September 11. "I'd be glad to shoot bin Laden in the cross hairs myself. I'm not a pacifist," he says. "But it seems to me that we're going to go back and finish the job in Iraq and do all sorts of other horrible things that we should not do.")

But even if fear of mad-cow disease somehow taps into our more primal cultural obsessions, there is the fact of mad-cow disease that must still be contended with. And the fact is that the seeming dearth of mad-cow cases in this country may be the entirely predictable result of our failure to look.

Michael Greger, a Jamaica Plain physician who is a nationally recognized expert on mad-cow disease (he's listed in the acknowledgments of Mad Cow U.S.A.), blasts the US Department of Agriculture for what he calls a "'don't look, don't find' program of surveillance," adding: "Every week in Europe they test 10 times as many cattle than we have tested in a decade. Europe has tested five million at this point. If the US had as high an incidence as Europe, the current USDA testing program would not detect it. It is irresponsible to assert that we have no mad-cow disease in the United States when we simply haven't looked hard enough to tell."

As for what the future holds, Greger replies that "no one knows what the risk of eating American beef is. And by the time we know for sure, it may be too late. I counsel my patients to err on the side of caution and stop eating beef. Better safe than sorry."

For obvious reasons, mad-cow disease remains a big story in Britain. Last September, London's Guardian newspaper published a harrowing two-part series on the small village of Queniborough, where five young adults had died of nvCJD over a period of several years. The reporter, Kevin Toolis, noted that all the victims may have gotten sick because of such antiquated butchering practices as mixing brains and meat. His description of the long, agonizing death of Stacey Robinson was particularly horrifying.

"The howling went on for five months, night and day, from the autumn of 1997 to the spring of 1998, a low, growling, demonic yowl that escaped her lips as if it came from deep within the earth; the cry of the damned," Toolis wrote. "It could be heard halfway along the ward in Leicester's Royal Infirmary as Stacey plunged into madness. She soon lost the power to walk, to eat, to clean herself, to use the bathroom. She turned aggressive, kicking, swearing and assaulting her nurses. She battered her forearm against the bed until it was black and blue. She held her hand under a scalding tap and felt no pain. In the end, the doctors turned her ordinary city hospital room into a padded cell."

Mad-cow disease could be much ado about nothing; it could also turn out to be a scourge for the ages. To date, only 100 or so people in Britain (and just a handful in other countries) have died of nvCJD, even though some 60 million may have been exposed to BSE-contaminated beef. But because of the decades-long latency period, those who have died so far may merely represent the bleeding edge. According to some estimates, the worst-case scenario is that some 100,000 Britons could die the way Stacey Robinson did.

Last winter, when mad-cow disease was all the rage, both Time and Newsweek ran big stories on it. CAN IT HAPPEN HERE? asked Time. CANNIBALS TO COWS: THE PATH OF A DEADLY DISEASE was Newsweek's lurid take. A search of the New York Times' Web site turns up 114 references to mad-cow disease during the first three months of 2001 -- but just 43 during the slow-news months of June, July, and August.

Three and a half months after September 11, the media are gradually returning to normal, however you want to define normal in a country scarred by terrorism and war. The New York Observer last week predicted that the Times may cease publication of "A Nation Challenged," its special section on the war against terrorism, sometime after the New Year. How long will it be before Geraldo, back from misrepresenting his whereabouts in Afghanistan, treats his new viewers on the Fox News Channel to a special on JonBenét Ramsey?

The media have distinguished themselves this year, proving that -- despite a decade of corporate downsizing and a growing obsession with celebrity and scandal -- they can still provide sustained, in-depth coverage of vitally important news. If the reinvigorated media are looking for other important stories to cover in 2002, they should take another, longer look at mad-cow disease. They had the chance in 2001, and they walked away. But discerning the extent of this threat to our food supply is surely as important as describing the threat from Al Qaeda.

Dan Kennedy is Media Editor at Boston Phoenix and can be reached at dkennedy@phx.com

The Journalism Money Pit

Until September 11, the megacorporations that own most of the nation's news organizations were squeezing, slashing, and downsizing. Since the attacks on the World Trade Center and the Pentagon, these same corporations have, to their credit, pursued a virtual open-checkbook policy, allowing their news divisions to spend whatever it takes to cover the war on terrorism.

Now it's gut-check time. Will media executives invest in the journalistic resources needed to cover a complex, worldwide story that is likely to drag on for years? Or will they, once things have returned to normal (whatever that now means), resume their cost-cutting ways?

Despite a drop-off in advertising revenues that, even before September 11, was reportedly the most severe since the Depression years of the 1930s, the financial pressures facing the media are largely artificial. With Wall Street addicted to media-company profits of 20 percent and above, corporate executives have been cutting not to stave off losses but, rather, to maintain profits.

John Morton, a financial analyst who specializes in the newspaper business, puts it in perspective when he notes that newspaper companies rang up an average profit margin of 19 percent for the first six months of 2001. "It's all relative," he says. "It's not like newspaper companies were losing money. It's just that they weren't making as much as they were accustomed to."

Of course, the squeeze has unquestionably gotten worse since September 11. According to the New York Times, the three major networks lost about $500 million in ad revenue during the first few days after the attacks, when they broadcast commercial-free news 24 hours a day. The news media as a whole may have already spent $100 million beyond their budgets.

Advertising Age reports that analysts are revising their already-pessimistic forecasts downward: after having previously predicted a modest rise in ad spending in 2002, most now believe the decline will continue. Media corporations such as the New York Times Company (whose holdings include the Boston Globe), Knight Ridder, Gannett, and Reuters have reported miserable third-quarter earnings in recent weeks, with downward trends accelerating since the terrorist attacks.

But media companies, over the long term, remain highly lucrative enterprises. More important, they hold a public trust, considered so vital by the drafters of the Constitution that they were given explicit protection in the form of the First Amendment. That public trust has never been more crucial than it is now. This is no time for media executives to obsess over quarter-to-quarter results.

Yet, given the need to keep their stock prices up, it's hard to see how they can do otherwise. It's a vicious circle, and it will require foresight and courage to break it.

Perhaps the most optimistic media observer is Tom Rosenstiel, director of the Project for Excellence in Journalism, who thinks September 11 may have signaled a shift similar to the transition from the frivolous 1920s to the serious '30s and '40s. In both the '20s and the '90s, the media were obsessed with celebrity and scandal. Now, just as they were 70 years ago, they are being forced to turn their attention to the unhappy realities of a changed world. Rosenstiel believes the media can rise to the occasion -- although he cautions that there's no guarantee that will happen.

Consider what has happened to CNN. Before September 11, the pioneering all-news channel was sucking wind. Its corporate master, AOL Time Warner, was cutting costs, and the channel was loading up on boneheaded talk shows to compete with the Fox News Channel, its upstart conservative rival. Since the attacks, the ratings of all three news channels -- CNN, MSNBC, and Fox -- have been up exponentially, with CNN, which still has the deepest reporting corps, leading the way. The network has reportedly been able to raise its advertising rates, which will pay for more reporting. And its chief executive, Walter Isaacson, has crowed publicly that his network has rediscovered its sense of mission.

Rosenstiel thinks that this is no fluke -- that bigger audiences are here to stay, and that those audiences will pull in the advertising revenues needed to pay for better news coverage. And though he cautions that he doesn't want to sound "Pollyanna-ish," he also thinks it might be possible to convince Wall Street that news organizations are better off putting more of their revenues into newsgathering and less into shareholder value. A pipe dream? Perhaps, Rosenstiel admits. But he notes that Wall Street, more than any other place in the country, was devastated by the terrorist attacks, and "they are going to have a different psychology now."

Besides, Rosenstiel asks, if supermarket profits, to cite one example, generally run five percent, why is it accepted as received wisdom that media companies must earn at least 20 percent? Those profit expectations, he says, are based on nothing other than "consensus." And the consensus can change.

Geneva Overholser, a Washington-based professor of public-affairs reporting for the Missouri School of Journalism and a leading critic of Wall Street's unceasing demand for profits, says she hopes that the "impossibility" of reconciling high profits and in-depth news coverage "is so clear now that we will finally come face to face with the need to change the way we operate." She adds: "I don't think the Street is just going to decide it will be different. We can't keep up with the old profit pressures, and we've got to tell Wall Street a different story."

(Overholser, by the way, will be among the participants in a conference at Harvard on October 28 and 29 on "Paying for the Next News," sponsored by the Nieman Foundation and New Directions for News. The purpose, says Nieman curator Robert Giles, is to develop strategies to balance profit considerations with the public interest. For more information, go to www.newdirectionsfornews.com.)

Earlier this year, Jay Harris walked away from his job as the publisher of the San Jose Mercury News rather than comply with Knight Ridder directives to keep cutting costs. Observers were shocked. Harris, one of the highest-ranking African-American news executives in the country, was an industry star. But he said he could not in good conscience implement cuts that would hurt his already-profitable paper just to make it still more profitable. And he issued a public call for national action -- perhaps in the form of a high-profile commission -- to study how Wall Street's unrealistic profit demands were hurting the news business.

"I wish I could be as hopeful as Tom [Rosenstiel]," Harris told the Phoenix by e-mail. "On the positive side, this has been a giant reminder to the corporate managers that news is important, that it sells newspapers and attracts viewers, and that audiences expect them to do a good job. If and when they pull back they will probably catch some well-deserved flak. The question going forward is whether the CEOs and other corporate managers will pay more attention to the needs of their readers and journalism's responsibility to the nation in a time of profound national challenge, or remain beholden to large institutional investors and Wall Street analysts who likely will want the companies to do all they can for the bottom line and year-to-year growth -- the crisis facing the nation notwithstanding."

As Harris understands all too well, the price of what became business-as-usual is now coming due. For the past decade the media have focused on celebrity and scandal, sex and sleaze, providing wall-to-wall coverage of O.J. Simpson, Monica Lewinsky, and Gary Condit. The attention given to Osama bin Laden and other Islamic extremists was, by comparison, minuscule -- despite the earlier World Trade Center bombing, attacks on American embassies in Kenya and Tanzania, and the Clinton White House's misbegotten raids on purported bin Laden strongholds in Sudan and Afghanistan. Los Angeles Times media reporter David Shaw recently cited studies showing that "newspaper editors and television news executives have reduced the space and time devoted to foreign coverage by 70 percent to 80 percent during the past 15 to 20 years." It's no wonder that we now find ourselves adrift in a dangerous world that we don't understand.

Media analyst John Morton thinks it won't be long before the media revert to their downsizing ways. "I expect there will be considerable pressure on cutting costs," he says. "Even a crisis, if it goes on forever, is no longer a crisis, and in time people will revert to their former habits."

Former Boston Globe editor Matt Storin warns that, even if the hunger for news remains strong, it won't necessarily follow that the advertising revenues will be there to support it. "At the very time when they might want more, they might get less," Storin says of news consumers. "As good as circulation and viewership is, it's the advertising that pays the bills, by and large."

Yet such a development would not only be bad for democracy -- it would, ultimately, be bad for the bottom line as well. Indeed, the war on terrorism may represent an opportunity for media that are willing to invest in the journalistic resources necessary to cover it. Business consultant (and presidential cousin) John Ellis, a columnist for Fast Company, believes that "the appetite for great reporting has never been higher," and predicts that news organizations that rise to the occasion will also, in the long run, enjoy the greatest economic success.

As a comparison, Ellis invokes the well-known story of the New York Times and the New York Herald Tribune, and the way each responded to the severe paper shortages of World War II. The Herald Tribune cut its news hole; the Times loaded up on as much reporting as it could, and cut advertising. The Times, of course, emerged from the war as New York's -- and the nation's -- leading newspaper. The Herald Tribune went out of business two decades later.

Trouble is, during World War II the Times' owners, the Sulzberger family, could do anything they pleased with their newspaper. Today, even the Sulzbergers have ceded some of their control to Wall Street.

In the current media environment, in which nearly all of our most important news organizations are owned by publicly traded corporations, the issue isn't whether individuals want to do the right thing, but whether the stock market will allow it.

Dan Kennedy is Media Editor for the Boston Phoenix and can be reached at dkennedy@phx.com.

How the Terrorist Crisis Threatens our Personal Liberties

They that can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety.
-- Benjamin Franklin, Historical Review of Pennsylvania, 1759

At some level of danger to life and property, even people strongly committed to civil liberties -- and not everyone is -- are prepared to sacrifice some privacy, freedom of movement, and convenience to greater security.
-- Harvard Law School professor Philip B. Heymann, Boston Globe, 9/15/01


On Tuesday, September 11, not long after the terrorist attacks on the World Trade Center and the Pentagon, John Perry Barlow sent a message to the nearly 1000 people who regularly receive his "BarlowFriendz" e-mails. The contents were what might be called typical Barlow at an atypical moment. A well-known Internet libertarian and former lyricist for the Grateful Dead, Barlow was not about to temper himself, even during those first awful, horrifying hours.

"This morning's events are roughly equivalent to the Reichstag fire that provided the social opportunity for the Nazi take-over of Germany," Barlow wrote. "I am *not* suggesting that, like the Nazis, the authoritarian forces in America actually had a direct role in perpetrating this mind-blistering tragedy.... Nevertheless, nothing could serve those who believe that American 'safety' is more important than American liberty better than something like this. Control freaks will dine on this day for the rest of our lives." He closed with this: "And, please, let us try to forgive those who have committed these appalling crimes. If we hate them, we will become them."

Barlow's message was many things: overwrought and outrageous, to be sure, but also important -- even prescient, given the fearful, censorious atmosphere that has wrapped around us like a thick blanket during the past two weeks, both comforting and suffocating. As Barlow quickly learned, it also became a prime target for precisely the kind of authoritarianism he'd warned of. "Merely by calling for forgiveness for the bombers," he said, "I've received a death threat. I've received some extremely ugly e-mail." People who e-mailed him to say they agreed with him, he adds, were quick to interject that they didn't want to be identified.

"This is what totalitarianism is really about," Barlow says. "It's not the imposition of dictatorial will on a population by a dictator, it's the imposition of dictatorial will on a population by the population itself. Not that it's going to stop me, but in much of America right now, it takes a lot of courage to speak your mind if you're not willing to go to war against whoever the enemy might be."

The purpose of terrorism is to terrorize. To that extent, at least, the hijackers have won a temporary victory. Surveys taken in the immediate aftermath of the attacks on New York and Washington show that though Benjamin Franklin may have eloquently stated the American ideal, Philip Heymann -- and John Barlow -- have a better sense of the public mood. In the current environment, people are all too willing to give up their "essential liberty. "

Take, for instance, a poll conducted by ABC News and the Washington Post on September 13, in which 92 percent of respondents said they would support "new laws that would make it easier for the FBI and other authorities to investigate people they suspect of involvement in terrorism." More ominously, support dropped only slightly, to 71 percent, when people were asked whether they were prepared to give up "some of Americans' personal liberties and privacy."

Another poll, conducted by CBS News and the New York Times on September 13 and 14, found much the same thing. By a margin of 74 percent to 21 percent, respondents agreed that "Americans will have to give up some of their personal freedoms in order to make the country safe from terrorist attacks." Even when asked whether they would be "willing or not willing to allow government agencies to monitor the telephone calls and e-mail of ordinary Americans on a regular basis," an eye-catching 39 percent were willing, and 53 percent were not -- hardly a ringing affirmation of the right to be left alone. And it gets worse: this week, a poll by the Siena College Research Institute found that one-third of New Yorkers would favor internment camps for "individuals who authorities identify as being sympathetic to terrorist causes."

Civil liberties are in grave danger today, perhaps as they have been at no other time in our history. The "war" metaphor President Bush has chosen to describe the events of September is apt, given the seriousness and deadliness of the attack against us. But, as many others have observed, it is a war of a very different kind, with no clearly defined enemy, no hard-and-fast objective, no obvious endpoint -- a miasma pervaded by paranoia and suspicion. In that kind of atmosphere, the spirit of freedom -- which always takes a beating during wartime -- is fragile and vulnerable.

Though government officials from the president and Attorney General John Ashcroft on down have paid lip service to civil liberties, and though the media have focused on the fate of freedom with unusual vigor and perseverance, the news coming out of Washington, and from across the country, is chilling. Repressive new immigration and wiretapping laws are under consideration in Congress. Arab-Americans are being harassed and attacked. Music is being censored. Television personalities are apologizing for speaking their minds.

No sane person would object to more-stringent security at airports -- although such common-sense steps as installing better doors to protect pilots from terrorists would do more than screening passengers who look vaguely Arab, or banning plastic knives. Trampling on the Constitution is another thing altogether.

In a mind-boggling column for the New York Post last week in which he labeled foreign-born Middle Easterners a potential "fifth column," John Podhoretz wrote: "Leftist civil libertarians and right-wing anti-government types can do their part ... to protect Muslims and Arab-Americans -- with a generous display of silence and understanding when it comes to the new surveillance techniques being adopted by law enforcement. Their standard-issue complaints ring hollow at a time of war, when civil liberties must necessarily be curtailed to some degree."

Obviously, fighting terrorism worldwide will not be sufficient to save the United States as we know it. If Podhoretz's sneering dismissal of constitutional protections is any indication, we're going to have to fight repression at home as well.

The war on terrorism declared by our government is part of a profound struggle between rationalism, human liberty, and modernity on the one hand, and fundamentalist religion and traditionalism on the other. It is, in short, a war between the Enlightenment and the medieval world. Though US policy in Arab and Muslim societies is hardly above reproach, it should be obvious to all Americans that the West cannot afford to lose the war against terrorism -- any more than we could afford to lose to Nazism or global communism.

In fighting this war, though, we would be well advised not to trample too heedlessly on the Bill of Rights -- emblematic, as it is, of the Enlightenment itself, with its emphasis on human liberty and autonomy. During virtually every dangerous moment in our history we have taken actions that we later came to regret: the Alien and Sedition Acts, used by President John Adams in the 1790s to imprison domestic critics and expel foreigners; the suspension of habeas corpus in the 1860s, during the Civil War; the Palmer raids against leftist groups during and after World War I, around 1920; the incarceration of Japanese-Americans during World War II, in the 1940s; the anti-Communist witch hunts of the McCarthy era, in the 1950s; and the abuses of J. Edgar Hoover's FBI, which waged a secret, unconstitutional campaign of surveillance and disruption against the civil-rights and antiwar movements of the 1960s.

Then there was Richard Nixon, who occupied a category all to himself. The break-in at Democratic Party headquarters at the Watergate Hotel, the pilfering of antiwar activist (and Pentagon Papers leaker) Daniel Ellsberg's psychiatric records, the use of the Internal Revenue Service to harass political enemies -- Nixon's presidency was a constant, ongoing war against liberty. It was a war that ultimately resulted in his near-impeachment and, in 1974, his resignation.

Fortunately, to judge from the outcry by civil libertarians during the past few weeks, it appears that we have learned from the past. Unfortunately, we may be doomed to repeat it anyway. Support for freedom diminishes in the face of danger. What takes its place is a desire for safe, non-threatening conformity. It's understandable, given the current crisis, that President Bush is enjoying an unprecedented 90 percent approval rating (according to a CNN/USA Today/Gallup poll). But it's chilling that the six percent who continue to disapprove of Bush's performance are being singled out as unpatriotic or worse.

Consider what happened to Bill Maher. Last week, on his ABC program, Politically Incorrect, Maher said something, well, politically incorrect -- namely, that in some past military campaigns, US forces "have been cowards lobbing cruise missiles from 2000 miles away." He continued: "That's cowardly. Staying in the airplane when it hits the building, say what you want about it, it's not cowardly." The outcry was so fierce that Maher apologized, saying his views "should have been expressed differently" and that he should have aimed the "coward" charge at politicians rather than American troops. But that didn't stop Sears, Roebuck and FedEx from canceling their advertising, thus threatening the future of his show. Obviously, advertisers have a First Amendment right to withdraw their patronage; but in this case, it's the power of corporate money, not ideas, that is at issue. And that was just the beginning.

Clear Channel Communications, which owns the country's largest chain of radio stations, reportedly sent out a memo urging that its affiliates not play as many as 150 songs that could be considered offensive under the circumstances -- including John Lennon's "Imagine," which Neil Young managed to perform movingly and without controversy during last Friday's national telethon.

Massachusetts congressmen Marty Meehan and Richard Neal offered some mild criticism of Bush's performance in the hours immediately after the bombing -- and, according to Boston Globe columnist Scot Lehigh, Meehan received threats serious enough to warrant police protection.

In Texas, the FBI shut down Arabic Web sites, prompting, according to Reuters, charges of conducting an "anti-Muslim witch hunt."

In Baltimore, the Sun reported that anchors and even a weather forecaster at one TV station were required "to read messages conveying full support for the Bush administration's efforts against terrorism." When staffers objected, the message was changed to indicate that it came from "station management."

A caller to NPR's The Connection last week said he's flying the American flag not just to demonstrate his patriotism, but to ward off the animus of those who might think that he, an olive-skinned Italian-American, was an Arab.

Televised comments by those geriatric poster boys for religious intolerance, the Reverends Jerry Falwell and Pat Robertson, resulted in a double-reverse back flip that was almost funny. Falwell attributed the terrorist attacks to God's wrath, blaming the ACLU, feminists, abortion-rights supporters, and lesbians and gay men, and saying, "God continues to lift the curtain and allow the enemies of America to give us probably what we deserve." Responded Robertson: "Jerry, that's my feeling." The once-influential hatemongers' comments touched off a firestorm of criticism, prompting Falwell to apologize, layering irony upon insult. Not only did his and Robertson's rhetoric reveal an anti-modernist mindset similar to the terrorists' (minus the violence), but by being forced to say "I'm sorry," Falwell himself became a victim of the same cultural clampdown on free speech that had ensnared Bill Maher and Marty Meehan.

Indeed, the threat posed to the First Amendment right now is not so much official censorship -- that is, bans enacted by the government -- as self-censorship, a phenomenon that is far more dangerous in an age of media conglomerates than it would have been in an earlier time. Maher can't speak his mind if advertisers are going to boycott his show, which must turn a profit for ABC in order to stay on the air. A list of songs banned by one radio station is of little consequence. But when Clear Channel suggests that its nearly 1200 radio stations consider not playing certain songs, that's downright chilling.

"We're in the very murky realm of self-censorship," says Marjorie Heins, director of the Free Expression Policy Project at the National Coalition Against Censorship. Institutions such as ABC and Clear Channel "have their own First Amendment rights to decide what to produce," she says. "This only gets worrisome if this gets pervasive and widespread and goes on for a long period of time. Hopefully they'll come to their senses."

Congress has recent experience in how not to react to a terrorist attack. A year after the Oklahoma City bombing of 1995, Congress passed the Anti-Terrorism and Effective Death Penalty Act, a grotesque piece of legislation that accomplished two things. It severely curtailed the writ of habeas corpus, making it far more difficult for convicted criminals -- even those awaiting the death penalty -- to present new evidence that they'd been wrongly convicted. And it allowed the use of secret evidence in deportation cases against immigrants, which writer and civil libertarian Nat Hentoff has rightly called a "denial of fundamental due process."

It's notable how little any of it had to do with what actually happened in Oklahoma City. Keep that in mind. Over the past few years, a number of proposals to curtail fundamental freedoms in the name of security have festered in back offices in Washington and elsewhere, waiting for the right time to be pulled out of a drawer and sprung upon an unsuspecting public.

One of those times came on the evening of September 13, just two days after the attacks, when Senators Orrin Hatch (R-Utah) and Dianne Feinstein (D-California) brought an amendment to the floor that would make it far easier for government investigators to snoop on computer users. Senator Patrick Leahy (D-Vermont) protested that the bill -- which few had had time even to read -- was overly broad and fuzzy. For instance, the bill would add "terrorism" to the list of crimes for which a person could be investigated. That would seem to be a no-brainer, except, as Leahy pointed out, the bill provided no definition of terrorism. "I guess some kid who is scaring you with his computer could be a terrorist and you could go through the kid's house, his parents' business, or anything else under this language; it is that broad," Leahy observed. He added: "Maybe the Senate wants to just go ahead and adopt new abilities to wiretap our citizens. Maybe they want to adopt new abilities to go into people's computers. Maybe that will make us feel safer. Maybe. And maybe what the terrorists have done made us a little bit less safe. Maybe they have increased Big Brother in this country."

Despite Leahy's opposition, the amendment passed on a voice vote. The appropriations bill to which it was attached was approved unanimously.

The situation in Washington right now is in flux. Though few doubt that the Hatch-Feinstein amendment would pass the House, it has been largely superseded by a bill filed last week by Attorney General Ashcroft -- the so-called Anti-Terrorism Act of 2001, or ATA. The bill is being considered this week -- by Senator Leahy's Judiciary Committee, fortunately -- and could change considerably before it emerges from the legislative sausage-maker. Nevertheless, it has been the subject of wide-ranging opposition from groups that normally don't get along: like-minded organizations such as the ACLU, the Electronic Frontier Foundation (EFF), and the Electronic Privacy Information Center (EPIC) have been joined by conservative groups such as Phyllis Schlafly's Eagle Forum and the Gun Owners of America.

Among the more controversial provisions of the Ashcroft legislation are those aimed at bringing wiretapping laws into the digital age. Theoretically, this makes sense: investigators' ability to track crime should not be diminished by technological advances. Yet the legislation could give investigators a great deal more power than they have today. Under current law, for instance, officials can obtain the phone numbers called by a suspect, even without probable cause. Under ATA, investigators would be able to obtain e-mail addresses and even the Web locations a suspect has visited -- information that is considerably more revealing. The role of judges in approving such wiretaps would be diminished, thus weakening an important safeguard. "Roving" wiretaps, which follow a suspect from phone to phone rather than being placed on just one phone, would be permitted -- probably a sensible move, but open to abuse. The legislation could also make it easier for federal investigators to use a controversial piece of software known as "Carnivore," which would allow them to intercept enormous quantities of e-mail and other information from Internet service providers, even from innocent customers not suspected of any wrongdoing. Customers would have no choice but to trust the feds not to exceed the scope of their warrants. By contrast, traditional wiretapping targets just those customers covered by a warrant granted by a judge. It also requires the intervention of the phone company, which, at least in theory, provides an extra layer of protection.

Because ATA, the Hatch-Feinstein measure, and other proposals have been drafted so hastily, groups such as the ACLU, EFF, and EPIC have been forced to react more quickly than they normally would, issuing broad statements of principle with details still to come. On Monday, for example, EFF issued a statement that said in part, "We fully support legitimate government efforts to bring the perpetrators of these attacks to justice. Yet as a watchdog for civil liberties, we are skeptical of claims that the only way we can increase our security is giving up our freedoms."

If only ATA were the worst of it. Time magazine reports that the Bush administration "is considering the establishment of special military tribunals" so that suspected terrorists "could be tried without the ordinary legal constraints of American justice." This is in addition to a policy change Ashcroft has already announced that expands the government's power to detain immigrants suspected of crimes. A recent article in the Wall Street Journal on how Europe deals with terrorism raised a cafeteria full of repressive possibilities: issuing national identification cards, placing closed-circuit television in public places, and holding suspects without charge for days on end. "Biometrics" technology could be used to identify people through the unique characteristics of their eyes or other facial features.

Other proposals are lurking in the bushes. Just last month, Senator Richard Shelby (R-Alabama) withdrew what critics have called the "Official Secrets Act" -- a bill that would make it a felony for government officials to leak virtually any classified information. (Never mind that it's already a crime to leak information that would compromise national security.) Former senator Daniel Patrick Moynihan has spoken often of government's excessive zeal in classifying information, as much to cover up official bungling as to protect the public. Last year Moynihan testified that the government has "enough classified material to stack up as high as 441 Washington Monuments." The Shelby legislation would only worsen this situation. As the New York Times editorialized, the bill would make it difficult to debate such important issues as the US-backed drug war in Colombia; it might even have hindered efforts to expose the Iran-contra scandal of the 1980s. Perhaps the most important example of a righteous leak concerns the aforementioned Pentagon Papers, which revealed the secret bureaucratic history of the Vietnam War. Unfortunately, Shelby has promised to introduce his miserable bill again when the time is right. And he is nothing if not persistent, having persuaded Congress to pass it last year, when it died only as a result of Bill Clinton's veto.

Certain elements in Washington have been trying for years to ban the use of encryption technology unless the government could be guaranteed a way to crack the code. Never mind that there is no evidence the New York and Washington terrorists used encryption, and that freedom fighters in other parts of the world have used it to safeguard their communications from tyrants such as Slobodan Milosevic.

Now Senator Judd Gregg (R–New Hampshire) is going to try again, even though nearly unbreakable encryption technology is freely available on the Internet. Gregg, in other words, is proposing to act after the horse has gotten away and the barn has long since burned to the ground.

When encryption is outlawed, only outlaws will use encryption. Maybe making common cause with gun owners makes sense after all.

At the root of these proposals to take away some of our liberties is this terrible thing that happened to us, and the very real threat that it -- or something like it -- is going to happen again. "We're sort of in this desperate search for security, and we want everybody to be on the same page. And that is a scary thing," says Paul McMasters, First Amendment ombudsman for the Freedom Forum. "We have the right to private speech, to engage in public discussion, and to do so anonymously if we wish. Those are very important First Amendment freedoms. The danger is that we're possibly devolving into a society that is run by the tyranny of conformity."

If this is war, we have to preserve our right to express our opinions about it. Some will demonstrate in favor of peace; some already are doing so. Others -- probably most of us -- will favor military action, but will understand that, as a self-governing people, we need to do our utmost to understand what is going on and to criticize as well as support our government.

Brock Meeks brings an unusual perspective to the table. He covered the Soviets' war in Afghanistan for the San Francisco Chronicle. His pioneering online journal, CyberWire Dispatch, railed against (among other things) government attempts to curtail free speech on the Internet. Now, as chief Washington correspondent for MSNBC.com, Meeks is worrying about a war that may well affect two of his sons, both of whom serve in the Army.

"I face the prospect of going back to Afghanistan and covering my own sons' deaths," Meeks says. "I think people should not be afraid to question the motives and operations of this government in anything that puts people in harm's way. People have to not go quietly into the night. The lessons of Vietnam aren't that far removed. People more than ever are going to have to dust off those 'Question Authority' buttons and bumper stickers and put them on."

Voluntary, heartfelt unity is one thing, and it's encouraging to see after the devastation of September 11. Conformity built on social stigmatization or even threats, combined with repressive new laws, is something else entirely.

Part of the job we all have to do in order to win this war is to prevent the barbarians who seek to exterminate the Western notion of individual liberty from causing us to do a good part of the job ourselves. In American constitutional law, virtually no liberties are absolute. After all, as Supreme Court justice Arthur Goldberg once said, "The Constitution is not a suicide pact."

Nevertheless, freedom would be drastically diminished if the Bill of Rights, targeted as the result of a siege mentality, were substantially weakened. The big danger that lies ahead -- as big a danger, at least in a larger historical context, as terrorism itself -- is that we'll turn our backs on the Enlightenment.

At this time of national crisis, many Americans take some comfort in waving the flag, and rightly so. But the flag is not the only symbol of our culture of liberty.

So, too, is the Constitution.

Dan Kennedy is a media critic for the Boston Phoenix. Harvey Silverglate is the co-author of The Shadow University: The Betrayal of Liberty on America's Campuses (HarperPerennial) and a partner in the law firm of Silverglate & Good.

Covering the Horror

TUESDAY, SEPTEMBER 11, 2 P.M. -- Three observations as the horror of the worst terrorist attack in American history continues to unfold.

1) The local affiliates are a hindrance. CBS, NBC, and ABC were almost unwatchable at times, because the local affiliates insisted on taking control of their own newscasts. With several Manhattan blocks in flames and the Pentagon smoldering in Arlington, Virginia, local viewers were subjected to such trivia as updates on the Mass Pike and the MBTA, and the release of "nonessential" workers from the State House. To be sure, the origin of several of the terrorist flights — Logan Airport — makes this a huge local story, and it will be fascinating to see what Boston news organizations do with this in the days and weeks to come. But localization hit bottom when the Boston stations carried Governor Jane Swift's well-meaning but news-free press conference live, even as CNN was broadcasting a crucial news conference called by the Taliban's foreign minister.

2) The media are acting responsibly. With elected officials such as Senator Chuck Hagel justifiably calling this "another Pearl Harbor," it must be difficult for the always-jumpy media to refrain from escalating tensions even further. Yet, even though commentators have properly noted that international terrorist Osama bin Laden must be considered a prime suspect, care is being taken not to get ahead of the story. "Nobody knows who's responsible for this," said CBS's Dan Rather shortly before 1 p.m. And when his colleague Bob Schiefer talked about the "rage" that was being felt in the streets of Washington, Rather replied, "It's one thing to have all that rage; it's another thing to know where to direct that rage." This is a considerable improvement over the first hours after the Oklahoma City bombing in 1995, when various so-called experts were quick to point the finger at Middle Eastern terrorists, only to have to swallow their words when the conspiracy turned out to be homegrown.

3) The media are reacting rather than reporting. For every reporter, anchor, and commentator who's working today, this is the biggest story he or she has ever covered -- and, it is to be hoped, ever will cover. In scanning around CNN, MSNBC, and the major networks, it appears that the media are doing an admirable job of remaining calm and trying to offer facts rather than unfounded speculation. But the atmosphere is too emotional and charged for anyone to step back and simply offer a 10- or 15-minute newscast pulling together everything that has happened. As a result, viewers are subjected to visions of horror but not much context, making it difficult to grasp the whole awful story. To be sure, this is a story that is best followed live. But a recap at the top of every hour would make it easier to follow.

Dan Kennedy is the media columnist for the Boston Phoenix.

Must-see GOP TV

PHILADELPHIA, JULY 31 -- There's an excellent reason that the number of people watching the conventions plummeted between 1976 and 1996. Rather than being news events where important things actually happened, the quadrennial gatherings degenerated into lame made-for-TV spectacles, with predigested pap and feel-good videos substituting for the fractiousness and drama of years past. Seen in this light, the public's increasingly itchy clicker-finger should be seen not so much a sign of disengagement as evidence of good taste.

Now some of the best minds in media and politics have a solution. They want to force you to watch.

Okay, let's be clear. We're not talking Big Brother, as in Orwell's 1984. Instead, we're talking about canceling Big Brother, as in the so-called reality show, at least when the conventions are in session. At a noontime panel discussion at City Hall on Sunday, as thousands of delegates were beginning to arrive for the Republican National Convention, a whole lot of smart people who ought to know better insisted that what the process really needs is a return to gavel-to-gavel coverage by the three major broadcast networks.

"I think the networks' decision is flat-out unconscionable," said Democratic National Committee general chairman Ed Rendell, referring to the cutback in hours that are being devoted to convention coverage this year. Rendell -- a former Philadelphia mayor who was instrumental in bringing the Republicans here -- went so far as to insist that all six networks (CBS, ABC, NBC, Fox, UPN, and WB) broadcast all four hours during each of the conventions' four nights. "And if they lose money," he added, "they should take their lumps."

Rendell's was a popular position at the event, sponsored by the Joan Shorenstein Center on the Press, Politics, and Public Policy, part of Harvard's Kennedy School. His remarks were met with loud applause, as were similar calls to arms by former Shorenstein Center director Marvin Kalb, now co-director of the center's Vanishing Voter Project ("There is a public-interest obligation") and CNN anchor Judy Woodruff ("The stakes are higher than usual"). To be fair, none argued for conveyor-belt journalism; all, Rendell included, said the networks should use the conventions as an opportunity to do tough reporting, although all were unclear on exactly how to make that happen at an event so prefabricated that -- as Boston Globe editor Matt Storin pointed out -- not even the vice-presidential nomination is left for convention week anymore. By contrast, the crowd sat on its hands when CBS News president Andrew Heyward said the Big Three networks should get credit for continuing to cover the such non-newsworthy events as thoroughly as they do, noting that they will all broadcast the presidential nominees' speeches live.

Of course, the conventions are being covered gavel to gavel on cable, via CNN, MSNBC, the Fox News Channel, and C-SPAN, as well as on the Internet and in every major and most minor newspapers in America. And there is one broadcast network that's going gavel to gavel: PBS. Not everyone gets cable or has an Internet connection; but everyone with a television set does get PBS, and newspapers are not exactly difficult to come by. With some 15,000 journalists in town, there will be no shortage of political reporting this week. So why, you might ask, is it necessary to broadcast the conventions on every network?

The answer is that the public won't watch unless it's forced to. Proponents of coverage on all the networks don't put it quite that crudely. But Kennedy School professor Tom Patterson, who is also the co-director of the Vanishing Voter Project, argued in a background paper that 74 percent of people who responded to a recent survey didn't know when the Republican convention would be held, and only 34 percent planned to watch any of it. Thus, the need for multi-outlet coverage, Patterson argued, is based on the notion that viewers who turn on the tube looking for Survivor and finding the convention instead will stick around and maybe learn something about the presidential campaign. Perhaps the theory is that one faux reality show is as good as another.

It was that unlikely populist Tom Brokaw, the NBC anchor, who pointed out the flaws in Patterson's argument, comparing the unavoidable coverage Patterson and Rendell favor to "state television."

Afterwards, when I asked Brokaw to elaborate, he did so with relish, calling the conventions "summer camp for grown-ups," and suggesting that viewers are turned off by politics not because they don't care, but because they sense a powerful disconnect between politics and their lives. "This is all about raising money and having a good time," Brokaw told me. "The public today is separated from what's going on in Washington -- not every four years, but every day."

Forcing people to watch isn't going to change that grim truth.

Why We Love to Hate Microsoft

It was April 28, a Friday, late in the afternoon. I was standing in the service department of one of the few Apple Computer dealers left on the planet, waiting to pick up an iMac I'd brought in earlier for repairs. The technician -- a soft-looking guy in his 20s or 30s, an anti-static strap around one wrist and junk-food detritus spread out on his workbench -- was so excited you'd think he'd just discovered a lost episode of Star Trek.

Earlier that day, the Justice Department had announced it would seek to split the software giant Microsoft into two parts, with one company getting Windows, the other getting everything else: the Office suite (Word, Excel, and the like), Internet Explorer, MSNBC, Slate, and maybe even a few remaindered copies of Bill Gates's hilariously awful 1995 bestseller, The Road Ahead.

Six weeks would pass before federal judge Thomas Penfield Jackson would give the break-up his imprimatur. It could be years before Microsoft exhausts its appeals. The next president, whether it's George W. Bush or Al Gore, may flinch at the prospect of destroying our most successful company and instead order his Justice Department to settle on the cheap. Yet the Apple technician was having none of that. To him, the Microsoft split was a done deal, and it had come not a moment too soon. Chortling with nasal alacrity, he prattled on and on about how it was all over for "Bill" -- that "Bill" had dictated what the computing landscape looked like for far too long, and now it was time for "Bill" to toe someone else's line. I wish I'd been taking notes, but you get the idea.

Later, I was struck by the unreality of our exchange -- or, to be more accurate, his monologue. In the first place, there we were, two people for whom computers are an essential part of our daily lives, and neither one of us was the least bit dependent on Microsoft. I use precisely one Microsoft product -- the Macintosh version of Internet Explorer -- and certainly could get by almost as well with Netscape Navigator were I afflicted with the same purist tendencies I'm sure my technician was. "Bill" has surely done plenty of dictating over the years, but he hadn't done any to us.

More important, though, was the level of fascination that moment revealed. Yes, Bill Gates is an asshole -- an arrogant, screaming, humorless workaholic who, despite his carefully nurtured reputation as some sort of uber-geek programming genius, is actually a mediocre software developer who built his monopoly by stealing others' work when he could, buying it when he had to, and threatening to destroy companies that tried to do business with anyone other than Microsoft.

But so what? I mean, go watch Erin Brockovich. There you'll learn about a utility company, Pacific Gas & Electric, whose toxic dumping killed some people and sickened many more. Closer to home, air pollution from PG&E's two power plants in Massachusetts may be directly responsible for about 150 deaths a year, according to a recent study by the Harvard School of Public Health. Yet nobody knows the name of the guy who runs PG&E.

Of course, Bill Gates is the richest man in the world, and in a culture obsessed with wealth, that counts for a lot. Robert D. Glynn Jr. -- who is, in fact, the chairman, CEO, and president of PG&E -- is well compensated indeed, with a reported 1999 income of $2.3 million. But consider that Gates's net worth is an estimated $80 billion. Then, too, the computer industry is sexy, hot, celebrity-driven. Slate editor Michael Kinsley, whose paychecks are signed by Gates (ever the literalist, Kinsley notes that he actually is paid via direct deposit), defines the difference between Microsoft and PG&E this way: "Technology is glamorous. The electricity grid is not glamorous." Bill Gates and Microsoft are the most visible symbols of that glamour, regardless of how unglamorous Gates may be in real life.

We love celebrity and we love wealth. But we love it even more when someone who has it all loses it because of hubris, or looks as though he's going to lose it, or loses it and gets it back and begs forgiveness on Oprah. In the real world, Bill Gates runs a software company. In the pop-culture world, he's the smartest kid in class, the nerd who reminds the teacher she forgot to assign homework, the nudgy little prick who got beaten up on the playground all the time, exacted his revenge by running roughshod over his former tormentors, and is now about to have his ass handed to him once again, like some endlessly looped fifth-grade psychodrama. We love celebrities, and we love to hate them too.

Gates is both the most widely recognized symbol of the computer age and the monopoly-wielding intimidator who has stifled innovation and made mediocrity the near-universal standard. His billions inspire blind worship and bitter envy. He is among the most admired of Americans, yet a small but dedicated minority hates him with such intensity that you'd think he was personally forcing them to use his products. Politicians fawn over him, yet the government wants to destroy his company. It's a love-hate relationship that reflects our own bifurcated attitudes toward technology, celebrity, and wealth. We're prospering, many of us, but we're doing so in a world we don't understand, working too many hours, both master of and slave to the wondrous machines that made prosperity possible.

All of which, in short, is why we love to hate Bill Gates.

Granted, a majority of Americans don't spend that much time thinking about Bill Gates. To the extent that they do, polls show that two-thirds hold a favorable view. And why shouldn't they? Tens of millions of people use Microsoft products without complaint. Though rarely the best in a given category, Microsoft's programs work reasonably well most of the time, they're ubiquitous, and they're cheap. Even an old Macintosh diehard like me has to admit that you can save a lot of money and compatibility hassles by picking up whatever Wintel machine is on sale at Best Buy.

But, yes, we hate Bill Gates too. Go to any search engine and enter the phrase "Microsoft sucks." You will be rewarded with a luxuriously long list of sites. Some let you punch Gates right in the mouth. Some show that infamous video of Gates getting hit with a cream pie in Brussels in 1998. (Incoherent aside: the pie attack was reportedly led by a Belgian author and artist named Noel Godin, who claims to belong to a "gang of bad hellions that have declared the pie war on all the unpleasant celebrities in every kind of domain." Godin's slogan, according to the Netly News: "Let's pie! Let's pie! Nincompoop guys!") You can stare at Web sites that tote up Gates's wealth. You can watch Windows 98 crash on Gates himself, at a corporate unveiling. You can read an anti-Gates comic book at a site called Frankengates. (See "Taking in the Sites," right.)

The anti-Microsoft movement obviously goes well beyond such digital trivia, though. There are the legions of folks urging consumers to boycott Microsoft, ranging from geeky Macintosh and Linux aficionados to geeky consumer advocate Ralph Nader. There are Windows users themselves, such as the New Republic's John Judis, who wrote last year, half in jest, that Microsoft should be broken up because its products, well, suck. There are Gates's competitors, such as Sun's Scott McNealy, who reportedly once called a Wall Street Journal reporter to complain, in a juvenile whine, that said reporter was going way too easy on "Little Billy Big Bucks," or Oracle's Larry Ellison, a reptilian thug who has said of Microsoft's rapacious ways, "It makes me want to puke." And, of course, let's not forget Janet Reno, Joel Klein, David Boies, and the rest of the Justice Department antitrust gang.

There's no question that Bill Gates has become a cultural obsession. Gates is one of our best-known celebrities, as omnipresent as Rosie O'Donnell or Michael Jordan. The legal case that threatens to destroy his company, US v. Microsoft, is the subject of endless, repetitive analysis in newspapers, magazines, and television shows. A Lexis-Nexis search of just one paper, the New York Times, for the phrases "Microsoft" and "monopoly or antitrust" yields 1676 hits between January 1, 1995, and June 13, 2000; 324 of those hits are for this year alone. Gates is a frequent cover boy for national magazines such as Time and Newsweek -- and all over the business and technology press. He is a disembodied presence in Douglas Coupland's 1995 novel Microserfs, whose opening lines include, "Bill is wise. Bill is kind. Bill, Be My Friend ... Please!" Ulterior Motive, a 1998 thriller by Daniel Oran -- a former Microsoft programmer who is said to have "invented" the Win95/98 "Start" button (nothing but a rip-off of the Macintosh Apple menu, snort I) -- stars a Gates-like character who runs for president even while preparing secretly to unveil an operating-system upgrade that will spy on every computer user in the country. How's that for an invasive cookie? And not to give away the ending, but Oran couldn't resist the urge to kill off Gates -- er, Jack Malcolm -- in the closing pages.

What, precisely, is the fascination with this supremely uninteresting man, who is universally described by those who know him as a wonk and a grind, consumed by business and awkward in his social dealings? Much of it involves our ambivalence toward wealth -- other people's and our own. Gates is -- even after the recent drop in the price of Microsoft stock lopped tens of billions of dollars off his net worth -- the richest man in the world. At a time of unprecedented stock-market wealth, Gates is the ultimate symbol, having done much to fatten the portfolios of small investors and large institutions alike. And he accomplished all this when he was young. More than two decades into the Microsoft saga, he is still only in his mid 40s. And when he's not obsessing over Judge Jackson or sleep-deprived or just generally agitated, and the light is just right and the photographer is kind, he very much resembles the gangly, dirty-haired, scuzzy-toothed kid who dropped out of Harvard with a vow to make his first million before he was 25.

Thus, Gates is a symbol for our time. As Gary Rivlin put it in his 1999 book, The Plot To Get Bill Gates (Times Books), "Every age, it is said, gets the icons it deserves. The wide-open Wild West of the nineteenth century gave us the robber barons. The greed of the over-consuming 1980s gave us the rapacious Michael Milken and Ivan Boesky. The money-drenched, harried 1990s, then, demanded a workaholic, unrepentant overachiever worth in the tens of billions of dollars."

Seen in this light, US v. Microsoft is actually two trials. One is the trial in Judge Jackson's courtroom, with its endless testimony, Gates's disingenuous videotaped deposition, and the disastrous Microsoft demo -- aimed at "proving" that Internet Explorer couldn't be separated from Windows -- that was at best sloppily put together, and at worst doctored. The other trial exists entirely in the realm of pop culture. This is hardly unprecedented; indeed, celebrity legal struggles are how we work through, or at least attain a greater understanding of, some of the more difficult issues that afflict our society. In the real O.J. Simpson trial, a man got away with murder. In the pop-culture O.J. Simpson trial, we learned some valuable lessons about race, celebrity, and the shortcomings of the legal system. The real Clinton-Lewinsky drama defies rational analysis. The pop-culture Clinton-Lewinsky drama shed light on issues ranging from workplace sexual harassment and a powerful boss's abuse of his position to the frightening ability of a self-righteous prosecutor-run-amok to destroy people's lives.

Defining the pop-culture elements of US v. Microsoft is somewhat more difficult. This is not, after all, a case about anything as serious as murder or as tawdry as blowjobs. If it's a morality play, then it's too early to say what the moral might be; Gates may, after all, emerge triumphant. Perhaps it's about our ambivalent attitude toward money and success. Gates attained both despite a notable lack of creativity and innovation when compared to many of his digital-age peers, such as Apple's Steve Jobs or Netscape's Marc Andreesen. Gates's legal woes may be a form of rough justice, a cosmic evening-out.

More than anything, though, the Bill Gates saga is about us -- a referendum on the workaholic '90s, when more Americans made more money than at any time in history (while leaving unprecedented numbers of working-class and poor people behind), when personal computers landed on every desk and in every home, when everyone (well, half of us, according to surveys) got into the stock market, and when technology was held up as the ultimate good.

The way Lloyd deMause sees it, what's happening to Bill Gates and to Microsoft is quite simple. We all got rich in the '90s. (Moi?) We feel guilty. And Gates must pay for our sins. "I see Gates as a sacrificial victim," says deMause, editor of the Journal of Psychohistory. "He's our richest man and started from very little. He's the American dream. We're now enjoying the longest period of prosperity in history. We feel guilty about that, and we have to find somebody to pay for that. We're all going along with this. There's no great outcry that we're pinning this guy's balls to the ground. And it's totally ludicrous."

But what about the polls showing that a majority of citizens like and respect Gates? "We like our sacrificial victims," deMause replies. He draws an analogy to the way the Aztecs used to treat their human sacrifices: "They'd wine him and dine him, they'd say he was the greatest person in the world, and they'd pull his heart out. It's the ones we love that we sacrifice. It's our first-born, it's our pride. And Bill Gates is our pride." But -- but -- didn't the antitrust trial show that Microsoft had abused its monopoly position, using its Windows dominance to crush Netscape and to damage Sun? DeMause will have none of that. He notes that the Sherman Antitrust Act itself, passed in 1890, was a product of the Progressive Era -- a "purity crusade," he says, that was based on a sort of cultural self-flagellation, when society banned alcohol and cracked down on brothels. The Sherman Act, he says, is "puritanical, anti-success -- it's an irrational law to start with." (It should be noted that deMause's pro-Microsoft stance may be motivated by a more practical concern. "I have a Mac, and nothing works together," he complains.)

Unusual though deMause's analysis may be, perhaps there's something to it. I may not have gotten rich in the '90s, but maybe, in other ways, I embody deMause's theory. Like millions of other people, I own some Microsoft stock, being perfectly content to let Gates earn me some money so long as I don't have to use his cruddy products. I also happen to believe that breaking up Microsoft would be bad for people who use computers and, thus, for the economy; the Windows/Office standard may be mediocre, but it is a standard, and that's what fueled much of our technological growth in recent years. Yet I'm thoroughly enjoying watching Gates get his comeuppance for decades of sleazy behavior, from putting the screws to the original developers of what became MS-DOS to messing with DOS so that Lotus 1-2-3 wouldn't run to pretending that Explorer was an integral part of Windows in order to destroy upstart Netscape. Gates deserves what's happening to him, and if it's costing me money, well, that's my price of admission.

Then, too, it's possible that deMause is studying the wrong primates. To see what Judge Jackson has said, and to read some of the commentary, it could be that the sort of behavior we're looking at here isn't human but, rather, ape. Government remains the alpha male. If Gates had merely acknowledged that by figuratively flashing his butt to the judge as a sign of simian deference, the antitrust case might have gone away. Instead, Gates flashed his butt in a more human sense -- that is, as if to say, "You can kiss my ass." And Jackson is making him suffer. Look at what the judge did. First he refused to hold any hearings on the proposed remedy of breaking Microsoft into two parts. Then he gave interviews in which he defended his decision to deny Microsoft its due-process rights by saying, essentially, that he was sick of the company's attitude, and they're all a bunch of liars anyway. "Untrustworthy" is the word that he used.

And guess what? Opinionmakers applauded, displaying their own butts to the alpha male. The New York Times' Tom Friedman, on June 9, dropped trou before the end of his lead paragraph, writing that the judge's decision was "an indictment of the attitude of the high-tech community in general toward government" and "an indictment of the particular attitude and arrogance of Microsoft." ("Bless Judge Jackson's heart for that," he added.) Even the Wall Street Journal's Holman Jenkins, in blasting Jackson on June 14 for basing his decision on his "aggravation with Microsoft for engaging in standard courtroom practice," wrote that Gates's real mistake was in failing to play along with the "show trial" aspects of the case and make a better public-relations effort in the courtroom. In other words, you should have shown those cheeks, Bill.

Let's see, now. Bill Gates as human sacrifice. Bill Gates as ape. Did I leave anything out? Well, how about Gates as Bill Clinton? Salon's Scott Rosenberg, whose coverage of the Microsoft case has been consistently excellent, noted the similarities between Gates's and Clinton's courtroom prevarications way back on December 15, 1998. "If you study the transcripts of the Microsoft chairman's testimony, you find a man resolutely unwilling to grant words a common meaning -- to the extent that he questions whether the 'we' in internal Microsoft e- mails actually refers to Microsoft," Rosenberg wrote. "In one hilarious passage, Gates digs in his heels and says he has no idea what a fellow executive meant in writing that 'we're going to be pissing on [Java] at every opportunity.' Bill Clinton and Bill Gates -- inveterate hairsplitters, separated at birth!"

In fact, Rosenberg didn't push the Clinton analogy far enough. Gates shares not just a proclivity for questioning the meaning of simple words (for Clinton, "is" and "sex"; for Gates, "we" and "piss"); he also shares a roughly equivalent position in pop-cultural terms. Both men have absolutely exasperated the majority of elite opinion -- i.e., the Washington-based political-media class that seems to care more about their behavior than anyone else. Both are pursued by crazed conspiracy theorists who believe the two Bills embody evil on this earth. And both are nevertheless broadly supported by the public, which understands their flaws but thinks the good outweighs the bad.

I asked Wendy Goldman Rohm, author of The Microsoft File: The Secret Case Against Bill Gates (Times Books, 1998), an anti-Gates screed, why there appears to be such a split between elite and popular opinion. She replied like an anti-Clinton zealot, arguing, in essence, that if the public knew what we know, it would demand that he be impeached and removed from office, damn it! "There may be some people who think they're untouched by what's going on, but it's untrue," she says. "They [Microsoft] have abused their monopoly power over and over again. Most people don't know anything about the facts of how Microsoft conducted business."

No doubt that's true. But it's probably equally true that if they did know, they wouldn't care. After all, look at the generally favorable view most people hold of Clinton. Likewise, the public has been told that Bill Gates is a greedy bastard who's tried to use his control of Windows to destroy would-be competitors. But they also know they can buy a computer for less than $1000 and it will be loaded with all the software they'll ever need, direct from Microsoft. And as far as they're concerned, it was free: Microsoft already got its money from the computer manufacturer. To top it off, their stock portfolio or mutual fund is probably a little richer because it contains some shares of Microsoft.

Fast Company columnist John Ellis, who uses words such as "horrible" and "unbelievable" to describe Microsoft's anti-competitive behavior, nevertheless has no problem understanding why Gates and his company retain broad popular support. "In the great value equation of the consumer, which is both money and time, Bill is on our side," Ellis says. "Microsoft is a company that a) works and b) makes you wealthier. So what's not to like?"

There's a joke about Microsoft that goes like this: "Q: How many Microsoft engineers does it take to change a light bulb? A: None. Bill Gates will just redefine Darkness(TM) as the new industry standard." Like many jokes, this one is based on more than a little truth.

Go back to early May. Millions of computer users around the world found messages in their e-mail boxes titled "I Love You." They opened them, ran the attached executable file -- and did enormous amounts of damage to their hard drives.

As it turns out, not everyone who received the Love Bug was hurt. You had to be using two products from Microsoft in order to be infected: the e- mail program Outlook, running under Windows. And it wasn't just that the Filipino hacker accused of writing the Love Bug wanted to hurt Windows/Outlook users. It was that Microsoft made it easy for him. Outlook allows users to write "scripts" that can automate any number of routine functions. The Love Bug included a script that sent out messages to every person in a user's address book, with copies of the virus attached.

On May 5, Boston Globe technology reporter Hiawatha Bray wrote a piece in which he quoted computer-security expert Richard Smith as saying that Microsoft should rethink its approach. But Scott Culp, a program manager at Microsoft, struck a defiant tone, telling Bray, "This is not due to a flaw in a Microsoft product" and "The technologies are there because the customers have asked us to put it there." Incredibly, Culp even went on to say that users need to be taught not to open e-mail attachments unless they're from people they trust -- ignoring the fact that the Love Bug, by using Outlook's address book, guaranteed that people would get the virus from people they trusted. Just call it Automated Virus Replication(TM), the new industry standard.

Which means that Wendy Goldman Rohm is right when she suggests that the public needs to pay more attention to Microsoft the company and maybe a little less attention to Bill Gates the icon. Pop-culture references aside, we have vested an enormous amount of power in the personal computer and the Internet; and Gates, in turn, has managed to grab far more than his share of control. Forget about Netscape, the proximate cause of the antitrust case; that's the past. The future is in the way Gates -- even in the midst of his legal travails -- continues to try to enhance his monopoly by any means necessary. The Microsoft mantra when encountering new technologies that it didn't develop is "embrace and extend." What does it mean? Marvelous Marvin Hagler said it best many years ago when he told boxing writers that the only two things on his mind were "destruct and destroy."

Look at Sun's Java programming language, intended as an Internet lingua franca that could make Windows obsolete. Microsoft cut a deal with Sun, then unveiled a Windows-specific version of Java that was incompatible with non-Windows machines. Or look at the latest brouhaha, over Kerberos, a security standard for servers -- the high-capacity computers on which much of the Internet actually resides. Kerberos, a freely available program associated with the so-called open-source movement, was altered by Microsoft so that the version used in Windows 2000 is incompatible with the rest of the world. And according to Declan McCullagh, writing in Wired News on May 11, when users of the hard-core tech site Slashdot.org exposed Microsoft's perfidy, Microsoft threatened to sue them for copyright violations.

Keep in mind, this was Microsoft engaging in exactly the kind of behavior that has brought it to the brink of a break-up, and doing it while waiting for Judge Thomas Penfield Jackson's final ruling. Incredible.

"Anybody who has been watching the business for a couple of years has seen countless generations of superior products plowed under," says Atlantic Monthly staff writer and Industry Standard columnist James Fallows, who actually worked for six months as a Microsoft consultant last year, hoping to improve Word.

It's what Microsoft may do next that worries Harvard Law School professor Charles Nesson, director of the Berkman Center on Internet and Society. The incredible growth of the Internet has been driven by principles antithetical to Microsoft, such as open standards and open access for all. The Berkman Center is already fighting the cable companies, which seek to restrict high-speed Internet-by-cable to service providers of their own choosing, and to ban or limit content -- such as streaming video -- that may compete with their own cable-TV offerings. Nesson sees Microsoft in much the same light as the cable companies, and he's concerned that Gates, whose Windows monopoly is threatened by the Internet, will find ways to bring the Internet into his domain.

Noting that the late Oliver Wendell Holmes Jr. believed that "law should be judged by how it affects the 'bad man,' " Nesson says, "Gates has always struck me as Holmes's 'bad man.' I don't think he's an immoral figure. He's not out there bloodletting. But he is out there at the limit of what the law allows, doing whatever he can get away with. To me, the big issue is the conflict between the public domain and the proprietary domain. Gates has become the lord of the proprietary domain."

In the end, then, there are some important connections between the Gates of pop culture and the Gates of real life. The Gates of pop culture is the richest, best-known figure of the computer age. The Gates of real life got that way by stomping anyone who got in his way, ethical and legal implications be damned. The Gates of pop culture is a benevolent geek, fattening 401(k) accounts across the land by building a phenomenally successful company. The Gates of real life is able to shower wealth upon us because of our tacit acquiescence -- through our willingness to put up with inferior products in return for standardized computers and a few crumbs under the stock-market table.

Toward the end of Pirates of Silicon Valley, a third-rate made-for-TV movie about the war between Apple and Microsoft, Steve Jobs (Noah Wyle) and Bill Gates (Anthony Michael Hall) confront each other backstage, shortly after the 1984 unveiling of the first Macintosh. Jobs realizes that Microsoft has been using the prototypes he sent over not just to write Mac software, but to study the operating system with an eye toward designing its own, similar interface: Windows.

"We're better than you! We've got better stuff!" screams Jobs.

"You don't get it, Steve," Gates replies, just before slinking away. "It doesn't matter."

Dumb-ass dialogue aside, it really never has mattered to Gates. He became king of the world by selling inferior products copied from someone else. Maybe, when we look at him, we see not just the celebrity and the wealth, but the willingness to cut corners, to be seen as the best without really being the best, to do anything to win except fight fairly. In this most narcissistic of times, we look into his bland, dorky features and see ourselves looking back. We love him. We hate him. He is the best of us. And the worst of us, too.

Dan Kennedy can be reached at dkennedy@phx.com. This article originally appeared in the Boston Phoenix.

BRAND NEW STORIES

Thanks for your support!

Did you enjoy AlterNet this year? Join us! We're offering AlterNet ad-free for 15% off - just $2 per week. From now until March 15th.