During the Great Depression, American moviegoers seeking escape could ogle platoons of glamorous chorus girls in “Gold Diggers of 1933.” Our feel-good movie of the year is “Slumdog Millionaire,” a Dickensian tale in which we root for an impoverished orphan from Mumbai’s slums to hit the jackpot on the Indian edition of “Who Wants to Be a Millionaire.”
IF you wanted to pick the moment when the American news business went on suicide watch, it was almost exactly three years ago. That’s when Stephen Colbert, appearing at the annual White House Correspondents’ Association dinner, delivered a monologue accusing his hosts of being stenographers who had, in essence, let the Bush White House get away with murder (or at least the war in Iraq). To prove the point, the partying journalists in the Washington Hilton ballroom could be seen (courtesy of C-Span) fawning over government potentates -- in some cases the very “sources” who had fed all those fictional sightings of Saddam Hussein’s W.M.D.
Colbert’s routine did not kill. The Washington Post reported that it “fell flat.” The Times initially did not even mention it. But to the Beltway’s bafflement, Colbert’s riff went viral overnight, ultimately to have a marathon run as the most popular video on iTunes. The cultural disconnect between the journalism establishment and the public it aspires to serve could not have been more vividly dramatized.
The bad news about the news business has accelerated ever since. Newspaper circulations and revenues are in free fall. Legendary brands from The Los Angeles Times to The Philadelphia Inquirer are teetering. The New York Times Company threatened to close The Boston Globe if its employees didn’t make substantial sacrifices in salaries and benefits. Other papers have died. The reporting ranks on network and local news alike are shriveling. You know it’s bad when the Senate is moved, as it was last week, to weigh in with hearings on “The Future of Journalism.”
Not all is bleak on the Titanic, however. The White House correspondents’ bacchanal was on tap for this weekend. And this time no one could accuse the revelers of failing to get down with the Colbert-iTunes-Facebook young folk: hip big-time journalists now stroke their fans with 140-character messages on Twitter. Or did. No sooner did boldface Washington media personalities ostentatiously embrace Twitter than Nielsen reported that more than 60 percent of Twitter users abandon it after a single month.
The causes of journalism’s downfall — some self-inflicted, some beyond anyone’s control (a worldwide economic meltdown) — are well known. To time-travel back to the dawn of the technological strand of the disaster, search YouTube for “1981 primitive Internet report on KRON.” What you’ll find is a 28-year-old local television news piece from San Francisco about a “far-fetched,” pre-Web experiment by the city’s two papers, The Chronicle and The Examiner, to distribute their wares to readers with home computers via primitive phone modems. Though there were at most 3,000 people in the Bay Area with PCs then, some 500 mailed in coupons for the service to The Chronicle alone. But, as the anchorwoman assures us at the end, with a two-hour download time (at $5 an hour), “the new telepaper won’t be much competition for the 20-cent street edition.”
The rest is irreversible history. This far-fetched newspaper experiment soon faded, even in San Francisco, the gateway to Silicon Valley. Today The Examiner, once the flagship of William Randolph Hearst’s grand journalistic empire, exists in name only, as a flimsy giveaway. The Chronicle is under threat of closure.
But this self-destructive retreat from innovation is hardly novel in the history of American communications. In the last transformative tech revolution before the Internet — television’s emergence in the late 1940s — the pattern was remarkably similar. The entertainment industry referred to TV as “the monster,” and by 1951, the editor of the industry’s trade paper, Variety, was fearful that the monster would “eventually swallow up practically all of show business.” Movies had killed vaudeville a generation earlier. This new household appliance threatened to strangle radio, movies, the Broadway theater, nightclubs and the circus. And newspapers too: “NBC’s New ‘Today’ Attacked by Papers as Competition” screamed a front-page Variety headline in 1952.
The vulnerable establishments in all these fields went nuts. Most movie studios pushed back against the future by refusing to sell their old movies to television or allow their stars to appear on it. Few seized the opportunity to produce programs for the new medium. Instead, some moguls tried to compete by exhibiting sports events by closed-circuit in networks of movie houses. In 1952-53, Cinerama, 3-D and Cinemascope were all heavily promoted to try to retain movie audiences. None of these desperate rear-guard actions could slow the video revolution. Movie newsreels, movie palaces, radio comedy and drama, and afternoon newspapers, among other staples of the American cultural diet, were all doomed.
And yet in 2009, Hollywood movie studios, radio and the Broadway theater, though smaller and much changed, are not dead. They learned to adapt and to collaborate with the monster.
In the Internet era, many sectors of American media have been re-enacting their at first complacent and finally panicked behavior of 60 years ago. Few in the entertainment business saw the digital cancer spreading through their old business models until well after file-sharing, via Napster, had started decimating the music industry. It’s not only journalism that is now struggling to plot a path to survival. But, with all due respect to show business, it’s only journalism that’s essential to a functioning democracy. And it’s not just because — as we keep being tediously reminded — Thomas Jefferson said so.
Yes, journalists have made tons of mistakes and always will. But without their enterprise, to take a few representative recent examples, we would not have known about the wretched conditions for our veterans at Walter Reed, the government’s warrantless wiretapping, the scams at Enron or steroids in baseball.
Such news gathering is not to be confused with opinion writing or bloviating — including that practiced here. Opinions can be stimulating and, for the audiences at Fox News and MSNBC, cathartic. We can spend hours surfing the posts of bloggers we like or despise, some of them gems, even as we might be moved to write our own blogs about local restaurants or the government documents we obsessively study online.
But opinions, however insightful or provocative and whether expressed online or in print or in prime time, are cheap. Reporting the news can be expensive. Some of it — monitoring the local school board, say — can and is being done by voluntary “citizen journalists” with time on their hands, integrity and a Web site. But we can’t have serious opinions about America’s role in combating the Taliban in Pakistan unless brave and knowledgeable correspondents (with security to protect them) tell us in real time what is actually going on there. We can’t know what is happening behind closed doors at corrupt, hard-to-penetrate institutions in Washington or Wall Street unless teams of reporters armed with the appropriate technical expertise and assiduously developed contacts are digging night and day. Those reporters have to eat and pay rent, whether they work for print, a TV network, a Web operation or some new bottom-up news organism we can’t yet imagine.
It’s immaterial whether we find the fruits of their labors on paper, a laptop screen, a BlackBerry, a Kindle or podcast. But someone — and certainly not the government, with all its conflicted interests — must pay for this content and make every effort to police its fairness and accuracy. If we lose the last major news-gathering operations still standing, there will be no news on Google News unless Google shells out to replace them. It won’t.
One of the freshest commentators on Internet culture, Clay Shirky, has written, correctly, that nobody really knows what form journalism will take in the evolving post-newspaper era. Looking back to the unpredictable social and cultural upheavals brought about by Gutenberg’s invention of movable type, he writes, “We’re collectively living through 1500, when it’s easier to see what’s broken than what will replace it.” So who will do the heavy journalistic lifting? “Whatever works.” Every experiment must be tried, professional and amateur, whether by institutions like The Times or “some 19-year-old kid few of us have heard of.”
What can’t be reinvented is the wheel of commerce. Just because information wants to be free on the Internet doesn’t mean it can always be free. Web advertising will never be profitable enough to support ambitious news gathering. If a public that thinks nothing of spending money on texting or pornography doesn’t foot the bill for such reportage, it won’t happen.
That’s why the debate among journalists about possible forms of payment (subscriptions, NPR-style donations, iTunes-style micropayments, foundation grants) is inside baseball. So is the acrimonious sniping between old media and new. The real question is for the public, not journalists: Does it want to pony up for news, whatever the media that prevail?
It’s all a matter of priorities. Not long ago, we laughed at the idea of pay TV. Free television was considered an inalienable American right (as long as it was paid for by advertisers). Then cable and satellite became the national standard.
By all means let’s mock the old mainstream media as they preen and party on in a Washington ballroom. Let’s deplore the tabloid journalism that, like the cockroach, will always be with us. But if a comprehensive array of real news is to be part of the picture as well, the time will soon arrive for us to put up or shut up. Whatever shape journalism ultimately takes in America, make no mistake that in the end we will get what we pay for.