Daniel Lazare

Gun Rights and Our National Identity Crisis

Like the Third Amendment against the peacetime quartering of soldiers in private homes, the Second Amendment used to be one of those obscure constitutional provisions that Americans could safely ignore. Legal opinion was agreed: this relic of the late eighteenth century did not confer an individual right "to keep and bear arms," only a collective right on the part of the states to maintain well-regulated militias in the form of local units of the National Guard. While a few gun nuts insisted on their Second Amendment right to turn their homes into mini-arsenals, everyone else knew they were deluded. Everyone knew this because the Supreme Court had supposedly settled the matter by unanimously dismissing any suggestion of an individual right in 1939.

But now everyone knows something else. Ever since a University of Texas law professor named Sanford Levinson published a seminal article, "The Embarrassing Second Amendment," in the Yale Law Journal in 1989, the legal academy has had to take another look at a provision that Laurence Tribe, the doyen of liberal constitutionalists, described ten years earlier as having no effect on gun control and as "merely ancillary to other constitutional guarantees of state sovereignty." Now such comfy notions are out the window as the National Rifle Association's view that the Second Amendment confers an individual right to own guns gains ground. While some scholars, such as Mark Tushnet, author of the new study Out of Range, argue that an individual-rights reading still allows for extensive gun control, others are frank enough to admit they're not sure what this oddly constructed amendment does and does not allow (although they'll still hazard a guess). As a prominent constitutional scholar named William Van Alstyne once remarked, "No provision in the Constitution causes one to stumble quite so much on a first reading, or second, or third reading." It is as if the legal academy, shaking its head over the First Amendment, suddenly could not make up its mind as to whether that hallowed text protected free speech or prohibited it.

What's going on here? Surely a mere twenty-seven words, loosely tethered together by three commas and one period, can't be that impenetrable. But they are; and if ever there was a Churchillian "riddle wrapped in a mystery inside an enigma," the Second Amendment is it.

Perhaps the best way to begin unraveling this puzzle is to think of the amendment not as a law but, with apologies to Tom Peyer and Hart Seely, as a bit of blank verse:

A well regulated Militia,

being necessary to the security of a free State,

the right of the people to keep and bear Arms,

shall not be infringed.

It's rhythmic and also somewhat strange, as proper modern verse should be. As to what it actually means, the questions begin with "well regulated" in line one. The phrase is confusing because when Americans hear the word "regulation" or any of its cognates, they usually think of government restrictions on individual liberty. But if a government-regulated militia is necessary for a free society (the meaning, presumably, of "a free State"), then how can the amendment mandate an individual right that the same government must not infringe? It is as if the amendment were telling government to intervene and not intervene at the same time.

This is certainly a head-scratcher. Yet the questions go on. Another concerns line two, which, while asserting that the militia is "necessary to the security of a free State," does not pause to explain why. Perhaps the connection was self-evident in the eighteenth century, but it is certainly not in the twenty-first. Today, we can think of a lot of things that are important to the survival of a free society: democratic expression, honest and fair elections, a good educational system, and a sound and equitable economy. So why does the amendment "privilege" a well-regulated militia above all others?

Finally, there are the questions posed by lines three and four. Why "keep and bear" rather than just "bear"? What does "Arms" mean -- muskets, pistols, assault rifles, grenade launchers, nukes? Finally, concerning the phrase "the right of the people to keep and bear Arms," the individualist interpretation holds that eighteenth-century Anglo-Americans viewed this as part of a natural right of armed resistance against tyranny. But if this was the case, why put it in writing? After all, Americans were well armed in the aftermath of the Revolutionary War and hair-trigger sensitive to any new tyrannical threat. Why, then, approve an amendment acknowledging the obvious? After asking Americans to ratify a new plan of government, why did the founders then request that they assert their right to overthrow it?

Questions like these are the subject of Tushnet's Out of Range. With the Supreme Court poised to issue its first gun-rights decision in nearly seven decades in District of Columbia v. Heller, a case the Court heard on March 18 and that involves one of the most sweeping citywide gun bans in the country, Tushnet's brief but dense primer on the Second Amendment and its relationship to the gun-control battles of the last quarter-century could not be more timely. Unfortunately, it also could not be more frustrating. Tushnet, a professor at Harvard Law School, suffers from an excess of caution. Understandably, he is determined not to be one of those overconfident types who, as he puts it, are just "blowing smoke" in claiming to know precisely what the Second Amendment means. As a consequence, he advances a couple of possibilities as to what it might mean, explains why one interpretation may have an edge over the other and then announces that the whole question may be beside the point, since it has little to do with reducing gun-related crime. In fact, he argues that the long-running debate over the Second Amendment may be
really about something else -- not about what the Second Amendment means, or about how to reduce violence, but ... about how we understand ourselves as Americans. Get that straight, and the fights over the Second Amendment would go away. But, of course, we can't get our national self-understanding straight, because we are always trying to figure out who we are, and revising our self-understandings. And so the battle over the Second Amendment will continue.

Like a patient on a psychiatrist's couch, Americans thus talk about the Second Amendment to avoid talking about peskier matters, in this case highly sensitive topics having to do with national identity and purpose. So we keep talking because we don't know how to stop.

But this is unfair, since Americans have no choice but to talk about a law they can neither change (thanks to the highly restrictive amending clause in Article V of the Constitution) nor even fully understand (thanks to the pervasive ambiguity of its twenty-seven words). Still, Out of Range attempts to explain the inexplicable by approaching the Second Amendment from two angles: its original meaning at the time of its adoption as part of the Bill of Rights in 1791 and the meaning it has acquired through judicial interpretation and political practice in the centuries since.

In purely historical terms, Tushnet says, the answer on balance seems more or less clear. Lines one and two, which compose something of a preamble, are plainly the product of an eighteenth-century ideology known as civic republicanism, a school of thought almost paranoid in its tendency to see tyranny forever lurking around the corner. Tushnet's discussion of this school is somewhat cursory (as he admits), but a host of historians, from Bernard Bailyn to Isaac Kramnick, have described it as consisting of a series of polarities between political power, on the one hand, and the people, on the other. If the people are soft, lazy and corrupt, they will easily succumb to a tyrant's rule. Conversely, if they are proud, brave and alert, then would-be oppressors, sensing that the people are keen to defend their liberties, will back off. While a popular militia is important in this respect, no less important are the values, habits and attitudes that accompany it -- vigor, courage, a martial spirit ("keep and bear" turns out to be a military term), plus a steely determination born of the knowledge that "those who expect to reap the blessings of freedom," to quote Tom Paine, "must, like men, undergo the fatigue of supporting it."

All of which suggests a broad reading of the Second Amendment, one that holds that a well-regulated militia is not the only social benefit that arises from widespread gun ownership but merely one of many. As Tushnet puts it, "Once each of us has the right to keep and bear arms, we can use the right however we want -- but always preserving the possibility that we will use it to defend against government oppression." Guns are good in their own right because guns, military training and liberty are all inextricably linked.

But what about "well regulated" -- surely that phrase suggests a government-controlled militia along the lines of today's National Guard? Not quite. In eighteenth-century parlance, regulation could take the form of a militia either spontaneously created by individuals or decreed by the state. Tushnet points out that the financially distressed farmers who participated in Daniel Shays's agrarian uprising in western Massachusetts in 1786 called themselves "regulators." Even though Tushnet doesn't mention it, the North Carolina frontiersmen who rose up against unfair colonial tax policies some twenty years earlier did so as well. Hence, there was nothing strange or inconsistent about regulation mustered from below by individuals rather than enforced from above by the state. Indeed, the Virginia Ratifying Convention in 1788 implied as much when it declared "that the people have the right to keep and bear arms [and] that a well regulated militia composed of the body of the people trained to arms is the proper, natural, and safe defense of a free state." Such a militia would be well regulated to the degree it was composed of the people as a whole.

This is no doubt the sort of thinking one would expect of a postrevolutionary society in which the people had just used their weapons to overthrow one government and were leery of laying them aside as another was taking shape. But the notion that "we the people" would reserve the right to take up arms against a government that "we" had just created seems contradictory. After all, if it's a people's government, who would the people revolt against -- themselves? Still, Americans clearly believed in a natural right of revolution in the event that power was misused or usurped, and they further believed that their Constitution should acknowledge as much. In a document festooned with checks and balances, this was to be the ultimate check, one directed against government tout court.

While one can quarrel about the details, it would thus appear that the NRA has been correct all along concerning the Second Amendment's original intent to guarantee an individual right to bear arms. But Tushnet adds that there is also the question of how the amendment has come to be understood in the years since. Once the Constitution was ratified and the new Republic began taking its first wobbly steps, three things happened: militias fell by the wayside as Americans discovered they had better ways to spend their time than drilling on the village green; politicians and the police took fright when guns began showing up in the hands of people they didn't like, such as newly freed blacks or left-wing radicals; and public safety became more and more of a concern as urbanization rose.

Thus, Supreme Court Justice Joseph Story complained in 1833 about "a growing indifference to any system of militia discipline, and a strong disposition, from a sense of its burthens, to be rid of all regulations." In 1879 the Illinois state legislature outlawed private militias after 400 or so German socialists paraded through Chicago with swords and guns, while the Supreme Court's 1939 decision United States v. Miller upheld a ban on sawed-off shotguns on the grounds that such weapons had nothing to do with maintaining a well-regulated militia. However scattershot, various gun-control measures have proliferated since the 1930s, prohibiting certain types of firearms (Tommy guns), forbidding certain people from owning them (felons and fugitives), establishing "gun-free school zones" and so on, all based on a collective-right reading holding that government has free rein to do what it wishes to maintain public safety. By the time Tribe published his famous textbook, American Constitutional Law, in 1978, any concept of an individual right to bear arms had effectively disappeared. The Second Amendment, American Constitutional Law announced, was irrelevant when it came to "purely private conduct" in the form of gun ownership. Gun control could therefore go forward unimpeded.

In retrospect, Tribe's textbook was plainly the high-water mark for the "collective right" interpretation. Eleven years later, Levinson published his Yale Law Journal article, complaining that "for too long, most members of the legal academy have treated the Second Amendment as the equivalent of an embarrassing relative, whose mention brings a quick change of subject to other, more respectable, family members." In fact, intellectual honesty dictated that they recognize there was more to the amendment than they had been willing to admit.

While Tribe has since come around to the individual-rights point of view (the 2000 edition of American Constitutional Law contains a ten-page reconsideration of the subject), he is now among those arguing that, notwithstanding such a right, firearms are still "subject to reasonable regulation in the interest of public safety" and that "laws that ban certain types of weapons, that require safety devices on others and that otherwise impose strict controls on guns can pass Constitutional scrutiny." Tushnet agrees, noting that the Pennsylvania State Constitutional Convention declared in 1788 that "no law shall be passed for disarming the people…unless for crimes committed, or real danger of public injury from individuals," a clear indication that public safety was a concern even in the Republic's earliest days. He observes that, in what has been called "America's first gun control movement," state legislatures followed up in the 1810s with laws against concealed weapons, bowie knives and the like for the same reason. Today, there is no shortage of gun-control laws even in states that recognize a constitutional right to bear arms, yet the courts have not seen a conflict. "Indeed," says Tushnet, "it's hard to identify a gun-control policy that has not been upheld against challenges based on state constitutional guarantees of an individual right to keep and bear arms."

In other words, we can all relax. Given its current conservative lineup, the Supreme Court will almost certainly uphold an individual right to bear arms in District of Columbia v. Heller. But while gun prohibition or equally sweeping licensing laws will probably not be permissible, lesser forms of gun control are still acceptable. Thus, things will continue pretty much unchanged. Lawyers will go back to arguing whether banning assault weapons passes constitutional muster, while the NRA will go back to complaining that we are all on a slippery slope to tyranny. Moms will march for gun control, hunters will campaign against it and "cold dead hands" bumper stickers will continue to proliferate on pickups. Plus ça change, plus c'est la même Scheiss.

Or so Tushnet suggests, although some of us may not be so sanguine. The problem may be an excessively narrow reading of the Second Amendment and the broader Constitution of which it is a part. Law professors, not surprisingly, tend to think of the Constitution as the law. But it is also a plan of government and a blueprint shaping American thought on such topics as democracy, civil liberties and popular sovereignty. Hence, while an individualist reading of the Second Amendment will certainly affect gun control in some fashion, that is not all it will affect. It will also send a powerfully coded message about the proper relationship between the people and their government and the nature of political authority. It is this aspect of the Second Amendment as opposed to its strictly legal dimension that seems most important.

Indeed, the closer one looks at the Second Amendment, the more significant its political ramifications seem. Its structure, for example, is oddly parallel to that of the larger Constitution, with a preamble (lines one and two) advancing a rationale of sorts and then a body, or gist (lines three and four), stating what is to be done. Other than the famous one beginning with "We the people," this is arguably the only such preamble in the entire document and certainly the only one in the Bill of Rights. The logical parallels are also curious. The larger Constitution opens by declaring that the people have unlimited power to alter their political circumstances so as to "promote the general welfare and secure the blessings of liberty to ourselves and our posterity." It seems that "we the people" can do whatever we want to improve our situation, including tossing out one constitution (the Articles of Confederation) and ordaining a new one. But the body of the Constitution goes on to say something completely different by declaring in Article V that a supposedly sovereign people is decidedly unsovereign when it comes to modifying the plan of government made in its name. (With just thirteen states representing as little as 5 percent of the US population able to veto any amendment, the US Constitution is among the hardest to change on earth.) By the same token, the mini-constitution that is the Second Amendment opens by declaring a people in arms to be the ultimate guarantor of freedom, but then it goes on to say that the people's government lacks the freedom to alter individual gun rights. Since the 1930s liberals have succeeded in circumventing the first restriction via the miracle of judicial interpretation, a modern form of transubstantiation that allows them to alter the essence of the Constitution without changing so much as a comma. But a return to an individual-rights reading of the Second Amendment would mean a rollback of free-form judicial review. By returning the amendment to its original meaning, such a reading couldn't help but strengthen the old civic-republican view of an expansive state as a threat to liberty.

This is profoundly reactionary and profoundly confusing. Are the people sovereign or not? Are they the protectors of liberty or a threat? The answer, according to the Big-C and little-c constitution, is both. Although legal academics like to think of the Constitution as a model of reason and balance, the Second Amendment puts us in touch with the document's inner schizophrenic -- and, consequently, our own. Thanks to it, we the people know that the people are dangerous. Therefore, we must take up arms against our own authority. We are perennially at war with ourselves and are never more alarmed than when confronted by our own power. The people are tyrannized by the fear of popular tyranny.

Richard Feldman's Ricochet, an insider account of how conservatives have used the Second Amendment to clobber liberals on gun control and more, is evidence of what this schizophrenia means on the most down-to-earth political level. Ricochet is long -- too long -- on details about the inner workings of the NRA, the purges, the plots and the Machiavellian maneuvers of executive vice president Wayne LaPierre. Still, it has its moments. The most relevant concerns a campaign Feldman helped engineer in New Jersey in July 1990 to punish then-Governor Tom Florio, a Democrat, for pushing through a ban on assault weapons a few months earlier. When Florio announced a major tax increase to plug a deficit in the state budget, a Pat Buchanan-style "pitchfork rebellion" -- led by a letter carrier named John Budzash and a title searcher named Pat Ralston -- erupted across the middle of the state. Feldman, eager for revenge and experienced as a field operative when it came to populist campaigns of this type, sprang into action. Working behind the scenes, he established contact with Hands Across New Jersey, as the tax protesters called themselves, funneling them money, advising on strategy and grooming their press releases. "But unlike what I'd helped produce for the NRA, we had to give the Hands documents a rough edge," he recalls. "I always made sure to misspell at least one word, 'frivilous' or 'wastefull.'" The group's biggest PR coup was distributing thousands of rolls of "Flush Florio" toilet paper to protest the governor's proposal to slap a 7 percent sales tax on such items, a tactic that frightened the state's Democratic establishment to the core. Senator Bill Bradley did his best to duck the controversy but barely squeaked through to re-election, while Florio lost to Republican Christine Todd Whitman three years later. It was an example of the sort of right-wing populism that would continue to build throughout the 1990s, crippling the Clinton Administration and paving the way for the Bush/Cheney coup d'état in December 2000.

Hands Across New Jersey could not have done it without the NRA, and the NRA could not have done it without the Second Amendment. On the surface, tax hikes and gun control would seem to have as little to do with each other as horticulture and professional wrestling. But eighteenth-century civic republicanism, the ideology bound up with an individualist reading of the Second Amendment, provided the necessary link by portraying both as the products of overweening government. In the face of such "tyranny," the message to protesters was plain: take down those muskets, so to speak, and sally forth to meet the redcoats. Don't think, don't analyze, don't engage in any of the other sober measures needed to sort out the fiscal mess. Just turn the clock back to the eighteenth century, pack your musket with "Flush Florio" wadding and fire away! Needless to say, atavistic protests like these were sadly irrelevant in terms of the financial pressures that, in a politically fragmented, traffic-bound state like New Jersey, were growing ever more acute. Yes, the protesters succeeded in throwing out some bums (and ushering in even worse ones). But with the current governor, Democrat Jon Corzine, now struggling to resolve a 10 percent budget gap, the crisis has only deepened.

After the disaster of the Bush years, it would seem that the right-wing populism embodied by Hands Across New Jersey has burned itself out. But given the collapse of the liberal-collectivist reading of the Second Amendment and the Supreme Court's likely embrace of an individual right, it could conceivably gain a new lease on life -- just as the country is grappling with a major recession, the worst housing crisis since the 1930s and a wave of municipal bankruptcies, all problems that call for a collective government response. But an incoherent Constitution dating from the days of the French monarchy, the Venetian republic and the Holy Roman Empire is now sending an increasingly strong message that firm and concerted action of this sort is the very definition of tyranny and must be resisted to the hilt. Once again, Americans must take aim -- at themselves! Tushnet's question concerning "how we understand ourselves as Americans" thus becomes somewhat easier to answer: Americans are people at the mercy of eighteenth-century attitudes they don't know how to escape.

Will the Clash of Faiths Go On Forever?


Divided by Faith: Religious Conflict and the Practice of Toleration in Early Modern Europe by Benjamin J. Kaplan

God's Crucible: Islam and the Making of Europe, 570-1215 by David Levering Lewis

Other than outright jihadis like Osama bin Laden and hard-core Zionist settlers in the West Bank, most people would agree that religious zealotry is out of control and ought to be reined in. The question is how to do it. On one side of the debate are the hards, those militant atheists who argue that the problem is not so much religious discord as religion itself, an idea that has given rise to repeated horrors not because it is misapplied or misunderstood but because it is false and therefore a poor guide to reality. Bad theories lead to bad outcomes, which is why the best way to deal with theism is to do to it what Copernicus did to Ptolemy, or Darwin to Lamarck -- finish it off as quickly as possible so the world can move on.

On the other side are the softs, those nice ecumenicists who contend that since it's unlikely that the world's believers will endorse the writings of Richard Dawkins, Sam Harris and Daniel Dennett anytime soon, we had all better learn to live together in our present state. Religion is therefore tolerable as long as it's not used as a justification to harass thy neighbor or condemn him to hell over minute theological differences. Call it the Kumbaya coalition, if you will.

Although it is uncertain how David Levering Lewis fits into this debate based on his seriously misconceived new book, God's Crucible, there is no doubt as to Benjamin Kaplan. Judging from Divided by Faith, his account of the elaborate measures that small groups of Catholics and Protestants took to keep the peace during the wars of religion of the sixteenth and seventeenth centuries, he is what might be called a hard-core softie, a fanatical believer in religious compromise as the key to preventing conflict. Whereas other historians of the era chronicle all the horrible things that the religious combatants did to one another -- Germany suffered more mass devastation in the Thirty Years War of 1618-48 than it did during World War II -- he describes the ingenious mechanisms Europeans employed to avoid killing one another in the name of a peace-loving Christ. Since such measures were mainly modest and small-scale, the result is history with the big stuff like wars, treaties and affairs of state left out and the minor adjustments and adaptations left in. Exciting it's not. But since life is often unexciting (especially when it's peaceful), Kaplan's version of how people got along in between the era's great battles and confrontations is not unimportant.

We learn from Kaplan that because lords and knights in sixteenth-century Austria enjoyed the right to hold Protestant services in their castles, houses and estates, Protestants in nominally Catholic Vienna would parade through the streets every Sunday morning on their way to some nearby Protestant nobleman's estate, where they would worship freely before heading back home. No one killed them as a consequence of their Auslauf ("walking out"), no one arrested them and no one drove them into exile, no small thing in the fraught climate of the 1570s, when Dutch Protestants and Catholic Habsburgs were battling in the Netherlands and English Puritans were clamoring for the head of Mary, Queen of Scots. In Strasbourg and the southern German towns of Ulm and Biberach, all dominated by Protestants, it was the Catholics who marched every Sunday so they could pray outside the city walls. In Danzig (now Gdansk, Poland), Socinians (otherwise known as Unitarians) marched to the nearby villages of Busków and Straszyn, while in Hamburg Mennonites marched to Altona, now a nearby suburb. All did so unmolested, even though elsewhere in Europe such displays would have been explosive.

Under a policy known as "simultaneum," Catholics and Protestants in biconfessional (dual religious) cities even learned to share the same church. If this sounds unremarkable, consider what would happen today if some rabidly Zionist rabbi and a firebrand imam were required to share the same synagogue or mosque. In liberal Holland -- about which an English diplomat once remarked, "Religion may possibly do more good in other places, but it does less harm here" -- the problem was how to square the freedom of conscience guaranteed by the Union of Utrecht, the 1579 treaty that gave rise to the Dutch republic, with the religious monopoly of the Dutch Reformed Church. The treaty allowed the Dutch to believe in any religion they liked but to practice only one. What to do? With their usual pragmatism, the Dutch settled on a policy of official conformity and unofficial laxity, a policy exemplified by the tiny schuilkerken (literally, "house churches") that members of Holland's substantial Catholic minority were permitted to build in attics, backrooms and courtyards. Cozy and gemütlich, these were the antithesis of the grandiose Baroque structures springing up in Catholic territories. Lacking such outward displays as crosses, bells or towers, they were nonetheless richly outfitted with altars, galleries, organs and vaulted roofs. Since keeping a low profile was essential, one such schuilkerk entered into an elaborate agreement with the Amsterdam town fathers not to park sleds out front, not to allow crowds to congregate or parade through the streets and not to schedule services so that parishioners would interfere with crowds of Protestant worshipers heading off to their own churches.

By seventeenth-century standards, such restrictions were so mild as to be positively disorienting. In 1660 a Dutch Mennonite named Thieleman van Braght waxed nostalgic for the good old days when his group was the most harshly treated sect in the country. Mennonites had stood bravely by their faith during the years of persecution, but with liberalization, he complained, had come a "pernicious worldly-mindedness," a decline in morals and a falling away of religious ardor. Just as the worst way to torture a masochist is to treat him nicely, the worst way to treat a would-be religious martyr is to bombard him with tolerance.

Clearly, then, minor adjustments to religious practice did for a time succeed in preventing religious strife in the early modern era, which is one reason Kaplan celebrates them. But there is another reason: they are all pre-Enlightenment measures instituted at a time when secularism was still in its infancy. After a detailed discussion of schuilkerken, Sunday parades and the like, Divided by Faith concludes by arguing that the age of secularization that the 1648 Treaty of Westphalia supposedly ushered in with the conclusion of the Thirty Years War may not have been as deeply rooted as is usually thought. Religious persecution was ostensibly on the wane, yet Louis XIV revoked the Edict of Nantes in 1685, depriving Protestant Huguenots of their civil rights and sending some 300,000 of them into exile. Shortly after, the Duke of Savoy succumbed to intense French pressure by summarily ordering descendants of medieval heretics known as the Waldensians to convert to Catholicism. When they revolted instead, he imprisoned some 9,000 of them for months, stood by as two-thirds of them died from their confinement and then sent the rest on a forced march through the snow-covered Alps to Switzerland. From 1702 to 1705, Protestants and Catholics traded tit-for-tat atrocities in the South of France in a particularly brutal conflict known as the War of the Cévennes. In 1731 Catholic authorities expelled some 19,000 Lutherans from the archbishopric of Salzburg, Austria. As late as 1780, rioting and mass destruction erupted in London in response to a modest bill in Parliament aimed at removing a few of the anti-Catholic legal indemnities left over from the previous century. Enlightened sectors of society had assumed that religious hatred was a thing of the past. But the Gordon Riots -- named for the flamboyant Lord George Gordon, leader of the Protestant Association -- showed that antipapism was still a force to be reckoned with.

All these episodes of religious-fueled strife lead Kaplan to a bold and simple conclusion: the Enlightenment has been oversold. The story of a new spirit of secularism chasing away the medieval fog is, he writes,

Keep reading... Show less

What Makes an Atheist Get out of Bed in the Morning?

Imagine it's Paris in the spring of 1789 and you have just announced that you are an inveterate foe of tyrants and kings. Obviously, your message is not going to fall on deaf ears. But now that you've made it clear what you're against, what are you for? Do you favor an aristocratic constitution in which power devolves to the provincial nobility? Would you prefer a British-style constitutional monarchy? Or do you believe in all power to the sans-culottes? How you answer will shape both your analysis of the situation and the political tactics you employ in changing it. It may also determine whether you wind up on the chopping block in the next half-decade or so.

This is the problem, more or less, confronting today's reinvigorated atheist movement. For a long time, religion had been doing quite nicely as a kind of minor entertainment. Christmas and Easter were quite unthinkable without it, not to mention Hanukkah and Passover. But then certain enthusiasts took things too far by crashing airliners into office towers in the name of Allah, launching a global crusade to rid the world of evil and declaring the jury still out on Darwinian evolution. As a consequence, religion now looks nearly as bad as royalism did in the late eighteenth century. But while united in their resolve to throw the bum out -- God, that is -- the antireligious forces appear to have given little thought to what to replace Him with should He go. They may not face the guillotine as a consequence. But they could end up making even bigger fools of themselves than the theologians they criticize.

Richard Dawkins is a case in point. It is no surprise that, along with Sam Harris, author of The End of Faith and Letter to a Christian Nation, and Daniel Dennett, author of Breaking the Spell: Religion As a Natural Phenomenon, he has emerged at the head of a growing intellectual movement aimed at relegating religion to the proverbial scrapheap of history (which by this point must be filled to overflowing). He's bright, obviously, a lively writer -- his 1978 book The Selfish Gene is regarded as a pop science classic -- and as an evolutionary biologist, he's particularly well equipped to defend Darwin against neofundamentalist hordes for whom he is the Antichrist. But Dawkins is something else as well: fiercely combative. Other scientists have tried to calm things down by making nice-nice noises concerning the supposedly complementary nature of the two pursuits. Einstein famously said that "science without religion is lame, religion without science is blind," while the late paleontologist Stephen J. Gould once characterized the two fields as "non-overlapping magisteria" that address different questions and have no reason to get in each other's way. But Dawkins, to his great credit, is having none of it. Although he does not quite come out and say so, he seems to have the good sense to realize that no two fields are ever truly separate but that, in a unified body of human knowledge, or episteme, all overlap. Conflict is inevitable when different fields employ different principles and say different things, which is why an evolutionary biologist can't simply ignore it when some blow-dried TV evangelist declares that God created the world in six days, and why he'll become positively unhinged should the same televangelist begin pressuring textbook publishers to adopt his views.

Consequently, he's got to go on the warpath -- not only against the fundamentalists but against the sloppy logic and wishful thinking on which they batten. This is Dawkins's forte, and it is what makes The God Delusion such an entertaining read. Not one for politeness, he is the sort of fierce logic-chopper who chuckles nastily when coming across what he regards as some particularly choice bit of inanity. Discussing Arius of Alexandria, for example, infamous in certain fourth-century theological circles for maintaining that God and Jesus were not "consubstantial," i.e., not composed of the same substance or essence, you can almost hear him snicker: "What on earth could that possibly mean, you are probably asking? Substance? What 'substance'? What exactly do you mean by 'essence'? 'Very little' seems the only reasonable reply." Quoting a third-century theologian known as St. Gregory the Miracle Worker on the mystery of the Holy Trinity -- "There is therefore nothing created, nothing subject to another in the Trinity: nor is there anything that has been added as though it once had not existed, but had entered afterwards: therefore the Father has never been without the Son" -- he can't help sneering that "whatever miracles may have earned St. Gregory his nickname, they were not miracles of honest lucidity." Noting that the Catholic Church divides angels into nine categories, or orders -- seraphim, cherubim, thrones, dominions, virtues, powers, principalities, archangels and ordinary members of the angelic rank-and-file -- he lets slip that "what impresses me about Catholic mythology is partly its tasteless kitsch but mostly the airy nonchalance with which these people make up the details as they go along."

This is not entirely fair. The Catholic Church does not just make such things up but has thought long and hard about angelic orders and other matters of equal importance. But Dawkins's outrage at the persistence of medieval ideas in the modern era is warranted. In fact, it's overdue. Also warranted is the sheer pleasure he takes in recounting a double-blind experiment funded by a whopping-rich outfit known as the Templeton Foundation to test the efficacy of prayer. Headed by a Boston cardiologist, Dawkins informs us, the study involved 1,802 patients in six hospitals who had just undergone coronary bypass surgery. Researchers divided the subjects into three groups: those who were not informed that church congregations as far away as Missouri were praying for their speedy recovery, those who were informed and a control group consisting of patients for whom no prayers were said and who were unaware that an experiment was under way. Church members were provided with each patient's first name and last initial and, in the interest of standardization, were asked to pray "for a successful surgery with a quick, healthy recovery and no complications" in just those words.

The results, announced in April 2006, were a hoot. The first group of patients, those who had no idea that others were praying for them, did no better than the control group, while the second, those who knew they were the object of others' prayers, actually did worse. "Performance anxiety," the experimenters theorized. "It may have made them uncertain, wondering am I so sick they had to call in their prayer team?" one speculated. Instead of accepting the results gracefully and conceding that the theory of intercessory prayer had been disproved, an Oxford theologian named Richard Swinburne complained that the whole exercise was meaningless because what matters to God is not prayer so much as the reasons behind it. But if the experiment had gone the other way and the patients being prayed over had outperformed the control group, we can well imagine what the reaction would have been. People like Swinburne would have shouted from the rooftops that God's existence had been proved and that we had all better beg his forgiveness double-quick.

But it didn't, and it is now clear that praying for a quick recovery is on par with crossing one's fingers and wishing for a Mercedes. Science is predicated on the assumption that belief is unwarranted without evidence and reason to back it up. But religion is based on the opposite: that belief in the absence of evidence is a virtue and that "the more your beliefs defy the evidence, the more virtuous you are," as Dawkins puts it. "Virtuoso believers who can manage to believe something really weird, unsupported and insupportable, in the teeth of evidence and reason, are especially highly rewarded." That last line is classic Dawkins -- provocative, pugnacious, even a bit over the top, but true.

As Dawkins admits, there is something distinctly nineteenth century about the new rationalism that he and others are promoting. It smacks of prairie populism and freethinkers like the wonderful Robert Ingersoll, who, in the post-Civil War period, used to crisscross the country, drawing thousands eager to hear him denounce the churches, poke fun at the Bible and sing the praises of Darwin: "Can we affect the nature and qualities of substance by prayer? Can we hasten or delay the tides by worship? Can we change winds by sacrifice? Will kneelings give us wealth?... Has man obtained any help from heaven?" These were questions that made Ingersoll one of the most popular lecturers of his day. Now, after the mushy ecumenism of the late twentieth century and the religious terrorism of the early twenty-first, a growing number of Americans plainly long for something more bracing.

But we are still in the position of the French revolutionary who has not moved beyond antiroyalism. Atheism is a purely negative ideology, which is its problem. If one does not believe in God, what should one believe in instead? Dawkins thinks he has an answer -- science -- but his understanding of the term is embarrassingly crude and empirical.

This comes through when he tries to figure out how "the God delusion" arose in the first place. Why did people latch onto an idea that we now know to be incorrect? Why didn't the ancient Israelites conduct their own double-blind experiment to determine whether sacrificing all those bulls, rams and occasionally children to Yahweh was really worth the trouble? Dawkins gropes for an explanation at one point in his book. He speculates that religious visions may be a form of temporal lobe epilepsy (which implies that there must have been quite an epidemic in Palestine when people like Elijah, Hosea and Jeremiah were raising a ruckus) but then lets the idea drop. He suggests that religion caught on because it confers certain evolutionary advantages but concedes that this is exceedingly hard to prove. He speculates that faith may be the result of a self-replicating "meme," the cultural equivalent of a gene. But after a murky discussion of "memeplexes" and genetic cartels, the reader is left with the uncomfortable feeling that Dawkins is lost in a tautological fog in which religion is self-replicating because it satisfies certain human needs and is therefore... self-replicating. Finally, he suggests that religion survives because it is comforting -- this, some 200 pages after conceding that religion is as likely to exacerbate stress as to alleviate it. (The last thing Old Testament prophets wanted to do was soothe troubled souls.)

Dawkins's sense of history is so minimal that it approaches the vanishing point. He is a classic example of the kind of shallow rationalist who thinks that all you have to know about history is that everything was cloudy and dark until the scientific revolution of the sixteenth and seventeenth centuries, at which point the sun began poking through. To quote Alexander Pope: "Nature and Nature's laws lay hid in night:/God said, Let Newton be! and all was light." Religion took hold at a certain point because people were stupid and benighted, but now that this is no longer the case, it should not hang around a moment longer. Yet it never occurs to Dawkins that monotheism is a theory like any other and that certain Jewish scribes and priests adopted it in the sixth century BC because it seemed to confer certain advantages. These were not survival advantages, since the Jews went on to rack up an unparalleled record of military defeats. Rather, they were intellectual advantages in that the theory of a single all-powerful, all-knowing deity seemed to explain the world better than what had come before.

Since Dawkins sees all religion as merely dumb, he can't imagine how this might be. Hence he can't see how the idea of an all-powerful, all-knowing creator might cause worshipers to see the world as a single integrated whole and then launch them on a long intellectual journey to figure out how the various parts fit together. Roughly 2,500 years separate the Book of Isaiah, in which Yahweh first declares, "I am the first and I am the last; apart from me there is no god [44:6]," and Einstein's quest for a unified field theory explaining everything from subatomic structure to the Big Bang. Everything else has changed, but the universalism behind such an endeavor has remained remarkably constant. Dawkins blames religion for stifling human curiosity. But were he a bit more curious about the phenomenon he is supposedly investigating, he would realize that it has done as much over the long haul to stimulate it. For a world-famous intellectual, he is oddly provincial.

Christopher Hitchens's new book, God Is Not Great, is another example of atheism as an empty vessel, one he manages to fill with an intellectual justification for George W. Bush's "war on terror." Hitchens, of course, is the former left-wing journalist who astounded friends, colleagues and readers alike by coming out in support of the invasion of Iraq in 2003. Since then, with everyone from Richard Perle to Peter Beinart busily backpedaling as the dimensions of the disaster have grown more and more glaring, Hitchens has dug in his heels. Like John McCain strolling through the Baghdad markets, he is more defiant of reality than ever, more insistent, as he put it in a March 26 article in the Australian, that the occupation has made the world a better and safer place. In God Is Not Great, he has something unpleasant to say about nearly every believer under the sun -- except one. He trots out John Ashcroft's infamous remark that America has "no king but Jesus" and reminds us that Jerry Falwell and Pat Robertson both welcomed 9/11 as payback for America's tolerance of homosexuality and abortion. He informs us that Hamas has talked about imposing the old Al-Jeziya tax on Christians and Jews in the West Bank, while in Gaza in April 2005 Muslim militants shot and killed a young woman named Yusra al-Azami merely because she was sitting unchaperoned in a car with her fiancé. For those inclined to think of the late Saddam Hussein as a Third World dictator in the secular-nationalist mold, Hitchens points out that Saddam found religion after the 1979 Iranian Revolution, inscribing the words "Allahuh Akhbar" (God is great) on the Iraqi flag, building a huge mosque as a showcase for his new piety and producing a handwritten version of the Koran allegedly with his own blood.

Yet one person is conspicuously absent from Hitchens's list of religious evil-doers: George W. Bush. Yes, the man who said Jesus is his favorite philosopher "because he changed my heart" and, as governor of Texas, proclaimed June 10 as "Jesus Day," goes unmentioned. How can this be? The explanation has to do with Hitchens's subtitle. If "religion poisons everything," then it must be responsible for most of the evil in the world, since belief of this sort is currently so widespread and pervasive. If a political leader is religious, he or she must be bad, and if he or she is bad, he or she must be religious. This is why Saddam gets slammed for his cynical exploitation of Islam and why Bush, author of the Global War on Terror and the war on Iraq, both of which Hitchens supports, gets a free pass. If he is to be believed, our faith-based President is defending rationalism against religious intolerance. Despite Hitchens's anti-Stalinist credentials, arguments like these are so unscrupulous as to call to mind the Comintern of the late '30s and early '40s. Somewhere, Andrei Vyshinsky is smiling.

Hitchens's historical sense in God Is Not Great is perhaps even more stunted than that of Dawkins. Here he is, for example, attacking the Jewish holiday of Hanukkah, which celebrates the Maccabean revolt in 168 BC against the Seleucid effort to de-Judaize the Jerusalem temple and consecrate it to Zeus:

When the father of Judah Maccabeus saw a Jew about to make a Hellenic offering on the old altar, he lost no time in murdering him. Over the next few years of the Maccabean "revolt," many more assimilated Jews were slain, or forcibly circumcised, or both, and the women who had flirted with the new Hellenic dispensation suffered even worse. Since the Romans eventually preferred the violent and dogmatic Maccabees to the less militarized and fanatical Jews who had shone in their togas in the Mediterranean light, the scene was set for the uneasy collusion between the old-garb ultra-Orthodox Sanhedrin and the imperial governorate. This lugubrious relationship was eventually to lead to Christianity (yet another Jewish heresy) and thus ineluctably to the birth of Islam. We could have been spared the whole thing.

If only the Maccabees had stood by as Antiochus IV Epiphanes looted the temple treasury, the world could have skipped 2,000 years or so of religious fanaticism and proceeded directly to the founding of the Council for Secular Humanism. Needless to say, there is no sense here of historical progress as necessarily convoluted and complex, with lots of back eddies, side currents and extended periods of stagnation. But just as it takes a child a long time to mature, it takes a long time for society as well.

It would be nice to believe that anachronistic thinking like this halted at Calais, but Michel Onfray's Traité de athéologie, which has been given the hotter title of Atheist Manifesto for the American market, is not reassuring. Onfray is more philosophically sophisticated than Dawkins and Hitchens, and he is entirely commendable in his determination to hold Judaism, Christianity and Islam to the same rigorous standard. Whereas Sam Harris singles out Islam as "a religion of conquest," for instance, Onfray points out that it was the Israelites who invented holy war, that the Israelite god Yahweh "sanctioned crimes, murders, assassination...kill[ing] animals like men and men like animals," and that the Vatican has distinguished itself more recently as "a fellow traveler with every brand of twentieth-century fascism -- Mussolini, Pétain, Franco, Hitler, Pinochet, the Greek colonels, South American dictators, etc." Islam's division of the world into a land of Islam and a land of infidels is "not too distant from Hitler's," Onfray adds. But Harris should know better than to call it "unique."

This may seem fairly obvious. But with everyone from atheists to neocons jumping on board the anti-Islamic crusade, it bears repeating. Still, Onfray goes astray in the left-Nietzschean twist that he gives to his antireligious critique. Nietzsche's influence is evident throughout Atheist Manifesto -- in its high-wattage prose style, in its tendency toward aphorism, but mostly in its treatment of things like Judaism and Christianity as intellectual categories removed from their historical contexts. Onfray, to cite just one example, is extremely hard on St. Paul, whom he describes as a hysteric "driven by a host of psychological problems," a loser who "converted his self-loathing into hatred of the world" and someone whose "impotence and resentment took the form of revenge: the revenge of the weakling." None of this is surprising, given Paul's views on such subjects as celibacy (strongly in favor), marriage (only for those unable to forgo sex), slavery (accepting) and women (condescending, to say the least). But anyone who reads Paul in the context of the entire Bible -- which Onfray says elsewhere is the only way the Bible can be properly understood -- will likely come away with a different impression. His hysteria, such as it is, doesn't begin to compare with that of Hosea, Jeremiah and other Hebrew prophets, whose rages were truly volcanic. His political quietism is more explicable if one bears in mind that he believed that an impending apocalypse would soon put an end to all forms of injustice. His views on gender are more benign than is commonly realized, which may be why even pagans reported that women were among the first to convert.

Indeed, Paul was something new as far as the biblical tradition was concerned, a thinker, polemicist and organizer who was sober, practical and all but tireless. This is undoubtedly why Engels was so notably friendly toward him in one of his last essays. Not only did he describe Pauline Christianity as the socialism of its day but, referring to an epistle in which Paul reminds parishioners of the need to provide the new movement with financial support (which he describes as the "grace of giving"), he even commiserated with him across the centuries over the difficulty of squeezing party dues out of local members. ("So it was like that with you too!") Context for Engels was all. It was obvious from his perspective that someone like Paul could not be held exclusively to a modern standard but had to be judged on the basis of his historical role. So what has happened in the century or so since Engels wrote that essay that has caused otherwise admirable leftists like Onfray to lose their historical bearings? Could the baleful influence of Nietzsche, the favorite philosopher of overwrought 16-year-olds, have something to do with it?

Terry Eagleton shows a firmer grasp of the issues in The Meaning of Life -- far firmer, in fact, than he did in the verbal hurricane that he unleashed on Dawkins in The London Review of Books last October. That article, which earned Eagleton a warm note of congratulations from Peter Steinfels in his "Beliefs" column in the New York Times -- an indication of just how bad it actually was -- was filled with ex cathedra comments and unsupported assertions that Eagleton, a left-wing Catholic back in the 1960s, somehow thought he could intimidate his readers into accepting. Thus: Dawkins "does not see that Christianity, like most religious faiths, values human life deeply, which is why the martyr differs from the suicide." Or: "Because the universe is God's, it shares in his life, which is the life of freedom. This is why it works all by itself, and why science and Richard Dawkins are therefore both possible." Dawkins is a boor, in other words, because he is unable to grasp such ineffable truths. Yet both statements were nothing more than silly. Judaism concerns itself not with the life of the individual but the life of the nation, while Christianity saw the life we know as merely a prelude to the real life that will occur after the Resurrection. If the universe worked all by itself, similarly, God would have no need to intervene in it miraculously from time to time, as He does in both the Old Testament and the New.

With The Meaning of Life, however, Eagleton, the author of such works as Criticism and Ideology: A Study in Marxist Literary Theory and The Illusions of Postmodernism, goes back to channeling his inner materialist. When he mentions God, it is in the sense of an abstract principle that he identifies by the Greek term agape, or love. Needless to say, this is not love in the erotic sense of the word but as a cosmic force that is an expression of the deity's free choice in creating a material universe in which human beings can exist. Since Eagleton is coy as to whether he is speaking literally or figuratively, most readers will assume the latter. As a rhetorical device, it therefore allows him to make the point that the alternative to divine creation is not an empty and meaningless universe, as some moderns would have it. Rather, we can still see the universe as an intelligible whole, one whose "underlying laws," he writes, "reveal a beauty, symmetry, and economy which are capable of moving scientists to tears" (a rare point of agreement with Dawkins). If believers, according to Bishop Berkeley, believe that God invested the universe with meaning through the act of creating it, then nonbelievers can believe that people can invest life with meaning through a similar act of creating a mode of living that allows people to realize their full potential.

This supposes that meaning is not something that one discovers "out there," by, say, sitting on a lonely mountaintop and contemplating the heavens. Rather, it supposes that one discovers it "in here," that is, in society and through it. In The God Delusion, Dawkins notes that people might fill the gap left by religious belief in any number of ways but adds that "my way includes a good dose of science, the honest and systematic endeavor to find out the truth about the real world." The words "my way" are a giveaway, since they suggest that meaning is something we arrive at individually. Eagleton, by contrast, contends that individual meaning is a solipsism, because any statement about oneself -- such as "I am handsomer than Adonis" or "I am the greatest composer since Beethoven" -- is meaningful only to the degree it is recognized by others. Hence, "my life is meaningful" is itself meaningful only to the degree that other people view it as such and see their own lives the same way. Hence, meaning can be achieved only via a collective act of self-creation in which humanity creates new conditions for itself so that humanity as a whole can flourish. As a corollary, Eagleton adds that "since there can be no true reciprocity except among equals, oppression and inequality are in the long run self-thwarting as well." Freedom and equality are necessary for humanity to create a meaningful existence for itself.

In short, humanity creates meaning for itself by liberating itself so that it can fulfill itself. This is also a solipsism, but one as big as all existence. Odd, isn't it, that atheists can be right about God but wrong about religion and much else about the modern condition, while a believer can be wrong about God but at least on the right track concerning the current spiritual malaise?

My Beef With Vegetarianism

There are many horrifying moments in Anatoly Kuznetsov's great Soviet novel Babi Yar, but one of the most horrifying concerns, of all things, the death of a newborn kitten. The kitten has been born deformed, so the hero, a small boy living in Nazi-occupied Ukraine, has to kill it. But instead of doing it the usual way by drowning it in a bucket, he decides it would be somehow kinder to pound the animal to death with a brick. "It was a moist, warm blob of life," Kuznetsov writes, "utterly devoid of sense and as insignificant as a worm. It seemed nothing could be easier than to dispose of it with one blow." But when he lets the brick fall,

A strange thing happened -- the little body seemed to be resilient, the brick fell to one side, and the kitten continued its miaowing. With shaking hands I picked up the brick again and proceeded to crush the little ball of living matter until the very entrails came out, and at last it was silent, and I scraped up the remains of the kitten with a shovel and took them off to the rubbish heap, and as I did it my head swam and I felt sick.

Somehow, amid the myriad slaughters of World War II, it takes a frail and worthless kitten -- "as insignificant as a worm" -- to teach us something about the tenacity of life and the awfulness of taking it away.

I'm not sure where I was when I came across this passage some thirty-odd years ago, but I'm pretty sure it was in close proximity to settling down to a steak or chicken dinner. If I made any connection between the kitten and the dead animal I was about to consume, it has been erased from my memory. But if I had made such a connection, what exactly would it have been? Certainly, pulverizing a poor defenseless creature is bad. But does that mean that dispatching it quickly and efficiently in a modern abattoir with the good utilitarian purpose of feeding the hungry is good? If "defenseless" is the operative word here, does that mean it is morally permissible to take the life of a fellow creature as long as we give it a sporting chance to fight back or escape -- in a bullring, perhaps, or out in the wild? Or maybe sentience is the relevant issue (as utilitarian philosopher Peter Singer, author of the 1975 manifesto Animal Liberation, maintains), in which case it may be bad to kill a kitten, but it's OK to kill an animal further down the evolutionary scale, such as a frog, a fish or a bug.

On the other hand, if life is the highest value and taking it is never, ever permissible, then what are we to do in the case of a poisonous snake that is about to strike a sleeping infant? Kill one to save the other, or stand back and let nature take its course? If all lives are equally precious, how can we choose between the two?

These are the kinds of conundrums that Tristram Stuart chews on in The Bloodless Revolution, his intelligent, readable, if ultimately unsatisfying, account of Western vegetarianism from the Elizabethan Age to the present. Many people no doubt regard vegetarianism as inherently frivolous and hence an unsuitable topic for serious intellectual history. But if The Bloodless Revolution does anything, it is to prove such skeptics wrong. One way or another, it shows that vegetarians have been in the forefront of some of the most important controversies of the modern era. The reason is not hard to fathom. Like everything else in life, food is multidimensional, which is why the question of whether to order fruit salad or a BLT is never solely a matter of taste but touches on everything from morality and aesthetics to agricultural policy, humanity's place in the natural world and even constitutional affairs. In the eighteenth century, to cite just one example, beef was as central to the English self-image as cheap gasoline currently is to that of the United States. Just as the ability to cruise down a highway in an SUV or pickup is what distinguishes an American from a Frenchman paying $7 a gallon to tool around in some mini-subcompact, the ability to consume great slabs of cow flesh was what distinguished John Bull from "Frogs" dining on onions and snails. Scruffy vegetarians seeking to take all that red meat away were barely distinguishable from Jacobin sympathizers wishing to guillotine the House of Lords.

If we are what we eat, in other words, then modifying the national diet was seen as the quickest route to changing the political structure, while resisting such demands was part and parcel of defending the status quo. Their analysis may have been naïve, but vegetarians' ambitions were immense and their critique was nothing if not sweeping.

Stuart begins his tale with Sir Francis Bacon, appropriately enough since Bacon was both a key figure in the Scientific Revolution that gave us modernity and keenly interested in the question of diet, health and longevity. This was a big issue in the seventeenth century for primarily scriptural reasons. The opening pages of the Bible are filled with people who live eight or nine centuries. But then, following Noah and his ark comes Genesis 9:3, in which God specifically gives permission to eat meat ("Everything that lives and moves will be food for you"). With that, longevity plummets. Since few people questioned the truth of such tales, the issue, as Bacon saw it, was what one had to do with the other -- whether not eating meat was the reason Methuselah lived 969 years or whether it was merely coincidental. Bacon never advanced beyond the speculative stage, but Thomas Bushell, one of his acolytes (and, it was widely reported, one of his lovers), put his master's theory to the test by retiring to the Calf of Man, a one-square-mile islet in the Irish Sea, following a period of riotous debauchery in the gaming houses, theaters and brothels of London. Bushell was hardly the first person to adopt a hermitic lifestyle, but he may have been the first to eschew meat and alcohol with the express purpose of improving his health. Although falling short of Methuselah's record, he lived to the ripe old age of 80 and died a wealthy man after developing the silver and lead mines of nearby Wales.

A surge of vegetarianism followed during the revolutionary period of the 1640s and '50s, when England was torn by civil war between parliamentary Roundheads and royalist Cavaliers. Rather than scientific exploration, the goal this time was more overtly political. The similarity between the mistreatment of animals and the common folks' ill treatment at the hands of the old ruling class was too obvious to ignore. Eliminating one surely entailed putting a stop to, or at least limiting, the other. Thus, a radical preacher named John Robins declared himself the new Adam and demanded that his followers, known as the Ranters, give up meat and alcohol so as to "reduce the world to its former condition, as it was before the fall of the first Adam," in the words of one of his disciples. A bricklayer-preacher in the town of Hackney told an excited crowd that "it is unlawfull to kill any creature that hath life because it came from God." An ex-soldier named Roger Crab accused his oppressors of "thirsting after flesh and blood" and asserted that "humour that lusteth after flesh and blood is made strong in us by feeding of it." Not feeding of it was the surest way to eliminate such aggressive tendencies.

A growing number of travelogues from India, the world capital of vegetarianism, gave such arguments an inestimable boost. Europeans were astonished by stories of Brahmans who lived on fruits and vegetables and of Jains who regarded life as so valuable that they swept the streets to avoid stepping on insects. A Dutchman named John Huygen van Linschoten reported in the 1590s that Indians "kill nothing in the world that has life, however small and useless it may be." An Englishman named Ralph Fitch wrote that they even have "hospitals for sheepe, goates, dogs, cats, birds, and for all other living creatures," adding that "when they be old and lame, they keepe them until they die." This was an eye-opener for Europeans who automatically killed their animals when they were past the productive age. Although Westerners assumed that meat and alcohol was what made them more manly, the French traveler François Bernier noted in the 1660s that Indian armies traveled more quickly on rice and dried lentils than European armies weighed down by their barrels of salted beef and tankards of wine. Indian ways were not only different but might actually be superior.

Unfortunately for vegetarianism, however, it was also during the Enlightenment that the ideology's shortcomings grew more obvious. The most difficult had to do with ethics. Vegetarianism is most fundamentally about the importance of not taking life other than under the most extreme circumstances. But cruel as it is to kill an ox or a pig, nature is even crueler. A tiger or wolf does not knock its prey senseless with a single blow to the forehead and then painlessly slit its jugular; rather, it tears it to pieces with its teeth. Freeing an animal so that it could return to its natural habitat meant subjecting it to a life of greater pain rather than less. This was disconcerting because it suggested that animals might be better off on a farm even if they were to be slaughtered in the end. There was also the fact that human agriculture created life that would not otherwise exist. If people stopped eating meat, the population of pigs, cattle and sheep would plummet, which meant that the sum total of happiness, human or otherwise, would diminish. This was enough to persuade the Comte de Buffon, a freethinker and naturalist, to declare in 1753 that man "seems to have acquired a right to sacrifice" animals by breeding and feeding them in the first place.

Vegetarians were unsure how to respond. Benjamin Franklin turned anti-meat at one point and for a time regarded "the taking of every Fish as a kind of unprovok'd Murder." But he had a change of heart when he noticed the many small fish inside the stomach of a freshly caught cod: "Then thought I, if you eat one another, I don't see why we mayn't eat you." But Franklin's contemporary, the radical English vegetarian Joseph Ritson, wrestled with the same problem only to reach the opposite conclusion. He railed against "sanguinary and ferocious" felines, and when his nephew killed a neighbor's cat on the grounds that it had just murdered a mouse, he sent the boy a note of congratulations: "Far from desiring to reprove you for what I learn you actually did, you receive my warmest approbation of your humanity." Vegetarians wanted to knock Homo sapiens off their pedestal and bring them down to the level of the other animals. Simultaneously, they wanted to turn human beings into supercops patrolling nature's furthest recesses in order to rein in predators and impose a more "humane" regime.

Some of the Western world's most exemplary intellectuals immersed themselves in debates of this sort. Leonardo da Vinci ranted against cruelty to animals, worried that eating eggs deprived future beings of life and reportedly purchased caged birds for the sole purpose of setting them free. Isaac Newton admired vegetarians and believed in the humane treatment of animals, although reports that he was a vegetarian himself proved exaggerated when a bill from a butcher, poulterer and fishmonger turned up among his personal effects after his death. Except for the occasional egg, Descartes limited himself to a fruit-and-vegetable diet in hopes that it would give him long life (he died at age 54). Shelley, who adopted vegetarianism at around age 20, believed not only that eating meat made people violent but that it fed the desire for luxury goods, a prime factor in the growth of human inequality. The hope was that the rich might not lord their wealth so much over the poor if they ate a humble diet.

Somewhat less exemplary -- or exemplary in a different way -- was Adolf Hitler, who gave up meat for a time in 1911 to treat a stomach ailment and then again in 1924 to shed some weight. Thereafter, he became a dedicated vegetarian, believing, according to Stuart, that a meat-free diet was the only thing that "alleviated his chronic flatulence, constipation, sweating, nervous tension, trembling of muscles, and the stomach cramps that convinced him he was dying of cancer." (The recent movie Downfall shows him consuming a meatless last supper of ravioli in tomato sauce before committing suicide in his Berlin bunker.) Urged on by their Führer, Nazi officials nagged Germans to abandon sausage and potatoes for "more natural diets based on wholesome roots, fruits and cereals," according to The Bloodless Revolution, and "legally obliged bakers to sell wholemeal bread -- the patriotic food of the great German peasant." German theosophists applauded, as did George Bernard Shaw and meat-eschewing Seventh Day Adventists. The latter group declared in 1933 that Germany had at long last gained a leader "who has his office from the hand of God, and who knows himself to be responsible to Him. As an anti-alcoholic, non-smoker, and vegetarian, he is closer to our own view of health reform than anybody else." The same regime that sent millions to the death camps, Stuart adds, promulgated rules for the humane slaughter of fish and crustaceans.

What drew Hitler to vegetarianism were most likely its antihumanist and authoritarian elements. "The monkeys, our ancestors of prehistoric times, are strictly vegetarian," he pointed out on one occasion. "If I offer a child the choice between a pear and piece of meat," he said on another, "he'll quickly choose the pear. That's his atavistic instinct speaking." Atavism was a virtue, of course, because it put the good Nazi in touch with his inner beast. "Man, alone amongst the living creatures," Hitler added, "tries to deny the laws of nature" -- laws that Nazism was out to reimpose. Since 1945 this nihilist strain has been carried forward by such figures as the Fascist vegetarian Maximiani Portas (a k a Savitri-Devi), who argued that "you cannot 'de-nazify' Nature" and, says Stuart, "laments that ancient forests have been destroyed to build roads, cities and to grow food for 'more and more people who might as well never have been born.'"

This is fascinating stuff. But it is at this point that The Bloodless Revolution loses narrative steam, which is odd considering vegetarianism's dramatic resurgence in recent years among pierced and tattooed twentysomethings. Instead of Trotskyists, Maoists and Social Democrats going at one another hammer and tongs, progressive circles are now witness to arguments over the merits of soy versus dairy, while "dumpster-divers" demonstrate their contempt for capitalist waste by subsisting on the discards from restaurants. Stuart says nothing about such developments; instead he winds up with a rather cursory two-page summary in which he criticizes "the old anthropocentric speciesism which attributes moral worth to entities according to how similar they are to 'us'" and acknowledges that "human self-interest" will always be a factor in determining agricultural policy; but he never explains how we can have one without the other. While not embracing vegetarianism entirely, he is clearly sympathetic and, where meat is concerned, comes out squarely in favor of a policy of less is more. "The equation," he says, "is simple: if we ate less unsustainably produced meat we would destroy fewer forests, use less water, emit fewer greenhouse gases and conserve the world's resources for future generations."

But calling something simple does not make it so. No sane person favors unsustainably produced meat. But, tellingly, Stuart does not consider the possibility of meat that is sustainably produced in accordance with the strictest environmental standards. Should we eat less of that also? Or more? Perhaps the issue should not be quantity but quality -- not whether we should eat more or less but whether we should eat better, which is to say chicken that tastes like something other than cardboard, turkey that tastes like something other than Styrofoam and so on. Maybe the solution is to reject bland industrial products and demand meat with character, the kind that comes from animals that have not spent their lives in industrial feedlots but have had an opportunity to walk around and develop their muscles.

On a more fundamental level, perhaps the problem has to do with the awful word "speciesism." In arguing for a balance between animal welfare, ecology and human self-interest, Stuart is advocating what some political theorists call the "parcellization" of human sovereignty, its division and subordination amid an array of competing interests. The idea is that instead of reigning supreme over nature, humanity should take its place within nature alongside its fellow animals. Instead of domination, this implies sharing, harmony and other New Age virtues. But the trouble with sovereignty is that it cannot be fragmented or reduced; either it's supreme and indivisible or it's not, in which case it's no longer sovereignty. Although vegetarians may think that surrendering human supremacy will reduce the harm that people do to the environment, any such effort is invariably counterproductive. Denying humans their supreme power means denying them their supreme responsibility to improve society, to safeguard the environment on which it depends and even -- dare we say it -- to improve nature as well.

Besides, humans are already sovereign -- the trouble is that most of them don't realize it or, for political reasons, refuse to acknowledge it, maintaining instead that real sovereignty lies with God, nature or the free market. But real-life experience tells us otherwise. Since vegetarians began warning in the eighteenth century that the earth would run out of food unless everyone immediately shifted to potatoes and grain, the global population has more than sextupled, global per capita income has increased nearly tenfold even when inflation is taken into account, while consumption of meat, poultry and seafood has risen as well, up 37 percent in the United States since 1909 and even more strongly in less developed portions of the world. More people are living better and eating more richly than anyone in the 1700s would have thought possible. Regardless of whether they are consuming more meat and poultry than is good for them, it is yet another reminder, as if any more were needed, of how thoroughly Malthusian myths about limits to human productivity have been shattered.

Scarcity no longer serves as an argument for vegetarianism, and neither, for that matter, does health, since we know from studies of Okinawan centenarians and others that small amounts of meat and dark-fleshed fish are good for you; that moderate amounts of alcohol (which vegetarians for some reason appear to avoid) is good for you as well; and that plenty of exercise, a sense of well-being that comes from a strong social structure and, of course, universal healthcare are equally essential.

So the next time you tuck into a plate of tagliatelle Bolognese, a leg of lamb or a proper coq au vin made from some rangy old rooster that's had more lovers than most of us can dream of, you should see it not just as a chance to fill your stomach but, rather, as an occasion to celebrate humanity's ongoing struggle to create abundance out of scarcity. Venceremos! It's a lot better than wallowing in the silly defeatism of a diet of tofu and sprouts.

The New Yorker Goes to War

In its first issue after the fall of the World Trade Center, The New Yorker published a handful of short reaction pieces by John Updike, Jonathan Franzen and others about the horror that had just occurred in lower Manhattan. Only one really stood out, however: an angry and eloquent blast by Susan Sontag at "a robotic president who assures us that America still stands tall" and robotic politicians who "apparently feel free to say nothing more than that they stand united behind President Bush."

In the wake of the Twin Towers attack, Sontag wrote, Americans had much to ponder "about the ineptitude of American intelligence and counter-intelligence, about options available to American foreign policy, particularly in the Middle East, and about what constitutes a smart program of military defense." Yet rather than thinking, politicians and the press were engaging in "confidence-building and grief management." Where Americans had once been contemptuous of Soviet yes-men, their own representatives were proving no less acquiescent in the crunch as the Bush Administration geared up for war. "The unanimity of the sanctimonious, reality-concealing rhetoric spouted by American officials and media commentators in recent days," she declared, "seems, well, unworthy of a mature democracy."

The essay, less than 500 words long, unleashed a torrent of right-wing abuse, most of it zeroing in on Sontag's parenthetical point that, by themselves, courage and cowardice are morally neutral -- their moral quality depends entirely on the ends they serve. Hence: "Whatever may be said of the perpetrators of Tuesday's slaughter, they were not cowards." Andrew Sullivan called it "deranged" and Charles Krauthammer said Sontag was morally obtuse, while Rod Dreher, a columnist for the New York Post, expressed a desire "to walk barefoot on broken glass across the Brooklyn Bridge, up to that despicable woman's apartment, grab her by the neck, drag her down to ground zero and force her to say that to the firefighters."

"I didn't agree at all with Susan Sontag's now famous four paragraphs," editor David Remnick subsequently confessed in an interview with a Japanese newspaper, and the magazine's coverage showed it. Never again would The New Yorker publish anything as remotely outspoken about Bush's War on Terrorism, by Sontag or by anyone else. While criticism of the White House did not exactly vanish, it unquestionably wound down, growing more tempered and balanced as the editors struggled to find something nice to say about Administration policies.

House liberal Hendrik Hertzberg continued to turn out editorials that were skeptical and irreverent (although, this being The New Yorker, never very angry). But he found himself regularly checked by Remnick, who weighed in at crucial moments with "Talk of the Town" comments that, after the usual hemming and hawing, inevitably concluded that the White House was on the right track after all. Rather than challenge the hawks, the magazine either confined itself to criticisms of the way the war was being conducted or, in a few instances, sought to one-up the boys on the Defense Policy Board by running terrorist scare stories more lurid than even they could dream up. In the end, the magazine wound up endorsing just the sort of war policies it had warned against back in September 2001. Rather than opposing robotic yes-man politics, it ended up practicing them.

The New Yorker has not been the only publication to fall into line behind the Bush Administration's war drive, but for a number of reasons its performance seems especially disappointing. One reason has to do with the magazine's track record. One doesn't have to be a William Shawn devotee to agree that the magazine has published some astonishing journalism over the years -- Hannah Arendt's "Eichmann in Jerusalem," James Baldwin's "Letter from a Region of My Mind," Rachel Carson's "Silent Spring," Jonathan Schell's pieces on Vietnam and Pauline Kael's wonderful demolition job on Claude Lanzmann's Shoah, to name just a few. During the Vietnam War, it was one of the few mainstream publications to try to unmask the sordid reality behind the brass's regular 5 o'clock press briefings. And if it published too many long and hyperfactual stories in the 1980s about wheat or geology, at least it preferred being trivial and obscure to the glories of being a team player in Washington, which is a moral stance of a sort.

Though its style may have been genteel, The New Yorker succeeded in challenging middle-class sensibilities more often than any number of scruffier publications. Another reason to mourn the magazine's lack of resistance is that it represents an opportunity lost. Just as the magazine helped middle-class opinion to coalesce against US intervention in Vietnam, it might well have served a similar function today by clarifying what is at stake in the Middle East. Rather than unveil the reality behind a spurious War on Terrorism, though, The New Yorker helped obscure it by painting Bush's crusade as a natural and inevitable response to the World Trade Center/Pentagon attack and, as a consequence, useless to oppose. Instead of encouraging opposition, it helped defuse it. From shocking the bourgeoisie, it has moved on to placating it at a time when it has rarely been more dangerous and bellicose.

How does a magazine bring itself to such a pass? The process probably began when Tina Brown took over in 1992. Politically, Brown wasn't left wing or right wing so much as no wing. She fawned over Ronald and Nancy Reagan in Vanity Fair and then, a dozen years later, fawned over Bill Clinton in The New Yorker ("his height, his sleekness, his newly cropped, iron-filing hair, and the intensity of his blue eyes..."). While publishing the occasional exposé, such as Mark Danner's memorable "Massacre at El Mozote," she was more concerned with putting the magazine in the swim. David Remnick, who succeeded her in 1998, is a different case. Where Brown is catty and mischievous, his style is earnest and respectable. Although a talented reporter and a graceful writer, he lacks Brown's irreverent streak. (One can hardly imagine him writing a first-person account of dancing topless in New Jersey, or whatever the male equivalent might be, as Brown famously did at the beginning of her career.) Remnick's 1993 book, "Lenin's Tomb: The Last Days of the Soviet Empire," dutifully followed the Washington line in reducing a complex historical event to a simple-minded melodrama about noble dissidents versus evil Communist apparatchiki. Under his leadership, The New Yorker has never seemed more like a tame, middle-of-the-road news magazine with cartoons, which may explain why its political writers, people like Nicholas Lemann, Jeffrey Goldberg and Remnick himself, have never enjoyed more airtime on shows like Charlie Rose. In traveling from irreverence to reverence, it helps to have someone in charge with a heat-seeking missile's ability to home in on the proper establishment position at any given moment. But it also helps to have someone who knows when to ask the tough questions and when to turn them off.

A magazine like The New Yorker, of course, is a diverse and sprawling thing. It speaks with many voices, not all of whom completely agree. If Remnick leans to the right, Hertzberg still leans to the left, leaving it to the reader to triangulate between the two. Reporters like Jane Mayer and Jon Lee Anderson, for their part, have raised uncomfortable questions about the War on Terrorism. Still, an overall conservative turn has been apparent from nearly the moment the Twin Towers collapsed.

In its initial post-9/11 issue, for instance, the same one in which Sontag's mini-blast appeared, Hertzberg noted in "The Talk of the Town" that the Bush Administration was describing the assault on the World Trade Center and Pentagon "with growing ferocity...as acts of war." Instead of jumping on board, Hertzberg was duly cautious in his response: "Unless a foreign government turns out to have directed the operation (or, at least, to have known and approved its scope in detail and in advance), that is a category mistake." Pace Bush, harboring a terrorist did not make one as guilty as a terrorist; rather, you had to know what he was up to and be a party to his plans to be placed on the same moral footing.

Sobriety like that was a rare commodity in those superheated days. But then, sensing perhaps that too much sobriety might be a dangerous thing, Remnick enlisted Hertzberg a week later in co-writing an editorial stating that war was justified after all. "The trope of war has been omnipresent since the day of the attack," the two men wrote, and "President Bush did not, and could not, retreat from it." Still, "even as the President spoke of war his remarks reflected an awareness of the snares that word carries. Justice, more than war, was the speech's theme." This was the September 20, 2001, televised address in which Bush told the world, "Either you are with us or you are with the terrorists." Instead of targeting terrorists, he was now targeting those who failed to show sufficient alacrity for the new American jihad. Yet somehow Remnick and Hertzberg persuaded themselves that the President's remarks "went some distance in reassuring those who feared that his response to this crisis would be a primitive spasm of vengeful violence." Once The New Yorker convinced itself that Bush was well intentioned even if his performance was sometimes maladroit, it became one more courtier straining to get the king's ear.

A month later, the magazine published an investigative report by Seymour Hersh blaming a Pentagon culture of "political correctness" for the failure to assassinate Mullah Muhammad Omar. According to Hersh, military personnel could have taken the Taliban leader out with a Hellfire missile strike once a Predator drone got him in its sights. But they were blocked by overscrupulous higher-ups who "want you to kill the guy, but not the guy next to him," as one of his sources in the military put it. It was a form of legal squeamishness, apparently, that the Pentagon would soon overcome. Two months later Hersh explored the growing neocon push for an invasion of Iraq. But rather than ask why the United States was targeting Saddam despite a lack of evidence tying him to 9/11, Hersh confined himself to the purely practical question of whether the hawks could pull it off. "The issue is not how nice it would be to get rid of Saddam," he quoted one former Defense Department official as saying. "The problem is feasibility." If the United States could do it, in other words, it should.

To be fair, Hersh's reporting has grown more critical of late. The New Yorker's overwhelmingly antiwar readership no doubt cheered when his March 17 report about superhawk Richard Perle's business dealings with Saudi-born arms merchant Adnan Khashoggi led to Perle's resignation as chairman of the Pentagon's Defense Policy Board. Ripping into the work of special Pentagon intelligence analysts assigned to find proof of Saddam's ties to Al Qaeda and the arsenal of forbidden weapons he had supposedly assembled, Hersh quoted an angry ex-CIA official as saying, "They were so crazed and so far out and difficult to reason with -- to the point of being bizarre. Dogmatic, as if on a mission from God." This was strong stuff. But in discussing how they may have allowed themselves to be duped by their own propaganda, Hersh couldn't bring himself to mention the likelier possibility -- that "intelligence" like this is a smokescreen and that, from the analysts on up, the Bush Administration has simply lied. The New Yorker may criticize aspects of the War on Terrorism, but to suggest that Bush's jihad is fraudulent remains beyond the pale.

Indeed, the closest the magazine came to such heresy was in an October 2002 article by Nicholas Lemann. As the author of "The Promised Land: The Great Black Migration and How It Changed America" and, more recently, "The Big Test," a study of standardized exams, Lemann has done some important work in the past. But as The New Yorker's Washington correspondent, he seems to have reinvented himself as the sort of star-struck journalist who daydreams about fly-fishing with Dick Cheney and gushes over Condoleezza Rice: "To spend time with Rice is to see right away how she has such an effect on people. 'Condi is just an A-plus-plus-plus presence,' one former Administration official who knows her says. 'She's the most gifted speaker and conveyer of solid thoughts, in articulate form, of anybody I've ever met.'"

On this occasion, however, Lemann allowed a bit of irony to peek out from behind that deadpan prose. In the year since the assault on the World Trade Center, he noted, Bush had won the foreign policy battle merely by getting everyone to parrot the words "War on Terror." "The phrase," he wrote, "meets the basic test of Presidential rhetoric: It has entered the language so fully, and framed the way people think about how the United States is reacting to the September 11th attacks so completely, that the idea that declaring and waging war on terror was not the sole, inevitable, logical consequence of the attacks just isn't in circulation."

Quite right. But Lemann might have added that not only were contrary ideas not in circulation among the policy elite in Washington, they were not in circulation among the editorial elite in Times Square, where Remnick & Co. had also allowed the White House to set the terms of the debate. Despite occasional reservations about the White House's saber-rattling, The New Yorker remained a prisoner of the emerging Bush doctrine of pre-emptive warfare.

The growing number of articles that the magazine has run since 9/11 on the subject of terrorism and the Middle East have been equally skewed. Whenever The New Yorker uses the word "terror" or one of its cognates, for instance, it is almost always in an Arab or Muslim context. While a Nexis search turns up numerous references in the magazine to Palestinian, Egyptian and Pakistani terrorism since the Twin Towers attack, it turns up no references to US or Israeli terrorism or, for that matter, to terrorism on the part of Christians or Jews. A Nexis search over the same period reveals that the word "fundamentalism" appears almost always in an Islamic context as well. In this modern update of Saul Steinberg's "View of the World from Ninth Avenue," religious fanatics are mostly Muslim, occasionally Christian, but -- despite all those Uzi-toting settlers -- never, ever Jewish.

Examples of such one-sidedness range from the subtle to the egregious -- and, as is often the case, it is the former that are most interesting. In an article last September about Gershom Scholem, the famous scholar of Jewish mysticism, the novelist Cynthia Ozick concluded with an impassioned peroration on the subject of Scholem's twin religious and political obsessions: "In Kabbalistic symbolism, with its tragic intuition that the world is broken, that all things are not in their proper places, that God, too, is in exile, Scholem saw both a confirmation of the long travail of Jewish dispersion and its consolation: the hope of redemption. In short, he saw Zionism." Many of us have gotten so used to nationalist rhetoric of this sort that we no longer notice. But can anyone imagine The New Yorker celebrating Islam in such a fashion and winding up with an equally passionate embrace of Arab nationalism?

Two months after the September 11 attacks, The New Yorker published Bernard Lewis's 9,000-word essay, "The Revolt of Islam," which, while certainly timely, was replete with reckless generalizations about a "Muslim world" and "Muslim peoples" turning their backs on modernity. Lewis's description of Islamic fundamentalism as essentially an internal Muslim development was seriously misleading. As Eqbal Ahmad and Richard Barnet argued in The New Yorker -- the old New Yorker -- back in 1988, the United States did as much to fuel Wahhabist militancy as the Saudis themselves, by funding a fundamentalist mujahedeen army to drive the Soviets out of Afghanistan. Rather than solely Muslim in origin, the dramatic spread of Wahhabism since the 1970s is an ideological byproduct of a US-Saudi alliance based equally on anti-Communism and oil. By focusing narrowly on Islam's rejection of modernity, moreover, Lewis implied that the Judeo-Christian West's relationship to modernity is healthy and normal. One would never suspect that the President of the United States is a born-again Christian who believes that "on the issue of evolution, the verdict is still out on how God created the earth"; that the House majority leader (Tom DeLay) believes that "only Christianity offers a comprehensive worldview that covers all areas of life and thought, every aspect of creation"; or that the Attorney General (John Ashcroft) is such an extreme fundamentalist that he had himself biblically anointed before being sworn in as governor of Missouri. While Lewis would like us to believe that the fundamentalist rejection of modernity is the Islamic world's problem and not ours, in fact it is both.

As crude and reductive as this was, however, it was sophisticated stuff compared with the dispatches of staff writer Jeffrey Goldberg, who more than any other writer embodies the magazine's rightward drift vis-à-vis the Middle East. In early 2000, staff writer Mary Anne Weaver had argued that the idea of Osama bin Laden as an all-powerful Svengali in charge of a global array of terrorist organizations was overblown -- a judgment that still holds up despite the astonishing feat that he and his minions were able to pull off in September 2001. But, post-9/11, tempered arguments such as these would no longer do. Hence, the terrorism beat was handed over to Goldberg, who, although born in America, had served in the Israeli army during the first intifada and had gone on to write for The New Republic and The New York Times Magazine. With a seemingly unlimited travel budget, Goldberg set about unearthing terrorist conspiracies from South America to northern Iraq.

At its best, Goldberg's journalism (which, remarkably, recently earned him a National Magazine Award) is only slightly more blinkered than the magazine as a whole. In "A Letter From Cairo," which followed the assault on the World Trade Center by just three weeks, he described a growing epidemic of anti-Semitism among Egyptians outraged by America's blanket support for Israel. The report was not inaccurate, as far as it went. But with 32 percent of Americans believing that Arabs living in the United States should be placed under special surveillance, according to one post-9/11 poll, Goldberg might have noted that bigotry is hardly limited to one side. While it is certainly distressing that many Egyptians believe that Israel's security service, the Mossad, was somehow at the bottom of the Twin Towers attack, as Goldberg also reported, it is no less disturbing that, according to a recent New York Times/CBS News Poll, a majority of Americans believe Saddam Hussein was responsible, a fact The New Yorker has scarcely acknowledged.

But if Goldberg's articles are blinkered at best, at worst they are -- to put this as politely as possible -- examples of irresponsible fearmongering whose principal effect has been to fuel the White House war drive. Last spring Goldberg published a massive article about the Kurds of northern Iraq. Its description of the 1988 poison-gas attack on the city of Halabja was vivid and frightening, but it was only one item in a farrago of charges aimed at America's latest villain du jour. While all eyes in the West have been on the 5,000 deaths in Halabja, Goldberg argued that Saddam had gassed "perhaps" 200 villages, with up to 400,000 people poisoned in all. In the course of some 16,000 words, he went on to accuse Saddam of supplying Al Qaeda with chemical and biological weapons, of controlling the tiny fundamentalist group Ansar al-Islam, of attempting to build nuclear arms and of plotting to unleash bio, chemical and nuclear WMDs on other countries.

Indeed, Goldberg suggested that Saddam was responsible for everything in the Kurdish region of northern Iraq from an epidemic of birth defects to an increase in the population of poisonous snakes. The number of villages that he said had been attacked with poison gas was more than triple the figure that a comprehensive study by Human Rights Watch had come up with several years earlier, yet the evidence Goldberg offered to back up his account was remarkably thin: eyewitness accounts more than a dozen years after the fact; the "expert" testimony of a lone English geneticist so alarmist that she carries bleach and an oversized plastic bag whenever she visits a shopping mall, to protect against chemical attack; plus various alarmist comments, cited without qualification, by pro-Administration hawks like Ahmad Chalabi, Kanan Makiya and James Woolsey. Goldberg's chief source for the supposed three-way mutual-aid pact between Baghdad and Al Qaeda was a 29-year-old accused drug runner languishing in a Kurdish prison, whose captors released him from his cell just long enough for him to fill Goldberg's ear with the latest scoop on the Iraqi leader's nefarious activities.

It was the sort of "reporting" that only hawks could love -- and they did. At a news conference a few days later, Bush told reporters: "Evidently, there's a new article in the New York magazine or New Yorker magazine -- some East Coast magazine -- and it details about his barbaric behavior toward his own people. And not only did he do it to his own people, he did it to people in his neighborhood. And this is a man who refuses to allow us to determine whether or not he still has weapons of mass destruction, which leads me to believe he does." Dick Cheney added shortly on Meet the Press, "It's a devastating article.... It demonstrates conclusively what a lot of us have said: that this is a man who is a great danger to that region of the world--especially if he's able to acquire nuclear weapons."

The article certainly generated "buzz," as Tina Brown might have put it. Never mind that, as Jason Burke pointed out in the London Observer, the search for a link between Al Qaeda and small but militant groups like Ansar al-Islam was likely to come up dry, since Islamic fundamentalism is a diverse, spontaneous and decentralized movement in which Al Qaeda is "nothing more than primus inter pares." The Bush Administration needed a link in order to keep its theory of a unified terrorist conspiracy going, and The New Yorker was happy to provide it.

In another article, Goldberg offered little more than the word of various unnamed South American "experts," "investigators" and "intelligence sources" for his claim that Hezbollah, Hamas and Al Qaeda have established a fundraising network among Lebanese traders dealing in black-market CDs and pharmaceuticals along the Paraguayan-Brazilian border. The piece provided important ammunition not only for those wishing to promote the idea of a centralized, global conspiracy but also for those neocons pushing for an extension of hostilities to Iran and Syria, Hezbollah's prime backers.

Finally, in early February, just as preparations for an Iraq invasion were entering their last stages and popular opposition was mounting worldwide, The New Yorker ran yet another Goldberg report on the alleged Baghdad/Al Qaeda connection, this time one that dispensed with unidentified sources and instead relied solely on the word of Defense Secretary Donald Rumsfeld, Director of Central Intelligence George Tenet, Under Secretary of Defense Douglas Feith and the ubiquitous James Woolsey. Goldberg dutifully jotted down all that they told him about the ties between Saddam and bin Laden, which, needless to say, were unmistakably close. Woolsey, for one, declared that, as Goldberg put it, "it is now illogical to doubt the notion that Saddam collaborates with Islamist terrorism." The reason: "Islamist terrorists" had been spotted taking instructions in airline hijacking at an Iraqi training camp known as Salman Pak. In fact, as Hersh revealed in The New Yorker in May, Salman Pak was being used to train counterterrorists to fight radical fundamentalists.

Goldberg's piece appeared a week after Remnick argued in "The Talk of the Town" that while there was "no indisputable evidence of [an Iraqi] connection with the Al Qaeda attacks on New York and Washington" and no "irrefutable evidence" concerning Iraqi weapons of mass destruction, such things didn't matter. War was still justified. Since "it is not...difficult to hide centrifuges or gallons of anthrax in a country that is larger than Germany," the mere possibility that Saddam had WMDs squirreled away was all that was needed to launch an invasion. "History will not easily excuse us," Remnick wrote, "if...we defer a reckoning with an aggressive totalitarian leader who intends not only to develop weapons of mass destruction but also to use them." Since Rumsfeld, Feith et al. had assured us that this is what Saddam intended to do, that was all we needed to know. The rest was superfluous. And if US troops now tramping through the rubble of Iraq have yet to turn up a single vial of anthrax or canister of poison gas, that is superfluous as well. Iraqi WMDs are yesterday's news. What concerns us today are the chemical weapons that the Bush Administration tells us are stashed away in Baathist Syria. The Administration says they exist, so they must be there...somewhere.

Should we care that "the voice of this magazine has been quite clearly aggressive in its support of the war," as Remnick told a Scottish newspaper last September? The answer, obviously, is yes. The New Yorker may be just one example of a magazine that has lost its bearings, but, given its journalistic track record, its massive circulation (nearly a million) and the remarkable hold it still has on a major chunk of the reading public, it's an unusually important one. Where once it used its institutional heft to help broaden American politics, now it is helping to narrow them. When The New Yorker runs a clever and amusing profile of a colorful character like the Slovenian social theorist Slavoj Zizek, as it recently did, the main purpose is to give an appearance of openness while assuring readers that such radical critics remain safely marginalized. Meanwhile, it seems highly unlikely that the magazine would publish articles by the likes of Hannah Arendt or Pauline Kael, hard-hitting intellectual warriors whose goal was to challenge conventional wisdom head-on. People like that couldn't have cared less about respectability. The idea that we should put aside all doubts and take people like Rumsfeld or Woolsey at their word would have left them incredulous.

But, then, irreverence, independence, intellectual daring -- such things have been suspended for the foreseeable future. We must swallow our skepticism and fall into line. Criticism must be constructive, which is to say it must not call into question the premises of the War on Terrorism, or the good intentions of those conducting it. One is reminded of the old Dwight Macdonald line about The New Yorker existing to make us "laugh and lie down," except for two things. Rather than passivity and enervation, the goal now is loyalty and mobilization. And as for making us laugh -- well, maybe it's the sour mood we find ourselves in nowadays, but The New Yorker no longer seems quite as funny.


Don't Sit on the Sidelines of History. Join Alternet All Access and Go Ad-Free. Support Honest Journalism.