Tesla, Nissan, Google, and several carmakers have declared that they will have commercial self-driving cars on the highways before the end of this decade. Experts at the Institute of Electrical and Electronics Engineers predict that 75 percent of cars will be self-driving by 2040. So far California, Nevada, Florida, Michigan, and the District of Columbia have passed laws explicitly legalizing self-driving vehicles, and many other states are looking to do so.
The coming era of autonomous autos raises concerns about legal liability and safety, but there are good reasons to believe that robot cars may exceed human drivers when it comes to practical and even ethical decision making.
More than 90 percent of all traffic accidents are the result of human error. In 2011, there were 5.3 million automobile crashes in the United States, resulting in more than 2.2 million injuries and 32,000 deaths. Americans spend $230 billion annually to cover the costs of accidents, accounting for approximately 2 to 3 percent of GDP.
Proponents of autonomous cars argue that they will be much safer than vehicles driven by distracted and error-prone humans. The longest-running safety tests have been conducted by Google, whose autonomous vehicles have traveled more than 700,000 miles so far with only one accident (when a human driver rear-ended the car). So far, so good.
Stanford University law professor Bryant Walker Smith, however, correctly observes that there are no engineered systems that are perfectly safe. Smith has roughly calculated that "Google's cars would need to drive themselves more than 725,000 representative miles without incident for us to say with 99 percent confidence that they crash less frequently than conventional cars." Given expected improvements in sensor technologies, algorithms, and computation, it seems likely that this safety benchmark will soon be met.
Still, all systems fail eventually. So who will be liable when a robot car-howsoever rarely-crashes into someone?
An April 2014 report from the good-government think tank the Brookings Institution argues that the current liability system can handle the vast majority of claims that might arise from damages caused by self-driving cars. A similar April 2014 report from the free market Competitive Enterprise Institute (CEI) largely agrees, "Products liability is an area that may be able to sufficiently evolve through common law without statutory or administrative intervention."
A January 2014 RAND Corporation study suggests that one way to handle legal responsibility for accidents might be to extend a no-fault liability system, in which victims recover damages from their own auto insurers after a crash. Another RAND idea would be to legally establish an irrebuttable presumption of owner control over the autonomous vehicle. Legislation could require that "a single person be responsible for the control of the vehicle. This person could delegate that responsibility to the car, but would still be presumed to be in control of the vehicle in the case of a crash."
This would essentially leave the current liability system in place. To the extent that liability must be determined in some cases, the fact that self-driving cars will be embedded with all sorts of sensors, including cameras and radar, will provide a pretty comprehensive record of what happened during a crash.
Should we expect robot cars to be more ethical than human drivers? In a fascinating March 2014 Transportation Research Record study, Virginia Tech researcher Noah Goodall wonders about "Ethical Decision Making During Automated Vehicle Crashes." Goodall observes that engineers will necessarily install software in automated vehicles enabling them to "predict various crash trajectory alternatives and select a path with the lowest damage or likelihood of collision."
To illustrate the challenge, Stanford's Smith considers a case in which you are driving on a narrow mountain road between two big trucks. "Suddenly, the brakes on the truck behind you fail, and it rapidly gains speed," he imagines. "If you stay in your lane, you will be crushed between the trucks. If you veer to the right, you will go off a cliff. If you veer to the left, you will strike a motorcyclist. What do you do? In short, who dies?"
Fortunately such fraught situations are rare. Although it may not be the moral thing to do, most drivers will react in ways that they hope will protect themselves and their passengers. So as a first approximation, autonomous vehicles should be programmed to choose actions that aim to protect their occupants.
Once the superior safety of driverless cars is established, they will dramatically change the shape of cities and the ways in which people live and work.
Roadway engineers estimate that typical highways now accommodate a maximum throughput of 2,200 human-driven vehicles per lane per hour, utilizing only about 5 percent of roadway capacity. Because self-driving cars would be safer and could thus drive closer and faster, switching to mostly self-driving cars would dramatically increase roadway throughput. One estimate by the University of South Florida's Center for Urban Transportation Research in November 2013 predicts that a 50 percent autonomous road fleet would boost highway capacity by 22 percent; an 80 percent robot fleet will goose capacity 50 percent, and a fully automated highway would see its throughput zoom by 80 percent.
Autonomous vehicles would also likely shift the way people think about car ownership. Currently most automobiles are idle most of the day in driveways or parking lots as their owners go about their lives. Truly autonomous vehicles make it possible for vehicles to be on the road much more of the time, essentially providing taxi service to users who summon them to their locations via mobile devices. Once riders are done with the cars, the vehicles can be dismissed to serve other patrons. Self-driving cars will also increase the mobility of the disabled, elderly, and those too young to drive.
Researchers at the University of Texas, devising a realistic simulation of vehicle usage in cities that takes into account issues such as congestion and rush hour patterns, found that if all cars were driverless each shared autonomous vehicle could replace 11 conventional cars. In their simulations, riders waited an average of 18 seconds for a driverless vehicle to show up, and each vehicle served 31 to 41 travelers per day. Less than one half of one percent of travelers waited more than five minutes for a ride.
By one estimate in a 2013 study from Columbia University's Earth Institute, shared autonomous vehicles would cut an individual's average cost of travel by as much as 75 percent compared to now. There are some 600 million parking spaces in American cities, occupying about 10 percent of urban land. In addition, 30 percent of city congestion originates from drivers seeking parking spaces close to their destinations. A fleet of shared driverless cars would free up lots of valuable urban land while at the same time reducing congestion on city streets. During low demand periods, vehicles would go to central locations for refueling and cleaning.
Since driving will be cheaper and more convenient, demand for travel will surely increase. People who can work while they commute might be willing to live even farther out from city centers. But more vehicle miles traveled would not necessarily translate into more fuel burned. For example, safer autonomous vehicles could be built much lighter than conventional vehicles and thus consume less fuel. Smoother acceleration and deceleration would reduce fuel consumption by up to 10 percent. Optimized autonomous vehicles could cut both the fuel used and pollutants emitted per mile. And poor countries could "leapfrog" to autonomous vehicles instead of embracing the personal ownership model of the 20th century West.
If driverless cars are in fact safer, every day of delay imposes a huge cost. People a generation hence will marvel at the carnage we inflicted as we hurtled down highways relying on just our own reflexes to keep us safe.
Do Some Circus and Zoo Animals Dream of Freedom and Revenge Against Their Masters? One Author Says Yes
Reviewed: Fear of the Animal Planet: The Hidden History of Animal Resistance, by Jason Hribal, CounterPunch/AK Press, 153 pages, $15.95
Jason Hribal’s Fear of the Animal Planet: The Hidden History of Animal Resistance will be ignored, dismissed, and mocked. Published by the tiny and idiosyncratic AK Press and written by an obscure semi-academic, it proposes an argument that will make anyone other than the fiercest PETA activists smirk. Yet it should be required reading for all social scientists and political activists, because it perfectly demonstrates a central and enduring problem of modern left-wing political discourse: the tendency to speak on behalf of those who have not spoken.
Hribal argues that for more than two centuries, animals in zoos, circuses, and marine amusement parks not only have been “oppressed” and “exploited” but have been conscious of their oppression and exploitation, waging an intentional “struggle” for “control of production,” “autonomy,” “revenge,” and the “dream of freedom.”
It’s not difficult to dismiss Hribal as a Marxist Doctor Doolittle or his social science as cartoonish. After all, he ascribes political consciousness to creatures whose thoughts cannot be known. But his claims of knowing the thoughts of animals are no more arrogant or absurd than the claims countless academics and activists continue to make about the consciousness of people whose ideas are also inaccessible.
Most of Fear of the Animal Planet is an impressively thorough catalog of animals refusing to perform tricks, escaping their cages and enclosures, and attacking handlers and audiences. Unlike his descriptions of animals’ minds, which of course are impossible to substantiate, Hribal’s accounts of their behavior are supported with verifiable evidence (usually multiple eyewitness accounts). They certainly show that many animals have not done what they were trained to do.
We learn about the unruly behavior of Jumbo, a 19th-century African bush elephant who was the first animal celebrity. At the London Zoo in Regent’s Park, Jumbo frequently rammed the iron doors of his exhibition cage and slammed his trainer to the floor. After he was sold to P.T. Barnum’s circus in the United States, for several weeks Jumbo refused to enter the shipping container, despite numerous proddings and stabbings by trainers. These events are well-documented and difficult to dispute.
But in determining the meaning of events, Hribal, like most of the Marxist scholars who inspired him, becomes a ventriloquist. Jumbo “did not see himself as a machine,” Hribal writes, and “resistance was his new thought.” Another unruly circus elephant, named Janet, “hated” her trainers. Mary and Tory were not just pachyderms that walked out of a circus ring; they were “two disgruntled employees.” Writing as if he had read Tyke the elephant’s manifesto, Hribal claims this most infamous of circus animals crushed her trainer to death during a performance in Honolulu because she “was tired of being leased out to circuses and carnivals,” “sick of the dismal and dangerous working conditions,” and “through with the untreated injuries and wounds and the lack of basic healthcare.”
Not only does Hribal ascribe specific ideas to his elephant “rebels” but, like communitarians who speak on behalf of masses of people, he also casts them as a part of a collective, global, trans-historical consciousness. The behaviors of Jumbo in London in 1882, Janet in Florida in 1992, Tyke in Hawaii in 1994, and Mary and Tory in Wisconsin in 2002 were all “part of a larger struggle against oppression and exploitation.”
According to Hribal, the “larger struggle” crosses not just time and space but also species. Like their pachyderm comrades, the monkeys and apes that escape from zoos “know what freedom is and they want it.” Likewise, the sea lions, dolphins, and orcas in marine amusement parks share this “dream of freedom.” Their occasional refusals to obey commands from trainers are “strikes” that are part of “the battle over the control of production.” The Sea World orca named Tilikum who in 2010 dragged his trainer to the bottom of a tank and held her there until she drowned was actually presenting “a clear, pronounced demonstration of his dislike of captivity and all that it entails: from the absence of autonomy to the exploitative relations to the ever-increasing work-load.” Marine biologists do believe that orcas communicate with one another, but I doubt that any scientist has heard revolutionary jargon in their clicks and whistles.
Hribal does not merely make his finned and four-legged insurrectionists speak his political language. He has them stand in for all of their caged brethren, the vast majority of whom never tried to escape their confines or stomp a keeper. In this way, again, he is no different from many historians of human beings.
In 1988 the literary theorist Gayatri Spivak published an essay criticizing a new movement among scholars of South Asia known as “subaltern studies.” The movement was an attempt to replace colonialist histories of the subcontinent with “history from below,” an enterprise that had been under way among left-wing historians in Great Britain and the United States since the 1960s. What Spivak found in this new approach to South Asian history I find in the “new social history” of the United States: a widespread but largely unconscious effort to place explicitly political and collectivist ideas in the minds of historical subjects who left no record of their thoughts. Spivak argued that the objective of university academics to “establish true knowledge of the subaltern and its consciousness” was essentially a new form of imperialism —an attempt to remake the world in the image of oneself.
Try this for an exercise: Open any book written in the last 40 years on African-American history, women’s history, or labor history and count the number of times Hribal’s terms describing the consciousness of animals are used to describe the consciousness of people. Then look for evidence that the people themselves used those terms. Most often, you will find the self-appointed leaders of the “oppressed” and “exploited”—abolitionists, feminists, union leaders, civil rights leaders, and political radicals—standing in for their constituents and speaking the language that left-wing historians want to hear.
It is not a defense of slavery, segregation, the denial of rights to women, or poverty to acknowledge the fact that, according to the available evidence, only a tiny portion of their alleged victims clearly thought of themselves that way. Few historians mention that a majority of the ex-slaves who were interviewed held positive views of their days on the plantation (including those who were interviewed by African Americans) or, more important, that more than 99 percent of American slaves left not a single record of their thoughts. The implication of Spivak’s argument, which was applied to similar treatments of Indian peasants, is that to claim a status for all slaves as “victimized” or “oppressed” is to homogenize the attitudes, behaviors, and cultures of millions of people and to make them one’s sock puppet. Similarly, the total African-American participation in the organized civil rights movement of the 1950s and ’60s equaled roughly 1 percent of the total African-American population of the time. We also know that many African Americans, most notably black nationalists, attacked civil rights leaders for being sell-out “Uncle Toms” and cultural assimilationists. Yet in our textbooks Martin Luther King Jr. is presented as the voice of all 20 million black people alive during his lifetime.
The most egregious political ventriloquism can be found in U.S. labor history, where the socialists and social democrats who took control of some unions are used by historians to present the American working class as having a long tradition of collectivist aspirations. My 2001 book on Jimmy Hoffa, Out of the Jungle, was the first to note that anti-socialist, strictly bread-and-butter unions like the Teamsters dwarfed the combined membership of the socialist-led unions beloved by New Left labor historians.
And the views of how many women have been represented by feminist discourse since its origins in the 19th century? Jason Hribal does to dolphins what Hillary Clinton is doing to the women of Afghanistan, but with a far more consequential intention than the razing of Sea World. Clinton and a large swath of feminists are justifying the military occupation of Afghanistan by claiming that Afghan women are current or potential “victims” of Sharia law and the Taliban. Yet only a small fraction of Afghan women have been asked in polls whether they agree with this assessment of their own lives; a majority who have been asked endorse Sharia law, and a significant percentage even endorse the return to power of the Taliban. If we liberate the women of Afghanistan, we will do so against the wishes of many of the liberated.
Speaking for the subaltern is not exclusively a practice of the left. Recently two fetuses testified against reproductive rights during a hearing of the Ohio state legislature. Lying on gurneys in the hearing room, two pregnant women were scanned by an ultrasound machine as a video monitor broadcast the images and sounds of their fetuses’ beating hearts. The fetuses were there to contribute their unwitting support to the “heartbeat bill,” which would ban abortions in Ohio as soon as a heartbeat could be detected, except in medical emergencies.
So let us use the apparent absurdities of this book to question our own equally absurd but also imperialistic claims about the beliefs and aspirations of those we do not know.
Of the 27 amendments to the U.S. Constitution, the 18th is the only one explicitly aimed at restricting people’s freedom. It is also the only one that has ever been repealed. Maybe that’s encouraging, especially for those of us who recognize the parallels between that amendment, which ushered in the nationwide prohibition of alcohol, and current bans on other drugs.
But given the manifest failure and unpleasant side effects of Prohibition, its elimination after 14 years is not terribly surprising, despite the arduous process required to undo a constitutional amendment. The real puzzle, as the journalist Daniel Okrent argues in his masterful new history of the period, is how a nation that never had a teetotaling majority, let alone one committed to forcibly imposing its lifestyle on others, embarked upon such a doomed experiment to begin with. How did a country consisting mostly of drinkers agree to forbid drinking?
The short answer is that it didn’t. As a reveler accurately protests during a Treasury Department raid on a private banquet in the HBO series Boardwalk Empire, neither the 18th Amendment nor the Volstead Act, which implemented it, prohibited mere possession or consumption of alcohol. The amendment took effect a full year after ratification, and those who could afford it were free in the meantime to stock up on wine and liquor, which they were permitted to consume until the supplies ran out. The law also included exceptions that were important for those without well-stocked wine cellars or the means to buy the entire inventory of a liquor store (as the actress Mary Pickford did). Home production of cider, beer, and wine was permitted, as was commercial production of alcohol for religious, medicinal, and industrial use (three loopholes that were widely abused). In these respects Prohibition was much less onerous than our current drug laws. Indeed, the legal situation was akin to what today would be called “decriminalization” or even a form of “legalization.”
After Prohibition took effect, Okrent shows, attempts to punish bootleggers with anything more than a slap on the wrist provoked public outrage and invited jury nullification. One can imagine what would have happened if the Anti-Saloon League and the Woman’s Christian Temperance Union had demanded a legal regime in which possessing, say, five milliliters of whiskey triggered a mandatory five-year prison sentence (as possessing five grams of crack cocaine did until recently). The lack of penalties for consumption helped reassure drinkers who voted for Prohibition as legislators and supported it (or did not vigorously resist it) as citizens. Some of these “dry wets” sincerely believed that the barriers to drinking erected by Prohibition, while unnecessary for moderate imbibers like themselves, would save working-class saloon patrons from their own excesses. Pauline Morton Sabin, the well-heeled, martini-drinking Republican activist who went from supporting the 18th Amendment to heading the Women’s Organization for National Prohibition Reform, one of the most influential pro-repeal groups, apparently had such an attitude.
In addition to paternalism, the longstanding American ambivalence toward pleasure in general and alcohol-fueled pleasure in particular helped pave the way to Prohibition. The Puritans were not dour teetotalers, but they were anxious about excess, and a similar discomfort may have discouraged drinkers from actively resisting dry demands. But by far the most important factor, Okrent persuasively argues, was the political maneuvering of the Anti-Saloon League (ASL) and its master strategist, Wayne Wheeler, who turned a minority position into the supreme law of the land by mobilizing a highly motivated bloc of swing voters.
Defining itself as “the Church in Action Against the Saloon,” the clergy-led ASL reached dry sympathizers through churches (mostly Methodist and Baptist) across the country. Okrent says the group typically could deliver something like 10 percent of voters to whichever candidate sounded driest (regardless of his private behavior). This power was enough to change the outcome of elections, putting the fear of the ASL, which Okrent calls “the mightiest pressure group in the nation’s history,” into the state and federal legislators who would vote to approve the 18th Amendment. That doesn’t mean none of the legislators who voted dry were sincere; many of them—including Richmond Hobson of Alabama and Morris Sheppard of Texas, the 18th Amendment’s chief sponsors in the House and Senate, respectively—were deadly serious about reforming their fellow citizens by regulating their liquid diets. But even the most ardent drys depended on ASL-energized supporters for their political survival.
The ASL strategy worked because wet voters did not have the same passion and unity, while the affected business interests feuded among themselves until the day their industry was abolished. Americans who objected to Prohibition generally did not feel strongly enough to make that issue decisive in their choice of candidates, although they did make themselves heard when the issue itself was put to a vote. Californians, for example, defeated four successive ballot measures that would have established statewide prohibition before their legislature approved the 18th Amendment in 1919.
As Prohibition wore on, its unintended consequences provided the fire that wets had lacked before it was enacted. They were appalled by rampant corruption, black market violence, newly empowered criminals, invasions of privacy, and deaths linked to alcohol poisoned under government order to discourage diversion (a policy that Sen. Edward Edwards of New Jersey denounced as “legalized murder”). These burdens seemed all the more intolerable because Prohibition was so conspicuously ineffective. As a common saying of the time put it, the drys had their law and the wets had their liquor, thanks to myriad quasi-legal and illicit businesses that Okrent colorfully describes.
Entrepreneurs taking advantage of legal loopholes included operators of “booze cruises” to international waters, travel agents selling trips to Cuba (which became a popular tourist destination on the strength of its proximity and wetness), “medicinal” alcohol distributors whose brochures (“for physician permittees only”) resembled bar menus, priests and rabbis who obtained allegedly sacramental wine for their congregations (which grew dramatically after Prohibition was enacted), breweries that turned to selling “malt syrup” for home beer production, vintners who delivered fermentable juice directly into San Francisco cellars through chutes connected to grape-crushing trucks, and the marketers of the Vino-Sano Grape Brick, which “came in a printed wrapper instructing the purchaser to add water to make grape juice, but to be sure not to add yeast or sugar, or leave it in a dark place, or let it sit too long before drinking it because ‘it might ferment and become wine.’â€Š” The outright lawbreakers included speakeasy proprietors such as the Stork Club’s Sherman Billings-ley, gangsters such as Al Capone, rum runners such as Bill McCoy, and big-time bootleggers such as Sam Bronfman, the Canadian distiller who made a fortune shipping illicit liquor to thirsty Americans under the cover of false paperwork. Their stories, as related by Okrent, are illuminating as well as engaging, vividly showing how prohibition warps everything it touches, transforming ordinary business transactions into tales of intrigue.
The plain fact that the government could not stop the flow of booze, but merely divert it into new channels at great cost, led disillusioned drys to join angry wets in a coalition that achieved an unprecedented and never-repeated feat. As late as 1930, just three years before repeal, Morris Sheppard confidently asserted, “There is as much chance of repealing the Eighteenth Amendment as there is for a hummingbird to fly to the planet Mars with the Washington Monument tied to its tail.”
That hummingbird was lifted partly by a rising tide of wet immigrants and urbanites. During the first few decades of the 20th century, the country became steadily less rural and less WASPy, a trend that ultimately made Prohibition democratically unsustainable. Understanding this demographic reality, dry members of Congress desperately delayed the constitutionally required reapportionment of legislative districts for nearly a decade after the 1920 census. “The dry refusal to allow Congress to recalculate state-by-state representation in the House during the 1920s is one of those political maneuvers in American history so audacious it’s hard to believe it happened,” Okrent writes. “The episode is all the more remarkable for never having established itself in the national consciousness.”
Other Prohibition-driven assaults on the Constitution are likewise little remembered today. In 1922 the Court reinforced a dangerous exception to the Fifth Amendment’s Double Jeopardy Clause by declaring that the “dual sovereignty” doctrine allowed prosecution of Prohibition violators in both state and federal courts for the same offense. In 1927 the Court ruled that requiring a bootlegger to declare his illegal earnings for tax purposes did not violate the Fifth Amendment’s guarantee against compelled self-incrimination. And “in twenty separate cases between 1920 and 1933,” Okrent notes, the Court carried out “a broad-strokes rewriting” of the case law concerning the Fourth Amendment’s prohibition of “unreasonable searches and seizures.” Among other things, the Court declared that a warrant was not needed to search a car suspected of carrying contraband liquor or to eavesdrop on telephone conversations between bootleggers (a precedent that was not overturned until 1967). Because of Prohibition’s demands, Okrent writes, “long-honored restraints on police authority soon gave way.”
That tendency has a familiar ring to anyone who follows Supreme Court cases growing out of the war on drugs, which have steadily whittled away at the Fourth Amendment during the last few decades. But unlike today, the incursions required to enforce Prohibition elicited widespread dismay. Here is how The New York Times summarized the Anti-Saloon League’s response to the wiretap decision: “It is feared by the dry forces that Prohibition will fall into ‘disrepute’ and suffer ‘irreparable harm’ if the American public concludes that ‘universal snooping’ is favored for enforcing the Eighteenth Amendment.”
The fear of a popular backlash was well-founded. From the beginning, Prohibition was resisted in the wetter provinces of America, where the authorities often declined to enforce it. Maryland never passed its own version of the Volstead Act, while New York repealed its alcohol prohibition law in 1923. Eleven other states eliminated their statutes by referendum in November 1932, months before Congress presented the 21st Amendment (which repealed the 18th) and more than a year before it was ratified.
This history of noncooperation is instructive in considering an argument that was often made by opponents of Proposition 19, the marijuana legalization initiative that California voters rejected in November. The measure’s detractors claimed legalizing marijuana at the state level would run afoul of the Supremacy Clause, which says “this Constitution, and the laws of the United States which shall be made in pursuance thereof…shall be the supreme law of the land.” Yet even under a prohibition system that, unlike the current one, was explicitly authorized by the Constitution, states had no obligation to ban what Congress banned or punish what Congress punished. In fact, state and local resistance to alcohol prohibition led the way to national repeal.
That precedent, while encouraging to antiprohibitionists who hope that federalism can help end the war on drugs, should be viewed with caution. For one thing, federalism isn’t what it used to be. Alcohol prohibition was enacted and repealed before the Supreme Court transformed the Commerce Clause into an all-purpose license to meddle, when it was taken for granted that the federal government could not ban an intoxicant unless the Constitution was amended to provide such a power. While the feds may not have the resources to wage the war on drugs without state assistance, under existing precedents they clearly have the legal authority to try.
Another barrier to emulating the antiprohibitionists of the 1920s is that none of the currently banned drugs is (or ever was) as widely consumed in this country as alcohol. That fact is crucial in understanding the contrast between the outrage that led to the repeal of alcohol prohibition and Americans’ general indifference to the damage done by the war on drugs today. The illegal drug that comes closest to alcohol in popularity is marijuana, which survey data indicate most Americans born after World War II have at least tried. That experience is reflected in rising public support for legalizing marijuana, which hit a record 46 percent in a nationwide Gallup poll conducted the week before Proposition 19 was defeated.
A third problem for today’s antiprohibitionists is the deep roots of the status quo. Alcohol prohibition came and went in 14 years, which made it easy to distinguish between the bad effects of drinking and the bad effects of trying to stop it. By contrast, the government has been waging war on cocaine and opiates since 1914 and on marijuana since 1937 (initially under the guise of enforcing revenue measures). Few people living today have clear memories of a different legal regime. That is one reason why histories like Okrent’s, which bring to life a period when booze was banned but pot was not, are so valuable.
Reflecting on the long-term impact of the vain attempt to get between Americans and their liquor, Okrent writes: “In 1920 could anyone have believed that the Eighteenth Amendment, ostensibly addressing the single subject of intoxicating beverages, would set off an avalanche of change in areas as diverse as international trade, speedboat design, tourism practices, soft-drink marketing, and the English language itself? Or that it would provoke the establishment of the first nationwide criminal syndicate, the idea of home dinner parties, the deep engagement of women in political issues other than suffrage, and the creation of Las Vegas?” Nearly a century after the war on other drugs was launched, Americans are only beginning to recognize its far-reaching consequences, most of which are considerably less fun than a dinner party or a trip to Vegas.
All along Hurricane Katrina's Evacuation Belt, in cities from Houston to Baton Rouge to Leesville, Louisiana, the exact same rumors are spreading faster than red ants at a picnic. The refugees from the United States' worst-ever natural disaster, it is repeatedly said, are bringing with them the worst of New Orleans' now-notorious lawlessness: looting, armed carjacking, and even the rape of children.
"By Thursday," the Chicago Tribune's Howard Witt reported, "local TV and radio stations in Baton Rouge...were breezily passing along reports of cars being hijacked at gunpoint by New Orleans refugees, riots breaking out in the shelters set up in Baton Rouge to house the displaced, and guns and knives being seized."
The only problem--none of the reports were true.
"The police, for example, confiscated a single knife from a refugee in one Baton Rouge shelter," Witt reported. "There were no riots in Baton Rouge. There were no armed hordes." Yet the panic was enough for Baton Rouge Mayor-President Kip Holden to impose a curfew on the city's largest shelter, and to warn darkly about "New Orleans thugs."
Even before evacuees could get comfy in Houston's Astrodome, rumors were flying that the refugees had already raped their first victim, just like that 7-year-old in the Superdome, or the babies in the Convention Center who got their throats slit. Not only was the Astrodome rape invented out of whole cloth, so, perhaps was the case reported 'round the globe of at least one prepubescent being raped and murdered in New Orleans' iconic sports arena.
"We don't have any substantiated rapes," New Orleans Police superintendent Edwin Compass said Monday, according to the Guardian. "We will investigate if the individuals come forward." The British paper further pointed out that, "While many claim they happened, no witnesses, survivors or survivors' relatives have come forward. Nor has the source for the story of the murdered babies, or indeed their bodies, been found. And while the floor of the convention center toilets were indeed covered in excrement, the Guardian found no corpses."
As Katrina wiped out New Orleans' communications infrastructure, and while key federal officials repeatedly expressed less knowledge than cable television reporters, panicky rumors quickly rushed in to fill the void. Many of them have shared the exact same theme--unspeakable urban ultra-violence, perpetuated by the overwhelmingly black population.
St. Tammany Parish President Kevin Davis issued a statement Monday that "Rumors are flying and being repeated occasionally in the media that describe supposed criminal actions in St. Tammany Parish. These rumors are NOT true." Police superintendent Compass had to fend off accusations that his beleagured force "stood by while women were raped and people were beaten."
The truth, whatever it may be, is clearly horrific enough, with just about every eyewitness account from New Orleans mentioning the palpable menace from crazed gangs of looters and ne'er-do-wells, especially after nightfall. Compass himself told reporters on Thursday that 88 of his cops were beaten back into a retreat by angry Convention Center refugees, forcing Mayor Ray Nagin to suspend rescue operations in favor of restoring a semblance of order.
But the lies matter too. If federal government officials can't even get their ass-covering justifications straight, let alone such non-trivial, easy-to-discern matters as whether there are indeed thousands of water-deprived refugees massed at a Convention Center, those stranded near the epicenter will likely be starved for information that could literally save their lives.
"Complaints are still rampant in New Orleans about a lack of information," NBC Anchor Brian Williams wrote on his weblog, echoing one of the most familiar complaints from the city.
"It's one of many running themes of the past week: There were no announcements in the Superdome during the storm, none to direct people after the storm, no official word (via bullhorn, leaflets or any other means) during the week-long, on-foot migration (and eventual stagnation) that defined life in the downtown section of the city for those first few days. One can't help but think that a single-engine plane towing a banner over the city would have been immeasurably helpful in both crowd and rumor control."And it's entirely possible that, like the chimeric Baton Rouge hordes, exaggerations about New Orleans' criminality affected policy, mostly by delaying rescue operations and the provision of aid. Relief efforts ground to a halt last week after reports circulated of looters shooting at helicopters, yet none of the hundreds of articles I read on the subject contained a single first-hand confirmation from a pilot or eyewitness. The suspension-triggering attack--on a military Chinook attempting to evacuate refugees from the Superdome--was contested by Federal Aviation Administration spokeswoman Laura Brown, who told ABC News, "We're controlling every single aircraft in that airspace and none of them reported being fired on." What's more, when asked about the attacks, Department of Homeland Security Secretary Michael Chertoff replied: "I haven't actually received a confirmed report of someone firing on a helicopter."
I don't begrudge any helicopter pilot erring on the side of caution; the vehicle is dangerous enough without a razor-thin margin for error. But a razor-thin margin is precisely what the wretched residents of New Orleans have had for nearly 10 days now, and too many of them have already succumbed. Incoming National Guard troops, steeled for urban warfare, have been surprised to instead encounter mostly docile and relieved stragglers.
Try as we might, it's almost impossible to avoid seeing any major event through the lens of our own prejudices and worldview. France-bashers were ready to slam Paris for being stingy about hurricane aid even before, you know, actually checking to see whether it was true (it wasn't). My prior antipathy toward the Department of Homeland Security has now hardened into something approaching activism. As we cast about for blame to lay, and lessons to learn for the next catastrophe, it's worth asking whether our haste to confirm our suspicions by believing the worst prevented us from doing our best.
I recently went out for a bout of activist carousing with Ban the Ban, a group opposed to the District of Columbia's proposed ban on smoking in bars and restaurants. I had expected to see plenty of heated arguments about the merits of the ban between smokers and non-smokers, and I did. I had not expected to see non-smokers attacking the ban on principle locked in debate with smokers who, between languorous puffs and grey exhalations, welcomed it as a means of reducing their own smoking.
If the argument--one I heard more than once from D.C. barflies--sounds strange, it is not, at any rate, rare. When New York City was mulling its own smoking ban, one young "man on the street" interviewee told the Village Voice: "I'd actually be all for it, which is odd since I am a smoker myself. I think it might make me smoke less. The increase in the cost of a pack of cigarettes hasn't stopped me from smoking. I just have friends who come up to visit from Florida bring cartons for me."
If we ignore for a moment the morality of endorsing a public restriction as a means to a personal self-help project, this is in one sense a perfectly ordinary thought. We are all, sometimes, afflicted with akrasia, those attacks of weak will that lead us to satisfy fleeting desires at the expense of our own acknowledged long-term interests.
Like Ulysses lashed to the mast, we empty the pantry of sweets, hire pricey personal trainers, join rehab groups, or loudly announce an intention to start working on that novel, knowing how embarrassed we'll feel if there's no progress to report when a friend asks how it's coming. Markets duly respond to our demand for self-restraint: Virgin Mobile recently introduced an anti-drunk dialing feature that allows users embarking on a pub crawl to block themselves from calling up that ex until the following morning.
There may even be ways for government to help us combat akrasia without overly restricting our freedoms. In his recent book The Ethics of Identity, philosopher Kwame Anthony Appiah offers (as a thought experiment more than a serious policy proposal) the example of the "self management card." When we go shopping for smokes or fatty foods or alcohol or a dose of heroin, Appiah imagines, the store is required to swipe our cards to ensure we haven't gone over a self-imposed limit, set by logging on to a special website set up for that purpose. An actual card of that sort would, of course, be a privacy nightmare, but it shows that attempts to help people make sound decisions need not be paternalistic.
Normal and necessary as these akrasia-countering mechanisms may be, though, they may also be symptoms of what Nobel laureate economist James Buchanan has dubbed "parentalism." Buchanan's term is not to be confused with paternalism, the familiar idea that sometimes people--other people--need to be restrained for their own protection from making poor choices. (In some cases, as with children or the severely mentally handicapped, this may well be right.) Parentalism is in a sense more insidious: It emerges when we begin to suspect that we ourselves are not competent to make our own choices, to yearn for someone to relieve us of the burden of choice. As Buchanan puts it:
[Economists and political theorists] have assumed that, other things being equal, persons want to be at liberty to make their own choices, to be free from coercion by others, including indirect coercion through means of persuasion. They have failed to emphasize sufficiently, and to examine the implications of, the fact that liberty carries with it responsibility. And it seems evident that many persons do not want to shoulder the final responsibility for their own actions..[They] want to be told what to do and when to do it; they seek order rather than uncertainty, and order comes at an opportunity cost they seem willing to bear.The thought is not novel to Buchanan. Jean-Paul Sartre described the "anguish" that comes with our realization that we are "condemned to be free." Marxist psychologist Erich Fromm diagnosed the totalitarian movements of the 20th century as symptoms of an urge to "escape from freedom," from the displacement of a feudal world in which identities were given--a place for everyone, and everyone in his place--with a capitalist order that made who we were and what we were to become seem dizzyingly contingent.
How much more true is that when the lodestones by which we navigated that sea of choices--religious communities, or localities with their own longstanding mores--are themselves objects of choice on the market, in an increasingly interconnected and mobile world that arrays communities and faiths before us like so many cans of soup on a Whole Foods shelf.
Contemporary theorists of choice paralysis sometimes talk as though the problem with abundant freedom of choice were merely that the cognitive demands of navigating modern markets' plenitude are uncomfortably high. Yet if that were so, then adaptive mechanisms to filter our choices--and, as described above, winnow out some of the tempting but destructive ones--would be the simple solution.
For the true parentalist, though, this will be unsatisfying, for the true parentalist wants to escape not just the burdens of the act of choosing, but the responsibility for making a poor choice. Voluntary market mechanisms for filtering or restraining choice will always, ultimately, have an escape clause: We can fire the personal trainer or tell our friends we've changed our minds about that diet or quitting smoking after all. And, in the final analysis, they allow us only to defer responsibility, not avoid it. The expert I consulted may have given me bad advice, yet I may still blame myself for a poor choice of experts.
There are plenty of practical problems with the parentalist impulse. As economist Glen Whitman notes in a forthcoming Cato Institute paper, we cannot assume we always help people by giving preference to their "long term" over their "short term" interests. Imagine an aging man in ill-health lamenting his sybaritic youth. We are tempted to say that his younger self, seeing the pleasures immediately available to him and giving short shrift to their long term consequences, exhibited a foolish bias toward the present. But surely it's also possible that his older self, faced with the proximate pains and inconveniences of poor health, discounts the pleasures past he'd have forsaken had he been more health-conscious. If we're prone to the first form of cognitive bias, why not the second?
Whitman also argues that, just as simple Pigovian taxes on pollution may be less efficient than allowing market negotiation to determine how much pollution will be produced in what location, sin taxes, smoking bans, and other parentalist attempts to spare our future selves the costs of our present choices may displace a rich variety of mechanisms for self-restraint that would match the rich variety of risk profiles and time-discount rates we find among members of a pluralistic society.
And as the young man interviewed by the Village Voice demonstrated, we can be ingenious at outwitting imposed restraints--even those we welcome in principle. We may find ourselves running up bigger credit card bills to buy more sin-taxed Twinkies and cigarettes, or traveling inconvenient distances to find a smoke friendly bar.
But perhaps a more important problem with parentalism is that it licenses what Sartre called "bad faith," the attempt to avoid the burdens of responsibility by denying our own freedom. Classical liberals may even inadvertently encourage this by speaking of responsibility as "the other side" of freedom, as though it were the spinach that had to be cleared away before getting to desert. But is that really so?
When we make trivial choices--what to have for dinner, what movie to see, which CD to buy--what we most value is the freedom to select without constraint from many options. Yet when it comes to our most central choices--what kind of person am I to be, what work will I find rewarding?--we may take as least as much satisfaction in the feeling of responsibility for our choices, in knowing that we have shaped a life that is ours even when we have chosen badly.
Classical liberals have become good at explaining how the market order they favor promotes freedom and happiness. They have been less adept at explaining why--at least past a certain point--people ought to want that freedom, which when genuine is always at least a little frightening. In the face of the parentalist impulse, we may need to develop the case that our bad choices, the choices that make us unhappy, are as vital and precious as the ones that bring us joy.
Never mind the vomiting. For members of O Centro Espirita Beneficiente Uniao do Vegetal, drinking ayahuasca, a foul-tasting psychedelic tea brewed from two Amazonian plants, involves four hours of recitation, chanting, questions and answers, and religious instruction.
That may help explain why the church has only 130 or so followers in the U.S., despite the drug trips at the center of its rituals. But the federal government does not want to take the chance that Uniao do Vegetal, a synthesis of Christianity and indigenous South American beliefs that originated in Brazil, will do for ayahuasca what Timothy Leary did for LSD.
So in 1999, after intercepting a shipment of ayahuasca extract bound for Uniao do Vegetal's U.S. headquarters in Santa Fe, customs agents searched the home of the group's president, Jeffrey Bronfman, and seized 30 gallons of the tea. In a case the U.S. Supreme Court recently agreed to hear, the group's members are demanding that the government stop harassing them and start respecting their religious practices.
The Customs Service and the Drug Enforcement Administration say ayahuasca is illegal because it contains dimethyltryptamine (DMT), which is banned by the Controlled Substances Act. Uniao do Vegetal members say their use of ayahuasca is protected by the Religious Freedom Restoration Act (RFRA), which prohibits the government from imposing a "substantial burden" on the free exercise of religion unless it is "the least restrictive means of furthering [a] compelling governmental interest."
In 2002 a federal judge, concluding that Uniao do Vegetal was likely to win this argument, issued a preliminary injunction barring the government from interfering with the church's rites. A three-judge panel of the U.S. Court of Appeals for the 10th Circuit upheld the injunction in 2003, and last year the full appeals court concurred.
For the Bush administration, which is big on religion but down on drugs, this case ought to pose a dilemma. RFRA, passed in 1993 with strong support from religious conservatives, was aimed at maximizing religious liberty by requiring the government to meet a stringent test when it prevents people of faith from acting on their beliefs.
The law was a response to a 1990 decision in which the Supreme Court ruled that the First Amendment's guarantee of religious freedom does not require the government to tolerate the peyote rituals of the Native American Church. While the First Amendment bars the government from deliberately targeting a specific religion, the Court said, it does not require exemptions from "neutral laws of general applicability" that happen to interfere with religious practices.
"To make an individual's obligation to obey such a law contingent upon the law's coincidence with his religious beliefs, except where the State's interest is 'compelling'...contradicts both constitutional tradition and common sense," wrote Justice Antonin Scalia for the majority. "Any society adopting such a decision would be courting anarchy."
Notwithstanding Scalia's warning, Congress passed RFRA with the intent of restoring the "compelling interest" test the Court had applied before the peyote case. Although the Court ruled in 1997 that RFRA was unconstitutional as applied to the states, it still binds the federal government.
In a 2002 case that foreshadowed Uniao do Vegetal's fight for the right to drink ayahuasca, the U.S. Court of Appeals for the 9th Circuit suggested that RFRA might protect possession (but not distribution) of marijuana by Rastafarians. No doubt that possibility gives drug warriors nightmares in which everyone arrested on marijuana charges claims to consider the plant a sacrament.
Yet it's not as if the idea of exempting religious groups from drug bans is unthinkable. The Volstead Act allowed Jews and Catholics to continue drinking wine as part of their rituals, and the federal government (like many states) lets members of the Native American Church eat peyote, the very practice that gave rise to the Supreme Court's abandonment of the "compelling interest" test. It's hard to see why ayahuasca rituals, which are officially pemitted in Brazil, are less tolerable.
Still, Scalia had a point: Religious beliefs cannot be a license to break the law. The government would never allow a religious group to commit murder because its god demanded human sacrifices. Then again, preventing murder is a pretty compelling interest, part of government's central mission to protect people from aggression.
A good rule of thumb might be that when a religious group can reasonably demand an exemption from a law, it's the law rather than the group that deserves scrutiny.
If you're a cultural historian, a movie geek, or just looking for an excuse to spend three hours watching TV, here's a video double feature you should try. First watch the premier pot-smuggling flick of the 1970s, Cheech and Chong's Up in Smoke. Then pop in the decade's most famous film about Coors smuggling, Smokey and the Bandit.
When you're done, try to figure out just how the good ol' boys and the hippies, two American tribes who were supposed to be sworn enemies, wound up flocking to such similar movies. The stories aren't twins--the heroes of Up in Smoke are too stoned to realize they're ferrying illegal cargo or that a smokey is on their trail--but if you catch them in the right light, they look like brothers.
These days it's widely recognized that it was the 1970s, not the '60s, that marked the real cultural revolution in the United States. The earlier decade might have seen America's traditionally tiny bohemia become a mass phenomenon, but it was in the '70s that the wave crashed, breaking down the boundaries between the rebels and the mainstream. One sign of this was a burst of creativity in Hollywood, where figures who spent the '60s soaking up the counterculture and making low-budget exploitation features--Francis Ford Coppola, Martin Scorsese, Jack Nicholson--used their new freedoms and their unorthodox training to transform the face of American film.
Meanwhile, other hands kept turning out those exploitation movies. In the new book Hick Flicks: The Rise and Fall of Redneck Cinema (McFarland), Scott Von Doviak gives us an entertaining and illuminating look at their world.
"While blaxploitation pictures ruled the urban grindhouses, providing heroes and myths for those trapped in the inner cities," he writes, "hick flicks dominated the drive-in circuit, bringing their own set of archetypal figures to flyover country." Von Doviak, who covers film for the Fort Worth Star-Telegram, has cast a wide net; he ends up discussing everything from early B movies to 21st-century fare, from backwoods creature features to arthouse documentaries. But the heart of his book is the 1970s, and the soul is movies about outlaws driving cars or trucks, ideally with a load of illicit spirits.
I can't endorse every opinion Von Doviak espouses. Notably, he fails to appreciate the peculiar charms of Sam Peckinpah's Convoy, surely the only film that is simultaneously a Christian allegory, a vaguely anarchist political fable, and a feature-length adaptation of a novelty song about CB radios. (It isn't a good movie, but it's much better than any picture starring Kris Kristofferson and Ali MacGraw has a right to be.) But Van Doviak is a witty and astute student of these films, entertainments that could simultaneously reflect the values of both the American counterculture and its alleged opposite.
I don't want to overstate this point. Hollywood has always celebrated individualist rebels, and the Southern backcountry has a longstanding anti-authoritarian tradition that, as the historian David Hackett Fisher put it in Albion's Seed, was "more radically libertarian, more strenuously hostile to ordering institutions than were the other cultures of British America."
Smokey and the Bandit was not a story that could be imagined only after 1969. It was a classic bandit narrative in the tradition of Robin Hood and Jesse James, with an invulnerable hero who defies unjust laws (in this case, speed limits and alcohol regulations), battles an oppressive sheriff (in this case, Jackie Gleason), and can move almost invisibly among the common folk who admire his heroic deeds (in this case, other drivers).
But this Robin Hood was rebelling at a time when the word rebellion invariably suggested the word freak. This Little John was played by Jerry Reed, a guy who used to jam with Elvis. This Sheriff of Nottingham was a fat racist cop, a cultural archetype that took hold during the civil rights movement--and was most evocative among those who sided with the protesters. The genre that begat them reached its peak after the country relaxed its attitudes toward on-screen sex, violence, and sympathy for lawbreakers, a change largely driven by the cultural revolution.
And there was something else. Once the ideals and fashions of Haight-Ashbury had leaked into the rest of the country, there was no predicting the ways they'd be adapted to local circumstances. By this point, those rednecks weren't just jeering the same sheriff as the hippies. Some of them were growing their hair, smoking weed, and listening to trippy music.
Such behavior swept the South and West in the '70s, but its headquarters was Austin, the city at the heart of Jan Reid's The Improbable Rise of Redneck Rock (University of Texas Press). Originally published in 1974, Reid's compulsively readable book was revised and reissued last year in substantially expanded form. It tells how a group of Texas-based musicians, most famously Willie Nelson, created a new style of music, usually called outlaw country, and a new cultural archetype, dubbed the cosmic cowboy. Larry Yurdin, who spent a chunk of the '70s running radio stations in Austin and Houston, once described the cosmic-cowboy scene to me as "the Texas version, in 1972, of what happened in San Francisco in '67. In a good ol' boy, Wild West context, it was the Summer of Love. With guns."
Once such a tremendous cultural collision has happened, it starts to look natural, even inevitable, in retrospect. By 1979 Hank Williams Jr. could sing, "If I get stoned and sing all night long/It's a family tradition"--and sure enough, the tradition was there, and not just in the Williams family. It just had to be discovered first.
It's during that period of discovery, when cultural identities are being reinvented and reshuffled, that things look more ambiguous. There's a scene in White Lightning, one of the better hixploitation flicks, where two moonshine runners walk past a hippie van that has the slogan "Legalize marijuana!" scrawled on its side.
"Legalize that shit, it's gonna ruin moonshine liquor forever," spits one of the rednecks. I like to imagine the van was carrying Cheech and Chong. Ã‚Â
In December, after a federal jury convicted McLean, Va., pain doctor William Hurwitz of running a drug-trafficking operation, the foreman told The Washington Post "he wasn't running a criminal enterprise." Don't bother reading that sentence again; it's not going to make any more sense the second time around.
Hurwitz, who is scheduled to be sentenced on April 14 and will go to prison for life if U.S. District Judge Leonard Wexler follows the prosecutors' recommendation, was charged with drug trafficking because a small minority of his patients abused or sold narcotic painkillers he prescribed for them. Prosecutors argued his practice amounted to a "criminal enterprise" based on a "conspiracy of silence"--i.e., a conspiracy in which Hurwitz did not actually conspire with anyone--because he charged for his services and should have known some of his patients were faking or exaggerating their pain.
Judging from the comments of the jury foreman, Ralph Craft, the jurors did not really buy this theory. Perhaps they still harbored the legally unsophisticated notion that drug traffickers are people who engage in drug trafficking. But they convicted Hurwitz anyway, because they didn't like the way he practiced medicine.
"I'm not an expert," Craft conceded, while expressing the opinion that Hurwitz was "a little bit cavalier" in prescribing opioids. "He ramped up and ramped up the prescriptions very quickly," he said. "This is stuff that can kill people. He should have been extra careful."
Craft and his fellow jurors were appalled by the sheer number of pills Hurwitz prescribed. "The dosages were just astounding," he said, calling them "beyond the bounds of reason."
As an example, Craft cited a prescription for 1,600 pills a day. As Hurwitz explained during the trial, this particular prescription, which was never filled, resulted from a nurse's calculation error that was discovered at the pharmacy. But it's true that many of his patients were taking very high doses of painkillers, doses that would kill someone unaccustomed to narcotics.
Although the jurors apparently considered such doses inherently suspicious, they are necessary for treating severe chronic pain because patients develop tolerance to the analgesic effects of narcotics. They are safe because patients also develop tolerance to the potentially fatal respiration-depressing effects of these drugs. Responses to pain medication vary from person to person, and there is no a priori limit to how high doses can be "ramped up."
The prosecution deliberately obscured these points during Hurwitz's trial, relying on the jurors' ignorance of pain treatment principles to convict him. The government's main medical expert, Michael Ashburn, testified that consumption of high narcotic doses by patients with chronic pain who do not have cancer is a sign of drug abuse.
In a letter they wrote before the verdict, six past presidents of the American Pain Society rebuked Ashburn for this statement, along with several other misrepresentations of pain treatment standards. "We are stunned by his testimony," they said. "Use of 'high dose' opioid therapy for chronic pain is clearly in the scope of medicine."
As these pain experts recognized, Hurwitz was not the only person on trial at the federal courthouse in Alexandria. So was every doctor who has the courage to risk investigation by treating people who suffer from severe chronic pain with the high doses of opioids they need to make their lives livable.
In poignant letters to Judge Wexler, who has fairly wide latitude in punishing Hurwitz now that the U.S. Supreme Court has made federal sentencing guidelines merely advisory, dozens of his former patients recount how he saved them from constant agony caused by migraines, back injuries, reflex sympathetic dystrophy and other painful conditions that left them disabled, homebound, despondent, and in some cases, suicidal. They outline the difficulties they had in getting adequate treatment before they found Hurwitz and the trouble they've been having since the government put him out of business.
"Good pain doctors are hard to find," writes one. "I am saddened that Dr. Hurwitz is branded a criminal for helping me and helping people like me." Another argues that Hurwitz's "crime"--trusting his patients--was one of his greatest virtues. "It is to Dr. Hurwitz's credit," he says, "that he chose to trust that his patients were genuinely seeking relief from pain that cannot be objectively measured. This trust is, in my experience, all too rare."
Threatening doctors with prison for viewing their patients with inadequate suspicion will make it even rarer.
Does your urine belong to Congress? Should private citizens not suspected of any wrongdoing be hauled up to Capitol Hill and grilled under oath, on live TV, about what substances they've put in their bodies?
Congressman Henry Waxman sure thinks so. The Los Angeles Democrat is convening hearings Thursday, March 17 on the pressing national security issue of ballplayers using performance-enhancing steroids. Last Wednesday, subpoenas were sent out to seven current and former Major League Baseball players to testify about their hormones in front of the oxymoronic House Committee on Government Reform.
Only one player, recent retiree Jose Canseco, has enthusiastically accepted the committee's invitation, though he's lobbying hard for immunity. By crazy coincidence, the former Bash Brother has a new, factually-challenged and universally-panned bestseller on the market, titled Juiced: Wild Times, Rampant 'Roids, Smash Hits, and How Baseball Got Big.
"Canseco's allegations about steroid use by Mark McGwire and other baseball players have received enormous media attention," an apparently envious Waxman wrote in his Feb. 24 letter requesting the hearings. "Many of the individuals have denied the accusations. Mr. Canseco insists his information is accurate. ... There is a simple way to find the truth in this matter. ... [H]ave them testify under oath."
Using the enormous power of the federal government to arbitrate literary disputes seems a little much. We wouldn't dream of forcing George W. Bush to swear on the Holy Bible just because Kitty Kelley reported that he snorted coke at Camp David, yet a private citizen's alleged use of a substance that's actually legal (with a prescription) is enough for Washington to set the wheels of publicity-masquerading-as-justice in motion.
And this isn't just a case of arrogant athletes getting their comeuppance – it potentially affects half the national labor force. Besides dragging Sammy Sosa and Jason Giambi on camera to recite the Fifth Amendment, the committee has issued a subpoena to Major League Baseball that, according to the L.A. Times, requests "results of drug testing since 2003," and "the names, disciplinary action taken and reason for suspension for all drug-related violations since 1990."
In other words, Congress is asserting its right to your drug tests, even if they were conducted based on a private agreement between employer and union, and even if the results – including disciplinary action – were understood at the time to be secret. About half of all employers test for drugs, and an estimated 50 million tests are performed each year. Should the federal government have the right to subpoena your private medical records?
That's hardly the only power-grab in this show trial. Waxman's committee (which is chaired by the equally distasteful Virginia Republican Tom Davis), literally believes it can investigative anything and everything it wants to. "Under the rules of the House," Davis and Waxman wrote Major League Baseball on Thursday, "the Committee on Government Reform may at any time conduct investigations of any matter."
Interestingly, baseball may end up mounting the first sustained attack on the committee's license to conduct fishing expeditions. Historically at each other's throats, team owners and the players union have joined forces under the same lawyer, Stanley Brand, who has vowed to fight the subpoenas on jurisdictional and constitutional grounds, all the way up to the Supreme Court.
"That would be limitless jurisdiction," Brand told told reporters after receiving the Davis/Waxman letter. "There would be nothing they couldn't look into ... . If that is the case, they don't have to have rules on jurisdiction because these guys can do whatever they want."
Cracks have already appeared in baseball's tenuous solidarity. Boston Red Sox pitcher Curt Schilling (who has no idea why he was subpoenaed) and White Sox slugger Frank Thomas have already said they'll testify. But Brand is at least talking a tough game about chalking a line in the sand.
For once, the urinalysis enthusiasts in the nation's sports pages are not joining as one to cheer on the feds. Epithets like "witch hunt" and "grandstanding politicians" are being tossed around, and for the first time in my memory, sportswriters are expressing concern about privacy rights and the long reach of Uncle Sam.
"I think they feel empowered to do whatever they want," Philadelphia Phillies pitcher Randy Wolf said last week, while emphasizing that he opposes steroid use. "You look at what they did with the 'confidential' drug tests that we had ... they said, 'Eh, we don't care if it was confidential or not. We're going to do what we want with it.'
"It's kind of a 1984 deal where basically, they want to know everything you're doing at all times, and because we're in the public spotlight our civil liberties are flushed down the toilet. It's chemical McCarthyism."
A Republican effort to stamp out needle-exchange programs abroad incensed editorial boards at The Washington Post and The New York Times last weekend, and both pages slammed the latest congressionally-mandated gag rule to hit the United Nations.
That conservatives are trying to stamp out harm reduction abroad is no small story, but both pages missed the fact that this is only the latest installment in a long story of strings-attached giving that has been changing U.S. foreign aid policy for years. From AIDS prevention measures stigmatizing sex, to anti-human trafficking targeting prostitution, to drug policies purged of pragmatism, foreign aid has become an American adventure in social engineering.
Global AIDS conferences have become as much a matter of America-bashing as AIDS-fighting. Last July, U.S. AIDS coordinator Randall Tobias was heckled mercilessly at the International AIDS conference in Bangkok. The discord stems from U.S. gestures toward a comprehensive approach on AIDS that never quite panned out. The Global Fund to Fight AIDS, Tuberculosis and Malaria was founded three years ago as a multilateral effort to help funnel vast sums of money to developing countries in need of public health funding. It took a localized, hands-off approach, scrutinizing applications but generally leaving questions of implementation and disbursement to local agencies and governments. The U.S. offered the first grant of $200 million and was expected to be a major supporter.
But during the president's State of the Union Address in 2003, he busted out with something called the "President's Emergency Plan for AIDS Relief," or PEPFAR, and a competing bureaucracy was born.
PEPFAR was smaller in its ambitions – only 15 countries were targeted – but much better funded, with $15 billion promised over five years. The Global Fund's biggest donor had proven itself capable of promising huge sums of money, but they would not be going into the Global Fund. Tobias tried to stem the ensuing wave of criticism by claiming that the organizations could work together. But it soon became clear that they had fundamentally different missions. PEPFAR enthusiastically endorses the so-called "ABC" approach – Abstinence, Be Faithful, and Condoms. The program gave President Bush an opportunity to scale up from his $10 million abstinence crusade in Texas (where there is no evidence it worked) to a billion-dollar version in Africa (where there is new evidence it's not helping.)
PEPFAR promises that 33 percent of all funds are spent on abstinence-promotion and that faith-based organizations can receive funding even if they refused to talk about or provide contraception. Condoms, the most statistically proven and economically sound method of prevention, are a last resort to be distributed to "high-risk" groups. The program also forces any organization receiving funds to explicitly oppose the legalization of prostitution.
The anti-prostitution demand is major, and it affects anti-human trafficking funds as well as AIDS funding. NGOs and organizations fighting trafficking are likely to be working against the stigmatized, underground nature of illicit prostitution, but they can't accept U.S. funds unless they condemn the practice. In the U.S., the push for action on human trafficking has come from the Christian right, and the lines between victim and sex worker have all but disappeared. U.S. funds go to outfits like the International Justice Mission, which has been accused of " brothel raids " in which its representatives "rescue" Asian sex workers against their will. Human trafficking is a much bigger issue than sexual slavery, but U.S. efforts thus far have focused almost exclusively on women, children and sex work.
With sexually-active Africans and Southeast Asian prostitutes on the hit list, intravenous drug users couldn't be far behind. Representatives Mark Souder (R-Ind.) and Tom Davis (R-Va.) are now trying to keep American aid money out of the hands of any organization that promotes clean needle exchanges. Assistant Secretary of State Robert Charles has already succeeded in scaring the United Nations Office of Drugs and Crime (UNODC) out of mentioning harm reduction in its literature, and UNODC projects are being threatened. Their actions have sent a ripple of terror through the network of international organizations.
Martin Jelsma, a program coordinator at the Transnational Institute, has been following the tension between the U.S. and UNODC for years. He worries that the current pressure "threatens the very heart of the few proven methods that are effective to stem the spread of HIV/AIDS."
As with condom distribution programs, there is no real question that needle exchange programs are effective in reducing AIDS transmission. In a hearing he held last month (tellingly titled "Is there such a thing as safe drug abuse?"), Souder didn't take long to reveal the root of his antipathy to harm reduction, and it had nothing to do with his alleged doubts about efficacy.
"These lifestyles," he said, "are the result of addiction, mental illness, or other conditions that should and can be treated rather than accepted as normal, healthy behaviors."
Souder's objections have to do with determining what is "normal" and "healthy" for other people; preventing AIDS transmission isn't on his agenda. As with the rest of U.S. AIDS assistance, his policies are more concerned with shaping a certain kind of global citizenry than stopping a virus. The New York Times editorial calls this a "triumph of ideology over science," but Souder and his coterie simply have different goals in mind. If keeping condoms and clean needles out of foreign hands are worthwhile goals in themselves, the science doesn't matter. Stopping the spread of AIDS would be a bonus, but it's not the priority, and it hasn't been for a long time.
Souder and others will talk of halting funding, but funding will only be redirected elsewhere. The UNODC has already buckled. The U.S. contributes far more than any other government to the fight against AIDS, and NGOs that depend on USAID will change their policies to survive.
"Compassionate conservatism" used to be a punch line, but it's in full swing these days, and it appears to involve throwing huge sums of our money into programs that don't work. If compassionate conservatism turns out to be neither, it will indeed be U.S. policy that finally brings wealthy Americans taxpayers and impoverished AIDS victims together. Both will be paying the price.