Dissent Magazine

How Gloria Steinem Renewed an Old Debate About Socialism and Feminism

In 1905, Eugene V. Debs, the popular labor activist and Socialist Party leader, had a speaking engagement in Rochester, New York and went to visit the aging women’s rights pioneer Susan B. Anthony at her home there. They exchanged memories of their previous meeting; then Anthony took Debs’s hand and, with good humor, said, “Give us suffrage and we’ll give you socialism.” Debs’s good-natured reply was: “Give us socialism and we’ll give you the vote.”

Keep reading... Show less

"Holy S**t, What a Nightmare This Turned Into": How Austerity Destroyed Our Small Towns

This story is part of Dissent magazine's 2012 special issue on Workers in the Age of Austerity. For more great coverage from Dissent, check out their website. 

Keep reading... Show less

Is a College Degree the Sure Bet it Once Was?

The U.S. higher education crisis has been well documented. College is overpriced, over-valued, and ripe for disruption (preferably, for some critics, by the outcome-driven private sector). At the same time, many Americans are flailing in the post-recession economy. With rising income inequality, persistent long-term unemployment, and declining real wages, Americans are searching for purchase on shifting ground. Not so long ago, the social contract between workers, government, and employers made college a calculable bet. But when the social contract was broken and policymakers didn’t step in, the only prescription for insecurity was the product that had been built on the assumption of security. We built a university system for the way we worked. What happens to college when we work not just differently but for less? And what if the crisis in higher education is related to the broader failures that have left so many workers struggling?

Keep reading... Show less

James K. Galbraith Takes on Thomas Piketty's "Capital in the Twenty-First Century"

What is “capital”? To Karl Marx, it was a social, political, and legal category—the means of control of the means of production by the dominant class. Capital could be money, it could be machines; it could be fixed and it could be variable. But the essence of capital was neither physical nor financial. It was the power that capital gave to capitalists, namely the authority to make decisions and to extract surplus from the worker.

Keep reading... Show less

'They' Shall Overcome

This article originally appeared on the Dissent Magazine blog.

Keep reading... Show less

The Users of the University

Here’s a trick you can try at home. Next time you hear a pundit say that to preserve America’s competitiveness or dynamism, we must replace the liberal arts with something more “practical,” take a second to check what they studied. Thomas Friedman, who asserts that students should study engineering and science because “average is over”? Mediterranean Studies, Brandeis. Charles Murray, who advocates shifting huge numbers of students into vocational training? History, Harvard. Dori Jones Yang, an accomplished writer and journalist who nonetheless told parents to funnel their children into “practical” disciplines? European history, Princeton.

Keep reading... Show less

What Happened to Public Education on Election Night?

Barack Obama’s K-12 “reform” policies have brought misery to public schools across the country: more standardized testing, faulty evaluations for teachers based on student test scores, more public schools shut down rather than improved, more privately managed and for-profit charter schools soaking up tax dollars but providing little improvement, more money wasted on unproven computer-based instruction, and more opportunities for private foundations to steer public policy. Obama’s agenda has also fortified a crazy-quilt political coalition on education that stretches from centrist ed-reform functionaries to conservatives aiming to undermine unions and privatize public schools to right-wingers seeking tax dollars for religious charters. Mitt Romney’s education program was worse in only one significant way: Romney also supported vouchers that allow parents to take their per-child public-education funding to private schools, including religious schools.

Keep reading... Show less

Wall Street, Coming to Your Town! (and Destroying It)

This article originally appeared in Dissent Magazine.

Keep reading... Show less

Can Graduate Student Teaching Survive?

Here’s a familiar story from about a decade ago: a pair of political antagonists conduct an increasingly bitter election campaign, ending in frustrating indecision. Disputes pile up over the legitimacy of the balloting process, and government officials are called in to intervene. Finally, a federal panel, boasting a majority of Republican appointees, shuts down the vote-counting outright, yielding a conservative victory.

Keep reading... Show less

Student Debt Crisis: It's Time for a Jubilee

Organizations that usually demand cancellation of the crippling debts owed by impoverished countries in the global South are now calling for debt forgiveness for a different group of borrowers: U.S. students.

Keep reading... Show less

Come Back Woody Guthrie, We Need You: America's Great Folk Singer Would Have Turned 100

I suspect that I was not the only teenager in the late 1960s engaging in sex, drugs, rock and roll, and Vietnam War protests to whom Woody’s body of work had an antique feeling. A perfunctory one-time listening to the Alan Lomaxtapes of the man said to be one of Bob Dylan’s heroes was enough, I thought, to punch my hipness card. Arguments about unions and the venality of millionaires and the New Deal seemed like passé accounts of battles my parents’ generation had fought and won. Yet through the mysterious alchemy that the greatest works of art possess, and the bizarre devolution of American politics, many of Guthrie’s songs are paradoxically more relevant today that at the time of his passing in 1967.

Keep reading... Show less

The Rise and Fall of Cesar Chavez and the United Farm Workers

This story is part of Dissent magazine's special issue on Workers in the Age of Austerity. Look out for more in the coming week. For more great coverage from Dissent, check out their website.  

Keep reading... Show less

Should Democrats Use the Tea Party to Split the Right?

Are you a Democratic congressional candidate in a tight electoral contest? Here’s an idea: Help to recruit a Tea Party candidate to enter the general election and siphon off voters from your Republican opponent. Sure, you might be forced to debate a reactionary nut job. But this only makes you look more reasonable. More importantly, the new entrant splits the right-wing vote. You waltz to victory.

Keep reading... Show less

Are Students the New Indentured Servants?

When we think of the founding of the early colonies, we usually think of the journey to freedom, in particular of the Puritans fleeing religious persecution to settle the Massachusetts Bay Colony. But it was not so for a majority of the first Europeans who emigrated to these shores. "Between one-half and two-thirds of all white immigrants to the British colonies arrived under indenture,” according to the economic historian David W. Galenson, a total of three hundred thousand to four hundred thousand people. Indenture was not an isolated practice but a dominant aspect of labor and life in early America.

Keep reading... Show less

Who Passed the Civil Rights Act of 1964?

Like so many of my generation who did voter registration work in the South during the 1960s, I have been saddened by the debate that Hillary Clinton and Barack Obama sparked over whether Martin Luther King or President Lyndon Johnson was responsible for the landmark 1964 Civil Rights Act that outlawed discrimination in hiring and public accommodations. Instead of providing voters with a thoughtful view of the recent past, Clinton and Obama combined to offer a crude, "great man" theory of history in which King's vision and Johnson's pragmatism were portrayed as antithetical forces.

The debate has quieted down. But it should not be allowed to fade from the headlines without a reminder of the lesson this controversy threatened to obscure -- blacks and whites across America relied on one another to make the Civil Rights Act of 1964 a reality.

The act had its legislative origins in a June 11, 1963 speech that President John Kennedy delivered on national television after Justice Department officials, aided by federal marshals, forced Alabama Governor George Wallace to stand aside while two black students were admitted to the previously segregated University of Alabama. "If an American, because his skin is dark … cannot enjoy the full and free life which all of us want, then who among us would be content to have the color of his skin changed and stand in his place?" Kennedy asked the country.

But Kennedy's speech, which was followed hours later by the murder of Mississippi civil rights leader Medgar Evers in Jackson, did not guarantee a speedy passage of civil rights legislation. A coalition of southern Democrats and conservative Republicans stood in the way and the best that Kennedy could do before his November 22 assassination was to get his civil rights bill voted out of committee.

It fell to President Lyndon Johnson to get Kennedy's civil rights legislation enacted. Soon after taking office, Johnson made his intentions clear. "We have talked long enough in this country about equal rights," he told a joint session of Congress on November 27. "It is time now to write the next chapter and to write it in books of law." At this same time, Martin Luther King was playing a crucial role in shaping public opinion. His April 16 "Letter from Birmingham Jail" and his August 28 speech "I Have a Dream" galvanized millions of Americans who in the past had remained passive when support for civil rights was needed.

Still, it was not until 1964 that Kennedy's civil rights bill got through Congress. On February 10, the House passed the bill by a vote of 290 to 130 and on June 19, in the wake of a record-breaking 75-day filibuster, which took up 534 hours, the Senate passed its version of the civil rights bill by a 73 to 27 margin. Now Lyndon Johnson began pressuring Congress to reach agreement on a bill that he could sign by July 4.

At this moment, Johnson benefited not only from the civil rights coalition led by Martin Luther King but from the grassroots work of Bob Moses, then a young organizer for the Student Nonviolent Coordinating Committee (SNCC) who had been active in Mississippi since 1961. At a November 1963 SNCC meeting, Moses had proposed a 1964 "Summer Project" in Mississippi that would make extensive use of college students, getting them to teach in freedom schools and carry out voter registration drives. A black-white coalition, Moses believed, would engage the whole country. But no sooner had the Summer Project begun when three of its participants -- Michael Schwerner, James Chaney, and Andrew Goodman -- disappeared on June 21 near Philadelphia, Mississippi.

Their disappearance (their bodies would later be found buried in an earthen dam) could not be ignored by America. Television cameras and the print media descended on Mississippi while state officials acted as if nothing of importance had happened. "They could be in Cuba," joked Mississippi Governor Paul Johnson.

It was the worst response that the diehard segregationists of the Deep South could have made. The influence of Martin Luther King, Lyndon Johnson, and John Kennedy, along with years of demonstrations and sit-ins, had created a political tide that reached its peak with the disappearance of the three men. On July 2, two days ahead of schedule, Congress, under heavy public pressure, agreed to the civil rights bill that Johnson wanted. Five hours later in a White House signing ceremony timed to coincide with the evening news, the president addressed the nation.

"One hundred and eighty-eight years ago this week a small band of valiant men began a long struggle for freedom," Johnson told the nation. "Now our generation of Americans has been called on to continue the unending search for justice within our own borders." The analogy was unmistakable. The president was comparing the work of the Founding Fathers with that of the civil rights movement.

Martin Luther King, who was present at the White House signing ceremony, also had no doubts about the significance of the day or about Lyndon Johnson's role in making the civil rights bill law. "It was a great moment," King declared, "something like the signing of the Emancipation Proclamation by Abraham Lincoln."

Today, we cannot know exactly what Johnson and King, two coalition builders, would say about the efforts to portray them as civil rights rivals. But it is hard to imagine that both would not have seen comparisons that pit them against each other as inimical to the civil rights movement they believed in. As King observed of the struggle for racial justice in his "Letter from Birmingham Jail": "We are caught in an inescapable network of mutuality, tied in a single garment of destiny."

Through The Documentary Looking Glass

Though full-length documentary films go back to the work of Robert Flaherty in the 1920s (Nanook of the North, Man of Aran), they have always been the uncommercial stepchild of movies that tell made-up stories. In their marginal way, documentaries have flourished in times of social crisis, and they enjoyed a special vogue in the 1930s, when independent filmmakers, inspired by politically committed Soviet artists, banded together in such organizations as the Film and Photo League.

Prone to advocacy, documentaries have often been criticized for manipulating their material. In Germany, Leni Riefenstahl perfected propaganda into an art form. The U.S. government itself sponsored Pare Lorentz's Great Depression documentaries, The Plow That Broke the Plains and The River, as well as Frank Capra's morale-building "Why We Fight" series during World War II.

But documentary films all but disappeared from theaters after 1945, although their aura of authenticity influenced narrative films, including depression movies like The Grapes of Wrath; postwar films noirs in the United States, and neorealist films in Italy; politically engaged films about the 1960s like Robert Kramer's Ice and Haskell Wexler's Medium Cool; as well as docudramas about real events-half history, half thriller -- such as Gillo Pontecorvo's The Battle of Algiers and Costa-Gavras's Z, State of Siege, and Missing.

As television grew more sophisticated, it began, timidly at first, to pursue public issues behind the news headlines. The work of Edward R. Murrow and his CBS producer, Fred Friendly -- their measured but brave expose of McCarthy, for example -- initially had no successors, especially at the networks.

But the turmoil of the 1960s created a new generation of cinema verite artists who used lighter, more mobile equipment, beginning with Frederick Wiseman, who immersed himself in social institutions such as an asylum for the "criminally insane" in Massachusetts and a high school in Philadelphia; D.A. Pennebaker, who pioneered rock concert movies such as Don't Look Back (on Bob Dylan) and Monterey Pop; and Albert and David Maysles, who made Salesman and Grey Gardens as well as the darkest of rock movies, Gimme Shelter, about a Rolling Stones concert at Altamont where a fan was knifed to death. The decade culminated in one very long documentary, Woodstock, in which the music, the counterculture, and the promotional hype converged, a film that gave a whole generation the vicarious feeling of having been there.

But the real story of documentary filmmaking picked up a year or two later with a movie made for French television although not shown there until many years later, Marcel Ophuls's The Sorrow and the Pity. Unlike cinema verite, with its deliberately raw technique and sense of total immersion in the moment, it was a somber, reflective investigation of a taboo subject -- French collaboration under German occupation, the myths and facts of the French resistance -- and it transformed not only how nonfiction movies were made but how the world, and especially the French, understood those soiled pages of history. The Sorrow and the Pity coincided with a new wave of interest in the Holocaust, an outpouring of memory, shame, and moral witness that had been dammed up for a quarter of a century.

Without its example, Claude Lanzmann's Shoah might not have been made fifteen years later. Along with the war itself, the hidden history of the Holocaust and the memories of survivors would provide an inexhaustible flow of material for filmmakers for decades to come. It's hard to imagine another subject on which the film medium has so amplified the historical record. But our present lives are often as shrouded from each other as the past, and this has repeatedly challenged filmmakers to excavate the buried dramas of family relationships.

The great breakthrough for television documentary in 1973 was the 12-part PBS series, An American Family, about Bill and Pat Loud and their five children, an upper middle-class family in Santa Barbara. It reflected many of the tensions in the post-sixties American family, from the homosexuality of one of the sons to the conflicts that led to the divorce of the parents. But above all, because the film crew essentially lived with the family for seven months, the movie tapped a surprising vein of eager self-exposure that has since become endemic to Americans. In the wake of the sixties, barriers between public and private life were falling everywhere; suddenly, very little seemed off limits. Documentary authenticity became an elusive thing as everyone performs for the camera.

In An American Family the uneasy self-dramatization of the subjects answered to the voyeuristic curiosity of the audience, which showed an unexpected interest in other people's daily lives. The pop sociology of the 1950s had migrated into television as viewers looked for a glass that would mirror their own experience.

To keep the documentary fresh, the best historical documentaries go to great lengths to avoid stereotyped archival material and imagery deadened by repetition. Lanzmann's Shoah, in a spirit of purism, eliminated newsreel footage entirely, substituting marathon interviews with survivors; conversations with a historian, Raul Hilberg; and subtly ironic images of antiquated railway cars and tranquil killing sites as they appear today.

In Capturing the Friedmans, the home movies serve as the archive of family life at its most carefree and most troubled. These are punctuated by David's anguished assertions of his father's and brother's innocence and by the level-headed comments of a journalist named Debbie Nathan, who sees evidence of a classic case of recovered memory, with all its distortions, and a tabloid-fed mass hysteria. Without taking a stand, the film leaves us with the impression of a painful miscarriage of justice.

If any film could serve as an antidote to this acrid sense of shattered lives and a dysfunctional family, it might be another of this year's widely seen documentaries, Jeff Blitz's Spellbound. Its unlikely subject was the 1999 National Spelling Bee, sponsored by the Scripps Howard newspaper chain since 1925. Like most other movies, documentaries are essentially about people, the look on their faces, the inflection of their voices, the wrinkles of mind and character implied by their body language and their interplay with others.

Movies about children always run the risk of cuteness, because kids often play to the camera with little self-consciousness. Spellbound escapes this pitfall by embedding these nerdy, striving kids solidly in their family settings, which range from that of a Mexican American ranch hand in Texas or an extended black family in Washington, D.C., to the middle-class homes of Indian Americans and Northeastern Jews. Some of the families are puzzled but proud to have raised a mutant, a child with a phenomenal memory, unshakeable determination, and the focus and discipline of a champion athlete. Other parents, acting out their own ambitions, coach the kids so relentlessly that it borders on child abuse. One Indian American father drills his son on seven to eight thousand words a day and calls in specialists to work with him on words from foreign languages. Like so many immigrants, he believes that there's "no way you can fail in this country if you work hard."

As they converge on the finals in Washington, the film interweaves eight separate family histories that demonstrate the diversity of America's population at its most attractive. These families' attitudes range from relentless joint effort ("We all had to pitch in to help," says an Indian American mother) to detached but beaming pride or wonder that such a prodigy could appear in their midst.

The most moving is the taciturn reaction of the Mexican, who still speaks no English after twenty years on a ranch in rural Texas. His fully Americanized daughter, Angela, becomes a local celebrity, yet has somehow, like most of the other kids in the film, remained "normal" and unaffected. His grown-up son, so Mexican yet already American, tells us the older man's story with simple but piercing eloquence: his flight from Mexico, his years of hard work, his hesitation about venturing as far as Washington to see his daughter compete. But the spelling bee has already taken his family beyond his wildest dreams. Even when Angela loses, his son tells us that if his father "were to die today, he would die a happy person."

If families have become a key subject in the current surge of documentaries, politics and history remain central to the genre. There appears to be no limit to our fascination with the political turbulence of the 1960s, a period not only colorful in itself but the bedrock of so many of our political divisions today.

Two recent documentaries probe the still-open wound of the Vietnam War and the radical movements that sprang up against it: Errol Morris's The Fog of War, essentially a long interview with former secretary of defense and Pentagon whiz kid Robert McNamara, and The Weather Underground, directed by Sam Green and Bill Siegel, which chronicles the political journey and second thoughts of antiwar radicals who took the Students for a Democratic Society from protest to violence and low-grade terrorism. In a sense it's unfair to compare these films because Morris, in works like The Thin Blue Line , A Brief History of Time , Fast, Cheap and Out of Control , and, most recently, Mr. Death , has developed into one of the most quirky, creative, and deeply respected documentarians.

Whereas the directors of The Weather Underground are talented novices, Morris has a restless, probing mind, a true cinematic imagination, and an unpredictable affinity for offbeat subjects. Yet these two movies, both grounded in interviews, share a common problem. Their larger subject is history, a vexed and contested history, yet they keep us too much enclosed within the point of view of their protagonists: their memories and reflections, their insights and self-deceptions, their cunning or unconscious efforts to vindicate themselves even as they fault their own past thinking. Invariably, we respond to the quality of their performances as much as we judge their political choices. When the subjects in a movie are as articulate as these people, their way of explaining and defending themselves, if it goes unchallenged, can badly skew our understanding of the issues.

Both the Weatherpeople and the former secretary of defense have a great deal to answer for, though the cabinet secretary, of course, wielded incomparably greater power. McNamara convinces us that plans were afoot for a gradual withdrawal from Vietnam just before John Kennedy's assassination, but he never explains why he went along with Lyndon Johnson's escalation and kept silent even after he had been forced out as secretary. He is much more frank about his role in Curtis LeMay's firebombing of sixty-seven Japanese cities in 1945, which might have left the two of them open to trial as war criminals had the United States lost, than he is about Vietnam, where he simply argues that "the fog of war" clouded everyone's judgment.

This is curiously similar to the repeated assertion by the aging veterans of the Weather Underground that "the Vietnam War made us crazy," to which one of them adds, "When you feel you have right on your side, you can do some horrific things." Green and Siegel offer some sharp and angry criticism by Todd Gitlin as a counterweight to these superficial rationalizations, and they give the last word to the more self-critical speakers.

But like most documentary filmmakers they essentially identify with their subjects, giving them a forum for reflections that are often engrossing but clarify very little. Despite their initial idealism, the Weather people were a self-destructive splinter group caught up in fantasies of violent revolution. Above all, the film gives little sense of other options open to the antiwar left during the Vietnam era. Instead we get lines like this excerpt from an unpublished memoir by Mark Rudd: "I was overwhelmed by hate. I cherished my hate as a badge of moral superiority." This is a kind of self-criticism, but it is far from illuminating.

The film adds up to some interesting although unrevealing character studies of mostly invisible lives, lives that travestied the deep convictions and moral anguish of so many Americans during the 1960s. The characters come across as attractive figures who have shielded themselves from any real insight into their own past. It has always been a paradox that many young sixties radicals, including some of my students at Columbia, were immensely appealing as individuals -- morally engaged, funny, drunk on ideas -- but stopped making sense when they disappeared into the group, especially after the summer of 1968 and the election of Richard Nixon.

Both The Weather Underground and The Fog of War seem to be political films, but they actually peel us back from politics to personality, which is the subject of most movies anyway. Though The Weather Underground brings back a sense of the times, and both movies find fresh, unhackneyed historical footage, The Fog of War stands out as the full-length portrait of a brilliant advocate locked in a morally dubious position, using confession as a form of self-exculpation. McNamara and the former radical terrorists are people who made terrible mistakes when they were young. Unlike some of their contemporaries, they lived long enough to reconsider, but they are not especially introspective. The ex-Weather people are so masked that in some ways they still seem to be living underground, adrift, cut loose from their historical moment.

Discussing his film after a press screening at the New York Film Festival, Morris anticipated that it would be criticized for not "contextualizing." Plenty of books about the Vietnam era offer context, he said, but he had always wanted to make a film about one person, history from the inside out.

"This movie is deliberately unbalanced, with only one side, one point of view," he admits, disarming criticism as effectively as McNamara. In the course of his interviews he had come to sympathize with the man, though he understood that, like the rest of us, he was vain, ambitious, and self-justifying. "People reveal themselves through language," he argued.

Clearly, he counts on viewers to understand more about McNamara than the man himself was willing to expose. This was precisely the bet he placed in his previous film, Mr. Death , a profile of an American eccentric, Fred Leuchter, whose specialty was the technical side of capital punishment, the mechanics of execution, but who got caught up in the twilight zone of Holocaust denial and ruined his life.

With McNamara, Morris's wager, really, was on the power of the film medium to reveal character, on his own uncanny gifts as an interviewer, which rival those of Ophuls and Lanzmann, and on the ingenious ways he finds of counterpointing his subject's story -- with Philip Glass's eerily effective music, for example, or with mesmerizing images of long lines of gigantic dominoes falling across a map of Southeast Asia. Working with mood, metaphor, animation, or recreation, he expands the terrain on which documentary operates. More formalist than historian, he extracts eleven "lessons" from McNamara's story, but uses them mainly to serve as chapter breaks.

Whatever its limits as a political investigation, The Fog of War is a singular specimen of the documentary film as a personal portrait of a key historical figure and as a gripping work of art.

This article has been edited for length. For the longer original version, visit Dissent Magazine.

Morris Dickstein teaches English and film at the CUNY Graduate Center. His most recent book is "Leopards in the Temple."

Citizenship and Disability

In the six years since I published a book about my son Jamie, Life As We Know It, a great deal has changed in Jamie's life-starting with his realization that there is a book about him. When I completed the book Jamie was only four, and had not yet entered the public K-12 system. But I did not stop serving as Jamie's recorder and public representative when I finished that book: I still represent him all the time, to school officials, camp counselors, babysitters and friends, to academic audiences, and to Down Syndrome Associations. I take it as one of my tasks to watch for important things he's never done before, as a way of charting and understanding the irreplaceable and irreducible little person he is, especially as he gets less and less little, and more and more capable of representing himself.

Jamie is now in his sixth year of school, having entered kindergarten in 1997-1998. In the intervening years he has not continued to perform at grade level (he is repeating fourth grade, at age eleven), and he has occasionally presented his schoolmates with some eccentric behavior. On the other hand, he has learned to read, to do two- and three-digit addition and subtraction, to multiply two-digit numbers, and most recently to do division by single numbers, with and without remainders.

He is a stubborn ignatz, as people find whenever they try to get him to do something he has no interest in, or whenever his teachers or aides try to make him move from one task to another. For a while he tried to put off unpleasant tasks by telling his teachers or therapists, "Let's do that tomorrow"; before long he realized that this didn't work, and began saying instead, "We did that yesterday"-a ruse with which he has had some success.

His conversational skills are steadily improving, but unless you're talking to him about one of the movies he's seen or one of the routines he's developed at school or at home, you'll find that his sense of the world is sometimes unintelligible, sometimes merely a bit awry. He recently received an invitation to a classmate's birthday party (his third such invitation since we moved to central Pennsylvania sixteen months ago: we count and cherish each one), and Janet asked him what the birthday boy looked like: "he's a small boy," said Jamie, holding his hand around his shoulder level.

"What color is his hair?" she asked.
"Black," Jamie replied.
"What color are his eyes?"
"Blue."
"Does he wear glasses?" (Jamie has worn glasses for about five years.) "No," Jamie said, "just eyes."

Over eleven years, then, we've come to expect that Jamie will defeat or exceed our expectations when we least expect him to. And from this I draw two points. One, he's a child. Two, and this is a somewhat more elaborate conclusion, although it can be derived from point one: it might be a good idea for all of us to treat other humans as if we do not know their potential, as if they just might in fact surprise us, as if they might defeat or exceed our expectations. It might be a good idea for us to check the history of the past two centuries whenever we think we know what "normal" human standards of behavior and achievement might be. And it might be a very good idea for us to expand the possibilities of democracy precisely because democracy offers us unfinished and infinitely revisable forms of political organization that stand the best chance, in the long run, of responding adequately to the human rights of the unpredictable creatures we humans are. That might be one way of recognizing and respecting something you might want to call our human dignity.

Jamie is, of course, one reason why I am drawn to the question of disability rights and their relation to democracy: every morning I take him to school, I know how very fortunate he is to be living under a social dispensation that entitles him to a public education alongside his nondisabled peers.

But beyond my immediate interest in forwarding Jamie's interests, I want to argue that disability issues are-or should be-central to theories of social justice in a much broader sense. Nancy Fraser's account of the "politics of recognition" and the "politics of redistribution," for example, offers a theory that tries to accommodate what were the two major strands of American progressive-left thought in the 1990s, multiculturalism and democratic socialism (in all their varieties). Fraser has shown convincingly that the politics of recognition and redistribution offer a productive way to think about feminism: cultural politics with regard to body images or sexual harassment, for example, are not to be understood as distractions from "real" politics that address comparative worth or the minimum wage.

Rather, recognition politics have consequences for the redistribution of social goods and resources even though they cannot be reduced to their redistributive effects. And since many left intellectuals in the 1990s were all too willing to think of politics as a zero-sum game in which any attention paid to multiculturalism had to come at the expense of democratic socialism and vice versa, Fraser's work seems to offer a way for the left to champion a progressive tax code and an end to racial profiling at the same time.

It is striking, nonetheless, that so few leftists have understood disability in these terms. Disability is not the only area of social life in which the politics of recognition are inseparable from the politics of redistribution; other matters central to citizenship, such as immigration, reproductive rights, and criminal justice, are every bit as complex. Nonetheless, our society's representations of disability are intricately tied to, and sometimes the very basis for, our public policies for "administering" disability. And when we contemplate, in these terms, the history of people with cognitive and developmental disabilities, we find a history in which "representation" takes on a double valence: first, in that people who were deemed incapable of representing themselves were therefore represented by a socio-medical apparatus that defined-or, in a social-constructionist sense, created-the category of "feeblemindedness"; and second, in the sense that the visual and rhetorical representations of "feebleminded" persons then set the terms for public policy. One cannot plausibly narrate a comprehensive history of ideas and practices of national citizenship in the post-Civil War United States without examining public policy regarding disability, especially mental disability, all the more especially when mental disability was then mapped onto certain immigrant populations who scored poorly on intelligence tests and were thereby pseudo-scientifically linked to criminality. And what of reproductive rights? By 1927, the spurious but powerful linkages among disability, immigration, poverty, and criminality provided the Supreme Court with sufficient justification for declaring involuntary sterilization legal under the Constitution.

There is an obvious reason why disability rights are so rarely thought of in terms of civil rights: disability was not covered in the Civil Rights Act of 1964. And as Anita Silvers points out, over the next twenty-five years, groups covered by civil rights law sometimes saw disability rights as a dilution of civil rights, on the grounds that people with disabilities were constitutively incompetent, whereas women and minorities faced discrimination merely on the basis of social prejudice. Silvers writes, "[t]o make disability a category that activates a heightened legal shield against exclusion, it was objected, would alter the purpose of legal protection for civil rights by transforming the goal from protecting opportunity for socially exploited people to providing assistance for naturally unfit people." The passage of the Americans with Disabilities Act (ADA) in 1990 did add disability to the list of stigmatized identities covered by antidiscrimination law, but thus far the ADA has been interpreted so narrowly, and by such a business-friendly judiciary, that employers have won over 95 percent of the suits brought under the act.

Jamie Berube currently has a right to an inclusive public education, but that right is neither intrinsic nor innate. Rather, Jamie's rights were invented, and implemented slowly and with great difficulty. The recognition of his human dignity, enshrined in those rights, was invented. And by the same token, those rights, and that recognition, can be taken away. While I live, I promise myself that I will not let that happen, but I live with the knowledge that it may: to live any other way, to live as if Jamie's rights were somehow intrinsic, would be irresponsible.

Of course, many of us would prefer to believe that our children have intrinsic human rights and human dignity no matter what; irrespective of any form of human social organization; regardless of whether they were born in twentieth-century Illinois or second-century Rome or seventh-century central Asia. But this is just a parent's-or a philosophical foundationalist's-wishful thinking. For what would it mean for Jamie to "possess" rights that no one on earth recognized? A fat lot of good it would do him. My argument may sound either monstrous or all too obvious: if, in fact, no one on earth recognized Jamie's human dignity, then there would in fact be no human perspective from which he would be understood to possess "intrinsic" human dignity. And then he wouldn't have it, and so much the worse for the human race.

In one respect, the promise of the IDEA, like the promise of the ADA, is clear: greater inclusion of people with disabilities in the social worlds of school and work. But in another sense the promise is unspecifiable; its content is something we actually cannot know in advance. For the IDEA does not merely guarantee all children with disabilities a free appropriate public education in the least restrictive environment. Even more than this, it grants the right to education in order that persons with disabilities might make the greatest possible use of their other rights-the ones having to do with voting, or employment discrimination, or with life, liberty, and the pursuit of happiness.

IDEA is thus designed to enhance the capabilities of all American children with disabilities regardless of their actual abilities-and this is why it is so profound a democratic idea. Here again I'm drawing on Nancy Fraser, whose theory of democracy involves the idea of "participatory parity," and the imperative that a democratic state should actively foster the abilities of its citizens to participate in the life of the polity as equals. Fraser's work to date has not addressed disability, but as I noted above, it should be easy to see how disability is relevant to Fraser's account of the politics of recognition and the politics of redistribution. This time, however, I want to press the point a bit harder. Fraser writes as if the promise of democracy entails the promise to enhance participatory parity among citizens, which it does, and she writes as if we knew what "participatory parity" itself means, which we don't. (This is why the promise of disability rights is unspecifiable.)

Imagine a building in which political philosophers are debating, in the wake of the attacks of September 11, 2001, the value and the purpose of participatory parity over against forms of authoritarianism or theocracy. Now imagine that this building has no access ramps, no Braille or large-print publications, no American Sign Language interpreters, no elevators, no special-needs paraprofessionals, no in-class aides. Contradictory as such a state of affairs may sound, it's a reasonably accurate picture of what contemporary debate over the meaning of democracy actually looks like. How can we remedy this? Only when we have fostered equal participation in debates over the ends and means of democracy can we have a truly participatory debate over what "participatory parity" itself means. That debate will be interminable in principle, since our understandings of democracy and parity are infinitely revisable, but lest we think of deliberative democracy as a forensic society dedicated to empyreal reaches of abstraction, we should remember that debates over the meaning of participatory parity set the terms for more specific debates about the varieties of human embodiment. These include debates about prenatal screening, genetic discrimination, stem-cell research, euthanasia, and, with regard to physical access, ramps, curb cuts, kneeling buses, and buildings employing what is now known as universal design.

Leftists and liberals, particularly those associated with university humanities departments, are commonly charged with being moral relativists, unable or unwilling to say (even after September 11) why one society might be "better" than another. So let me be especially clear on this final point. I think there's a very good reason to extend the franchise, to widen the conversation, to democratize our debates, and to make disability central to our theories of egalitarian social justice. The reason is this: a capacious and supple sense of what it is to be human is better than a narrow and partial sense of what it is to be human, and the more participants we as a society can incorporate into the deliberation of what it means to be human, the greater the chances that that deliberation will in fact be transformative in such a way as to enhance our collective capacities to recognize each other as humans entitled to human dignity. As Jamie reminds me daily, both deliberately and unwittingly, most Americans had no idea what people with Down syndrome could achieve until we'd passed and implemented and interpreted and reinterpreted a law entitling them all to a free appropriate public education in the least restrictive environment. I can say all this without appealing to any innate justification for human dignity and human rights, and I can also say this: Without a sufficient theoretical and practical account of disability, we can have no account of democracy worthy of the name.

Perhaps some of our fellow citizens with developmental disabilities would not put the argument quite this way; even though Jamie has led me to think this way, he doesn't talk the way I do. But those of us who do participate in political debates, whether about school funding in a specific district or about the theory and practice of democracy at its most abstract, have the obligation to enhance the abilities of our children and our fellow citizens with disabilities to participate in the life of the United States as political and moral equals with their nondisabled peers-both for their own good, and for the good of democracy, which is to say, for the good of all of us.

Michael Berube is the Paterno Family Professor in Literature at Pennsylvania State University. This article is adapted from a talk given at the 2002 convention of the Arc of the United States (formerly the Association of Retarded Citizens of the United States).

In Defense of Multilateralism

The argument seems easy: the International Criminal Court represents a significant step toward a global rule of law, and the United States should be part of it. That's ultimately right, but the argument is more complicated, and there are political as well as legal reasons for supporting American participation.

The Court will, of course, be used for political purposes. George W. Bush's administration is right on that point, and it seems silly to deny it. Even in wellordered domestic societies, the judicial process can be politically exploited: think of the role of the Supreme Court in our last election. In international society, where the rule of law is still a distant dream, politics is almost certain to determine the early course of the ICC. And it is entirely certain that there will be efforts to focus prosecutorial energy on Americans abroad. In many parts of the world, the Court will be viewed primarily as an instrument designed to set limits on (our) hegemonic power. How one feels about that depends, I suppose, on one's position in the world. Taking a long view, Americans may one day be very happy to see the next hegemon (China, say) constrained by judicial authority. But the debate about the ICC offers liberals and leftists a chance to argue for something more than this: not a constrained but a shared hegemony.

The current American position goes something like this: when war is just and necessary, as in the Gulf in 1991 or in Kosovo in 1999, it is the United States that bears the brunt of the fighting. Our European allies oppose American unilateralism only this far: they want a role in deciding when war is just and necessary, but they are content, once the decision is made, to leave most of the fighting to American soldiers. Americans are supposed to accept the risks of war (and are criticized, sometimes rightly, for fighting at long range so as to reduce those risks), and we are also supposed to accept the legal liabilities. It is American soldiers, and hardly anyone else, who will be accused of war crimes and they will be accused both when there are legal reasons to think that crimes have been committed and when there are political reasons to pretend that crimes have been committed. Why should we expose our soldiers to this liability when, in fact if not in principle, no other country's soldiers are similarly exposed?

But the administration's argument assumes the permanent unilateralism of American warmaking. As in Afghanistan, the Bush people prefer to be in full charge, acting alone or virtually alone, consulting no one. They are happy to bring in allied forces, but only after the fighting is over, to help in peacekeeping. And America's allies are not in fact unhappy about this role, whatever they say about it, because it frees them from the expense of raising and equipping a modern army. I learned recently that the German soldiers sent to Afghanistan as peacekeepers flew there on rented Russian planes. The Germans have created a rapid deployment force but have not invested in the means to deploy it. The case is similar across Western Europe, in virtually every area of military endeavor. This means that the Europeans are in the morally ambiguous position of claiming a role in decision making while remaining unready to share in the risks of decisions made. The Bush administration exploits this moral ambiguity to argue against multilateral decision making. But liberals and leftists both here and in Europe, it seems to me, should be arguing the other way around: against unilateral warmaking. We should seize the occasion of the ICC debate to insist that the best way to avoid exposing American soldiers to political prosecutions is to make sure that European (and other) soldiers are similarly exposed. The eagerness to invent war crimes will be greatly diminished if war, when it is just and necessary, is a genuinely multilateral engagement.

This requires that European countries (and others too, but right now the NATO states are the relevant ones for this discussion) be ready to send soldiers into battle. Americans and Europeans will have to argue about when to do that, but the readiness to do it is a moral necessity. If there is to be equal liability in the courtroom, there has to be equal risk on the battlefield. I don't understand how Europeans can be so outraged at America's refusal to join the Court when they are so unwilling to join the battle. On the other hand, the Bush administration invites the outrage by its refusal to join even in peacekeeping efforts (in Bosnia, as I write) where multilateral engagement and multilateral liability are already in place. Wherever the risks of engage ment are shared, there is no excuse for refusing to share the legal liabilities.

In fact, we should join the ICC now, even when the risks are not yet (fully) shared. Joining would be a signal of what is hardly apparent today: American support for multilateralism down the road. And meanwhile, in addition to constraining American hegemony, the court will also set up mechanisms useful to everyone, Americans included, for conciliation and reparation in cases of injury and criminal action. Bush's lawyers apparently see these mechanisms only as potential problems. In fact, however, we will greatly enhance the legitimacy of our military operations in foreign countries if we acknowledge the possibility of external judicial review. Consider the bombing of an Afghan wedding party this summer, now under investigation by the U.S. air force. This is an example of warfare from a great distance, with very low risk for the American pilots involved but high risk, apparently, for civilians on the ground. How can we claim exclusive jurisdiction? The ICC could bring a just closure to cases like this (if anything like criminal negligence is involved), in a way that no U.S. military or civil court could possibly do. We have submitted American soldiers to the jurisdiction of foreign courts before thisin Japan, Germany, and Italy, for exampleso why not to the jurisdiction of an international court that we will have, or could have, a hand in creating?

And from this beginning, some future U.S. administration could pursue a fuller multilateralism, insisting that other countries involve themselves seriously and substantially, along with us, in the risks of humanitarian interventions, just wars, and international peacekeeping.

BRAND NEW STORIES