Rebecca Gordon

Exceptions to American exceptionalism and what 2023 could bring

Rebecca Gordon: Another Exceptional Year?

Originally, he was an America Firster, a phrase from the pre-World War II era deeply associated with anti-Semitism. (Yep, he arrived there years before Ye ever made it on the scene!) As an America Firster, who undoubtedly snitched the phrase from Pat Buchanan’s 2000 presidential run, he even used it in his 2017 inaugural address. (“We assembled here today are issuing a new decree to be heard in every city, in every foreign capital and in every hall of power. From this day forward, a new vision will govern our land. From this day forward, it’s going to be only America first — America first.”) Initially, however, he rejected the allied idea of American exceptionalism. As he put it at an event in Texas in 2015, “I don’t like the term. I never liked it. When I see these politicians get up [and say], ‘the American exceptionalism’ — we’re dying. We owe $18 trillion in debt. I’d like to make us exceptional. And I’d like to talk later instead of now.”

And when you think about it, that was rather exceptional in its own way. After all, he was claiming — as no other politician would have dared to do at the time — that this country was anything but exceptional; that, in fact, it had to be brought back from the graveyard of history and that he was going to do so. He was going to “make America great again” (MAGA), a phrase that, in 2015-2016, no other politician, Democrat or Republican, would have dared use. Of course, his truest position was always Trump First and, in 2023, it’s undoubtedly Make Trump Great Again. However, explain it as you will, he did later adopt American exceptionalism when, assumedly, he had made his country and his presidency the exception to all rules. And of course, in January 2021, he gave genuine new meaning both to “Trump first” and “American exceptionalism” when he called on his followers to attend a “big protest in D.C.” that would “be wild” and support that most exceptional of all presidents, him.

In the Biden era, with the January 6th moment more or less behind us, we’re back to American exceptionalism, not that most of our politicians ever truly left it in the dust. And as TomDispatch regular Rebecca Gordon, author of American Nuremberg, reminds us today, lest you think otherwise, this country does remain all too exceptional, even if in ways that couldn’t be more unnerving. Let her explain. Tom

American Exceptionalism on Full Display: Why This Country Might Want to Lower Its Expectations

Let me start with a confession: I no longer read all the way through newspaper stories about the war in Ukraine. After years of writing about war and torture, I’ve reached my limit. These days, I just can’t pore through the details of the ongoing nightmare there. It’s shameful, but I don’t want to know the names of the dead or examine images caught by brave photographers of half-exploded buildings, exposing details — a shoe, a chair, a doll, some half-destroyed possessions — of lives lost, while I remain safe and warm in San Francisco. Increasingly, I find that I just can’t bear it.

And so I scan the headlines and the opening paragraphs, picking up just enough to grasp the shape of Vladimir Putin’s horrific military strategy: the bombing of civilian targets like markets and apartment buildings, the attacks on the civilian power grid, and the outright murder of the residents of cities and towns occupied by Russian troops. And these aren’t aberrations in an otherwise lawfully conducted war. No, they represent an intentional strategy of terror, designed to demoralize civilians rather than defeat an enemy military. This means, of course, that they’re also war crimes: violations of the laws and customs of war as summarized in 2005 by the International Committee of the Red Cross (ICRC).

The first rule of war, as laid out by the ICRC, requires combatant countries to distinguish between (permitted) military and (prohibited) civilian targets. The second states that “acts or threats of violence the primary purpose of which is to spread terror among the civilian population” — an all-too-on-target summary of Russia’s war-making these last 10 months — “are prohibited.” Violating that prohibition is a crime.

The Great Exceptions

How should war criminals be held accountable for their actions? At the end of World War II, the victorious Allies answered this question with trials of major German and Japanese officials. The most famous of these were held in the German city of Nuremberg, where the first 22 defendants included former high government officials, military commanders, and propagandists of the Nazi regime, as well as the banker who built its war machine. All but three were convicted and 12 were hanged.

The architects of those Nuremberg trials — representatives of the United States, the Soviet Union, the United Kingdom, and France — intended them as a model of accountability for future wars. The best of those men (and most of them were men) recognized their debt to the future and knew they were establishing a precedent that might someday be held against their own nations. The chief prosecutor for the United States, Robert H. Jackson, put it this way: “We must not forget that the record on which we judge the defendants today is the record on which we will be judged tomorrow.”

Indeed, the Nuremberg jurists fully expected that the new United Nations would establish a permanent court where war criminals who couldn’t be tried in their home countries might be brought to justice. In the end, it took more than half a century to establish the International Criminal Court (ICC). Only in 1998 did 60 nations adopt the ICC’s founding document, the Rome Statute. Today, 123 countries have signed.

Russia is a major exception, which means that its nationals can’t be tried at the ICC for war crimes in Ukraine. And that includes the crime the Nuremberg tribunal identified as the source of all the rest of the war crimes the Nazis committed: launching an aggressive, unprovoked war.

Guess what other superpower has never signed the ICC? Here are a few hints:

  • Its 2021 military budget dwarfed that of the next nine countries combined and was 1.5 times the size of what the world’s other 144 countries with such budgets spent on defense that year.
  • Its president has just signed a $1.7 trillion spending bill for 2023, more than half of which is devoted to “defense” (and that, in turn, is only part of that country’s full national security budget).
  • It operates roughly 750 publicly acknowledged military bases in at least 80 countries.
  • In 2003, it began an aggressive, unprovoked (and disastrous) war by invading a country 6,900 miles away.

War Crimes? No, Thank You

Yes, the United States is that other Great Exception to the rules of war. While, in 2000, during the waning days of his presidency, Bill Clinton did sign the Rome Statute, the Senate never ratified it. Then, in 2002, as the Bush administration was ramping up its “global war on terror,” including its disastrous occupation of Afghanistan and an illegal CIA global torture program, the United States simply withdrew its signature entirely. Secretary of Defense Donald Rumsfeld then explained why this way:

[T]he ICC provisions claim the authority to detain and try American citizens — U.S. soldiers, sailors, airmen and Marines, as well as current and future officials — even though the United States has not given its consent to be bound by the treaty. When the ICC treaty enters into force this summer, U.S. citizens will be exposed to the risk of prosecution by a court that is unaccountable to the American people, and that has no obligation to respect the Constitutional rights of our citizens.

That August, in case the U.S. stance remained unclear to anyone, Congress passed, and President George W. Bush signed, the American Servicemembers Protection Act of 2002. As Human Rights Watch reported at the time, “The new law authorizes the use of military force to liberate any American or citizen of a U.S.-allied country being held by the [International Criminal] Court, which is located in The Hague.” Hence, its nickname: the “Hague Invasion Act.” A lesser-known provision also permitted the United States to withdraw military support from any nation that participates in the ICC.

The assumption built into Rumsfeld’s explanation was that there was something special — even exceptional — about U.S. citizens. Unlike the rest of the world, we have “Constitutional rights,” which apparently include the right to commit war crimes with impunity. Even if a citizen is convicted of such a crime in a U.S. court, he or she has a good chance of receiving a presidential pardon. And were such a person to turn out to be one of the “current and future officials” Rumsfeld mentioned, his or her chance of being hauled into court would be about the same as mine of someday being appointed secretary of defense.

The United States is not a member of the ICC, but, as it happens, Afghanistan is. In 2018, the court’s chief prosecutor, Fatou Bensouda, formally requested that a case be opened for war crimes committed in that country. TheNew York Timesreported that Bensouda’s “inquiry would mostly focus on large-scale crimes against civilians attributed to the Taliban and Afghan government forces.” However, it would also examine “alleged C.I.A. and American military abuse in detention centers in Afghanistan in 2003 and 2004, and at sites in Poland, Lithuania, and Romania, putting the court directly at odds with the United States.”

Bensouda planned an evidence-gathering trip to the United States, but in April 2019, the Trump administration revoked her visa, preventing her from interviewing any witnesses here. It then followed up with financial sanctions on Bensouda and another ICC prosecutor, Phakiso Mochochoko.

Republicans like Bush and Trump are not, however, the only presidents to resist cooperating with the ICC. Objection to its jurisdiction has become remarkably bipartisan. It’s true that, in April 2021, President Joe Biden rescinded the strictures on Bensouda and Mochochoko, but not without emphasizing this exceptional nation’s opposition to the ICC as an appropriate venue for trying Americans. The preamble to his executive order notes that:

The United States continues to object to the International Criminal Court’s assertions of jurisdiction over personnel of such non-State Parties as the United States and its allies absent their consent or referral by the United Nations Security Council and will vigorously protect current and former United States personnel from any attempts to exercise such jurisdiction.

Neither Donald Rumsfeld nor Donald Trump could have said it more clearly.

So where do those potential Afghan cases stand today? A new prosecutor, Karim Khan, took over as 2021 ended. He announced that the investigation would indeed go forward, but that acts of the U.S. and allies like the United Kingdom would not be examined. He would instead focus on actions of the Taliban and the Afghan offshoot of the Islamic State. When it comes to potential war crimes, the United States remains the Great Exception.

In other words, although this country isn’t a member of the court, it wields more influence than many countries that are – all of which means that, in 2023, the United States is not in the best position when it comes to accusing Russia of horrifying war crimes in Ukraine.

What the Dickens?

I blame my seven decades of life for the way my mind can now meander. For me, “great exceptions” brings to mind Charles Dickens’s classic story Great Expectations. His novels exposed the cruel reality of life among the poor in an industrializing Great Britain, with special attention to the pain felt by children. Even folks whose only brush with Dickens was reading Oliver Twist or watching The Muppets Christmas Carol know what’s meant by the expression “Dickensian poverty.” It’s poverty with that extra twist of cruelty — the kind the American version of capitalism has so effectively perpetuated.

When it comes to poverty among children, the United States is indeed exceptional, even among the 38 largely high-income nations of the Organization for Economic Cooperation and Development (OECD). As of 2018, the average rate of child poverty in OECD countries was 12.8%. (In Finland and Denmark, it was only 4%!) For the United States, with the world’s highest gross domestic product, however, it was 21%.

Then, something remarkable happened. In year two of the Covid pandemic, Congress passed the American Rescue Plan, which (among other measures) expanded the child tax credit from $2,000 up to as much as $3,600 per child. The payments came in monthly installments and, unlike the Earned Income Credit, a family didn’t need to have any income to qualify. The result? An almost immediate 40% drop in child poverty. Imagine that!

Given such success, you might think that keeping an expanded child tax credit in place would be an obvious move. Saving little children from poverty! But if so, you’ve failed to take into account the Republican Party’s remarkable commitment to maintaining its version of American exceptionalism. One of the items that the party’s congressional representatives managed to get expunged from the $1.7 trillion 2023 appropriation bill was that very expanded child tax credit. It seems that cruelty to children was the Republican party’s price for funding government operations.

Charles Dickens would have recognized that exceptional — and gratuitous — piece of meanness.

The same bill, by the way, also thanks to Republican negotiators, ended universal federal public-school-lunch funding, put in place during the pandemic’s worst years. And lest you think the Republican concern with (extending) poverty ended with starving children, the bill also will allow states to resume kicking people off Medicaid (federally subsidized health care for low-income people) starting in April 2023. The Kaiser Family Foundation estimates that one in five Americans will lose access to medical care as a result.

Great expectations for 2023, indeed.

We’re the Exception!

There are, in fact, quite a number of other ways in which this country is also exceptional. Here are just a few of them:

  • Children killed by guns each year. In the U.S. it’s 5.6 per 100,000. That’s seven times as high as the next highest country, Canada, at 0.8 per 100,000.
  • Number of required paid days off per year. This country is exceptional here as well, with zero mandatory days off and 10 federal holidays annually. Even Mexico mandates six paid vacation days and seven holidays, for a total of 13. At the other end of the scale, Chile, France, Germany, South Korea, Spain, and the United Kingdom all require a combined total of more than 30 paid days off per year.
  • Life expectancy. According to 2019 data, the latest available from the World Health Organization for 183 countries, U.S. average life expectancy at birth for both sexes is 78.5 years. Not too shabby, right? Until you realize that there are 40 countries with higher life expectancy than ours, including Japan at number one with 84.26 years, not to mention Chile, Greece, Peru, and Turkey, among many others.
  • Economic inequality. The World Bank calculates a Gini coefficient of 41.5 for the United States in 2019. The Gini is a 0-to-100-point measure of inequality, with 0 being perfect equality. The World Bank lists the U.S. economy as more unequal than those of 142 other countries, including places as poor as Haiti and Niger. Incomes are certainly lower in those countries, but unlike the United States, the misery is spread around far more evenly.
  • Women’s rights. The United States signed the United Nations Convention on the Elimination of All Forms of Discrimination against Women in 1980, but the Senate has never ratified it (thank you again, Republicans!), so it doesn’t carry the force of law here. Last year, the right-wing Supreme Court gave the Senate a helping hand with its decision in Dobbs v. Jackson Women’s Health Organization to overturn Roe v. Wade. Since then, several state legislatureshave rushed to join the handful of nations that outlaw all abortions. The good news is that voters in states from Kansas to Kentucky have ratified women’s bodily autonomy by rejecting anti-abortion ballot propositions.
  • Greenhouse gas emissions. Well, hooray! We’re no longer number one in this category. China surpassed us in 2006. Still, give us full credit; we’re a strong second and remain historically the greatest greenhouse gas emitter of all time.

Make 2023 a (Less) Exceptional Year

Wouldn’t it be wonderful if we were just a little less exceptional? If, for instance, in this new year, we were to transfer some of those hundreds of billions of dollars Congress and the Biden administration have just committed to enriching corporate weapons makers, while propping up an ultimately unsustainable military apparatus, to the actual needs of Americans? Wouldn’t it be wonderful if just a little of that money were put into a new child tax credit?

Sadly, it doesn’t look very likely this year, given a Congress in which, however minimally and madly, the Republicans control the House of Representatives. Still, whatever the disappointments, I don’t hate this country of mine. I love it — or at least I love what it could be. I’ve just spent four months on the front lines of American politics in Nevada, watching some of us at our very best risk guns, dogs, and constant racial invective to get out the vote for a Democratic senator.

I’m reminded of poet Lloyd Stone’s words that I sang as a teenager to the tune of Sibelius’s Finlandia hymn:

My country’s skies are bluer than the ocean
And sunlight beams on cloverleaf and pine
But other lands have sunlight, too, and clover,
And skies are somewhere blue as mine.
Oh, hear my prayer, O gods of all the nations
A song of peace for their lands and for mine

So, no great expectations in 2023, but we can still hope for a few exceptions, can’t we?

Living for politics or 'just living'?

Rebecca Gordon: Three Conversations about Politics

Since I turned 18, I doubt I’ve ever missed a vote. Certainly, though, I never missed a presidential election. In 1968, at age 24, for instance, already swept away by the anti-Vietnam War movement, I voted for antiwar Democrat Eugene McCarthy in the New York primary. Even though McCarthy would win the popular vote nationally in the Democratic primaries, he lost the nomination, in a distinctly controversial fashion, at the Democratic convention to former Vice President Hubert Humphrey, hardly an antiwar sort of guy. Still, in the election to come, I voted for him, only to see Republican Richard Nixon (of the notorious “Southern strategy” and later Watergate infamy) beat him nationally, become president, and later expand that American war in North Vietnam, Laos, and Cambodia. (Note that Alabama segregationist governor George Wallace won more than 5% of New York State’s vote that year, a reminder with Nixon that there has long been a Trumpian quality to American politics.) And then, four years later, I would vote for George McGovern, again to end that war, only to watch Nixon win for the second time in a landslide (even in New York!). Sigh.

Still, to this day, I do go out and vote, although, on my way to the polls, I sometimes have to ask my wife whom I should vote for farther down the ticket. So, in my modest, haphazard fashion, I’ve participated in American politics, but never, like TomDispatch regular Rebecca Gordon, just back from the front lines of the recent midterm elections, in actual campaign work. Not since, as a child on Halloween, I took a donation container door to door in my apartment building for UNICEF, have I ever, as Gordon describes so vividly today, tried to directly convince anyone to do anything political in a campaign of any sort. (And given the recent midterms, as you’ll see when you read her piece today, thank heavens she, and so many other political activists like her, did so in a big-time way!) She has what she calls a “political vocation” and, given our present American world, the 2022 election season, and the 2024 version to come, thank goodness she, like so many others, does.

Still, I wouldn’t claim that I had no political vocation whatsoever. In my own fashion, here at TomDispatch, I’ve labored week after week, month after month, trying to put crucial information about how our world actually works and who is (and isn’t) responsible for that in front of anyone willing to read such pieces. And that, in its own fashion, has, I suppose, been my vocation, my version, you might say, of going out on the campaign trail — though what the reader does with anything I publish at this website is, of course, up to him or her. Now, if you want to think a little about what your own vocation in life might be, political or otherwise, check out Gordon. Tom

Living for Politics Or "Just Living"?

“Welcome back!” read my friend Allan’s email. “So happy to have you back and seeing that hard work paid off. Thank you for all that you do. Please don’t cook this evening. I am bringing you a Honduran dinner — tacos hondureños and baleadas, plus a bottle of wine.” The tacos were tasty indeed, but even more pleasing was my friend’s evident admiration for my recent political activities.

My partner and I had just returned from four months in Reno, working with UNITE-HERE, the hospitality industry union, on their 2022 midterm electoral campaign. It’s no exaggeration to say that, with the votes in Nevada’s mostly right-wing rural counties cancelling out those of Democratic-leaning Las Vegas, that union campaign in Reno saved the Senate from falling to the Republicans. Catherine Cortez Masto, the nation’s first Latina senator, won reelection by a mere 7,928 votes, out of a total of more than a million cast. It was her winning margin of 8,615 in Washoe County, home to Reno, that put her over the top.

Our friend was full of admiration for the two of us, but the people who truly deserved the credit were the hotel housekeepers, cooks, caterers, and casino workers who, for months, walked the Washoe County streets six days a week, knocking on doors in 105-degree heat and even stumping through an Election Day snowstorm. They endured having guns pulled on them, dogs sicced on them, and racist insults thrown at them, and still went out the next day to convince working-class voters in communities of color to mark their ballots for a candidate many had never heard of. My partner and I only played back-up roles in all of this; she, managing the logistics of housing, feeding, and supplying the canvassers, and I, working with maps and spreadsheets to figure out where to send the teams each day. It was, admittedly, necessary, if not exactly heroic, work.

“I’m not like the two of you,” Allan said when he stopped by with the promised dinner. “You do important work. I’m just living my life.”

“Not everybody,” I responded, “has a calling to politics.” And I think that’s true. I also wonder whether having politics as a vocation is entirely admirable.

Learning to Surf

That exchange with Allan got me thinking about the place of politics in my own life. I’ve been fortunate enough to be involved in activism of one sort or another for most of my 70 years, but it’s been just good fortune or luck that I happened to stumble into a life with a calling, even one as peculiar as politics.

There are historical moments when large numbers of people “just living” perfectly good lives find themselves swept up in the breaking wave of a political movement. I’ve seen quite a few of those moments, starting with the struggle of Black people for civil rights when I was a teenager, and the movement to stop the Vietnam War in that same era. Much more recently, I’ve watched thousands of volunteers in Kansas angrily reject the Supreme Court’s decision in Dobbs v. Jackson Women’s Health Organization, which overturned a 50-year precedent protecting a woman’s right to end a pregnancy. Going door to door in a classic political field campaign, they defeated a proposed anti-abortion amendment to the Kansas constitution, while almost doubling the expected turnout for a midterm primary.

To some observers, in a red and landlocked state like Kansas, that wave of resistance seemed to come out of nowhere. It certainly surprised a lot of professionals, but the capacity to ride it didn’t, in fact, come out of nowhere. When given a choice, it turns out that a substantial majority of people in the middle of this country will vote in favor of women’s bodily autonomy. But many of them won’t do it without a push. To build such a successful electoral campaign required people who’d spent years honing the necessary skills in times when the political seas appeared almost unendurably flat.

Some of those skills, learned through repeated practice, were technical: crafting effective messages; targeting the right voters; navigating coalitions of organizations with sometimes overlapping, sometimes competing priorities. And some might be called “moral skills,” the cultivation of internal characteristics — patience, say, or hope — until they become second nature. The Greek philosopher Aristotle called those moral skills “virtues” and believed we acquire them just like any other skill — by practicing them until they become habits.

You could compare some of us with a political vocation to a surfer sitting on her board constantly scanning the sea ahead, hoping to discern the best waves as they form, hoping she’d practiced well enough to ride them. Like so many surfers of this sort, I’ve probably wiped out more often than I’ve successfully ridden the waves.

Character Flaws for Justice

“This is the year,” I told a different friend long ago, “that I want to develop character flaws.” She was understandably startled, not least because my character has never been what you might call spotless.

“Why would you want to do that?” she asked.

“Because I’m getting ready to work on a political campaign.” I was only half-joking. In fact, doing politics effectively requires habits that don’t come naturally to me — like keeping information close to my vest rather than sharing everything I know with all comers.

There’s a fine line, too, between sitting on information and telling lies. In fact, to do politics effectively, you must be willing to lie. This truth is often taken for granted by those involved. A recent New York Times article about a man who can’t stop lying referred to a study of people’s self-reported truthfulness. Writing about those who admit to lying frequently, reporter Ellen Barry says,

This ‘small group of prolific liars,’ as the researchers termed it, constituted around 5.3 percent of the population but told half the reported lies, an average of 15 per day. Some were in professions, like retail or politics, that compelled them to lie. But others lied in a way that had no clear rationale. [My emphasis added, of course.]

As Barry sees it, politics is self-evidently a profession that compels its practitioners to lie. And I tend to agree with her, though I’m less interested in the lies candidates tell voters to get elected than the ones organizers like me tell people to get them to join, stick with, or fund a campaign.

Often, we lie about whether we can win. As I’ve written previously, I worked on campaigns I was sure we were going to lose, but that I thought were worth fighting anyway. In 1995 and 1996, for instance, I helped build a field campaign to defeat California Proposition 209, which succeeded in outlawing affirmative action at every level of government. We didn’t have much of a chance, but we still built an army of volunteers statewide, in part by telling them that, though our opponents had the money, we had the people capable of engaging voters no one expected to participate.

So, we said we could win because we were thinking ahead. Proposition 209 represented a cynical effort (indeed, its authors called it the California Civil Rights Initiative) to harness white anxiety about what would soon be a nonwhite majority in California. We hoped that building a multi-racial coalition to fight this initiative, even if we lost, would prepare people for the struggles to come.

But did I really know we couldn’t win? At some point, I suppose I traded in one virtue — truthfulness — for another — hope. And then, to project confidence and encourage others to hope as well, I had to start believing my own lies (at least a bit).

The funny thing about hope, though, is that sometimes the lies you force yourself to believe turn out to be true. That’s what happened this year with the campaign in Nevada. You never have enough canvassers to talk to every voter, so you have to choose your main target groups. UNITE-HERE chose to target people of color in working-class neighborhoods who rarely or never participate in elections.

Voters in Nevada are unusual in that more than a third of them (37%) are registered to vote with a small party or have no party affiliation at all. This is the largest single group of voters in the state, and it included many of our targets. Registered Democrats have a 6% edge over Republicans in Nevada, but the question always is: Which way will the people in the mysterious middle vote — for us or them? During two weeks of early voting, I downloaded the statistics on the party affiliations of the voters in Washoe County, where I was working. Democrats were winning the mail-in ballots, but when it came to in-person voting, the Republicans were creaming us. It didn’t look good at all — except that the numbers of small-party or no-party voters dwarfed the consistent edge the Republicans held. Which way would they jump?

I typically kept those statistics to myself, since it wasn’t part of my job to look at them in the first place. In the upbeat daily briefing for our canvassing team leaders, I concentrated instead on reporting the crucial everyday numbers for us: How many doors did we knock on yesterday? How many conversations did we have with voters? How many supporters did we identify? Those numbers I could present with honest enthusiasm, pointing to improvements made, for instance, by working with individual canvassers on how to keep doors open and voters talking.

But the funny thing was this: the hope I was projecting turned out to be warranted. The strategy that failed in California in 1996 — bringing out unlikely voters in communities of workers and people of color — succeeded in Nevada in 2022. When we opened the mystery box, it turned out to contain voters for us.

One More Conversation

I once had a friend, Lauren, who, for years, had been a member of one of the political organizations that grew out of the 1960s radical group Students for a Democratic Society. She’d gone to meetings and demonstrations, collated newsletters, handed out flyers, and participated in a well-functioning system of collective childcare. One day, I asked her how the work was going.

“Oh,” she said. “I dropped out. I still spend every Wednesday night with Emma [the child whose care she had shared in that group], but I’m not doing political work anymore.”

“But why not?”

“I realized that everything about politics involves making people do things they don’t want to do and that’s not how I want to spend my life.”

Even now, years later, I can see her point. Whether it’s asking my fellow part-time university teachers to come to a union meeting, trying to get a stranger to accept a leaflet on the street, or convincing a potential voter to listen to me about why this election matters and should matter to them, my strange vocation often does involve attempting to get people to do things they don’t particularly want to do.

Of course, it’s because I do believe in whatever I’m trying to move them toward that I’m involved in such politics in the first place. Usually, it’s because I believe that my goal should be their goal, too, whether it’s racial or economic justice, women’s liberation, or just keeping the planet from burning up.

But that leads me to another character flaw politics requires. You could call it pride, or even arrogance; it’s the confidence that I know better than you what’s good for you. Oddly enough, it may turn out that it’s when I’m pushing the most selfish goals — when I’m working for something I myself need like a living wage or the right to control my own body — that my motives stand up best to my own scrutiny.

It’s then that I’m asking someone to participate in collective action for my own benefit, and what could be more honest than that?

Politics as a Vocation

Politics as a Vocation” was the title of a well-known lecture by German sociologist Max Weber. In it, he famously defined the state as “a human community that (successfully) claims the monopoly of the legitimate use of physical force within a given territory.” Even when the use of force is delegated to some other institution — say, the police — Weber argued that citizens accept the “right” of the police to use violence because it comes from the state. That source of legitimacy is the only thing that separates a police force (so to speak) from any other violent gang.

For Weber, politics meant either leading a state or influencing its leaders. So if a state controls the legitimate use of force, then politics involves deciding how that force is to be deployed — under what conditions and for what purposes. It’s a heavy responsibility that, he claimed, people take on for one of only two reasons: either as a means to an end (which could be anything from personal wealth to ending poverty) or for its own sake — for the pleasure and feeling of prestige that power bestows.

“The decisive means for politics,” Weber wrote, “is violence.” If he was right, then my friend’s intuition that politics is about making people do things they don’t want to do may not have been so off the mark. Even the form of politics that appears to challenge Weber’s premise — the tradition of nonviolent action — involves a form of coercion. Those who willingly expose themselves to political violence are also trying to make people do something they don’t want to do by invoking empathy (and possibly feelings of guilt).

If, in some fashion, all politics really does involve coercion, can a political life possibly be a morally good one? I still think so, but it requires tempering a commitment to a cause with what Weber called the “ethic of responsibility” — a willingness not only to honestly examine our motives but to genuinely consider the likely results when we choose to act on them. It’s not enough to have good intentions. It’s crucial to strive as well for good — if imperfect — outcomes.

“Politics,” Weber said, “is a strong and slow boring of hard boards. It takes both passion and perspective.” But there’s another kind of life he also recommended, even if with a bit of a sneer, to those who don’t measure up to the demands of politics as a vocation. Such people “would have done better,” he observed, “in simply cultivating plain brotherliness in personal relations.”

And therein lies the greatest moral danger for those of us who feel that our vocation is indeed politics: a contempt for that plain “brotherliness” (or sisterliness) that makes ordinary human life bearable. There’s a saying attributed to Carlos Fonseca, one of the founders of Nicaragua’s revolutionary party, the Sandinistas: “A man [of course, it’s a man!] who is tired has a right to rest. But a man who rests does not have the right to be in the vanguard.”

And there it is, a fundamental disrespect for ordinary human life, including the need for rest, that tempts the activist to feel her calling makes her better than the people she’s called to serve.

In the end, if we do politics at all, it should be precisely so that people can have ordinary lives, ones not constrained and distorted by the kinds of injustice political activists try to end.

“I’m just living my life,” my friend Allan told me. In truth, his life is far more admirable than he believes. I’d say that he has a vocation for kindness every bit as heroic as any political calling. We’re not the only folks he feeds. The day before he visited us, he’d delivered dinner to another friend after her shoulder surgery. He spends little on himself, so he can send most of the money he earns to his family in Central America. During the worst of the pandemic shutdown, he regularly checked in on all the old folks he knows, startling my partner and me into realizing that we’ve lived long enough to fall into the category of elders to be looked after.

At the end of this long political season, back home from Nevada, I find that I’m full of admiration for the life my friend Allan is “just living.” As I wait for the next Trumpist wave to rise, may I remember that “just living” is the whole point of doing politics.

Returning to Reno in the shadow of Roe's undoing

Rebecca Gordon, Back to the Future Again

How far are we? Who really knows? Let’s just say that we’re somewhere significantly down the road to extremity, all-American style. With a hung (and wrung-out) Congress and lame (and aged) president, our tripartite government is looking ever less “tri” and ever more “part.” And it increasingly seems that the part being emphasized is a Supreme Court that should perhaps be renamed the Extreme Court. Only recently, it issued a series of Trumpist rulings that, from green-lighting the carrying of concealed weaponry to suppressing abortion to keeping climate change on track, rivaled in their extremity the 1857 Dred Scott ruling’s endorsement of slavery that helped launch the Civil War.

And that may just be the beginning. In their next term, for instance, the six justices of the Extreme Court could turn directly to that “tri” and try to whittle it down further. In particular, they may endorse what’s called the independent state legislature doctrine, an extremist theory that, according to the New York Times, “would give state legislatures independent power, not subject to review by state courts, to set election rules at odds with state constitutions, and to draw congressional maps warped by partisan gerrymandering.” And since, at this point, a significant majority of state legislatures are controlled by Republicans the possibility of gerrymandering the political map into a forever-winning extremist government seems all too imaginable.

So hold onto your hats (and guns) folks — we’ve already passed through the diciest post-election season in memory. (Sedition, you bet!) And we could be heading toward an all-American, all-Trumpist Extreme Court and Republican Party version of something akin to fascism.

With that in mind, could there be anything more important than getting out the vote in the elections of 2022 and 2024? I doubt it. So, thank you TomDispatch regular Rebecca Gordon for heading back to Nevada to lend a hand. We should all, whatever our doubts, take her as an example of what has to be done to prevent the Extremes from taking this land from so many of the rest of us. Tom

Returning to Reno In the Shadow of Roe's Undoing

Recently, I told my friend Mimi that, only weeks from now, I was returning to Reno to help UNITE-HERE, the hospitality industry union, in the potentially nightmarish 2022 election. “Even though,” I added, “I hate electoral politics.”

She just laughed.

“What’s so funny?” I asked.

“You’ve been saying that as long as I’ve known you,” she replied with a grin.

How right she was. And “as long as I’ve known you” has been a pretty long time. We met more than a quarter of a century ago when my partner and I hired her as the first organizer in a field campaign to defeat Proposition 209. That ballot initiative was one of a series pandering to the racial anxieties of white Californians that swept through the state in the 1990s. The first of them was Prop 187, outlawing the provision of government services, including health care and education, to undocumented immigrants. In 1994, Californians approved that initiative by a 59% to 41% vote. A federal court, however, found most of its provisions unconstitutional and it never went into effect.

We weren’t so lucky with Proposition 209, which, in 1996, outlawed affirmative-action programs statewide at any level of government or public service. Its effects reverberate to this day, not least at the prestigious University of California’s many campuses.

A study commissioned 25 years later by its Office of the President revealed that “Prop 209 caused a decline in systemwide URG enrollment by at least twelve percent.” URGs are the report’s shorthand for “underrepresented groups” — in other words, Latinos, Blacks, and Native Americans. Unfortunately, Proposition 209’s impact on the racial makeup of the university system’s students has persisted for decades and, as that report observed, “led URG applicants to cascade out of UC into measurably less-advantageous universities.” Because of UC’s importance in California’s labor market, “this caused a decline in the total number of high-earning ($100,000) early-30s African American and Hispanic/Latinx Californians by at least three percent.”

Yes, we lost the Prop 209 election, but the organization we helped start back in 1995, Californians for Justice, still flourishes. Led by people of color, it’s become a powerful statewide advocate for racial justice in public education with a number of electoral and legislative victories to its name.

Shortcomings and the Short Run

How do I hate thee, electoral organizing? Let me count the ways. First, such work requires that political activists like me go wide, but almost never deep. It forces us to treat voters like so many items to be checked off a list, not as political actors in their own right. Under intense time pressure, your job is to try to reach as many people as possible, immediately discarding those who clearly aren’t on your side and, in some cases, even actively discouraging them from voting. In the long run, treating elections this way can weaken the connection between citizens and their government by reducing all the forms of democratic participation to a single action, a vote. Such political work rarely builds organized power that lasts beyond Election Day.

In addition, electoral campaigns sometimes involve lying not just to voters, but even to your own canvassers (not to speak of yourself) about whether you can win or not. In bad campaigns — and I’ve seen a couple of them — everyone lies about the numbers: canvassers about how many doors they’ve knocked on; local field directors about what their canvassers have actually done; and so on up the chain of command to the campaign director. In good campaigns, this doesn’t happen, but those may not, I suspect, be in the majority. And lying, of course, can become a terrible habit for anyone hoping to construct a strong organization, not to mention a better world.

Lying, as the philosopher Immanuel Kant argued, is a way of treating people as if they were merely things to be used. Electoral campaigns can often tempt organizers to take just such an instrumental approach to others, assuming voters and campaign workers have value only to the extent that they can help you win. Such an approach, however efficient in the short run, doesn’t build solidarity or democratic power for the long haul. Sometimes, of course, the threat is so great — as was true when it came to the possible reelection of Donald Trump in 2020 — that the short-run simply matters more.

Another problem with elections? Campaigns so often involve convincing people to do something they’ve come to think of as a waste of time, namely, going to the polls. A 2018 senatorial race I worked on, for example, focused on our candidate’s belief in the importance of raising the minimum wage. And yes, we won that election, but four years later, the federal minimum wage is still stubbornly stuck at $7.25 an hour, though not, of course, through any fault of our candidate. Still, the voters who didn’t think electing Nevada Senator Jacky Rosen would improve their pay weren’t wrong.

On the other hand, the governor we helped elect that same year (and for whose reelection I’ll be working again soon) did come through for working Nevadans by, for example, signing legislation that guarantees a worker’s right to be recalled before anyone new is hired when a workplace reopens after a Covid shutdown.

You’ll hear some left-wing intellectuals and many working people who are, in the words of the old saying, “too broke to pay attention,” claim that elections don’t change anything. But such a view grows ever harder to countenance in a world where a Supreme Court disastrously reshaped by Donald Trump and Mitch McConnell is hell-bent on reshaping nearly the last century of American political life. It’s true that overturning Roe v. Wade doesn’t affect my body directly. I’m too old to need another abortion. Still, I’m just as angry as I was in 2016 at people who couldn’t bring themselves to vote for Hillary Clinton because she wasn’t Bernie Sanders. As I told such acquaintances at the time, “Yes, we’ll hate her and we’ll have to spend the next four years fighting her, but on the other hand, SUPREME COURT, SUPREME COURT, SUPREME COURT!”

Okay, maybe that wasn’t exactly the most elegant of arguments, but it was accurate, as anyone will tell you who’d like to avoid getting shot by a random heat-packing pedestrian, buried under the collapsing wall between church and state, or burned out in yet another climate-change-induced conflagration.

If Voting Changed Anything…

Back in 1996, as Election Day approached, Californians for Justice had expanded from two offices — in Oakland and Long Beach — to 11 around the state. We were paying a staff of 45 and expanding (while my partner and I lay awake many nights wondering how we’d make payroll at the end of the week). We were ready for our get-out-the-vote push.

Just before the election, one of the three organizations that had given us seed money published its monthly newsletter. The cover featured a photo of a brick wall spray-painted with the slogan: “If voting changed anything, they’d make it illegal.” Great, just what we needed!

It’s not as if I didn’t agree, at least in part, with the sentiment. Certainly, when it comes to foreign policy and the projection of military force globally, there has been little difference between the two mainstream political parties. Since the end of World War II, Democrats and Republicans have cooperated in a remarkably congenial way when it comes to this country’s disastrous empire-building project, while financially rewarding the military-industrial complex, year after year, in a grandiose fashion.

Even in the Proposition 209 campaign, my interest lay more in building long-term political power for California communities of color than in a vote I already knew we would lose. Still, I felt then and feel today that there’s something deeply wrong with the flippant response of some progressives that elections aren’t worth bothering about. I’d grown up in a time when, in the Jim Crow South, voting was still largely illegal for Blacks and people had actually died fighting for their right to vote. Decades earlier, some of my feminist forebears had been tortured while campaigning for votes for women.

Making Voting Illegal Again

In 1965, President Lyndon Johnson signed the Voting Rights Act, explicitly outlawing any law or regulation that “results in the denial or abridgment of the right of any citizen to vote on account of race or color.” Its specific provisions required states or counties with a history of voter suppression to receive “pre-clearance” from the attorney general or the District Court for the District of Columbia for any further changes in election laws or practices. Many experts considered this provision the heart of that Act.

Then, in 2013, in Shelby County v. Holder, a Supreme Court largely shaped by Republican presidents tore that heart right out. Essentially, the court ruled that, because those once excluded from voting could now do so, such jurisdictions no longer needed preclearance to change their voting laws and regulations. In other words, because it was working, it should be set aside.

Not surprisingly, some states moved immediately to restrict access to voting rights. According to the Brennan Center for Justice, “within 24 hours of the ruling, Texas announced that it would implement a strict photo ID law. Two other states, Mississippi and Alabama, also began to enforce photo ID laws that had previously been barred because of federal preclearance.” Within two months, North Carolina passed what that center called “a far-reaching and pernicious voting bill” which:

instituted a strict photo ID requirement; curtailed early voting; eliminated same day registration; restricted preregistration; ended annual voter registration drives; and eliminated the authority of county boards of elections to keep polls open for an additional hour.

Fortunately, the Fourth Circuit Court of Appeals struck down the North Carolina law in 2016, and surprisingly the Supreme Court let that ruling stand.

But as it turned out, the Supremes weren’t done with the Voting Rights Act. In 2021, the present Trumpian version of the court issued a ruling in Brnovich v. Democratic National Committee upholding Arizona’s right to pass laws requiring people to vote only in precincts where they live, while prohibiting anyone who wasn’t a relative of the voter from hand-delivering mail-in ballots to the polls. The court held that, even though in practice such measures would have a disproportionate effect on non-white voters, as long as a law was technically the same for all voters, it didn’t matter that, in practice, it would become harder for some groups to vote.

Writing for the majority, Justice Samuel Alito declared that states have a different and more important interest in such voting restrictions: preventing voter fraud. In other words — at least in the minds of two-thirds of the present Supreme Court — some version of Donald Trump’s big lie about rigged elections and voter fraud has successfully replaced racist voter suppression as the primary future danger to free and fair elections.

Maybe elections do change something. Otherwise, why, in the wake of the 2020 elections, would “they” (including Republican-controlled statelegislatures across significant parts of the country) be so intent on making it ever harder for certain people to vote? And if you think that’s bad, wait until the Supremes rule next year on the fringe legal theory of an “independent state legislature.” We may well see the court decide that a state’s legislature can legally overrule the popular vote in a federal election — just in time for the 2024 presidential race.

The Future Awaits Us

A couple of times a week I talk by phone with another friend. We began doing this at the height of George W. Bush’s and Dick Cheney’s vicious “war on terror.” We’d console each other when it came to the horrors of that conflict, including the illegal invasion of Iraq, the deaths and torture of Iraqi and Afghan civilians, and the seemingly endless expansion of American imperial meddling. We’re still doing it. Somehow, every time we talk, it seems as if the world has traveled one more mile on its way to hell in a handbasket.

Both of us have spent our lives trying, in our own modest fashion, to gum up the works of capitalism, militarism, and authoritarian government. To say that we’ve been less than successful would certainly be understating things. Still, we do keep at it, while discussing what in the world we can still do.

At this point in my life and my country’s slide into authoritarian misery, I often find it hard even to imagine what would be useful. Faced with such political disorientation, I fall back on a core conviction that, when the way forward is unclear, the best thing we can do is give people the experience of achieving in concert what they could never achieve by themselves. Sometimes, the product of an organizing drive is indeed victory. Even when it isn’t though, helping create a group capable of reading a political situation and getting things done, while having one another’s backs, is also a kind of victory.

That’s why, this election season, my partner and I are returning to Reno to join hotel housekeepers, cooks, and casino workers trying to ensure the reelection of two Democrats, Senator Catherine Cortez Masto and Governor Steve Sisolak, in a state where the margin of Democratic Party victories hasn’t grown since 2012.

From our previous experience, we know one thing: we’ll be working on a well-run campaign that won’t waste anyone’s time and has its eye on the future. As I wrote about the union’s 2020 presidential campaign for Joe Biden, more than winning a difficult election is at stake. What’s also important is building organized power for working people. In other words, providing the kind of training and leadership development that will send “back to every hotel, restaurant, casino, and airport catering service leaders who can continue to organize and advocate for their working-class sisters and brothers.”

I still hate electoral politics, but you don’t always get to choose the terrain you’re fighting on. Through its machinations at the federal, state, and county level, the Republican Party has been all but screaming its plans to steal the next presidential election. It’s no exaggeration to say that preserving some form of democratic government two years from now depends in part on keeping Republicans from taking over Congress, especially the Senate, this year.

So, it’s back to Reno, where the future awaits us. Let’s hope it’s one we can live with.

Women now dominate higher education. What does that mean for its future?

Rebecca Gordon, Where the Boys Aren't

Sixty years later, I would still like a do-over. Yes, I went to a school where, to fiddle with the title of Rebecca Gordon’s article, the boys were (and only them). I’m talking about Yale College in the 1960s when it was all-male and the hunting (or do I mean haunting?) grounds for George W. Bush, like his father and grandfather before him, and John Kerry among others. Sigh. I fought my parents hard over the decision to go there and lost big time. For them, Yale meant that I would be headed for the stratosphere, just like George and John. I was to be a triumph for a family in which my dad was just emerging from what had, for him, been the tough years of the supposedly golden 1950s.

And yes, I did get a good education. I mean, what else was there for me to do — a Jewish kid at a university that had just removed its informal quota on Jews, and without a girl in sight? It was the rest of the experience, all the fraternities that didn’t rush me, the famed secret society, Skull and Bones, that didn’t give me a second thought, and all those hotels where you had to put up the girl you invited to New Haven for a partying weekend that cost more than I could afford. (Forget the fact that I didn’t exactly have a lot of people to invite.) Well, you know the story. Or maybe, I hope, you don’t.

And then, what did I do but wander into the study of Chinese history — not exactly my parents’ idea of how to prepare myself for future glory — and never looked back? And so it went in a college world that, as you’ll discover today, seems all too impossible even to imagine anymore (and thank god for that!).

All I could think as I read TomDispatch regular Rebecca Gordon’s look at higher education six decades later is that I missed my moment, even if it also seems, as she explains, that the women who now make up the majority of college students may be missing theirs, through no fault of their own. Tom

Fewer Big (or Any Size) Men on Campus: What Does It Mean that Women Now Dominate Higher Education?

In the last week of her life, my mother extracted a promise from me. “Make sure,” she said, “that Orion goes to college.”

I swore that I would, although I wasn’t at all sure how I’d make it happen. Even in the year 2000, average tuitions were almost 10 times what my own undergraduate school had charged 30 years earlier. I knew that sending my nephew to college would cost more money than I’d have when the time came. If he was going to college, like his aunt before him, he’d need financial help. The difference was that his “help” was likely to come not as a grant, but as life-defining loans.

“Orion,” by the way, is a pseudonym for my brother’s son, my parents’ only grandchild. To the extent that any of us placed family hopes in a next generation, he’s borne them all. Orion was only five years old when I made that promise and he lived 3,000 miles away in a depressed and depressing de-industrialized town in New York’s Hudson River Valley. We’d only met in person once at that point. Over the years, however, we kept in touch by phone, later by text message, and twice he even visited my partner and me in San Francisco.

A little more than a decade after I made that promise, Orion graduated from high school. I thought that with a scholarship, loans, and financial help from his father and us, we might indeed figure out how to pay the staggering costs of a college education, which now averages $35,000 a year, having doubled in this century alone.

It turned out, however, that money wasn’t the only obstacle to making good on my promise. There was another catch as well. Orion didn’t want to go to college. Certainly, the one guidance counselor at his 1,000-student public high school had made no attempt to encourage either him or, as far as I could tell, many of his classmates to plan for a post-high-school education. But would better academic counseling have made a difference? I doubt it.

A bright boy who had once been an avid reader, Orion was done with schooling by the time he’d turned 18. He made that clear when I visited him for a talk about his future. He had a few ideas about what he might do: join the military or the New York state police. In reality, though, it turned out that he had no serious interest in either of those careers.

He might have been a disaffected student, but he was — and is — a hard worker. Over the next few years, despite sky-high unemployment in the Hudson River Valley, he always had a job. He made and delivered pizzas. He cleaned rooms at a high-end hotel for wealthy equestrians. He did pick-up carpentry. And then he met an older tradesman who gave him an informal apprenticeship in laying floors and setting tile. Orion learned how to piece together hardwood and install carpeting. He proudly showed me photos of the floors he’d laid and the kitchens he’d tiled.

Eventually, he had to part ways with his mentor, who also happened to be a dangerous drunk. We had another talk and I reminded him of my promise to my mother. I’d recently gotten an unexpected windfall — an advance on a book I was writing, American Nuremberg — which put me in a position to help set him up in business. He bought a van, completed his tool set, and paid for a year’s insurance. Now, 10 years after graduating from high school, he’s making decent money as a respected tradesman and is thinking about marrying his girlfriend. He’s made himself a life without ever going to college.

I worry about him, though. Laying floors is a young person’s trade. A few years on your knees, swinging a hammer all day, will tear your joints apart. He can’t do this forever.

The Rising of the Women

Still, it turns out that my nephew isn’t the only young man to opt out of more schooling. I’ve seen this in my own classrooms and the data confirms it as a national and international trend.

I started teaching ethics at the University of San Francisco in 2005. It soon struck me that there were invariably more women in my classes than men. Nor was the subject matter responsible, since everyone had to pass a semester of ethics to graduate from that Jesuit university. No, as it turned out, my always-full classes represented the school’s overall gender balance. For a few years, I wondered whether such an overrepresentation of women could be attributed to parents who felt safer sending their daughters to a Catholic school, especially in a city with San Francisco’s reputation for sex, drugs, and rock ‘n’ roll.

Recently, though, I came to realize that my classes were simply part of a much larger phenomenon already beginning to worry some observers. Until about 1990, men invariably outnumbered women at every level of post-secondary education and more of them graduated, too. At four-year colleges and in post-graduate programs or in community colleges (once they became more prevalent), more men earned two-year, four-year, master’s, and doctorate-level degrees.

It was during the 1970s that the ratio began to shift. In 1970, among recent high-school graduates, 32% of the men and just 20% of the women enrolled in post-secondary institutions. By 1990, equal percentages – around 32% — were going to college. In the years that followed, college attendance continued to increase for both sexes, but significantly faster for women who, in 1994, surpassed men. Since the end of the 1990s, men’s college attendance has stayed relatively stable at about 37% of high-school graduates.

Women’s campus presence, however, has only continued to climb with 44% of recent female high-school graduates enrolled in post-secondary schools by 2019.

So, the problem, if there is one, isn’t that men have stopped going to college. A larger proportion of them, in fact, attend today than at any time in our history. It’s just that an even larger proportion of women are doing so.

As a result, if you visit a college campus, you should see roughly three women — now about 60% of all college students — for every two men. And that gap has been growing ever wider, even during the disruption of the Covid pandemic.

Not only do more women now attend college than men, but they’re more likely to graduate and receive degrees. According to the National Center for Educational Statistics, in 1970, men received 57% of both two- and four-year degrees, 61% of master’s degrees, and 90% of doctorates. By 2019, women were earning the majority of degrees at all levels.

One unexpected effect of this growing college gender gap is that it’s becoming harder for individual women to get into selective schools. The Hechinger Report, a non-profit institution focused on education, lists a number of well-known ones where male applicants have a better chance of being accepted, including:

Boston, Bowdoin and Swarthmore colleges; Brown, Denison, Pepperdine, Pomona, Vanderbilt and Wesleyan universities; and the University of Miami. At each school, men were at least 2 percentage points more likely than women to be accepted in both 2019 and 2020. Pitzer College admitted 20% of men last year compared to 15% of women, and Vassar College accepted 28% of men compared to 23% of women. Both had more than twice as many female applicants as male applicants.

Even for Vassar, once a women’s college, having too many women is now apparently a problem.

In addition, in recent years, despite those lower acceptance rates for women at elite schools, colleges have generally had to deal with declining enrollments, a trend only accelerated by the Covid pandemic. As Americans postpone having children and have fewer when they do, the number of people reaching college age is actually shrinking. Two-year colleges have been especially hard hit.

And there’s the debt factor. Like my nephew Orion, more potential students, especially men, are now weighing the problem of deferring their earnings, while acquiring a staggering debt load from their years at college. Some of them are opting instead to try to make a living without a degree. Certain observers think this shift has been partially caused by a pandemic-fueled rise in wages in the lower tiers of the American workforce.

A Mystery

Why are there fewer men than women in college today? On this, theories abound, but genuine answers are few. Conservatives offer a number of explanations that echo their culture-war slogans, including that “the atmosphere on the nation’s campuses has become increasingly hostile to masculinity.”

A Wall Street Journal op-ed ascribed it in part to “labor-saving innovations in household management and child care — automatic washing machines, disposable diapers, inexpensive takeout restaurants — as well as new forms of birth control [that] helped women pursue college degrees and achieve new vocational ambitions.” But the biggest problem, write the op-ed’s authors, may be that girls simply do better in elementary and secondary school, which discourages boys from going on to college. This problem, they argue, is attributable not only to the advent of washing machines, but ultimately to the implementation of the Great Society’s liberal social policies. Citing Charles Murray, the reactionary co-author of the 1994 book The Bell Curve, they blame women’s takeover of higher education on the progressive social policies of the 1960s, the rise of the “custodial” (or welfare) state, and the existence of a vast pool of jailed men. They write:

[T]here are about 1.24 million more men who are incarcerated than women, largely preventing them from attending traditional college. Scholars such as Charles Murray have long demonstrated that expanded government entitlements following the Great Society era have reduced traditional family formation, reduced incentives to excel both in school and on the job, and increased crime.

Critics to the left have also cited male incarceration as a factor in the college gender divide, although they’re more likely to blame racist police and policies. Sadly, the devastation caused by jailing so many Black, Latino, and Native American men has only begun to be understood, but given the existing racial divide in college attendance, I seriously doubt that many of those men would be in college even if they weren’t in prison.

Some observers have also suggested that, given the staggering rise in college tuitions, young men, especially from the working and middle classes, often make a sound if instinctive decision that a college education will not repay their time, effort, and the debt load it entails. Like my nephew, they may indeed be better off entering a well-paying trade and getting an early start on building their savings.

Do Women Need College More Than Men?

If some young men now believe that college won’t reward them sufficiently to warrant the investment, many young women have rightly judged that they will need a college education to have any hope of earning a decent living. It’s no accident that their college enrollment skyrocketed in the 1970s. After a long post-World-War II economic expansion, that was the moment when wages in this country first began stagnating, a trend that continued in the 1980s when President Ronald Reagan launched his attacks on unions, while the federal minimum wage barely rose. In fact, it has remained stuck at $7.25 per hour since 2009.

First established in 1938, the minimum wage was intended to allow a single adult (then assumed to be a man) to support a non-earning adult (assumed to be his wife), and several children. It was called a “breadwinner” wage. The feminism that made work outside the home possible for women, saving the lives and sanity of so many of us, provided a useful distraction from those stagnant real wages, rising inequality, and the increased immiseration of millions (not to speak of the multiplication of billionaires).

In the last few decades of the twentieth century, many women came to believe that working for money was their personal choice. In truth, I suspect that they were also responding to new economic realities and the end of that “breadwinner” wage. I think the college gender gap, which grew ever wider as wages fell, is at least in part a consequence of those changes. Few of my women students believe that they have a choice when it comes to supporting themselves, even if they haven’t necessarily accepted how limited the kind of work they’re likely to find will be. Whether they form partnered households or not, they take it for granted that they’ll have to support themselves financially.

This makes a college degree even more important, since having it has a bigger impact on women’s earnings than on men’s. A study by the Federal Reserve Bank in St. Louis confirmed this. Reviewing 2015 census data, it showed that the average wage for a man with only a high-school diploma was around $12 per hour. Women earned 24.4% less than that, or about $9 hourly. On the other hand, women got a somewhat greater boost (28%) from earning a two-year degree than men (22%). For a four-year degree, it was 68% for women and 62% for men.

In other words, although a college education improves income prospects for both genders, it does more for women — even if not enough to raise their income to the level of men with the same education. The income gender gap remains stubbornly fixed in men’s favor. Like Alice in Through the Looking Glass, it seems women still have to run faster just to avoid losing ground. This means that for us, earning a decent living requires at least some college, which is less true for men.

What Does the Future Hold?

Sadly, as college becomes ever more the preserve of women, I suspect it will also lose at least some of its social and economic value. They let us in and we turned out to be too good at it. My prediction? Someday, college will be dismissed as something women do and therefore not an important marker of social or economic worth.

As with other realms that became devalued when women entered them (secretarial work, for example, or family medicine), I expect that companies will soon begin dropping the college-degree requirement for applicants.

In fact, it already seems to be happening. Corporations like IBM, Accenture, and Bank of America have begun opting for “skills-based” rather than college-based hiring. According to a CNBC report, a recent Harvard Business School study examined job postings for software quality-assurance engineers and “found that only 26% of Accenture’s postings for the job contained a degree requirement. At IBM, just 29% did.” Even the government is dropping some college-degree requirements. According to the same report, in January 2021, the White House issued an executive order on “Limiting [the] Use of Educational Requirements in Federal Service Contracts.” When hiring for IT positions, the order says, considering only those with college degrees “excludes capable candidates and undermines labor market efficiencies.” And recently, Maryland announced that it’s dropping the college graduation requirement for thousands of state positions.

Of course, this entire economic argument assumes that the value of a college education is purely extrinsic and can be fully measured in dollars. As a long-time college teacher, I still believe that education has an intrinsic value, beyond preparing “job-ready” workers or increasing their earning potential. At its best, college offers a unique opportunity to encounter new ideas in expansive ways, learn how to weigh evidence and arguments, and contemplate what it means to be a human being and a citizen of the world. It can make democracy possible in a time of creeping authoritarianism.

What kind of future do we face in a world where such an experience could be reduced, like knitting (which was once an honorable way for both sexes to earn a living), to a mere hobby for women?

Protesting 'this country’s military death machine': confessions of an American tax resister

Rebecca Gordon, War, Death, and Taxes

I don’t normally do this, but in the context of TomDispatch regular Rebecca Gordon’s latest all-too-well-timed piece on paying (or rather not paying) one’s taxes, let me quote a couple of paragraphs I once wrote for this site about my own distant past and then briefly explain why:

And here’s a little story from the Neolithic age we now call ‘the Sixties’ about that moment when the U.S. military was still a citizen’s army with a draft (even if plenty of people figured out how to get exemptions). At a large demonstration, I turned in my draft card to protest the war. Not long after, my draft board summoned me. I knew when I got there that I had a right to look at my draft file, so I asked to see it. I have no idea what I thought I would find in it, but at 25, despite my antiwar activism, I still retained a curiously deep and abiding faith in my government. When I opened that file and found various documents from the FBI, I was deeply shocked. The Bureau, it turned out, had its eyes on me. Anxious about the confrontation to come — the members of my draft board would, in fact, soon quite literally be shouting at me and threatening to call me up essentially instantaneously — I remember touching one of those FBI documents and it was as if an electric current had run directly through my body. I couldn’t shake the Big Brotherness of it all, though undoubtedly my draft card had gone more or less directly from that demonstration to the Bureau.
As it happened, my draft board’s threats put me among the delinquent 1-A files to be called up next. Not long after, in July 1970 — I would read about it on the front page of the New York Times — a group of five antiwar activists, calling themselves Women Against Daddy Warbucks, broke into that very draft board, located in Rockefeller Center in New York City, took the 1-A files, shredded them, and tossed them like confetti around that tourist spot. And I never heard from my draft board again. Lucky me at that time. Of course, so many young, draftable American men had no such luck. They were indeed sent to Vietnam to fight and suffer, sometimes to be wounded or killed, or (as surprising numbers of them did) join the antiwar movement of that moment.

Those paragraphs came to mind because of the story Rebecca Gordon tells today about her own urge to resist America’s wars and the moment when she could personally go no further. It made me realize that, in some sense, thanks to those five women long ago, I was relieved of a decision I have no idea how I would have dealt with in the end. At that time, many young men like me were going to Canada rather than be drafted into the military and risking deployment to Vietnam. But I actually visited Canada soon after I turned in my draft card and, much as I liked the neighborhoods in Toronto where I spent time, I found I simply couldn’t imagine leaving my country, no matter what. It just wasn’t me. And that meant, when I was called up again, choosing either jail or the military. It was my luck, I suppose, that I never had to make that decision, which undoubtedly would have led to a very different life than the one I’ve had.

Other people, Gordon included, then and since, weren’t so lucky. No group called Five Women Against Uncle Sam destroyed her tax records in 1992 and so, today, she can tell you her antiwar story and remind us that we all have limits when it comes to our moments of decision. Tom

“Too Distraught”

Confessions of a Failed Tax Resister

Every April, as income-tax returns come due, I think about the day 30 years ago when I opened my rented mailbox and saw a business card resting inside. Its first line read, innocently enough, “United States Treasury.” It was the second line — “Internal Revenue Service” — that took my breath away. That card belonged to an IRS revenue agent and scrawled across it in blue ink was the message: “Call me.”

I’d used that mailbox as my address on the last tax return I’d filed, eight years earlier. Presumably, the agent thought she’d be visiting my home when she appeared at the place where I rented a mailbox, which, as I would discover, was the agency’s usual first step in running down errant taxpayers. Hands shaking, I put a quarter in a payphone and called my partner. “What’s going to happen to us?” I asked her.

Resisting War Taxes

I knew that the IRS wasn’t visiting me as part of an audit of my returns, since I hadn’t filed any for eight years. My partner and I were both informal tax resisters — she, ever since joining the pacifist Catholic Worker organization; and I, ever since I’d returned from Nicaragua in 1984. I’d spent six months traveling that country’s war zones as a volunteer with Witness for Peace. My work involved recording the testimony of people who had survived attacks by the “Contras,” the counterrevolutionary forces opposing the leftist Sandinista government then in power (after a popular uprising deposed the U.S.-backed dictator, Anastasio Somoza). At the time, the Contras were being illegally supported by the administration of President Ronald Reagan.

With training and guidance from the CIA, they were using a military strategy based on terrorizing civilians in the Nicaraguan countryside. Their targets included newly built schools, clinics, roads, and phone lines — anything the revolutionary government had, in fact, achieved — along with the campesinos (the families of subsistence farmers) who used such things. Contra attacks very often involved torture: flaying people alive, severing body parts, cutting open the wombs of pregnant women. Nor were such acts mere aberrations. They were strategic choices made by a force backed and directed by the United States.

When I got back to the United States, I simply couldn’t imagine paying taxes to subsidize the murder of people in another country, some of whom I knew personally. I continued working, first as a bookkeeper, then at a feminist bookstore, and eventually at a foundation. But with each new employer, on my W-4 form I would claim that I expected to owe no taxes that year, so the IRS wouldn’t take money out of my paycheck. And I stopped filing tax returns.

Not paying taxes for unjust wars has a long history in this country. It goes back at least to Henry David Thoreau’s refusal to pay them to support the Mexican-American War (1846-1848). His act of resistance landed him in jail for a night and led him to write On the Duty of Civil Disobedience, dooming generations of high-school students to reading the ruminations of a somewhat self-satisfied tax resister. Almost a century later, labor leader and pacifist A.J. Muste revived Thoreau’s tradition, once even filing a copy of the Duty ofCivil Disobedience in place of his Form 1040. After supporting textile factory workers in their famous 1919 strike in Lowell, Massachusetts, and some 20 years later helping form and run the Amalgamated Textile Workers of America (where my mother once worked as a labor organizer), Muste eventually came to serve on the board of the War Resisters League (WRL).

For almost a century now, WRL, along with the even older Fellowship of Reconciliation and other peace groups, have promoted antiwar tax resistance as a nonviolent means of confronting this country’s militarism. In recent years, both organizations have expanded their work beyond opposing imperial adventures overseas to stand against racist, militarized policing at home as well.

Your Tax Dollars at Work

Each year, the WRL publishes a “pie chart” poster that explains “where your income tax money really goes.” In most years, more than half of it is allocated to what’s euphemistically called “defense.” This year’s poster, distinctly an outlier, indicates that pandemic-related spending boosted the non-military portion of the budget above the 50% mark for the first time in decades. Still, at $768 billion, we now have the largest Pentagon budget in history (and it’s soon to grow larger yet). That’s a nice reward for a military whose main achievements in this century are losing major wars in Iraq and Afghanistan.

But doesn’t the war in Ukraine justify all those billions? Not if you consider that none of the billions spent in previous years stopped Russia from invading. As Lindsay Koshgarian argues at Newsweek, “Colossal military spending didn’t prevent the Russian invasion, and more money won’t stop it. The U.S. alone already spends 12 times more on its military than Russia. When combined with Europe’s biggest military spenders, the U.S. and its allies on the continent outspend Russia by at least 15 to 1. If more military spending were the answer, we wouldn’t be in this situation.”

“Defense” spending could, however, just as accurately be described as welfare for military contractors, because that’s where so much of the money eventually ends up. The top five weapons-making companies in 2021 were Lockheed Martin, Raytheon Technology, Boeing, Northrup Grumman, and General Dynamics. Together, they reaped $198 billion in taxpayer funds last year alone. In 2020, the top 100 contractors took in $551 billion. Of course, we undoubtedly got some lovely toys for our money, but I’ve always found it difficult to live in — or eat — a drone. They’re certainly useful, however, for murdering a U.S. citizen in Yemen or so many civilians elsewhere in the Greater Middle East and Africa.

The Pentagon threatens the world with more than the direct violence of war. It’s also a significant factor driving climate change. The U.S. military is the world’s largest institutional consumer of oil. If it were a country, the Pentagon would rank 55th among the world’s carbon emitters.

While the military budget increases yearly, federal spending that actually promotes human welfare has fallen over the last decade. In fact, such spending for the program most Americans think of when they hear the word “welfare” — Temporary Aid for Needy Families, or TANF — hasn’t changed much since 1996, the year the Personal Responsibility and Work Opportunity Reconciliation Act (so-called welfare reform) took effect. In 1997, federal expenditures for TANF totaled about $16.6 billion. That figure has remained largely unchanged. However, according to the Congressional Research Service, since the program began, such expenditures have actually dropped 40% in value, thanks to inflation.

Unlike military outlays, spending for the actual welfare of Americans doesn’t increase over time. In fact, as a result of the austerity imposed by the 2011 Budget Control Act, the Center for Budget and Policy Priorities reports that “by 2021 non-defense funding (excluding veterans’ health care) was about 9% lower than it had been 11 years earlier after adjusting for inflation and population growth.” Note that Congress passed that austerity measure a mere three years after the subprime lending crisis exploded, initiating the Great Recession, whose reverberations still ring in our ears.

This isn’t necessarily how taxpayers want their money spent. In one recent poll, a majority of them, given the choice, said they would prioritize education, social insurance, and health care. A third would rather that their money not be spent on war at all. And almost 40% believed that the federal government simply doesn’t spend enough on social-welfare programs.

Death May Be Coming for Us All, But Taxes Are for the Little People

Pollsters don’t include corporations like Amazon, FedEx, and Nike in their surveys of taxpayers. Perhaps the reason is that those corporate behemoths often don’t pay a dollar in income tax. In 2020, in fact, 55 top U.S. companies paid no corporate income taxes whatsoever. Nor would the survey takers have polled billionaires like Jeff Bezos, Elon Musk, or Carl Icahn, all of whom also manage the neat trick of not paying any income tax at all some years.

In 2021, using “a vast trove of Internal Revenue Service data on the tax returns of thousands of the nation’s wealthiest people, covering more than 15 years,” ProPublica published a report on how much the rich really pay in taxes. The data show that, between 2014 and 2018, the richest Americans paid a measly “true tax” rate of 3.4% on the growth of their wealth over that period. The average American — you — typically pays 14% of his or her income each year in federal income tax. As ProPublica explains:

America’s billionaires avail themselves of tax-avoidance strategies beyond the reach of ordinary people. Their wealth derives from the skyrocketing value of their assets, like stock and property. Those gains are not defined by U.S. laws as taxable income unless and until the billionaires sell.

So, if the rich avoid paying taxes by holding onto their assets instead of selling them, where do they get the money to live like the billionaires they are? The answer isn’t complicated: they borrow it.­ Using their wealth as collateral, they typically borrow millions of dollars to live on, using the interest on those loans to offset any income they might actually receive in a given year and so reducing their taxes even more.

While they do avoid paying taxes, I’m pretty sure those plutocrats aren’t tax resisters. They’re not using such dodges to avoid paying for U.S. military interventions around the world, which was why I stopped paying taxes for almost a decade. Through the Reagan administration and the first Bush presidency, with the Savings and Loan debacle and the first Gulf War, there was little the U.S. government was doing that I wanted to support.

These days, however, having lived through the “greed is good” decade, having watched a particularly bizarre version of American individualism reach its pinnacle in the presidency of billionaire Donald Trump, I think about taxes a bit differently. I still don’t want to pay for the organized global version of murder that is war, American-style, but I’ve also come to see that taxes are an important form of communal solidarity. Our taxes allow us, though the government, to do things together we can’t do as individuals — like generating electricity or making sure our food is clean and safe. In a more truly democratic society, people like me might feel better about paying taxes, since we’d be deciding more collectively how to spend our common wealth for the common good. We might even buy fewer drones.

Until that day comes, there are still many ways, as the War Resisters League makes clear, to resist paying war taxes, should you choose to do so. I eventually started filing my returns again and paid off eight years of taxes, penalties, and interest. It wasn’t the life decision I’m proudest of, but here’s what happened.

“Too Distraught

The method I chose was, as I’ve said, not to file my tax returns, which, if your employer doesn’t withhold any taxes and send them to the feds, denies the federal government tax revenue from you. Mind you, for most of those years I wasn’t making much money. We’re talking about hundreds of dollars, not hundreds of thousands of dollars in lost tax revenue. Over those years, I got just the occasional plaintive query from the IRS about whether I’d forgotten my taxes. But during the mid-1980s, the IRS upgraded its computers, improving its ability to capture income reported by employers and so enabling it to recreate the returns a taxpayer should have filed, but didn’t. And so, in 1992 an IRS agent visited my mailbox.

Only a month earlier, a friend, my partner, and I had bought a house together. So, when I saw that “Call me,” on the agent’s business card, I was terrified that my act of conscience was going to lose us our life savings. Trembling, I called the revenue agent and set up an appointment at the San Francisco federal building, a place I only knew as the site of many antiwar demonstrations I’d attended.

I remember the agent meeting us at the entrance to a room filled with work cubicles. I took a look at her and my gaydar went off. “Oh, my goodness,” I thought, “she’s a lesbian!” Maybe that would help somehow — not that I imagined for a second that my partner and I were going to get the “family discount” we sometimes received from LGBT cashiers.

The three of us settled into her cubicle. She told me that I would have to file returns from 1986 to 1991 (the IRS computers, it turned out, couldn’t reach back further than that) and also pay the missing taxes, penalties, and interest on all of it. With an only partially feigned quaver in my voice, I asked, “Are you going to take our house away?”

She raised herself from her chair just enough to scan the roomful of cubicles around us, then sat down again. Silently, she shook her head. Well, it may not have been the family discount, but it was good enough for me.

Then she asked why I hadn’t filed my taxes and, having already decided I was going to pay up, I didn’t explain anything about those Nicaraguan families our government had maimed or murdered. I didn’t say why I’d been unwilling or what I thought it meant to pay for this country’s wars in Central America or preparations for more wars to come. “I just kept putting it off,” I said, which was true enough, if not the whole truth.

Somehow, she bought that and asked me one final question, “By the way, what do you do for a living?”

“I’m an accountant,” I replied.

Her eyebrows flew up and she shook her head, but that was that.

Why did I give up so easily? There were a few reasons. The Contra war in Nicaragua had ended after the Sandinistas lost the national elections in 1990. Nicaraguans weren’t stupid. They grasped that, as long as the Sandinistas were in power, the U.S. would continue to embargo their exports and arm and train the Contras. And I’d made some changes in my own life. After decades of using part-time paid work to support my full-time activism, I’d taken a “grown-up” job to help pay my ailing and impoverished mother’s rent, once I convinced her to move from subsidized housing in Cambridge, Massachusetts, to San Francisco. And, of course, I’d just made a fundamental investment of my own in the status quo. I’d bought a house. Even had I been willing to lose it, I couldn’t ask my co-owners to suffer for my conscience.

But in the end, I also found I just didn’t have the courage to defy the government of the world’s most powerful country.

As it happened, I wasn’t the only person in the Bay Area to get a visit from a revenue agent that year. The IRS, it turned out, was running a pilot program to see whether they could capture more unpaid taxes by diverting funds from auditing to directly pursuing non-filers like me. Several resisters I knew were caught in their net, including my friend T.J.

An agent came to T.J.’s house and sat at his kitchen table. Unlike “my” agent, T.J.’s not only asked him why he hadn’t filed his returns, but read from a list of possible reasons: “Did you have a drug or alcohol problem? Were you ill? Did you have trouble filling out the forms?”

“Don’t you have any political reasons on your list?” T.J. asked.

The agent looked doubtful. “Political? Well, there’s ‘too distraught.’”

“That’s it,” said T.J. “Put down ‘too distraught.’”

T.J. died years ago, but I remember him every tax season when I again have to reckon with just how deeply implicated all of us are in this country’s military death machine, whether we pay income taxes or not. Still, so many of us keep on keeping on, knowing we must never become too distraught to find new ways to oppose military aggression anywhere in the world, including, of course, Ukraine, while affirming life as best we can.

Automated killer robots aren't science fiction anymore — and the world isn't ready

Here’s a scenario to consider: a military force has purchased a million cheap, disposable flying drones each the size of a deck of cards, each capable of carrying three grams of explosives — enough to kill a single person or, in a “shaped charge,” pierce a steel wall. They’ve been programmed to seek out and “engage” (kill) certain human beings, based on specific “signature” characteristics like carrying a weapon, say, or having a particular skin color. They fit in a single shipping container and can be deployed remotely. Once launched, they will fly and kill autonomously without any further human action.

Science fiction? Not really. It could happen tomorrow. The technology already exists.

In fact, lethal autonomous weapons systems (LAWS) have a long history. During the spring of 1972, I spent a few days occupying the physics building at Columbia University in New York City. With a hundred other students, I slept on the floor, ate donated takeout food, and listened to Alan Ginsberg when he showed up to honor us with some of his extemporaneous poetry. I wrote leaflets then, commandeering a Xerox machine to print them out.

And why, of all campus buildings, did we choose the one housing the Physics department? The answer: to convince five Columbia faculty physicists to sever their connections with the Pentagon’s Jason Defense Advisory Group, a program offering money and lab space to support basic scientific research that might prove useful for U.S. war-making efforts. Our specific objection: to the involvement of Jason’s scientists in designing parts of what was then known as the “automated battlefield” for deployment in Vietnam. That system would indeed prove a forerunner of the lethal autonomous weapons systems that are poised to become a potentially significant part of this country’s — and the world’s — armory.

Early (Semi-)Autonomous Weapons

Washington faced quite a few strategic problems in prosecuting its war in Indochina, including the general corruption and unpopularity of the South Vietnamese regime it was propping up. Its biggest military challenge, however, was probably North Vietnam’s continual infiltration of personnel and supplies on what was called the Ho Chi Minh Trail, which ran from north to south along the Cambodian and Laotian borders. The Trail was, in fact, a network of easily repaired dirt roads and footpaths, streams and rivers, lying under a thick jungle canopy that made it almost impossible to detect movement from the air.

The U.S. response, developed by Jason in 1966 and deployed the following year, was an attempt to interdict that infiltration by creating an automated battlefield composed of four parts, analogous to a human body’s eyes, nerves, brain, and limbs. The eyes were a broad variety of sensors — acoustic, seismic, even chemical (for sensing human urine) — most dropped by air into the jungle. The nerve equivalents transmitted signals to the “brain.” However, since the sensors had a maximum transmission range of only about 20 miles, the U.S. military had to constantly fly aircraft above the foliage to catch any signal that might be tripped by passing North Vietnamese troops or transports. The planes would then relay the news to the brain. (Originally intended to be remote controlled, those aircraft performed so poorly that human pilots were usually necessary.)

And that brain, a magnificent military installation secretly built in Thailand’s Nakhon Phanom, housed two state-of-the-art IBM mainframe computers. A small army of programmers wrote and rewrote the code to keep them ticking, as they attempted to make sense of the stream of data transmitted by those planes. The target coordinates they came up with were then transmitted to attack aircraft, which were the limb equivalents. The group running that automated battlefield was designated Task Force Alpha and the whole project went under the code name Igloo White.

As it turned out, Igloo White was largely an expensive failure, costing about a billion dollars a year for five years (almost $40 billion total in today’s dollars). The time lag between a sensor tripping and munitions dropping made the system ineffective. As a result, at times Task Force Alpha simply carpet-bombed areas where a single sensor might have gone off. The North Vietnamese quickly realized how those sensors worked and developed methods of fooling them, from playing truck-ignition recordings to planting buckets of urine.

Given the history of semi-automated weapons systems like drones and “smart bombs” in the intervening years, you probably won’t be surprised to learn that this first automated battlefield couldn’t discriminate between soldiers and civilians. In this, they merely continued a trend that’s existed since at least the eighteenth century in which wars routinely kill more civilians than combatants.

None of these shortcomings kept Defense Department officials from regarding the automated battlefield with awe. Andrew Cockburn described this worshipful posture in his book Kill Chain: The Rise of the High-Tech Assassins, quoting Leonard Sullivan, a high-ranking Pentagon official who visited Vietnam in 1968: “Just as it is almost impossible to be an agnostic in the Cathedral of Notre Dame, so it is difficult to keep from being swept up in the beauty and majesty of the Task Force Alpha temple.”

Who or what, you well might wonder, was to be worshipped in such a temple?

Most aspects of that Vietnam-era “automated” battlefield actually required human intervention. Human beings were planting the sensors, programming the computers, piloting the airplanes, and releasing the bombs. In what sense, then, was that battlefield “automated”? As a harbinger of what was to come, the system had eliminated human intervention at a single crucial point in the process: the decision to kill. On that automated battlefield, the computers decided where and when to drop the bombs.

In 1969, Army Chief of Staff William Westmoreland expressed his enthusiasm for this removal of the messy human element from war-making. Addressing a luncheon for the Association of the U.S. Army, a lobbying group, he declared:

“On the battlefield of the future enemy forces will be located, tracked, and targeted almost instantaneously through the use of data links, computer-assisted intelligence evaluation, and automated fire control. With first round kill probabilities approaching certainty, and with surveillance devices that can continually track the enemy, the need for large forces to fix the opposition will be less important.”

What Westmoreland meant by “fix the opposition” was kill the enemy. Another military euphemism in the twenty-first century is “engage.” In either case, the meaning is the same: the role of lethal autonomous weapons systems is to automatically find and kill human beings, without human intervention.

New LAWS for a New Age — Lethal Autonomous Weapons Systems

Every autumn, the British Broadcasting Corporation sponsors a series of four lectures given by an expert in some important field of study. In 2021, the BBC invited Stuart Russell, professor of computer science and founder of the Center for Human-Compatible Artificial Intelligence at the University of California, Berkeley, to deliver those “Reith Lectures.” His general subject was the future of artificial intelligence (AI), and the second lecture was entitled “The Future Role of AI in Warfare.” In it, he addressed the issue of lethal autonomous weapons systems, or LAWS, which the United Nations defines as “weapons that locate, select, and engage human targets without human supervision.”

Russell’s main point, eloquently made, was that, although many people believe lethal autonomous weapons are a potential future nightmare, residing in the realm of science fiction, “They are not. You can buy them today. They are advertised on the web.”

I’ve never seen any of the movies in the Terminator franchise, but apparently military planners and their PR flacks assume most people derive their understanding of such LAWS from this fictional dystopian world. Pentagon officials are frequently at pains to explain why the weapons they are developing are not, in fact, real-life equivalents of SkyNet — the worldwide communications network that, in those films, becomes self-conscious and decides to eliminate humankind. Not to worry, as a deputy secretary of defense told Russell, “We have listened carefully to these arguments and my experts have assured me that there is no risk of accidentally creating SkyNet.”

Russell’s point, however, was that a weapons system doesn’t need self-awareness to act autonomously or to present a threat to innocent human beings. What it does need is:

  • A mobile platform (anything that can move, from a tiny quadcopter to a fixed-wing aircraft)
  • Sensory capacity (the ability to detect visual or sound information)
  • The ability to make tactical decisions (the same kind of capacity already found in computer programs that play chess)
  • The ability to “engage,” i.e. kill (which can be as complicated as firing a missile or dropping a bomb, or as rudimentary as committing robot suicide by slamming into a target and exploding)

The reality is that such systems already exist. Indeed, a government-owned weapons company in Turkey recently advertised its Kargu drone — a quadcopter “the size of a dinner plate,” as Russell described it, which can carry a kilogram of explosives and is capable of making “anti-personnel autonomous hits” with “targets selected on images and face recognition.” The company’s site has since been altered to emphasize its adherence to a supposed “man-in-the-loop” principle. However, the U.N. has reported that a fully autonomous Kargu-2 was, in fact, deployed in Libya in 2020.

You can buy your own quadcopter right now on Amazon, although you’ll still have to apply some DIY computer skills if you want to get it to operate autonomously.

The truth is that lethal autonomous weapons systems are less likely to look like something from the Terminator movies than like swarms of tiny killer bots. Computer miniaturization means that the technology already exists to create effective LAWS. If your smartphone could fly, it could be an autonomous weapon. Newer phones use facial-recognition software to “decide” whether to allow access. It’s not a leap to create flying weapons the size of phones, programmed to “decide” to attack specific individuals, or individuals with specific features. Indeed, it’s likely such weapons already exist.

Can We Outlaw LAWS?

So, what’s wrong with LAWS, and is there any point in trying to outlaw them? Some opponents argue that the problem is they eliminate human responsibility for making lethal decisions. Such critics suggest that, unlike a human being aiming and pulling the trigger of a rifle, a LAWS can choose and fire at its own targets. Therein, they argue, lies the special danger of these systems, which will inevitably make mistakes, as anyone whose iPhone has refused to recognize his or her face will acknowledge.

In my view, the issue isn’t that autonomous systems remove human beings from lethal decisions. To the extent that weapons of this sort make mistakes, human beings will still bear moral responsibility for deploying such imperfect lethal systems. LAWS are designed and deployed by human beings, who therefore remain responsible for their effects. Like the semi-autonomous drones of the present moment (often piloted from half a world away), lethal autonomous weapons systems don’t remove human moral responsibility. They just increase the distance between killer and target.

Furthermore, like already outlawed arms, including chemical and biological weapons, these systems have the capacity to kill indiscriminately. While they may not obviate human responsibility, once activated, they will certainly elude human control, just like poison gas or a weaponized virus.

And as with chemical, biological, and nuclear weapons, their use could effectively be prevented by international law and treaties. True, rogue actors, like the Assad regime in Syria or the U.S. military in the Iraqi city of Fallujah, may occasionally violate such strictures, but for the most part, prohibitions on the use of certain kinds of potentially devastating weaponry have held, in some cases for over a century.

Some American defense experts argue that, since adversaries will inevitably develop LAWS, common sense requires this country to do the same, implying that the best defense against a given weapons system is an identical one. That makes as much sense as fighting fire with fire when, in most cases, using water is much the better option.

The Convention on Certain Conventional Weapons

The area of international law that governs the treatment of human beings in war is, for historical reasons, called international humanitarian law (IHL). In 1995, the United States ratified an addition to IHL: the 1980 U.N. Convention on Certain Conventional Weapons. (Its full title is much longer, but its name is generally abbreviated as CCW.) It governs the use, for example, of incendiary weapons like napalm, as well as biological and chemical agents.

The signatories to CCW meet periodically to discuss what other weaponry might fall under its jurisdiction and prohibitions, including LAWS. The most recent conference took place in December 2021. Although transcripts of the proceedings exist, only a draft final document — produced before the conference opened — has been issued. This may be because no consensus was even reached on how to define such systems, let alone on whether they should be prohibited. The European Union, the U.N., at least 50 signatory nations, and (according to polls), most of the world population believe that autonomous weapons systems should be outlawed. The U.S., Israel, the United Kingdom, and Russia disagree, along with a few other outliers.

Prior to such CCW meetings, a Group of Government Experts (GGE) convenes, ostensibly to provide technical guidance for the decisions to be made by the Convention’s “high contracting parties.” In 2021, the GGE was unable to reach a consensus about whether such weaponry should be outlawed. The United States held that even defining a lethal autonomous weapon was unnecessary (perhaps because if they could be defined, they could be outlawed). The U.S. delegation put it this way:

“The United States has explained our perspective that a working definition should not be drafted with a view toward describing weapons that should be banned. This would be — as some colleagues have already noted — very difficult to reach consensus on, and counterproductive. Because there is nothing intrinsic in autonomous capabilities that would make a weapon prohibited under IHL, we are not convinced that prohibiting weapons based on degrees of autonomy, as our French colleagues have suggested, is a useful approach.”

The U.S. delegation was similarly keen to eliminate any language that might require “human control” of such weapons systems:

“[In] our view IHL does not establish a requirement for ‘human control’ as such… Introducing new and vague requirements like that of human control could, we believe, confuse, rather than clarify, especially if these proposals are inconsistent with long-standing, accepted practice in using many common weapons systems with autonomous functions.”

In the same meeting, that delegation repeatedly insisted that lethal autonomous weapons would actually be good for us, because they would surely prove better than human beings at distinguishing between civilians and combatants.

Oh, and if you believe that protecting civilians is the reason the arms industry is investing billions of dollars in developing autonomous weapons, I’ve got a patch of land to sell you on Mars that’s going cheap.

The Campaign to Stop Killer Robots

The Governmental Group of Experts also has about 35 non-state members, including non-governmental organizations and universities. The Campaign to Stop Killer Robots, a coalition of 180 organizations, among them Amnesty International, Human Rights Watch, and the World Council of Churches, is one of these. Launched in 2013, this vibrant group provides important commentary on the technical, legal, and ethical issues presented by LAWS and offers other organizations and individuals a way to become involved in the fight to outlaw such potentially devastating weapons systems.

The continued construction and deployment of killer robots is not inevitable. Indeed, a majority of the world would like to see them prohibited, including U.N. Secretary General Antonio Guterres. Let’s give him the last word: “Machines with the power and discretion to take human lives without human involvement are politically unacceptable, morally repugnant, and should be prohibited by international law.”

I couldn’t agree more.

Copyright 2022 Rebecca Gordon

Featured image: Killer Robots by Global Panorama is licensed under CC BY-SA 2.0 / Flickr

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer’s new dystopian novel, Songlands(the final one in his Splinterlands series), Beverly Gologorsky’s novel Every Body Has a Story, and Tom Engelhardt’s A Nation Unmade by War, as well as Alfred McCoy’s In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower’s The Violent American Century: War and Terror Since World War II.

Rebecca Gordon, a TomDispatch regular, teaches at the University of San Francisco. She is the author of Mainstreaming Torture, American Nuremberg: The U.S. Officials Who Should Stand Trial for Post-9/11 War Crimes and is now at work on a new book on the history of torture in the United States.

Graveyard shift: The dark reality of the modern economy reveals itself under pandemic-era demands

In mid-October, President Biden announced that the Port of Los Angeles would begin operating 24 hours a day, seven days a week, joining the nearby Port of Long Beach, which had been doing so since September. The move followed weeks of White House negotiations with the International Longshore and Warehouse Union, as well as shippers like UPS and FedEx, and major retailers like Walmart and Target.

The purpose of expanding port hours, according to the New York Times, was “to relieve growing backlogs in the global supply chains that deliver critical goods to the United States.” Reading this, you might be forgiven for imagining that an array of crucial items like medicines or their ingredients or face masks and other personal protective equipment had been languishing in shipping containers anchored off the West Coast. You might also be forgiven for imagining that workers, too lazy for the moment at hand, had chosen a good night’s sleep over the vital business of unloading such goods from boats lined up in their dozens offshore onto trucks, and getting them into the hands of the Americans desperately in need of them. Reading further, however, you’d learn that those “critical goods” are actually things like “exercise bikes, laptops, toys, [and] patio furniture.”

Fair enough. After all, as my city, San Francisco, enters what’s likely to be yet another almost rainless winter on a planet in ever more trouble, I can imagine my desire for patio furniture rising to a critical level. So, I’m relieved to know that dock workers will now be laboring through the night at the command of the president of the United States to guarantee that my needs are met. To be sure, shortages of at least somewhat more important items are indeed rising, including disposable diapers and the aluminum necessary for packaging some pharmaceuticals. Still, a major focus in the media has been on the specter of “slim pickings this Christmas and Hanukkah.”

Providing “critical” yard furnishings is not the only reason the administration needs to unkink the supply chain. It’s also considered an anti-inflation measure (if an ineffective one). At the end of October, the Consumer Price Index had jumped 6.2% over the same period in 2020, the highest inflation rate in three decades. Such a rise is often described as the result of too much money chasing too few goods. One explanation for the current rise in prices is that, during the worst months of the pandemic, many Americans actually saved money, which they’re now eager to spend. When the things people want to buy are in short supply — perhaps even stuck on container ships off Long Beach and Los Angeles — the price of those that are available naturally rises.

Republicans have christened the current jump in the consumer price index as “Bidenflation,” although the administration actually bears little responsibility for the situation. But Joe Biden and the rest of the Democrats know one thing: if it looks like they’re doing nothing to bring prices down, there will be hell to pay at the polls in 2022, and so it’s the night shift for dock workers and others in Los Angeles, Long Beach, and possibly other American ports.

However, running West Coast ports 24/7 won’t solve the supply-chain problem, not when there aren’t enough truckers to carry that critical patio furniture to Home Depot. The shortage of such drivers arises because there’s more demand than ever before, and because many truckers have simply quit the industry. As the New York Times reports, “Long hours and uncomfortable working conditions are leading to a shortage of truck drivers, which has compounded shipping delays in the United States.”

Rethinking (Shift) Work

Truckers aren’t the only workers who have been rethinking their occupations since the coronavirus pandemic pressed the global pause button. The number of employees quitting their jobs hit 4.4 million this September, about 3% of the U.S. workforce. Resignations were highest in industries like hospitality and medicine, where employees are most at risk of Covid-19 exposure.

For the first time in many decades, workers are in the driver’s seat. They can command higher wages and demand better working conditions. And that’s exactly what they’re doing at workplaces ranging from agricultural equipment manufacturer John Deere to breakfast-cereal makers Kellogg and Nabisco. I’ve even been witnessing it in my personal labor niche, part-time university faculty members (of which I’m one). So allow me to pause here for a shout-out to the 6,500 part-time professors in the University of California system: Thank you! Your threat of a two-day strike won a new contract with a 30% pay raise over the next five years!

This brings me to Biden’s October announcement about those ports going 24/7. In addition to demanding higher pay, better conditions, and an end to two-tier compensation systems (in which laborers hired later don’t get the pay and benefits available to those already on the job), workers are now in a position to reexamine and, in many cases, reject the shift-work system itself. And they have good reason to do so.

So, what is shift work? It’s a system that allows a business to run continuously, ceaselessly turning out and/or transporting widgets year after year. Workers typically labor in eight-hour shifts: 8:00 a.m. to 4:00 p.m., 4:00 p.m. to midnight, and midnight to 8:00 a.m., or the like. In times of labor shortages, they can even be forced to work double shifts, 16 hours in total. Businesses love shift work because it reduces time (and money) lost to powering machinery up and down. And if time is money, then more time worked means more profit for corporations. In many industries, shift work is good for business. But for workers, it’s often another story.

The Graveyard Shift

Each shift in a 24-hour schedule has its own name. The day shift is the obvious one. The swing shift takes you from the day shift to the all-night, or graveyard, shift. According to folk etymology, that shift got its name because, once upon a time, cemetery workers were supposed to stay up all night listening for bells rung by unfortunates who awakened to discover they’d been buried alive. While it’s true that some coffins in England were once fitted with such bells, the term was more likely a reference to the eerie quiet of the world outside the workplace during the hours when most people are asleep.

I can personally attest to the strangeness of life on the graveyard shift. I once worked in an ice cream cone factory. Day and night, noisy, smoky machines resembling small Ferris wheels carried metal molds around and around, while jets of flame cooked the cones inside them. After a rotation, each mold would tip, releasing four cones onto a conveyor belt, rows of which would then approach my station relentlessly. I’d scoop up a stack of 25, twirl them around in a quick check for holes, and place them in a tall box.

Almost simultaneously, I’d make cardboard dividers, scoop up three more of those stacks and seal them, well-divided, in that box, which I then inserted in an even larger cardboard carton and rushed to a giant mechanical stapler. There, I pressed it against a switch, and — boom-ba-da-boom — six large staples would seal it shut, leaving me just enough time to put that carton atop a pallet of them before racing back to my machine, as new columns of just-baked cones piled up, threatening to overwhelm my worktable.

The only time you stopped scooping and boxing was when a relief worker arrived, so you could have a brief break or gobble down your lunch. You rarely talked to your fellow-workers, because there was only one “relief” packer, so only one person at a time could be on break. Health regulations made it illegal to drink water on the line and management was too cheap to buy screens for the windows, which remained shut, even when it was more than 100 degrees outside.

They didn’t like me very much at the Maryland Pacific Cone Company, maybe because I wanted to know why the high school boys who swept the floors made more than the women who, since the end of World War II, had been climbing three rickety flights of stairs to stand by those machines. In any case, management there started messing with my shifts, assigning me to all three in the same week. As you might imagine, I wasn’t sleeping a whole lot and would occasionally resort to those “little white pills” immortalized in the truckers’ song “Six Days on the Road.”

But I’ll never forget one graveyard shift when an angel named Rosie saved my job and my sanity. It was probably three in the morning. I’d been standing under fluorescent lights, scooping, twirling, and boxing for hours when the universe suddenly stood still. I realized at that moment that I’d never done anything else since the beginning of time but put ice cream cones in boxes and would never stop doing so until the end of time.

If time lost its meaning then, dimensions still turned out to matter a lot, because the cones I was working on that night were bigger than I was used to. Soon I was falling behind, while a huge mound of 40-ounce Eat-It-Alls covered my table and began to spill onto the floor. I stared at them, frozen, until I suddenly became aware that someone was standing at my elbow, gently pushing me out of the way.

Rosie, who had been in that plant since the end of World War II, said quietly, “Let me do this. You take my line.” In less than a minute, she had it all under control, while I spent the rest of the night at her machine, with cones of a size I could handle.

I have never been so glad to see the dawn.

The Deadly Reality of the Graveyard Shift

So, when the president of the United States negotiated to get dock workers in Los Angeles to work all night, I felt a twinge of horror. There’s another all-too-literal reason to call it the “graveyard” shift. It turns out that working when you should be in bed is dangerous. Not only do more accidents occur when the human body expects to be asleep, but the long-term effects of night work can be devastating. As the Centers for Disease Control and Prevention’s National Institute of Occupational Safety and Health (NIOSH) reports, the many adverse effects of night work include:

“type 2 diabetes, heart disease, stroke, metabolic disorders, and sleep disorders. Night shift workers might also have an increased risk for reproductive issues, such as irregular menstrual cycles, miscarriage, and preterm birth. Digestive problems and some psychological issues, such as stress and depression, are more common among night shift workers. The fatigue associated with nightshift can lead to injuries, vehicle crashes, and industrial disasters.”

Some studies have shown that such shift work can also lead to decreased bone-mineral density and so to osteoporosis. There is, in fact, a catchall term for all these problems: shift-work disorder.

In addition, studies directly link the graveyard shift to an increased incidence of several kinds of cancer, including breast and prostate cancer. Why would disrupted sleep rhythms cause cancer? Because such disruptions affect the release of the hormone melatonin. Most of the body’s cells contain little “molecular clocks” that respond to daily alternations of light and darkness. When the light dims at night, the pineal gland releases melatonin, which promotes sleep. In fact, many people take it in pill form as a “natural” sleep aid. Under normal circumstances, such a melatonin release continues until the body encounters light again in the morning.

When this daily (circadian) rhythm is disrupted, however, so is the regular production of melatonin, which turns out to have another important biological function. According to NIOSH, it “can also stop tumor growth and protect against the spread of cancer cells.” Unfortunately, if your job requires you to stay up all night, it won’t do this as effectively.

There’s a section on the NIOSH website that asks, “What can night shift workers do to stay healthy?” The answers are not particularly satisfying. They include regular checkups and seeing your doctor if you have any of a variety of symptoms, including “severe fatigue or sleepiness when you need to be awake, trouble with sleep, stomach or intestinal disturbances, irritability or bad mood, poor performance (frequent mistakes, injuries, vehicle crashes, near misses, etc.), unexplained weight gain or loss.”

Unfortunately, even if you have access to healthcare, your doctor can’t write you a prescription to cure shift-work disorder. The cure is to stop working when your body should be asleep.

An End to Shift Work?

Your doctor can’t solve your shift work issue because, ultimately, it’s not an individual problem. It’s an economic and an ethical one.

There will always be some work that must be performed while most people are sleeping, including healthcare, security, and emergency services, among others. But most shift work gets done not because life depends upon it, but because we’ve been taught to expect our patio furniture on demand. As long as advertising and the grow-or-die logic of capitalism keep stoking the desire for objects we don’t really need, may not even really want, and will sooner or later toss on a garbage pile in this or some other country, truckers and warehouse workers will keep damaging their health.

Perhaps the pandemic, with its kinky supply chain, has given us an opportunity to rethink which goods are so “critical” that we’re willing to let other people risk their lives to provide them for us. Unfortunately, such a global rethink hasn’t yet touched Joe Biden and his administration as they confront an ongoing pandemic, supply-chain problems, a rise in inflation, and — oh yes! — an existential climate crisis that gets worse with every plastic widget produced, packed, and shipped.

It’s time for Biden — and the rest of us — to take a breath and think this through. There are good reasons that so many people are walking away from underpaid, life-threatening work. Many of them are reconsidering the nature of work itself and its place in their lives, no matter what the president or anyone else might wish.

And that’s a paradigm shift we all could learn to live with.

Copyright 2021 Rebecca Gordon

Featured image: Port of Los Angeles sunrise by pete is licensed under CC BY 2.0 / Flickr

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer’s new dystopian novel, Songlands(the final one in his Splinterlands series), Beverly Gologorsky’s novel Every Body Has a Story, and Tom Engelhardt’s A Nation Unmade by War, as well as Alfred McCoy’s In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower’s The Violent American Century: War and Terror Since World War II.

Rebecca Gordon, a TomDispatch regular, teaches at the University of San Francisco. She is the author of American Nuremberg: The U.S. Officials Who Should Stand Trial for Post-9/11 War Crimes and is now at work on a new book on the history of torture in the United States.

The curse of the ignored modern prophets

For decades, I kept a poster on my wall that I'd saved from the year I turned 16. In its upper left-hand corner was a black-and-white photo of a white man in a grey suit. Before him spread a cobblestone plaza. All you could see were the man and the stones. Its caption read, "He stood up alone and something happened."

It was 1968. "He" was Minnesota Senator Eugene McCarthy. As that campaign slogan suggested, his strong second-place showing in the Maine primary was proof that opposition to the Vietnam War had finally become a viable platform for a Democratic candidate for president. I volunteered in McCarthy's campaign office that year. My memory of my duties is now vague, but they mainly involved alphabetizing and filing index cards containing information about the senator's supporters. (Remember, this was the age before there was a computer in every pocket, let alone social media and micro-targeting.)

Running against the Vietnam War, McCarthy was challenging then-President Lyndon Johnson in the Democratic primaries. After McCarthy had a strong second-place showing in Maine, New York Senator Robert F. Kennedy entered the race, too, running against the very war his brother, President John F. Kennedy, had bequeathed to Johnson when he was assassinated. Soon, Johnson would withdraw from the campaign, announcing in a televised national address that he wouldn't run for another term.

With his good looks and family name, Bobby Kennedy appeared to have a real chance for the nomination when, on June 5, 1968, during a campaign event in Los Angeles, he, like his brother, was assassinated. That left the war's opponents without a viable candidate for the nomination. Outside the Democratic Party convention in Chicago that August, tens of thousands of angry, mostly young Americans demonstrated their frustration with the war and the party's refusal to take a stand against it. In what was generally recognized as a police riot, the Chicago PD beat protesters and journalists bloody on national TV, as participants chanted, "The whole world is watching." And indeed, it was.

In the end, the nomination went to Johnson's vice president and war supporter Hubert Humphrey, who would face Republican hawk Richard Nixon that November. The war's opponents watched in frustration as the two major parties closed ranks, cementing their post-World-War-II bipartisan agreement to use military power to enforce U.S. global dominance.

Cassandra Foresees the Future

Of course, the McCarthy campaign's slogan was wrong on two counts. He didn't stand up alone. Millions of us around the world were then working to end the war in Vietnam. Sadly, nothing conclusive happened as a result of his campaign. Nixon went on to win the 1968 general election and the Vietnam War dragged on to an ignominious U.S. defeat seven years later.

Nineteen sixty-eight was also the year my high school put on Tiger at the Gates, French playwright Jean Giraudoux's antiwar drama about the run-up to the Trojan War. Giraudoux chronicled that ancient conflict's painful inevitability, despite the fervent desire of Troy's rulers and its people to prevent it. The play opens as Andromache, wife of the doomed Trojan warrior Hector, tells her sister-in-law Cassandra, "There's not going to be a Trojan war."

Cassandra, you may remember, bore a double curse from the gods: yes, she could see into the future, but no one would believe her predictions. She informs Andromache that she's wrong; that, like a tiger pacing outside the city's walls, war with all its bloody pain is preparing to spring. And, of course, she's right. Part of the play's message is that Cassandra doesn't need her supernatural gift to predict the future. She can guess what will happen simply because she understands the relentless forces driving her city to war: the poets who need tragedies to chronicle; the would-be heroes who desire glory; the rulers caught in the inertia of tradition.

Although Tiger was written in the 1930s, between the two world wars, it could just as easily have appeared in 1968. Substitute the mass media for the poets; the military-industrial complex for the Greek and Trojan warriors; and administration after administration for the city's rulers, and you have a striking representation of the quicksand war that dragged 58,000 U.S. soldiers and millions of Vietnamese, Laotians, and Cambodians to their deaths. And in some sense, we — the antiwar forces in this country — foresaw it all (in broad outline, if not specific detail): the assassinations, carpet bombings, tiger cages, and the CIA's first mass assassination and torture scheme, the Phoenix Program. Of course we couldn't predict the specifics. Indeed, some turned out worse than we'd feared. In any case, our foresight did us no more good than Cassandra's did her.

Rehabilitations and Revisions

It's just over a month since the 20th anniversary of the 9/11 attacks and the start of the "Global War on Terror." The press has been full of recollections and rehabilitations. George W. Bush used the occasion to warn the nation (as if we needed it at that point) about the dangers of what CNN referred to as "domestic violent extremists." He called them "children of the same foul spirit" as the one that engenders international terrorism. He also inveighed against the January 6th Capitol invasion:

"'This is how election results are disputed in a banana republic — not our democratic republic,' he said in a statement at the time, adding that he was 'appalled by the reckless behavior of some political leaders since the election.'"

You might almost think he'd forgotten that neither should elections in a democracy be "disputed" by three-piece-suited thugs shutting down a ballot count — as happened in Florida during his own first election in 2000. Future Trump operative Roger Stone has claimed credit for orchestrating that so-called Brooks Brothers Rebellion, which stopped the Florida vote count and threw the election to the Supreme Court and, in the end, to George W. Bush.

You might also think that, with plenty of shoving from his vice president Dick Cheney and a cabal of leftover neocons from the Project for a New American Century, Bush had never led this country into two devastating, murderous, profoundly wasteful wars. You might think we'd never seen the resumption of institutionalized CIA- and military-run state torture on a massive scale under his rule, or his administration's refusal to join the International Criminal Court.

And finally, you might think that nobody saw all this coming, that there were no Cassandras in this country in 2001. But there you would be wrong. All too many of us sensed just what was coming as soon as the bombing and invasion of Afghanistan began. I knew, for example, as early as November 2001, when the first mainstream article extolling the utility of torture appeared, that whatever else the U.S. response to the 9/11 attacks would entail, organized torture would be part of it. As early as December 2002, we all could have known that. That's when the first articles began appearing in the Washington Post about the "stress and duress" techniques the CIA was already beginning to use at Bagram Air Base in Afghanistan. Some of the hapless victims would later turn out to have been sold to U.S. forces for bounties by local strongmen.

It takes very little courage for a superannuated graduate student (as I was in 2001) to write academic papers about U.S. torture practices (as I did) and the stupidity and illegality of our invasion of Afghanistan. It's another thing, however, when a real Cassandra stands up — all alone — and tries to stop something from happening.

I'm talking, of course, about Representative Barbara Lee, the only member of Congress to vote against granting the president the power to "use all necessary and appropriate force against those nations, organizations, or persons he determines planned, authorized, committed, or aided the terrorist attacks that occurred on September 11, 2001, or harbored such organizations or persons." It was this Authorization of the Use of Military Force, or AUMF, that provided the legal grounds for the U.S. invasion of Afghanistan in September 2001. Lee was right when, after agonizing about her vote, she decided to follow the counsel of the dean of the National Cathedral, the Reverend Nathan Baxter. That very morning, she had heard him pray that, in response to the terrible crimes of 9/11, we not "become the evil we deplore."

How right she was when she said on the House floor:

"However difficult this vote may be, some of us must urge the use of restraint. Our country is in a state of mourning. Some of us must say, 'Let's step back for a moment, let's just pause, just for a minute, and think through the implications of our actions today, so that this does not spiral out of control.'"

The legislation she opposed that day would indeed allow "this" to spiral out of control. That same AUMF has since been used to justify an ever-metastasizing series of wars and conflicts that spread from Afghanistan in central Asia through the Middle East, south to Yemen, and leapt to Libya, Somalia, and other lands in Africa. Despite multiple attempts to repeal it, that same minimalist AUMF remains in effect today, ready for use by the next president with aspirations to military adventures. In June 2021, the House of Representatives did finally pass a bill rescinding it, sponsored by Barbara Lee herself. At present, however, it languishes in the Senate's Committee on Foreign Relations.

In the days after 9/11, Lee was roundly excoriated for her vote. The Wall Street Journalcalled her a "clueless liberal," while the Washington Timeswrote that she was "a long-practicing supporter of America's enemies." Curiously, both those editorials were headlined with the question, "Who Is Barbara Lee?" (Those of us in the San Francisco Bay Area could have answered that. Lee was — and remains — an African American congressional representative from Oakland, California, the inheritor of the seat and mantle of another great black congressional representative, Ron Dellums.) She received mountains of hate mail then and enough death threats to force her to seek police protection.

Like George W. Bush, Lee received some media rehabilitation in various 20th anniversary retrospectives of 9/11. In her case, however, it was well-deserved. The Washington Post, for instance, praised her for her courage, noting that no one — not Bernie Sanders, not Joe Biden — shared her vision, or, I would add, shared Cassandra's curse with her. Like the character in Tiger at the Gates, Lee didn't need a divine gift to foresee that the U.S. "war on terror" would spin disastrously out of control. A little historical memory might have served the rest of the country well, reminding us of what happened the last time the United States fought an ever-escalating war.

Cassandras and Their Mirror Images

It was clear from the start that Vice President Dick Cheney and Secretary of Defense Donald Rumsfeld were never that interested in Afghanistan (although that was no solace to the many thousands of Afghans who were bombed, beaten, and tortured). Those officials had another target in mind — Iraq — almost literally from the moment al-Qaeda's hijacked planes struck New York and Washington.

In 2002, after months of lies about Iraqi leader Saddam Hussein's possession of (nonexistent) weapons of mass destruction (WMD) and his supposed pursuit of a nuclear bomb, the Bush administration got its second AUMF, authorizing "the President to use the U.S. armed forces to: …defend U.S. national security against the continuing threat posed by Iraq," functionally condoning the U.S. invasion of his country. This time, Barbara Lee was not alone in her opposition. In the House, she was joined by 132 Democrats, 6 Republicans, and one independent (Bernie Sanders). Only 23 senators, however, voted "nay," including Rhode Island Republican Lincoln Chafee and Vermont independent Jim Jeffords.

In the run-up to the March 2003 invasion, figures who might be thought of as "anti-Cassandras" took center stage. Unlike the Greek seer, these unfortunates were apparently doomed to tell falsehoods — and be believed. Among them was Condoleezza Rice, President Bush's national security advisor, who, when pressed for evidence that Saddam Hussein actually possessed WMD, told CNN's Wolf Blitzer that "we don't want the smoking gun to be a mushroom cloud," implying Iraq represented a nuclear threat to this country.

Then there was secretary of State Colin Powell, who put the case for war to the United Nations General Assembly in February 2003, emphasizing the supposedly factual basis of everything he presented:

"My colleagues, every statement I make today is backed up by sources, solid sources. These are not assertions. What we're giving you are facts and conclusions based on solid intelligence."

It wasn't true, of course, but around the world, many believed him.

And let's not leave the mainstream press out here. There's plenty of blame to go around, but perhaps the anti-Cassandra crown should go to the New York Times for its promotion of Bush administration war propaganda, especially by its reporter Judith Miller. In 2004, the Times published an extraordinary mea culpa, an apologetic note "from the editors" that said,

"[W]e have found a number of instances of coverage that was not as rigorous as it should have been. In some cases, information that was controversial then, and seems questionable now, was insufficiently qualified or allowed to stand unchallenged. Looking back, we wish we had been more aggressive in re-examining the claims as new evidence emerged — or failed to emerge."

I suspect the people of Iraq might share the Times's wish.

There was, of course, one other group of prophets who accurately foresaw the horrors that a U.S. invasion would bring with it: the millions who filled the streets of their cities here and around the world, demanding that the United States stay its hand. So powerful was their witness that they were briefly dubbed "the other superpower." Writing in the Nation, Jonathan Schell extolled their strength, saying that this country's "shock and awe" assault on Iraq "has found its riposte in courage and wonder." Alas, that mass witness in those streets was not enough to forestall one more murderous assault by what would, in the long run, prove to be a dying empire.

Cassandra at the Gates (of Glasgow)

And now, the world is finally waking up to an even greater disaster: the climate emergency that's burning up my part of the world, the American West, and drowning others. This crisis has had its Cassandras, too. One of these was 89-year-old John Rogalsky, who worked for 35 years as a meteorologist in the federal government. As early as 1963, he became aware of the problem of climate change and began trying to warn us. In 2017, he told the Canadian Broadcasting Company:

"[B]y the time the end of the 60s had arrived, I was absolutely convinced that it was real, it was just a question of how rapidly it would happen and how difficult it would become for the world at large, and how soon before people, or governments would even listen to the science. People I talked to about this, I was letting them know, this is happening, get ready."

This November, the 197 nations that have signed up to the United Nations Framework Convention on Climate Change will meet in Glasgow, Scotland, at the 2021 United Nations Climate Change Conference. We must hope that this follow-up to the 2015 Paris agreement will produce concrete steps to reverse the overheating of this planet and mitigate its effects, especially in those nations that have contributed the least to the problem and are already suffering disproportionately. Italy and the United Kingdom will serve as co-hosts.

I hope it's a good sign that at a pre-Glasgow summit in Milan, Italy's Prime Minister Mario Draghi met with three young "Cassandras" — climate activists Greta Thunberg (Sweden), Vanessa Nakate (Uganda), and Martina Comparelli (Italy) — after Thunberg's now famous "blah, blah, blah" speech, accusing world leaders of empty talk. "Your pressure, frankly, is very welcome," Draghi told them. "We need to be whipped into action. Your mobilization has been powerful, and rest assured, we are listening."

For the sake of the world, let us hope that this time Cassandra will be believed.

Copyright 2021 Rebecca Gordon

Featured image: Climate protest by Victoria Pickering is licensed under CC BY-NC-ND 2.0 / Flickr

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer's new dystopian novel, Songlands(the final one in his Splinterlands series), Beverly Gologorsky's novel Every Body Has a Story, and Tom Engelhardt's A Nation Unmade by War, as well as Alfred McCoy's In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower's The Violent American Century: War and Terror Since World War II.

Rebecca Gordon, a TomDispatch regular, teaches at the University of San Francisco. She is the author of American Nuremberg: The U.S. Officials Who Should Stand Trial for Post-9/11 War Crimes and is now at work on a new book on the history of torture in the United States.

Debt and disillusionment: Is higher education a giant pyramid scheme?

For the last decade and a half, I've been teaching ethics to undergraduates. Now — admittedly, a little late to the party — I've started seriously questioning my own ethics. I've begun to wonder just what it means to be a participant, however minor, in the pyramid scheme that higher education has become in the years since I went to college.

Airplane Games

Sometime in the late 1980s, the Airplane Game roared through the San Francisco Bay Area lesbian community. It was a classic pyramid scheme, even if cleverly dressed up in language about women's natural ability to generate abundance, just as we gestate children in our miraculous wombs. If the connection between feminism and airplanes was a little murky — well, we could always think of ourselves as modern-day Amelia Earharts. (As long as we didn't think too hard about how she ended up.)

A few women made a lot of money from it — enough, in the case of one friend of mine, for a down payment on a house. Inevitably, a lot more of us lost money, even as some like me stood on the sidelines sadly shaking our heads.

There were four tiers on that "airplane": a captain, two co-pilots, four crew, and 8 passengers — 15 in all to start. You paid $3,000 to get on at the back of the plane as a passenger, so the first captain (the original scammer), got out with $24,000 — $3,000 from each passenger. The co-pilots and crew, who were in on the fix, paid nothing to join. When the first captain "parachuted out," the game split in two, and each co-pilot became the captain of a new plane. They then pressured their four remaining passengers to recruit enough new women to fill each plane, so they could get their payday, and the two new co-pilots could each captain their own planes.

Unless new people continued to get on at the back of each plane, there would be no payday for the earlier passengers, so the pressure to recruit ever more women into the game only grew. The original scammers ran through the game a couple of times, but inevitably the supply of gullible women willing to invest their savings ran out. By the time the game collapsed, hundreds of women had lost significant amounts of money.

No one seemed to know the women who'd brought the game and all those "planes" to the Bay Area, but they had spun a winning story about endless abundance and the glories of women's energy. After the game collapsed, they took off for another women's community with their "earnings," leaving behind a lot of sadder, poorer, and perhaps wiser San Francisco lesbians.

Feasting at the Tenure Trough or Starving in the Ivory Tower?

So, you may be wondering, what could that long-ago scam have to do with my ethical qualms about working as a college instructor? More than you might think.

Let's start with PhD programs. In 2019, the most recent year for which statistics are available, U.S. colleges and universities churned out about 55,700 doctorates — and such numbers continue to increase by about 1% a year. The average number of doctorates earned over the last decade is almost 53,000 annually. In other words, we're talking about nearly 530,000 PhDs produced by American higher education in those 10 years alone. Many of them have ended up competing for a far smaller number of jobs in the academic world.

It's true that most PhDs in science or engineering end up with post-doctoral positions (earning roughly $40,000 a year) or with tenure-track or tenured jobs in colleges and universities (averaging $60,000 annually to start). Better yet, most of them leave their graduate programs with little or no debt.

The situation is far different if your degree wasn't in STEM (science, technology, engineering, or mathematics) but, for example, in education or the humanities. As a start, far more of those degree-holders graduate owing money, often significant sums, and ever fewer end up teaching in tenure-track positions — in jobs, that is, with security, decent pay, and benefits.

Many of the non-STEM PhDs who stay in academia end up joining an exploited, contingent workforce of part-time, or "adjunct," professors. That reserve army of the underemployed is higher education's dirty little secret. After all, we — and yes, I'm one of them — actually teach the majority of the classes in many schools, while earning as little as $1,500 a semester for each of them.

I hate to bring up transportation again, but there's a reason teachers like us are called "freeway flyers." A 2014 Congressional report revealed that 89% of us work at more than one institution and 27% at three different schools, just to cobble together the most meager of livings.

Many of us, in fact, rely on public antipoverty programs to keep going. Inside Higher Ed, reflecting on a 2020 report from the American Federation of Teachers, describes our situation this way:

"Nearly 25% of adjunct faculty members rely on public assistance, and 40% struggle to cover basic household expenses, according to a new report from the American Federation of Teachers. Nearly a third of the 3,000 adjuncts surveyed for the report earn less than $25,000 a year. That puts them below the federal poverty guideline for a family of four."

I'm luckier than most adjuncts. I have a union, and over the years we've fought for better pay, healthcare, a pension plan, and a pathway (however limited) to advancement. Now, however, my school's administration is using the pandemic as an excuse to try to claw back the tiny cost-of-living adjustments we won in 2019.

The Oxford Dictionary of Englishdefines an adjunct as "a thing added to something else as a supplementary rather than an essential part." Once upon a time, in the middle of the previous century, that's just what adjunct faculty were — occasional additions to the full-time faculty. Often, they were retired professionals who supplemented a department's offerings by teaching a single course in their area of expertise, while their salaries were more honoraria than true payments for work performed. Later, as more women entered academia, it became common for a male professor's wife to teach a course or two, often as part of his employment arrangement with the university. Since her salary was a mere adjunct to his, she was paid accordingly.

Now, the situation has changed radically. In many colleges and universities, adjunct faculty are no longer supplements, but the most "essential part" of the teaching staff. Classes simply couldn't go on without us; nor, if you believe college administrations, could their budgets be balanced without us. After all, why pay a full-time professor $10,000 to teach a class (since he or she will be earning, on average, $60,000 a year and covering three classes a semester) when you can give a part-timer like me $1,500 for the very same work?

And adjuncts have little choice. The competition for full-time positions is fierce, since every year another 53,000 or more new PhDs climb into the back row of the academic airplane, hoping to make it to the pilot's seat and secure a tenure-track position.

And here's another problem with that. These days the people in the pilots' seats often aren't parachuting out. They're staying right where they are. That, in turn, means new PhDs find themselves competing for an ever-shrinking prize, as Laura McKenna has written in the Atlantic, "not only with their own cohort but also with the unemployed PhDs who graduated in previous years." Many of those now clinging to pilots' seats are members of my own boomer generation, who still benefit from a 1986 law (signed by then-75-year-old President Ronald Reagan) that outlawed mandatory retirements.

Grade Inflation v. Degree Inflation?

People in the world of education often bemoan the problem of "grade inflation" — the tendency of average grades to creep up over time. Ironically, this problem is exacerbated by the adjunctification of teaching, since adjuncts tend to award higher grades than professors with secure positions. The reason is simple enough: colleges use student evaluations as a major metric for rehiring adjuncts and higher grades translate directly into better evaluations. Grade inflation at the college level is, in my view, a non-issue, at least for students. Employers don't look at your transcript when they're hiring you and even graduate schools care more about recommendations and GRE scores.

The real problem faced by today's young people isn't grade inflation. It's degree inflation.

Once upon a time in another America, a high-school diploma was enough to snag you a good job, with a chance to move up as time went on (especially if you were white and male, as the majority of workers were in those days). And you paid no tuition whatsoever for that diploma. In fact, public education through 12th grade is still free, though its quality varies profoundly depending on who you are and where you live.

But all that changed as increasing numbers of employers began requiring a college degree for jobs that don't by any stretch of the imagination require a college education to perform. The Washington Postreports:

"Among the positions never requiring a college degree in the past that are quickly adding that to the list of desired requirements: dental hygienists, photographers, claims adjusters, freight agents, and chemical equipment operators."

In 2017, Manjari Raman of the Harvard Business School wrote that

"the degree gap — the discrepancy between the demand for a college degree in job postings and the employees who are currently in that job who have a college degree — is significant. For example, in 2015, 67% of production supervisor job postings asked for a college degree, while only 16% of employed production supervisors had one."

In other words, even though most people already doing such jobs don't have a bachelor's degree, companies are only hiring new people who do. Part of the reason: that requirement automatically eliminates a lot of applicants, reducing the time and effort involved in making hiring decisions. Rather than sifting through résumés for specific skills (like the ability to use certain computer programs or write fluently), employers let a college degree serve as a proxy. The result is not only that they'll hire people who don't have the skills they actually need, but that they're eliminating people who do have the skills but not the degree. You won't be surprised to learn that those rejected applicants are more likely to be people of color, who are underrepresented among the holders of college degrees.

Similarly, some fields that used to accept a BA now require a graduate degree to perform the same work. For example, the Bureau of Labor Statistics reports that "in 2015–16, about 39% of all occupational therapists ages 25 and older had a bachelor's degree as their highest level of educational attainment." Now, however, employers are commonly insisting that new applicants hold at least a master's degree — and so up the pyramid we continually go (at ever greater cost to those students).

The Biggest Pyramid of All

In a sense, you could say that the whole capitalist economy is the biggest pyramid of them all. For every one of the fascinating, fulfilling, autonomous, and well-paying jobs out there, there are thousands of boring, mind- and body-crushing ones like pulling items for shipment in an Amazon warehouse or folding clothes at Forever 21.

We know, in other words, that there are only a relatively small number of spaces in the cockpit of today's economic plane. Nonetheless, we tell our young people that the guaranteed way to get one of those rare gigs at the top of the pyramid is a college education.

Now, just stop for a second and consider what it costs to join the 2021 all-American Airplane Game of education. In 1970, when I went to Reed, a small, private, liberal arts college, tuition was $3,000 a year. I was lucky. I had a scholarship (known in modern university jargon as a "tuition discount") that covered most of my costs. This year, annual tuition at that same school is a mind-boggling $62,420, more than 20 times as high. If college costs had simply risen with inflation, the price would be about $21,000 a year, or just under triple the price.

If I'd attended Federal City College (now the University of D.C.), my equivalent of a state school then, tuition would have been free. Now, even state schools cost too much for many students. Annually, tuition at the University of California at Berkeley, the flagship school of that state's system, is $14,253 for in-state students, and $44,007 for out-of-staters.

I left school owing $800, or about $4,400 in today's dollars. These days, most financial "aid" resembles foreign "aid" to developing countries — that is, it generally takes the form of loans whose interest piles up so fast that it's hard to keep up with it, let alone begin to pay off the principal in your post-college life. Some numbers to contemplate: 62% of those graduating with a BA in 2019 did so owing money — owing, in fact, an average of almost $29,000. The average debt of those earning a graduate degree was an even more staggering $71,000. That, of course, is on top of whatever the former students had already shelled out while in school. And that, in turn, is before the "miracle" of compound interest takes hold and that debt starts to grow like a rogue zucchini.

It's enough to make me wonder whether a seat in the Great American College and University Airplane Game is worth the price, and whether it's ethical for me to continue serving as an adjunct flight attendant along the way. Whatever we tell students about education being the path to a good job, the truth is that there are remarkably few seats at the front of the plane.

Of course, on the positive side, I do still believe that time spent at college offers students something beyond any price — the opportunity to learn to think deeply and critically, while encountering people very different from themselves. The luckiest students graduate with a lifelong curiosity about the world and some tools to help them satisfy it. That is truly a ticket to a good life — and no one should have to buy a seat in an Airplane Game to get one.

Copyright 2021 Rebecca Gordon

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer's new dystopian novel, Songlands(the final one in his Splinterlands series), Beverly Gologorsky's novel Every Body Has a Story, and Tom Engelhardt's A Nation Unmade by War, as well as Alfred McCoy's In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower's The Violent American Century: War and Terror Since World War II.

Rebecca Gordon, a TomDispatch regular, teaches at the University of San Francisco. She is the author of American Nuremberg: The U.S. Officials Who Should Stand Trial for Post-9/11 War Crimes and is now at work on a new book on the history of torture in the United States.

Teetering on the existential edge, we have one more last, best chance for survival

In San Francisco, we're finally starting to put away our masks. With 74% of the city's residents over 12 fully vaccinated, for the first time in more than a year we're enjoying walking, shopping, and eating out, our faces naked. So I was startled when my partner reminded me that we need to buy masks again very soonN95 masks, that is. The California wildfire season has already begun, earlier than ever, and we'll need to protect our lungs during the months to come from the fine particulates carried in the wildfire smoke that's been engulfing this city in recent years.

I was in Reno last September, so I missed the morning when San Franciscans awoke to apocalyptic orange skies, the air freighted with smoke from burning forests elsewhere in the state. The air then was bad enough even in the high mountain valley of Reno. At that point, we'd already experienced "very unhealthy" purple-zone air quality for days. Still, it was nothing like the photos that could have been from Mars then emerging from the Bay Area. I have a bad feeling that I may get my chance to experience the same phenomenon in 2021 — and, as the fires across California have started so much earlier, probably sooner than September.

The situation is pretty dire: this state — along with our neighbors to the north and southeast — is now living through an epic drought. After a dry winter and spring, the fuel-moisture content in our forests (the amount of water in vegetation, living and dead) is way below average. This April, the month when it is usually at its highest, San Jose State University scientists recorded levels a staggering 40% below average in the Santa Cruz Mountains, well below the lowest level ever before observed. In other words, we have never been this dry.

Under the Heat Dome

When it's hot in most of California, its often cold and foggy in San Francisco. Today is no exception. Despite the raging news about heat records, it's not likely to reach 65 degrees here. So it's a little surreal to consider what friends and family are going through in the Pacific Northwest under the once-in-thousands-of-years heat dome that's settled over the region. A heat dome is an area of high pressure surrounded by upper-atmosphere winds that essentially pin it in place. If you remember your high-school physics, you'll recall that when a gas (for example, the air over the Pacific Northwest) is contained, the ratio between pressure and temperature remains constant. If the temperature goes up, the pressure goes up.

The converse is also true; as the pressure rises, so does the temperature. And that's what's been happening over Oregon, Washington, and British Columbia in normally chilly Canada. Mix in the fact that climate change has driven average temperatures in those areas up by three to four degrees since the industrial revolution, and you have a recipe for the disaster that struck the region recently.

And it has indeed been a disaster. The temperature in the tiny town of Lytton, British Columbia, for instance, hit 121 degrees on June 29th, breaking the Canadian heat record for the third time in as many days. (The previous record had stood since 1937.) That was Tuesday. On Wednesday night, the whole town was engulfed in the flames of multiple fires. The fires, in turn, generated huge pyrocumulus clouds that penetrated as high as the stratosphere (a rare event in itself), producing lightning strikes that ignited new fires in a vicious cycle that, in the end, simply destroyed the kilometer-long town.

Heat records have been broken all over the Pacific Northwest. Portland topped records for three days running, culminating with a 116-degree day on June 28th; Seattle hit a high of 108, which the Washington Postreported "was 34 degrees above the normal high of 74 and higher than the all-time heat record in Washington, D.C., among many other cities much farther to its south."

With the heat comes a rise in "sudden and unexpected" deaths. Hundreds have died in Oregon and Washington and, according to the British Columbia coroner, at least 300 in her state — almost double the average number for that time period.

Class, Race, and Hot Air

It's hardly a new observation that the people who have benefited least from the causes of climate change — the residents of less industrialized countries and poor people of all nations — are already suffering most from its results. Island nations like the Republic of Palau in the western Pacific are a prime example. Palau faces a number of climate-change challenges, according to the United Nations Development Program, including rising sea levels that threaten to inundate some of its lowest-lying islands, which are just 10 meters above sea level. In addition, encroaching seawater is salinating some of its agricultural land, creating seaside strips that can now grow only salt-tolerant root crops. Meanwhile, despite substantial annual rainfall, saltwater inundation threatens the drinking water supply. And worse yet, Palau is vulnerable to ocean storms that, on our heating planet, are growing ever more frequent and severe.

There are also subtle ways the rising temperatures that go with climate change have differential effects, even on people living in the same city. Take air conditioning. One of the reasons people in the Pacific Northwest suffered so horrendously under the heat dome is that few homes in that region are air conditioned. Until recently, people there had been able to weather the minimal number of very hot days each year without installing expensive cooling machinery.

Obviously, people with more discretionary income will have an easier time investing in air conditioning now that temperatures are rising. What's less obvious, perhaps, is that its widespread use makes a city hotter — a burden that falls disproportionately on people who can't afford to install it in the first place. Air conditioning works on a simple principle; it shifts heat from air inside an enclosed space to the outside world, which, in turn, makes that outside air hotter.

A 2014 study of this effect in Phoenix, Arizona, showed that air conditioning raised ambient temperatures by one to two degrees at night — an important finding, because one of the most dangerous aspects of the present heat waves is their lack of night-time cooling. As a result, each day's heat builds on a higher base, while presenting a greater direct-health threat, since the bodies of those not in air conditioning can't recover from the exhaustion of the day's heat at night. In effect, air conditioning not only heats the atmosphere further but shifts the burden of unhealthy heat from those who can afford it to those who can't.

Just as the coronavirus has disproportionately ravaged black and brown communities (as well as poor nations around the world), climate-change-driven heat waves, according to a recent University of North Carolina study reported by the BBC, mean that "black people living in most U.S. cities are subject to double the level of heat stress as their white counterparts." This is the result not just of poverty, but of residential segregation, which leaves urban BIPOC (black, indigenous, and other people of color) communities in a city's worst "heat islands" — the areas containing the most concrete, the most asphalt, and the least vegetation — and which therefore attract and retain the most heat.

"Using satellite temperature data combined with demographic information from the U.S. Census," the researchers "found that the average person of color lives in an area with far higher summer daytime temperatures than non-Hispanic white people." They also discovered that, in all but six of the 175 urban areas they studied in the continental U.S., "people of color endure much greater heat impacts in summer." Furthermore, "for black people this was particularly stark. The researchers say they are exposed to an extra 3.12C [5.6F] of heating, on average, in urban neighborhoods, compared to an extra 1.47C [2.6F] for white people."

That's a big difference.

Food, Drink, and Fires — the View from California

Now, let me return to my own home state, California, where conditions remain all too dry and, apart from the coast right now, all too hot. Northern California gets most of its drinking water from the snowpack that builds each year in the Sierra Nevada mountains. In spring, those snows gradually melt, filling the rivers that fill our reservoirs. In May 2021, however, the Sierra snowpack was a devastating six percent of normal!

Stop a moment and take that in, while you try to imagine the future of much of the state — and the crucial crops it grows.

For my own hometown, San Francisco, things aren't quite that dire. Water levels in Hetch Hetchy, our main reservoir, located in Yosemite National Park, are down from previous years, but not disastrously so. With voluntary water-use reduction, we're likely to have enough to drink this year at least. Things are a lot less promising, however, in rural California where towns tend to rely on groundwater for domestic use.

Shrinking water supplies don't just affect individual consumers here in this state, they affect everyone in the United States who eats, because 13.5% of all our agricultural products, including meat and dairy, as well as fruits and vegetables, come from California. Growing food requires prodigious amounts of water. In fact, farmland irrigation accounts for roughly 80% of all water used by businesses and homes in the state.

So how are California's agricultural water supplies doing this year? The answer, sadly, is not very well. State regulators have already cut distribution to about a quarter of California's irrigated acreage (about two million acres) by a drastic 95%. That's right. A full quarter of the state's farmlands have access to just 5% of what they would ordinarily receive from rivers and aqueducts. As a result, some farmers are turning to groundwater, a more easily exhausted source, which also replenishes itself far more slowly than rivers and streams. Some are even choosing to sell their water to other farmers, rather than use it to grow crops at all, because that makes more economic sense for them. As smaller farms are likely to be the first to fold, the water crisis will only enhance the dominance of major corporations in food production.

Meanwhile, we'll probably be breaking out our N95 masks soon. Wildfire season has already begun — earlier than ever. On July 1st, the then-still-uncontained Salt fire briefly closed a section of Interstate 5 near Redding in northern California. (I-5 is the main north-south interstate along the West coast.) And that's only one of the more than 4,500 fire incidents already recorded in the state this year.

Last year, almost 10,000 fires burned more than four million acres here, and everything points to a similar or worse season in 2021. Unlike Donald Trump, who famously blamed California's fires on a failure to properly rake our forests, President Biden is taking the threat seriously. On June 30th, he convened western state leaders to discuss the problem, acknowledging that "we have to act and act fast. We're late in the game here." The president promised a number of measures: guaranteeing sufficient, and sufficiently trained, firefighters; raising their minimum pay to $15 per hour; and making grants to California counties under the Federal Emergency Management Agency's BRIC (Building Resilient Infrastructure and Communities) program.

Such measures will help a little in the short term, but none of it will make a damn bit of difference in the longer run if the Biden administration and a politically divisive Congress don't begin to truly treat climate change as the immediate and desperately long-term emergency it is.

Justice and Generations

In his famous A Theory of Justice, the great liberal philosopher of the twentieth century John Rawls proposed a procedural method for designing reasonable and fair principles and policies in a given society. His idea: that the people determining such basic policies should act as if they had stepped behind a "veil of ignorance" and had lost specific knowledge of their own place in society. They'd be ignorant of their own class status, ethnicity, or even how lucky they'd been when nature was handing out gifts like intelligence, health, and physical strength.

Once behind such a veil of personal ignorance, Rawls argued, people might make rules that would be as fair as possible, because they wouldn't know whether they themselves were rich or poor, black or white, old or young — or even which generation they belonged to. This last category was almost an afterthought, included, he wrote, "in part because questions of social justice arise between generations as well as within them."

His point about justice between generations not only still seems valid to me, but in light of present-day circumstances radically understated. I don't think Rawls ever envisioned a trans-generational injustice as great as the climate-change one we're allowing to happen, not to say actively inducing, at this very moment.

Human beings have a hard time recognizing looming but invisible dangers. In 1990, I spent a few months in South Africa providing some technical assistance to an anti-apartheid newspaper. When local health workers found out that I had worked (as a bookkeeper) for an agency in the U.S. trying to prevent the transmission of AIDS, they desperately wanted to talk to me. How, they hoped to learn, could they get people living in their townships to act now to prevent a highly transmissible illness that would only produce symptoms years after infection? How, in the face of the all-too-present emergencies of everyday apartheid life, could they get people to focus on a vague but potentially horrendous danger barreling down from the future? I had few good answers and, almost 30 years later, South Africa has the largest HIV-positive population in the world.

Of course, there are human beings who've known about the climate crisis for decades — and not just the scientists who wrote about it as early as the 1950s or the ones who gave an American president an all-too-accurate report on it in 1965. The fossil-fuel companies have, of course, known all along — and have focused their scientific efforts not on finding alternative energy sources, but on creating doubt about the reality of human-caused climate change (just as, once upon a time, tobacco companies sowed doubt about the relationship between smoking and cancer). As early as 1979, the Guardianreports, an internal Exxon study concluded that the use of fossil fuels would certainly "cause dramatic environmental effects" in the decades ahead. "The potential problem is great and urgent," the study concluded.

A problem that was "great and urgent" in 1979 is now a full-blown existential crisis for human survival.

Some friends and I were recently talking about how ominous the future must look to the younger people we know. "They are really the first generation to confront an end to humanity in their own, or perhaps their children's lifetimes," I said.

"But we had The Bomb," a friend reminded me. "We grew up in the shadow of nuclear war." And she was right of course. We children of the 1950s and 1960s grew up knowing that someone could "press the button" at any time, but there was a difference. Horrifying as is the present retooling of our nuclear arsenal (going on right now, under President Biden), nuclear war nonetheless remains a question of "if." Climate change is a matter of "when" and that when, as anyone living in the Northwest of the United States and Canada should know after these last weeks, is all too obviously now.

It's impossible to overstate the urgency of the moment. And yet, as a species, we're acting like the children of indulgent parents who provide multiple "last chances" to behave. Now, nature has run out of patience and we're running out of chances. So much must be done globally, especially to control the giant fossil-fuel companies. We can only hope that real action will emerge from November's international climate conference. And here in the U.S., unless congressional Democrats succeed in ramming through major action to stop climate change before the 2022 midterms, we'll have lost one more last, best chance for survival.

Copyright 2021 Rebecca Gordon

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer's new dystopian novel, Songlands(the final one in his Splinterlands series), Beverly Gologorsky's novel Every Body Has a Story, and Tom Engelhardt's A Nation Unmade by War, as well as Alfred McCoy's In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower's The Violent American Century: War and Terror Since World War II.

Rebecca Gordon, a TomDispatch regular, teaches at the University of San Francisco. She is the author of American Nuremberg: The U.S. Officials Who Should Stand Trial for Post-9/11 War Crimes and is now at work on a new book on the history of torture in the United States.

Welfare for weapons makers doesn't make us safer

These days my conversations with friends about the new administration go something like this:

"Biden's doing better than I thought he would."

"Yeah. Vaccinations, infrastructure, acknowledging racism in policing. A lot of pieces of the Green New Deal, without calling it that. The child subsidies. It's kind of amazing."

"But on the military–"

"Yeah, same old, same old."

As my friends and I have noticed, President Joe Biden remains super-glued to the same old post-World War II agreement between the two major parties: they can differ vastly on domestic policies, but they remain united when it comes to projecting U.S. military power around the world and to the government spending that sustains it. In other words, the U.S. "national security" budget is still the third rail of politics in this country.

Assaulting the Old New Deal

It was Democratic House Speaker Tip O'Neill who first declared that Social Security is "the third rail" of American politics. In doing so, he metaphorically pointed to the high-voltage rail that runs between the tracks of subways and other light-rail systems. Touch that and you'll electrocute yourself.

O'Neill made that observation back in 1981, early in Ronald Reagan's first presidential term, at a moment when the new guy in Washington was already hell-bent on dismantling Franklin Delano Roosevelt's New Deal legacy.

Reagan would fight his campaign to do so on two key fronts. First, he would attack labor unions, whose power had expanded in the years since the 1935 Wagner Act (officially the National Labor Relations Act) guaranteed workers the right to bargain collectively with their employers over wages and workplace rules. Such organizing rights had been hard-won indeed. Not a few workers died at the hands of the police or domestic mercenaries like Pinkerton agents, especially in the early 1930s. By the mid-1950s, union membership would peak at around 35% of workers, while wages would continue to grow into the late 1970s, when they stagnated and began their long decline.

Reagan's campaign began with an attack on PATCO, a union of well-paid professionals — federally-employed air-traffic controllers — which his National Labor Relations Board eventually decertified. That initial move signaled the Republican Party's willingness, even enthusiasm, for breaking with decades of bipartisan support for organized labor. By the time Donald Trump took office in the next century, it was a given that Republicans would openly support anti-union measures like federal "right-to-work" laws, which, if passed, would make it illegal for employers to agree to a union-only workplace and so effectively destroy the bargaining power of unions. (Fortunately, opponents were able to forestall that move during Trump's presidency, but in February 2021, Republicans reintroduced their National Right To Work Act.)

The Second Front and the Third Rail

There was a second front in Reagan's war on the New Deal. He targeted a group of programs from that era that came to be known collectively as "entitlements." Three of the most important were Aid to Dependent Children, unemployment insurance, and Social Security. In addition, in 1965, a Democratic Congress had added a healthcare entitlement, Medicare, which helps cover medical expenses for those over 65 and younger people with specific chronic conditions, as well as Medicaid, which does the same for poor people who qualify. These, too, would soon be in the Republican gunsights.

The story of Reagan's racially inflectedattacks on welfare programs is well-known. His administration's urge to go after unemployment insurance, which provided payments to laid-off workers, was less commonly acknowledged. In language eerily echoed by Republican congressional representatives today, the Reagan administration sought to reduce the length of unemployment benefits, so that workers would be forced to take any job at any wage. A 1981 New York Timesreport, for instance, quoted Reagan Assistant Secretary of Labor Albert Agrisani as saying:

"'The bottom line… is that we have developed two standards of work, available work and desirable work.' Because of the availability of unemployment insurance and extended benefits, he said, 'there are jobs out there that people don't want to take.'"

Reagan did indeed get his way with unemployment insurance, but when he turned his sights on Social Security, he touched Tip O'Neill's third rail.

Unlike welfare, whose recipients are often framed as lazy moochers, and unemployment benefits, which critics claim keep people from working, Social Security was then and remains today a hugely popular program. Because workers contribute to the fund with every paycheck and usually collect benefits only after retirement, beneficiaries appear deserving in the public eye. Of all the entitlement programs, it's the one most Americans believe that they and their compatriots are genuinely entitled to. They've earned it. They deserve it.

So, when the president moved to reduce Social Security benefits, ostensibly to offset a rising deficit in its fund, he was shocked by the near-unanimous bipartisan resistance he met. His White House put together a plan to cut $80 billion over five years by — among other things — immediately cutting benefits and raising the age at which people could begin fully collecting them. Under that plan, a worker who retired early at 62 and was entitled to $248 a month would suddenly see that payout reduced to $162.

Access to early retirement was, and remains, a justice issue for workers with shorter life expectancies — especially when those lives have been shortened by the hazards of the work they do. As South Carolina Republican Congressman Carroll Campbell complained to the White House at the time: "I've got thousands of sixty-year-old textile workers who think it's the end of the world. What the hell am I supposed to tell them?"

After the Senate voted 96-0 to oppose any plan that would "precipitously and unfairly reduce early retirees' benefits," the Reagan administration regrouped and worked out a compromise with O'Neill and the Democrats. Economist (later Federal Reserve chair) Alan Greenspan would lead a commission that put together a plan, approved in 1983, to gradually raise the full retirement age, increase the premiums paid by self-employed workers, start taxing benefits received by people with high incomes, and delay cost-of-living adjustments. Those changes were rolled out gradually, the country adjusted, and no politicians were electrocuted in the process.

Panic! The System Is Going Broke!

With its monies maintained in a separately sequestered trust fund, Social Security, unlike most government programs, is designed to be self-sustaining. Periodically, as economist and New York Times columnist Paul Krugman might put it, serious politicians claim to be concerned about that fund running out of money. There's a dirty little secret that those right-wing deficit slayers never tell you, though: when the Social Security trust fund runs a surplus, as it did from 1983 to 2009, it's required to invest it in government bonds, indirectly helping to underwrite the federal government's general fund.

They also aren't going to mention that one group who contributes to that surplus will never see a penny in benefits: undocumented immigrant workers who pay into the system but won't ever collect Social Security. Indeed, in 2016, such workers provided an estimated $13 billion out of about $957 billion in Social Security taxes, or almost 3% of total revenues. That may not sound like much, but over the years it adds up. In that way, undocumented workers help subsidize the trust fund and, in surplus years, the entire government.

How, then, is Social Security funded? Each year, employees contribute 6.2% of their wages (up to a cap amount). Employers match that, for a total of 12.4% of wages paid, and both put out another 1.45% each for Medicare. Self-employed people pay both shares or a total of 15.3% of their income, including Medicare. And those contributions add up to about nine-tenths of the fund's annual income (89% in 2019). The rest comes from interest on government bonds.

So, is the Social Security system finally in trouble? It could be. When the benefits due to a growing number of retirees exceed the fund's income, its administrators will have to dip into its reserves to make up the difference. As people born in the post-World War II baby boom reach retirement, at a moment when the American population is beginning to age rapidly, dire predictions are resounding about the potential bankruptcy of the system. And there is, in fact, a consensus that the fund will begin drawing down its reserves, possibly starting this year, and could exhaust them as soon as 2034. At that point, relying only on the current year's income to pay benefits could reduce Social Security payouts to perhaps 79% of what's promised at present.

You can already hear the cries that the system is going broke!

But it doesn't have to be that way. Employees and employers only pay Social Security tax on income up to a certain cap. This year it's $142,800. In other words, employees who make a million dollars in 2021 will contribute no more to Social Security than those who make $142,800. To rescue Social Security, all it would take is raising that cap — or better yet, removing it altogether.

In fact, the Congressional Budget Office has run the numbers and identified two different methods of raising it to eventually tax all wage income. Either would keep the trust fund solvent.

Naturally, plutocrats and their congressional minions don't want to raise the Social Security cap. They'd rather starve the entitlement beast and blame possible shortfalls on greedy boomers who grew up addicted to government handouts. Under the circumstances, we, and succeeding generations, had better hope that Social Security remains, as it was in 1981, the third rail in American politics.

Welfare for Weapons Makers

Of course, there's a second high-voltage, untouchable rail in American politics and that's funding for the military and weapons manufacturers. It takes a brave politician indeed to suggest even the most minor of reductions in Pentagon spending, which has for years been the single largest item of discretionary spending in the federal budget.

It's notoriously difficult to identify how much money the government actually spends annually on the military. President Trump's last Pentagon budget, for the fiscal year ending on September 30th, offered about $740 billion to the armed services (not including outlays for veteran services and pensions). Or maybe it was only $705.4 billion. Or perhaps, including Department of Energy outlays involving nuclear weapons, $753.5 billion. (And none of those figures even faintly reflected full national-security spending, which is certainly well over a trillion dollars annually.)

Most estimates put President Biden's 2022 military budget at $753 billion — about the same as Trump's for the previous year. As former Senator Everett Dirksen is once supposed to have said, "A billion here, a billion there, and pretty soon you're talking real money."

Indeed, we're talking real money and real entitlements here that can't be touched in Washington without risking political electrocution. Unlike actual citizens, U.S. arms manufacturers seem entitled to ever-increasing government subsidies — welfare for weapons, if you like. Beyond the billions spent to directly fund the development and purchase of various weapons systems, every time the government permits arms sales to other countries, it's expanding the coffers of companies like Lockheed-Martin, Northrup-Grumman, Boeing, and Raytheon Technologies. The real beneficiaries of Donald Trump's so-called Abraham Accords between Israel and the majority Muslim states of Morocco, the United Arab Emirates, Bahrain, and Sudan were the U.S. companies that sell the weaponry that sweetened those deals for Israel's new friends.

When Americans talk about undeserved entitlements, they're usually thinking about welfare for families, not welfare for arms manufacturers. But military entitlements make the annual federal appropriation of $16.5 billion for Temporary Aid to Needy Families (TANF) look puny by comparison. In fact, during Republican and Democratic administrations alike, the yearly federal outlay for TANF hasn't changed since it was established through the 1996 Personal Responsibility and Work Opportunity Reconciliation Act, known in the Clinton era as "welfare reform." Inflation has, however, eroded its value by about 40% in the intervening years.

And what do Americans get for those billions no one dares to question? National security, right?

But how is it that the country that spends more on "defense" than the next seven, or possibly 10, countries combined is so insecure that every year's Pentagon budget must exceed the last one? Why is it that, despite those billions for military entitlements, our critical infrastructure, including hospitals, gas pipelines, and subways (not to mention Cape Cod steamships), lies exposed to hackers?

And if, thanks to that "defense" budget, we're so secure, why is it that, in my wealthy home city of San Francisco, residents now stand patiently in lines many blocks long to receive boxes of groceries? Why is "national security" more important than food security, or health security, or housing security? Or, to put it another way, which would you rather be entitled to: food, housing, education, and healthcare, or your personal share of a shiny new hypersonic missile?

But wait! Maybe defense spending contributes to our economic security by creating, as Donald Trump boasted in promoting his arms deals with Saudi Arabia, "jobs, jobs, jobs." It's true that spending on weaponry does, in fact, create jobs, just not nearly as many as investing taxpayer dollars in a variety of far less lethal endeavors would. As Brown University's Costs of War project reports:

"Military spending creates fewer jobs than the same amount of money would have, if invested in other sectors. Clean energy and health care spending create 50% more jobs than the equivalent amount of spending on the military. Education spending creates more than twice as many jobs."

It seems that President Joe Biden is ready to shake things up by attacking child poverty, the coronavirus pandemic, and climate change, even if he has to do it without any Republican support. But he's still hewing to the old Cold War bipartisan alliance when it comes to the real third rail of American politics — military spending. Until the power can be cut to that metaphorical conduit, real national security remains an elusive dream.

The reality of work in the Biden-Harris era

A year ago, just a few weeks before San Francisco locked itself down for the pandemic, I fell deeply in love with a 50-year-old. The object of my desire was a wooden floor loom in the window of my local thrift shop. Friends knowledgeable on such matters examined photos I took of it and assured me that all the parts were there, so my partner (who puts up with such occasional infatuations) helped me wrangle it into one of our basement rooms and I set about learning to weave.

These days, all I want to do is weave. The loom that's gripped me, and the pandemic that's gripped us all, have led me to rethink the role of work (and its subset, paid labor) in human lives. During an enforced enclosure, this 68-year-old has spent a lot of time at home musing on what the pandemic has revealed about how this country values work. Why, for example, do the most "essential" workers so often earn so little — or, in the case of those who cook, clean, and care for the people they live with, nothing at all? What does it mean when conservatives preach the immeasurable value of labor, while insisting that its most basic price in the marketplace shouldn't rise above $7.25 per hour?

That, after all, is where the federal minimum wage has been stuck since 2009. And that's where it would probably stay forever, if Republicans like Kansas Senator Roger Marshall had their way. He brags that he put himself through college making $6 an hour and doesn't understand why people can't do the same today for $7.25. One likely explanation: the cost of a year at Kansas State University has risen from $898 when he was at school to $10,000 today. Another? At six bucks an hour, he was already making almost twice the minimum wage of his college years, a princely $3.35 an hour.

It's Definitely Not Art, But Is It Work?

It's hard to explain the pleasure I've gotten from learning the craft of weaving, an activity whose roots extend at least 20,000 years into the past. In truth, I could devote the next (and most likely last) 20 years of my life just to playing with "plain weave," its simplest form — over-under, over-under — and not even scratch the surface of its possibilities. Day after day, I tromp down to our chilly basement and work with remarkable satisfaction at things as simple as getting a straight horizontal edge across my cloth.

But is what I'm doing actually "work"? Certainly, at the end of a day of bending under the loom to tie things up, of working the treadles to raise and lower different sets of threads, my aging joints are sore. My body knows all too well that I've been doing something. But is it work? Heaven knows, I'm not making products crucial to our daily lives or those of others. (We now possess more slightly lopsided cloth napkins than any two-person household could use in a lifetime.) Nor, at my beginner's level, am I producing anything that could pass for "art."

I don't have to weave. I could buy textiles for a lot less than it costs me to make them. But at my age, in pandemic America, I'm lucky. I have the time, money, and freedom from personal responsibilities to be able to immerse myself in making cloth. For me, playing with string is a first-world privilege. It won't help save humanity from a climate disaster or reduce police violence in communities of color. It won't even help a union elect an American president, something I was focused on last fall, while working with the hospitality-industry union. It's not teaching college students to question the world and aspire to living examined lives, something I've done in my official work as a part-time professor for the last 15 years. It doesn't benefit anyone but me.

Nevertheless, what I'm doing certainly does have value for me. It contributes, as philosophers might say, to my human flourishing. When I practice weaving, I'm engaged in something political philosopher Iris Marion Young believed essential to a good life. As she put it, I'm "learning and using satisfying and expansive skills." Young thought that a good society would offer all its members the opportunity to acquire and deploy such complicated skills in "socially recognized settings." In other words, a good society would make it possible for people to do work that was both challenging and respected.

Writing in the late 1980s, she took for granted that "welfare capitalism" of Europe, and to a far lesser extent the United States, would provide for people's basic material needs. Unfortunately, decades later, it's hard even to teach her critique of such welfare capitalism — a system that sustained lives but didn't necessarily allow them to flourish — because my students here have never experienced an economic system that assumes any real responsibility for sustaining life. Self-expression and an opportunity to do meaningful work? Pipe dreams if you aren't already well-off! They'll settle for jobs that pay the rent, keep the refrigerator stocked, and maybe provide some health benefits as well. That would be heaven enough, they say. And who could blame them when so many jobs on offer will fall far short of even such modest goals?

What I'm not doing when I weave is making money. I'm not one of the roughly 18 million workers in this country who do earn their livings in the textile industry. Such "livings" pay a median wage of about $28,000 a year, which likely makes it hard to keep a roof over your head. Nor am I one of the many millions more who do the same around the world, people like Seak Hong who sews garments and bags for an American company in Cambodia. Describing her life, she told a New York Times reporter, "I feel tired, but I have no choice. I have to work." Six days a week,

"Ms. Hong wakes up at 4:35 a.m. to catch the truck to work from her village. Her workday begins at 7 and usually lasts nine hours, with a lunch break. During the peak season, which lasts two to three months, she works until 8:30 p.m."
"Ms. Hong has been in the garment business for 22 years. She earns the equivalent of about $230 a month and supports her father, her sister, her brother (who is on disability) and her 12-year-old son."

Her sister does the unpaid — but no less crucial — work of tending to her father and brother, the oxen, and their subsistence rice plants.

Hong and her sister are definitely working, one with pay, the other without. They have, as she says, no choice.

Catherine Gamet, who makes handbags in France for Louis Vuitton, is also presumably working to support herself. But hers is an entirely different experience from Hong's. She loves what she's been doing for the last 23 years. Interviewed in the same article, she told the Times, "To be able to build bags and all, and to be able to sew behind the machine, to do hand-sewn products, it is my passion." For Gamet, "The time flies by."

Both these women have been paid to make bags for more than 20 years, but they've experienced their jobs very differently, undoubtedly thanks to the circumstances surrounding their work, rather than the work itself: how much they earn; the time they spend traveling to and from their jobs; the extent to which the "decision" to do a certain kind of work is coerced by fear of poverty. We don't learn from Hong's interview how she feels about the work itself. Perhaps she takes pride in what she does. Most people find a way to do that. But we know that making bags is Gamet's passion. Her work is not merely exhausting, but in Young's phrase "satisfying and expansive." The hours she spends on it are lived, not just endured as the price of survival.

Pandemic Relief and Its Discontents

Joe Biden and Kamala Harris arrived at the White House with a commitment to getting a new pandemic relief package through Congress as soon as possible. It appears that they'll succeed, thanks to the Senate's budget reconciliation process — a maneuver that bypasses the possibility of a Republican filibuster. Sadly, because resetting the federal minimum wage to $15 per hour doesn't directly involve taxation or spending, the Senate's parliamentarian ruled that the reconciliation bill can't include it.

Several measures contained in the package have aroused conservative mistrust, from the extension of unemployment benefits to new income supplements for families with children. Such measures provoke a Republican fear that somebody, somewhere, might not be working hard enough to "deserve" the benefits Congress is offering or that those benefits might make some workers think twice about sacrificing their time caring for children to earn $7.25 an hour at a soul-deadening job.

As New York Times columnist Ezra Klein recently observed, Republicans are concerned that such measures might erode respect for the "natural dignity" of work. In an incisive piece, he rebuked Republican senators like Mike Lee and Marco Rubio for responding negatively to proposals to give federal dollars to people raising children. Such a program, they insisted, smacked of — the horror! — "welfare," while in their view, "an essential part of being pro-family is being pro-work." Of course, for Lee and Rubio "work" doesn't include changing diapers, planning and preparing meals, doing laundry, or helping children learn to count, tell time, and tie their shoelaces — unless, of course, the person doing those things is employed by someone else's family and being paid for it. In that case it qualifies as "work." Otherwise, it's merely a form of government-subsidized laziness.

There is, however, one group of people that "pro-family" conservatives have long believed are naturally suited to such activities and who supposedly threaten the well-being of their families if they choose to work for pay instead. I mean, of course, women whose male partners earn enough to guarantee food, clothing, and shelter with a single income. I remember well a 1993 article by Pat Gowens, a founder of Milwaukee's Welfare Warriors, in the magazine Lesbian Contradiction. She wondered why conservative anti-feminists of that time thought it good if a woman with children had a man to provide those things, but an outrage if she turned to "The Man" for the same aid. In the first case, the woman's work is considered dignified, sacred, and in tune with the divine plan. Among conservatives, then or now, the second could hardly be dignified with the term "work."

The distinction they make between private and public paymasters, when it comes to domestic labor contains at least a tacit, though sometimes explicit, racial element. When the program that would come to be known as "welfare" was created as part of President Franklin Roosevelt's New Deal in the 1930s, it was originally designed to assist respectable white mothers who, through no fault of their own, had lost their husbands to death or desertion. It wasn't until the 1960s that African American women decided to secure their right to coverage under the same program and built the National Welfare Rights Organization to do so.

The word "welfare" refers, as in the preamble to the Constitution, to human wellbeing. But when Black women started claiming those rights, it suddenly came to signify undeserved handouts. You could say that Ronald Reagan rode into the White House in 1980 in a Cadillac driven by the mythical Black "welfare queen" he continually invoked in his campaign. It would be nice to think that the white resentment harnessed by Reagan culminated (as in "reached its zenith and will now decline") with Trump's 2016 election, but, given recent events, that would be unrealistically optimistic.

Reagan began the movement to undermine the access of poor Americans to welfare programs. Ever since, starving the entitlement beast has been the Republican lodestar. In the same period, of course, the wealthier compatriots of those welfare mothers have continued to receive ever more generous "welfare" from the government. Those would include subsidies to giant agriculture, oil-depletion allowances and other subsidies for fossil-fuel companies, the mortgage-interest tax deduction for people with enough money to buy rather than rent their homes, and the massive tax cuts for billionaires of the Trump era. However, it took a Democratic president, Bill Clinton, to achieve what Reagan couldn't, and, as he put it, "end welfare as we know it."

The Clinton administration used the same Senate reconciliation process in play today for the Biden administration's Covid-19 relief bill to push through the 1996 Personal Responsibility and Work Opportunity Reconciliation Act. It was more commonly known as "welfare reform." That act imposed a 32-hour-per-week work or training requirement on mothers who received what came to be known as Temporary Assistance to Needy Families. It also gave "temporary" its deeper meaning by setting a lifetime benefits cap of five years. Meanwhile, that same act proved a bonanza for non-profits and Private Industry Councils that got contracts to administer "job training" programs and were paid to teach women how to wear skirts and apply makeup to impress future employers. In the process, a significant number of unionized city and county workers nationwide were replaced with welfare recipients "earning" their welfare checks by sweeping streets or staffing county offices, often for less than the minimum wage.

In 1997, I was working with Californians for Justice (CFJ), then a new statewide organization dedicated to building political power in poor communities, especially those of color. Given the high unemployment rates in just such communities, our response to Clinton's welfare reforms was to demand that those affected by them at least be offered state-funded jobs at a living wage. If the government was going to make people work for pay, we reasoned, then it should help provide real well-paying jobs, not bogus "job readiness" programs. We secured sponsors in the state legislature, but I'm sure you won't be shocked to learn that our billion-dollar jobs bill never got out of committee in Sacramento.

CFJ's project led me into an argument with one of my mentors, the founder of the Center for Third World Organizing, Gary Delgado. Why on earth, he asked me, would you campaign to get people jobs? "Jobs are horrible. They're boring: they waste people's lives and destroy their bodies." In other words, Gary was no believer in the inherent dignity of paid work. So, I had to ask myself, why was I?

Among those who have inspired me, Gary wasn't alone in holding such a low opinion of jobs. The Greek philosopher Aristotle, for instance, had been convinced that those whose economic condition forced them to work for a living would have neither the time nor space necessary to live a life of "excellence" (his requirement for human happiness). Economic coercion and a happy life were, in his view, mutually exclusive.

Reevaluating Jobs

One of the lies capitalism tells us is that we should be grateful for our jobs and should think of those who make a profit from our labor not as exploiters but as "job creators." In truth, however, there's no creativity involved in paying people less than the value of their work so that you can skim off the difference and claim that you earned it. Even if we accept that there could be creativity in "management" — the effort to organize and divide up work so it's done efficiently and well — it's not the "job creators" who do that, but their hirelings. All the employers bring to the game is money.

Take the example of the admirable liberal response to the climate emergency, the Green New Deal. In the moral calculus of capitalism, it's not enough that shifting to a green economy could promote the general welfare by rebuilding and extending the infrastructure that makes modern life possible and rewarding. It's not enough that it just might happen in time to save billions of people from fires, floods, hurricanes, or starvation. What matters — the selling point — is that such a conversion would create jobs (along with the factor no one mentions out loud: profits).

Now, I happen to support exactly the kind of work involved in building an economy that could help reverse climate devastation. I agree with Joe Biden's campaign statement that such an undertaking could offer people jobs with "good wages, benefits, and worker protections." More than that, such jobs would indeed contribute to a better life for those who do them. As the philosopher Iris Marion Young puts it, they would provide the chance to learn and use "satisfying and expansive skills in a socially recognized setting." And that would be a very good thing even if no one made a penny of profit in the process.

Now, having finished my paid labor for the day, it's back to the basement and loom for me.

Copyright 2021 Rebecca Gordon

Rebecca Gordon, a TomDispatch regular, teaches at the University of San Francisco. She is the author of American Nuremberg: The U.S. Officials Who Should Stand Trial for Post-9/11 War Crimes and is now at work on a new book on the history of torture in the United States.

How 4 years of Trump brought us closer to doomsday

If you live in California, you're likely to be consumed on occasion by thoughts of fire. That's not surprising, given that, in last year alone, actual fires consumed over four and a quarter million acres of the state, taking with them 10,488 structures, 33 human lives, and who knows how many animals. By the end of this January, a month never before even considered part of the "fire" season, 10 wildfires had already burned through 2,337 more acres, according to the California Department of Forestry and Fire Protection (CalFire).

With each passing year, the state's fire season arrives earlier and does greater damage. In 2013, a mere eight years ago, fires consumed about 602,000 acres and started significantly later. That January, CalFire reported only a single fire, just two in February, and none in March. Fire season didn't really begin until April and had tapered off before year's end. This past December, however, 10 fires still burned at least 10,000 acres. In fact, it almost doesn't make sense to talk about a fire "season" anymore. Whatever the month, wildfires are likely to be burning somewhere in the state.

Clearly, California's fires (along with Oregon's and Washington's) are getting worse. Just as clearly, notwithstanding Donald Trump's exhortations to do a better job of "raking" our forests, climate change is the main cause of this growing disaster.

Fortunately, President Joe Biden seems to take the climate emergency seriously. In just his first two weeks in office, he's canceled the Keystone XL pipeline project, forbidden new drilling for oil or gas on public lands, and announced a plan to convert the entire federal fleet of cars and trucks to electric vehicles. Perhaps most important of all, he's bringing the U.S. back into the Paris climate accords, signaling an understanding that a planetary crisis demands planetwide measures and that the largest carbon-emitting economies should be leading the way. "This isn't [the] time for small measures," Biden has said. "We need to be bold."

Let's just hope that such boldness has arrived in time and that the Biden administration proves unwilling to sacrifice the planet on an altar of elusive congressional unity and illusionary bipartisanship.

Another Kind of Fire

If climate change threatens human life as we know it, so does another potential form of "fire" — the awesome power created when a nuclear reaction converts matter to energy. This is the magic of Einstein's observation that e=mc2, or that the energy contained in a bit of matter is equal to its mass (roughly speaking, its weight) multiplied by the speed of light expressed in meters per second. Roughly speaking, as we've all known since August 6, 1945, when an atomic bomb was dropped on the Japanese city of Hiroshima, that's an awful lot of energy. When a nuclear reaction is successfully controlled, the energy can be regulated and used to produce electricity without emitting carbon dioxide in the process.

Unfortunately, while nuclear power plants don't add greenhouse gasses to the atmosphere, they do create radioactive waste, some of which remains deadly for thousands of years. Industry advocates who argue for nuclear power as a "green" alternative generally ignore the problem which has yet to be solved ­­ of disposing of that waste.

In what hopefully is just a holdover from the Trump administration, the Energy Department website still "addresses" this issue by suggesting that all the nuclear waste produced to date "could fit on a football field at a depth of less than 10 yards!" The site neglects to add that, if you shoved that 3,456,000 square feet of nuclear waste together the wrong way, the resultant explosive chain reaction would probably wipe out most life on Earth.

Remember, too, that "controlled" nuclear reactions don't always remain under human control. Ask anyone who lived near the Three Mile Island nuclear reactor in Pennsylvania, the Chernobyl nuclear power plant in the Ukraine, or the Fukushima Daiichi nuclear power plant in Japan.

There is, however, another far more devastating form of "controlled" nuclear reaction, the kind created when a nuclear bomb explodes. Only one country has ever deployed atomic weapons in war, of course: the United States, in its attack on Hiroshima and, three days later, on Nagasaki. Those bombs were of the older uranium-based variety and were puny by the standards of today's nuclear weapons. Still, the horror of those attacks was sufficient to convince many that such weapons should never be used again.

Treaties and Entreaties

In the decades since 1945, various configurations of nations have agreed to treaties prohibiting the use of, or limiting the proliferation of, nuclear weapons — even as the weaponry spread and nuclear arsenals grew. In the Cold War decades, the most significant of these were the bilateral pacts between the two superpowers of the era, the U.S. and the Soviet Union. When the latter collapsed in 1991, Washington signed treaties instead with the Russian Federation government, the most recent being the New START treaty, which came into effect in 2011 and was just extended by Joe Biden and Vladimir Putin.

In addition to such bilateral agreements, the majority of nations on the planet agreed on various multilateral pacts, including the Nuclear Non-Proliferation Treaty, or NPT, which has been signed by 191 countries and has provided a fairly effective mechanism for limiting the spread of such arms. Today, there are still "only" nine nuclear-armed states. Of these, six have signed the NPT, but just five of them — China, France, Russia, the United Kingdom, and United States — admit to possessing such weaponry. Israel, which also signed the pact, has never publicly acknowledged its growing nuclear arsenal. Three other nuclear-armed countries — India, Pakistan, and North Korea — have never signed the treaty at all. Worse yet, in 2005, the George W. Bush administration inked a side-deal with India that gave Washington's blessing to the acceleration of that country's nuclear weapons development program outside the monitoring constraints of the NPT.

The treaty assigns to the International Atomic Energy Agency (IAEA) the authority to monitor compliance. It was this treaty, for example, that gave the IAEA the right to inspect Iraq's nuclear program in the period before the U.S. invaded in 2003. Indeed, the IAEA repeatedly reported that Iraq was, in fact, in compliance with the treaty in the months that preceded the invasion, despite the claims of the Bush administration that Iraqi ruler Saddam Hussein had such weaponry. The United States must act, President Bush insisted then, before the "smoking gun" of proof the world demanded turned out to be a "mushroom cloud" over some American city. As became clear after the first few months of the disastrous U.S. military occupation, there simply were no weapons of mass destruction in Iraq. (At least partly in recognition of the IAEA's attempts to forestall that U.S. invasion, the agency and its director general, Mohamed El Baradei, would receive the 2005 Nobel Peace Prize.)

Like Iraq, Iran also ratified the NPT in 1968, laying the foundation for ongoing IAEA inspections there. In recent years, having devastated Iraq's social, economic, and political infrastructure, the United States shifted its concern about nuclear proliferation to Iran. In 2015, along with China, Russia, France, the United Kingdom, Germany, and the European Union, the Obama administration signed the Joint Comprehensive Plan of Action (JCPOA), informally known as the Iran nuclear deal.

Under the JCPOA, in return for the lifting of onerous economic sanctions that were affecting the whole population, Iran agreed to limit the development of its nuclear capacity to the level needed to produce electricity. Again, IAEA scientists would be responsible for monitoring the country's compliance, which by all accounts was more than satisfactory — at least until 2018. That's when President Donald Trump unilaterally pulled the U.S. out of the agreement and reimposed heavy sanctions. Since then, as its economy began to be crushed, Iran was, understandably enough, reluctant to uphold its end of the bargain.

In the years since 1945, the world has seen treaties signed to limit or ban the testing of nuclear weapons or to cap the size of nuclear arsenals, as well as bilateral treaties to decommission parts of existing ones, but never a treaty aimed at outlawing nuclear weapons altogether. Until now. On January 22, 2021, the United Nations Treaty on the Prohibition of Nuclear Weapons took effect. Signed so far by 86 countries, the treaty represents "a legally binding instrument to prohibit nuclear weapons, leading towards their total elimination," according to the U.N. Sadly, but unsurprisingly, none of the nine nuclear powers are signatories.

"Fire and Fury"

I last wrote about nuclear danger in October 2017 when Donald Trump had been in the White House less than a year and, along with much of the world, I was worried that he might bungle his way into a war with North Korea. Back then, he and Kim Jong-un had yet to fall in love or to suffer their later public breakup. Kim was still "Little Rocket Man" to Trump, who had threatened to "rain fire and fury like the world has never seen" on North Korea.

The world did, in the end, survive four years of a Trump presidency without a nuclear war, but that doesn't mean he left us any safer. On the contrary, he took a whole series of rash steps leading us closer to nuclear disaster:

  • He pulled the U.S. out of the JCPOA, thereby destabilizing the Iran nuclear agreement and reigniting Iran's threats (and apparent efforts toward) someday developing nuclear weapons.
  • He withdrew from the 1987 Intermediate Range Nuclear Forces Treaty between the U.S. and the Soviet Union (later the Russian Federation), which, according to the nonpartisan Arms Control Association,
"required the United States and the Soviet Union to eliminate and permanently forswear all of their nuclear and conventional ground-launched ballistic and cruise missiles with ranges of 500 to 5,500 kilometers. The treaty marked the first time the superpowers had agreed to reduce their nuclear arsenals, eliminate an entire category of nuclear weapons, and employ extensive on-site inspections for verification."
  • He withdrew from the Open Skies Treaty, which gave signatories permission to fly over each other's territories to identify military installations and activities. Allowing this kind of access was meant to contribute to greater trust among nuclear-armed nations.
  • He threatened to allow the New START Treaty to expire, should he be reelected.
  • He presided over a huge increase in spending on the "modernization" of the U.S. nuclear arsenal, including on new submarine- and land-based launching capabilities. A number of these programs are still in their initial stages and could be stopped by the Biden administration.

In January 2021, after four years of Trump, the Bulletin of Atomic Scientists adjusted its "Doomsday Clock," moving the minute hand forward, to a mere 100 seconds to midnight. Since 1947, that Clock's annual resetting has reflected how close, in the view of the Bulletin's esteemed scientists and Nobel laureates, humanity has come to ending it all. As the Bulletin's editors note, "The Clock has become a universally recognized indicator of the world's vulnerability to catastrophe from nuclear weapons, climate change, and disruptive technologies in other domains."

Why so close to midnight? The magazine lists a number of reasons, including the increased danger of nuclear war, due in large part to steps taken by the United States in the Trump years, as well as to the development of "hypersonic" missiles, which are supposed to fly at five times the speed of sound and so evade existing detection systems. (Trump famously referred to these "super-duper" weapons as "hydrosonic," a term that actually describes a kind of toothbrush.) There is disagreement among weapons experts about the extent to which such delivery vehicles will live up to the (hyper) hype about them, but the effort to build them is destabilizing in its own right.

The Bulletin points to a number of other factors that place humanity in ever greater danger. One is, of course, the existential threat of climate change. Another is the widespread dissemination of "false and misleading information." The spread of lies about Covid-19, its editors say, exemplifies the life-threatening nature of a growing "wanton disregard for science and the large-scale embrace of conspiratorial nonsense." This is, they note, "often driven by political figures and partisan media." Such attacks on knowledge itself have "undermined the ability of responsible national and global leaders to protect the security of their citizens."

Passing the (Nuclear) Ball

When Donald Trump announced that he wouldn't attend the inauguration of Joe Biden and Kamala Harris, few people were surprised. After all, he was still insisting that he'd actually won the election, even after that big lie fueled an insurrectionary invasion of the Capitol. But there was another reason for concern: if Trump was going to be at Mar-a-Lago, how would he hand over the "nuclear football" to the new president? That "football" is, in fact, a briefcase containing the nuclear launch codes, which presidents always have with them. Since the dawn of the nuclear age, it's been passed from the outgoing president to the new one on Inauguration Day.

Consternation! The problem was resolved through the use of two briefcases, which were simultaneously deactivated and activated at 11:59:59 a.m. on January 20th, just as Biden was about to be sworn in.

The football conundrum pointed to a far more serious problem, however — that the fate of humanity regularly hangs on the actions of a single individual (whether as unbalanced as Donald Trump or as apparently sensible as Joe Biden) who has the power to begin a war that could end our species.

There's good reason to think that Joe Biden will be more reasonable about the dangers of nuclear warfare than the narcissistic idiot he succeeds. In addition to agreeing to extend the New START treaty, he's also indicated a willingness to rejoin the Iran nuclear deal and criticized Trump's nuclear buildup. Nevertheless, the power to end the world shouldn't lie with one individual. Congress could address this problem, by (as I suggested in 2017) enacting "a law that would require a unanimous decision by a specified group of people (for example, officials like the secretaries of state and defense together with the congressional leadership) for a nuclear first strike."

The Fire Next Time?

"God gave Noah the rainbow sign
No more water but the fire next time"

These words come from the African-American spiritual "I Got a Home in that Rock." The verse refers to God's promise to Noah in Genesis, after the great flood, never again to destroy all life on earth, a promise signified by the rainbow.

Those who composed the hymn may have been a bit less trusting of God — or of human destiny — than the authors of Genesis, since the Bible account says nothing about fire or a next time. Sadly, recent human history suggests that there could indeed be a next time. If we do succeed in destroying ourselves, it seems increasingly likely that it will be by fire, whether the accelerating heating of the globe over decades, or a nuclear conflagration any time we choose. The good news, the flame of hope, is that we still have time — at least 100 seconds — to prevent it.

Copyright 2021 Rebecca Gordon

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer's new dystopian novel Frostlands(the second in the Splinterlands series), Beverly Gologorsky's novel Every Body Has a Story, and Tom Engelhardt's A Nation Unmade by War, as well as Alfred McCoy's In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower's The Violent American Century: War and Terror Since World War II.

Rebecca Gordon, a TomDispatch regular, teaches at the University of San Francisco. She is the author of American Nuremberg: The U.S. Officials Who Should Stand Trial for Post-9/11 War Crimes and is now at work on a new book on the history of torture in the United States.

The signs that our American empire is crumbling

How can you tell when your empire is crumbling? Some signs are actually visible from my own front window here in San Francisco.

Directly across the street, I can see a collection of tarps and poles (along with one of my own garbage cans) that were used to construct a makeshift home on the sidewalk. Beside that edifice stands a wooden cross decorated with a string of white Christmas lights and a red ribbon — a memorial to the woman who built that structure and died inside it earlier this week. We don't know — and probably never will — what killed her: the pandemic raging across California? A heart attack? An overdose of heroin or fentanyl?

Behind her home and similar ones is a chain-link fence surrounding the empty playground of the Horace Mann/Buena Vista elementary and middle school. Like that home, the school, too, is now empty, closed because of the pandemic. I don't know where the families of the 20 children who attended that school and lived in one of its gyms as an alternative to the streets have gone. They used to eat breakfast and dinner there every day, served on the same sidewalk by a pair of older Latina women who apparently had a contract from the school district to cook for the families using that school-cum-shelter. I don't know, either, what any of them are now doing for money or food.

Just down the block, I can see the line of people that has formed every weekday since early December. Masked and socially distanced, they wait patiently to cross the street, one at a time, for a Covid test at a center run by the San Francisco Department of Health. My little street seems an odd choice for such a service, since — especially now that the school has closed — it gets little foot traffic. Indeed, a representative of the Latino Task Force, an organization created to inform the city's Latinx population about Covid resources told our neighborhood paper Mission Local that

"Small public health clinics such as this one 'will say they want to do more outreach, but I actually think they don't want to.' He believes they chose a low-trafficked street like Bartlett to stay under the radar. 'They don't want to blow the spot up, because it does not have a large capacity.'"

What do any of these very local sights have to do with a crumbling empire? They're signs that some of the same factors that fractured the Roman empire back in 476 CE (and others since) are distinctly present in this country today — even in California, one of its richest states. I'm talking about phenomena like gross economic inequality; over-spending on military expansion; political corruption; deep cultural and political fissures; and, oh yes, the barbarians at the gates. I'll turn to those factors in a moment, but first let me offer a brief defense of the very suggestion that U.S. imperialism and an American empire actually exist.

Imperialism? What's That Supposed to Mean?

What better source for a definition of imperialism than the Encyclopedia Britannica, that compendium of knowledge first printed in 1768 in the country that became the great empire of the nineteenth and first part of the twentieth centuries? According to the Encyclopedia, "imperialism" denotes "state policy, practice, or advocacy of extending power and dominion, especially by direct territorial acquisition or by gaining political and economic control of other areas." Furthermore, imperialism "always involves the use of power, whether military or economic or some subtler form." In other words, the word indicates a country's attempts to control and reap economic benefit from lands outside its borders.

In that context, "imperialism" is an accurate description of the trajectory of U.S. history, starting with the country's expansion across North America, stealing territory and resources from Indian nations and decimating their populations. The newly independent United States would quickly expand, beginning with the 1803 Louisiana Purchase from France. That deal, which effectively doubled its territory, included most of what would become the state of Louisiana, together with some or all of the present-day states of New Mexico, Texas, Arkansas, Missouri, Oklahoma, Kansas, Colorado, Iowa, Nebraska, Wyoming, Minnesota, North and South Dakota, Montana, and even small parts of what are today the Canadian provinces of Alberta and Saskatchewan.

Eventually, such expansionism escaped even those continental borders, as the country went on to gobble up the Philippines, Hawaii, the Panama Canal Zone, the Virgin Islands, Puerto Rico, Guam, American Samoa, and the Mariana Islands, the last five of which remain U.S. territories to this day. (Inhabitants of the nation's capital, where I grew up, were only partly right when we used to refer to Washington, D.C., as "the last colony.")

Of course, France didn't actually control most of that land, apart from the port city of New Orleans and its immediate environs. What Washington bought was the "right" to take the rest of that vast area from the native peoples who lived there, whether by treaty, population transfers, or wars of conquest and extermination. The first objective of that deal was to settle land on which to expand the already hugely lucrative cotton business, that economic engine of early American history fueled, of course, by slave labor. It then supplied raw materials to the rapidly industrializing textile industry of England, which drove that country's own imperial expansion.

U.S. territorial expansion continued as, in 1819, Florida was acquired from Spain and, in 1845, Texas was forcibly annexed from Mexico (as well as various parts of California a year later). All of those acquisitions accorded with what newspaper editor John O'Sullivan would soon call the country's manifest — that is, clear and obvious — destiny to control the entire continent.

American Doctrines from Monroe to Truman to (G.W.) Bush

U.S. economic, military, and political influence has long extended far beyond those internationally recognized possessions and various presidents have enunciated a series of "doctrines" to legitimate such an imperial reach.

Monroe: The first of these was the Monroe Doctrine, introduced in 1823 in President James Monroe's penultimate State of the Union address. He warned the nations of Europe that, while the United States recognized existing colonial possessions in the Americas, it would not permit the establishment of any new ones.

President Teddy Roosevelt would later add a corollary to Monroe's doctrine by establishing Washington's right to intercede in any country in the Americas that, in the view of its leaders, was not being properly run. "Chronic wrongdoing," he said in a 1904 message to Congress, "may in America, as elsewhere, ultimately require intervention by some civilized nation." The United States, he suggested, might find itself forced, "however reluctantly, in flagrant cases of such wrongdoing or impotence, to the exercise of an international police power." In the first quarter of the twentieth century, that Roosevelt Corollary would be used to justify U.S. occupations of Cuba, the Dominican Republic, Haiti, and Nicaragua.

Truman: Teddy's cousin, President Franklin D. Roosevelt, publicly renounced the Monroe Doctrine and promised a hands-off attitude towards Latin America, which came to be known as the Good Neighbor Policy. It didn't last long, however. In a 1947 address to Congress, the next president, Harry S. Truman, laid out what came to be known as the Truman Doctrine, which would underlie the country's foreign policy at least until the collapse of the Soviet Union in 1991. It held that U.S. national security interests required the "containment" of existing Communist states and the prevention of the further spread of Communism anywhere on Earth.

It almost immediately led to interventions in the internal struggles of Greece and Turkey and would eventually underpin Washington's support for dictators and repressive regimes from El Salvador to Indonesia. It would justify U.S.-backed coups in places like Iran, Guatemala, and Chile. It would lead this country into a futile war in Korea and a disastrous defeat in Vietnam.

That post-World War II turn to anticommunism would be accompanied by a new kind of colonialism. Rather than directly annexing territories to extract cheap labor and cheaper natural resources, under this new "neocolonial" model, the United States — and soon the great multilateral institutions of the post-war era, the World Bank and the International Monetary Fund — would gain control over the economies of poor nations. In return for aid — or loans often pocketed by local elites and repaid by the poor — those nations would accede to demands for the "structural adjustment" of their economic systems: the privatization of public services like water and utilities and the defunding of human services like health and education, usually by American or multinational corporations. Such "adjustments," in turn, allowed the recipients to service the loans, extracting scarce hard currency from already deeply impoverished nations.

Bush: You might have thought that the fall of the Soviet empire and the end of the Cold War would have provided Washington with an opportunity to step away from resource extraction and the seemingly endless military and CIA interventions that accompanied it. You might have imagined that the country then being referred to as the "last superpower" would finally consider establishing new and different relationships with the other countries on this little planet of ours. However, just in time to prevent even the faint possibility of any such conversion came the terrorist attacks of 9/11, which gave President George W. Bush the chance to promote his very own doctrine.

In a break from postwar multilateralism, the Bush Doctrine outlined the neoconservative belief that, as the only superpower in a now supposedly "unipolar" world, the United States had the right to take unilateral military action any time it believed it faced external threat of any imaginable sort. The result: almost 20 years of disastrous "forever wars" and a military-industrial complex deeply embedded in our national economy. Although Donald Trump's foreign policy occasionally feinted in the direction of isolationism in its rejection of international treaties, protocols, and organizational responsibilities, it still proved itself a direct descendant of the Bush Doctrine. After all, it was Bush who first took the United States out of the Anti-Ballistic Missile Treaty and rejected the Kyoto Protocol to fight climate change.

His doctrine instantly set the stage for the disastrous invasion and occupation of Afghanistan, the even more disastrous Iraq War, and the present-day over-expansion of the U.S. military presence, overt and covert, in practically every corner of the world. And now, to fulfill Donald Trump's Star Trek fantasies, even in outer space.

An Empire in Decay

If you need proof that the last superpower, our very own empire, is indeed crumbling, consider the year we've just lived through, not to mention the first few weeks of 2021. I mentioned above some of the factors that contributed to the collapse of the famed Roman empire in the fifth century. It's fair to say that some of those same things are now evident in twenty-first-century America. Here are four obvious candidates:

Grotesque Economic Inequality: Ever since President Ronald Reagan began the Republican Party's long war on unions and working people, economic inequality has steadily increased in this country, punctuated by terrible shocks like the Great Recession of 2007-2008 and, of course, by the Covid-19 disaster. We've seen 40 years of tax reductions for the wealthy, stagnant wages for the rest of us (including a federal minimum wage that hasn't changed since 2009), and attacks on programs like TANF (welfare) and SNAP (food stamps) that literally keep poor people alive.

The Romans relied on slave labor for basics like food and clothing. This country relies on super-exploited farm and food-factory workers, many of whom are unlikely to demand more or better because they came here without authorization. Our (extraordinarily cheap) clothes are mostly produced by exploited people in other countries.

The pandemic has only exposed what so many people already knew: that the lives of the millions of working poor in this country are growing ever more precarious and desperate. The gulf between rich and poor widens by the day to unprecedented levels. Indeed, as millions have descended into poverty since the pandemic began, the Guardian reports that this country's 651 billionaires have increased their collective wealth by $1.1 trillion. That's more than the $900 billion Congress appropriated for pandemic aid in the omnibus spending bill it passed at the end of December 2020.

An economy like ours, which depends so heavily on consumer spending, cannot survive the deep impoverishment of so many people. Those 651 billionaires are not going to buy enough toys to dig us out of this hole.

Wild Overspending on the Military: At the end of 2020, Congress overrode Trump's veto of the annual National Defense Authorization Act, which provided a stunning $741 billion to the military this fiscal year. (That veto, by the way, wasn't in response to the vast sums being appropriated in the midst of a devastating pandemic, but to the bill's provisions for renaming military bases currently honoring Confederate generals, among other extraneous things.) A week later, Congress passed that omnibus pandemic spending bill and it contained an additional $696 billion for the Defense Department.

All that money for "security" might be justified, if it actually made our lives more secure. In fact, our federal priorities virtually take food out of the mouths of children to feed the maw of the military-industrial complex and the never-ending wars that go with it. Even before the pandemic, more than 10% of U.S. families regularly experienced food insecurity. Now, it's a quarter of the population.

Corruption So Deep It Undermines the Political System: Suffice it to say that the man who came to Washington promising to "drain the swamp" has presided over one of the most corrupt administrations in U.S. history. Whether it's been blatant self-dealing (like funneling government money to his own businesses); employing government resources to forward his reelection (including using the White House as a staging ground for parts of the Republican National Convention and his acceptance speech); tolerating corrupt subordinates like Secretary of Commerce Wilbur Ross; or contemplating a self-pardon, the Trump administration has set the bar high indeed for any future aspirants to the title of "most corrupt president."

One problem with such corruption is that it undermines the legitimacy of government in the minds of the governed. It makes citizens less willing to obey laws, pay taxes, or act for the common good by, for example, wearing masks and socially distancing during a pandemic. It rips apart social cohesion from top to bottom.

Of course, Trump's most dangerous corrupt behavior — one in which he's been joined by the most prominent elected and appointed members of his government and much of his party — has been his campaign to reject the results of the 2020 general election. The concerted and cynical promotion of the big lie that the Democrats stole that election has so corrupted faith in the legitimacy of government that up to 68% of Republicans now believe the vote was rigged to elect Joe Biden. At "best," Trump has set the stage for increased Republican suppression of the vote in communities of color. At worst, he has so poisoned the electoral process that a substantial minority of Americans will never again accept as free and fair an election in which their candidate loses.

A Country in Ever-Deepening Conflict: White supremacy has infected the entire history of this country, beginning with the near-extermination of its native peoples. The Constitution, while guaranteeing many rights to white men, proceeded to codify the enslavement of Africans and their descendants. In order to maintain that enslavement, the southern states seceded and fought a civil war. After a short-lived period of Reconstruction in which Black men were briefly enfranchised, white supremacy regained direct legal control in the South, and flourished in a de facto fashion in the rest of the country.

In 1858, two years before that civil war began, Abraham Lincoln addressed the Illinois Republican State Convention, reminding those present that

"'A house divided against itself cannot stand.' I believe this government cannot endure, permanently half slave and half free. I do not expect the Union to be dissolved — I do not expect the house to fall – but I do expect it will cease to be divided. It will become all one thing, or all the other."

More than 160 years later, the United States clearly not only remains but has become ever more divided. If you doubt that the Civil War is still being fought today, look no farther than the Confederate battle flags proudly displayed by members of the insurrectionary mob that overran the Capitol on January 6th.

Oh, and the barbarians? They are not just at the gate; they have literally breached it, as we saw in Washington when they burst through the doors and windows of the center of government.

Building a Country From the Rubble of Empire

Human beings have long built new habitations quite literally from the rubble — the fallen stones and timbers — of earlier ones. Perhaps it's time to think about what kind of a country this place — so rich in natural resources and human resourcefulness — might become if we were to take the stones and timbers of empire and construct a nation dedicated to the genuine security of all its people. Suppose we really chose, in the words of the preamble to the Constitution, "to promote the general welfare, and to secure the blessings of liberty to ourselves and our posterity."

Suppose we found a way to convert the desperate hunger for ever more, which is both the fuel of empires and the engine of their eventual destruction, into a new contentment with "enough"? What would a United States whose people have enough look like? It would not be one in which tiny numbers of the staggeringly wealthy made hundreds of billions more dollars and the country's military-industrial complex thrived in a pandemic, while so many others went down in disaster.

This empire will fall sooner or later. They all do. So, this crisis, just at the start of the Biden and Harris years, is a fine time to begin thinking about what might be built in its place. What would any of us like to see from our front windows next year?

Copyright 2021 Rebecca Gordon

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer's new dystopian novel Frostlands(the second in the Splinterlands series), Beverly Gologorsky's novel Every Body Has a Story, and Tom Engelhardt's A Nation Unmade by War, as well as Alfred McCoy's In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower's The Violent American Century: War and Terror Since World War II.

Rebecca Gordon, a TomDispatch regular, teaches at the University of San Francisco. She is the author of American Nuremberg: The U.S. Officials Who Should Stand Trial for Post-9/11 War Crimes and is now at work on a new book on the history of torture in the United States.

Trump's broken promise shows how the American empire is rotting from within

It was the end of October 2001. Two friends, Max Elbaum and Bob Wing, had just dropped by. (Yes, children, believe it or not, people used to drop in on each other, maskless, once upon a time.) They had come to hang out with my partner Jan Adams and me. Among other things, Max wanted to get some instructions from fellow-runner Jan about taping his foot to ease the pain of plantar fasciitis. But it soon became clear that he and Bob had a bigger agenda for the evening. They were eager to recruit us for a new project.

And so began War Times/Tiempo de Guerras, a free, bilingual, antiwar tabloid that, at its height, distributed 100,000 copies every six weeks to more than 700 antiwar organizations around the country. It was already clear to the four of us that night -- as it was to millions around the world -- that the terrorist attacks of September 11th would provide the pretext for a major new projection of U.S. military power globally, opening the way to a new era of "all-war-all-the-time." War Times was a project of its moment (although the name would still be apt today, given that those wars have never ended). It would be superseded in a few years by the explosive growth of the Internet and the 24-hour news cycle. Still, it represented an early effort to fill the space where a peace movement would eventually develop.

All-War-All-the-Time -- For Some of Us

We were certainly right that the United States had entered a period of all-war-all-the-time. It's probably hard for people born since 9/11 to imagine how much -- and how little -- things changed after September 2001. By the end of that month, this country had already launched a "war" on an enemy that then-Secretary of Defense Donald Rumsfeld told us was "not just in Afghanistan," but in "50 or 60 countries, and it simply has to be liquidated."

Five years and two never-ending wars later, he characterized what was then called the war on terror as "a generational conflict akin to the Cold War, the kind of struggle that might last decades as allies work to root out terrorists across the globe and battle extremists who want to rule the world." A generation later, it looks like Rumsfeld was right, if not about the desires of the global enemy, then about the duration of the struggle.

Here in the United States, however, we quickly got used to being "at war." In the first few months, interstate bus and train travelers often encountered (and, in airports, still encounter) a new and absurd kind of "security theater." I'm referring to those long, snaking lines in which people first learned to remove their belts and coats, later their hats and shoes, as ever newer articles of clothing were recognized as potential hiding places for explosives. Fortunately, the arrest of the Underwear Bomber never led the Transportation Security Administration to the obvious conclusion about the clothing travelers should have to remove next. We got used to putting our three-ounce containers of liquids (No more!) into quart-sized baggies (No bigger! No smaller!).

It was all-war-all-the-time, but mainly in those airports. Once the shooting wars started dragging on, if you didn't travel by airplane much or weren't deployed to Afghanistan or Iraq, it was hard to remember that we were still in war time at all. There were continuing clues for those who wanted to know, like the revelations of CIA torture practices at "black sites" around the world, the horrors of military prisons like the ones at Bagram Air Force Base in Afghanistan, Abu Ghraib in Baghdad, and the still-functioning prison complex at Guantánamo Bay, Cuba. And soon enough, of course, there were the hundreds and then thousands of veterans of the Iraq and Afghan wars taking their places among the unhoused veterans of earlier wars in cities across the United States, almost unremarked upon, except by service organizations.

So, yes, the wars dragged on at great expense, but with little apparent effect in this country. They even gained new names like "the long war" (as Donald Trump's Secretary of Defense James Mattis put it in 2017) or the "forever wars," a phrase now so common that it appearsall over the place. But apart from devouring at least $6.4 trillion dollars through September 2020that might otherwise have been invested domestically in healthcare, education, infrastructure, or addressing poverty and inequality, apart from creating increasingly militarized domestic police forces armed ever more lethally by the Pentagon, those forever wars had little obvious effect on the lives of most Americans.

Of course, if you happened to live in one of the places where this country has been fighting for the last 19 years, things are a little different. A conservative estimate by Iraq Body Count puts violent deaths among civilians in that country alone at 185,454 to 208,493 and Brown University's Costs of War project points out that even the larger figure is bound to be a significant undercount:

Several times as many Iraqi civilians may have died as an indirect result of the war, due to damage to the systems that provide food, health care, and clean drinking water, and as a result, illness, infectious diseases, and malnutrition that could otherwise have been avoided or treated.

And that's just Iraq. Again, according to the Costs of War Project, "At least 800,000 people have been killed by direct war violence in Iraq, Afghanistan, Syria, Yemen, and Pakistan."

Of course, many more people than that have been injured or disabled. And America's post-9/11 wars have driven an estimated 37 million people from their homes, creating the greatest human displacement since World War II. People in this country are rightly concerned about the negative effects of online schooling on American children amid the ongoing Covid-19 crisis (especially poor children and those in communities of color). Imagine, then, the effects on a child's education of losing her home and her country, as well as one or both parents, and then growing up constantly on the move or in an overcrowded, under-resourced refugee camp. The war on terror has truly become a war of generations.

Every one of the 2,977 lives lost on 9/11 was unique and invaluable. But the U.S. response has been grotesquely disproportionate -- and worse than we War Times founders could have imagined that October night so many years ago.

Those wars of ours have gone on for almost two decades now. Each new metastasis has been justified by George W. Bush's and then Barack Obama's use of the now ancient 2001 Authorization for the Use of Military Force (AUMF), which Congress passed in the days after 9/11. Its language actually limited presidential military action to a direct response to the 9/11 attacks and the prevention of future attacks by the same actors. It stated that the president authorized to use all necessary and appropriate force against those nations, organizations, or persons he determines planned, authorized, committed, or aided the terrorist attacks that occurred on September 11, 2001, or harbored such organizations or persons, in order to prevent any future acts of international terrorism against the United States by such nations, organizations or persons.

Despite that AUMF's limited scope, successive presidents have used it to justify military action in at least 18 countries. (To be fair, President Obama realized the absurdity of his situation when he sent U.S. troops to Syria and tried to wring a new authorization out of Congress, only to be stymied by a Republican majority that wouldn't play along.)

In 2002, in the run-up to the Iraq War, Congress passed a second AUMF, which permitted the president to use the armed forces as "necessary and appropriate" to "defend U.S. national security against the continuing threat posed by Iraq." In January 2020, Donald Trump used that second authorization to justify the murder by drone of Qasem Soleimani, an Iranian general, along with nine other people.

Trump Steps In

In 2016, peace activists were preparing to confront a Hillary Clinton administration that we expected would continue Obama's version of the forever wars -- the "surge" in Afghanistan, the drone assassination campaigns, the special ops in Africa. But on Tuesday, November 8, 2016, something went "Trump" in the night and Donald J. Trump took over the presidency with a promise to end this country's forever wars, which he had criticized relentlessly during his campaign. That, of course, didn't mean we should have expected a peace dividend anytime soon. He was also committed to rebuilding a supposedly "depleted" U.S. military. As he said at a 2019 press conference,

When I took over, it was a mess... One of our generals came in to see me and he said, 'Sir, we don't have ammunition.' I said, 'That's a terrible thing you just said.' He said, 'We don't have ammunition.' Now we have more ammunition than we've ever had.

It's highly unlikely that the military couldn't afford to buy enough bullets when Trump entered the Oval Office, given that publicly acknowledged defense funding was then running at $580 billion a year. He did, however, manage to push that figure to $713 billion by fiscal year 2020. That December, he threatened to veto an even larger appropriation for 2021 -- $740 billion -- but only because he wanted the military to continue to honor Confederate generals by keeping their names on military bases. Oh, and because he thought the bill should also change liability rules for social media companies, an issue you don't normally expect to see addressed in a defense appropriations bill. And, in any case, Congress passed the bill with a veto-proof majority.

As Pentagon expert Michael Klare pointed out recently, while it might seem contradictory that Trump would both want to end the forever wars and to increase military spending, his actions actually made a certain sense. The president, suggested Klare, had been persuaded to support the part of the U.S. military command that has favored a sharp pivot away from reigning post-9/11 Pentagon practices. For 19 years, the military high command had hewed fairly closely to the strategy laid out by Secretary of Defense Donald Rumsfeld early in the Bush years: maintaining the capacity to fight ground wars against one or two regional powers (think of that "Axis of Evil" of Iraq, North Korea, and Iran), while deploying agile, technologically advanced forces in low-intensity (and a couple of higher-intensity) counterterrorism conflicts. Nineteen years later, whatever its objectives may have been -- a more-stable Middle East? Fewer and weaker terrorist organizations? -- it's clear that the Rumsfeld-Bush strategy has failed spectacularly.

Klare points out that, after almost two decades without a victory, the Pentagon has largely decided to demote international terrorism from rampaging monster to annoying mosquito cloud. Instead, the U.S. must now prepare to confront the rise of China and Russia, even if China has only one overseas military base and Russia, economically speaking, is a rickety petro-state with imperial aspirations. In other words, the U.S. must prepare to fight short but devastating wars in multiple domains (including space and cyberspace), perhaps even involving the use of tactical nuclear weapons on the Eurasian continent. To this end, the country has indeed begun a major renovation of its nuclear arsenal and announced a new 30-year plan to beef up its naval capacity. And President Trump rarely misses a chance to tout "his" creation of a new Space Force.

Meanwhile, did he actually keep his promise and at least end those forever wars? Not really. He did promise to bring all U.S. troops home from Afghanistan by Christmas, but acting Defense Secretary Christopher Miller only recently said that we'd be leaving about 2,500 troops there and a similar number in Iraq, with the hope that they'd all be out by May 2021. (In other words, he dumped those wars in the lap of the future Biden administration.)

In the meantime in these years of "ending" those wars, the Trump administration actually loosened the rules of engagement for air strikes in Afghanistan, leading to a "massive increase in civilian casualties," according to a new report from the Costs of War Project. "From the last year of the Obama administration to the last full year of recorded data during the Trump administration," writes its author, Neta Crawford, "the number of civilians killed by U.S.-led airstrikes in Afghanistan increased by 330 percent."

In spite of his isolationist "America First" rhetoric, in other words, President Trump has presided over an enormous buildup of an institution, the military-industrial complex, that was hardly in need of major new investment. And in spite of his anti-NATO rhetoric, his reduction by almost a third of U.S. troop strength Germany, and all the rest, he never really violated the post-World War II foreign policy pact between the Republican and Democratic parties. Regardless of how they might disagree about dividing the wealth domestically, they remain united in their commitment to using diplomacy when possible, but military force when necessary, to maintain and expand the imperial power that they believed to be the guarantor of that wealth.

And Now Comes Joe

On January 20, 2021, Joe Biden will become the president of a country that spends as much on its armed forces, by some counts, as the next 10 countries combined. He'll inherit responsibility for a nation with a military presence in 150 countries and special-operations deployments in 22 African nations alone. He'll be left to oversee the still-unfinished, deeply unsuccessful, never-ending war on terror in Iraq, Syria, Afghanistan, Yemen, and Somalia and, as publicly reported by the Department of Defense, 187,000 troops stationed outside the United States.

Nothing in Joe Biden's history suggests that he or any of the people he's already appointed to his national security team have the slightest inclination to destabilize that Democratic-Republican imperial pact. But empires are not sustained by inclination alone. They don't last forever. They overextend themselves. They rot from within.

If you're old enough, you may remember stories about the long lines for food in the crumbling Soviet Union, that other superpower of the Cold War. You can see the same thing in the United States today. Once a week, my partner delivers food boxes to hungry people in our city, those who have lost their jobs and homes, because the pandemic has only exacerbated this country's already brutal version of economic inequality. Another friend routinely sees a food line stretching over a mile, as people wait hours for a single free bag of groceries.

Perhaps the horrors of 2020 -- the fires and hurricanes, Trump's vicious attacks on democracy, the death, sickness, and economic dislocation caused by Covid-19 -- can force a real conversation about national security in 2021. Maybe this time we can finally ask whether trying to prop up a dying empire actually makes us -- or indeed the world -- any safer. This is the best chance in a generation to start that conversation. The alternative is to keep trudging mindlessly toward disaster.

Rebecca Gordon, a TomDispatch regular, teaches at the University of San Francisco. She is the author of American Nuremberg: The U.S. Officials Who Should Stand Trial for Post-9/11 War Crimesand is now at work on a new Dispatch book on the history of torture in the United States.

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer's new dystopian novel (the second in the Splinterlands series) Frostlands, Beverly Gologorsky's novel Every Body Has a Story, and Tom Engelhardt's A Nation Unmade by War, as well as Alfred McCoy's In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower's The Violent American Century: War and Terror Since World WarII.

Copyright 2020 Rebecca Gordon

Trump's through-the-looking-glass presidency has turned the country inside out

In the chaos of this moment, it seems likely that Joe Biden will just squeeze into the presidency and that he'll certainly win the popular vote, Donald Trump's Mussolini-like behavior and election night false claim of victory notwithstanding. Somehow, it all brings another moment in my life to mind.

Back in October 2016, my friends and I frequently discussed the challenges progressives would face if the candidate we expected to win actually entered the Oval Office. There were so many issues to worry about back then. The Democratic candidate was an enthusiastic booster of the U.S. armed forces and believed in projecting American power through its military presence around the world. Then there was that long record of promoting harsh sentencing laws and the disturbing talk about "the kinds of kids that are called superpredators -- no conscience, no empathy."

In 2016, the country was already riven by deep economic inequality. While Hillary Clinton promised "good-paying jobs" for those struggling to stay housed and buy food, we didn't believe it. We'd heard the same promises so many times before, and yet the federal minimum wage was still stuck where it had been ever since 2009, at $7.25 an hour. Would a Clinton presidency really make a difference for working people? Not if we didn't push her -- and hard.

The candidate we were worried about was never Donald Trump, but Hillary Clinton. And the challenge we expected to confront was how to shove that quintessential centrist a few notches to the left. We were strategizing on how we might organize to get a new administration to shift government spending from foreign wars to human needs at home and around the world. We wondered how people in this country might finally secure the "peace dividend" that had been promised to us in the period just after the Cold War, back when her husband Bill became president. In those first (and, as it turned out, only) Clinton years, what we got instead was so-called welfare reform whose consequences are still being felt today, as layoffs drive millions into poverty.

We doubted Hillary Clinton's commitment to addressing most of our other concerns as well: mass incarceration and police violence, structural racism, economic inequality, and most urgent of all (though some of us were just beginning to realize it), the climate emergency. In fact, nationwide, people like us were preparing to spend a day or two celebrating the election of the first woman president and then get down to work opposing many of her anticipated policies. In the peace and justice movements, in organized labor, in community-based organizations, in the two-year-old Black Lives Matter movement, people were ready to roll.

And then the unthinkable happened. The woman we might have loved to hate lost that election and the white-supremacist, woman-hating monster we would grow to detest entered the Oval Office.

For the last four years, progressives have been fighting largely to hold onto what we managed to gain during Barack Obama's presidency: an imperfect healthcare plan that nonetheless insured millions of Americans for the first time; a signature on the Paris climate accord and another on a six-nation agreement to prevent Iran from pursuing nuclear weapons; expanded environmental protections for public lands; the opportunity for recipients of Deferred Action for Childhood Arrivals -- DACA -- status to keep on working and studying in the U.S.

For those same four years, we've been fighting to hold onto our battered capacity for outrage in the face of continual attacks on simple decency and human dignity. There's no need to recite here the catalogue of horrors Donald Trump and his spineless Republican lackeys visited on this country and the world. Suffice it to say that we've been living like Alice in Through the Looking Glass, running as hard as we can just to stand still. That fantasy world's Red Queen observes to a panting Alice that she must come from

"A slow sort of country! Now, here, you see, it takes all the running you can do to keep in the same place. If you want to get somewhere else, you must run at least twice as fast as that!"

It wasn't simply the need to run faster than full speed just in order to stay put that made Trump World so much like Looking-Glass Land. It's that, just as in Lewis Carroll's fictional world, reality has been turned inside out in the United States. As new Covid-19 infections reached an all-time high of more than 100,000 in a single day and the cumulative death toll surpassed 230,000, the president in the mirror kept insisting that "we're rounding the corner" (and a surprising number of Americans seemed to believe him). He neglected to mention that, around that very corner, a coronaviral bus is heading straight toward us, accelerating as it comes. In a year when, as NPR reported, "Nearly 1 in 4 households have experienced food insecurity," Trump just kept bragging about the stock market and reminding Americans of how well their 401k's were doing -- as if most people even had such retirement accounts in the first place.

Trump World, Biden Nation, or Something Better?

After four years of running in place, November 2016 seems like a lifetime ago. The United States of 2020 is a very different place, at once more devastated and more hopeful than at least we were a mere four years ago. On the one hand, pandemic unemployment has hit women, especially women of color, much harder than men, driving millions out of the workforce, many permanently. On the other, we've witnessed the birth of the #MeToo movement against sexual harassment and of the Time's Up Legal Defense Fund, which has provided millions of dollars for working-class women to fight harassment on the job. In a few brief years, physical and psychological attacks on women have ceased to be an accepted norm in the workplace. Harassment certainly continues every day, but the country's collective view of it has shifted.

Black and Latino communities still face daily confrontations with police forces that act more like occupying armies than public servants. The role of the police as enforcers of white supremacy hasn't changed in most parts of the country. Nonetheless, the efforts of the Black Lives Matter movement and of the hundreds of thousands of people who demonstrated this summer in cities nationwide have changed the conversation about the police in ways no one anticipated four years ago. Suddenly, the mainstream media are talking about more than body cams and sensitivity training. In June 2020, the New York Timesran an op-ed entitled, "Yes, We Mean Literally Abolish the Police," by Miramne Kaba, an organizer working against the criminalization of people of color. Such a thing was unthinkable four years ago.

In the Trumpian pandemic moment, gun purchases have soared in a country that already topped the world by far in armed citizens. And yet young people -- often led by young women -- have roused themselves to passionate and organized action to get guns off the streets of Trump Land. After a gunman shot up Emma Gonzalez's school in Parkland, Florida, she famously announced, "We call BS" on the claims of adults who insisted that changing the gun laws was unnecessary and impossible. She led the March for Our Lives, which brought millions onto the streets in this country to denounce politicians' inaction on gun violence.

While Donald Trump took the U.S. out of the Paris climate agreement, Greta Thunberg, the 17-year-old Swedish environmental activist, crossed the Atlantic in a carbon-neutral sailing vessel to address the United Nations, demanding of the adult world "How dare you" leave it to your children to save an increasingly warming planet:

"You have stolen my dreams and my childhood with your empty words. And yet I'm one of the lucky ones. People are suffering. People are dying. Entire ecosystems are collapsing. We are in the beginning of a mass extinction, and all you can talk about is money and fairy tales of eternal economic growth. How dare you!"

"How dare you?" is a question I ask myself every time, as a teacher, I face a classroom of college students who, each semester, seem both more anxious about the future and more determined to make it better than the present.

Public attention is a strange beast. Communities of color have known for endless years that the police can kill them with impunity, and it's not as if people haven't been saying so for decades. But when such incidents made it into the largely white mainstream media, they were routinely treated as isolated events -- the actions of a few bad apples -- and never as evidence of a systemic problem. Suddenly, in May 2020, with the release of a hideous video of George Floyd's eight-minute murder in Minneapolis, Minnesota, systematic police violence against Blacks became a legitimate topic of mainstream discussion.

The young have been at the forefront of the response to Floyd's murder and the demands for systemic change that have followed. This June in my city of San Francisco, where police have killed at least five unarmed people of color in the last few years, high school students planned and led tens of thousands of protesters in a peaceful march against police violence.

Now that the election season has reached its drawn-out crescendo, there is so much work ahead of us. With the pandemic spreading out of control, it's time to begin demanding concerted federal action, even from this most malevolent president in history. There's no waiting for Inauguration Day, no matter who takes the oath of office on January 20th. Many thousands more will die before then.

And isn't it time to turn our attention to the millions who have lost their jobs and face the possibility of losing their housing, too, as emergency anti-eviction decrees expire? Isn't it time for a genuine congressional response to hunger, not by shoring up emergency food distribution systems like food pantries, but by putting dollars in the hands of desperate Americans so they can buy their own food? Congress must also act on the housing emergency. The Centers for Disease Control and Prevention's "Temporary Halt in Residential Evictions To Prevent the Further Spread of Covid-19" only lasts until December 31st and it doesn't cover tenants who don't have a lease or written rental agreement. It's crucial, even with Donald Trump still in the White House as the year begins, that it be extended in both time and scope. And now Senate Republican leader Mitch McConnell has said that he won't even entertain a new stimulus bill until January.

Another crucial subject that needs attention is pushing Congress to increase federal funding to state and local governments, which so often are major economic drivers for their regions. The Trump administration and McConnell not only abandoned states and cities, leaving them to confront the pandemic on their own just as a deep recession drastically reduced tax revenues, but -- in true looking-glass fashion -- treated their genuine and desperate calls for help as mere Democratic Party campaign rhetoric.

"In Short, There Is Still Much to Do"

My favorite scene in Gillo Pontecorvo's classic 1966 film The Battle of Algiers takes place at night on a rooftop in the Arab quarter of that city. Ali La Pointe, a passionate recruit to the cause of the National Liberation Front (NLF), which is fighting to throw the French colonizers out of Algeria, is speaking with Ben M'Hidi, a high-ranking NLF official. Ali is unhappy that the movement has called a general strike in order to demonstrate its power and reach to the United Nations. He resents the seven-day restriction on the use of firearms. "Acts of violence don't win wars," Ben M'Hidi tells Ali. "Finally, the people themselves must act."

For the last four years, Donald Trump has made war on the people of this country and indeed on the people of the entire world. He's attacked so many of us, from immigrant children at the U.S. border to anyone who tries to breathe in the fire-choked states of California, Oregon, Washington, and most recently Colorado. He's allowed those 230,000 Americans to die in a pandemic that could have been controlled and thrown millions into poverty, to mention just a few of his "war" crimes. Finally, the people themselves must act.

On that darkened rooftop in an eerie silence, Ben M'Hidi continues his conversation with La Pointe. "You know, Ali," he says. "It's hard enough to start a revolution, even harder to sustain it, and hardest of all to win it." He pauses, then continues, "But it's only afterwards, once we've won, that the real difficulties begin. In short, there is still much to do."

It's hard enough to vote out a looking-glass president. But it's only once we've won, whether that's now or four years from now, that the real work begins. There is, indeed, still much to do.

Rebecca Gordon, a TomDispatch regular, teaches at the University of San Francisco. She is the author of American Nuremberg: The U.S. Officials Who Should Stand Trial for Post-9/11 War Crimesand is now at work on a new Dispatch book on the history of torture in the United States.

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer's new dystopian novel (the second in the Splinterlands series) Frostlands, Beverly Gologorsky's novel Every Body Has a Story, and Tom Engelhardt's A Nation Unmade by War, as well as Alfred McCoy's In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower's The Violent American Century: War and Terror Since World War II.

Copyright 2020 Rebecca Gordon

How to take on Trump on the ground — and convince others to do the same

"Look, folks, the air quality is in the red zone today. The EPA says that means people with lung or heart issues should avoid prolonged activity outdoors."

That was J.R. de Vera, one of two directors of UNITE-HERE!'s independent expenditure campaign to elect Biden and Harris in Reno, Nevada. UNITE-HERE! is a union representing 300,000 workers in the hospitality industry -- that world of hotels and bars, restaurants and caterers. Ninety percent of its members are now laid off because of Trump's bungling of the Covid-19 pandemic and many are glad for the chance to help get him out of the White House.

"So some of you will want to stay in your hotel rooms and make phone calls today," JR continues. Fifty faces fall in the 50 little Zoom boxes on my laptop screen. Canvassers would much rather be talking to voters at their doors than calling them on a phone bank. Still, here in the burning, smoking West, the union is as committed to its own people's health and safety as it is to dragging Donald Trump out of office. So, for many of them, phone calls it will be.

My own job doesn't change much from day to day. Though I live in San Francisco, I've come to Reno to do back-room logistics work in the union campaign's cavernous warehouse of an office: ordering supplies, processing reimbursements, and occasionally helping the data team make maps of the areas our canvassers will walk.

Our field campaign is just one of several the union is running in key states. We're also in Arizona and Florida and, only last week, we began door-to-door canvassing in Philadelphia. Social media, TV ads, bulk mail, and phone calls are all crucial elements in any modern electoral campaign, but none of them is a substitute for face-to-face conversations with voters.

We've been in Reno since early August, building what was, until last week, the only field campaign in the state supporting Joe Biden and Kamala Harris. (Just recently, our success in campaigning safely has encouraged the Democratic Party to start its own ground game here and elsewhere.) We know exactly how many doors we have to knock on, how many Biden voters we have to identify, how many of them we have to convince to make a concrete voting plan, and how many we have to get out to vote during Nevada's two-week early voting period to win here.

We're running a much larger campaign in Clark County, where close to three-quarters of Nevada's population lives (mostly in Las Vegas). Washoe County, home of the twin cities of Reno and Sparks, is the next largest population center with 16% of Nevadans. The remaining 14 counties, collectively known as "the Rurals," account for the rest. Washoe and Clark are barely blue; the Rurals decidedly red.

In 2018, UNITE-HERE!'s ground campaign helped ensure that Jacky Rosen would flip a previously Republican Senate seat, and we helped elect Democrat Steve Sisolak as governor. He's proved a valuable union ally, signing the Adolfo Fernandez Act, a first-in-the-nation law protecting workers and businesses in Nevada from the worst effects of the Covid-19 pandemic.

Defying a threatened Trump campaign lawsuit (later dismissed by a judge), Sisolak also signed an election reform bill that allows every active Nevada voter to receive a mail-in ballot. Largely as a result of the union's work in 2018, this state now boasts an all-female Democratic senatorial delegation, a Democratic governor, and a female and Democratic majority in the state legislature. Elections, as pundits of all stripes have been known to say, have consequences.

Door-to-Door on Planet A

"¿Se puede, o no se puede?"

"¡Sí, se puede!"

("Can we do it?" "Yes, we can!")

Each morning's online canvass dispatch meeting starts with that call-and-response followed by a rousing handclap. Then we talk about where people will be walking that day and often listen to one of the canvassers' personal stories, explaining why he or she is committed to this campaign. Next, we take a look at the day's forecast for heat and air quality as vast parts of the West Coast burn, while smoke and ash travel enormous distances. Temperatures here were in the low 100s in August (often hovering around 115 degrees in Las Vegas). And the air? Let's just say that there have been days when I've wished breathing were optional.

Climate-change activists rightly point out that "there's no Planet B" for the human race, but some days it seems as if our canvassers are already working on a fiery Planet A that is rapidly becoming unlivable. California's wildfires -- including its first-ever "gigafire" -- have consumed more than four million acres in the last two months, sending plumes of ash to record heights, and dumping a staggering amount of smoke into the Reno-Sparks basin. Things are a little better at the moment, but for weeks I couldn't see the desert mountains that surround the area. Some days I couldn't even make out the Grand Sierra Reno casino, a quarter mile from the highway on which I drive to work each morning.

For our canvassers -- almost every one a laid-off waiter, bartender, hotel housekeeper, or casino worker -- the climate emergency and the Covid-19 pandemic are literally in their faces as they don their N95 masks to walk the streets of Reno. It's the same for the voters they meet at their doors. Each evening, canvassers report (on Zoom, of course) what those voters are saying and, for the first time I can remember, they are now talking about the climate. They're angry at a president who pulled the U.S. out of the Paris climate accord and they're scared about what a potentially searing future holds for their children and grandchildren. They may not have read Joe Biden's position on clean energy and environmental justice, but they know that Donald Trump has no such plan.

Braving Guns, Germs, and Smoke

In his classic book Guns, Germs, and Steel, Jared Diamond suggested that the three variables in his title helped in large part to explain how European societies and the United States came to control much of the planet in the twentieth century. As it happens, our door-to-door canvassers confront a similar triad of obstacles right here in Reno, Nevada (if you replace that final "steel" with "smoke.")

Guns and Other Threats

Nevada is an open-carry state and gun ownership is common here. It's not unusual to see someone walking around a supermarket with a holstered pistol on his hip. A 2015 state law ended most gun registration requirements and another allows people visiting from elsewhere to buy rifles without a permit. So gun sightings are everyday events.

Still, it can be startling, if you're not used to it, to have a voter answer the door with a pistol all too visible, even if securely holstered. And occasionally, our canvassers have even watched those guns leave their holsters when the person at the door realizes why they're there (which is when the campaign gets the police involved). Canvassers are trained to observe very clear protocols, including immediately leaving an area if they experience any kind of verbal or physical threat.

African American and Latinx canvassers who've campaigned before in Reno say that, in 2020, Trump supporters seem even more emboldened than in the past to shout racist insults at them. More than once, neighbors have called the police on our folks, essentially accusing them of canvassing-while-black-or-brown. Two days before I wrote this piece, the police pulled over one young Latino door-knocker because neighbors had called to complain that he was walking up and down the street waving a gun. (The "gun" in question was undoubtedly the electronic tablet he was carrying to record the results of conversations with voters.) The officer apologized.

Which reminds me of another apology offered recently. A woman approached an African-American canvasser, demanding to know what in the world he was doing in her neighborhood. On learning his mission, she offered an apology as insulting as her original question. "We're not used to seeing people like you around here," she explained.


Until the pandemic, my partner and I had planned to work together with UNITE-HERE! in Reno during this election, as we did in 2018. But she's five years older than I am, and her history of pneumonia means that catching Covid-19 could be especially devastating for her. So she's stayed in San Francisco, helping out the union's national phone bank effort instead.

In fact, we didn't really expect that there would be a ground campaign this year, given the difficulties presented by the novel coronavirus. But the union was determined to eke out that small but genuine addition to the vote that a field campaign can produce. So they put in place stringent health protocols for all of us: masks and a minimum of six feet of distance between everyone at all times; no visits to bars, restaurants, or casinos, including during off hours; temperature checks for everyone entering the office; and the immediate reporting of any potential Covid-19 symptoms to our health and safety officer. Before the union rented blocks of rooms at two extended-stay hotels, our head of operations checked their mask protocols for employees and guests and examined their ventilation systems to make sure that the air conditioners vented directly outdoors and not into a common air system for the whole building.

To date, not one of our 57 canvassers has tested positive, a record we intend to maintain as we add another 17 full-timers to our team next week.

One other feature of our coronavirus protocol: we don't talk to any voter who won't put on a mask. I was skeptical that canvassers would be able to get voters to mask up, even with the individually wrapped surgical masks we're offering anyone who doesn't have one on or handy. However, it turns out that, in this bizarre election year, people are eager to talk, to vent their feelings and be heard. So many of the people we're canvassing have suffered so much this year that they're surprised and pleased when someone shows up at their door wondering how they're doing.

And the answer to that question for so many potential voters is not well -- with jobs lost, housing threatened, children struggling with online school, and hunger pangs an increasingly everyday part of life. So yes, a surprising number of people, either already masked or quite willing to put one on, want to talk to us about an election that they generally see as the most important of their lifetime.


And did I mention that it's been smoky here? It can make your eyes water, your throat burn, and the urge to cough overwhelm you. In fact, the symptoms of smoke exposure are eerily similar to the ones for Covid-19. More than one smoke-affected canvasser has spent at least five days isolated in a hotel room, waiting for negative coronavirus test results.

The White House website proudly quotes the president on his administration's testing record: "We do tremendous testing. We have the best testing in the world." Washoe County health officials are doing what they can, but if this is the best in the world, then the world is in worse shape than we thought.

The Power of a Personal Story

So why, given the genuine risk and obstacles they face, do UNITE-HERE!'s canvassers knock on doors six days a week to elect Joe Biden and Kamala Harris? Their answers are a perfect embodiment of the feminist dictum "the personal is political." Every one of them has a story about why she or he is here. More than one grew up homeless and never want another child to live that way. One is a DACA recipient who knows that a reelected Donald Trump will continue his crusade to end that amnesty for undocumented people brought to the United States as children. Through their participation in union activism, many have come to understand that workers really can beat the boss when they organize -- and Trump, they say, is the biggest boss of all.

Through years of political campaigning, the union's leaders have learned that voters may think about issues, but they're moved to vote by what they feel about them. The goal of every conversation at those doors right now is to make a brief but profound personal connection with the voter, to get each of them to feel just how important it is to vote this year. Canvassers do this by asking how a voter is doing in these difficult times and listening -- genuinely listening -- and responding to whatever answer they get. And they do it by being vulnerable enough to share the personal stories that lie behind their presence at the voter's front door.

One canvasser lost his home at the age of seven, when his parents separated. He and his mother ended up staying in shelters and camping for months in a garden shed on a friend's property. One day recently he knocked on a door and found a Trump supporter on the other side of it. He noticed a shed near the house, pointed to it, and told the man about living in something similar as a child. That Trumpster started to cry. He began talking about how he'd had just the same experience and the way, as a teenager, he'd had to hold his family together when his heroin-addicted parents couldn't cope. He'd never talked to any of his present-day friends about how he grew up and, in the course of that conversation, came to agree with our canvasser that Donald Trump wasn't likely to improve life for people like them. He was, he said, changing his vote to Biden right then and there. (And that canvasser will be back to make sure he actually votes.)

Harvard University Professor Marshall Ganz pioneered the "public narrative," the practice of organizing by storytelling. It's found at the heart of many organizing efforts these days. The 2008 Obama campaign, for example, trained thousands of volunteers to tell their stories to potential voters. The It Gets Better Project has collected more than 50,000 personal messages from older queer people to LGBTQ youth who might be considering suicide or other kinds of self-harm -- assuring them that their own lives did, indeed, get better.

Being the sort of political junkie who devours the news daily, I was skeptical about the power of this approach, though I probably shouldn't have been. After all, how many times did I ask my mother or father to "tell me a story" when I was a kid? What are our lives but stories? Human beings are narrative animals and, however rational, however versed in the issues we may sometimes be, we still live through stories.

Data can give me information on issues I care about, but it can't tell me what issues I should care about. In the end, I'm concerned about racial and gender justice as well as the climate emergency because of the way each of them affects people and other creatures with whom I feel connected.

A Campaign Within a Campaign

Perhaps the most inspiring aspect of UNITE-HERE!'s electoral campaign is the union's commitment to developing every canvasser's leadership skills. The goal is more than winning what's undoubtedly the most important election of our lifetime. It's also to send back to every hotel, restaurant, casino, and airport catering service leaders who can continue to organize and advocate for their working-class sisters and brothers. This means implementing an individual development plan for each canvasser.

Team leaders work with all of them to hone their stories into tools that can be used in an honest and generous way to create a genuine connection with voters. They help those canvassers think about what else they want to learn to do, while developing opportunities for them to master technical tools like computer spreadsheets and databases.

There's a special emphasis on offering such opportunities to women and people of color who make up the vast majority of the union's membership. Precious hours of campaign time are also devoted to workshops on how to understand and confront systemic racism and combat sexual harassment, subjects President Trump is acquainted with in the most repulsively personal way. The union believes its success depends as much on fostering a culture of respect as on the hard-nosed negotiating it's also famous for.

After months of pandemic lockdown and almost four years of what has objectively been the worst, most corrupt, most incompetent, and possibly even most destructive presidency in the nation's history, it's a relief to be able to do something useful again. And sentimental as it may sound, it's an honor to be able to do it with this particular group of brave and committed people. Sí, se puede. Yes, we can.

Rebecca Gordon, a TomDispatch regular, teaches at the University of San Francisco. She is the author of American Nuremberg: The U.S. Officials Who Should Stand Trial for Post-9/11 War Crimesand is now at work on a new Dispatch book on the history of torture in the United States.

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer's new dystopian novel (the second in the Splinterlands series) Frostlands, Beverly Gologorsky's novel Every Body Has a Story, and Tom Engelhardt's A Nation Unmade by War, as well as Alfred McCoy's In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower's The Violent American Century: War and Terror Snce World War II.

Copyright 2020 Rebecca Gordon

How a diseased planet is reshaping the world of work

In two weeks, my partner and I were supposed to leave San Francisco for Reno, Nevada, where we’d be spending the next three months focused on the 2020 presidential election. As we did in 2018, we’d be working with UNITE-HERE, the hospitality industry union, only this time on the campaign to drive Donald Trump from office.

Keep reading...Show less

The US is in free fall — and at risk of becoming a failed state

You know that feeling when you trip on the street and instantly sense that you’re about to crash hard and there’s no way to prevent it? As gravity has its way with you, all you can do is watch yourself going down. Yeah, that feeling.

Keep reading...Show less

How the credibility gap became a chasm in the age of Trump

These days, teaching graduating college seniors has me, as the Brits would say in the London Underground, “minding the gap.” In my case, however, it’s not the gap between the platform and the train I’m thinking of, but a couple of others: those between U.S. citizens and our government, as well as between different generations of Americans. The Covid-19 crisis has made some of those gaps far more visible than usual, just as my students leave school with many of their personal expectations in tatters.

Keep reading...Show less

Why the chaotic mind of Donald Trump is strangely and terrifyingly fascinating

My partner and I have been fighting about politics since we met in 1965. I was 13. She was 18 and my summer camp counselor. (It was another 14 years before we became a couple.) We spent that first summer arguing about the Vietnam War. I was convinced that the U.S. never should have gotten involved there. Though she agreed, she was more concerned that the growth of an antiwar movement would distract people from what she considered far more crucial to this country: the Civil Rights movement. As it happened, we were both right.

Keep reading...Show less

The future may be female — but the pandemic is patriarchal

Before I found myself “sheltering in place,” this article was to be about women’s actions around the world to mark March 8th, International Women’s Day. From Pakistan to Chile, women in their millions filled the streets, demanding that we be able to control our bodies and our lives. Women came out in Iraq and Kyrgyzstan, Turkey and Peru, the Philippines and Malaysia. In some places, they risked beatings by masked men. In others, they demanded an end to femicide -- the millennia-old reality that women in this world are murdered daily simply because they are women.

Keep reading...Show less

'The right to do whatever I want as president'

On February 5th, the Senate voted to acquit President Donald J. Trump of abuse of power and obstruction of Congress. In other words, Trump’s pre-election boast that he “could stand in the middle of Fifth Avenue and shoot somebody” and not “lose any voters” proved something more than high-flown hyperbole. (To be fair, he did lose one Republican “voter” in the Senate — Mitt Romney — but it wasn’t enough to matter.)

Keep reading...Show less

How the cowardice of the Obama administration foreshadowed the impunity of Trump

On February 5th, the Senate voted to acquit President Donald J. Trump of abuse of power and obstruction of Congress. In other words, Trump’s pre-election boast that he “could stand in the middle of Fifth Avenue and shoot somebody” and not “lose any voters” proved something more than high-flown hyperbole. (To be fair, he did lose one Republican “voter” in the Senate -- Mitt Romney -- but it wasn’t enough to matter.)

Keep reading...Show less

Here's what Trump doesn't understand about the 'deep state'

This seems like a strange moment to be writing about "the deep state" with the country entering a new phase of open and obvious aboveground chaos and instability. Just as we had gotten used to the fact that the president is, in effect, under congressional indictment, just as we had settled into a more or less stable stalemate over when (and if) the Senate will hold an impeachment trial, the president shook the snow globe again, by ordering the assassination of foreign military officials and threatening the destruction of Iran’s cultural sites. Nothing better than the promise of new war crimes to take the world’s attention away from a little thing like extorting a U.S. ally to help oneself get reelected.

Keep reading...Show less

What's wrong with Trump Republican Party? It springs from the twin roots of political evil

On the Thursday of the second week of the House Intelligence Committee’s impeachment hearings, former U.S. Attorney Preet Bharara had a special guest on his weekly podcastCarl Bernstein. It was Bernstein, with fellow Washington Post journalist Bob Woodward, whose reporting broke open the story of how the Committee to Re-elect the President burglarized Democratic Party headquarters at the Watergate office building in Washington, D.C.  That reporting and the impeachment hearings that followed eventually forced President Richard Nixon to resign in disgrace in 1974. Bharara wanted to hear about what differences Bernstein sees between the Nixon impeachment proceedings and Donald Trump’s today.

Keep reading...Show less

Trump's high crimes: Why impeachment fever is catching on — and could stop a historically destructive president

Recently a friend who follows the news a bit less obsessively than I do said, “I thought George W. Bush was bad, but it seems like Donald Trump is even worse. What do you think?”

Keep reading...Show less

Trump wants to give pardons for war crimes — but the worst offenders don't even need them

Memorial Day has come and gone and President Trump did not issue his pardons after all. There was substantial evidence that he was planning to use the yearly moment honoring the country’s war dead to grant executive clemency to several U.S. soldiers and at least one military contractor. All have been accused, and one already convicted, of crimes in the never-ending war on terror. But apparently Trump received enough resistance from serving and retired senior military officers and former soldiers, including presidential candidate Pete Buttigieg, to change his mind -- for now.

Keep reading...Show less

How to make yourself an exception to the rule of law

Events just fly by in the ever-accelerating rush of Trump Time, so it’s easy enough to miss important ones in the chaos. Paul Manafort is sentenced twice and indicted a third time! Whoosh! Gone! The Senate agrees with the House that the United States should stop supporting Saudi Arabia in Yemen (and Mitch McConnell calls this attempt to extricate the country from cooperation in further war crimes “inappropriate and counterproductive”)! Whoosh! Gone! Twelve Republican senators cross party lines to overturn Trump’s declaration of a national emergency on the U.S.-Mexico border, followed by the president’s veto! Whoosh! Gone! Delegates to the March 2019 U.N. Environment Assembly meeting agree to a non-binding but important resolution drastically reducing the production of single-use plastic. The United States delegation, however, succeeds in watering down the final language lest it “endorse the approach being taken in other countries, which is different than our own”! Once again, the rest of the world is briefly reminded of the curse of American exceptionalism and then, whoosh! Gone!

Keep reading...Show less

How Trump's ego could weaken the National Guard

A young friend is seriously considering joining her state’s National Guard. She’s a world-class athlete, but also a working-class woman from a rural background competing in a rich person’s sport. Between seasons, she works for a local farm and auctioneer to put together the money for equipment and travel.

Keep reading...Show less

A world on fire: Trump's dizzying outrage cycle fuels our anxiety like a drug - but his presidency's dangers are very real

I took my first hit of speed in 1970 during my freshman year in college. That little white pill -- Dexedrine -- was a revelation. It made whatever I was doing absolutely fascinating. Amphetamine sharpened my focus and banished all appetites except a hunger for knowledge. I spent that entire night writing 35 pages of hand-scrawled notes about a 35-page article by the philosopher Ludwig Feurbach, thereby convincing the professor who would become my advisor and mentor that I was absolutely fascinating.

Keep reading...Show less