Rebecca Gordon

Big Brother Trump is already watching you

Sometime in the late 1980s, I was talking with a friend on my landline (the only kind of telephone we had then). We were discussing logistics for an upcoming demonstration against the Reagan administration’s support for the Contras fighting the elected government of Nicaragua. We agreed that, when our call was done, I’d call another friend, “Mary,” to update her on the plans. I hung up.

But before I could make the call, my phone rang.

“Hi, this is Mary,” my friend said.

“Mary! I was just about to call you.”

“But you did call me,” she said.

“No, I didn’t. My phone just rang, and you were on the other end.”

It was pretty creepy, but that was how surveillance worked in the days of wired telephone systems. Whoever was listening in, most likely someone from the local San Francisco Police Department, had inadvertently caused both lines to ring, while preparing to catch my coming conversation with Mary. Assuming they’d followed the law, arranging such surveillance would have involved a number of legal and technical steps, including securing a wiretapping warrant. They’d have had to create a physical connection between their phones and ours, most likely by plugging into the phone company’s central office.

Government surveillance has come a long way since then, both technically and in terms of what’s legally possible in Donald Trump’s United States and under the John Roberts Supreme Court.

All the President’s Tech

Government agencies have many ways of keeping tabs on us today. The advent of cellular technology has made it so much easier to track where any of us have been, simply by triangulating the locations of the cell towers our phones have pinged along the way.

If you watch police procedurals on television (which I admit to doing more than is probably good for me), you’ll see a panoply of surveillance methods on display, in addition to cellular location data. It used to be only on British shows that the police could routinely rely on video recordings as aids in crime solving. For some decades, the Brits were ahead of us in creating a surveillance society. Nowadays, though, even the detectives on U.S. shows like Law and Order SVU (heading for its 27th season) can usually locate a private video camera with a sightline to the crime and get its owner to turn over the digital data.

Facial recognition is another technology you’ll see on police dramas these days. It’s usually illustrated by a five-second interval during which dozens of faces appear briefly on a computer monitor. The sequence ends with a final triumphant flourish—a single face remaining on screen, behind a single flashing word: “MATCH.”

We should probably live as if everything we do, even in supposedly “secure” places (real and virtual), is visible to the Trump regime.

I have no idea whether the TV version is what real facial recognition software actually looks like. What I do know is that it’s already being used by federal agencies like Immigration and Customs Enforcement (ICE) and the FBI, under the auspices of a company called Clearview, which is presently led by Hal Lambert, a big Trump fundraiser. As Mother Jones magazine reports, Clearview has “compiled a massive biometric database” containing “billions of images the company scraped off the internet and social media without the knowledge of the platforms or their users.” The system is now used by law enforcement agencies around the country, despite its well-documented inability to accurately recognize the faces of people with dark skin.

The old-fashioned art of tailing suspects on foot is rapidly giving way to surveillance by drone, while a multitude of cameras at intersections capture vehicle license plates. Fingerprinting has been around for well over a century, although it doesn’t actually work on everyone. Old people tend to lose the ridges that identify our unique prints, which explains why I can’t reliably use mine to open my phone or wake my computer. Maybe now’s my moment to embark on a life of crime? Probably not, though, as my face is still pretty recognizable, and that’s what the Transportation Safety Administration uses to make sure I’m really the person in the photo on my Real ID.

The second Trump administration is deploying all of these surveillance methods and more, as it seeks to extend its authoritarian power. And one key aspect of that project is the consolidation of the personal information of millions of people in a single place.

One Database to Rule Them All

It’s been thoroughly demonstrated that, despite its name, Elon Musk’s Department of Government Efficiency has been anything but efficient in reducing “waste, fraud, and abuse” in federal spending. DOGE, however, has made significantly more progress in achieving a less well publicized but equally important objective: assembling into a single federal database the personal details of hundreds of millions of individuals who have contact with the government. Such a database would combine information from multiple agencies, including the IRS and the Social Security Administration. The process formally began in March 2025 when, as The New York Times reported, President Trump signed an executive order “calling for the federal government to share data across agencies.” Such a move, as Times reporters Sheera Frenkel and Aaron Krolik note, raises “questions over whether he might compile a master list of personal information on Americans that could give him untold surveillance power.”

In keeping with the fiction that DOGE’s work is primarily focused on cost cutting, Trump labeled his order “Stopping Waste, Fraud, and Abuse by Eliminating Information Silos.” That fiction provided the pretext for DOGE’s demands that agency after agency grant its minions free access to the most private data they had on citizens and noncitizens alike. As The Washington Postreported in early May:

The U.S. DOGE Service is racing to build a single centralized database with vast troves of personal information about millions of U.S. citizens and residents, a campaign that often violates or disregards core privacy and security protections meant to keep such information safe, government workers say.

Worse yet, it will probably be impossible to follow DOGE’s trail of technological mayhem. As the Post reporters explain:

The current administration and DOGE are bypassing many normal data-sharing processes, according to staffers across 10 federal agencies, who spoke on the condition of anonymity out of fear of retribution. For instance, many agencies are no longer creating records of who accessed or changed information while granting some individuals broader authority over computer systems. DOGE staffers can add new accounts and disable automated tracking logs at several Cabinet departments, employees said. Officials who objected were fired, placed on leave or sidelined.

My own union, the American Federation of Teachers, joined a suit to prevent DOGE from seizing access to Social Security data and won in a series of lower courts. However, on May 31, in a 6-3 ruling, the Supreme Court (with the three liberal judges dissenting) temporarily lifted the block imposed by the lower courts until the case comes back to the justices for a decision on its merits. In the meantime, DOGE can have what it wants from the Social Security Administration. And even if the Supreme Court were ultimately to rule against DOGE, the damage will be done. As the president of El Salvador said in response to an entirely different court ruling, “Oopsie. Too late.”

Musk’s Pal Peter Thiel—and Palantir

Anyone who’s ever worked with a database, even one with only a few thousand records, knows how hard it is to keep it organized and clean. There’s the problem of duplicate records (multiple versions of the same person or other items). And that’s nothing compared to the problem of combining information from multiple sources. Even the names of the places where data goes (“fields”) will differ from one base to another. The very structures of the databases and how records are linked together (“relationships”) will differ, too. All of this makes combining and maintaining databases a messy and confusing business. Now imagine trying to combine dozens of idiosyncratically constructed ones with information stretching back decades into one single, clean, useful repository of information. It’s a daunting project.

And in the case of Trump’s One Big Beautiful Database, that’s where Peter Thiel’s company Palantir comes in. As The New York Times reported recently, at the urging of Elon Musk and DOGE, Trump turned to Palantir to carry out the vision expressed in his March executive order mentioned above. In fact, according to the Times, “at least three DOGE members formerly worked at Palantir, while two others had worked at companies funded by Peter Thiel, an investor and a founder of Palantir.”

Palantir, named for the “seeing stones” described in J.R.R. Tolkien’s Lord of the Rings, is already at work, providing its data platform Foundry to several parts of the government. According to the Times:

The Trump administration has expanded Palantir’s work across the federal government in recent months. The company has received more than $113 million in federal government spending since Mr. Trump took office, according to public records, including additional funds from existing contracts as well as new contracts with the Department of Homeland Security and the Pentagon. (This does not include a $795 million contract that the Department of Defense awarded the company last week, which has not been spent.)Representatives of Palantir are also speaking to at least two other agencies—the Social Security Administration and the Internal Revenue Service—about buying its technology, according to six government officials and Palantir employees with knowledge of the discussions.

Who is Peter Thiel, Palantir’s co-founder? In addition to being a friend of Musk’s, Thiel was an early Trump supporter among the tech elites of Silicon Valley, donating $1.25 million to his 2016 campaign. He is also credited with shaping the political career of Vice President JD Vance, from his campaign to become a senator to his selection as Trump’s running mate. Thiel is part of a rarified brotherhood of tech and crypto-currency billionaires who share a commitment to a particular project of world domination by a technological elite. (And if that sounds like the raw material for a crazy conspiracy theory, bear with me again here.) Thiel was also an early funder of Clearview, the facial recognition software mentioned earlier.

In hiring Palantir and turning our data over to the company, Trump makes himself a useful tool, along with Vance, in the service of Thiel’s vision—just as he has been to the machinations of Project 2025’s principal author Russell Vought, who has different, but no less creepy dreams of domination.

The Dark Enlightenment

Thiel and his elite tech bros, including Musk, Internet pioneer and venture capitalist Marc Andreessen, and Clearview founder Hoan Ton-That, share a particular philosophy. Other believers include figures like fervent Trump supporter Steve Bannon and Vice President Vance. This explicitly anti-democratic worldview goes by various names, including the “neo-reactionary movement” and the “Dark Enlightenment.”

Its founder is a software developer and political blogger named Curtis Yarvin, who has advocated replacing a “failed” democratic system with an absolute monarchy. Describing the Dark Enlightenment in The Nation magazine in October 2022, Chris Lehman observed that, in his run for Senate, JD Vance had adopted “a key plank of [Yarvin’s] plan for post-democratic overhaul—the strongman plan to ‘retire all government employees, which goes by the jaunty mnemonic ‘RAGE.’” (Any similarity to Musk’s DOGE is probably not coincidental.)

So, what is the Dark Enlightenment? It’s the negative image of an important intellectual movement of the 17th and18th centuries, the Enlightenment, whose principles formed, among other things, the basis for American democracy. These included such ideas as the fundamental equality of all human beings, the view that government derives its authority from the consent of the governed, and the existence of those “certain unalienable rights” mentioned in the U.S. Declaration of Independence.

Our response must be to oppose Trump’s onrushing version of American fascism as boldly and openly as we can.

The Dark Enlightenment explicitly opposes all of those and more. Lehman put it this way: “As Yarvin envisions it, RAGE is the great purge of the old operating system that clears the path for a more enlightened race of technocrats to seize power and launch the social order on its rational course toward information-driven self-realization.” That purge would necessarily produce “collateral casualties,” which would include “the nexus of pusillanimous yet all-powerful institutions Yarvin has dubbed ‘the Cathedral’—the universities, the elite media, and anything else that’s fallen prey to liberal perfidy.” Of course, we’ve already seen at least a partial realization of just such goals in Trump’s focused attacks on universities, journalists, and that collection of values described as diversity, equity, and inclusion.

On that last point, it should be noted that Yarvin and his followers also tended to be adherents of an “intellectual” current called “human biological diversity” championed by Steven Sailer, another Yarvin acolyte. That phrase has been appropriated by contemporary proponents of what used to be called eugenics, or scientific racism. It’s Charles Murray’s 1994 pseudo-scientific Bell Curve dressed up in high-flown pseudo-philosophy.

However, there’s more to the Dark Enlightenment than authoritarianism and racism. One stream, populated especially by Thiel and other tech bros, has an eschatology of sorts. This theology of the Earth’s end-times holds that elite humans will eventually (perhaps even surprisingly soon) achieve eternal life through physical communion with machines, greatly augmenting their capacities through artificial intelligence. That’s important to them because they’ve given up on the Earth. This planet is already too small and used up to sustain human life for long, they feel. Hence, our human destiny is instead to rule the stars. This is the theology underlying Elon Musk’s hunger for Mars. Anything that stands in the way of such a destiny must and shall be swept away on the tide of a tech bros future. (For an excellent explication of the full worldview shared by such would-be masters of the rest of us—and the rest of the universe as well—take a look at Adam Becker’s new book, More Everything Forever: AI Overlords, Space Empires, and Silicon Valley’s Crusade to Control the Fate of Humanity.)

Surveillance Everywhere?

Back in my own corner of the world, the San Francisco Police Department has come a long way since those ancient days of clumsy phone tapping. Recently, a cryptocurrency billionaire, Chris Larsen, gave the SFPD $9.4 million to upgrade its surveillance tech. They’ll use the money to outfit a new Real Time Investigation Center (RTIC) with all the latest toys. “We’re going to be covering the entire city with drones,” claimed RTIC representative Captain Thomas MacGuire. Imagine my joyful anticipation!

How should defenders of democracy respond to the coming reality of near-constant, real-time government surveillance? We can try to shrink and hide, of course, but that only does their job for them, by driving us into a useless underground. Instead, we should probably live as if everything we do, even in supposedly “secure” places (real and virtual), is visible to the Trump regime. Our response must be to oppose Trump’s onrushing version of American fascism as boldly and openly as we can. Yes, some of us will be harassed, imprisoned, or worse, but ultimately, the only answer to mass surveillance by those who want to be our overlords is open, mass defiance.

NOW READ: Bending over backwards to appease Donald Trump is a perilous strategy

Trump and Co are doing their best to make America white again — as if it ever was

On May 5, the New York Metropolitan Museum of Art’s Costume Institute held its annual fundraising gala. The event showcases the extraordinary imaginations of people who design exorbitant clothes and the gutsiness of those who dare (and can afford) to wear them.

I’m dimly aware of this annual extravaganza because of my interest in knitting, spinning, and weaving—the crafts involved in turning fluff into yarn and yarn into cloth. Mind you, I have no flair for fashion myself. I could never carry off wearing the simplest of ballgowns, and I’m way too short to rock a tuxedo. My own personal style runs to 1970s White Dyke. (Think blue jeans and flannel shirts.) But I remain fascinated by what braver people will get themselves up in.

One of my favorite movies is Paris Is Burning, a 1990 documentary about the underground Harlem ballroom scene, where drag queens and transgender folks, mostly Black and Latina, recreated a fierce version of the world of haute couture. It was a testament to people’s ability to take the detritus of what systems of racism and economic deprivation had given them and spin it into defiant art.

So I was excited to learn that the theme of this year’s gala was to be “Superfine: Tailoring Black Style,” an homage to the tradition of Black dandyism, about which Vogue magazine writes:

There is something undeniably magnetic about the sharp creases of a tailored suit, the gleam of polished leather shoes, the swish of a silk pocket square. But for Black dandyism, this isn’t just about looking good—it’s a declaration. A defiant reclaiming of space in a world that has long sought to define and confine Black identity. So, what exactly is Black dandyism? At its core, it’s a fashion revolution, a movement steeped in history, resistance, and pride.

The Met’s gala theme was chosen back in October 2024, when it still seemed possible that, rather than electing a fascist toddler, this country might choose a Black woman as president. In that case, the gala could have served as an extended victory toast. (As it happens, Kamala Harris did in fact attend.)

Instead, this country is today laboring under an increasingly authoritarian regime in Washington, one proudly and explicitly dedicated to reversing decades of victories by various movements for Black liberation.

Resuscitating Employment Discrimination

I wrote “laboring under” quite intentionally, because one of one of Trump 2.0’s key attacks on African Americans comes in the realm of work. The Heritage Foundation’s Project 2025 in its ominous preelection document Mandate for Change made this clear in a chapter on the Labor Department. The first “needed reform” there, it insisted, would be to uproot DEI (diversity, equity, and inclusion) efforts wherever they might be found in the government and military. Its authors wrote that the new administration must:

Reverse the DEI Revolution in Labor Policy. Under the Obama and Biden administrations, labor policy was yet another target of the Diversity, Equity, and Inclusion (DEI) revolution. Under this managerialist left-wing race and gender ideology, every aspect of labor policy became a vehicle with which to advance race, sex, and other classifications and discriminate against conservative and religious viewpoints on these subjects and others, including pro-life views. The next administration should eliminate every one of these wrongful and burdensome ideological projects.

In case the reader has any doubt about the evils attributed to DEI, that chapter’s next “needed reform” made it clear that the greatest of those horrors involved any effort whatsoever to prevent racial discrimination against people of color. To that end, Project 2025 wanted the federal government to stop collecting racial demographics in employment. It called on the next administration to eliminate altogether the gathering of such data by the Equal Employment Opportunity Commission (EEOC) on the grounds that collecting “employment statistics based on race/ethnicity… can then be used to support a charge of discrimination under a disparate impact theory. This could lead to racial quotas to remedy alleged race discrimination.”

In other words, as I wrote months before Donald Trump returned to power, “If you can’t demonstrate racial discrimination in employment (because you are enjoined from collecting data about race and employment), then there is no racial discrimination to remedy.”

The 1964 Civil Rights Act first established the EEOC’s mandate to collect such employment data by race in its Title VII, the section on employment rights. Title VII remains a major target of the second Trump administration. That’s especially true when it comes to federal employment, where all federal agencies are required “to maintain an affirmative program of equal employment”—an idea abhorred by the Trump administration.

The employment-rights section of the Civil Rights Act covers all employers, including the federal government. And in 1965, President Lyndon Johnson went even further, issuing Executive Order 11246, which applied similar principles to the employment practices of federal contractors. That order established the Office of Federal Contract Compliance Programs (OFCCP), which uses the EEOC’s data to ensure that federal contractors don’t discriminate against what are considered protected classes of workers.

Not surprisingly, Project 2025 called on the next administration to rescind Executive Order 11246, which is precisely what President Donald Trump did on January 21, 2025, his second day in office, in an order entitled (apparently without irony) “Ending Illegal Discrimination and Restoring Merit-Based Opportunity.” (To be clear, by “illegal discrimination,” Trump, of course, meant imagined “discrimination” against white people.) In addition to eliminating that mandate, Trump’s order also rescinded a number of later executive orders meant to ensure racial equity in employment, including:

(i) Executive Order 12898 of February 11, 1994 (Federal Actions to Address Environmental Justice in Minority Populations and Low-Income Populations);
(ii) Executive Order 13583 of August 18, 2011 (Establishing a Coordinated Government-wide Initiative to Promote Diversity and Inclusion in the Federal Workforce);
(iii) Executive Order 13672 of July 21, 2014 (Further Amendments to Executive Order 11478, Equal Employment Opportunity in the Federal Government, and Executive Order 11246, Equal Employment Opportunity); and
(iv) The Presidential Memorandum of October 5, 2016 (Promoting Diversity and Inclusion in the National Security Workforce).

According to Project 2025, preventing “discrimination” against whites requires another move as well: eliminating any law or policy that prohibits discriminatory employment outcomes. In other words, intentional racial discrimination, which is often impossible to prove, would remain the only legitimate form of discrimination.

Decimating the Black Middle Class

Why have I made such a detailed excursion into the weeds of federal law and policymaking? Because the real-world effects on African American communities of such arcane maneuvering will likely be staggering.

Federal employment was a crucial factor in building today’s Black middle class, beginning in the decades after emancipation and accelerating significantly under the provisions of that 1964 Civil Rights Act and the various presidential orders that followed. As Danielle Mahones of the Berkeley Labor Center of the University of California points out, “Federal employment has been a pathway to the middle class for African American workers and their families since Reconstruction, including postal work and other occupations.” We can now expect, she adds, “to see Black workers lose their federal jobs.”

The Trump administration’s apparently race-neutral attack on supposed waste, fraud, and abuse in the federal workforce is guaranteed to disproportionately remove Black workers from federal employment.

And with Donald Trump’s victory in November 2024, that indeed is the plan that has been brought to the White House by Russell Vought, one of the key architects of Project 2025 and now head of the Office of Management and Budget. Implementation began with the series of executive orders already described, which largely govern the hiring of new employees. But actions affecting federal hiring don’t take effect quickly, especially in periods of government cutbacks like we’re seeing today.

Fortunately for Vought and his co-conspirators at the Heritage Foundation, Trump had another option in his anti-Black toolbox: the chainsaw wielded by Elon Musk and his Department of Government Efficiency. While estimates vary, the best estimate is that, thanks to Musk and crew, around 260,000 federal workers have by now “been fired, taken buyouts, or retired early.”

Eliminating federal employees in such a way has indeed had a disproportionate effect on Black workers, since they comprise almost 19% of that workforce, while the country’s total workforce is only 13% Black. (At the Post Office, the figure may be closer to 30%.) If 260,000 federal workers have lost their jobs under Trump and Musk, then almost 50,000 of them may be Black. In other words, cutting federal jobs disproportionately affects Black workers.

“Negro Removal”

Of course, Donald Trump’s approach to Blacks is hardly new in this country. “Negro removal” has a long history here. When I first moved to San Francisco in the late 1970s, there was a big blank area in the middle of the city. Acres of empty blocks sat in the section of town known as the “Western Addition” or, to the people who had once lived there, “the Fillmore.” The Fillmore had been a racially mixed neighborhood. Populated by Japanese- and Filipino-Americans, it had also housed a significant Black enclave. As a local NPR podcast described the scene, “If you were walking down San Francisco’s Fillmore Street in the 1950s, chances are you might run into Billie Holiday stepping out of a restaurant. Or Ella Fitzgerald trying on hats. Or Thelonious Monk smoking a cigarette.” The neighborhood was often called the “Harlem of the West.”

But “urban renewal” projects, initiated under the federal Housing Act of 1949, would tear down over 14,000 housing units and an unknown number of businesses there in the name of “slum clearance and community redevelopment.” By the time I arrived, however, much of the Fillmore had been rebuilt, including the Japantown business area, though many empty lots remained. Today, they’ve all been filled in, but the 10% of the city’s population that had been African American when “urban renewal” began has been halved. And while Blacks still represent 5% of the city’s population, they also account for 37% of the unhoused.

The writer and activist James Baldwin visited San Francisco in 1963, while the Fillmore’s razing was in full swing. “Urban renewal,” he pointed out, “is Negro removal.” And according to Mindy T. Fullilove, a professor of urban studies and health, San Francisco’s urban renewal experience was duplicated across the country. As she put it back in 2001:

[U]rban renewal affected thousands of communities in hundreds of cities. Urban renewal was to achieve “clearance” of “blight” and “slum” areas so that they could be rebuilt for new uses other than housing the poor… The short-term consequences were dire, including loss of money, loss of social organization, and psychological trauma.

As Fullilove argued, federal policies like urban renewal, involving “community dispossession—and its accompanying psychological trauma, financial loss, and rippling instability—produced a rupture in the historical trajectory of African American urban communities.” She believes that such federal intervention foreclosed the possibility that Black people would follow the route to full participation in U.S. social, commercial, and political life taken by “earlier waves of immigrants to the city.”

Policies that appear to be “race neutral” can have racialized effects. The phrase “urban renewal” says nothing about uprooting Black communities, yet that is what it achieved in practice. Just as earlier federal policies led to the removal of Black communities from the hearts of hundreds of U.S. cities, the Trump administration’s apparently race-neutral attack on supposed waste, fraud, and abuse in the federal workforce is guaranteed to disproportionately remove Black workers from federal employment. Together with the planned ejection of millions of immigrants, and following the Project 2025 playbook, Trump, Elon Musk, and their minions like Stephen Miller are doing their best to Make America White Again. (As if it ever was!)

Text and Subtext

The second time around, Trump’s administration sees race everywhere. It’s the subtext of almost everything its officials say and it’s right there in the “text” of its actions and pronouncements.

Ironically enough, Mindy Fullilove’s article is—for the moment—still available from the National Institutes of Health library website. Given the “Negro removal” that the Trump administration has been eagerly pursuing on its thousands of websites and libraries, though, who knows how long it will remain there. Certainly, you can expect to see further erasures of African Americans from any arena this administration enters. As Washington Post columnist Theodore T. Johnson writes,

Not only does this White House see race; it is also a preoccupation: One of its first executive orders enacted an anti-diversity agenda that purged women, people of color, and programs from federal websites and libraries. Trump directed the firing of multiple generals and admirals who are Black, female, or responsible for the military following the rule of law.

Recent weeks have seen the purging (and in some cases, embarrassed restoration) of any number of Black historical figures, including Jackie Robinson, Harriet Tubman, and the Tuskegee Airmen, from government websites.

Nor are attacks on employment and representation the new administration’s only attempts to constrain the lives of African Americans. On April 28, Trump issued an executive order devoted to “Strengthening and Unleashing America’s Law Enforcement to Pursue Criminals and Protect Innocent Citizens.” In addition to “unleashing” local law enforcement, the order prepares the way for military involvement in local policing. It also seeks to roll back consent decrees governing the behavior of police departments judged discriminatory by previous Justice Departments. In 2025, no one should be confused about the respective races of the “criminals” and “innocent citizens” referred to in Trump’s order.

So yes, along with overlapping groups, including immigrants, transgender and other LGBTQ+ folks, women, and union workers, Black Americans are clear targets for this administration. That’s why even as rarified an event as the Met Gala may be, it still inspires me. As Ty Gaskins wrote in Vogue, Black style is a “defiant reclaiming of space in a world that has long sought to define and confine Black identity.”

Isn’t it now time for all of us to reclaim our space—and nation—from Donald Trump?

NOW READ: Even Trump's lapdog is terrified now

The Republican plan for US workers

Recently, you may have noticed that the hot weather is getting ever hotter. Every year the United States swelters under warmer temperatures and longer periods of sustained heat. In fact, each of the last nine months — May 2023 through February 2024 — set a world record for heat. As I’m writing this, March still has a couple of days to go, but likely as not, it, too, will set a record.

This story originally appeared on TomDispatch.

Such heat poses increasing health hazards for many groups: the old, the very young, those of us who don’t have access to air conditioning. One group, however, is at particular risk: people whose jobs require lengthy exposure to heat. Numbers from the Bureau of Labor Statistics show that about 40 workers died of heat exposure between 2011 and 2021, although, as CNN reports, that’s probably a significant undercount. In February 2024, responding to this growing threat, a coalition of 10 state attorneys general petitioned the federal Occupational Safety and Health Administration (OSHA) to implement “a nationwide extreme heat emergency standard” to protect workers from the kinds of dangers that last year killed, among others, construction workers, farm workers, factory workers, and at least one employee who was laboring in an unairconditioned area of a warehouse in Memphis, Tennessee.

Facing the threat of overweening government interference from OSHA or state regulators, two brave Republican-run state governments have stepped in to protect employers from just such dangerous oversight. Florida and Texas have both passed laws prohibiting localities from mandating protections like rest breaks for, or even having to provide drinking water to, workers in extreme heat situations. Seriously, Florida and Texas have made it illegal for local cities to protect their workers from the direct effects of climate change. Apparently, being “woke” includes an absurd desire not to see workers die of heat exhaustion.

And those state laws are very much in keeping with the plans that the national right-wing has for workers, should the wholly-owned Trump subsidiary that is today’s Republican Party take control of the federal government this November.

We’ve Got a Plan for That!

It’s not exactly news that conservatives, who present themselves as the friends of working people, often support policies that threaten not only workers’ livelihoods, but their very lives. This fall, as we face the most consequential elections of my lifetime (all 71 years of it), rights that working people once upon a time fought and died for — the eight-hour day, a legal minimum wage, protections against child labor — are, in effect, back on the ballot. The people preparing for a second Trump presidency aren’t hiding their intentions either. Anyone can discover them, for instance, in the Heritage Foundation’s well-publicized Project 2025 Mandate for Leadership, a “presidential transition” plan that any future Trump administration is expected to put into operation.

As I’ve written before, the New York Times’s Carlos Lozada did us a favor by working his way through all 887 pages of that tome of future planning. Lacking his stamina, I opted for a deep dive into a single chapter of it focused on the “Department of Labor and Related Agencies.” Its modest 35 pages offer a plan to thoroughly dismantle more than a century of workers’ achievements in the struggle for both dignity and simple on-the-job survival.

First Up: Stop Discriminating Against Discriminators

I’m sure you won’t be shocked to learn that the opening salvo of that chapter is an attack on federal measures to reduce employment discrimination based on race or sex. Its author, Jonathan Berry of the Federalist Society, served in Donald Trump’s Department of Labor (DOL). He begins his list of “needed reforms” with a call to “Reverse the DEI Revolution in Labor Policy.” “Under the Obama and Biden Administrations,” Berry explains, “labor policy was yet another target of the Diversity, Equity, and Inclusion (DEI) revolution” under which “every aspect of labor policy became a vehicle with which to advance race, sex, and other classifications and discriminate against conservative and religious viewpoints on these subjects and others, including pro-life views.”

You may wonder what it means to advance “classifications” or why that’s even a problem. Berry addresses this question in his second “necessary” reform, a call to “Eliminate Racial Classifications and Critical Race Theory Trainings.” Those two targets for elimination would seem to carry very different weight. After all, “Critical Race Theory,” or CRT, is right-wing code for the view that structural barriers exist preventing African Americans and other people of color from enjoying the full rights of citizens or residents. It’s unclear that such “trainings” even occur at the Labor Department, under CRT or any other label, so their “elimination” would, in fact, have little impact on workers.

On the other hand, the elimination of “racial classifications” would be consequential for many working people, as Berry makes clear. “The Biden Administration,” he complains, “has pushed ‘racial equity’ in every area of our national life, including in employment, and has condoned the use of racial classifications and racial preferences under the guise of DEI and critical race theory, which categorizes individuals as oppressors and victims based on race.” Pushing racial equity in employment? The horror!

Berry’s characterization of CRT is, in fact, the opposite of what critical race theory seeks to achieve. This theoretical approach to the problem of racism does not categorize individuals at all, but instead describes structures — like corporate hiring practices based on friendship networks — that can disadvantage groups of people of a particular race. In fact, CRT describes self-sustaining systems that do not need individual oppressors to continue (mal)functioning.

The solution to the problem of discrimination in employment in Project 2025’s view is to deny the existence of race (or sex, or sexual orientation) as a factor in the lives of people in this country. It’s simple enough: if there’s no race, then there’s no racial discrimination. Problem solved.

And to ensure that it remains solved, Project 2025 would prohibit the Equal Economic Opportunity Commission, or EEOC, from collecting employment data based on race. The mere existence of such “data can then be used to support a charge of discrimination under a disparate impact theory. This could lead to racial quotas to remedy alleged race discrimination.” In other words, if you can’t demonstrate racial discrimination in employment (because you’re enjoined from collecting data on the subject), then there’s no racial discrimination to remedy. Case closed, right?

By outlawing such data collection, a Republican administration guided by Project 2025 would make it almost impossible to demonstrate the existence of racial disparity in the hiring, retention, promotion, or termination of employees.

Right-wingers in my state of California tried something similar in 2003 with Ballot Proposition 54, known as the Racial Privacy Initiative. In addition to employment data, Prop. 54 would have outlawed collecting racial data about public education and, no less crucially, about policing. As a result, Prop. 54 would have made it almost impossible for civil rights organizations to address the danger of “driving while Black” — the disproportionate likelihood that Black people will be the subject of traffic stops with the attendant risk of police violence or even death. Voters soundly defeated Prop. 54 by a vote of 64% to 36% and, yes, racial discrimination still exists in California, but at least we have access to the data to prove it.

There is, however, one group of people Project 2025 would emphatically protect from discrimination: employers who, because of their “conservative and religious viewpoints… including pro-life views,” want the right to discriminate against women and LGBTQ people. “The President,” writes Berry, “should make clear via executive order that religious employers are free to run their businesses according to their religious beliefs, general nondiscrimination laws notwithstanding.” Of course, Congress already made it clear that, under Title VII of the Religious Freedom Restoration Act of 1993, “religious” employers are free to ignore anti-discrimination laws when it suits them.

But Wait, There’s More

Not content with gutting anti-discrimination protections, Project 2025 would also seek to rescind rights secured under the Fair Labor Standards Act, or FLSA, which workers have enjoyed for many decades. Originally passed in 1938, the FLSA “establishes minimum wage, overtime pay, recordkeeping, and child labor standards affecting full-time and part-time workers in the private sector and in Federal, State, and local governments,” according to the Department of Labor.

Perhaps because the federal minimum hourly wage has remained stuck at $7.25 for a decade and a half, Project 2025 doesn’t launch the typical conservative attack on the very concept of such a wage. It does, however, go after overtime pay (generally time-and-a-half for more than 40 hours of work a week), by proposing that employers be allowed to average time worked over a longer period. This would supposedly be a boon for workers, granting them the “flexibility” to labor fewer than 40 hours one week and more than 40 the next, without an employer having to pay overtime compensation for that second week. What such a change would actually do, of course, is give an employer the power to require overtime work during a crunch period while reducing hours at other times, thereby avoiding paying overtime often or at all.

Another supposedly family-friendly proposal would allow workers to choose to take their overtime compensation as paid time off, rather than in dollars and cents. Certainly, any change that would reduce workloads sounds enticing. But as the Pew Research Center reports, more than 40% of workers can’t afford to, and don’t, take all their paid time off now, so this measure could function as yet one more way to reduce the overtime costs of employers.

In contrast to the Heritage Foundation’s scheme, Senator Bernie Sanders has proposed a genuinely family-friendly workload reduction plan: a gradual diminution of the standard work week from 40 to 32 hours at the same pay. Such proposals have been around (and ridiculed) for decades, but this one is finally receiving serious consideration in places like the New York Times.

In deference to the supposedly fierce spirit of “worker independence,” Project 2025 would also like to see many more workers classified not as employees at all but as independent contractors. And what would such workers gain from that “independence”? Well, as a start, freedom from those pesky minimum wage and overtime compensation regulations, not to speak of the loss of protections like disability insurance. And they’d be “free” to pay the whole tab (15.3% of their income) for their Social Security and Medicare taxes, unlike genuine employees, whose employers pick up half the cost.

Young people, too, would acquire more “independence” thanks to Project 2025 — at least if what they want to do is work in more dangerous jobs where they are presently banned. As Berry explains:

“Some young adults show an interest in inherently dangerous jobs. Current rules forbid many young people, even if their family is running the business, from working in such jobs. This results in worker shortages in dangerous fields and often discourages otherwise interested young workers from trying the more dangerous job.”

The operative word here is “adults.” In fact, no laws presently exclude adults from hazardous work based on age. What Berry is talking about is allowing adolescents to perform such labor. Duvan Tomás Pérez, for instance, was a 16-year-old who showed just such an “interest” in an inherently dangerous job: working at a poultry plant in Mississippi, where he died in an industrial accident. The middle schooler, a Guatemalan immigrant who had lived in the United States for six years, was employed illegally by the Mar-Jac Poultry company. If there are “worker shortages in dangerous fields,” it’s because adults don’t want to take the risks. The solution is to make the work less dangerous for everyone, not to hire children to do it.

We’re Gonna Roll the Union Over

Mind you, much to the displeasure of Project 2025 types, this country is experiencing a renaissance of union organizing. Companies that long thought they could avoid unionization, from Amazon to Starbucks, are now the subject of such drives. In my own world of higher education, new unions are popping up and established ones are demonstrating renewed vigor in both private and public universities. As the bumper-sticker puts it, unions are “the folks who brought you the weekend.” They’re the reason we have laws on wages and hours, not to speak of on-the-job protections. So, it should be no surprise that Project 2025 wants to reduce the power of unions in a number of ways, including:

  • Amending the National Labor Relations Act to allow “Employee Involvement Organizations” to supplant unions. Such “worker-management councils” are presently forbidden for good reason. They replace real unions that have the power to bargain for wages and working conditions with toothless pseudo-unions.
  • Ending the use of “card-checks” and requiring elections to certify union representation. At the moment, the law still permits a union to present signed union-support cards from employees to the National Labor Relations Board and the employer. If both entities agree, the union wins legal recognition. The proposed change would make it significantly harder for unions to get certified, especially because cards can be collected without the employer’s knowledge, whereas a public election with a long lead time gives the employer ample scope for anti-union organizing activities, both legal and otherwise.
  • Allowing individual states to opt out of labor protections granted under the Fair Labor Standards Act and the National Labor Relations Act.

The measures covered here are, believe it or not, just the highlights of that labor chapter of Project 2025. If put into practice, they would be an historically unprecedented dream come true for employers, and a genuine nightmare for working people.

Meanwhile, at the Trumpified and right-wing-dominated Supreme Court, there are signs that some justices are interested in entertaining a case brought by Elon Musk’s SpaceX that could abolish the National Labor Relations Board (NLRB), the federal entity that adjudicates most labor disputes involving federal law. Without the NLRB, legal protections for workers, especially organizing or organized workers, would lose most of their bite. Despite the court’s claim to pay no attention to public opinion, its justices would certainly take note of a resounding defeat of Donald Trump, the Republicans, and Project 2025 at the polls.

A New “Contract on America?”

The last time the right wing was this organized was probably back in 1994, when Newt Gingrich published his “Contract with America.” Some of us were so appalled by its contents that we referred to it as a plan for a gangster hit, a “Contract on America.”

This year, they’re back with a vengeance. All of which is to say that if you work for a living, or if you know and love people who do, there’s a lot on the line in this year’s election. We can’t sit this one out.

Don’t give Trump a 2nd chance to destroy American democracy

Recently my partner and I had brunch with some old comrades, folks I first met in the 1996 fight to stop the state of California from outlawing affirmative action. Sadly, we lost that one and, almost three decades later, we continue to lose affirmative action programs thanks to a Supreme Court rearranged or, more accurately, deranged by one Donald J. Trump.

It was pure joy to hang out with them and remember that political struggle during which, as my partner and I like to say, we taught a generation of young people to ask, “Can you kick in a dollar to help with the campaign?” For a couple of old white lesbians who, in the words of a beloved Catherine Koetter poster, “forgot to have children,” those still-committed organizers and activists are the closest thing to offspring we’ve got. And their kids, including one now in college, who were willing to hang out with their parents’ old buddies, are the closest we’ll ever have to grandchildren.

As people whose lives have long been tangled up in politics will do, we soon started talking about the state of the world: the wars in Ukraine, Gaza, and Sudan; the pain on this country’s border with Mexico; and of course the looming 2024 election campaign. It was then that the college student told us he wouldn’t be voting for Joe Biden—and that none of his friends would either. The president’s initial support of, and later far too-tepid objections to, the genocidal horror transpiring in Gaza were simply too much for him. That Biden has managed to use his executive powers to cancel $138 billion in student debt didn’t outweigh the repugnance he and his friends feel for the president’s largely unquestioning support of Israel’s destruction of that 25-mile strip of land on the Mediterranean Sea. To vote for Biden would be like taking a knife to his conscience. And I do understand.

Vote Your Conscience?

This year, I wonder whether only people who live in California and other dependably “blue” states can afford that kind of conscience. I’m not objecting to voting “uncommitted” in a Democratic Party primary as so many citizens of Michigan and Minnesota have done. If I lived in one of those states, I’d have done the same. In fact, I didn’t vote for Biden in Super Tuesday’s California primary either and, in truth, I wouldn’t even have to vote for him in November, because in this state my vote isn’t needed to ensure his victory, which is essentially guaranteed. But God save the world if voters in Arizona, Nevada, Pennsylvania, or other swing states follow that example.

I’m less sure, however, what I’d do if, like thousands of Arab-American voters in Michigan, I had friends and family in Gaza, the West Bank, or indeed among the millions of Palestinian refugees living in Lebanon or Jordan. Would I be able to mark my ballot for Joe? And if I wouldn’t, then how could I ask anyone else to do so?

In the end I would have to vote for him because, however terrible for that part of the world another four years of the Biden administration might be, a second Trump presidency would be even worse. (Trump’s recent comment about Gaza aimed at Israeli forces couldn’t have been blunter: “You’ve got to finish the problem.”) At least, unlike Trump, Biden isn’t beholden to the Christian Zionists of the evangelical right, who long to gather all the world’s Jews into the state of Israel, as a precondition for the return to Earth of Jesus Christ. (The fate of those Jews afterward is, of course, of little concern to those “Christians.”)

What We’ve Seen Already

At an educators’ conference I attended last month, a panelist discussing what Trump’s re-election would mean for those of us in the teaching profession inadvertently referred to his “first term.” Another panelist gently reminded her that the period from 2017 to 2021 had, in fact, been Trump’s only term and that we need to keep it that way. In preparation for this article, I looked back at some of my writings during that first (and God willing, only) term of his, to remind myself just how bad it was. I was surprised to find that I’d produced almost 30 pieces then about living in Trumpland.

There’s so much to remember about the first Trump term, and so much I’d forgotten. And that’s hardly surprising, given the speed with which, then as now, one unspeakable and previously unimaginable Trumpian horror follows the next. There’s simply no way to keep up. Here’s what I wrote, for instance, about living in Trumpworld in 2018:

There’s speed and then there’s Trump speed: the dizzying, careening way that the president drives the Formula One car of state. Just when we’ve started to adjust to one outrage—say, the ripping of migrant children from their mothers’ arms (a procedure that continues to this day, despite court prohibition)—here comes another down the track. This time it’s the construction in Texas of a tent city to house immigrant children. No, wait. That was the last lap. Now, it’s the mustering of almost 6,000 troops on the border, authorized to use lethal force ‘if they have to’ against people desperately fleeing lethal conditions in their own countries.

And he’s still at it. Not satisfied with labeling migrants as rapists the moment he came down that infamous escalator to enter the presidential campaign in 2015, he’s now comparing them to Hannibal Lecter, the fictional murderer and cannibal in Silence of the Lambs, that horror film about a serial killer who skins his female victims.

Certain of Trump’s greatest hits do still linger in the collective American consciousness. Who could forget his pronouncement that “some fine people on both sides” attended the 2017 Unite the Right march in Charlottesville, Virginia, where counter-protester Heather Heyer was murdered when a white supremacist drove his car into her? (We’re less likely to remember that other moment a couple of years later when the president doubled down, while hailing Confederate leader Robert E. Lee as “a great general,” by explaining that he still stood by his “very fine people” statement.)

Then there was the suggestion made in one of his daily press briefings during the Covid-19 pandemic that, in addition to taking the anti-malarial drug chloroquine with no proven usefulness for Covid-19, sufferers might want to consider injecting bleach into their bodies since it did such a good job of killing the virus on hard surfaces.

You’ve probably forgotten, as I had, that back in the days when he was still a first-time candidate, he was already advocating the commission of genuine war crimes. As I wrote in 2016:

“He declared himself ready to truly hit the Islamic State where it hurts. “The other thing with the terrorists,” he toldFox News, “is you have to take out their families, when you get these terrorists, you have to take out their families. They care about their lives, don’t kid yourself. When they say they don’t care about their lives, you have to take out their families.” Because it’s a well-known fact—in Trumpland at least — that nothing makes people less likely to behave violently than murdering their parents and children. And it certainly doesn’t matter, when Trump advocates it, that murder is a crime.

For me, however, some of Trump’s worst crimes were epistemological ones—crimes, that is, against knowledge. By subjecting us all to a firehose of falsehoods, he undermined people’s belief that we can ever know if anything is true. You don’t like things the way you find them? Well, in the immortal words of Kellyanne Conway, Trump’s former campaign manager and senior counselor as president, just turn to “alternative facts.” The intentional distortion of reality is a classic authoritarian trick, designed to convince masses of people that, as Hannah Arendt wrote back in 1951, nothing is true and everything is possible.

Worse than Déjà Vu

“Déjà vu” is French for “already seen” and it describes that sense of experiencing something all over again. We indeed already saw and heard too much that was unnerving, not to say frightening, during the four years of Trump’s presidency. The only thing that kept him from doing even more harm was his chaotic and lazy way of working. His attention span was notoriously short, and he could be easily distracted by any shiny object. Much of his daily schedule was given over to “executive time,” an apparent euphemism for watching cable TV and responding on Twitter to whatever he saw there.

A second Trump term would be very different if the forces gathering around him have anything to say about it. Carlos Lozada of TheNew York Times has done us an immense favor by reading and digesting all 887 pages of the plan the Heritage Foundation has produced for the next Republican presidency, Mandate for Leadership. That document details the step-by-step process necessary to transform the presidency into something resembling a monarchy, where vestigial versions of the legislative and judicial branches would serve the agenda of a unitary executive, led by an autocratic president and backed by the U.S. military. Given that someone else has done all the work to make him a king, Trump is very likely to adopt some version of that foundation’s plan. As Lozada explains:

There is plenty here that one would expect from a contemporary conservative agenda: calls for lower corporate taxes and against abortion rights; criticism of diversity, equity, and inclusion initiatives and the “climate fanaticism” of the Biden administration; and plans to militarize the southern border and target the “administrative state,” which is depicted here as a powerful and unmanageable federal bureaucracy bent on left-wing social engineering.

The Mandate calls for infusing all aspects of government, including its scientific functions, with “biblical” values and, from the military to the Environmental Protection Agency, excluding any taint of diversity, equity, or inclusion. More disturbing yet is its commitment to consolidating power in the hands of a single executive, or ruler, if you like. Those planners aren’t small-government conservatives like anti-tax activist Grover Norquist who used to explain, “I don’t want to abolish government. I simply want to reduce it to the size where I can drag it into the bathroom and drown it in the bathtub.”

Yes, the Heritage program includes inevitable tax cuts for the wealthy and the like, but, as Lozada observes, “The main conservative promise here is to wield the state as a tool for concentrating power and entrenching ideology.”

If Mandate for Leadership is the theory, then the Heritage Foundation’s Project 2025 is the practice. As TheNew YorkTimes reports, it’s “a $22 million presidential transition operation that is preparing policies, personnel lists, and transition plans to recommend to any Republican who may win the 2024 election.” Its success depends in large part on replacing tens of thousands of federal civil servants with political appointees loyal to the president. Donald Trump tried this late in his presidency, when he used an executive order to institute a new “schedule” or list of appointees to the civil service, exempting all “career positions in the federal service of a confidential, policy-determining, policymaking, or policy-advocating character” from competitive hiring. Immediately rescinded by President Biden, this “Schedule F” would be reinstated under Project 2025, allowing Trump to replace up to 50,000 career civil servants with his own faithful minions committed to his—or rather Heritage’s—program. (Trump himself doesn’t actually care about “entrenching ideology,” although he’s definitely a fan of “concentrating power” in his own hands.)

But, But Biden?

The news from Gaza seems to grow more dire by the day. Even so, I’ve concluded that we can’t afford to use a vote for Trump, or a refusal to vote for anyone, as a way to punish Joe Biden. His toleration of genocide is unforgivable; his atavistic American instinct to offer a military response to any challenge is more of the arrogant Cold-War-era stance that was so much a part of his earlier political life. Witness, for example, how his use of missiles to “send a message” to the Houthis in Yemen is only driving them to attack more ships in the Red Sea. (Meanwhile, enemies that can’t be bombed into submission like climate change and drought have reduced daily traffic by nearly 40% in an even more important international waterway, the Panama Canal.)

Nor has the United States under Biden stepped back from its general role as the “indispensable” arbiter of events in the Americas, or indeed in any of the 80 or more countries where it continues to have a military presence. I hold no brief for an imperial United States under Biden or anyone else. Nevertheless, I do believe that the world can’t afford another presidency by the man who suggests that he will establish a day-one “dictatorship” in order to “drill, drill, drill.”

Remember, this is the guy who, the last time around, pulled the United States out of the Paris climate accords. Now, the world has just lived through the hottest February on record (something that’s been true of every month since May 2023!), one in which wildfires raged not only in the southern hemisphere, where it is, after all, summer, but in Texas, burning well more than a million acres there.

This is the man who cheered on the government of Jair Bolsonaro in Brazil, as it presided over murderous attacks on the Amazon. This is the man who is still cheering as his Republican Party abandons its support for Ukraine in favor of Vladimir Putin’s Russia. This is the man who called for the assassination of his opponent in 2016, and exacerbated relations with Iran (with reverberations felt to this day) by ordering the drone assassination of Iranian general Qasem Soleimani.

This is the man who, while he fails to understand how NATO actually works, has suggested that the United States will not come to the defense of “delinquent” member nations, but instead “would encourage [the Russians] to do whatever the hell they want” to such countries.

Oh, and lest we forget, this is the man who tried once before to end American democracy. It would be true madness to give him a second chance.

Three ways the United States refuses to play by global rules

In 1963, the summer I turned 11, my mother had a gig evaluating Peace Corps programs in Egypt and Ethiopia. My younger brother and I spent most of that summer in France. We were first in Paris with my mother before she left for North Africa, then with my father and his girlfriend in a tiny town on the Mediterranean. (In the middle of our six-week sojourn there, the girlfriend ran off to marry a Czech she’d met, but that’s another story.)

In Paris, I saw American tourists striding around in their shorts and sandals, cameras slung around their necks, staking out positions in cathedrals and museums. I listened to my mother’s commentary on what she considered their boorishness and insensitivity. In my 11-year-old mind, I tended to agree. I’d already heard the expression “the ugly American” — although I then knew nothing about the prophetic 1958 novel with that title about U.S. diplomatic bumbling in Southeast Asia in the midst of the Cold War — and it seemed to me that those interlopers in France fit the term perfectly.

When I got home, I confided to a friend (whose parents, I learned years later, worked for the CIA) that sometimes, while in Europe, I’d felt ashamed to be an American. “You should never feel that way,” she replied. “This is the best country in the world!”

Indeed, the United States was, then, the leader of what was known as “the free world.” Never mind that, throughout the Cold War, we would actively support dictatorships (in Argentina, Chile, Indonesia, Nicaragua, and El Salvador, among other places) and actually overthrow democratizing governments (in Chile, Guatemala, and Iran, for example). In that era of the G.I. Bill, strong unions, employer-provided healthcare, and general postwar economic dominance, to most of us who were white and within reach of the middle class, the United States probably did look like the best country in the world.

Things do look a bit different today, don’t they? In this century, in many important ways, the United States has become an outlier and, in some cases, even an outlaw. Here are three examples of U.S. behavior that has been literally egregious, three ways in which this country has stood out from the crowd in a sadly malevolent fashion.

Guantánamo, the Forever Prison Camp

In January 2002, the administration of President George W. Bush established an offshore prison camp at the U.S. Naval Base in Guantánamo Bay, Cuba. The idea was to house prisoners taken in what had already been labeled “the Global War on Terror” on a little piece of “U.S.” soil beyond the reach of the American legal system and whatever protections that system might afford anyone inside the country. (If you wonder how the United States had access to a chunk of land on an island nation with which it had the frostiest of relations, including decades of economic sanctions, here’s the story: in 1903, long before Cuba’s 1959 revolution, its government had granted the United States “coaling” rights at Guantánamo, meaning that the U.S. Navy could establish a base there to refuel its ships. The agreement remained in force in 2002, as it does today.)

In the years that followed, Guantánamo became the site of the torture and even murder of individuals the U.S. took prisoner in Afghanistan, Iraq, and other countries ranging from Pakistan to Mauritania. Having written for more than 20 years about such U.S. torture programs that began in October 2001, I find today that I can’t bring myself to chronicle one more time all the horrors that went on at Guantánamo or at CIA “black sites” in countries ranging from Thailand to Poland, or at Bagram Air Base in Afghanistan, or indeed at the Abu Ghraib prison and Camp NAMA (whose motto was: “No blood, no foul”) in Iraq. If you don’t remember, just go ahead and google those places. I’ll wait.

Thirty men remain at Guantánamo today. Some have never been tried. Some have never even been charged with a crime. Their continued detention and torture, including, as recently as 2014, punitive, brutal forced feeding for hunger strikers, confirmed the status of the United States as a global scofflaw. To this day, keeping Guantánamo open displays this country’s contempt for international law, including the Geneva Conventions and the United Nations Convention against Torture. It also displays contempt for our own legal system, including the Constitution’s “supremacy” clause which makes any ratified international treaty like the Convention against Torture “the supreme law of the land.”

In February 2023, Fionnuala Ní Aoláin, the U.N.’s Special Rapporteur on the promotion and protection of human rights and fundamental freedoms while countering terrorism, became the first representative of the United Nations ever permitted to visit Guantánamo. She was horrified by what she found there, telling the Guardian that the U.S. has “a responsibility to redress the harms it inflicted on its Muslim torture victims. Existing medical treatment, both at the prison camp in Cuba and for detainees released to other countries, was inadequate to deal with multiple problems such as traumatic brain injuries, permanent disabilities, sleep disorders, flashbacks, and untreated post-traumatic stress disorder.”

“These men,” she added, “are all survivors of torture, a unique crime under international law, and in urgent need of care. Torture breaks a person, it is intended to render them helpless and powerless so that they cease to function psychologically, and in my conversations both with current and former detainees I observed the harms it caused.”

The lawyer for one tortured prisoner, Ammar al-Baluchi, reports that al-Baluchi “suffers from traumatic brain injury from having been subjected to ‘walling’ where his head was smashed repeatedly against the wall.” He has entered a deepening cognitive decline, whose “symptoms include headaches, dizziness, difficulty thinking and performing simple tasks.” He cannot sleep for more than two hours at a time, “having been sleep-deprived as a torture technique.”

The United States, Ní Aoláin insists, must provide rehabilitative care for the men it has broken. I have my doubts, however, about the curative powers of any treatment administered by Americans, even civilian psychologists. After all, two of them personally designed and implemented the CIA’s torture program.

The United States should indeed foot the bill for treating not only the 30 men who remain in Guantánamo, but others who have been released and continue to suffer the long-term effects of torture. And of course, it goes without saying that the Biden administration should finally close that illegal prison camp — although that’s not likely to happen. Apparently it’s easier to end an entire war than decide what to do with 30 prisoners.

Unlawful Weapons

The United States is an outlier in another arena as well: the production and deployment of arms widely recognized as presenting an immediate or future danger to non-combatants. The U.S. has steadfastly resisted joining conventions outlawing such weaponry, including cluster bombs (or more euphemistically, “cluster munitions”) and landmines.

In fact, the United States deployed cluster bombs in its wars in Iraq, and Afghanistan. (In the previous century, it dropped 270 million of them in Laos alone while fighting the Vietnam War.) Ironically — one might even say, hypocritically — the U.S. joined 146 other countries in condemning Syrian and Russian use of the same weapons in the Syrian civil war. Indeed, former White House press secretary Jen Psaki told reporters that if Russia were using them in Ukraine (as, in fact, it is), that would constitute a “war crime.”

Now the U.S. has sent cluster bombs to Ukraine, supposedly to fill a crucial gap in the supply of artillery shells. Mind you, it’s not that the United States doesn’t have enough conventional artillery shells to resupply Ukraine. The problem is that sending them there would leave this country unprepared to fight two simultaneous (and hypothetical) major wars as envisioned in what the Pentagon likes to think of as its readiness doctrine.

What are cluster munitions? They are artillery shells packed with many individual bomblets, or “submunitions.” When one is fired, from up to 20 miles away, it spreads as many as 90 separate bomblets over a wide area, making it an excellent way to kill a lot of enemy soldiers with a single shot.

What places these weapons off-limits for most nations is that not all the bomblets explode. Some can stay where they fell for years, even decades, until as a New York Times editorial put it, “somebody — often, a child spotting a brightly colored, battery-size doodad on the ground — accidentally sets it off.” They can, in other words, lie in wait long after a war is over, sowing farmland and forest with deadly booby traps. That’s why then-Secretary General of the United Nations Ban Ki-moon once spoke of “the world’s collective revulsion at these abhorrent weapons.” That’s why 123 countries have signed the 2008 Convention on Cluster Munitions. Among the holdouts, however, are Russia, Ukraine, and the United States.

According to National Security Advisor Jake Sullivan, the cluster bombs the U.S. has now sent to Ukraine each contains 88 bomblets, with, according to the Pentagon, a failure rate of under 2.5%. (Other sources, however, suggest that it could be 14% or higher.) This means that for every cluster shell fired, at least two submunitions are likely to be duds. We have no idea how many of these weapons the U.S. is supplying, but a Pentagon spokesman in a briefing said there are “hundreds of thousands available.” It doesn’t take much mathematical imagination to realize that they present a real future danger to Ukrainian civilians. Nor is it terribly comforting when Sullivan assures the world that the Ukrainian government is “motivated” to minimize risk to civilians as the munitions are deployed, because “these are their citizens that they’re protecting.”

I for one am not eager to leave such cost-benefit risk calculations in the hands of any government fighting for its survival. That’s precisely why international laws against indiscriminate weapons exist — to prevent governments from having to make such calculations in the heat of battle.

Cluster bombs are only a subset of the weapons that leave behind “explosive remnants of war.” Landmines are another. Like Russia, the United States is not found among the 164 countries that have signed the 1999 Ottawa Convention, which required signatories to stop producing landmines, destroy their existing stockpiles, and clear their own territories of mines.

Ironically, the U.S. routinely donates money to pay for mine clearance around the world, which is certainly a good thing, given the legacy it left, for example, in Vietnam. According to the New York Times in 2018:

“Since the war there ended in 1975, at least 40,000 Vietnamese are believed to have been killed and another 60,000 wounded by American land mines, artillery shells, cluster bombs and other ordnance that failed to detonate back then. They later exploded when handled by scrap-metal scavengers and unsuspecting children.”

Hot Enough for Ya?

As I write this piece, about one-third of this country’s population is living under heat alerts. That’s 110 million people. A heatwave is baking Europe, where 16 Italian cities are under warnings, and Greece has closed the Acropolis to prevent tourists from dying of heat stroke. This summer looks to be worse in Europe than even last year’s record-breaker when heat killed more than 60,000 people. In the U.S., too, heat is by far the greatest weather-related killer. Makes you wonder why Texas Governor Greg Abbott signed a bill eliminating required water breaks for outside workers, just as the latest heat wave was due to roll in.

Meanwhile, New York’s Hudson Valley and parts of Vermont, including its capital Montpelier, were inundated this past week by a once-in-a-hundred-year storm, while in South Korea, workers raced to rescue people whose cars were trapped inside the completely submerged Cheongju tunnel after a torrential monsoon rainfall. Korea, along with much of Asia, expects such rains during the summer, but this year’s — like so many other weather statistics — have been literally off the charts. Journalists have finally experienced a sea change (not unlike the extraordinary change in surface water temperatures in the Atlantic Ocean). Gone are the tepid suggestions that climate change “may play a part” in causing extreme weather events. Reporters around the world now simply assume that’s our reality.

When it comes to confronting the climate emergency, though, the United States has once again been bringing up the rear. As far back as 1992, at the United Nations Earth Summit in Rio de Janeiro, President George H.W. Bush resisted setting any caps on carbon-dioxide emissions. As the New York Times reported then, “Showing a personal interest on the subject, he singlehandedly forced negotiators to excise from the global warming treaty any reference to deadlines for capping emissions of pollutants.” And even then, Washington was resisting the efforts of poorer countries to wring some money from us to help defray the costs of their own environmental efforts.

Some things don’t change all that much. Although President Biden reversed Donald Trump’s move to pull the U.S. out of the Paris climate accords, his own climate record has been a combination of two steps forward (the green energy transition funding found in the 2022 Inflation Reduction Act, for example) and a big step back (greenlighting the ConocoPhillips Willow oil drilling project on federal land in Alaska’s north slope, not to speak of Senator Joe Manchin’s pride and joy, the $6.6 billion Mountain Valley Pipeline for natural gas).

And when it comes to remediating the damage our emissions have done to poorer countries around the world, this country is still a day late and billions of dollars short. In fact, on July 13th, climate envoy John Kerry told a congressional hearing that “under no circumstances” would the United States pay reparations to developing countries suffering the devastating effects of climate change. Although at the U.N.’s COP 27 conference in November 2022, the U.S. did (at least in principle) support the creation of a fund to help poorer countries ameliorate the effects of climate change, as Reuters reported, “the deal did not spell out who would pay into the fund or how money would be disbursed.”

Welcome to Solastalgia

I learned a new word recently, solastalgia. It actually is a new word, created in 2005 by Australian philosopher Glenn Albrecht to describe “the distress that is produced by environmental change impacting on people while they are directly connected to their home environment.” Albrecht’s focus was on Australian rural indigenous communities with centuries of attachment to their particular places, but I think the concept can be extended, at least metaphorically, to the rest of us whose lives are now being affected by the painful presences (and absences) brought on by environmental and climate change: the presence of unprecedented heat, fire, noise, and light; the presence of deadly rain and flooding; and the growing absence of ice at the Earth’s poles or on its mountains. In my own life, among other things, it’s the loss of fireflies and the almost infinite sadness of rarely seeing more than a few faint stars.

Of course, the “best country in the world” wasn’t the only nation involved in creating the horrors I’ve been describing. And the ordinary people who live in this country are not to blame for them. Still, as beneficiaries of this nation’s bounty — its beauty, its aspirations, its profoundly injured but still breathing democracy — we are, as the philosopher Iris Marion Young insisted, responsible for them. It will take organized, collective political action, but there is still time to bring our outlaw country back into what indeed should be a united community of nations confronting the looming horrors on this planet. Or so I hope and believe.

Home-grown fascists are beginning to say the quiet part out loud

One day when I was about six, I was walking with my dad in New York City. We noticed that someone had stuck little folded squares of paper under the windshield wipers of the cars parked on the street beside us. My father picked one up and read it. I saw his face grow dark with anger.

“What is it, Papa?”

“It’s a message from people who think that all Jews should be killed.”

This would have been in the late 1950s, a time when the Nazi extermination of millions of Jews in Europe was still fresh in the American consciousness. Not, you might have thought, a good season for sowing murderous antisemitism in lower Manhattan. Already aware that, being the daughter of a Jewish father and gentile mother, I was myself a demi-semite, I was worried. I knew that these people wanted to kill my father, but with a typical child-centered focus, I really wanted to know whether the gentile half of my heredity would protect me in the event of a new Holocaust.

“Would they kill me, too?” I asked.

Yes, he told me, they would if they could. But he then reassured me that such people would never actually have the power to do what they wanted to. It couldn’t happen here.

I must admit that I’m grateful my father died before Donald Trump became president, before tiki-torch-bearing Nazi wannabes seeking to “Unite the Right” marched through Charlottesville, Virginia, in 2017, chanting “Jews will not replace us!” before one of them drove his car into a crowd of counterdemonstrators, killing Heather Heyer, and before President Trump responded to the whole event by declaring that “you also had people that were very fine people, on both sides.”

Are Queer People the New Jews?

Maybe the grubby little group behind the tracts my father and I saw that day in New York would have let me live. Maybe not. In those days home-grown fascists were rare and so didn’t have that kind of power.

Now, however, there’s a new extermination campaign stalking this country that would definitely include me among its targets: the right-wing Republican crusade against “sexual predators” and “groomers,” by which they mean LGBTQI+ people. (I’m going to keep things simple here by just writing “LGBT” or “queer” to indicate this varied collection of Americans who are presently a prime target of the right wing in this country.)

You may think “extermination campaign” is an extreme way to describe the set of public pronouncements, laws, and regulations addressing the existence of queer people here. Sadly, I disagree. Ambitious would-be Republican presidential candidates across the country, from Florida Governor Ron DeSantis to the less-known governor of North Dakota, Doug Burgum, are using anti-queer legislation to bolster their primary campaigns. For Florida, it started in July 2022 with DeSantis’s Parental Rights in Education act (better known as his “Don’t Say Gay” law), which mandated that, in the state’s public schools,

“Classroom instruction by school personnel or third parties on sexual orientation or gender identity may not occur in kindergarten through grade 3 or in a manner that is not age-appropriate or developmentally appropriate for students in accordance with state standards.”

In April 2023, DeSantis doubled down, signing a new law that extended the ban all the way up through high school. Florida teachers at every level now run the very real risk of losing their jobs and credentials if they violate the new law. And queer kids, who are already at elevated risk of depression and suicide, have been deprived of the kind of affirming space that, research shows, greatly reduces those possibilities.

Is Florida an outlier? Not really. Other states have followed its lead in restricting mentions of sexual orientation or gender identity in their public schools. By February of this year, 42 such bills had been introduced in a total of 22 states and are creating a wave of LGBT refugees.

But the attacks against queer people go well beyond banning any discussion of gayness in public schools. We’re also witnessing a national campaign against trans and non-binary people that, in effect, aims to eliminate such human beings altogether, whether by denying their very existence or denying them the medical care they need. This campaign began with a focus on trans youth but has since widened to include trans and non-binary people of all ages.

Misgendering: As of 2023, seven states have laws allowing (or requiring) public school teachers to refuse to use the preferred pronouns of students if they don’t match their official sex. This behavior is called “misgendering” and it’s more than a violation of common courtesy. It’s a denial of another person’s being, their actual existence, and can have a lethal effect. Such repudiation of trans and non-binary young people significantly increases their chances of committing suicide.

It also increases the chances that their non-queer peers will come to view them with the kind of disrespect and even contempt that could also prove lethal and certainly increases their chances of becoming targets of violence. In 2022, for example, CBS News reported that “the number of trans people who were murdered in the U.S. nearly doubled between 2017 and 2021.” It’s no accident that this increase correlates with an increase in high-profile political and legal attacks on trans people. Sadly, but not surprisingly, race hatred has also played a role in many of these deaths. While Blacks represent about 13% of trans and non-binary people, they accounted for almost three-quarters of those murder victims.

Medical care: Laws allowing or even requiring misgendering in classrooms are, however, only the beginning. Next up? Denying trans kids, and ultimately trans adults, medical care. As of June 1st of this year, according to the national LGBT rights organization Human Rights Campaign, 20 states already ban gender-affirmative medical care for trans youth up to age 18. Another seven states now have such bans under consideration.

What is “gender affirmative” medical care? According to the World Health Organization, it “can include any single or combination of a number of social, psychological, behavioral, or medical (including hormonal treatment or surgery) interventions designed to support and affirm an individual’s gender identity.” In other words, it’s the kind of attention needed by people whose gender identity does not align in some way with the sex they were assigned at birth.

What does it mean to deprive a trans person of such care? It can, in fact, prove to be a death sentence.

It may be difficult to imagine this if you yourself aren’t living with gender dysphoria (a constant disorienting and debilitating alienation from one’s own body). What studies show is that proper healthcare reduces suicidal thoughts and attempts, along with other kinds of psychological distress. Furthermore, people who begin to receive such care in adolescence are less likely to be depressed, suicidal, or involved in harmful drug use later in life. As Dr. Deanna Adkins, director of the Duke Child and Adolescent Gender Care Clinic at Duke University Hospital, notes, young people who receive the gender-affirming care they need “are happier, less depressed, and less anxious. Their schoolwork often improves, their safety often improves.” And, she says, “Saving their lives is a big deal.”

Denial of life-saving care may start with young people. But the real future right-wing agenda is to deny such health care to everyone who needs it, whatever their ages. In April 2023, the New York Times reported that Florida and six other states had already banned Medicaid coverage for gender-affirming care. Missouri has simply banned most such care outright, no matter who’s paying for it.

And the attacks on queer people just keep coming. In May 2023, the Human Rights Campaign listed anti-queer bills introduced and passed in this year alone:

“• Over 520 anti-LGBTQ+ bills have been introduced in state legislatures, a record;
“• Over 220 bills specifically target transgender and non-binary people, also a record; and
“• A record 74 anti-LGBTQ laws have been enacted so far this year, including:
“• Laws banning gender affirming care for transgender youth: 16
“• Laws requiring or allowing misgendering of transgender students: 7
“• Laws targeting drag performances: 2
“• Laws creating a license to discriminate: 3
“• Laws censoring school curricula, including books: 13″

We’re not paranoid. They really do want us to disappear.

Anti-Gay Campaigns in Africa: Made in the USA

Though they’re starting to say the quiet part out loud, even in this country, they’ve been so much less careful in Africa for decades now.

It’s not all that uncommon today for right-wing Christians in the United States to publicly demand that LGBT people be put to death. As recently as Pride month (June) of last year, in a sermon that went viral on Tik-Tok, Pastor Joe Jones of Shield of Faith Baptist Church in Boise, Idaho, called for all gay people to be executed. Local NBC and CBS TV stations, along with some national affiliates, saw fit to amplify Jones’s demand to “put them to death. Put all queers to death” by interviewing him in prime time.

In keeping with right-wing propaganda that treats queer people as child predators, Jones sees killing gays as the key to preventing the sexual abuse of children. “When they die,” he said, “that stops the pedophilia. It’s a very, very simple process.” (The reality is that most sexual abuse of children involves male perpetrators and girl victims and happens inside families.)

Though American “Christians” like Jones may be years away, if ever, from instituting the death penalty for queer people here, they have already been far more successful in Africa. On May 29th, Ugandan president Yoweri Museveni signed perhaps the world’s harshest anti-LGBT law, criminalizing all homosexual activity, providing the death penalty for “serial offenders,” and according to the Reuters news agency, for the “transmission of a terminal illness like HIV/AIDS through gay sex.” It also “decrees a 20-year sentence for ‘promoting’ homosexuality.”

While Uganda’s new anti-gay law may be the most extreme on the continent, more than 30 other African countries already outlaw homosexuality to varying degrees.

It’s a little-known fact that right-wing and Christian nationalist churches from the United States have played a major role in formulating and promoting such laws. Since at least the early 2000s, those churches have poured millions of dollars into anti-gay organizing in Africa. According to Open Democracy, more than 20 U.S. evangelical groups have been involved in efforts to criminalize homosexuality there:

“The Fellowship Foundation, a secretive U.S. religious group whose Ugandan associate, David Bahati, wrote Uganda’s infamous ‘Kill the Gays’ bill, is the biggest spender in Africa. Between 2008 and 2018, this group sent more than $20m to Uganda alone.”

Such groups often employ the language of anticolonialism to advance their cause, treating homosexuality as a “western” import to Africa. Despite such rhetoric, however, quite a few of them are actually motivated by racist as well as anti-gay beliefs. “Of the groups that are active in Africa,” says Open Democracy, “ten are members of the World Congress of Families (WCF), which has been linked to white supremacists in the U.S. and Europe.”

Is MAGA Really Fascism? And Does It Matter?

Back in the late 1980s, I published an article entitled “What Is Fascism — And Why Do Women Need to Know?” in Lesbian Contradiction, a paper I used to edit with three other women. It was at the height of the presidency of Ronald Reagan and I was already worried about dangerous currents in the Republican party, ones that today have swelled into a full-scale riptide to the right. There’s a lot that’s dated in the piece, but the definition I offered for that much-used (and misused) bit of political terminology still stands:

“The term it­self was invented by Benito Mussolini, the premier of Italy from 1922 to 1945, and refers to the ‘fasces,’ the bundle of rods which symbolized the power of the Roman emperors. Today, I would define fascism as an ideology, movement, or government with several identifying characteristics:
“• Authoritarianism and a fanatical respect for leaders. Fas­cism is explicitly anti-democratic. It emerges in times of social flux or instability and of chaotic and worsening economic situations.
“• Subordination of the individual to the state or to the “race.” This subordination often has a spiritual im­plication: people are offered an opportunity to transcend their own sense of insignificance through participation in a powerful movement of the chosen.
“• Appeal to a mythical imperial glory of the past. That past may be quite ancient, as in Mussolini’s evoca­tions of the Roman Empire. Or it might be as recent as the United States of the 1950s.
“• Biological determinism. Fascism involves a belief in absolute biological differences between the sexes and among different races.
“• Genuine popularity. The scariest thing to me about real fascism is that it has always been a truly pop­ular movement. Even when it is a relatively minor force, fascism can be a mass movement without being a majority movement.”

“Having laid out these basic elements,” I added, one “real strength of fascism lies in its ex­traordinary ideological elasticity,” which allows it to embrace a wide variety of economic positions from libertarian to socialist and approaches to foreign policy that range from isolationism to imperialism. I think this, too, remains true today.

What I failed to emphasize then — perhaps because I thought it went without saying (but it certainly needs to be said today) — is that fascism is almost by definition deadly. It needs enemies on whom it can focus the steaming rage of its adherents and it is quite content for that rage to lead to literal extermination campaigns.

The creation of such enemies invariably involves a process of rhetorical dehumanization. In fascist propaganda, target groups cease to be actual people, becoming instead vermin, viruses, human garbage, communists, Marxists, terrorists, or in the case of the present attacks on LGBT people, pedophiles and groomers. As fascist movements develop, they bring underground streams of hatred into the light of “legitimate” political discourse.

All those decades ago, I suggested that the Christian fundamentalists represented an incipient fascist force. I think it’s fair to say that today’s Make America Great Again crew has inherited that mantle, successfully incorporating right-wing Christianity into a larger proto-fascist movement. All the elements of classic fascism now lurk there: adulation of the leader, subordination of the individual to the larger movement, an appeal to mythical past glories, a not-so-subtle embrace of white supremacy, and discomfort with anything or anyone threatening the “natural” order of men and women. You have only to watch a video of a Trump rally to see that his is a mass (even if not a majority) movement.

Why should it matter whether Donald Trump’s MAGA movement and the Republican Party he’s largely taken over represent a kind of fascism? The answer: because the logic of fascism leads so inexorably to the politics of extermination. Describing his MAGA movement as fascism makes it easier to recognize the existential threat it truly represents — not only to a democratic society but to specific groups of human beings within it.

I know it may sound alarmist, but I think it’s true: proto-fascist forces in this country have shown that they are increasingly willing to exterminate queer people, if that’s what it takes to gain and hold onto power. If I’m right, that means all Americans, queer or not, now face an existential threat.

For those who don’t happen to fall into one of MAGA’s target groups, let me close by paraphrasing Donald Trump: in the end, they’re coming after you. We’re just standing in the way.

A deal with the Devil: How 'non-profit industrial complex' muddies the tax system

Rebecca Gordon: Do You Really Want That Tax Deduction?

Today’s remarkable tax-time look at what TomDispatch regular Rebecca Gordon calls “the non-profit industrial complex” reminded me of just how lucky I’ve been running this website all these years. As a start, I’ve had you, the readers of TomDispatch, who have so generously donated to this site to keep it going independently of much else. And reading through Gordon’s piece, my own luck once again came to mind. Let me just quote here from part of a fund-raising letter I wrote in December 2022:

Once upon a time, only a few years after I started TomDispatch, I met a man named Patrick Lannan. I was invited to Santa Fe where the Lannan Foundation was located because Eduardo Galeano, a wonderful Uruguayan author I’d worked with in the 1980s (when I was an editor at Pantheon Books) — I had published his classic Memory of Fire trilogy — had been invited to give a talk there. Until then, I’d never met him, so when the Lannan Foundation invited me, despite the long trip from New York, it was something of a thrill. And in the process, I also met Patrick.
To make a long story short, he took a liking to TomDispatch and, at some point so many years ago, promised to support me in my work for the rest of his life. Patrick was, I have to say, truly a man of his word. I got grant payments every year from his foundation that covered my own salary and so I never had to take a cent of the money you readers contributed to TD. That went instead to the authors writing for the site and the wonderful crew who have worked with me all these years. Unfortunately, as happens to all of us sooner or later, Patrick died in July at 83. Now, his foundation is winding down and, partway into next year, TD will stop receiving those payments.

And that’s where this old guy is today. But think about it as you read Gordon’s piece. In all these years, the crucial foundation I dealt with never once tried to reshape or redirect TomDispatch in any way. Gordon reminds me of what a lucky guy I was, foundationally speaking. Now, check her piece out and, if the mood strikes you, visit the TD donation page and do your damnedest! Tom

A Deal With the Devil: Non-Profit Status and Political Action

We’ve just passed through tax time again. (Unless, like me, you live in one of several states ravaged by recent extreme weather events brought on by climate change. In that case, you can wait until October.) It’s also that moment when the War Resisters League — slogan: “If you work for peace, stop paying for war” — publishes its invaluable annual “Where Your Income Tax Money Really Goes” pie chart and publicizes a series of Tax Day events nationwide.

For many of the rest of us, it’s time to pat ourselves on the back for the charitable donations we made to tax-deductible organizations in 2022. Time to pat ourselves on the back for being clever and generous enough to “do well by doing good,” right? Time, perhaps, to wonder why, even when we give to organizations seeking radical change, the IRS still rewards us with a tax deduction. Do the feds really support organized opposition to, for example, the military-industrial complex? Or is there more to the story of what my students sometimes refer to as the “nonprofit-industrial complex”?

What’s That Tax Deduction Really Worth?

For many decades, people who give away money have been schooled to seek a tax deduction in return. We’re encouraged to be suspicious of organizations that don’t qualify under section 501(c)(3) of the U.S. Tax Code, which offers a bonus for our generosity. We’re told that the federal government will bless any organization that’s really doing something useful with the magic wand of that coveted tax-deductible status.

Can’t charitable donations save us thousands of tax dollars? Doesn’t that make insisting on such a tax deduction the grown-up thing to do?

Not, as it turns out, for most people. This belief that charitable write-offs also pay off is based on a misunderstanding about how such tax deductions work. Suppose you give a qualifying charity $100. That will reduce your tax bill by the same amount, right? Alas, no. That $100-dollar gift will reduce the amount of your income that you pay taxes on by half the amount of your gift, or $50. This means that if you, like most working folks, pay federal taxes at an effective rate of less than 15%, the amount you save on taxes by giving $100 is less than 15% of $50, or a whopping $7.50 — the price of a couple of fancy coffee drinks.

But giving to get a tax deduction is even less financially advantageous than that. Suppose you’re part of a married couple that earned $120,000 in 2022. Lucky you. If you were taxed for every penny of that amount, your federal income tax bill would be $17,634, an effective tax rate of 14.7%. (Look here to see how it works.)

What if you decide to tithe, donating 10% of that $120,000, or $12,000, to qualifying charities. Such giving would reduce your taxable income by $6,000, bringing it down to $114,000. Your federal income tax bill would then come down to $16,314, saving you about $1,320. Not bad, eh?

But here’s the kicker: Suppose instead that you decide not to “itemize” — that is, list all your contributions (and other expenditures like that great middle-class welfare plan, the mortgage interest tax deduction)? Instead, you opt for the “standard deduction.” For a married couple filing together, that will reduce your taxable income by $25,900, bringing that same tax bill to a mere $11,396, saving you about $5,700 in taxes without your giving away a penny. To get the same tax write-off by making contributions, you’d have to donate twice the standard deduction, or $51,800.

So am I suggesting that we shouldn’t give money to non-profits, because we often don’t really benefit from the tax deduction? Absolutely not. I’m saying that when we donate, we shouldn’t do it because of the tax deduction. We should do it, if we can, because it supports activities crucial to our own and the long-term survival and even flourishing of so many other people. This is true, whether you’re helping a relative pay the rent this month or contributing to a candidate for the Wisconsin supreme court. Neither of those gifts will garner you a penny in tax deductions, but one will keep someone you love off the street and the other would have helped ensure that women in Wisconsin could still get an abortion when they needed one.

Who loses out when we refuse to give without a tax deduction? To begin with, we do, because we’ve placed an artificial limit on the kinds of campaigns, organizations, and people we allow ourselves to donate to. We’ve automatically excluded, for instance, gifts to political parties and candidates. No organization offering tax deductions can support or oppose any candidate for elected office. Of course, elected officials aren’t the only people in a position to affect our lives, but four years under Donald Trump and a couple under Joe Biden should have reminded us that they can do a whole lot of harm, or substantial good.

Making Sense of the Foundation-Non-Profit Complex

The current system of tax deductions creates other losers as well. The halo that surrounds 501(c)(3) status also constrains recipient non-profit organizations. Those that agree to the IRS rules are making a not-always advantageous deal with the devil. That tax-deductible status comes with some hefty limits:

  • No electioneering (campaigning for or against candidates)
  • No activities outside the “charitable and educational purpose” described in the organization’s original application for such status
  • Organizations may not be created for the purpose of violating laws, meaning that advocating for civil disobedience, for example, or tax resistance can’t be part of an organization’s official purpose.

Failure to comply with the rules can get an organization’s tax-exempt status yanked. One result: organizations begin to censor themselves, sometimes restricting their activities even more than the law requires. For example, while 501(c)(3)s aren’t allowed to get involved in election campaigns for specific candidates, depending on their total annual income, they are allowed to use as much as 20% of their spending each year to influence legislation or government policies.

That means they can work directly for or against ballot initiatives and spend money to lobby government officials. They can, for example, employ paid lobbyists or rent buses to bring people to a state capitol or Congress to talk to their representatives about a particular issue. Many smaller non-profits may not even know this and so may limit their scope of action even more severely than the law requires for fear of losing that precious status.

The third restriction (no illegal purposes) has ramifications for organizations that may want to fund or be involved with actions that are nonviolent but illegal, like certain kinds of civil disobedience. In 1975, for instance, the IRS denied 501(c)(3) status to an antiwar organization whose stated charitable purpose was training people to participate in nonviolent civil disobedience. Does that mean that no 501(c)(3) outfit can engage in civil disobedience? No, but it does mean that your entire stated charitable purpose cannot be law-breaking. Fear of losing their status, whether well-founded or not, keeps many organizations from even considering participation in entire realms of political action.

Given these restrictions, why would organizations seeking radical change to existing racial, gender, or economic structures want to go the 501(c)(3) route? What makes that status so valuable for a non-profit organization? We’ve already seen that individual donors shy away from giving when they don’t get a tax deduction. But for many organizations there’s an even more important source of funds that also requires the 501(c)(3) tag: foundation grants. Although individuals give far more money overall than foundations, many organizations depend on large chunks of money from foundations, which in most cases will only make grants to 501(c)(3) groups.

But accepting foundation funding is another Faustian bargain for several reasons:

  • Foundation funding can deform your program: It can entice you to change your focus and your activities to attract grant money. You’re no longer choosing your own priorities. Instead, you find yourself tailoring what you do to what foundations want to support.

    Say yours is a racial-justice organization working for serious police reform. Funding for your area of work is scarce, but a whole tranche of funding for after-school youth programs suddenly becomes available. Now, there may even be a place for such programs addressing young people in your overall political strategy. But if you need the money to keep the doors open, you may be tempted to refocus large parts of your program, not because you think after-school youth activities are the best vehicle for police reform, but because that’s where the money is.
  • Foundations can be fickle: They are easily distracted by the new and shiny. One moment, they may be all about “sustainable communities,” which might be a perfect fit for your organization’s efforts to address the problems of renters in your city. So you spend three years building the coalitions and doing the public contact work necessary to put a rent-stabilization ordinance on the ballot. At that very moment, however, the money dries up, because funders are now focusing on public transit instead. Just when you most need the support to help win the next election, your benefactors have wandered off in pursuit of something different.
  • Foundations can rearrange an entire non-profit “ecosystem”: They may, for example, look at organizations in various states working on racial justice in public education and decide that such a movement needs a national “collaboration” (a foundational buzzword). So a coordinating council is created and funded, requiring that individual organizations participate as a condition for receiving grant money. That means they’ll have to divert staff time for planning for national meetings, Zoom calls, and conferences. Such a coalition could, of course, supercharge the work of individual organizations, but it might just as easily distract them from crucial local work while involving them in controversies they’d rather avoid.
  • Foundations have no incentive to support fundamental economic change: Where do foundations come from? The majority, even in the liberal and progressive world, are private, often funded by a single family (think “Rockefeller”) or company (think “Ford” or “Kellogg”). Creating such philanthropic vehicles provides real tax advantages to such families and companies. It also gives them a very respectable way of influencing public policy. Ultimately, these foundations are not going to support serious challenges to the capitalist system, because that system guarantees the continued wealth of their founders.

Foundations and “Movement Capture”

I teach at a college and, over the years, many of my justice-minded students have expressed a desire to change the world by founding their own non-profit organizations. In their minds and based on the models they see in their worthy community-engaged learning classes, non-profits are the sine qua non of social change and social movements. For most of them, working for social justice means working for a non-profit.

It wasn’t always this way. Before the 1954 U.S. tax code created the 501(c)(3) designation, charities existed, but mostly to provide direct aid to people in need. They certainly weren’t the main political vehicles for social change. In fact, many people organizing to improve their lives were far more likely to turn to unions, not only to improve wages and working conditions, but to address other issues in their lives like housing. It’s probably no accident that the rise of the non-profit sector in the second half of the twentieth century coincided with the reduced power of unions.

It was during the Civil Rights movement of the 1950s and 1960s that modern non-profits and foundations first played a significant role in attempts to bring about structural change in this country, even while also ensuring that such change, in the end, would be limited in nature. The giant among those philanthropic organizations was then the Ford Foundation, the largest collection of charitable wealth the world had ever seen.

A bit late to the civil rights fight, Ford turned its attention in that direction toward the end of the 1960s, after the passage of the Civil Rights and Voting Rights Acts. Under the leadership of McGeorge Bundy, a Cold War liberal and Vietnam War architect, Ford began to make huge contributions in the field of civil rights, mainly to the venerable National Urban League, but also to the NAACP. At the same time, it sought to reduce more militant activities, refusing, for example, to fund Martin Luther King’s planned Poor People’s March on Washington, scheduled for 1968. At the time when the Black Power movement was growing, Ford used its inaugural funding of Black organizations to moderate the influence of more radical voices.

University of Washington scholar Megan Ming Francis has labeled such attempts to use foundation money to control and channel a powerful social movement as “movement capture.” In a 2019 paper, she describes an early example of this process. During the 1920s and 1930s, she writes, funders, including the white-run Garland Foundation, used “their financial leverage to redirect the NAACP’s agenda away from the issue of racial violence to a focus on education at a critical juncture in the civil rights movement.” A decades-long fight for a federal anti-lynching law never succeeded.

Who knows what might have happened had the most powerful civil rights organization of the time kept its focus on lynching and the institutionalized state torture of Black people in the early twentieth century? Maybe police would no longer continue to routinely kill people of color with such impunity in this century.

An extract from the Ford Foundation’s 1967 annual report indicates that the tradition of squelching popular uprisings and redirecting the focus of activists lived on at Ford:

Staff attention has been turned to the more militant civil rights organizations, specifically the Congress Of Racial Equality (CORE) and the Southern Christian Leadership Conference (SCLC). Our interest in both is in seeing whether we may be of help to them in fashioning rational, goal-oriented programs of constructive action. [Emphasis added.]

In other words, use Ford money to get the militants to pivot to activities their “betters” then considered “constructive action.”

What’s a Donor to Do?

The view of funders and non-profits I’ve offered here doesn’t fit either the right-wing understanding of charity as an alternative to government action (former President George H.W. Bush’s “thousand points of light”) or the liberal belief in the power of foundation-funded organizations to change the world. Unfortunately, I have no prescription for how or where to give your money away this tax season, or indeed during the rest of the year. I can only suggest that you do give. And that whatever any of us give, whether it be money, time, or attention, we do it expecting the only return that really matters: taking part in the larger movements for justice and mercy in all their forms.

Rent is unaffordable for more than half the country

Rebecca Gordon: Singing the "Bourgeois Blues"

On my way home from the doctor's office, I regularly pass the New York apartment building where I grew up. I would invariably stop, stare, and feel an overwhelming desire to visit the place I hadn't seen in perhaps 60 years. The street door hadn't changed a bit.

A few months ago, on a whim, I looked for the buzzer to apartment 6D, pressed it, and a woman's voice answered. I promptly said, "Hi, I'm Tom Engelhardt. I grew up in the apartment you now live in and was wondering whether you'd let me see it again." To my amazement — yes, this is New York City! — she promptly buzzed me in and I found myself riding to the 6th floor on the barely updated gate elevator I used as a kid. Ours was, I must tell you, a remarkable apartment. Even to get to it, you had to step out of the elevator, walk down a short corridor out onto a covered but open catwalk (where you can still see the roofs of New York around you), and then down another corridor.

So many years later, I did just that and, when the present resident of 6D let me in, felt overwhelmed with memories as I saw the staircase to the second floor where my old bedroom was, the living room with the remarkable skylight under which my mother drew her caricatures, and even the little porch beyond it. And yes, it sounds, I know, like quite a place, which it was (and remains). Today, fully renovated, it's undoubtedly a wildly expensive coop or condo, but, in 1946, when my parents got that duplex apartment, just after my father left the Air Force in the wake of World War II, it was rent-controlled and cheap as hell. (Lucky for them as, in the 1950s when I was a kid, they were eternally short on cash.) But no surprise then either. After all, at the time, all of New York was rent-controlled and veterans stood a reasonable chance of getting a fine apartment they could actually afford.

As in much of the country now, rent control in New York is largely a thing of the past as rents here have all too literally gone through the roof, with even studio apartments soaring toward $4,000 a month. As Bloomberg News reports, there's never been a worse time to rent in the big city. More than three bedrooms will cost you an average of $9,592 per month. And yes, that's to rent, not buy! Imagine that! Once upon a time, that apartment of mine was something like $190 per month! And with that in mind, let TomDispatch regular Rebecca Gordon fill you in on rent madness in twenty-first-century America. Tom

Don't Try to Find a Home in Washington, D.C. Or Pretty Much Anywhere Else If You're a Renter

In 1937, the American folklorist Alan Lomax invited Louisiana folksinger Huddie Ledbetter (better known as Lead Belly) to record some of his songs for the Library of Congress in Washington, D.C. Lead Belly and his wife Martha searched in vain for a place to spend a few nights nearby. But they were Black and no hotel would give them shelter, nor would any Black landlord let them in, because they were accompanied by Lomax, who was white. A white friend of Lomax's finally agreed to put them up, although his landlord screamed abuse at him and threatened to call the police.

In response to this encounter with D.C.'s Jim Crow laws, Lead Belly wrote a song, "The Bourgeois Blues," recounting his and Martha's humiliation and warning Blacks to avoid the capital if they were looking for a place to live. The chorus goes,

Lord, in a bourgeois town
It's a bourgeois town
I got the bourgeois blues
Gonna spread the news all around

And one verse adds,

I want to tell all the colored people to listen to me
Don’t ever try to get a home in Washington, D.C.
'Cause it's a bourgeois town

Such affronts, Lead Belly sang, occurred in the "home of the brave, land of the free," where he didn't want "to be mistreated by no bourgeoisie."

There are music scholars who believe that Lead Belly didn't really understand what "bourgeois" meant. They claim Lomax, later accused of being a Communist "fellow traveler," provided him with that addition to his vocabulary and he simply understood it as a synonym for "racist." Personally, I think that, in a few deft verses, Lead Belly managed to show how racism and class stratification merged to make it all but impossible to find a home in Washington, as in so many other places in America.

Still a Bourgeois Town

In the late 1970s, after a period of unemployment, my mother got a job for a year in Washington. We'd lived there while I was growing up, but she hadn't been back for almost a decade. She was a white middle-class professional and it was still hell finding an affordable place to rent. (She'd been without a job for more than a year.) It would be some time before credit ratings would be formalized, thanks to the financial corporation FICO, producing a model of a standardized credit score for anyone. But her prospective landlords had other ways of checking on her creditworthiness. That she was a divorced woman with no rental history and no recent jobs didn't make things easy.

Still, she had her sense of humor. One day during that search, she mailed me an old 45 rpm recording of Lead Belly's "Bourgeois Blues." It seemed to perfectly catch her frustrated efforts to escape a friend's guest room before she wore out her welcome.

I was reminded of that record recently when I read about the travails of Maxwell Alejandro Frost, a new Democratic congressman from Orlando, Florida. Born in 1996, he's the youngest member of the House of Representatives. He quit his full-time job to campaign for Congress, supporting himself by driving an Uber. When he tried to find a home in Washington, his application for a studio apartment was rejected because of a bad credit score. As Frost tweeted:

Just applied to an apartment in DC where I told the guy that my credit was really bad. He said I'd be fine. Got denied, lost the apartment, and the application fee. This ain't meant for people who don't already have money.

Nor, as Lead Belly might have added, for people like Frost who are Black.

Washington, D.C., it seems, remains a "bourgeois" town.

The True Costs of Renting

Suppose you want to rent a place to live. What will you need to have put aside just to move in? This depends not only on the monthly rent, but on other fees and upfront payments in the place where you plan to live. And, of course, your credit score.

Application fee: One part of Frost's story caught my attention: he had to forfeit his "application fee" for an apartment he didn't get. If, like me, you haven't rented a house or apartment in a while you might not even know about such fees. They're meant to cover the cost of a background check on the applicant. You might expect them to be rolled into the rent, but in a seller's (or renter's) market, there's no risk to landlords in making them extra.

Frost's fee was $50 for one application. (These fees tend to top out around $75.) Not so bad, right? Until you grasp that many potential renters find themselves filing multiple applications — 10 isn't unheard of — simply to find one place to rent, so you're potentially talking about hundreds of dollars in fees. California, my own state, is among the few that regulate application fees. The maximum rises to match inflation. In December 2022, that max was $59.67. Some states set a lower maximum, and some don't regulate the fees at all.

Move-in fees: If you haven't rented in a while, this one may take you by surprise. Unlike a security deposit, move-in fees are nonrefundable. They'e supposed to cover the costs of preparing a place for a new tenant — everything from installing new locks to replacing appliances and painting. Once subsumed in the monthly rent, today these costs are often passed on directly to renters. Nationally, they average between 30% and 50% of a month's rent.

In June 2022, the median rent for an apartment in the United States crossed the $2,000 threshold for the first time, which means the median move-in fee now ranges from $600 to $1,000.

First and last months' rent: This upfront cost should be familiar to anyone who's ever rented. Landlords almost always require two months' rent upfront and hold on to the last month's rent to ensure that a tenant can't skip out without paying. Because landlords can invest the money they're holding (and tenants can't invest what they've forked over to landlords), in recent years, most states have required landlords to pay interest on the tenant's funds.

Security deposit: Unlike the move-in fee, a security deposit — often a month's rent — is refundable if tenants leave a place in good condition. Its ostensible purpose: to reimburse the landlord for future cleaning and repair costs that exceed normal wear-and-tear. (But wait! Isn't that what the non-refundable move-in fee should do?)

Other fees: If you're renting a condo, you may have to cover the owner's monthly Home Owner Association fees. In some cases, you'll also pay for a utility's hookup like gas or electricity.

So, how much will you have to pay to set foot in that apartment? Well, if you're like Nuala Bishari, a San Francisco Chronicle reporter who recently tried to rent a house in nearby Oakland, California, you'll need to set aside almost $10,000. If you're not sure how you could possibly put that kind of money together, the credit score company Experian has some advice for you:

First, "calculate your odds." Find out how many other people are applying for the unit you're interested in and, if the competition is stiff, "consider looking elsewhere." (As if you haven't done that already!)

Then tighten your belt. "Reducing extraneous expenses," it observes, "is an easy way to save." Stop going out to eat, for instance, and look for free family activities. If that's not enough, it's time to "get serious about cost cutting." Their brilliant suggestions include:

  • "Cut back on utility use. [Wait! I thought I was supposed to cook more at home. Never mind. I'll just sit here in the dark.]
  • Carpool to work instead of driving. [I take the bus, but maybe I should start walking.]
  • Switch to a budget grocery store and look for coupons and sales. [Right! No more Whole Paycheck for me!]
  • Join a buy-nothing group."

Such "advice" to people desperate to find housing would be amusing if it weren't so desperately insulting.

Rent Is Unaffordable for More Than Half the Country

Suppose you've managed to get together your up-front costs. What can you expect to pay each month? The federal Department of Housing and Urban Development considers housing affordable when rent takes no more than 30% of an individual's or family's monthly income. Human Rights Watch (!) reported in December 2022 that the Census Bureau's 2021 Annual Community Survey revealed a little over half of all renters are spending more than 30% of their income that way — and in many cases, significantly more.

It tells you something that Human Rights Watch is concerned about housing costs in this country. The National Low Income Housing Coalition (NLIHC) put its data in perspective through what it calls a "Housing Wage": the hourly rate you'd need to make working 40 hours a week to afford to rent a place in a specific area. For many Americans, housing, they report, is simply "out of reach."

In 2022, a full-time worker needs to earn an hourly wage of $25.82 on average to afford a modest, two-bedroom rental home in the U.S. This Housing Wage for a two-bedroom home is $18.57 higher than the federal minimum wage of $7.25. In 11 states and the District of Columbia, the two-bedroom Housing Wage is more than $25.00 per hour. A full-time worker needs to earn an hourly wage of $21.25 on average in order to afford a modest one-bedroom rental home in the U.S.

Unfortunately, many people don't earn $21.25 an hour, which is why they hold two or three jobs, or add Uber or Door Dash shifts to their other work. It's hardest for minimum wage workers. As the NLIHC observes, "In no state can a person working full-time at the prevailing federal, state, or county minimum wage afford a two-bedroom apartment at the [fair market rate]." Furthermore, "in only 274 counties out of more than 3,000 nationwide can a full-time worker earning the minimum wage afford a one-bedroom rental home at the [fair market rate]."

For people living at or below the poverty line, the situation is even direr, which is why so many end up unhoused, whether by couch-surfing among friends and family or pitching a tent on the street.

In the coming months, the situation is only expected to worsen now that pandemic-era eviction moratoriums and the $46.5 billion federal Emergency Rental Assistance Program are expiring. According to the Pew Research Center, those programs prevented more than a million people from being evicted.

It Wasn't Always This Way

People have always experienced poverty, but in the United States, the poor have not always gone without housing. Yes, they lived in tenements or, if they were men down on their luck, in single-room occupancy hotels. And yes, the conditions were often horrible, but at least they spent their nights indoors.

Indeed, the routine presence of significant populations of the urban unhoused on this country's city streets goes back only about four decades. When I moved to the San Francisco Bay Area in 1982, there was a community of about 400 people living in or near People’s Park in Berkeley. Known as the Berkeley Beggars, they were considered a complete oddity, a hangover of burnt-out hippies from the 1960s.

During President Ronald Reagan's administration, however, a number of factors combined to create a semi-permanent class of the unhoused in this country: high-interest rates implemented by the Federal Reserve's inflation fight drove up the cost of mortgages; a corruption scandal destroyed many savings and loan institutions from which middle-income people had long secured home mortgages; labor unions came under sustained attack, even by the federal government; and real wages (adjusted for inflation) plateaued.

Declaring that government was the problem, not the solution, Reagan began a four-decade-long Republican quest to dismantle the New Deal social-safety net implemented under President Franklin Delano Roosevelt and supplemented under President Lyndon Johnson. Reagan savaged poverty-reduction programs like Food Stamps and Medicaid, while throwing more than 300,000 people with disabilities off Social Security. Democrat Bill Clinton followed up, joining with Republicans to weaken Aid to Families with Dependent Children ("welfare").

A decade earlier, scandal-ridden state asylums for the mentally ill began to be shut down all over the country. In the late 1960s, Reagan had led that effort in California when he was governor. While hundreds of thousands were freed from a form of incarceration, they also instantly lost their housing. (On a personal note, this is why, in 1990, my mother found herself living in unsupervised subsidized housing for a population of frail elderly and recently de-institutionalized people with mental illnesses. This wasn't a good combination.)

By the turn of the century, a permanent cohort of people without housing had come to seem a natural part of American life.

And It Doesn't Have to Be Like This Forever

There is no single solution to the growing problem of unaffordable housing, but with political will and organizing action at the local, state, and federal levels it could be dealt with. In addition to the obvious — building more housing — here are a few modest suggestions:

At the state and local level:

  • Raise minimum wages to reflect the prevailing cost of living.
  • Remove zoning restrictions on the construction of multifamily buildings.
  • Pass rent-control ordinances, so rents rise no faster than the consumer price index.
  • Pass limits on up-front rental and move-in fees.
  • Pass legislation to prevent no-cause evictions.
  • Pass legislation, as California has already done, to allow renters to report their on-time rent payments to credit bureaus, allowing them to boost their credit scores without borrowing money.

At the federal level:

  • Raise the federal minimum wage, which, even in this era of inflation, has been stuck at $7.25 an hour since 2009.
  • Increase funding for SNAP, the present food-stamp program (whose pandemic-era increases have just expired).
  • Increase federal funding for public housing.
  • Provide universal healthcare, ideally in the form of Medicare for all.
  • Increase "Section 8" housing subsidies for low-income renters.
  • Raise taxes on the wealthy to fund such changes.
  • Finally, shift part — say one-third — of the bloated "defense" budget (up $80 billion from last year to $858 billion in 2023) to programs that actually contribute to national security — the daily financial security of the people who live in this nation.

Then maybe the next time we send new people to Congress, all of them will be able to find a home in Washington, D.C.

Rain and heat, fire and snow:Life in a destabilized California

Rebecca Gordon: I Never Thought I’d Miss the Earthquakes

In many ways, it’s hard to absorb (if I can even use that word) the increasing extremity of our weather world. Just when you’ve taken in the latest newsflashes, something even more extreme appears on the radar screen and that news becomes history. It’s already ancient news that the last eight years were the hottest on record and the oceans were never warmer than in 2022. It doesn’t matter whether you’re talking about the recent record flooding of Auckland, New Zealand, or the “bomb cyclone” that hit Sacramento, California; it doesn’t matter whether you’re thinking of last summer’s record heat waves in China or this winter’s record cold there (reaching -63.4 in its northernmost city of Mohe); it doesn’t matter whether it’s the news that the temperature last year hit heights in South Asia and the Middle East where it became dangerous for a human being even to think about working, no less living outdoors; it doesn’t matter that billion-dollar climate disasters are on a striking path upward in this country and elsewhere in the world; it doesn’t matter whether you’re talking about the seemingly unending megadrought in the American Southwest and West (despite those recent flooding rains in California) or the fact that glaciers and ice sheets are melting in an unprecedented fashion from Greenland to the Swiss Alps, the Himalayas to Antarctica.

What makes all of this so tragic is that none of the above, extreme as it may be, faintly represents an end point. Such news is only going to get worse. Remember in 2015 when so many countries signed onto the Paris climate accords, agreeing that the one thing that shouldn’t happen on Planet Earth was for the global temperature to rise more than 1.5 degrees Celsius above the preindustrial level? That was sadly so then. It’s now expected that the return of the El Niño climate phenomenon in the Pacific later this year could push the global temperature over that line by 2024.

All of the above should be shocking news, but having just read it, you already know that it’s anything but these days. And increasingly, in our own fashion, ever more of us are beginning to experience climate change up close and personal. In fact, today, TomDispatch regular Rebecca Gordon writes a little memoir of living amid the torrents (and the drought) in northern California. Read it quickly. It’ll soon be ancient history. Tom

Rain and Heat, Fire and Snow: Life in a Destabilized California

It was January 1983 and raining in San Francisco.

The summer before, I’d moved here from Portland, Oregon, a city known for its perpetual gray drizzles and, on the 60-odd days a year when the sun deigns to shine, dazzling displays of greenery. My girlfriend had spent a year convincing me that San Francisco had much more to offer me than Portland did for her.

Every few months, I’d scrape the bottom of my bank account to travel to San Francisco and taste its charms. Once, I even hitched a ride on a private plane. (Those were the days!) In a week’s visit, she’d take me to multiple women’s music concerts — events you’d wait a year for in Portland. We’d visit feminist and leftist bookstores, eat real Mexican food, and walk through Golden Gate Park in brilliant sunshine. The sky would be clear, the city would be sparkling, and she convinced me that San Francisco would indeed be paradise. Or at least drier than Portland.

So, I moved, but I wuz robbed! I knew it that first winter when, from December through March, the rain seemed to come down in rivers — atmospheric rivers, in fact — though none of us knew the term back then. That would be my initial encounter with, as a Mexican-American friend used to call it, “el pinche niño.” El Niño is the term meteorologists give to one-half of an oscillating cyclical weather phenomenon originating in the Pacific Ocean. El Niño usually brings drought to the southern parts of North America, as well as Central America, while deluging northern California and the Pacific Northwest. La Niña is the other half of that cycle, its effects roughly flipping those of El Niño geographically. (As for the meaning of “pinche,” go ahead and Google it.)

San Francisco sits in the sweet spot where, at least until the end of the last century, we would get winter rains at both ends of the cycle. And boy, did it rain that winter! I soon began to wonder whether any amount of love or any number of concerts could make up for the cold and mud. Eventually, I realized that I couldn’t really blame the girlfriend. The only other time I’d lived in San Francisco was during the then-unusual drought year of 1976. Of course, I came to believe then that it never rained here. So, really, if there was a bait-and-switch going on, I had pulled it on myself.

Still, looking back, as much as the rain annoyed me, I couldn’t have imagined how much I’d miss it two decades into the twenty-first century.

But Is It Climate Change? And Would That Actually Be So Bad?

Along with the rest of the western United States, my city has now been in the grip of a two-decade-long megadrought that has persisted through a number of El Niño/La Niña cycles. Scientists tell us that it’s the worst for the West and Southwest in at least the last 1,200 years. Since 2005, I’ve biked or walked the three miles from my house to the university where I teach. In all those years, there have probably been fewer than 10 days when rain forced me to drive or take the bus. Periodic droughts are not unknown in this part of the country. But climate scientists are convinced that this extended, deadly drought has been caused by climate change.

It wasn’t always that way. Twenty years ago, those of us who even knew about global warming, from laypeople to experts, were wary of attributing any particular weather event to it. Climate-change deniers and believers alike made a point of distinguishing between severe weather events and the long-term effects of changes in the climate. For the deniers, however, as the years went on, it seemed that no accumulation of symptoms — floods, droughts, heat waves, fires, or tornadoes — could legitimately be added together to yield a diagnosis of climate change. Or if climate change was the reason, then human activity didn’t cause it and it was probably a good thing anyway.

Not that long ago, it wasn’t even unusual to encounter “climate-change-is-good-for-you” articles in reasonably mainstream outlets. For example, the conservative British magazine The Spectator ran a Matt Ridley piece in 2013 that began: “Climate change has done more good than harm so far and is likely to continue doing so for most of this century. This is not some barmy, right-wing fantasy; it is the consensus of expert opinion.” It turned out that Ridley’s “consensus of expert opinion” derived from a single economist’s (and not a climate scientist’s) paper summarizing 14 other economists on the subject.

“The chief benefits of global warming,” Ridley wrote then, “include: fewer winter deaths; lower energy costs; better agricultural yields; probably fewer droughts; maybe richer biodiversity.” He added that, were the world’s economy to continue to grow by 3% annually, “the average person will be about nine times as rich in 2080 as she is today. So low-lying Bangladesh will be able to afford the same kind of flood defenses that the Dutch have today.”

There was so much wrong with those last two sentences (beginning with what “average” means), but I’ll content myself with pointing out that, in October 2022, historic floods covered one-third of Pakistan (next door to Bangladesh), including prime farmland the size of the state of Virginia. Thirty-three million people were affected by those floods that, according to the New York Times, “were caused by heavier-than-usual monsoon rains and glacial melt.” And what led to such unusual rain and melt? As the Times reported:

Scientists say that global warming caused by greenhouse-gas emissions is sharply increasing the likelihood of extreme rain in South Asia, home to a quarter of humanity. And they say there is little doubt that it made this year’s monsoon season more destructive.

It seems unlikely those floods will lead to “better agricultural yields.” (If only Pakistan had thought to build dikes, like the Dutch!)

Maybe it’s easy to take potshots at what someone like Ridley wrote almost a decade ago, knowing what we do now. Back then, views like his were not uncommon on the right and, all too sadly, they’re not rare even today. (Ridley is still at it, having recently written a piece twitting the British Conservative Party for supporting something as outré as wind power.) And of course, those climate change denials were supported (then and now) by the companies that stood to lose the most from confronting the dangers of greenhouse gases, not only the fossil-fuel industry (whose scientists knew with stunning accuracy exactly what was already happening on this planet as early as the 1970s), but electric companies as well.

Back in 2000, an ExxonMobile “advertorial” in the New York Times hit the trifecta: climate change isn’t real; or if it is, humans (and especially fossil-fuel companies!) aren’t responsible; and anyway it might be a good thing. Titled “Unsettled Science,” the piece falsely argued that scientists could not agree on whether climate change was happening. (By that time, 90% of climate scientists, including ExxonMobile’s, had reached a consensus that climate change is real.) After all, the ad insisted, there had been other extended periods of unusual weather like the “little ice age” of the medieval era and, in any case, greenhouse gas concentrations vary naturally “for reasons having nothing to do with human activity.”

We shouldn’t be surprised that Exxon-Mobile tried to keep climate change controversial in the public mind. They had a lot to lose in a transition away from fossil fuels. It’s less common knowledge, however, that the company has long bankrolled climate denial “grassroots” organizations. In fact, its scientists knew about climate change as early as the 1950s and, in a 1977 internal memo, they summarized their research on the subject by predicting a one- to three-degree Celsius average temperature rise by 2050, pretty much the future we’re now staring at.

Water, Water, Anywhere?

California has been “lucky” this fall and winter. We’ve seen a (probably temporary) break in the endless drought. A series of atmospheric rivers have brought desperately needed rain to our valleys and an abundance of snow to the mountains. But not everyone has been celebrating, as floods have swept away homes, cars, and people up and down the state. They’ve shut down highways and rail lines, while forcing thousands to evacuate. After years of thirst, for a few weeks the state has been drowning; and, as is so often the case with natural disasters, the poorest people have been among those hardest hit.

I’ve always enjoyed the delicious smugness of lying in a warm bed listening to wind and water banging at my windows. These days it’s a guilty pleasure, though, because I know how many thousands of unhoused people have suffered in and even died during the recent storms. In Sacramento, rain marooned one tent encampment, as the spit of land it occupied became an island. In the city of Ontario, near Los Angeles, flash floods washed away people’s tents and may have drowned as many as 10 of their inhabitants.

My own city responded to the rains with police sweeps of unhoused people hours before a “bomb cyclone” hit on January 4th. In such a “sweep,” police and sometimes other officials descend suddenly to enforce city ordinances that make it illegal to sit or lie on the sidewalk. They make people “move along,” confiscating any belongings they can’t carry off. Worse yet, shelters in the city were already full. There was nowhere inside for the unhoused to go and many lost the tents that had been their only covering.

The same climate change that’s prolonged the drought has exacerbated the deadly effects of those rainstorms. Over the last few years, record wildfires have consumed entire communities. Twenty years of endless dry days have turned our forests and meadows into tinderboxes, just waiting for a spark. Now, when rain bangs down in such amounts on already burnt, drought-hardened land, houses slide down hills, trees are pulled from the earth, and sinkholes open in roads and highways.

There is one genuine piece of luck here, though. Along with the rain, more than twice as much snow as would accumulate in an average year has covered the Sierra mountains of northern California. This is significant because many cities in the region get their water from the Sierra runoff. San Francisco is typical. Its municipal water supply comes from the Hetch Hetchy Reservoir, near Yosemite National Park, fed from that runoff. For now, it looks as if a number of cities could, for the first time in a while, have extra water available this year. But there’s always the chance that warm weather early in the spring will turn snow to rain, melting away the snowpack and our hopes.

Much of northern California’s water comes from the Sierra mountains, but it’s a different story in the south. The 9.8 million residents of Los Angeles County, along with most of southern California, get their water from the Colorado River. A century-old arrangement governs water use by the seven states through which the Colorado runs, along with 30 tribal nations and parts of northern Mexico — about 40 million people in all. Historically, the “northern basin” states, Wyoming, Utah, Colorado, and New Mexico, have been allocated 7.5 million acre-feet of water a year. Nevada, California, and Arizona have received 8.5 million and Mexico has treaty rights to 1.5 million. Dams on the two lakes — Mead in Nevada and Powell in Utah — provide hydroelectric power to many people in those same states.

The megadrought has drastically reduced the levels of these two artificial lakes that serve as reservoirs for those seven states. The original agreement assumed that 17.5 million acre-feet of water would be available annually (each acre-foot being about what two households might use in a year). For the last three years, however, the flow has fallen below 10 million acre-feet. This year, the states have been unable to agree on how to parcel out those allocations, so the Biden administration may have to step in and impose a settlement.

Both lakes are at their lowest historic levels since they were first filled. Several times, while working on a midterm election campaign in Reno, Nevada last year, I noticed stories in the local press about human remains being uncovered as Lake Mead’s shoreline recedes, some of them apparently victims of mob hits in decades past.

Less water in those giant lakes means less water for agriculture and residential consumption. But the falling water levels threaten a further problem: the potential failure of their dams to provide electric power crucial to millions. Last summer, Lake Mead dropped to within 90 feet of the depth at which its dam can no longer generate power. Some estimates suggest that Lake Powell’s Glen Canyon dam may stop producing electricity as soon as July.

Earthquakes, Drought, and Disaster

The woman I moved to San Francisco for (whom I’ve known since I was a young teen in the 1960s) spent her college years at the University of California, Berkeley. I remember her telling me, in the summer of 1969, that she and a number of friends had spent the previous spring semester celebrating the coming end of the world as they knew it. Apparently, some scientists had then predicted that a giant earthquake would cause the San Francisco Bay Area to collapse into the Pacific Ocean. Facing such a possible catastrophe, a lot of young folks decided that they might as well have a good party. There was smoking and drinking and dancing to welcome the approaching apocalypse. (When a Big One did hit 20 years later, the city didn’t exactly fall into the ocean, but a big chunk of the San Francisco Bay Bridge did go down.)

Over the last months, we Californians have experienced both historic drought and historic rainfall. The world as we knew it really is ending faster than some of us ever expected. Now that we’re facing an imminent catastrophe, one already killing people around the globe and even in my state, it’s hard to know how to respond. Somehow, I don’t feel like partying though. I think it’s time to fight.

Exceptions to American exceptionalism and what 2023 could bring

Rebecca Gordon: Another Exceptional Year?

Originally, he was an America Firster, a phrase from the pre-World War II era deeply associated with anti-Semitism. (Yep, he arrived there years before Ye ever made it on the scene!) As an America Firster, who undoubtedly snitched the phrase from Pat Buchanan’s 2000 presidential run, he even used it in his 2017 inaugural address. (“We assembled here today are issuing a new decree to be heard in every city, in every foreign capital and in every hall of power. From this day forward, a new vision will govern our land. From this day forward, it’s going to be only America first — America first.”) Initially, however, he rejected the allied idea of American exceptionalism. As he put it at an event in Texas in 2015, “I don’t like the term. I never liked it. When I see these politicians get up [and say], ‘the American exceptionalism’ — we’re dying. We owe $18 trillion in debt. I’d like to make us exceptional. And I’d like to talk later instead of now.”

And when you think about it, that was rather exceptional in its own way. After all, he was claiming — as no other politician would have dared to do at the time — that this country was anything but exceptional; that, in fact, it had to be brought back from the graveyard of history and that he was going to do so. He was going to “make America great again” (MAGA), a phrase that, in 2015-2016, no other politician, Democrat or Republican, would have dared use. Of course, his truest position was always Trump First and, in 2023, it’s undoubtedly Make Trump Great Again. However, explain it as you will, he did later adopt American exceptionalism when, assumedly, he had made his country and his presidency the exception to all rules. And of course, in January 2021, he gave genuine new meaning both to “Trump first” and “American exceptionalism” when he called on his followers to attend a “big protest in D.C.” that would “be wild” and support that most exceptional of all presidents, him.

In the Biden era, with the January 6th moment more or less behind us, we’re back to American exceptionalism, not that most of our politicians ever truly left it in the dust. And as TomDispatch regular Rebecca Gordon, author of American Nuremberg, reminds us today, lest you think otherwise, this country does remain all too exceptional, even if in ways that couldn’t be more unnerving. Let her explain. Tom

American Exceptionalism on Full Display: Why This Country Might Want to Lower Its Expectations

Let me start with a confession: I no longer read all the way through newspaper stories about the war in Ukraine. After years of writing about war and torture, I’ve reached my limit. These days, I just can’t pore through the details of the ongoing nightmare there. It’s shameful, but I don’t want to know the names of the dead or examine images caught by brave photographers of half-exploded buildings, exposing details — a shoe, a chair, a doll, some half-destroyed possessions — of lives lost, while I remain safe and warm in San Francisco. Increasingly, I find that I just can’t bear it.

And so I scan the headlines and the opening paragraphs, picking up just enough to grasp the shape of Vladimir Putin’s horrific military strategy: the bombing of civilian targets like markets and apartment buildings, the attacks on the civilian power grid, and the outright murder of the residents of cities and towns occupied by Russian troops. And these aren’t aberrations in an otherwise lawfully conducted war. No, they represent an intentional strategy of terror, designed to demoralize civilians rather than defeat an enemy military. This means, of course, that they’re also war crimes: violations of the laws and customs of war as summarized in 2005 by the International Committee of the Red Cross (ICRC).

The first rule of war, as laid out by the ICRC, requires combatant countries to distinguish between (permitted) military and (prohibited) civilian targets. The second states that “acts or threats of violence the primary purpose of which is to spread terror among the civilian population” — an all-too-on-target summary of Russia’s war-making these last 10 months — “are prohibited.” Violating that prohibition is a crime.

The Great Exceptions

How should war criminals be held accountable for their actions? At the end of World War II, the victorious Allies answered this question with trials of major German and Japanese officials. The most famous of these were held in the German city of Nuremberg, where the first 22 defendants included former high government officials, military commanders, and propagandists of the Nazi regime, as well as the banker who built its war machine. All but three were convicted and 12 were hanged.

The architects of those Nuremberg trials — representatives of the United States, the Soviet Union, the United Kingdom, and France — intended them as a model of accountability for future wars. The best of those men (and most of them were men) recognized their debt to the future and knew they were establishing a precedent that might someday be held against their own nations. The chief prosecutor for the United States, Robert H. Jackson, put it this way: “We must not forget that the record on which we judge the defendants today is the record on which we will be judged tomorrow.”

Indeed, the Nuremberg jurists fully expected that the new United Nations would establish a permanent court where war criminals who couldn’t be tried in their home countries might be brought to justice. In the end, it took more than half a century to establish the International Criminal Court (ICC). Only in 1998 did 60 nations adopt the ICC’s founding document, the Rome Statute. Today, 123 countries have signed.

Russia is a major exception, which means that its nationals can’t be tried at the ICC for war crimes in Ukraine. And that includes the crime the Nuremberg tribunal identified as the source of all the rest of the war crimes the Nazis committed: launching an aggressive, unprovoked war.

Guess what other superpower has never signed the ICC? Here are a few hints:

  • Its 2021 military budget dwarfed that of the next nine countries combined and was 1.5 times the size of what the world’s other 144 countries with such budgets spent on defense that year.
  • Its president has just signed a $1.7 trillion spending bill for 2023, more than half of which is devoted to “defense” (and that, in turn, is only part of that country’s full national security budget).
  • It operates roughly 750 publicly acknowledged military bases in at least 80 countries.
  • In 2003, it began an aggressive, unprovoked (and disastrous) war by invading a country 6,900 miles away.

War Crimes? No, Thank You

Yes, the United States is that other Great Exception to the rules of war. While, in 2000, during the waning days of his presidency, Bill Clinton did sign the Rome Statute, the Senate never ratified it. Then, in 2002, as the Bush administration was ramping up its “global war on terror,” including its disastrous occupation of Afghanistan and an illegal CIA global torture program, the United States simply withdrew its signature entirely. Secretary of Defense Donald Rumsfeld then explained why this way:

[T]he ICC provisions claim the authority to detain and try American citizens — U.S. soldiers, sailors, airmen and Marines, as well as current and future officials — even though the United States has not given its consent to be bound by the treaty. When the ICC treaty enters into force this summer, U.S. citizens will be exposed to the risk of prosecution by a court that is unaccountable to the American people, and that has no obligation to respect the Constitutional rights of our citizens.

That August, in case the U.S. stance remained unclear to anyone, Congress passed, and President George W. Bush signed, the American Servicemembers Protection Act of 2002. As Human Rights Watch reported at the time, “The new law authorizes the use of military force to liberate any American or citizen of a U.S.-allied country being held by the [International Criminal] Court, which is located in The Hague.” Hence, its nickname: the “Hague Invasion Act.” A lesser-known provision also permitted the United States to withdraw military support from any nation that participates in the ICC.

The assumption built into Rumsfeld’s explanation was that there was something special — even exceptional — about U.S. citizens. Unlike the rest of the world, we have “Constitutional rights,” which apparently include the right to commit war crimes with impunity. Even if a citizen is convicted of such a crime in a U.S. court, he or she has a good chance of receiving a presidential pardon. And were such a person to turn out to be one of the “current and future officials” Rumsfeld mentioned, his or her chance of being hauled into court would be about the same as mine of someday being appointed secretary of defense.

The United States is not a member of the ICC, but, as it happens, Afghanistan is. In 2018, the court’s chief prosecutor, Fatou Bensouda, formally requested that a case be opened for war crimes committed in that country. The New York Times reported that Bensouda’s “inquiry would mostly focus on large-scale crimes against civilians attributed to the Taliban and Afghan government forces.” However, it would also examine “alleged C.I.A. and American military abuse in detention centers in Afghanistan in 2003 and 2004, and at sites in Poland, Lithuania, and Romania, putting the court directly at odds with the United States.”

Bensouda planned an evidence-gathering trip to the United States, but in April 2019, the Trump administration revoked her visa, preventing her from interviewing any witnesses here. It then followed up with financial sanctions on Bensouda and another ICC prosecutor, Phakiso Mochochoko.

Republicans like Bush and Trump are not, however, the only presidents to resist cooperating with the ICC. Objection to its jurisdiction has become remarkably bipartisan. It’s true that, in April 2021, President Joe Biden rescinded the strictures on Bensouda and Mochochoko, but not without emphasizing this exceptional nation’s opposition to the ICC as an appropriate venue for trying Americans. The preamble to his executive order notes that:

The United States continues to object to the International Criminal Court’s assertions of jurisdiction over personnel of such non-State Parties as the United States and its allies absent their consent or referral by the United Nations Security Council and will vigorously protect current and former United States personnel from any attempts to exercise such jurisdiction.

Neither Donald Rumsfeld nor Donald Trump could have said it more clearly.

So where do those potential Afghan cases stand today? A new prosecutor, Karim Khan, took over as 2021 ended. He announced that the investigation would indeed go forward, but that acts of the U.S. and allies like the United Kingdom would not be examined. He would instead focus on actions of the Taliban and the Afghan offshoot of the Islamic State. When it comes to potential war crimes, the United States remains the Great Exception.

In other words, although this country isn’t a member of the court, it wields more influence than many countries that are – all of which means that, in 2023, the United States is not in the best position when it comes to accusing Russia of horrifying war crimes in Ukraine.

What the Dickens?

I blame my seven decades of life for the way my mind can now meander. For me, “great exceptions” brings to mind Charles Dickens’s classic story Great Expectations. His novels exposed the cruel reality of life among the poor in an industrializing Great Britain, with special attention to the pain felt by children. Even folks whose only brush with Dickens was reading Oliver Twist or watching The Muppets Christmas Carol know what’s meant by the expression “Dickensian poverty.” It’s poverty with that extra twist of cruelty — the kind the American version of capitalism has so effectively perpetuated.

When it comes to poverty among children, the United States is indeed exceptional, even among the 38 largely high-income nations of the Organization for Economic Cooperation and Development (OECD). As of 2018, the average rate of child poverty in OECD countries was 12.8%. (In Finland and Denmark, it was only 4%!) For the United States, with the world’s highest gross domestic product, however, it was 21%.

Then, something remarkable happened. In year two of the Covid pandemic, Congress passed the American Rescue Plan, which (among other measures) expanded the child tax credit from $2,000 up to as much as $3,600 per child. The payments came in monthly installments and, unlike the Earned Income Credit, a family didn’t need to have any income to qualify. The result? An almost immediate 40% drop in child poverty. Imagine that!

Given such success, you might think that keeping an expanded child tax credit in place would be an obvious move. Saving little children from poverty! But if so, you’ve failed to take into account the Republican Party’s remarkable commitment to maintaining its version of American exceptionalism. One of the items that the party’s congressional representatives managed to get expunged from the $1.7 trillion 2023 appropriation bill was that very expanded child tax credit. It seems that cruelty to children was the Republican party’s price for funding government operations.

Charles Dickens would have recognized that exceptional — and gratuitous — piece of meanness.

The same bill, by the way, also thanks to Republican negotiators, ended universal federal public-school-lunch funding, put in place during the pandemic’s worst years. And lest you think the Republican concern with (extending) poverty ended with starving children, the bill also will allow states to resume kicking people off Medicaid (federally subsidized health care for low-income people) starting in April 2023. The Kaiser Family Foundation estimates that one in five Americans will lose access to medical care as a result.

Great expectations for 2023, indeed.

We’re the Exception!

There are, in fact, quite a number of other ways in which this country is also exceptional. Here are just a few of them:

  • Children killed by guns each year. In the U.S. it’s 5.6 per 100,000. That’s seven times as high as the next highest country, Canada, at 0.8 per 100,000.
  • Number of required paid days off per year. This country is exceptional here as well, with zero mandatory days off and 10 federal holidays annually. Even Mexico mandates six paid vacation days and seven holidays, for a total of 13. At the other end of the scale, Chile, France, Germany, South Korea, Spain, and the United Kingdom all require a combined total of more than 30 paid days off per year.
  • Life expectancy. According to 2019 data, the latest available from the World Health Organization for 183 countries, U.S. average life expectancy at birth for both sexes is 78.5 years. Not too shabby, right? Until you realize that there are 40 countries with higher life expectancy than ours, including Japan at number one with 84.26 years, not to mention Chile, Greece, Peru, and Turkey, among many others.
  • Economic inequality. The World Bank calculates a Gini coefficient of 41.5 for the United States in 2019. The Gini is a 0-to-100-point measure of inequality, with 0 being perfect equality. The World Bank lists the U.S. economy as more unequal than those of 142 other countries, including places as poor as Haiti and Niger. Incomes are certainly lower in those countries, but unlike the United States, the misery is spread around far more evenly.
  • Women’s rights. The United States signed the United Nations Convention on the Elimination of All Forms of Discrimination against Women in 1980, but the Senate has never ratified it (thank you again, Republicans!), so it doesn’t carry the force of law here. Last year, the right-wing Supreme Court gave the Senate a helping hand with its decision in Dobbs v. Jackson Women’s Health Organization to overturn Roe v. Wade. Since then, several state legislatures have rushed to join the handful of nations that outlaw all abortions. The good news is that voters in states from Kansas to Kentucky have ratified women’s bodily autonomy by rejecting anti-abortion ballot propositions.
  • Greenhouse gas emissions. Well, hooray! We’re no longer number one in this category. China surpassed us in 2006. Still, give us full credit; we’re a strong second and remain historically the greatest greenhouse gas emitter of all time.

Make 2023 a (Less) Exceptional Year

Wouldn’t it be wonderful if we were just a little less exceptional? If, for instance, in this new year, we were to transfer some of those hundreds of billions of dollars Congress and the Biden administration have just committed to enriching corporate weapons makers, while propping up an ultimately unsustainable military apparatus, to the actual needs of Americans? Wouldn’t it be wonderful if just a little of that money were put into a new child tax credit?

Sadly, it doesn’t look very likely this year, given a Congress in which, however minimally and madly, the Republicans control the House of Representatives. Still, whatever the disappointments, I don’t hate this country of mine. I love it — or at least I love what it could be. I’ve just spent four months on the front lines of American politics in Nevada, watching some of us at our very best risk guns, dogs, and constant racial invective to get out the vote for a Democratic senator.

I’m reminded of poet Lloyd Stone’s words that I sang as a teenager to the tune of Sibelius’s Finlandia hymn:

My country’s skies are bluer than the ocean
And sunlight beams on cloverleaf and pine
But other lands have sunlight, too, and clover,
And skies are somewhere blue as mine.
Oh, hear my prayer, O gods of all the nations
A song of peace for their lands and for mine

So, no great expectations in 2023, but we can still hope for a few exceptions, can’t we?

Living for politics or 'just living'?

Rebecca Gordon: Three Conversations about Politics

Since I turned 18, I doubt I’ve ever missed a vote. Certainly, though, I never missed a presidential election. In 1968, at age 24, for instance, already swept away by the anti-Vietnam War movement, I voted for antiwar Democrat Eugene McCarthy in the New York primary. Even though McCarthy would win the popular vote nationally in the Democratic primaries, he lost the nomination, in a distinctly controversial fashion, at the Democratic convention to former Vice President Hubert Humphrey, hardly an antiwar sort of guy. Still, in the election to come, I voted for him, only to see Republican Richard Nixon (of the notorious “Southern strategy” and later Watergate infamy) beat him nationally, become president, and later expand that American war in North Vietnam, Laos, and Cambodia. (Note that Alabama segregationist governor George Wallace won more than 5% of New York State’s vote that year, a reminder with Nixon that there has long been a Trumpian quality to American politics.) And then, four years later, I would vote for George McGovern, again to end that war, only to watch Nixon win for the second time in a landslide (even in New York!). Sigh.

Still, to this day, I do go out and vote, although, on my way to the polls, I sometimes have to ask my wife whom I should vote for farther down the ticket. So, in my modest, haphazard fashion, I’ve participated in American politics, but never, like TomDispatch regular Rebecca Gordon, just back from the front lines of the recent midterm elections, in actual campaign work. Not since, as a child on Halloween, I took a donation container door to door in my apartment building for UNICEF, have I ever, as Gordon describes so vividly today, tried to directly convince anyone to do anything political in a campaign of any sort. (And given the recent midterms, as you’ll see when you read her piece today, thank heavens she, and so many other political activists like her, did so in a big-time way!) She has what she calls a “political vocation” and, given our present American world, the 2022 election season, and the 2024 version to come, thank goodness she, like so many others, does.

Still, I wouldn’t claim that I had no political vocation whatsoever. In my own fashion, here at TomDispatch, I’ve labored week after week, month after month, trying to put crucial information about how our world actually works and who is (and isn’t) responsible for that in front of anyone willing to read such pieces. And that, in its own fashion, has, I suppose, been my vocation, my version, you might say, of going out on the campaign trail — though what the reader does with anything I publish at this website is, of course, up to him or her. Now, if you want to think a little about what your own vocation in life might be, political or otherwise, check out Gordon. Tom

Living for Politics Or "Just Living"?

“Welcome back!” read my friend Allan’s email. “So happy to have you back and seeing that hard work paid off. Thank you for all that you do. Please don’t cook this evening. I am bringing you a Honduran dinner — tacos hondureños and baleadas, plus a bottle of wine.” The tacos were tasty indeed, but even more pleasing was my friend’s evident admiration for my recent political activities.

My partner and I had just returned from four months in Reno, working with UNITE-HERE, the hospitality industry union, on their 2022 midterm electoral campaign. It’s no exaggeration to say that, with the votes in Nevada’s mostly right-wing rural counties cancelling out those of Democratic-leaning Las Vegas, that union campaign in Reno saved the Senate from falling to the Republicans. Catherine Cortez Masto, the nation’s first Latina senator, won reelection by a mere 7,928 votes, out of a total of more than a million cast. It was her winning margin of 8,615 in Washoe County, home to Reno, that put her over the top.

Our friend was full of admiration for the two of us, but the people who truly deserved the credit were the hotel housekeepers, cooks, caterers, and casino workers who, for months, walked the Washoe County streets six days a week, knocking on doors in 105-degree heat and even stumping through an Election Day snowstorm. They endured having guns pulled on them, dogs sicced on them, and racist insults thrown at them, and still went out the next day to convince working-class voters in communities of color to mark their ballots for a candidate many had never heard of. My partner and I only played back-up roles in all of this; she, managing the logistics of housing, feeding, and supplying the canvassers, and I, working with maps and spreadsheets to figure out where to send the teams each day. It was, admittedly, necessary, if not exactly heroic, work.

“I’m not like the two of you,” Allan said when he stopped by with the promised dinner. “You do important work. I’m just living my life.”

“Not everybody,” I responded, “has a calling to politics.” And I think that’s true. I also wonder whether having politics as a vocation is entirely admirable.

Learning to Surf

That exchange with Allan got me thinking about the place of politics in my own life. I’ve been fortunate enough to be involved in activism of one sort or another for most of my 70 years, but it’s been just good fortune or luck that I happened to stumble into a life with a calling, even one as peculiar as politics.

There are historical moments when large numbers of people “just living” perfectly good lives find themselves swept up in the breaking wave of a political movement. I’ve seen quite a few of those moments, starting with the struggle of Black people for civil rights when I was a teenager, and the movement to stop the Vietnam War in that same era. Much more recently, I’ve watched thousands of volunteers in Kansas angrily reject the Supreme Court’s decision in Dobbs v. Jackson Women’s Health Organization, which overturned a 50-year precedent protecting a woman’s right to end a pregnancy. Going door to door in a classic political field campaign, they defeated a proposed anti-abortion amendment to the Kansas constitution, while almost doubling the expected turnout for a midterm primary.

To some observers, in a red and landlocked state like Kansas, that wave of resistance seemed to come out of nowhere. It certainly surprised a lot of professionals, but the capacity to ride it didn’t, in fact, come out of nowhere. When given a choice, it turns out that a substantial majority of people in the middle of this country will vote in favor of women’s bodily autonomy. But many of them won’t do it without a push. To build such a successful electoral campaign required people who’d spent years honing the necessary skills in times when the political seas appeared almost unendurably flat.

Some of those skills, learned through repeated practice, were technical: crafting effective messages; targeting the right voters; navigating coalitions of organizations with sometimes overlapping, sometimes competing priorities. And some might be called “moral skills,” the cultivation of internal characteristics — patience, say, or hope — until they become second nature. The Greek philosopher Aristotle called those moral skills “virtues” and believed we acquire them just like any other skill — by practicing them until they become habits.

You could compare some of us with a political vocation to a surfer sitting on her board constantly scanning the sea ahead, hoping to discern the best waves as they form, hoping she’d practiced well enough to ride them. Like so many surfers of this sort, I’ve probably wiped out more often than I’ve successfully ridden the waves.

Character Flaws for Justice

“This is the year,” I told a different friend long ago, “that I want to develop character flaws.” She was understandably startled, not least because my character has never been what you might call spotless.

“Why would you want to do that?” she asked.

“Because I’m getting ready to work on a political campaign.” I was only half-joking. In fact, doing politics effectively requires habits that don’t come naturally to me — like keeping information close to my vest rather than sharing everything I know with all comers.

There’s a fine line, too, between sitting on information and telling lies. In fact, to do politics effectively, you must be willing to lie. This truth is often taken for granted by those involved. A recent New York Times article about a man who can’t stop lying referred to a study of people’s self-reported truthfulness. Writing about those who admit to lying frequently, reporter Ellen Barry says,

This ‘small group of prolific liars,’ as the researchers termed it, constituted around 5.3 percent of the population but told half the reported lies, an average of 15 per day. Some were in professions, like retail or politics, that compelled them to lie. But others lied in a way that had no clear rationale. [My emphasis added, of course.]

As Barry sees it, politics is self-evidently a profession that compels its practitioners to lie. And I tend to agree with her, though I’m less interested in the lies candidates tell voters to get elected than the ones organizers like me tell people to get them to join, stick with, or fund a campaign.

Often, we lie about whether we can win. As I’ve written previously, I worked on campaigns I was sure we were going to lose, but that I thought were worth fighting anyway. In 1995 and 1996, for instance, I helped build a field campaign to defeat California Proposition 209, which succeeded in outlawing affirmative action at every level of government. We didn’t have much of a chance, but we still built an army of volunteers statewide, in part by telling them that, though our opponents had the money, we had the people capable of engaging voters no one expected to participate.

So, we said we could win because we were thinking ahead. Proposition 209 represented a cynical effort (indeed, its authors called it the California Civil Rights Initiative) to harness white anxiety about what would soon be a nonwhite majority in California. We hoped that building a multi-racial coalition to fight this initiative, even if we lost, would prepare people for the struggles to come.

But did I really know we couldn’t win? At some point, I suppose I traded in one virtue — truthfulness — for another — hope. And then, to project confidence and encourage others to hope as well, I had to start believing my own lies (at least a bit).

The funny thing about hope, though, is that sometimes the lies you force yourself to believe turn out to be true. That’s what happened this year with the campaign in Nevada. You never have enough canvassers to talk to every voter, so you have to choose your main target groups. UNITE-HERE chose to target people of color in working-class neighborhoods who rarely or never participate in elections.

Voters in Nevada are unusual in that more than a third of them (37%) are registered to vote with a small party or have no party affiliation at all. This is the largest single group of voters in the state, and it included many of our targets. Registered Democrats have a 6% edge over Republicans in Nevada, but the question always is: Which way will the people in the mysterious middle vote — for us or them? During two weeks of early voting, I downloaded the statistics on the party affiliations of the voters in Washoe County, where I was working. Democrats were winning the mail-in ballots, but when it came to in-person voting, the Republicans were creaming us. It didn’t look good at all — except that the numbers of small-party or no-party voters dwarfed the consistent edge the Republicans held. Which way would they jump?

I typically kept those statistics to myself, since it wasn’t part of my job to look at them in the first place. In the upbeat daily briefing for our canvassing team leaders, I concentrated instead on reporting the crucial everyday numbers for us: How many doors did we knock on yesterday? How many conversations did we have with voters? How many supporters did we identify? Those numbers I could present with honest enthusiasm, pointing to improvements made, for instance, by working with individual canvassers on how to keep doors open and voters talking.

But the funny thing was this: the hope I was projecting turned out to be warranted. The strategy that failed in California in 1996 — bringing out unlikely voters in communities of workers and people of color — succeeded in Nevada in 2022. When we opened the mystery box, it turned out to contain voters for us.

One More Conversation

I once had a friend, Lauren, who, for years, had been a member of one of the political organizations that grew out of the 1960s radical group Students for a Democratic Society. She’d gone to meetings and demonstrations, collated newsletters, handed out flyers, and participated in a well-functioning system of collective childcare. One day, I asked her how the work was going.

“Oh,” she said. “I dropped out. I still spend every Wednesday night with Emma [the child whose care she had shared in that group], but I’m not doing political work anymore.”

“But why not?”

“I realized that everything about politics involves making people do things they don’t want to do and that’s not how I want to spend my life.”

Even now, years later, I can see her point. Whether it’s asking my fellow part-time university teachers to come to a union meeting, trying to get a stranger to accept a leaflet on the street, or convincing a potential voter to listen to me about why this election matters and should matter to them, my strange vocation often does involve attempting to get people to do things they don’t particularly want to do.

Of course, it’s because I do believe in whatever I’m trying to move them toward that I’m involved in such politics in the first place. Usually, it’s because I believe that my goal should be their goal, too, whether it’s racial or economic justice, women’s liberation, or just keeping the planet from burning up.

But that leads me to another character flaw politics requires. You could call it pride, or even arrogance; it’s the confidence that I know better than you what’s good for you. Oddly enough, it may turn out that it’s when I’m pushing the most selfish goals — when I’m working for something I myself need like a living wage or the right to control my own body — that my motives stand up best to my own scrutiny.

It’s then that I’m asking someone to participate in collective action for my own benefit, and what could be more honest than that?

Politics as a Vocation

Politics as a Vocation” was the title of a well-known lecture by German sociologist Max Weber. In it, he famously defined the state as “a human community that (successfully) claims the monopoly of the legitimate use of physical force within a given territory.” Even when the use of force is delegated to some other institution — say, the police — Weber argued that citizens accept the “right” of the police to use violence because it comes from the state. That source of legitimacy is the only thing that separates a police force (so to speak) from any other violent gang.

For Weber, politics meant either leading a state or influencing its leaders. So if a state controls the legitimate use of force, then politics involves deciding how that force is to be deployed — under what conditions and for what purposes. It’s a heavy responsibility that, he claimed, people take on for one of only two reasons: either as a means to an end (which could be anything from personal wealth to ending poverty) or for its own sake — for the pleasure and feeling of prestige that power bestows.

“The decisive means for politics,” Weber wrote, “is violence.” If he was right, then my friend’s intuition that politics is about making people do things they don’t want to do may not have been so off the mark. Even the form of politics that appears to challenge Weber’s premise — the tradition of nonviolent action — involves a form of coercion. Those who willingly expose themselves to political violence are also trying to make people do something they don’t want to do by invoking empathy (and possibly feelings of guilt).

If, in some fashion, all politics really does involve coercion, can a political life possibly be a morally good one? I still think so, but it requires tempering a commitment to a cause with what Weber called the “ethic of responsibility” — a willingness not only to honestly examine our motives but to genuinely consider the likely results when we choose to act on them. It’s not enough to have good intentions. It’s crucial to strive as well for good — if imperfect — outcomes.

“Politics,” Weber said, “is a strong and slow boring of hard boards. It takes both passion and perspective.” But there’s another kind of life he also recommended, even if with a bit of a sneer, to those who don’t measure up to the demands of politics as a vocation. Such people “would have done better,” he observed, “in simply cultivating plain brotherliness in personal relations.”

And therein lies the greatest moral danger for those of us who feel that our vocation is indeed politics: a contempt for that plain “brotherliness” (or sisterliness) that makes ordinary human life bearable. There’s a saying attributed to Carlos Fonseca, one of the founders of Nicaragua’s revolutionary party, the Sandinistas: “A man [of course, it’s a man!] who is tired has a right to rest. But a man who rests does not have the right to be in the vanguard.”

And there it is, a fundamental disrespect for ordinary human life, including the need for rest, that tempts the activist to feel her calling makes her better than the people she’s called to serve.

In the end, if we do politics at all, it should be precisely so that people can have ordinary lives, ones not constrained and distorted by the kinds of injustice political activists try to end.

“I’m just living my life,” my friend Allan told me. In truth, his life is far more admirable than he believes. I’d say that he has a vocation for kindness every bit as heroic as any political calling. We’re not the only folks he feeds. The day before he visited us, he’d delivered dinner to another friend after her shoulder surgery. He spends little on himself, so he can send most of the money he earns to his family in Central America. During the worst of the pandemic shutdown, he regularly checked in on all the old folks he knows, startling my partner and me into realizing that we’ve lived long enough to fall into the category of elders to be looked after.

At the end of this long political season, back home from Nevada, I find that I’m full of admiration for the life my friend Allan is “just living.” As I wait for the next Trumpist wave to rise, may I remember that “just living” is the whole point of doing politics.

Returning to Reno in the shadow of Roe's undoing

Rebecca Gordon, Back to the Future Again

How far are we? Who really knows? Let’s just say that we’re somewhere significantly down the road to extremity, all-American style. With a hung (and wrung-out) Congress and lame (and aged) president, our tripartite government is looking ever less “tri” and ever more “part.” And it increasingly seems that the part being emphasized is a Supreme Court that should perhaps be renamed the Extreme Court. Only recently, it issued a series of Trumpist rulings that, from green-lighting the carrying of concealed weaponry to suppressing abortion to keeping climate change on track, rivaled in their extremity the 1857 Dred Scott ruling’s endorsement of slavery that helped launch the Civil War.

And that may just be the beginning. In their next term, for instance, the six justices of the Extreme Court could turn directly to that “tri” and try to whittle it down further. In particular, they may endorse what’s called the independent state legislature doctrine, an extremist theory that, according to the New York Times, “would give state legislatures independent power, not subject to review by state courts, to set election rules at odds with state constitutions, and to draw congressional maps warped by partisan gerrymandering.” And since, at this point, a significant majority of state legislatures are controlled by Republicans the possibility of gerrymandering the political map into a forever-winning extremist government seems all too imaginable.

So hold onto your hats (and guns) folks — we’ve already passed through the diciest post-election season in memory. (Sedition, you bet!) And we could be heading toward an all-American, all-Trumpist Extreme Court and Republican Party version of something akin to fascism.

With that in mind, could there be anything more important than getting out the vote in the elections of 2022 and 2024? I doubt it. So, thank you TomDispatch regular Rebecca Gordon for heading back to Nevada to lend a hand. We should all, whatever our doubts, take her as an example of what has to be done to prevent the Extremes from taking this land from so many of the rest of us. Tom

Returning to Reno In the Shadow of Roe's Undoing

Recently, I told my friend Mimi that, only weeks from now, I was returning to Reno to help UNITE-HERE, the hospitality industry union, in the potentially nightmarish 2022 election. “Even though,” I added, “I hate electoral politics.”

She just laughed.

“What’s so funny?” I asked.

“You’ve been saying that as long as I’ve known you,” she replied with a grin.

How right she was. And “as long as I’ve known you” has been a pretty long time. We met more than a quarter of a century ago when my partner and I hired her as the first organizer in a field campaign to defeat Proposition 209. That ballot initiative was one of a series pandering to the racial anxieties of white Californians that swept through the state in the 1990s. The first of them was Prop 187, outlawing the provision of government services, including health care and education, to undocumented immigrants. In 1994, Californians approved that initiative by a 59% to 41% vote. A federal court, however, found most of its provisions unconstitutional and it never went into effect.

We weren’t so lucky with Proposition 209, which, in 1996, outlawed affirmative-action programs statewide at any level of government or public service. Its effects reverberate to this day, not least at the prestigious University of California’s many campuses.

A study commissioned 25 years later by its Office of the President revealed that “Prop 209 caused a decline in systemwide URG enrollment by at least twelve percent.” URGs are the report’s shorthand for “underrepresented groups” — in other words, Latinos, Blacks, and Native Americans. Unfortunately, Proposition 209’s impact on the racial makeup of the university system’s students has persisted for decades and, as that report observed, “led URG applicants to cascade out of UC into measurably less-advantageous universities.” Because of UC’s importance in California’s labor market, “this caused a decline in the total number of high-earning ($100,000) early-30s African American and Hispanic/Latinx Californians by at least three percent.”

Yes, we lost the Prop 209 election, but the organization we helped start back in 1995, Californians for Justice, still flourishes. Led by people of color, it’s become a powerful statewide advocate for racial justice in public education with a number of electoral and legislative victories to its name.

Shortcomings and the Short Run

How do I hate thee, electoral organizing? Let me count the ways. First, such work requires that political activists like me go wide, but almost never deep. It forces us to treat voters like so many items to be checked off a list, not as political actors in their own right. Under intense time pressure, your job is to try to reach as many people as possible, immediately discarding those who clearly aren’t on your side and, in some cases, even actively discouraging them from voting. In the long run, treating elections this way can weaken the connection between citizens and their government by reducing all the forms of democratic participation to a single action, a vote. Such political work rarely builds organized power that lasts beyond Election Day.

In addition, electoral campaigns sometimes involve lying not just to voters, but even to your own canvassers (not to speak of yourself) about whether you can win or not. In bad campaigns — and I’ve seen a couple of them — everyone lies about the numbers: canvassers about how many doors they’ve knocked on; local field directors about what their canvassers have actually done; and so on up the chain of command to the campaign director. In good campaigns, this doesn’t happen, but those may not, I suspect, be in the majority. And lying, of course, can become a terrible habit for anyone hoping to construct a strong organization, not to mention a better world.

Lying, as the philosopher Immanuel Kant argued, is a way of treating people as if they were merely things to be used. Electoral campaigns can often tempt organizers to take just such an instrumental approach to others, assuming voters and campaign workers have value only to the extent that they can help you win. Such an approach, however efficient in the short run, doesn’t build solidarity or democratic power for the long haul. Sometimes, of course, the threat is so great — as was true when it came to the possible reelection of Donald Trump in 2020 — that the short-run simply matters more.

Another problem with elections? Campaigns so often involve convincing people to do something they’ve come to think of as a waste of time, namely, going to the polls. A 2018 senatorial race I worked on, for example, focused on our candidate’s belief in the importance of raising the minimum wage. And yes, we won that election, but four years later, the federal minimum wage is still stubbornly stuck at $7.25 an hour, though not, of course, through any fault of our candidate. Still, the voters who didn’t think electing Nevada Senator Jacky Rosen would improve their pay weren’t wrong.

On the other hand, the governor we helped elect that same year (and for whose reelection I’ll be working again soon) did come through for working Nevadans by, for example, signing legislation that guarantees a worker’s right to be recalled before anyone new is hired when a workplace reopens after a Covid shutdown.

You’ll hear some left-wing intellectuals and many working people who are, in the words of the old saying, “too broke to pay attention,” claim that elections don’t change anything. But such a view grows ever harder to countenance in a world where a Supreme Court disastrously reshaped by Donald Trump and Mitch McConnell is hell-bent on reshaping nearly the last century of American political life. It’s true that overturning Roe v. Wade doesn’t affect my body directly. I’m too old to need another abortion. Still, I’m just as angry as I was in 2016 at people who couldn’t bring themselves to vote for Hillary Clinton because she wasn’t Bernie Sanders. As I told such acquaintances at the time, “Yes, we’ll hate her and we’ll have to spend the next four years fighting her, but on the other hand, SUPREME COURT, SUPREME COURT, SUPREME COURT!”

Okay, maybe that wasn’t exactly the most elegant of arguments, but it was accurate, as anyone will tell you who’d like to avoid getting shot by a random heat-packing pedestrian, buried under the collapsing wall between church and state, or burned out in yet another climate-change-induced conflagration.

If Voting Changed Anything…

Back in 1996, as Election Day approached, Californians for Justice had expanded from two offices — in Oakland and Long Beach — to 11 around the state. We were paying a staff of 45 and expanding (while my partner and I lay awake many nights wondering how we’d make payroll at the end of the week). We were ready for our get-out-the-vote push.

Just before the election, one of the three organizations that had given us seed money published its monthly newsletter. The cover featured a photo of a brick wall spray-painted with the slogan: “If voting changed anything, they’d make it illegal.” Great, just what we needed!

It’s not as if I didn’t agree, at least in part, with the sentiment. Certainly, when it comes to foreign policy and the projection of military force globally, there has been little difference between the two mainstream political parties. Since the end of World War II, Democrats and Republicans have cooperated in a remarkably congenial way when it comes to this country’s disastrous empire-building project, while financially rewarding the military-industrial complex, year after year, in a grandiose fashion.

Even in the Proposition 209 campaign, my interest lay more in building long-term political power for California communities of color than in a vote I already knew we would lose. Still, I felt then and feel today that there’s something deeply wrong with the flippant response of some progressives that elections aren’t worth bothering about. I’d grown up in a time when, in the Jim Crow South, voting was still largely illegal for Blacks and people had actually died fighting for their right to vote. Decades earlier, some of my feminist forebears had been tortured while campaigning for votes for women.

Making Voting Illegal Again

In 1965, President Lyndon Johnson signed the Voting Rights Act, explicitly outlawing any law or regulation that “results in the denial or abridgment of the right of any citizen to vote on account of race or color.” Its specific provisions required states or counties with a history of voter suppression to receive “pre-clearance” from the attorney general or the District Court for the District of Columbia for any further changes in election laws or practices. Many experts considered this provision the heart of that Act.

Then, in 2013, in Shelby County v. Holder, a Supreme Court largely shaped by Republican presidents tore that heart right out. Essentially, the court ruled that, because those once excluded from voting could now do so, such jurisdictions no longer needed preclearance to change their voting laws and regulations. In other words, because it was working, it should be set aside.

Not surprisingly, some states moved immediately to restrict access to voting rights. According to the Brennan Center for Justice, “within 24 hours of the ruling, Texas announced that it would implement a strict photo ID law. Two other states, Mississippi and Alabama, also began to enforce photo ID laws that had previously been barred because of federal preclearance.” Within two months, North Carolina passed what that center called “a far-reaching and pernicious voting bill” which:

instituted a strict photo ID requirement; curtailed early voting; eliminated same day registration; restricted preregistration; ended annual voter registration drives; and eliminated the authority of county boards of elections to keep polls open for an additional hour.

Fortunately, the Fourth Circuit Court of Appeals struck down the North Carolina law in 2016, and surprisingly the Supreme Court let that ruling stand.

But as it turned out, the Supremes weren’t done with the Voting Rights Act. In 2021, the present Trumpian version of the court issued a ruling in Brnovich v. Democratic National Committee upholding Arizona’s right to pass laws requiring people to vote only in precincts where they live, while prohibiting anyone who wasn’t a relative of the voter from hand-delivering mail-in ballots to the polls. The court held that, even though in practice such measures would have a disproportionate effect on non-white voters, as long as a law was technically the same for all voters, it didn’t matter that, in practice, it would become harder for some groups to vote.

Writing for the majority, Justice Samuel Alito declared that states have a different and more important interest in such voting restrictions: preventing voter fraud. In other words — at least in the minds of two-thirds of the present Supreme Court — some version of Donald Trump’s big lie about rigged elections and voter fraud has successfully replaced racist voter suppression as the primary future danger to free and fair elections.

Maybe elections do change something. Otherwise, why, in the wake of the 2020 elections, would “they” (including Republican-controlled state legislatures across significant parts of the country) be so intent on making it ever harder for certain people to vote? And if you think that’s bad, wait until the Supremes rule next year on the fringe legal theory of an “independent state legislature.” We may well see the court decide that a state’s legislature can legally overrule the popular vote in a federal election — just in time for the 2024 presidential race.

The Future Awaits Us

A couple of times a week I talk by phone with another friend. We began doing this at the height of George W. Bush’s and Dick Cheney’s vicious “war on terror.” We’d console each other when it came to the horrors of that conflict, including the illegal invasion of Iraq, the deaths and torture of Iraqi and Afghan civilians, and the seemingly endless expansion of American imperial meddling. We’re still doing it. Somehow, every time we talk, it seems as if the world has traveled one more mile on its way to hell in a handbasket.

Both of us have spent our lives trying, in our own modest fashion, to gum up the works of capitalism, militarism, and authoritarian government. To say that we’ve been less than successful would certainly be understating things. Still, we do keep at it, while discussing what in the world we can still do.

At this point in my life and my country’s slide into authoritarian misery, I often find it hard even to imagine what would be useful. Faced with such political disorientation, I fall back on a core conviction that, when the way forward is unclear, the best thing we can do is give people the experience of achieving in concert what they could never achieve by themselves. Sometimes, the product of an organizing drive is indeed victory. Even when it isn’t though, helping create a group capable of reading a political situation and getting things done, while having one another’s backs, is also a kind of victory.

That’s why, this election season, my partner and I are returning to Reno to join hotel housekeepers, cooks, and casino workers trying to ensure the reelection of two Democrats, Senator Catherine Cortez Masto and Governor Steve Sisolak, in a state where the margin of Democratic Party victories hasn’t grown since 2012.

From our previous experience, we know one thing: we’ll be working on a well-run campaign that won’t waste anyone’s time and has its eye on the future. As I wrote about the union’s 2020 presidential campaign for Joe Biden, more than winning a difficult election is at stake. What’s also important is building organized power for working people. In other words, providing the kind of training and leadership development that will send “back to every hotel, restaurant, casino, and airport catering service leaders who can continue to organize and advocate for their working-class sisters and brothers.”

I still hate electoral politics, but you don’t always get to choose the terrain you’re fighting on. Through its machinations at the federal, state, and county level, the Republican Party has been all but screaming its plans to steal the next presidential election. It’s no exaggeration to say that preserving some form of democratic government two years from now depends in part on keeping Republicans from taking over Congress, especially the Senate, this year.

So, it’s back to Reno, where the future awaits us. Let’s hope it’s one we can live with.

Women now dominate higher education. What does that mean for its future?

Rebecca Gordon, Where the Boys Aren't

Sixty years later, I would still like a do-over. Yes, I went to a school where, to fiddle with the title of Rebecca Gordon’s article, the boys were (and only them). I’m talking about Yale College in the 1960s when it was all-male and the hunting (or do I mean haunting?) grounds for George W. Bush, like his father and grandfather before him, and John Kerry among others. Sigh. I fought my parents hard over the decision to go there and lost big time. For them, Yale meant that I would be headed for the stratosphere, just like George and John. I was to be a triumph for a family in which my dad was just emerging from what had, for him, been the tough years of the supposedly golden 1950s.

And yes, I did get a good education. I mean, what else was there for me to do — a Jewish kid at a university that had just removed its informal quota on Jews, and without a girl in sight? It was the rest of the experience, all the fraternities that didn’t rush me, the famed secret society, Skull and Bones, that didn’t give me a second thought, and all those hotels where you had to put up the girl you invited to New Haven for a partying weekend that cost more than I could afford. (Forget the fact that I didn’t exactly have a lot of people to invite.) Well, you know the story. Or maybe, I hope, you don’t.

And then, what did I do but wander into the study of Chinese history — not exactly my parents’ idea of how to prepare myself for future glory — and never looked back? And so it went in a college world that, as you’ll discover today, seems all too impossible even to imagine anymore (and thank god for that!).

All I could think as I read TomDispatch regular Rebecca Gordon’s look at higher education six decades later is that I missed my moment, even if it also seems, as she explains, that the women who now make up the majority of college students may be missing theirs, through no fault of their own. Tom

Fewer Big (or Any Size) Men on Campus: What Does It Mean that Women Now Dominate Higher Education?

In the last week of her life, my mother extracted a promise from me. “Make sure,” she said, “that Orion goes to college.”

I swore that I would, although I wasn’t at all sure how I’d make it happen. Even in the year 2000, average tuitions were almost 10 times what my own undergraduate school had charged 30 years earlier. I knew that sending my nephew to college would cost more money than I’d have when the time came. If he was going to college, like his aunt before him, he’d need financial help. The difference was that his “help” was likely to come not as a grant, but as life-defining loans.

“Orion,” by the way, is a pseudonym for my brother’s son, my parents’ only grandchild. To the extent that any of us placed family hopes in a next generation, he’s borne them all. Orion was only five years old when I made that promise and he lived 3,000 miles away in a depressed and depressing de-industrialized town in New York’s Hudson River Valley. We’d only met in person once at that point. Over the years, however, we kept in touch by phone, later by text message, and twice he even visited my partner and me in San Francisco.

A little more than a decade after I made that promise, Orion graduated from high school. I thought that with a scholarship, loans, and financial help from his father and us, we might indeed figure out how to pay the staggering costs of a college education, which now averages $35,000 a year, having doubled in this century alone.

It turned out, however, that money wasn’t the only obstacle to making good on my promise. There was another catch as well. Orion didn’t want to go to college. Certainly, the one guidance counselor at his 1,000-student public high school had made no attempt to encourage either him or, as far as I could tell, many of his classmates to plan for a post-high-school education. But would better academic counseling have made a difference? I doubt it.

A bright boy who had once been an avid reader, Orion was done with schooling by the time he’d turned 18. He made that clear when I visited him for a talk about his future. He had a few ideas about what he might do: join the military or the New York state police. In reality, though, it turned out that he had no serious interest in either of those careers.

He might have been a disaffected student, but he was — and is — a hard worker. Over the next few years, despite sky-high unemployment in the Hudson River Valley, he always had a job. He made and delivered pizzas. He cleaned rooms at a high-end hotel for wealthy equestrians. He did pick-up carpentry. And then he met an older tradesman who gave him an informal apprenticeship in laying floors and setting tile. Orion learned how to piece together hardwood and install carpeting. He proudly showed me photos of the floors he’d laid and the kitchens he’d tiled.

Eventually, he had to part ways with his mentor, who also happened to be a dangerous drunk. We had another talk and I reminded him of my promise to my mother. I’d recently gotten an unexpected windfall — an advance on a book I was writing, American Nuremberg — which put me in a position to help set him up in business. He bought a van, completed his tool set, and paid for a year’s insurance. Now, 10 years after graduating from high school, he’s making decent money as a respected tradesman and is thinking about marrying his girlfriend. He’s made himself a life without ever going to college.

I worry about him, though. Laying floors is a young person’s trade. A few years on your knees, swinging a hammer all day, will tear your joints apart. He can’t do this forever.

The Rising of the Women

Still, it turns out that my nephew isn’t the only young man to opt out of more schooling. I’ve seen this in my own classrooms and the data confirms it as a national and international trend.

I started teaching ethics at the University of San Francisco in 2005. It soon struck me that there were invariably more women in my classes than men. Nor was the subject matter responsible, since everyone had to pass a semester of ethics to graduate from that Jesuit university. No, as it turned out, my always-full classes represented the school’s overall gender balance. For a few years, I wondered whether such an overrepresentation of women could be attributed to parents who felt safer sending their daughters to a Catholic school, especially in a city with San Francisco’s reputation for sex, drugs, and rock ‘n’ roll.

Recently, though, I came to realize that my classes were simply part of a much larger phenomenon already beginning to worry some observers. Until about 1990, men invariably outnumbered women at every level of post-secondary education and more of them graduated, too. At four-year colleges and in post-graduate programs or in community colleges (once they became more prevalent), more men earned two-year, four-year, master’s, and doctorate-level degrees.

It was during the 1970s that the ratio began to shift. In 1970, among recent high-school graduates, 32% of the men and just 20% of the women enrolled in post-secondary institutions. By 1990, equal percentages – around 32% — were going to college. In the years that followed, college attendance continued to increase for both sexes, but significantly faster for women who, in 1994, surpassed men. Since the end of the 1990s, men’s college attendance has stayed relatively stable at about 37% of high-school graduates.

Women’s campus presence, however, has only continued to climb with 44% of recent female high-school graduates enrolled in post-secondary schools by 2019.

So, the problem, if there is one, isn’t that men have stopped going to college. A larger proportion of them, in fact, attend today than at any time in our history. It’s just that an even larger proportion of women are doing so.

As a result, if you visit a college campus, you should see roughly three women — now about 60% of all college students — for every two men. And that gap has been growing ever wider, even during the disruption of the Covid pandemic.

Not only do more women now attend college than men, but they’re more likely to graduate and receive degrees. According to the National Center for Educational Statistics, in 1970, men received 57% of both two- and four-year degrees, 61% of master’s degrees, and 90% of doctorates. By 2019, women were earning the majority of degrees at all levels.

One unexpected effect of this growing college gender gap is that it’s becoming harder for individual women to get into selective schools. The Hechinger Report, a non-profit institution focused on education, lists a number of well-known ones where male applicants have a better chance of being accepted, including:

Boston, Bowdoin and Swarthmore colleges; Brown, Denison, Pepperdine, Pomona, Vanderbilt and Wesleyan universities; and the University of Miami. At each school, men were at least 2 percentage points more likely than women to be accepted in both 2019 and 2020. Pitzer College admitted 20% of men last year compared to 15% of women, and Vassar College accepted 28% of men compared to 23% of women. Both had more than twice as many female applicants as male applicants.

Even for Vassar, once a women’s college, having too many women is now apparently a problem.

In addition, in recent years, despite those lower acceptance rates for women at elite schools, colleges have generally had to deal with declining enrollments, a trend only accelerated by the Covid pandemic. As Americans postpone having children and have fewer when they do, the number of people reaching college age is actually shrinking. Two-year colleges have been especially hard hit.

And there’s the debt factor. Like my nephew Orion, more potential students, especially men, are now weighing the problem of deferring their earnings, while acquiring a staggering debt load from their years at college. Some of them are opting instead to try to make a living without a degree. Certain observers think this shift has been partially caused by a pandemic-fueled rise in wages in the lower tiers of the American workforce.

A Mystery

Why are there fewer men than women in college today? On this, theories abound, but genuine answers are few. Conservatives offer a number of explanations that echo their culture-war slogans, including that “the atmosphere on the nation’s campuses has become increasingly hostile to masculinity.”

A Wall Street Journal op-ed ascribed it in part to “labor-saving innovations in household management and child care — automatic washing machines, disposable diapers, inexpensive takeout restaurants — as well as new forms of birth control [that] helped women pursue college degrees and achieve new vocational ambitions.” But the biggest problem, write the op-ed’s authors, may be that girls simply do better in elementary and secondary school, which discourages boys from going on to college. This problem, they argue, is attributable not only to the advent of washing machines, but ultimately to the implementation of the Great Society’s liberal social policies. Citing Charles Murray, the reactionary co-author of the 1994 book The Bell Curve, they blame women’s takeover of higher education on the progressive social policies of the 1960s, the rise of the “custodial” (or welfare) state, and the existence of a vast pool of jailed men. They write:

[T]here are about 1.24 million more men who are incarcerated than women, largely preventing them from attending traditional college. Scholars such as Charles Murray have long demonstrated that expanded government entitlements following the Great Society era have reduced traditional family formation, reduced incentives to excel both in school and on the job, and increased crime.

Critics to the left have also cited male incarceration as a factor in the college gender divide, although they’re more likely to blame racist police and policies. Sadly, the devastation caused by jailing so many Black, Latino, and Native American men has only begun to be understood, but given the existing racial divide in college attendance, I seriously doubt that many of those men would be in college even if they weren’t in prison.

Some observers have also suggested that, given the staggering rise in college tuitions, young men, especially from the working and middle classes, often make a sound if instinctive decision that a college education will not repay their time, effort, and the debt load it entails. Like my nephew, they may indeed be better off entering a well-paying trade and getting an early start on building their savings.

Do Women Need College More Than Men?

If some young men now believe that college won’t reward them sufficiently to warrant the investment, many young women have rightly judged that they will need a college education to have any hope of earning a decent living. It’s no accident that their college enrollment skyrocketed in the 1970s. After a long post-World-War II economic expansion, that was the moment when wages in this country first began stagnating, a trend that continued in the 1980s when President Ronald Reagan launched his attacks on unions, while the federal minimum wage barely rose. In fact, it has remained stuck at $7.25 per hour since 2009.

First established in 1938, the minimum wage was intended to allow a single adult (then assumed to be a man) to support a non-earning adult (assumed to be his wife), and several children. It was called a “breadwinner” wage. The feminism that made work outside the home possible for women, saving the lives and sanity of so many of us, provided a useful distraction from those stagnant real wages, rising inequality, and the increased immiseration of millions (not to speak of the multiplication of billionaires).

In the last few decades of the twentieth century, many women came to believe that working for money was their personal choice. In truth, I suspect that they were also responding to new economic realities and the end of that “breadwinner” wage. I think the college gender gap, which grew ever wider as wages fell, is at least in part a consequence of those changes. Few of my women students believe that they have a choice when it comes to supporting themselves, even if they haven’t necessarily accepted how limited the kind of work they’re likely to find will be. Whether they form partnered households or not, they take it for granted that they’ll have to support themselves financially.

This makes a college degree even more important, since having it has a bigger impact on women’s earnings than on men’s. A study by the Federal Reserve Bank in St. Louis confirmed this. Reviewing 2015 census data, it showed that the average wage for a man with only a high-school diploma was around $12 per hour. Women earned 24.4% less than that, or about $9 hourly. On the other hand, women got a somewhat greater boost (28%) from earning a two-year degree than men (22%). For a four-year degree, it was 68% for women and 62% for men.

In other words, although a college education improves income prospects for both genders, it does more for women — even if not enough to raise their income to the level of men with the same education. The income gender gap remains stubbornly fixed in men’s favor. Like Alice in Through the Looking Glass, it seems women still have to run faster just to avoid losing ground. This means that for us, earning a decent living requires at least some college, which is less true for men.

What Does the Future Hold?

Sadly, as college becomes ever more the preserve of women, I suspect it will also lose at least some of its social and economic value. They let us in and we turned out to be too good at it. My prediction? Someday, college will be dismissed as something women do and therefore not an important marker of social or economic worth.

As with other realms that became devalued when women entered them (secretarial work, for example, or family medicine), I expect that companies will soon begin dropping the college-degree requirement for applicants.

In fact, it already seems to be happening. Corporations like IBM, Accenture, and Bank of America have begun opting for “skills-based” rather than college-based hiring. According to a CNBC report, a recent Harvard Business School study examined job postings for software quality-assurance engineers and “found that only 26% of Accenture’s postings for the job contained a degree requirement. At IBM, just 29% did.” Even the government is dropping some college-degree requirements. According to the same report, in January 2021, the White House issued an executive order on “Limiting [the] Use of Educational Requirements in Federal Service Contracts.” When hiring for IT positions, the order says, considering only those with college degrees “excludes capable candidates and undermines labor market efficiencies.” And recently, Maryland announced that it’s dropping the college graduation requirement for thousands of state positions.

Of course, this entire economic argument assumes that the value of a college education is purely extrinsic and can be fully measured in dollars. As a long-time college teacher, I still believe that education has an intrinsic value, beyond preparing “job-ready” workers or increasing their earning potential. At its best, college offers a unique opportunity to encounter new ideas in expansive ways, learn how to weigh evidence and arguments, and contemplate what it means to be a human being and a citizen of the world. It can make democracy possible in a time of creeping authoritarianism.

What kind of future do we face in a world where such an experience could be reduced, like knitting (which was once an honorable way for both sexes to earn a living), to a mere hobby for women?

Protesting 'this country’s military death machine': confessions of an American tax resister

Rebecca Gordon, War, Death, and Taxes

I don’t normally do this, but in the context of TomDispatch regular Rebecca Gordon’s latest all-too-well-timed piece on paying (or rather not paying) one’s taxes, let me quote a couple of paragraphs I once wrote for this site about my own distant past and then briefly explain why:

And here’s a little story from the Neolithic age we now call ‘the Sixties’ about that moment when the U.S. military was still a citizen’s army with a draft (even if plenty of people figured out how to get exemptions). At a large demonstration, I turned in my draft card to protest the war. Not long after, my draft board summoned me. I knew when I got there that I had a right to look at my draft file, so I asked to see it. I have no idea what I thought I would find in it, but at 25, despite my antiwar activism, I still retained a curiously deep and abiding faith in my government. When I opened that file and found various documents from the FBI, I was deeply shocked. The Bureau, it turned out, had its eyes on me. Anxious about the confrontation to come — the members of my draft board would, in fact, soon quite literally be shouting at me and threatening to call me up essentially instantaneously — I remember touching one of those FBI documents and it was as if an electric current had run directly through my body. I couldn’t shake the Big Brotherness of it all, though undoubtedly my draft card had gone more or less directly from that demonstration to the Bureau.
As it happened, my draft board’s threats put me among the delinquent 1-A files to be called up next. Not long after, in July 1970 — I would read about it on the front page of the New York Times — a group of five antiwar activists, calling themselves Women Against Daddy Warbucks, broke into that very draft board, located in Rockefeller Center in New York City, took the 1-A files, shredded them, and tossed them like confetti around that tourist spot. And I never heard from my draft board again. Lucky me at that time. Of course, so many young, draftable American men had no such luck. They were indeed sent to Vietnam to fight and suffer, sometimes to be wounded or killed, or (as surprising numbers of them did) join the antiwar movement of that moment.

Those paragraphs came to mind because of the story Rebecca Gordon tells today about her own urge to resist America’s wars and the moment when she could personally go no further. It made me realize that, in some sense, thanks to those five women long ago, I was relieved of a decision I have no idea how I would have dealt with in the end. At that time, many young men like me were going to Canada rather than be drafted into the military and risking deployment to Vietnam. But I actually visited Canada soon after I turned in my draft card and, much as I liked the neighborhoods in Toronto where I spent time, I found I simply couldn’t imagine leaving my country, no matter what. It just wasn’t me. And that meant, when I was called up again, choosing either jail or the military. It was my luck, I suppose, that I never had to make that decision, which undoubtedly would have led to a very different life than the one I’ve had.

Other people, Gordon included, then and since, weren’t so lucky. No group called Five Women Against Uncle Sam destroyed her tax records in 1992 and so, today, she can tell you her antiwar story and remind us that we all have limits when it comes to our moments of decision. Tom

“Too Distraught”

Confessions of a Failed Tax Resister

Every April, as income-tax returns come due, I think about the day 30 years ago when I opened my rented mailbox and saw a business card resting inside. Its first line read, innocently enough, “United States Treasury.” It was the second line — “Internal Revenue Service” — that took my breath away. That card belonged to an IRS revenue agent and scrawled across it in blue ink was the message: “Call me.”

I’d used that mailbox as my address on the last tax return I’d filed, eight years earlier. Presumably, the agent thought she’d be visiting my home when she appeared at the place where I rented a mailbox, which, as I would discover, was the agency’s usual first step in running down errant taxpayers. Hands shaking, I put a quarter in a payphone and called my partner. “What’s going to happen to us?” I asked her.

Resisting War Taxes

I knew that the IRS wasn’t visiting me as part of an audit of my returns, since I hadn’t filed any for eight years. My partner and I were both informal tax resisters — she, ever since joining the pacifist Catholic Worker organization; and I, ever since I’d returned from Nicaragua in 1984. I’d spent six months traveling that country’s war zones as a volunteer with Witness for Peace. My work involved recording the testimony of people who had survived attacks by the “Contras,” the counterrevolutionary forces opposing the leftist Sandinista government then in power (after a popular uprising deposed the U.S.-backed dictator, Anastasio Somoza). At the time, the Contras were being illegally supported by the administration of President Ronald Reagan.

With training and guidance from the CIA, they were using a military strategy based on terrorizing civilians in the Nicaraguan countryside. Their targets included newly built schools, clinics, roads, and phone lines — anything the revolutionary government had, in fact, achieved — along with the campesinos (the families of subsistence farmers) who used such things. Contra attacks very often involved torture: flaying people alive, severing body parts, cutting open the wombs of pregnant women. Nor were such acts mere aberrations. They were strategic choices made by a force backed and directed by the United States.

When I got back to the United States, I simply couldn’t imagine paying taxes to subsidize the murder of people in another country, some of whom I knew personally. I continued working, first as a bookkeeper, then at a feminist bookstore, and eventually at a foundation. But with each new employer, on my W-4 form I would claim that I expected to owe no taxes that year, so the IRS wouldn’t take money out of my paycheck. And I stopped filing tax returns.

Not paying taxes for unjust wars has a long history in this country. It goes back at least to Henry David Thoreau’s refusal to pay them to support the Mexican-American War (1846-1848). His act of resistance landed him in jail for a night and led him to write On the Duty of Civil Disobedience, dooming generations of high-school students to reading the ruminations of a somewhat self-satisfied tax resister. Almost a century later, labor leader and pacifist A.J. Muste revived Thoreau’s tradition, once even filing a copy of the Duty of Civil Disobedience in place of his Form 1040. After supporting textile factory workers in their famous 1919 strike in Lowell, Massachusetts, and some 20 years later helping form and run the Amalgamated Textile Workers of America (where my mother once worked as a labor organizer), Muste eventually came to serve on the board of the War Resisters League (WRL).

For almost a century now, WRL, along with the even older Fellowship of Reconciliation and other peace groups, have promoted antiwar tax resistance as a nonviolent means of confronting this country’s militarism. In recent years, both organizations have expanded their work beyond opposing imperial adventures overseas to stand against racist, militarized policing at home as well.

Your Tax Dollars at Work

Each year, the WRL publishes a “pie chart” poster that explains “where your income tax money really goes.” In most years, more than half of it is allocated to what’s euphemistically called “defense.” This year’s poster, distinctly an outlier, indicates that pandemic-related spending boosted the non-military portion of the budget above the 50% mark for the first time in decades. Still, at $768 billion, we now have the largest Pentagon budget in history (and it’s soon to grow larger yet). That’s a nice reward for a military whose main achievements in this century are losing major wars in Iraq and Afghanistan.

But doesn’t the war in Ukraine justify all those billions? Not if you consider that none of the billions spent in previous years stopped Russia from invading. As Lindsay Koshgarian argues at Newsweek, “Colossal military spending didn’t prevent the Russian invasion, and more money won’t stop it. The U.S. alone already spends 12 times more on its military than Russia. When combined with Europe’s biggest military spenders, the U.S. and its allies on the continent outspend Russia by at least 15 to 1. If more military spending were the answer, we wouldn’t be in this situation.”

“Defense” spending could, however, just as accurately be described as welfare for military contractors, because that’s where so much of the money eventually ends up. The top five weapons-making companies in 2021 were Lockheed Martin, Raytheon Technology, Boeing, Northrup Grumman, and General Dynamics. Together, they reaped $198 billion in taxpayer funds last year alone. In 2020, the top 100 contractors took in $551 billion. Of course, we undoubtedly got some lovely toys for our money, but I’ve always found it difficult to live in — or eat — a drone. They’re certainly useful, however, for murdering a U.S. citizen in Yemen or so many civilians elsewhere in the Greater Middle East and Africa.

The Pentagon threatens the world with more than the direct violence of war. It’s also a significant factor driving climate change. The U.S. military is the world’s largest institutional consumer of oil. If it were a country, the Pentagon would rank 55th among the world’s carbon emitters.

While the military budget increases yearly, federal spending that actually promotes human welfare has fallen over the last decade. In fact, such spending for the program most Americans think of when they hear the word “welfare” — Temporary Aid for Needy Families, or TANF — hasn’t changed much since 1996, the year the Personal Responsibility and Work Opportunity Reconciliation Act (so-called welfare reform) took effect. In 1997, federal expenditures for TANF totaled about $16.6 billion. That figure has remained largely unchanged. However, according to the Congressional Research Service, since the program began, such expenditures have actually dropped 40% in value, thanks to inflation.

Unlike military outlays, spending for the actual welfare of Americans doesn’t increase over time. In fact, as a result of the austerity imposed by the 2011 Budget Control Act, the Center for Budget and Policy Priorities reports that “by 2021 non-defense funding (excluding veterans’ health care) was about 9% lower than it had been 11 years earlier after adjusting for inflation and population growth.” Note that Congress passed that austerity measure a mere three years after the subprime lending crisis exploded, initiating the Great Recession, whose reverberations still ring in our ears.

This isn’t necessarily how taxpayers want their money spent. In one recent poll, a majority of them, given the choice, said they would prioritize education, social insurance, and health care. A third would rather that their money not be spent on war at all. And almost 40% believed that the federal government simply doesn’t spend enough on social-welfare programs.

Death May Be Coming for Us All, But Taxes Are for the Little People

Pollsters don’t include corporations like Amazon, FedEx, and Nike in their surveys of taxpayers. Perhaps the reason is that those corporate behemoths often don’t pay a dollar in income tax. In 2020, in fact, 55 top U.S. companies paid no corporate income taxes whatsoever. Nor would the survey takers have polled billionaires like Jeff Bezos, Elon Musk, or Carl Icahn, all of whom also manage the neat trick of not paying any income tax at all some years.

In 2021, using “a vast trove of Internal Revenue Service data on the tax returns of thousands of the nation’s wealthiest people, covering more than 15 years,” ProPublica published a report on how much the rich really pay in taxes. The data show that, between 2014 and 2018, the richest Americans paid a measly “true tax” rate of 3.4% on the growth of their wealth over that period. The average American — you — typically pays 14% of his or her income each year in federal income tax. As ProPublica explains:

America’s billionaires avail themselves of tax-avoidance strategies beyond the reach of ordinary people. Their wealth derives from the skyrocketing value of their assets, like stock and property. Those gains are not defined by U.S. laws as taxable income unless and until the billionaires sell.

So, if the rich avoid paying taxes by holding onto their assets instead of selling them, where do they get the money to live like the billionaires they are? The answer isn’t complicated: they borrow it.­ Using their wealth as collateral, they typically borrow millions of dollars to live on, using the interest on those loans to offset any income they might actually receive in a given year and so reducing their taxes even more.

While they do avoid paying taxes, I’m pretty sure those plutocrats aren’t tax resisters. They’re not using such dodges to avoid paying for U.S. military interventions around the world, which was why I stopped paying taxes for almost a decade. Through the Reagan administration and the first Bush presidency, with the Savings and Loan debacle and the first Gulf War, there was little the U.S. government was doing that I wanted to support.

These days, however, having lived through the “greed is good” decade, having watched a particularly bizarre version of American individualism reach its pinnacle in the presidency of billionaire Donald Trump, I think about taxes a bit differently. I still don’t want to pay for the organized global version of murder that is war, American-style, but I’ve also come to see that taxes are an important form of communal solidarity. Our taxes allow us, though the government, to do things together we can’t do as individuals — like generating electricity or making sure our food is clean and safe. In a more truly democratic society, people like me might feel better about paying taxes, since we’d be deciding more collectively how to spend our common wealth for the common good. We might even buy fewer drones.

Until that day comes, there are still many ways, as the War Resisters League makes clear, to resist paying war taxes, should you choose to do so. I eventually started filing my returns again and paid off eight years of taxes, penalties, and interest. It wasn’t the life decision I’m proudest of, but here’s what happened.

“Too Distraught

The method I chose was, as I’ve said, not to file my tax returns, which, if your employer doesn’t withhold any taxes and send them to the feds, denies the federal government tax revenue from you. Mind you, for most of those years I wasn’t making much money. We’re talking about hundreds of dollars, not hundreds of thousands of dollars in lost tax revenue. Over those years, I got just the occasional plaintive query from the IRS about whether I’d forgotten my taxes. But during the mid-1980s, the IRS upgraded its computers, improving its ability to capture income reported by employers and so enabling it to recreate the returns a taxpayer should have filed, but didn’t. And so, in 1992 an IRS agent visited my mailbox.

Only a month earlier, a friend, my partner, and I had bought a house together. So, when I saw that “Call me,” on the agent’s business card, I was terrified that my act of conscience was going to lose us our life savings. Trembling, I called the revenue agent and set up an appointment at the San Francisco federal building, a place I only knew as the site of many antiwar demonstrations I’d attended.

I remember the agent meeting us at the entrance to a room filled with work cubicles. I took a look at her and my gaydar went off. “Oh, my goodness,” I thought, “she’s a lesbian!” Maybe that would help somehow — not that I imagined for a second that my partner and I were going to get the “family discount” we sometimes received from LGBT cashiers.

The three of us settled into her cubicle. She told me that I would have to file returns from 1986 to 1991 (the IRS computers, it turned out, couldn’t reach back further than that) and also pay the missing taxes, penalties, and interest on all of it. With an only partially feigned quaver in my voice, I asked, “Are you going to take our house away?”

She raised herself from her chair just enough to scan the roomful of cubicles around us, then sat down again. Silently, she shook her head. Well, it may not have been the family discount, but it was good enough for me.

Then she asked why I hadn’t filed my taxes and, having already decided I was going to pay up, I didn’t explain anything about those Nicaraguan families our government had maimed or murdered. I didn’t say why I’d been unwilling or what I thought it meant to pay for this country’s wars in Central America or preparations for more wars to come. “I just kept putting it off,” I said, which was true enough, if not the whole truth.

Somehow, she bought that and asked me one final question, “By the way, what do you do for a living?”

“I’m an accountant,” I replied.

Her eyebrows flew up and she shook her head, but that was that.

Why did I give up so easily? There were a few reasons. The Contra war in Nicaragua had ended after the Sandinistas lost the national elections in 1990. Nicaraguans weren’t stupid. They grasped that, as long as the Sandinistas were in power, the U.S. would continue to embargo their exports and arm and train the Contras. And I’d made some changes in my own life. After decades of using part-time paid work to support my full-time activism, I’d taken a “grown-up” job to help pay my ailing and impoverished mother’s rent, once I convinced her to move from subsidized housing in Cambridge, Massachusetts, to San Francisco. And, of course, I’d just made a fundamental investment of my own in the status quo. I’d bought a house. Even had I been willing to lose it, I couldn’t ask my co-owners to suffer for my conscience.

But in the end, I also found I just didn’t have the courage to defy the government of the world’s most powerful country.

As it happened, I wasn’t the only person in the Bay Area to get a visit from a revenue agent that year. The IRS, it turned out, was running a pilot program to see whether they could capture more unpaid taxes by diverting funds from auditing to directly pursuing non-filers like me. Several resisters I knew were caught in their net, including my friend T.J.

An agent came to T.J.’s house and sat at his kitchen table. Unlike “my” agent, T.J.’s not only asked him why he hadn’t filed his returns, but read from a list of possible reasons: “Did you have a drug or alcohol problem? Were you ill? Did you have trouble filling out the forms?”

“Don’t you have any political reasons on your list?” T.J. asked.

The agent looked doubtful. “Political? Well, there’s ‘too distraught.’”

“That’s it,” said T.J. “Put down ‘too distraught.’”

T.J. died years ago, but I remember him every tax season when I again have to reckon with just how deeply implicated all of us are in this country’s military death machine, whether we pay income taxes or not. Still, so many of us keep on keeping on, knowing we must never become too distraught to find new ways to oppose military aggression anywhere in the world, including, of course, Ukraine, while affirming life as best we can.

Automated killer robots aren't science fiction anymore — and the world isn't ready

Here’s a scenario to consider: a military force has purchased a million cheap, disposable flying drones each the size of a deck of cards, each capable of carrying three grams of explosives — enough to kill a single person or, in a “shaped charge,” pierce a steel wall. They’ve been programmed to seek out and “engage” (kill) certain human beings, based on specific “signature” characteristics like carrying a weapon, say, or having a particular skin color. They fit in a single shipping container and can be deployed remotely. Once launched, they will fly and kill autonomously without any further human action.

Science fiction? Not really. It could happen tomorrow. The technology already exists.

In fact, lethal autonomous weapons systems (LAWS) have a long history. During the spring of 1972, I spent a few days occupying the physics building at Columbia University in New York City. With a hundred other students, I slept on the floor, ate donated takeout food, and listened to Alan Ginsberg when he showed up to honor us with some of his extemporaneous poetry. I wrote leaflets then, commandeering a Xerox machine to print them out.

And why, of all campus buildings, did we choose the one housing the Physics department? The answer: to convince five Columbia faculty physicists to sever their connections with the Pentagon’s Jason Defense Advisory Group, a program offering money and lab space to support basic scientific research that might prove useful for U.S. war-making efforts. Our specific objection: to the involvement of Jason’s scientists in designing parts of what was then known as the “automated battlefield” for deployment in Vietnam. That system would indeed prove a forerunner of the lethal autonomous weapons systems that are poised to become a potentially significant part of this country’s — and the world’s — armory.

Early (Semi-)Autonomous Weapons

Washington faced quite a few strategic problems in prosecuting its war in Indochina, including the general corruption and unpopularity of the South Vietnamese regime it was propping up. Its biggest military challenge, however, was probably North Vietnam’s continual infiltration of personnel and supplies on what was called the Ho Chi Minh Trail, which ran from north to south along the Cambodian and Laotian borders. The Trail was, in fact, a network of easily repaired dirt roads and footpaths, streams and rivers, lying under a thick jungle canopy that made it almost impossible to detect movement from the air.

The U.S. response, developed by Jason in 1966 and deployed the following year, was an attempt to interdict that infiltration by creating an automated battlefield composed of four parts, analogous to a human body’s eyes, nerves, brain, and limbs. The eyes were a broad variety of sensors — acoustic, seismic, even chemical (for sensing human urine) — most dropped by air into the jungle. The nerve equivalents transmitted signals to the “brain.” However, since the sensors had a maximum transmission range of only about 20 miles, the U.S. military had to constantly fly aircraft above the foliage to catch any signal that might be tripped by passing North Vietnamese troops or transports. The planes would then relay the news to the brain. (Originally intended to be remote controlled, those aircraft performed so poorly that human pilots were usually necessary.)

And that brain, a magnificent military installation secretly built in Thailand’s Nakhon Phanom, housed two state-of-the-art IBM mainframe computers. A small army of programmers wrote and rewrote the code to keep them ticking, as they attempted to make sense of the stream of data transmitted by those planes. The target coordinates they came up with were then transmitted to attack aircraft, which were the limb equivalents. The group running that automated battlefield was designated Task Force Alpha and the whole project went under the code name Igloo White.

As it turned out, Igloo White was largely an expensive failure, costing about a billion dollars a year for five years (almost $40 billion total in today’s dollars). The time lag between a sensor tripping and munitions dropping made the system ineffective. As a result, at times Task Force Alpha simply carpet-bombed areas where a single sensor might have gone off. The North Vietnamese quickly realized how those sensors worked and developed methods of fooling them, from playing truck-ignition recordings to planting buckets of urine.

Given the history of semi-automated weapons systems like drones and “smart bombs” in the intervening years, you probably won’t be surprised to learn that this first automated battlefield couldn’t discriminate between soldiers and civilians. In this, they merely continued a trend that’s existed since at least the eighteenth century in which wars routinely kill more civilians than combatants.

None of these shortcomings kept Defense Department officials from regarding the automated battlefield with awe. Andrew Cockburn described this worshipful posture in his book Kill Chain: The Rise of the High-Tech Assassins, quoting Leonard Sullivan, a high-ranking Pentagon official who visited Vietnam in 1968: “Just as it is almost impossible to be an agnostic in the Cathedral of Notre Dame, so it is difficult to keep from being swept up in the beauty and majesty of the Task Force Alpha temple.”

Who or what, you well might wonder, was to be worshipped in such a temple?

Most aspects of that Vietnam-era “automated” battlefield actually required human intervention. Human beings were planting the sensors, programming the computers, piloting the airplanes, and releasing the bombs. In what sense, then, was that battlefield “automated”? As a harbinger of what was to come, the system had eliminated human intervention at a single crucial point in the process: the decision to kill. On that automated battlefield, the computers decided where and when to drop the bombs.

In 1969, Army Chief of Staff William Westmoreland expressed his enthusiasm for this removal of the messy human element from war-making. Addressing a luncheon for the Association of the U.S. Army, a lobbying group, he declared:

“On the battlefield of the future enemy forces will be located, tracked, and targeted almost instantaneously through the use of data links, computer-assisted intelligence evaluation, and automated fire control. With first round kill probabilities approaching certainty, and with surveillance devices that can continually track the enemy, the need for large forces to fix the opposition will be less important.”

What Westmoreland meant by “fix the opposition” was kill the enemy. Another military euphemism in the twenty-first century is “engage.” In either case, the meaning is the same: the role of lethal autonomous weapons systems is to automatically find and kill human beings, without human intervention.

New LAWS for a New Age — Lethal Autonomous Weapons Systems

Every autumn, the British Broadcasting Corporation sponsors a series of four lectures given by an expert in some important field of study. In 2021, the BBC invited Stuart Russell, professor of computer science and founder of the Center for Human-Compatible Artificial Intelligence at the University of California, Berkeley, to deliver those “Reith Lectures.” His general subject was the future of artificial intelligence (AI), and the second lecture was entitled “The Future Role of AI in Warfare.” In it, he addressed the issue of lethal autonomous weapons systems, or LAWS, which the United Nations defines as “weapons that locate, select, and engage human targets without human supervision.”

Russell’s main point, eloquently made, was that, although many people believe lethal autonomous weapons are a potential future nightmare, residing in the realm of science fiction, “They are not. You can buy them today. They are advertised on the web.”

I’ve never seen any of the movies in the Terminator franchise, but apparently military planners and their PR flacks assume most people derive their understanding of such LAWS from this fictional dystopian world. Pentagon officials are frequently at pains to explain why the weapons they are developing are not, in fact, real-life equivalents of SkyNet — the worldwide communications network that, in those films, becomes self-conscious and decides to eliminate humankind. Not to worry, as a deputy secretary of defense told Russell, “We have listened carefully to these arguments and my experts have assured me that there is no risk of accidentally creating SkyNet.”

Russell’s point, however, was that a weapons system doesn’t need self-awareness to act autonomously or to present a threat to innocent human beings. What it does need is:

  • A mobile platform (anything that can move, from a tiny quadcopter to a fixed-wing aircraft)
  • Sensory capacity (the ability to detect visual or sound information)
  • The ability to make tactical decisions (the same kind of capacity already found in computer programs that play chess)
  • The ability to “engage,” i.e. kill (which can be as complicated as firing a missile or dropping a bomb, or as rudimentary as committing robot suicide by slamming into a target and exploding)

The reality is that such systems already exist. Indeed, a government-owned weapons company in Turkey recently advertised its Kargu drone — a quadcopter “the size of a dinner plate,” as Russell described it, which can carry a kilogram of explosives and is capable of making “anti-personnel autonomous hits” with “targets selected on images and face recognition.” The company’s site has since been altered to emphasize its adherence to a supposed “man-in-the-loop” principle. However, the U.N. has reported that a fully autonomous Kargu-2 was, in fact, deployed in Libya in 2020.

You can buy your own quadcopter right now on Amazon, although you’ll still have to apply some DIY computer skills if you want to get it to operate autonomously.

The truth is that lethal autonomous weapons systems are less likely to look like something from the Terminator movies than like swarms of tiny killer bots. Computer miniaturization means that the technology already exists to create effective LAWS. If your smartphone could fly, it could be an autonomous weapon. Newer phones use facial-recognition software to “decide” whether to allow access. It’s not a leap to create flying weapons the size of phones, programmed to “decide” to attack specific individuals, or individuals with specific features. Indeed, it’s likely such weapons already exist.

Can We Outlaw LAWS?

So, what’s wrong with LAWS, and is there any point in trying to outlaw them? Some opponents argue that the problem is they eliminate human responsibility for making lethal decisions. Such critics suggest that, unlike a human being aiming and pulling the trigger of a rifle, a LAWS can choose and fire at its own targets. Therein, they argue, lies the special danger of these systems, which will inevitably make mistakes, as anyone whose iPhone has refused to recognize his or her face will acknowledge.

In my view, the issue isn’t that autonomous systems remove human beings from lethal decisions. To the extent that weapons of this sort make mistakes, human beings will still bear moral responsibility for deploying such imperfect lethal systems. LAWS are designed and deployed by human beings, who therefore remain responsible for their effects. Like the semi-autonomous drones of the present moment (often piloted from half a world away), lethal autonomous weapons systems don’t remove human moral responsibility. They just increase the distance between killer and target.

Furthermore, like already outlawed arms, including chemical and biological weapons, these systems have the capacity to kill indiscriminately. While they may not obviate human responsibility, once activated, they will certainly elude human control, just like poison gas or a weaponized virus.

And as with chemical, biological, and nuclear weapons, their use could effectively be prevented by international law and treaties. True, rogue actors, like the Assad regime in Syria or the U.S. military in the Iraqi city of Fallujah, may occasionally violate such strictures, but for the most part, prohibitions on the use of certain kinds of potentially devastating weaponry have held, in some cases for over a century.

Some American defense experts argue that, since adversaries will inevitably develop LAWS, common sense requires this country to do the same, implying that the best defense against a given weapons system is an identical one. That makes as much sense as fighting fire with fire when, in most cases, using water is much the better option.

The Convention on Certain Conventional Weapons

The area of international law that governs the treatment of human beings in war is, for historical reasons, called international humanitarian law (IHL). In 1995, the United States ratified an addition to IHL: the 1980 U.N. Convention on Certain Conventional Weapons. (Its full title is much longer, but its name is generally abbreviated as CCW.) It governs the use, for example, of incendiary weapons like napalm, as well as biological and chemical agents.

The signatories to CCW meet periodically to discuss what other weaponry might fall under its jurisdiction and prohibitions, including LAWS. The most recent conference took place in December 2021. Although transcripts of the proceedings exist, only a draft final document — produced before the conference opened — has been issued. This may be because no consensus was even reached on how to define such systems, let alone on whether they should be prohibited. The European Union, the U.N., at least 50 signatory nations, and (according to polls), most of the world population believe that autonomous weapons systems should be outlawed. The U.S., Israel, the United Kingdom, and Russia disagree, along with a few other outliers.

Prior to such CCW meetings, a Group of Government Experts (GGE) convenes, ostensibly to provide technical guidance for the decisions to be made by the Convention’s “high contracting parties.” In 2021, the GGE was unable to reach a consensus about whether such weaponry should be outlawed. The United States held that even defining a lethal autonomous weapon was unnecessary (perhaps because if they could be defined, they could be outlawed). The U.S. delegation put it this way:

“The United States has explained our perspective that a working definition should not be drafted with a view toward describing weapons that should be banned. This would be — as some colleagues have already noted — very difficult to reach consensus on, and counterproductive. Because there is nothing intrinsic in autonomous capabilities that would make a weapon prohibited under IHL, we are not convinced that prohibiting weapons based on degrees of autonomy, as our French colleagues have suggested, is a useful approach.”

The U.S. delegation was similarly keen to eliminate any language that might require “human control” of such weapons systems:

“[In] our view IHL does not establish a requirement for ‘human control’ as such… Introducing new and vague requirements like that of human control could, we believe, confuse, rather than clarify, especially if these proposals are inconsistent with long-standing, accepted practice in using many common weapons systems with autonomous functions.”

In the same meeting, that delegation repeatedly insisted that lethal autonomous weapons would actually be good for us, because they would surely prove better than human beings at distinguishing between civilians and combatants.

Oh, and if you believe that protecting civilians is the reason the arms industry is investing billions of dollars in developing autonomous weapons, I’ve got a patch of land to sell you on Mars that’s going cheap.

The Campaign to Stop Killer Robots

The Governmental Group of Experts also has about 35 non-state members, including non-governmental organizations and universities. The Campaign to Stop Killer Robots, a coalition of 180 organizations, among them Amnesty International, Human Rights Watch, and the World Council of Churches, is one of these. Launched in 2013, this vibrant group provides important commentary on the technical, legal, and ethical issues presented by LAWS and offers other organizations and individuals a way to become involved in the fight to outlaw such potentially devastating weapons systems.

The continued construction and deployment of killer robots is not inevitable. Indeed, a majority of the world would like to see them prohibited, including U.N. Secretary General Antonio Guterres. Let’s give him the last word: “Machines with the power and discretion to take human lives without human involvement are politically unacceptable, morally repugnant, and should be prohibited by international law.”

I couldn’t agree more.

Copyright 2022 Rebecca Gordon

Featured image: Killer Robots by Global Panorama is licensed under CC BY-SA 2.0 / Flickr

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer’s new dystopian novel, Songlands (the final one in his Splinterlands series), Beverly Gologorsky’s novel Every Body Has a Story, and Tom Engelhardt’s A Nation Unmade by War, as well as Alfred McCoy’s In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower’s The Violent American Century: War and Terror Since World War II.

Rebecca Gordon, a TomDispatch regular, teaches at the University of San Francisco. She is the author of Mainstreaming Torture, American Nuremberg: The U.S. Officials Who Should Stand Trial for Post-9/11 War Crimes and is now at work on a new book on the history of torture in the United States.

Graveyard shift: The dark reality of the modern economy reveals itself under pandemic-era demands

In mid-October, President Biden announced that the Port of Los Angeles would begin operating 24 hours a day, seven days a week, joining the nearby Port of Long Beach, which had been doing so since September. The move followed weeks of White House negotiations with the International Longshore and Warehouse Union, as well as shippers like UPS and FedEx, and major retailers like Walmart and Target.

The purpose of expanding port hours, according to the New York Times, was “to relieve growing backlogs in the global supply chains that deliver critical goods to the United States.” Reading this, you might be forgiven for imagining that an array of crucial items like medicines or their ingredients or face masks and other personal protective equipment had been languishing in shipping containers anchored off the West Coast. You might also be forgiven for imagining that workers, too lazy for the moment at hand, had chosen a good night’s sleep over the vital business of unloading such goods from boats lined up in their dozens offshore onto trucks, and getting them into the hands of the Americans desperately in need of them. Reading further, however, you’d learn that those “critical goods” are actually things like “exercise bikes, laptops, toys, [and] patio furniture.”

Fair enough. After all, as my city, San Francisco, enters what’s likely to be yet another almost rainless winter on a planet in ever more trouble, I can imagine my desire for patio furniture rising to a critical level. So, I’m relieved to know that dock workers will now be laboring through the night at the command of the president of the United States to guarantee that my needs are met. To be sure, shortages of at least somewhat more important items are indeed rising, including disposable diapers and the aluminum necessary for packaging some pharmaceuticals. Still, a major focus in the media has been on the specter of “slim pickings this Christmas and Hanukkah.”

Providing “critical” yard furnishings is not the only reason the administration needs to unkink the supply chain. It’s also considered an anti-inflation measure (if an ineffective one). At the end of October, the Consumer Price Index had jumped 6.2% over the same period in 2020, the highest inflation rate in three decades. Such a rise is often described as the result of too much money chasing too few goods. One explanation for the current rise in prices is that, during the worst months of the pandemic, many Americans actually saved money, which they’re now eager to spend. When the things people want to buy are in short supply — perhaps even stuck on container ships off Long Beach and Los Angeles — the price of those that are available naturally rises.

Republicans have christened the current jump in the consumer price index as “Bidenflation,” although the administration actually bears little responsibility for the situation. But Joe Biden and the rest of the Democrats know one thing: if it looks like they’re doing nothing to bring prices down, there will be hell to pay at the polls in 2022, and so it’s the night shift for dock workers and others in Los Angeles, Long Beach, and possibly other American ports.

However, running West Coast ports 24/7 won’t solve the supply-chain problem, not when there aren’t enough truckers to carry that critical patio furniture to Home Depot. The shortage of such drivers arises because there’s more demand than ever before, and because many truckers have simply quit the industry. As the New York Times reports, “Long hours and uncomfortable working conditions are leading to a shortage of truck drivers, which has compounded shipping delays in the United States.”

Rethinking (Shift) Work

Truckers aren’t the only workers who have been rethinking their occupations since the coronavirus pandemic pressed the global pause button. The number of employees quitting their jobs hit 4.4 million this September, about 3% of the U.S. workforce. Resignations were highest in industries like hospitality and medicine, where employees are most at risk of Covid-19 exposure.

For the first time in many decades, workers are in the driver’s seat. They can command higher wages and demand better working conditions. And that’s exactly what they’re doing at workplaces ranging from agricultural equipment manufacturer John Deere to breakfast-cereal makers Kellogg and Nabisco. I’ve even been witnessing it in my personal labor niche, part-time university faculty members (of which I’m one). So allow me to pause here for a shout-out to the 6,500 part-time professors in the University of California system: Thank you! Your threat of a two-day strike won a new contract with a 30% pay raise over the next five years!

This brings me to Biden’s October announcement about those ports going 24/7. In addition to demanding higher pay, better conditions, and an end to two-tier compensation systems (in which laborers hired later don’t get the pay and benefits available to those already on the job), workers are now in a position to reexamine and, in many cases, reject the shift-work system itself. And they have good reason to do so.

So, what is shift work? It’s a system that allows a business to run continuously, ceaselessly turning out and/or transporting widgets year after year. Workers typically labor in eight-hour shifts: 8:00 a.m. to 4:00 p.m., 4:00 p.m. to midnight, and midnight to 8:00 a.m., or the like. In times of labor shortages, they can even be forced to work double shifts, 16 hours in total. Businesses love shift work because it reduces time (and money) lost to powering machinery up and down. And if time is money, then more time worked means more profit for corporations. In many industries, shift work is good for business. But for workers, it’s often another story.

The Graveyard Shift

Each shift in a 24-hour schedule has its own name. The day shift is the obvious one. The swing shift takes you from the day shift to the all-night, or graveyard, shift. According to folk etymology, that shift got its name because, once upon a time, cemetery workers were supposed to stay up all night listening for bells rung by unfortunates who awakened to discover they’d been buried alive. While it’s true that some coffins in England were once fitted with such bells, the term was more likely a reference to the eerie quiet of the world outside the workplace during the hours when most people are asleep.

I can personally attest to the strangeness of life on the graveyard shift. I once worked in an ice cream cone factory. Day and night, noisy, smoky machines resembling small Ferris wheels carried metal molds around and around, while jets of flame cooked the cones inside them. After a rotation, each mold would tip, releasing four cones onto a conveyor belt, rows of which would then approach my station relentlessly. I’d scoop up a stack of 25, twirl them around in a quick check for holes, and place them in a tall box.

Almost simultaneously, I’d make cardboard dividers, scoop up three more of those stacks and seal them, well-divided, in that box, which I then inserted in an even larger cardboard carton and rushed to a giant mechanical stapler. There, I pressed it against a switch, and — boom-ba-da-boom — six large staples would seal it shut, leaving me just enough time to put that carton atop a pallet of them before racing back to my machine, as new columns of just-baked cones piled up, threatening to overwhelm my worktable.

The only time you stopped scooping and boxing was when a relief worker arrived, so you could have a brief break or gobble down your lunch. You rarely talked to your fellow-workers, because there was only one “relief” packer, so only one person at a time could be on break. Health regulations made it illegal to drink water on the line and management was too cheap to buy screens for the windows, which remained shut, even when it was more than 100 degrees outside.

They didn’t like me very much at the Maryland Pacific Cone Company, maybe because I wanted to know why the high school boys who swept the floors made more than the women who, since the end of World War II, had been climbing three rickety flights of stairs to stand by those machines. In any case, management there started messing with my shifts, assigning me to all three in the same week. As you might imagine, I wasn’t sleeping a whole lot and would occasionally resort to those “little white pills” immortalized in the truckers’ song “Six Days on the Road.”

But I’ll never forget one graveyard shift when an angel named Rosie saved my job and my sanity. It was probably three in the morning. I’d been standing under fluorescent lights, scooping, twirling, and boxing for hours when the universe suddenly stood still. I realized at that moment that I’d never done anything else since the beginning of time but put ice cream cones in boxes and would never stop doing so until the end of time.

If time lost its meaning then, dimensions still turned out to matter a lot, because the cones I was working on that night were bigger than I was used to. Soon I was falling behind, while a huge mound of 40-ounce Eat-It-Alls covered my table and began to spill onto the floor. I stared at them, frozen, until I suddenly became aware that someone was standing at my elbow, gently pushing me out of the way.

Rosie, who had been in that plant since the end of World War II, said quietly, “Let me do this. You take my line.” In less than a minute, she had it all under control, while I spent the rest of the night at her machine, with cones of a size I could handle.

I have never been so glad to see the dawn.

The Deadly Reality of the Graveyard Shift

So, when the president of the United States negotiated to get dock workers in Los Angeles to work all night, I felt a twinge of horror. There’s another all-too-literal reason to call it the “graveyard” shift. It turns out that working when you should be in bed is dangerous. Not only do more accidents occur when the human body expects to be asleep, but the long-term effects of night work can be devastating. As the Centers for Disease Control and Prevention’s National Institute of Occupational Safety and Health (NIOSH) reports, the many adverse effects of night work include:

“type 2 diabetes, heart disease, stroke, metabolic disorders, and sleep disorders. Night shift workers might also have an increased risk for reproductive issues, such as irregular menstrual cycles, miscarriage, and preterm birth. Digestive problems and some psychological issues, such as stress and depression, are more common among night shift workers. The fatigue associated with nightshift can lead to injuries, vehicle crashes, and industrial disasters.”

Some studies have shown that such shift work can also lead to decreased bone-mineral density and so to osteoporosis. There is, in fact, a catchall term for all these problems: shift-work disorder.

In addition, studies directly link the graveyard shift to an increased incidence of several kinds of cancer, including breast and prostate cancer. Why would disrupted sleep rhythms cause cancer? Because such disruptions affect the release of the hormone melatonin. Most of the body’s cells contain little “molecular clocks” that respond to daily alternations of light and darkness. When the light dims at night, the pineal gland releases melatonin, which promotes sleep. In fact, many people take it in pill form as a “natural” sleep aid. Under normal circumstances, such a melatonin release continues until the body encounters light again in the morning.

When this daily (circadian) rhythm is disrupted, however, so is the regular production of melatonin, which turns out to have another important biological function. According to NIOSH, it “can also stop tumor growth and protect against the spread of cancer cells.” Unfortunately, if your job requires you to stay up all night, it won’t do this as effectively.

There’s a section on the NIOSH website that asks, “What can night shift workers do to stay healthy?” The answers are not particularly satisfying. They include regular checkups and seeing your doctor if you have any of a variety of symptoms, including “severe fatigue or sleepiness when you need to be awake, trouble with sleep, stomach or intestinal disturbances, irritability or bad mood, poor performance (frequent mistakes, injuries, vehicle crashes, near misses, etc.), unexplained weight gain or loss.”

Unfortunately, even if you have access to healthcare, your doctor can’t write you a prescription to cure shift-work disorder. The cure is to stop working when your body should be asleep.

An End to Shift Work?

Your doctor can’t solve your shift work issue because, ultimately, it’s not an individual problem. It’s an economic and an ethical one.

There will always be some work that must be performed while most people are sleeping, including healthcare, security, and emergency services, among others. But most shift work gets done not because life depends upon it, but because we’ve been taught to expect our patio furniture on demand. As long as advertising and the grow-or-die logic of capitalism keep stoking the desire for objects we don’t really need, may not even really want, and will sooner or later toss on a garbage pile in this or some other country, truckers and warehouse workers will keep damaging their health.

Perhaps the pandemic, with its kinky supply chain, has given us an opportunity to rethink which goods are so “critical” that we’re willing to let other people risk their lives to provide them for us. Unfortunately, such a global rethink hasn’t yet touched Joe Biden and his administration as they confront an ongoing pandemic, supply-chain problems, a rise in inflation, and — oh yes! — an existential climate crisis that gets worse with every plastic widget produced, packed, and shipped.

It’s time for Biden — and the rest of us — to take a breath and think this through. There are good reasons that so many people are walking away from underpaid, life-threatening work. Many of them are reconsidering the nature of work itself and its place in their lives, no matter what the president or anyone else might wish.

And that’s a paradigm shift we all could learn to live with.

Copyright 2021 Rebecca Gordon

Featured image: Port of Los Angeles sunrise by pete is licensed under CC BY 2.0 / Flickr

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer’s new dystopian novel, Songlands (the final one in his Splinterlands series), Beverly Gologorsky’s novel Every Body Has a Story, and Tom Engelhardt’s A Nation Unmade by War, as well as Alfred McCoy’s In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower’s The Violent American Century: War and Terror Since World War II.

Rebecca Gordon, a TomDispatch regular, teaches at the University of San Francisco. She is the author of American Nuremberg: The U.S. Officials Who Should Stand Trial for Post-9/11 War Crimes and is now at work on a new book on the history of torture in the United States.

The curse of the ignored modern prophets

For decades, I kept a poster on my wall that I'd saved from the year I turned 16. In its upper left-hand corner was a black-and-white photo of a white man in a grey suit. Before him spread a cobblestone plaza. All you could see were the man and the stones. Its caption read, "He stood up alone and something happened."

It was 1968. "He" was Minnesota Senator Eugene McCarthy. As that campaign slogan suggested, his strong second-place showing in the Maine primary was proof that opposition to the Vietnam War had finally become a viable platform for a Democratic candidate for president. I volunteered in McCarthy's campaign office that year. My memory of my duties is now vague, but they mainly involved alphabetizing and filing index cards containing information about the senator's supporters. (Remember, this was the age before there was a computer in every pocket, let alone social media and micro-targeting.)

Running against the Vietnam War, McCarthy was challenging then-President Lyndon Johnson in the Democratic primaries. After McCarthy had a strong second-place showing in Maine, New York Senator Robert F. Kennedy entered the race, too, running against the very war his brother, President John F. Kennedy, had bequeathed to Johnson when he was assassinated. Soon, Johnson would withdraw from the campaign, announcing in a televised national address that he wouldn't run for another term.

With his good looks and family name, Bobby Kennedy appeared to have a real chance for the nomination when, on June 5, 1968, during a campaign event in Los Angeles, he, like his brother, was assassinated. That left the war's opponents without a viable candidate for the nomination. Outside the Democratic Party convention in Chicago that August, tens of thousands of angry, mostly young Americans demonstrated their frustration with the war and the party's refusal to take a stand against it. In what was generally recognized as a police riot, the Chicago PD beat protesters and journalists bloody on national TV, as participants chanted, "The whole world is watching." And indeed, it was.

In the end, the nomination went to Johnson's vice president and war supporter Hubert Humphrey, who would face Republican hawk Richard Nixon that November. The war's opponents watched in frustration as the two major parties closed ranks, cementing their post-World-War-II bipartisan agreement to use military power to enforce U.S. global dominance.

Cassandra Foresees the Future

Of course, the McCarthy campaign's slogan was wrong on two counts. He didn't stand up alone. Millions of us around the world were then working to end the war in Vietnam. Sadly, nothing conclusive happened as a result of his campaign. Nixon went on to win the 1968 general election and the Vietnam War dragged on to an ignominious U.S. defeat seven years later.

Nineteen sixty-eight was also the year my high school put on Tiger at the Gates, French playwright Jean Giraudoux's antiwar drama about the run-up to the Trojan War. Giraudoux chronicled that ancient conflict's painful inevitability, despite the fervent desire of Troy's rulers and its people to prevent it. The play opens as Andromache, wife of the doomed Trojan warrior Hector, tells her sister-in-law Cassandra, "There's not going to be a Trojan war."

Cassandra, you may remember, bore a double curse from the gods: yes, she could see into the future, but no one would believe her predictions. She informs Andromache that she's wrong; that, like a tiger pacing outside the city's walls, war with all its bloody pain is preparing to spring. And, of course, she's right. Part of the play's message is that Cassandra doesn't need her supernatural gift to predict the future. She can guess what will happen simply because she understands the relentless forces driving her city to war: the poets who need tragedies to chronicle; the would-be heroes who desire glory; the rulers caught in the inertia of tradition.

Although Tiger was written in the 1930s, between the two world wars, it could just as easily have appeared in 1968. Substitute the mass media for the poets; the military-industrial complex for the Greek and Trojan warriors; and administration after administration for the city's rulers, and you have a striking representation of the quicksand war that dragged 58,000 U.S. soldiers and millions of Vietnamese, Laotians, and Cambodians to their deaths. And in some sense, we — the antiwar forces in this country — foresaw it all (in broad outline, if not specific detail): the assassinations, carpet bombings, tiger cages, and the CIA's first mass assassination and torture scheme, the Phoenix Program. Of course we couldn't predict the specifics. Indeed, some turned out worse than we'd feared. In any case, our foresight did us no more good than Cassandra's did her.

Rehabilitations and Revisions

It's just over a month since the 20th anniversary of the 9/11 attacks and the start of the "Global War on Terror." The press has been full of recollections and rehabilitations. George W. Bush used the occasion to warn the nation (as if we needed it at that point) about the dangers of what CNN referred to as "domestic violent extremists." He called them "children of the same foul spirit" as the one that engenders international terrorism. He also inveighed against the January 6th Capitol invasion:

"'This is how election results are disputed in a banana republic — not our democratic republic,' he said in a statement at the time, adding that he was 'appalled by the reckless behavior of some political leaders since the election.'"

You might almost think he'd forgotten that neither should elections in a democracy be "disputed" by three-piece-suited thugs shutting down a ballot count — as happened in Florida during his own first election in 2000. Future Trump operative Roger Stone has claimed credit for orchestrating that so-called Brooks Brothers Rebellion, which stopped the Florida vote count and threw the election to the Supreme Court and, in the end, to George W. Bush.

You might also think that, with plenty of shoving from his vice president Dick Cheney and a cabal of leftover neocons from the Project for a New American Century, Bush had never led this country into two devastating, murderous, profoundly wasteful wars. You might think we'd never seen the resumption of institutionalized CIA- and military-run state torture on a massive scale under his rule, or his administration's refusal to join the International Criminal Court.

And finally, you might think that nobody saw all this coming, that there were no Cassandras in this country in 2001. But there you would be wrong. All too many of us sensed just what was coming as soon as the bombing and invasion of Afghanistan began. I knew, for example, as early as November 2001, when the first mainstream article extolling the utility of torture appeared, that whatever else the U.S. response to the 9/11 attacks would entail, organized torture would be part of it. As early as December 2002, we all could have known that. That's when the first articles began appearing in the Washington Post about the "stress and duress" techniques the CIA was already beginning to use at Bagram Air Base in Afghanistan. Some of the hapless victims would later turn out to have been sold to U.S. forces for bounties by local strongmen.

It takes very little courage for a superannuated graduate student (as I was in 2001) to write academic papers about U.S. torture practices (as I did) and the stupidity and illegality of our invasion of Afghanistan. It's another thing, however, when a real Cassandra stands up — all alone — and tries to stop something from happening.

I'm talking, of course, about Representative Barbara Lee, the only member of Congress to vote against granting the president the power to "use all necessary and appropriate force against those nations, organizations, or persons he determines planned, authorized, committed, or aided the terrorist attacks that occurred on September 11, 2001, or harbored such organizations or persons." It was this Authorization of the Use of Military Force, or AUMF, that provided the legal grounds for the U.S. invasion of Afghanistan in September 2001. Lee was right when, after agonizing about her vote, she decided to follow the counsel of the dean of the National Cathedral, the Reverend Nathan Baxter. That very morning, she had heard him pray that, in response to the terrible crimes of 9/11, we not "become the evil we deplore."

How right she was when she said on the House floor:

"However difficult this vote may be, some of us must urge the use of restraint. Our country is in a state of mourning. Some of us must say, 'Let's step back for a moment, let's just pause, just for a minute, and think through the implications of our actions today, so that this does not spiral out of control.'"

The legislation she opposed that day would indeed allow "this" to spiral out of control. That same AUMF has since been used to justify an ever-metastasizing series of wars and conflicts that spread from Afghanistan in central Asia through the Middle East, south to Yemen, and leapt to Libya, Somalia, and other lands in Africa. Despite multiple attempts to repeal it, that same minimalist AUMF remains in effect today, ready for use by the next president with aspirations to military adventures. In June 2021, the House of Representatives did finally pass a bill rescinding it, sponsored by Barbara Lee herself. At present, however, it languishes in the Senate's Committee on Foreign Relations.

In the days after 9/11, Lee was roundly excoriated for her vote. The Wall Street Journal called her a "clueless liberal," while the Washington Times wrote that she was "a long-practicing supporter of America's enemies." Curiously, both those editorials were headlined with the question, "Who Is Barbara Lee?" (Those of us in the San Francisco Bay Area could have answered that. Lee was — and remains — an African American congressional representative from Oakland, California, the inheritor of the seat and mantle of another great black congressional representative, Ron Dellums.) She received mountains of hate mail then and enough death threats to force her to seek police protection.

Like George W. Bush, Lee received some media rehabilitation in various 20th anniversary retrospectives of 9/11. In her case, however, it was well-deserved. The Washington Post, for instance, praised her for her courage, noting that no one — not Bernie Sanders, not Joe Biden — shared her vision, or, I would add, shared Cassandra's curse with her. Like the character in Tiger at the Gates, Lee didn't need a divine gift to foresee that the U.S. "war on terror" would spin disastrously out of control. A little historical memory might have served the rest of the country well, reminding us of what happened the last time the United States fought an ever-escalating war.

Cassandras and Their Mirror Images

It was clear from the start that Vice President Dick Cheney and Secretary of Defense Donald Rumsfeld were never that interested in Afghanistan (although that was no solace to the many thousands of Afghans who were bombed, beaten, and tortured). Those officials had another target in mind — Iraq — almost literally from the moment al-Qaeda's hijacked planes struck New York and Washington.

In 2002, after months of lies about Iraqi leader Saddam Hussein's possession of (nonexistent) weapons of mass destruction (WMD) and his supposed pursuit of a nuclear bomb, the Bush administration got its second AUMF, authorizing "the President to use the U.S. armed forces to: …defend U.S. national security against the continuing threat posed by Iraq," functionally condoning the U.S. invasion of his country. This time, Barbara Lee was not alone in her opposition. In the House, she was joined by 132 Democrats, 6 Republicans, and one independent (Bernie Sanders). Only 23 senators, however, voted "nay," including Rhode Island Republican Lincoln Chafee and Vermont independent Jim Jeffords.

In the run-up to the March 2003 invasion, figures who might be thought of as "anti-Cassandras" took center stage. Unlike the Greek seer, these unfortunates were apparently doomed to tell falsehoods — and be believed. Among them was Condoleezza Rice, President Bush's national security advisor, who, when pressed for evidence that Saddam Hussein actually possessed WMD, told CNN's Wolf Blitzer that "we don't want the smoking gun to be a mushroom cloud," implying Iraq represented a nuclear threat to this country.

Then there was secretary of State Colin Powell, who put the case for war to the United Nations General Assembly in February 2003, emphasizing the supposedly factual basis of everything he presented:

"My colleagues, every statement I make today is backed up by sources, solid sources. These are not assertions. What we're giving you are facts and conclusions based on solid intelligence."

It wasn't true, of course, but around the world, many believed him.

And let's not leave the mainstream press out here. There's plenty of blame to go around, but perhaps the anti-Cassandra crown should go to the New York Times for its promotion of Bush administration war propaganda, especially by its reporter Judith Miller. In 2004, the Times published an extraordinary mea culpa, an apologetic note "from the editors" that said,

"[W]e have found a number of instances of coverage that was not as rigorous as it should have been. In some cases, information that was controversial then, and seems questionable now, was insufficiently qualified or allowed to stand unchallenged. Looking back, we wish we had been more aggressive in re-examining the claims as new evidence emerged — or failed to emerge."

I suspect the people of Iraq might share the Times's wish.

There was, of course, one other group of prophets who accurately foresaw the horrors that a U.S. invasion would bring with it: the millions who filled the streets of their cities here and around the world, demanding that the United States stay its hand. So powerful was their witness that they were briefly dubbed "the other superpower." Writing in the Nation, Jonathan Schell extolled their strength, saying that this country's "shock and awe" assault on Iraq "has found its riposte in courage and wonder." Alas, that mass witness in those streets was not enough to forestall one more murderous assault by what would, in the long run, prove to be a dying empire.

Cassandra at the Gates (of Glasgow)

And now, the world is finally waking up to an even greater disaster: the climate emergency that's burning up my part of the world, the American West, and drowning others. This crisis has had its Cassandras, too. One of these was 89-year-old John Rogalsky, who worked for 35 years as a meteorologist in the federal government. As early as 1963, he became aware of the problem of climate change and began trying to warn us. In 2017, he told the Canadian Broadcasting Company:

"[B]y the time the end of the 60s had arrived, I was absolutely convinced that it was real, it was just a question of how rapidly it would happen and how difficult it would become for the world at large, and how soon before people, or governments would even listen to the science. People I talked to about this, I was letting them know, this is happening, get ready."

This November, the 197 nations that have signed up to the United Nations Framework Convention on Climate Change will meet in Glasgow, Scotland, at the 2021 United Nations Climate Change Conference. We must hope that this follow-up to the 2015 Paris agreement will produce concrete steps to reverse the overheating of this planet and mitigate its effects, especially in those nations that have contributed the least to the problem and are already suffering disproportionately. Italy and the United Kingdom will serve as co-hosts.

I hope it's a good sign that at a pre-Glasgow summit in Milan, Italy's Prime Minister Mario Draghi met with three young "Cassandras" — climate activists Greta Thunberg (Sweden), Vanessa Nakate (Uganda), and Martina Comparelli (Italy) — after Thunberg's now famous "blah, blah, blah" speech, accusing world leaders of empty talk. "Your pressure, frankly, is very welcome," Draghi told them. "We need to be whipped into action. Your mobilization has been powerful, and rest assured, we are listening."

For the sake of the world, let us hope that this time Cassandra will be believed.

Copyright 2021 Rebecca Gordon

Featured image: Climate protest by Victoria Pickering is licensed under CC BY-NC-ND 2.0 / Flickr

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer's new dystopian novel, Songlands (the final one in his Splinterlands series), Beverly Gologorsky's novel Every Body Has a Story, and Tom Engelhardt's A Nation Unmade by War, as well as Alfred McCoy's In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower's The Violent American Century: War and Terror Since World War II.

Rebecca Gordon, a TomDispatch regular, teaches at the University of San Francisco. She is the author of American Nuremberg: The U.S. Officials Who Should Stand Trial for Post-9/11 War Crimes and is now at work on a new book on the history of torture in the United States.

Debt and disillusionment: Is higher education a giant pyramid scheme?

For the last decade and a half, I've been teaching ethics to undergraduates. Now — admittedly, a little late to the party — I've started seriously questioning my own ethics. I've begun to wonder just what it means to be a participant, however minor, in the pyramid scheme that higher education has become in the years since I went to college.

Airplane Games

Sometime in the late 1980s, the Airplane Game roared through the San Francisco Bay Area lesbian community. It was a classic pyramid scheme, even if cleverly dressed up in language about women's natural ability to generate abundance, just as we gestate children in our miraculous wombs. If the connection between feminism and airplanes was a little murky — well, we could always think of ourselves as modern-day Amelia Earharts. (As long as we didn't think too hard about how she ended up.)

A few women made a lot of money from it — enough, in the case of one friend of mine, for a down payment on a house. Inevitably, a lot more of us lost money, even as some like me stood on the sidelines sadly shaking our heads.

There were four tiers on that "airplane": a captain, two co-pilots, four crew, and 8 passengers — 15 in all to start. You paid $3,000 to get on at the back of the plane as a passenger, so the first captain (the original scammer), got out with $24,000 — $3,000 from each passenger. The co-pilots and crew, who were in on the fix, paid nothing to join. When the first captain "parachuted out," the game split in two, and each co-pilot became the captain of a new plane. They then pressured their four remaining passengers to recruit enough new women to fill each plane, so they could get their payday, and the two new co-pilots could each captain their own planes.

Unless new people continued to get on at the back of each plane, there would be no payday for the earlier passengers, so the pressure to recruit ever more women into the game only grew. The original scammers ran through the game a couple of times, but inevitably the supply of gullible women willing to invest their savings ran out. By the time the game collapsed, hundreds of women had lost significant amounts of money.

No one seemed to know the women who'd brought the game and all those "planes" to the Bay Area, but they had spun a winning story about endless abundance and the glories of women's energy. After the game collapsed, they took off for another women's community with their "earnings," leaving behind a lot of sadder, poorer, and perhaps wiser San Francisco lesbians.

Feasting at the Tenure Trough or Starving in the Ivory Tower?

So, you may be wondering, what could that long-ago scam have to do with my ethical qualms about working as a college instructor? More than you might think.

Let's start with PhD programs. In 2019, the most recent year for which statistics are available, U.S. colleges and universities churned out about 55,700 doctorates — and such numbers continue to increase by about 1% a year. The average number of doctorates earned over the last decade is almost 53,000 annually. In other words, we're talking about nearly 530,000 PhDs produced by American higher education in those 10 years alone. Many of them have ended up competing for a far smaller number of jobs in the academic world.

It's true that most PhDs in science or engineering end up with post-doctoral positions (earning roughly $40,000 a year) or with tenure-track or tenured jobs in colleges and universities (averaging $60,000 annually to start). Better yet, most of them leave their graduate programs with little or no debt.

The situation is far different if your degree wasn't in STEM (science, technology, engineering, or mathematics) but, for example, in education or the humanities. As a start, far more of those degree-holders graduate owing money, often significant sums, and ever fewer end up teaching in tenure-track positions — in jobs, that is, with security, decent pay, and benefits.

Many of the non-STEM PhDs who stay in academia end up joining an exploited, contingent workforce of part-time, or "adjunct," professors. That reserve army of the underemployed is higher education's dirty little secret. After all, we — and yes, I'm one of them — actually teach the majority of the classes in many schools, while earning as little as $1,500 a semester for each of them.

I hate to bring up transportation again, but there's a reason teachers like us are called "freeway flyers." A 2014 Congressional report revealed that 89% of us work at more than one institution and 27% at three different schools, just to cobble together the most meager of livings.

Many of us, in fact, rely on public antipoverty programs to keep going. Inside Higher Ed, reflecting on a 2020 report from the American Federation of Teachers, describes our situation this way:

"Nearly 25% of adjunct faculty members rely on public assistance, and 40% struggle to cover basic household expenses, according to a new report from the American Federation of Teachers. Nearly a third of the 3,000 adjuncts surveyed for the report earn less than $25,000 a year. That puts them below the federal poverty guideline for a family of four."

I'm luckier than most adjuncts. I have a union, and over the years we've fought for better pay, healthcare, a pension plan, and a pathway (however limited) to advancement. Now, however, my school's administration is using the pandemic as an excuse to try to claw back the tiny cost-of-living adjustments we won in 2019.

The Oxford Dictionary of English defines an adjunct as "a thing added to something else as a supplementary rather than an essential part." Once upon a time, in the middle of the previous century, that's just what adjunct faculty were — occasional additions to the full-time faculty. Often, they were retired professionals who supplemented a department's offerings by teaching a single course in their area of expertise, while their salaries were more honoraria than true payments for work performed. Later, as more women entered academia, it became common for a male professor's wife to teach a course or two, often as part of his employment arrangement with the university. Since her salary was a mere adjunct to his, she was paid accordingly.

Now, the situation has changed radically. In many colleges and universities, adjunct faculty are no longer supplements, but the most "essential part" of the teaching staff. Classes simply couldn't go on without us; nor, if you believe college administrations, could their budgets be balanced without us. After all, why pay a full-time professor $10,000 to teach a class (since he or she will be earning, on average, $60,000 a year and covering three classes a semester) when you can give a part-timer like me $1,500 for the very same work?

And adjuncts have little choice. The competition for full-time positions is fierce, since every year another 53,000 or more new PhDs climb into the back row of the academic airplane, hoping to make it to the pilot's seat and secure a tenure-track position.

And here's another problem with that. These days the people in the pilots' seats often aren't parachuting out. They're staying right where they are. That, in turn, means new PhDs find themselves competing for an ever-shrinking prize, as Laura McKenna has written in the Atlantic, "not only with their own cohort but also with the unemployed PhDs who graduated in previous years." Many of those now clinging to pilots' seats are members of my own boomer generation, who still benefit from a 1986 law (signed by then-75-year-old President Ronald Reagan) that outlawed mandatory retirements.

Grade Inflation v. Degree Inflation?

People in the world of education often bemoan the problem of "grade inflation" — the tendency of average grades to creep up over time. Ironically, this problem is exacerbated by the adjunctification of teaching, since adjuncts tend to award higher grades than professors with secure positions. The reason is simple enough: colleges use student evaluations as a major metric for rehiring adjuncts and higher grades translate directly into better evaluations. Grade inflation at the college level is, in my view, a non-issue, at least for students. Employers don't look at your transcript when they're hiring you and even graduate schools care more about recommendations and GRE scores.

The real problem faced by today's young people isn't grade inflation. It's degree inflation.

Once upon a time in another America, a high-school diploma was enough to snag you a good job, with a chance to move up as time went on (especially if you were white and male, as the majority of workers were in those days). And you paid no tuition whatsoever for that diploma. In fact, public education through 12th grade is still free, though its quality varies profoundly depending on who you are and where you live.

But all that changed as increasing numbers of employers began requiring a college degree for jobs that don't by any stretch of the imagination require a college education to perform. The Washington Post reports:

"Among the positions never requiring a college degree in the past that are quickly adding that to the list of desired requirements: dental hygienists, photographers, claims adjusters, freight agents, and chemical equipment operators."

In 2017, Manjari Raman of the Harvard Business School wrote that

"the degree gap — the discrepancy between the demand for a college degree in job postings and the employees who are currently in that job who have a college degree — is significant. For example, in 2015, 67% of production supervisor job postings asked for a college degree, while only 16% of employed production supervisors had one."

In other words, even though most people already doing such jobs don't have a bachelor's degree, companies are only hiring new people who do. Part of the reason: that requirement automatically eliminates a lot of applicants, reducing the time and effort involved in making hiring decisions. Rather than sifting through résumés for specific skills (like the ability to use certain computer programs or write fluently), employers let a college degree serve as a proxy. The result is not only that they'll hire people who don't have the skills they actually need, but that they're eliminating people who do have the skills but not the degree. You won't be surprised to learn that those rejected applicants are more likely to be people of color, who are underrepresented among the holders of college degrees.

Similarly, some fields that used to accept a BA now require a graduate degree to perform the same work. For example, the Bureau of Labor Statistics reports that "in 2015–16, about 39% of all occupational therapists ages 25 and older had a bachelor's degree as their highest level of educational attainment." Now, however, employers are commonly insisting that new applicants hold at least a master's degree — and so up the pyramid we continually go (at ever greater cost to those students).

The Biggest Pyramid of All

In a sense, you could say that the whole capitalist economy is the biggest pyramid of them all. For every one of the fascinating, fulfilling, autonomous, and well-paying jobs out there, there are thousands of boring, mind- and body-crushing ones like pulling items for shipment in an Amazon warehouse or folding clothes at Forever 21.

We know, in other words, that there are only a relatively small number of spaces in the cockpit of today's economic plane. Nonetheless, we tell our young people that the guaranteed way to get one of those rare gigs at the top of the pyramid is a college education.

Now, just stop for a second and consider what it costs to join the 2021 all-American Airplane Game of education. In 1970, when I went to Reed, a small, private, liberal arts college, tuition was $3,000 a year. I was lucky. I had a scholarship (known in modern university jargon as a "tuition discount") that covered most of my costs. This year, annual tuition at that same school is a mind-boggling $62,420, more than 20 times as high. If college costs had simply risen with inflation, the price would be about $21,000 a year, or just under triple the price.

If I'd attended Federal City College (now the University of D.C.), my equivalent of a state school then, tuition would have been free. Now, even state schools cost too much for many students. Annually, tuition at the University of California at Berkeley, the flagship school of that state's system, is $14,253 for in-state students, and $44,007 for out-of-staters.

I left school owing $800, or about $4,400 in today's dollars. These days, most financial "aid" resembles foreign "aid" to developing countries — that is, it generally takes the form of loans whose interest piles up so fast that it's hard to keep up with it, let alone begin to pay off the principal in your post-college life. Some numbers to contemplate: 62% of those graduating with a BA in 2019 did so owing money — owing, in fact, an average of almost $29,000. The average debt of those earning a graduate degree was an even more staggering $71,000. That, of course, is on top of whatever the former students had already shelled out while in school. And that, in turn, is before the "miracle" of compound interest takes hold and that debt starts to grow like a rogue zucchini.

It's enough to make me wonder whether a seat in the Great American College and University Airplane Game is worth the price, and whether it's ethical for me to continue serving as an adjunct flight attendant along the way. Whatever we tell students about education being the path to a good job, the truth is that there are remarkably few seats at the front of the plane.

Of course, on the positive side, I do still believe that time spent at college offers students something beyond any price — the opportunity to learn to think deeply and critically, while encountering people very different from themselves. The luckiest students graduate with a lifelong curiosity about the world and some tools to help them satisfy it. That is truly a ticket to a good life — and no one should have to buy a seat in an Airplane Game to get one.

Copyright 2021 Rebecca Gordon

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer's new dystopian novel, Songlands (the final one in his Splinterlands series), Beverly Gologorsky's novel Every Body Has a Story, and Tom Engelhardt's A Nation Unmade by War, as well as Alfred McCoy's In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower's The Violent American Century: War and Terror Since World War II.

Rebecca Gordon, a TomDispatch regular, teaches at the University of San Francisco. She is the author of American Nuremberg: The U.S. Officials Who Should Stand Trial for Post-9/11 War Crimes and is now at work on a new book on the history of torture in the United States.


Teetering on the existential edge, we have one more last, best chance for survival

In San Francisco, we're finally starting to put away our masks. With 74% of the city's residents over 12 fully vaccinated, for the first time in more than a year we're enjoying walking, shopping, and eating out, our faces naked. So I was startled when my partner reminded me that we need to buy masks again very soonN95 masks, that is. The California wildfire season has already begun, earlier than ever, and we'll need to protect our lungs during the months to come from the fine particulates carried in the wildfire smoke that's been engulfing this city in recent years.

I was in Reno last September, so I missed the morning when San Franciscans awoke to apocalyptic orange skies, the air freighted with smoke from burning forests elsewhere in the state. The air then was bad enough even in the high mountain valley of Reno. At that point, we'd already experienced "very unhealthy" purple-zone air quality for days. Still, it was nothing like the photos that could have been from Mars then emerging from the Bay Area. I have a bad feeling that I may get my chance to experience the same phenomenon in 2021 — and, as the fires across California have started so much earlier, probably sooner than September.

The situation is pretty dire: this state — along with our neighbors to the north and southeast — is now living through an epic drought. After a dry winter and spring, the fuel-moisture content in our forests (the amount of water in vegetation, living and dead) is way below average. This April, the month when it is usually at its highest, San Jose State University scientists recorded levels a staggering 40% below average in the Santa Cruz Mountains, well below the lowest level ever before observed. In other words, we have never been this dry.

Under the Heat Dome

When it's hot in most of California, its often cold and foggy in San Francisco. Today is no exception. Despite the raging news about heat records, it's not likely to reach 65 degrees here. So it's a little surreal to consider what friends and family are going through in the Pacific Northwest under the once-in-thousands-of-years heat dome that's settled over the region. A heat dome is an area of high pressure surrounded by upper-atmosphere winds that essentially pin it in place. If you remember your high-school physics, you'll recall that when a gas (for example, the air over the Pacific Northwest) is contained, the ratio between pressure and temperature remains constant. If the temperature goes up, the pressure goes up.

The converse is also true; as the pressure rises, so does the temperature. And that's what's been happening over Oregon, Washington, and British Columbia in normally chilly Canada. Mix in the fact that climate change has driven average temperatures in those areas up by three to four degrees since the industrial revolution, and you have a recipe for the disaster that struck the region recently.

And it has indeed been a disaster. The temperature in the tiny town of Lytton, British Columbia, for instance, hit 121 degrees on June 29th, breaking the Canadian heat record for the third time in as many days. (The previous record had stood since 1937.) That was Tuesday. On Wednesday night, the whole town was engulfed in the flames of multiple fires. The fires, in turn, generated huge pyrocumulus clouds that penetrated as high as the stratosphere (a rare event in itself), producing lightning strikes that ignited new fires in a vicious cycle that, in the end, simply destroyed the kilometer-long town.

Heat records have been broken all over the Pacific Northwest. Portland topped records for three days running, culminating with a 116-degree day on June 28th; Seattle hit a high of 108, which the Washington Post reported "was 34 degrees above the normal high of 74 and higher than the all-time heat record in Washington, D.C., among many other cities much farther to its south."

With the heat comes a rise in "sudden and unexpected" deaths. Hundreds have died in Oregon and Washington and, according to the British Columbia coroner, at least 300 in her state — almost double the average number for that time period.

Class, Race, and Hot Air

It's hardly a new observation that the people who have benefited least from the causes of climate change — the residents of less industrialized countries and poor people of all nations — are already suffering most from its results. Island nations like the Republic of Palau in the western Pacific are a prime example. Palau faces a number of climate-change challenges, according to the United Nations Development Program, including rising sea levels that threaten to inundate some of its lowest-lying islands, which are just 10 meters above sea level. In addition, encroaching seawater is salinating some of its agricultural land, creating seaside strips that can now grow only salt-tolerant root crops. Meanwhile, despite substantial annual rainfall, saltwater inundation threatens the drinking water supply. And worse yet, Palau is vulnerable to ocean storms that, on our heating planet, are growing ever more frequent and severe.

There are also subtle ways the rising temperatures that go with climate change have differential effects, even on people living in the same city. Take air conditioning. One of the reasons people in the Pacific Northwest suffered so horrendously under the heat dome is that few homes in that region are air conditioned. Until recently, people there had been able to weather the minimal number of very hot days each year without installing expensive cooling machinery.

Obviously, people with more discretionary income will have an easier time investing in air conditioning now that temperatures are rising. What's less obvious, perhaps, is that its widespread use makes a city hotter — a burden that falls disproportionately on people who can't afford to install it in the first place. Air conditioning works on a simple principle; it shifts heat from air inside an enclosed space to the outside world, which, in turn, makes that outside air hotter.

A 2014 study of this effect in Phoenix, Arizona, showed that air conditioning raised ambient temperatures by one to two degrees at night — an important finding, because one of the most dangerous aspects of the present heat waves is their lack of night-time cooling. As a result, each day's heat builds on a higher base, while presenting a greater direct-health threat, since the bodies of those not in air conditioning can't recover from the exhaustion of the day's heat at night. In effect, air conditioning not only heats the atmosphere further but shifts the burden of unhealthy heat from those who can afford it to those who can't.

Just as the coronavirus has disproportionately ravaged black and brown communities (as well as poor nations around the world), climate-change-driven heat waves, according to a recent University of North Carolina study reported by the BBC, mean that "black people living in most U.S. cities are subject to double the level of heat stress as their white counterparts." This is the result not just of poverty, but of residential segregation, which leaves urban BIPOC (black, indigenous, and other people of color) communities in a city's worst "heat islands" — the areas containing the most concrete, the most asphalt, and the least vegetation — and which therefore attract and retain the most heat.

"Using satellite temperature data combined with demographic information from the U.S. Census," the researchers "found that the average person of color lives in an area with far higher summer daytime temperatures than non-Hispanic white people." They also discovered that, in all but six of the 175 urban areas they studied in the continental U.S., "people of color endure much greater heat impacts in summer." Furthermore, "for black people this was particularly stark. The researchers say they are exposed to an extra 3.12C [5.6F] of heating, on average, in urban neighborhoods, compared to an extra 1.47C [2.6F] for white people."

That's a big difference.

Food, Drink, and Fires — the View from California

Now, let me return to my own home state, California, where conditions remain all too dry and, apart from the coast right now, all too hot. Northern California gets most of its drinking water from the snowpack that builds each year in the Sierra Nevada mountains. In spring, those snows gradually melt, filling the rivers that fill our reservoirs. In May 2021, however, the Sierra snowpack was a devastating six percent of normal!

Stop a moment and take that in, while you try to imagine the future of much of the state — and the crucial crops it grows.

For my own hometown, San Francisco, things aren't quite that dire. Water levels in Hetch Hetchy, our main reservoir, located in Yosemite National Park, are down from previous years, but not disastrously so. With voluntary water-use reduction, we're likely to have enough to drink this year at least. Things are a lot less promising, however, in rural California where towns tend to rely on groundwater for domestic use.

Shrinking water supplies don't just affect individual consumers here in this state, they affect everyone in the United States who eats, because 13.5% of all our agricultural products, including meat and dairy, as well as fruits and vegetables, come from California. Growing food requires prodigious amounts of water. In fact, farmland irrigation accounts for roughly 80% of all water used by businesses and homes in the state.

So how are California's agricultural water supplies doing this year? The answer, sadly, is not very well. State regulators have already cut distribution to about a quarter of California's irrigated acreage (about two million acres) by a drastic 95%. That's right. A full quarter of the state's farmlands have access to just 5% of what they would ordinarily receive from rivers and aqueducts. As a result, some farmers are turning to groundwater, a more easily exhausted source, which also replenishes itself far more slowly than rivers and streams. Some are even choosing to sell their water to other farmers, rather than use it to grow crops at all, because that makes more economic sense for them. As smaller farms are likely to be the first to fold, the water crisis will only enhance the dominance of major corporations in food production.

Meanwhile, we'll probably be breaking out our N95 masks soon. Wildfire season has already begun — earlier than ever. On July 1st, the then-still-uncontained Salt fire briefly closed a section of Interstate 5 near Redding in northern California. (I-5 is the main north-south interstate along the West coast.) And that's only one of the more than 4,500 fire incidents already recorded in the state this year.

Last year, almost 10,000 fires burned more than four million acres here, and everything points to a similar or worse season in 2021. Unlike Donald Trump, who famously blamed California's fires on a failure to properly rake our forests, President Biden is taking the threat seriously. On June 30th, he convened western state leaders to discuss the problem, acknowledging that "we have to act and act fast. We're late in the game here." The president promised a number of measures: guaranteeing sufficient, and sufficiently trained, firefighters; raising their minimum pay to $15 per hour; and making grants to California counties under the Federal Emergency Management Agency's BRIC (Building Resilient Infrastructure and Communities) program.

Such measures will help a little in the short term, but none of it will make a damn bit of difference in the longer run if the Biden administration and a politically divisive Congress don't begin to truly treat climate change as the immediate and desperately long-term emergency it is.

Justice and Generations

In his famous A Theory of Justice, the great liberal philosopher of the twentieth century John Rawls proposed a procedural method for designing reasonable and fair principles and policies in a given society. His idea: that the people determining such basic policies should act as if they had stepped behind a "veil of ignorance" and had lost specific knowledge of their own place in society. They'd be ignorant of their own class status, ethnicity, or even how lucky they'd been when nature was handing out gifts like intelligence, health, and physical strength.

Once behind such a veil of personal ignorance, Rawls argued, people might make rules that would be as fair as possible, because they wouldn't know whether they themselves were rich or poor, black or white, old or young — or even which generation they belonged to. This last category was almost an afterthought, included, he wrote, "in part because questions of social justice arise between generations as well as within them."

His point about justice between generations not only still seems valid to me, but in light of present-day circumstances radically understated. I don't think Rawls ever envisioned a trans-generational injustice as great as the climate-change one we're allowing to happen, not to say actively inducing, at this very moment.

Human beings have a hard time recognizing looming but invisible dangers. In 1990, I spent a few months in South Africa providing some technical assistance to an anti-apartheid newspaper. When local health workers found out that I had worked (as a bookkeeper) for an agency in the U.S. trying to prevent the transmission of AIDS, they desperately wanted to talk to me. How, they hoped to learn, could they get people living in their townships to act now to prevent a highly transmissible illness that would only produce symptoms years after infection? How, in the face of the all-too-present emergencies of everyday apartheid life, could they get people to focus on a vague but potentially horrendous danger barreling down from the future? I had few good answers and, almost 30 years later, South Africa has the largest HIV-positive population in the world.

Of course, there are human beings who've known about the climate crisis for decades — and not just the scientists who wrote about it as early as the 1950s or the ones who gave an American president an all-too-accurate report on it in 1965. The fossil-fuel companies have, of course, known all along — and have focused their scientific efforts not on finding alternative energy sources, but on creating doubt about the reality of human-caused climate change (just as, once upon a time, tobacco companies sowed doubt about the relationship between smoking and cancer). As early as 1979, the Guardian reports, an internal Exxon study concluded that the use of fossil fuels would certainly "cause dramatic environmental effects" in the decades ahead. "The potential problem is great and urgent," the study concluded.

A problem that was "great and urgent" in 1979 is now a full-blown existential crisis for human survival.

Some friends and I were recently talking about how ominous the future must look to the younger people we know. "They are really the first generation to confront an end to humanity in their own, or perhaps their children's lifetimes," I said.

"But we had The Bomb," a friend reminded me. "We grew up in the shadow of nuclear war." And she was right of course. We children of the 1950s and 1960s grew up knowing that someone could "press the button" at any time, but there was a difference. Horrifying as is the present retooling of our nuclear arsenal (going on right now, under President Biden), nuclear war nonetheless remains a question of "if." Climate change is a matter of "when" and that when, as anyone living in the Northwest of the United States and Canada should know after these last weeks, is all too obviously now.

It's impossible to overstate the urgency of the moment. And yet, as a species, we're acting like the children of indulgent parents who provide multiple "last chances" to behave. Now, nature has run out of patience and we're running out of chances. So much must be done globally, especially to control the giant fossil-fuel companies. We can only hope that real action will emerge from November's international climate conference. And here in the U.S., unless congressional Democrats succeed in ramming through major action to stop climate change before the 2022 midterms, we'll have lost one more last, best chance for survival.

Copyright 2021 Rebecca Gordon

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer's new dystopian novel, Songlands (the final one in his Splinterlands series), Beverly Gologorsky's novel Every Body Has a Story, and Tom Engelhardt's A Nation Unmade by War, as well as Alfred McCoy's In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower's The Violent American Century: War and Terror Since World War II.

Rebecca Gordon, a TomDispatch regular, teaches at the University of San Francisco. She is the author of American Nuremberg: The U.S. Officials Who Should Stand Trial for Post-9/11 War Crimes and is now at work on a new book on the history of torture in the United States.

Welfare for weapons makers doesn't make us safer

These days my conversations with friends about the new administration go something like this:

"Biden's doing better than I thought he would."

"Yeah. Vaccinations, infrastructure, acknowledging racism in policing. A lot of pieces of the Green New Deal, without calling it that. The child subsidies. It's kind of amazing."

"But on the military–"

"Yeah, same old, same old."

As my friends and I have noticed, President Joe Biden remains super-glued to the same old post-World War II agreement between the two major parties: they can differ vastly on domestic policies, but they remain united when it comes to projecting U.S. military power around the world and to the government spending that sustains it. In other words, the U.S. "national security" budget is still the third rail of politics in this country.

Assaulting the Old New Deal

It was Democratic House Speaker Tip O'Neill who first declared that Social Security is "the third rail" of American politics. In doing so, he metaphorically pointed to the high-voltage rail that runs between the tracks of subways and other light-rail systems. Touch that and you'll electrocute yourself.

O'Neill made that observation back in 1981, early in Ronald Reagan's first presidential term, at a moment when the new guy in Washington was already hell-bent on dismantling Franklin Delano Roosevelt's New Deal legacy.

Reagan would fight his campaign to do so on two key fronts. First, he would attack labor unions, whose power had expanded in the years since the 1935 Wagner Act (officially the National Labor Relations Act) guaranteed workers the right to bargain collectively with their employers over wages and workplace rules. Such organizing rights had been hard-won indeed. Not a few workers died at the hands of the police or domestic mercenaries like Pinkerton agents, especially in the early 1930s. By the mid-1950s, union membership would peak at around 35% of workers, while wages would continue to grow into the late 1970s, when they stagnated and began their long decline.

Reagan's campaign began with an attack on PATCO, a union of well-paid professionals — federally-employed air-traffic controllers — which his National Labor Relations Board eventually decertified. That initial move signaled the Republican Party's willingness, even enthusiasm, for breaking with decades of bipartisan support for organized labor. By the time Donald Trump took office in the next century, it was a given that Republicans would openly support anti-union measures like federal "right-to-work" laws, which, if passed, would make it illegal for employers to agree to a union-only workplace and so effectively destroy the bargaining power of unions. (Fortunately, opponents were able to forestall that move during Trump's presidency, but in February 2021, Republicans reintroduced their National Right To Work Act.)

The Second Front and the Third Rail

There was a second front in Reagan's war on the New Deal. He targeted a group of programs from that era that came to be known collectively as "entitlements." Three of the most important were Aid to Dependent Children, unemployment insurance, and Social Security. In addition, in 1965, a Democratic Congress had added a healthcare entitlement, Medicare, which helps cover medical expenses for those over 65 and younger people with specific chronic conditions, as well as Medicaid, which does the same for poor people who qualify. These, too, would soon be in the Republican gunsights.

The story of Reagan's racially inflected attacks on welfare programs is well-known. His administration's urge to go after unemployment insurance, which provided payments to laid-off workers, was less commonly acknowledged. In language eerily echoed by Republican congressional representatives today, the Reagan administration sought to reduce the length of unemployment benefits, so that workers would be forced to take any job at any wage. A 1981 New York Times report, for instance, quoted Reagan Assistant Secretary of Labor Albert Agrisani as saying:

"'The bottom line… is that we have developed two standards of work, available work and desirable work.' Because of the availability of unemployment insurance and extended benefits, he said, 'there are jobs out there that people don't want to take.'"

Reagan did indeed get his way with unemployment insurance, but when he turned his sights on Social Security, he touched Tip O'Neill's third rail.

Unlike welfare, whose recipients are often framed as lazy moochers, and unemployment benefits, which critics claim keep people from working, Social Security was then and remains today a hugely popular program. Because workers contribute to the fund with every paycheck and usually collect benefits only after retirement, beneficiaries appear deserving in the public eye. Of all the entitlement programs, it's the one most Americans believe that they and their compatriots are genuinely entitled to. They've earned it. They deserve it.

So, when the president moved to reduce Social Security benefits, ostensibly to offset a rising deficit in its fund, he was shocked by the near-unanimous bipartisan resistance he met. His White House put together a plan to cut $80 billion over five years by — among other things — immediately cutting benefits and raising the age at which people could begin fully collecting them. Under that plan, a worker who retired early at 62 and was entitled to $248 a month would suddenly see that payout reduced to $162.

Access to early retirement was, and remains, a justice issue for workers with shorter life expectancies — especially when those lives have been shortened by the hazards of the work they do. As South Carolina Republican Congressman Carroll Campbell complained to the White House at the time: "I've got thousands of sixty-year-old textile workers who think it's the end of the world. What the hell am I supposed to tell them?"

After the Senate voted 96-0 to oppose any plan that would "precipitously and unfairly reduce early retirees' benefits," the Reagan administration regrouped and worked out a compromise with O'Neill and the Democrats. Economist (later Federal Reserve chair) Alan Greenspan would lead a commission that put together a plan, approved in 1983, to gradually raise the full retirement age, increase the premiums paid by self-employed workers, start taxing benefits received by people with high incomes, and delay cost-of-living adjustments. Those changes were rolled out gradually, the country adjusted, and no politicians were electrocuted in the process.

Panic! The System Is Going Broke!

With its monies maintained in a separately sequestered trust fund, Social Security, unlike most government programs, is designed to be self-sustaining. Periodically, as economist and New York Times columnist Paul Krugman might put it, serious politicians claim to be concerned about that fund running out of money. There's a dirty little secret that those right-wing deficit slayers never tell you, though: when the Social Security trust fund runs a surplus, as it did from 1983 to 2009, it's required to invest it in government bonds, indirectly helping to underwrite the federal government's general fund.

They also aren't going to mention that one group who contributes to that surplus will never see a penny in benefits: undocumented immigrant workers who pay into the system but won't ever collect Social Security. Indeed, in 2016, such workers provided an estimated $13 billion out of about $957 billion in Social Security taxes, or almost 3% of total revenues. That may not sound like much, but over the years it adds up. In that way, undocumented workers help subsidize the trust fund and, in surplus years, the entire government.

How, then, is Social Security funded? Each year, employees contribute 6.2% of their wages (up to a cap amount). Employers match that, for a total of 12.4% of wages paid, and both put out another 1.45% each for Medicare. Self-employed people pay both shares or a total of 15.3% of their income, including Medicare. And those contributions add up to about nine-tenths of the fund's annual income (89% in 2019). The rest comes from interest on government bonds.

So, is the Social Security system finally in trouble? It could be. When the benefits due to a growing number of retirees exceed the fund's income, its administrators will have to dip into its reserves to make up the difference. As people born in the post-World War II baby boom reach retirement, at a moment when the American population is beginning to age rapidly, dire predictions are resounding about the potential bankruptcy of the system. And there is, in fact, a consensus that the fund will begin drawing down its reserves, possibly starting this year, and could exhaust them as soon as 2034. At that point, relying only on the current year's income to pay benefits could reduce Social Security payouts to perhaps 79% of what's promised at present.

You can already hear the cries that the system is going broke!

But it doesn't have to be that way. Employees and employers only pay Social Security tax on income up to a certain cap. This year it's $142,800. In other words, employees who make a million dollars in 2021 will contribute no more to Social Security than those who make $142,800. To rescue Social Security, all it would take is raising that cap — or better yet, removing it altogether.

In fact, the Congressional Budget Office has run the numbers and identified two different methods of raising it to eventually tax all wage income. Either would keep the trust fund solvent.

Naturally, plutocrats and their congressional minions don't want to raise the Social Security cap. They'd rather starve the entitlement beast and blame possible shortfalls on greedy boomers who grew up addicted to government handouts. Under the circumstances, we, and succeeding generations, had better hope that Social Security remains, as it was in 1981, the third rail in American politics.

Welfare for Weapons Makers

Of course, there's a second high-voltage, untouchable rail in American politics and that's funding for the military and weapons manufacturers. It takes a brave politician indeed to suggest even the most minor of reductions in Pentagon spending, which has for years been the single largest item of discretionary spending in the federal budget.

It's notoriously difficult to identify how much money the government actually spends annually on the military. President Trump's last Pentagon budget, for the fiscal year ending on September 30th, offered about $740 billion to the armed services (not including outlays for veteran services and pensions). Or maybe it was only $705.4 billion. Or perhaps, including Department of Energy outlays involving nuclear weapons, $753.5 billion. (And none of those figures even faintly reflected full national-security spending, which is certainly well over a trillion dollars annually.)

Most estimates put President Biden's 2022 military budget at $753 billion — about the same as Trump's for the previous year. As former Senator Everett Dirksen is once supposed to have said, "A billion here, a billion there, and pretty soon you're talking real money."

Indeed, we're talking real money and real entitlements here that can't be touched in Washington without risking political electrocution. Unlike actual citizens, U.S. arms manufacturers seem entitled to ever-increasing government subsidies — welfare for weapons, if you like. Beyond the billions spent to directly fund the development and purchase of various weapons systems, every time the government permits arms sales to other countries, it's expanding the coffers of companies like Lockheed-Martin, Northrup-Grumman, Boeing, and Raytheon Technologies. The real beneficiaries of Donald Trump's so-called Abraham Accords between Israel and the majority Muslim states of Morocco, the United Arab Emirates, Bahrain, and Sudan were the U.S. companies that sell the weaponry that sweetened those deals for Israel's new friends.

When Americans talk about undeserved entitlements, they're usually thinking about welfare for families, not welfare for arms manufacturers. But military entitlements make the annual federal appropriation of $16.5 billion for Temporary Aid to Needy Families (TANF) look puny by comparison. In fact, during Republican and Democratic administrations alike, the yearly federal outlay for TANF hasn't changed since it was established through the 1996 Personal Responsibility and Work Opportunity Reconciliation Act, known in the Clinton era as "welfare reform." Inflation has, however, eroded its value by about 40% in the intervening years.

And what do Americans get for those billions no one dares to question? National security, right?

But how is it that the country that spends more on "defense" than the next seven, or possibly 10, countries combined is so insecure that every year's Pentagon budget must exceed the last one? Why is it that, despite those billions for military entitlements, our critical infrastructure, including hospitals, gas pipelines, and subways (not to mention Cape Cod steamships), lies exposed to hackers?

And if, thanks to that "defense" budget, we're so secure, why is it that, in my wealthy home city of San Francisco, residents now stand patiently in lines many blocks long to receive boxes of groceries? Why is "national security" more important than food security, or health security, or housing security? Or, to put it another way, which would you rather be entitled to: food, housing, education, and healthcare, or your personal share of a shiny new hypersonic missile?

But wait! Maybe defense spending contributes to our economic security by creating, as Donald Trump boasted in promoting his arms deals with Saudi Arabia, "jobs, jobs, jobs." It's true that spending on weaponry does, in fact, create jobs, just not nearly as many as investing taxpayer dollars in a variety of far less lethal endeavors would. As Brown University's Costs of War project reports:

"Military spending creates fewer jobs than the same amount of money would have, if invested in other sectors. Clean energy and health care spending create 50% more jobs than the equivalent amount of spending on the military. Education spending creates more than twice as many jobs."

It seems that President Joe Biden is ready to shake things up by attacking child poverty, the coronavirus pandemic, and climate change, even if he has to do it without any Republican support. But he's still hewing to the old Cold War bipartisan alliance when it comes to the real third rail of American politics — military spending. Until the power can be cut to that metaphorical conduit, real national security remains an elusive dream.

The reality of work in the Biden-Harris era

A year ago, just a few weeks before San Francisco locked itself down for the pandemic, I fell deeply in love with a 50-year-old. The object of my desire was a wooden floor loom in the window of my local thrift shop. Friends knowledgeable on such matters examined photos I took of it and assured me that all the parts were there, so my partner (who puts up with such occasional infatuations) helped me wrangle it into one of our basement rooms and I set about learning to weave.

These days, all I want to do is weave. The loom that's gripped me, and the pandemic that's gripped us all, have led me to rethink the role of work (and its subset, paid labor) in human lives. During an enforced enclosure, this 68-year-old has spent a lot of time at home musing on what the pandemic has revealed about how this country values work. Why, for example, do the most "essential" workers so often earn so little — or, in the case of those who cook, clean, and care for the people they live with, nothing at all? What does it mean when conservatives preach the immeasurable value of labor, while insisting that its most basic price in the marketplace shouldn't rise above $7.25 per hour?

That, after all, is where the federal minimum wage has been stuck since 2009. And that's where it would probably stay forever, if Republicans like Kansas Senator Roger Marshall had their way. He brags that he put himself through college making $6 an hour and doesn't understand why people can't do the same today for $7.25. One likely explanation: the cost of a year at Kansas State University has risen from $898 when he was at school to $10,000 today. Another? At six bucks an hour, he was already making almost twice the minimum wage of his college years, a princely $3.35 an hour.

It's Definitely Not Art, But Is It Work?

It's hard to explain the pleasure I've gotten from learning the craft of weaving, an activity whose roots extend at least 20,000 years into the past. In truth, I could devote the next (and most likely last) 20 years of my life just to playing with "plain weave," its simplest form — over-under, over-under — and not even scratch the surface of its possibilities. Day after day, I tromp down to our chilly basement and work with remarkable satisfaction at things as simple as getting a straight horizontal edge across my cloth.

But is what I'm doing actually "work"? Certainly, at the end of a day of bending under the loom to tie things up, of working the treadles to raise and lower different sets of threads, my aging joints are sore. My body knows all too well that I've been doing something. But is it work? Heaven knows, I'm not making products crucial to our daily lives or those of others. (We now possess more slightly lopsided cloth napkins than any two-person household could use in a lifetime.) Nor, at my beginner's level, am I producing anything that could pass for "art."

I don't have to weave. I could buy textiles for a lot less than it costs me to make them. But at my age, in pandemic America, I'm lucky. I have the time, money, and freedom from personal responsibilities to be able to immerse myself in making cloth. For me, playing with string is a first-world privilege. It won't help save humanity from a climate disaster or reduce police violence in communities of color. It won't even help a union elect an American president, something I was focused on last fall, while working with the hospitality-industry union. It's not teaching college students to question the world and aspire to living examined lives, something I've done in my official work as a part-time professor for the last 15 years. It doesn't benefit anyone but me.

Nevertheless, what I'm doing certainly does have value for me. It contributes, as philosophers might say, to my human flourishing. When I practice weaving, I'm engaged in something political philosopher Iris Marion Young believed essential to a good life. As she put it, I'm "learning and using satisfying and expansive skills." Young thought that a good society would offer all its members the opportunity to acquire and deploy such complicated skills in "socially recognized settings." In other words, a good society would make it possible for people to do work that was both challenging and respected.

Writing in the late 1980s, she took for granted that "welfare capitalism" of Europe, and to a far lesser extent the United States, would provide for people's basic material needs. Unfortunately, decades later, it's hard even to teach her critique of such welfare capitalism — a system that sustained lives but didn't necessarily allow them to flourish — because my students here have never experienced an economic system that assumes any real responsibility for sustaining life. Self-expression and an opportunity to do meaningful work? Pipe dreams if you aren't already well-off! They'll settle for jobs that pay the rent, keep the refrigerator stocked, and maybe provide some health benefits as well. That would be heaven enough, they say. And who could blame them when so many jobs on offer will fall far short of even such modest goals?

What I'm not doing when I weave is making money. I'm not one of the roughly 18 million workers in this country who do earn their livings in the textile industry. Such "livings" pay a median wage of about $28,000 a year, which likely makes it hard to keep a roof over your head. Nor am I one of the many millions more who do the same around the world, people like Seak Hong who sews garments and bags for an American company in Cambodia. Describing her life, she told a New York Times reporter, "I feel tired, but I have no choice. I have to work." Six days a week,

"Ms. Hong wakes up at 4:35 a.m. to catch the truck to work from her village. Her workday begins at 7 and usually lasts nine hours, with a lunch break. During the peak season, which lasts two to three months, she works until 8:30 p.m."
"Ms. Hong has been in the garment business for 22 years. She earns the equivalent of about $230 a month and supports her father, her sister, her brother (who is on disability) and her 12-year-old son."

Her sister does the unpaid — but no less crucial — work of tending to her father and brother, the oxen, and their subsistence rice plants.

Hong and her sister are definitely working, one with pay, the other without. They have, as she says, no choice.

Catherine Gamet, who makes handbags in France for Louis Vuitton, is also presumably working to support herself. But hers is an entirely different experience from Hong's. She loves what she's been doing for the last 23 years. Interviewed in the same article, she told the Times, "To be able to build bags and all, and to be able to sew behind the machine, to do hand-sewn products, it is my passion." For Gamet, "The time flies by."

Both these women have been paid to make bags for more than 20 years, but they've experienced their jobs very differently, undoubtedly thanks to the circumstances surrounding their work, rather than the work itself: how much they earn; the time they spend traveling to and from their jobs; the extent to which the "decision" to do a certain kind of work is coerced by fear of poverty. We don't learn from Hong's interview how she feels about the work itself. Perhaps she takes pride in what she does. Most people find a way to do that. But we know that making bags is Gamet's passion. Her work is not merely exhausting, but in Young's phrase "satisfying and expansive." The hours she spends on it are lived, not just endured as the price of survival.

Pandemic Relief and Its Discontents

Joe Biden and Kamala Harris arrived at the White House with a commitment to getting a new pandemic relief package through Congress as soon as possible. It appears that they'll succeed, thanks to the Senate's budget reconciliation process — a maneuver that bypasses the possibility of a Republican filibuster. Sadly, because resetting the federal minimum wage to $15 per hour doesn't directly involve taxation or spending, the Senate's parliamentarian ruled that the reconciliation bill can't include it.

Several measures contained in the package have aroused conservative mistrust, from the extension of unemployment benefits to new income supplements for families with children. Such measures provoke a Republican fear that somebody, somewhere, might not be working hard enough to "deserve" the benefits Congress is offering or that those benefits might make some workers think twice about sacrificing their time caring for children to earn $7.25 an hour at a soul-deadening job.

As New York Times columnist Ezra Klein recently observed, Republicans are concerned that such measures might erode respect for the "natural dignity" of work. In an incisive piece, he rebuked Republican senators like Mike Lee and Marco Rubio for responding negatively to proposals to give federal dollars to people raising children. Such a program, they insisted, smacked of — the horror! — "welfare," while in their view, "an essential part of being pro-family is being pro-work." Of course, for Lee and Rubio "work" doesn't include changing diapers, planning and preparing meals, doing laundry, or helping children learn to count, tell time, and tie their shoelaces — unless, of course, the person doing those things is employed by someone else's family and being paid for it. In that case it qualifies as "work." Otherwise, it's merely a form of government-subsidized laziness.

There is, however, one group of people that "pro-family" conservatives have long believed are naturally suited to such activities and who supposedly threaten the well-being of their families if they choose to work for pay instead. I mean, of course, women whose male partners earn enough to guarantee food, clothing, and shelter with a single income. I remember well a 1993 article by Pat Gowens, a founder of Milwaukee's Welfare Warriors, in the magazine Lesbian Contradiction. She wondered why conservative anti-feminists of that time thought it good if a woman with children had a man to provide those things, but an outrage if she turned to "The Man" for the same aid. In the first case, the woman's work is considered dignified, sacred, and in tune with the divine plan. Among conservatives, then or now, the second could hardly be dignified with the term "work."

The distinction they make between private and public paymasters, when it comes to domestic labor contains at least a tacit, though sometimes explicit, racial element. When the program that would come to be known as "welfare" was created as part of President Franklin Roosevelt's New Deal in the 1930s, it was originally designed to assist respectable white mothers who, through no fault of their own, had lost their husbands to death or desertion. It wasn't until the 1960s that African American women decided to secure their right to coverage under the same program and built the National Welfare Rights Organization to do so.

The word "welfare" refers, as in the preamble to the Constitution, to human wellbeing. But when Black women started claiming those rights, it suddenly came to signify undeserved handouts. You could say that Ronald Reagan rode into the White House in 1980 in a Cadillac driven by the mythical Black "welfare queen" he continually invoked in his campaign. It would be nice to think that the white resentment harnessed by Reagan culminated (as in "reached its zenith and will now decline") with Trump's 2016 election, but, given recent events, that would be unrealistically optimistic.

Reagan began the movement to undermine the access of poor Americans to welfare programs. Ever since, starving the entitlement beast has been the Republican lodestar. In the same period, of course, the wealthier compatriots of those welfare mothers have continued to receive ever more generous "welfare" from the government. Those would include subsidies to giant agriculture, oil-depletion allowances and other subsidies for fossil-fuel companies, the mortgage-interest tax deduction for people with enough money to buy rather than rent their homes, and the massive tax cuts for billionaires of the Trump era. However, it took a Democratic president, Bill Clinton, to achieve what Reagan couldn't, and, as he put it, "end welfare as we know it."

The Clinton administration used the same Senate reconciliation process in play today for the Biden administration's Covid-19 relief bill to push through the 1996 Personal Responsibility and Work Opportunity Reconciliation Act. It was more commonly known as "welfare reform." That act imposed a 32-hour-per-week work or training requirement on mothers who received what came to be known as Temporary Assistance to Needy Families. It also gave "temporary" its deeper meaning by setting a lifetime benefits cap of five years. Meanwhile, that same act proved a bonanza for non-profits and Private Industry Councils that got contracts to administer "job training" programs and were paid to teach women how to wear skirts and apply makeup to impress future employers. In the process, a significant number of unionized city and county workers nationwide were replaced with welfare recipients "earning" their welfare checks by sweeping streets or staffing county offices, often for less than the minimum wage.

In 1997, I was working with Californians for Justice (CFJ), then a new statewide organization dedicated to building political power in poor communities, especially those of color. Given the high unemployment rates in just such communities, our response to Clinton's welfare reforms was to demand that those affected by them at least be offered state-funded jobs at a living wage. If the government was going to make people work for pay, we reasoned, then it should help provide real well-paying jobs, not bogus "job readiness" programs. We secured sponsors in the state legislature, but I'm sure you won't be shocked to learn that our billion-dollar jobs bill never got out of committee in Sacramento.

CFJ's project led me into an argument with one of my mentors, the founder of the Center for Third World Organizing, Gary Delgado. Why on earth, he asked me, would you campaign to get people jobs? "Jobs are horrible. They're boring: they waste people's lives and destroy their bodies." In other words, Gary was no believer in the inherent dignity of paid work. So, I had to ask myself, why was I?

Among those who have inspired me, Gary wasn't alone in holding such a low opinion of jobs. The Greek philosopher Aristotle, for instance, had been convinced that those whose economic condition forced them to work for a living would have neither the time nor space necessary to live a life of "excellence" (his requirement for human happiness). Economic coercion and a happy life were, in his view, mutually exclusive.

Reevaluating Jobs

One of the lies capitalism tells us is that we should be grateful for our jobs and should think of those who make a profit from our labor not as exploiters but as "job creators." In truth, however, there's no creativity involved in paying people less than the value of their work so that you can skim off the difference and claim that you earned it. Even if we accept that there could be creativity in "management" — the effort to organize and divide up work so it's done efficiently and well — it's not the "job creators" who do that, but their hirelings. All the employers bring to the game is money.

Take the example of the admirable liberal response to the climate emergency, the Green New Deal. In the moral calculus of capitalism, it's not enough that shifting to a green economy could promote the general welfare by rebuilding and extending the infrastructure that makes modern life possible and rewarding. It's not enough that it just might happen in time to save billions of people from fires, floods, hurricanes, or starvation. What matters — the selling point — is that such a conversion would create jobs (along with the factor no one mentions out loud: profits).

Now, I happen to support exactly the kind of work involved in building an economy that could help reverse climate devastation. I agree with Joe Biden's campaign statement that such an undertaking could offer people jobs with "good wages, benefits, and worker protections." More than that, such jobs would indeed contribute to a better life for those who do them. As the philosopher Iris Marion Young puts it, they would provide the chance to learn and use "satisfying and expansive skills in a socially recognized setting." And that would be a very good thing even if no one made a penny of profit in the process.

Now, having finished my paid labor for the day, it's back to the basement and loom for me.

Copyright 2021 Rebecca Gordon

Rebecca Gordon, a TomDispatch regular, teaches at the University of San Francisco. She is the author of American Nuremberg: The U.S. Officials Who Should Stand Trial for Post-9/11 War Crimes and is now at work on a new book on the history of torture in the United States.

How 4 years of Trump brought us closer to doomsday

If you live in California, you're likely to be consumed on occasion by thoughts of fire. That's not surprising, given that, in last year alone, actual fires consumed over four and a quarter million acres of the state, taking with them 10,488 structures, 33 human lives, and who knows how many animals. By the end of this January, a month never before even considered part of the "fire" season, 10 wildfires had already burned through 2,337 more acres, according to the California Department of Forestry and Fire Protection (CalFire).

With each passing year, the state's fire season arrives earlier and does greater damage. In 2013, a mere eight years ago, fires consumed about 602,000 acres and started significantly later. That January, CalFire reported only a single fire, just two in February, and none in March. Fire season didn't really begin until April and had tapered off before year's end. This past December, however, 10 fires still burned at least 10,000 acres. In fact, it almost doesn't make sense to talk about a fire "season" anymore. Whatever the month, wildfires are likely to be burning somewhere in the state.

Clearly, California's fires (along with Oregon's and Washington's) are getting worse. Just as clearly, notwithstanding Donald Trump's exhortations to do a better job of "raking" our forests, climate change is the main cause of this growing disaster.

Fortunately, President Joe Biden seems to take the climate emergency seriously. In just his first two weeks in office, he's canceled the Keystone XL pipeline project, forbidden new drilling for oil or gas on public lands, and announced a plan to convert the entire federal fleet of cars and trucks to electric vehicles. Perhaps most important of all, he's bringing the U.S. back into the Paris climate accords, signaling an understanding that a planetary crisis demands planetwide measures and that the largest carbon-emitting economies should be leading the way. "This isn't [the] time for small measures," Biden has said. "We need to be bold."

Let's just hope that such boldness has arrived in time and that the Biden administration proves unwilling to sacrifice the planet on an altar of elusive congressional unity and illusionary bipartisanship.

Another Kind of Fire

If climate change threatens human life as we know it, so does another potential form of "fire" — the awesome power created when a nuclear reaction converts matter to energy. This is the magic of Einstein's observation that e=mc2, or that the energy contained in a bit of matter is equal to its mass (roughly speaking, its weight) multiplied by the speed of light expressed in meters per second. Roughly speaking, as we've all known since August 6, 1945, when an atomic bomb was dropped on the Japanese city of Hiroshima, that's an awful lot of energy. When a nuclear reaction is successfully controlled, the energy can be regulated and used to produce electricity without emitting carbon dioxide in the process.

Unfortunately, while nuclear power plants don't add greenhouse gasses to the atmosphere, they do create radioactive waste, some of which remains deadly for thousands of years. Industry advocates who argue for nuclear power as a "green" alternative generally ignore the problem which has yet to be solved ­­ of disposing of that waste.

In what hopefully is just a holdover from the Trump administration, the Energy Department website still "addresses" this issue by suggesting that all the nuclear waste produced to date "could fit on a football field at a depth of less than 10 yards!" The site neglects to add that, if you shoved that 3,456,000 square feet of nuclear waste together the wrong way, the resultant explosive chain reaction would probably wipe out most life on Earth.

Remember, too, that "controlled" nuclear reactions don't always remain under human control. Ask anyone who lived near the Three Mile Island nuclear reactor in Pennsylvania, the Chernobyl nuclear power plant in the Ukraine, or the Fukushima Daiichi nuclear power plant in Japan.

There is, however, another far more devastating form of "controlled" nuclear reaction, the kind created when a nuclear bomb explodes. Only one country has ever deployed atomic weapons in war, of course: the United States, in its attack on Hiroshima and, three days later, on Nagasaki. Those bombs were of the older uranium-based variety and were puny by the standards of today's nuclear weapons. Still, the horror of those attacks was sufficient to convince many that such weapons should never be used again.

Treaties and Entreaties

In the decades since 1945, various configurations of nations have agreed to treaties prohibiting the use of, or limiting the proliferation of, nuclear weapons — even as the weaponry spread and nuclear arsenals grew. In the Cold War decades, the most significant of these were the bilateral pacts between the two superpowers of the era, the U.S. and the Soviet Union. When the latter collapsed in 1991, Washington signed treaties instead with the Russian Federation government, the most recent being the New START treaty, which came into effect in 2011 and was just extended by Joe Biden and Vladimir Putin.

In addition to such bilateral agreements, the majority of nations on the planet agreed on various multilateral pacts, including the Nuclear Non-Proliferation Treaty, or NPT, which has been signed by 191 countries and has provided a fairly effective mechanism for limiting the spread of such arms. Today, there are still "only" nine nuclear-armed states. Of these, six have signed the NPT, but just five of them — China, France, Russia, the United Kingdom, and United States — admit to possessing such weaponry. Israel, which also signed the pact, has never publicly acknowledged its growing nuclear arsenal. Three other nuclear-armed countries — India, Pakistan, and North Korea — have never signed the treaty at all. Worse yet, in 2005, the George W. Bush administration inked a side-deal with India that gave Washington's blessing to the acceleration of that country's nuclear weapons development program outside the monitoring constraints of the NPT.

The treaty assigns to the International Atomic Energy Agency (IAEA) the authority to monitor compliance. It was this treaty, for example, that gave the IAEA the right to inspect Iraq's nuclear program in the period before the U.S. invaded in 2003. Indeed, the IAEA repeatedly reported that Iraq was, in fact, in compliance with the treaty in the months that preceded the invasion, despite the claims of the Bush administration that Iraqi ruler Saddam Hussein had such weaponry. The United States must act, President Bush insisted then, before the "smoking gun" of proof the world demanded turned out to be a "mushroom cloud" over some American city. As became clear after the first few months of the disastrous U.S. military occupation, there simply were no weapons of mass destruction in Iraq. (At least partly in recognition of the IAEA's attempts to forestall that U.S. invasion, the agency and its director general, Mohamed El Baradei, would receive the 2005 Nobel Peace Prize.)

Like Iraq, Iran also ratified the NPT in 1968, laying the foundation for ongoing IAEA inspections there. In recent years, having devastated Iraq's social, economic, and political infrastructure, the United States shifted its concern about nuclear proliferation to Iran. In 2015, along with China, Russia, France, the United Kingdom, Germany, and the European Union, the Obama administration signed the Joint Comprehensive Plan of Action (JCPOA), informally known as the Iran nuclear deal.

Under the JCPOA, in return for the lifting of onerous economic sanctions that were affecting the whole population, Iran agreed to limit the development of its nuclear capacity to the level needed to produce electricity. Again, IAEA scientists would be responsible for monitoring the country's compliance, which by all accounts was more than satisfactory — at least until 2018. That's when President Donald Trump unilaterally pulled the U.S. out of the agreement and reimposed heavy sanctions. Since then, as its economy began to be crushed, Iran was, understandably enough, reluctant to uphold its end of the bargain.

In the years since 1945, the world has seen treaties signed to limit or ban the testing of nuclear weapons or to cap the size of nuclear arsenals, as well as bilateral treaties to decommission parts of existing ones, but never a treaty aimed at outlawing nuclear weapons altogether. Until now. On January 22, 2021, the United Nations Treaty on the Prohibition of Nuclear Weapons took effect. Signed so far by 86 countries, the treaty represents "a legally binding instrument to prohibit nuclear weapons, leading towards their total elimination," according to the U.N. Sadly, but unsurprisingly, none of the nine nuclear powers are signatories.

"Fire and Fury"

I last wrote about nuclear danger in October 2017 when Donald Trump had been in the White House less than a year and, along with much of the world, I was worried that he might bungle his way into a war with North Korea. Back then, he and Kim Jong-un had yet to fall in love or to suffer their later public breakup. Kim was still "Little Rocket Man" to Trump, who had threatened to "rain fire and fury like the world has never seen" on North Korea.

The world did, in the end, survive four years of a Trump presidency without a nuclear war, but that doesn't mean he left us any safer. On the contrary, he took a whole series of rash steps leading us closer to nuclear disaster:

  • He pulled the U.S. out of the JCPOA, thereby destabilizing the Iran nuclear agreement and reigniting Iran's threats (and apparent efforts toward) someday developing nuclear weapons.
  • He withdrew from the 1987 Intermediate Range Nuclear Forces Treaty between the U.S. and the Soviet Union (later the Russian Federation), which, according to the nonpartisan Arms Control Association,
"required the United States and the Soviet Union to eliminate and permanently forswear all of their nuclear and conventional ground-launched ballistic and cruise missiles with ranges of 500 to 5,500 kilometers. The treaty marked the first time the superpowers had agreed to reduce their nuclear arsenals, eliminate an entire category of nuclear weapons, and employ extensive on-site inspections for verification."
  • He withdrew from the Open Skies Treaty, which gave signatories permission to fly over each other's territories to identify military installations and activities. Allowing this kind of access was meant to contribute to greater trust among nuclear-armed nations.
  • He threatened to allow the New START Treaty to expire, should he be reelected.
  • He presided over a huge increase in spending on the "modernization" of the U.S. nuclear arsenal, including on new submarine- and land-based launching capabilities. A number of these programs are still in their initial stages and could be stopped by the Biden administration.

In January 2021, after four years of Trump, the Bulletin of Atomic Scientists adjusted its "Doomsday Clock," moving the minute hand forward, to a mere 100 seconds to midnight. Since 1947, that Clock's annual resetting has reflected how close, in the view of the Bulletin's esteemed scientists and Nobel laureates, humanity has come to ending it all. As the Bulletin's editors note, "The Clock has become a universally recognized indicator of the world's vulnerability to catastrophe from nuclear weapons, climate change, and disruptive technologies in other domains."

Why so close to midnight? The magazine lists a number of reasons, including the increased danger of nuclear war, due in large part to steps taken by the United States in the Trump years, as well as to the development of "hypersonic" missiles, which are supposed to fly at five times the speed of sound and so evade existing detection systems. (Trump famously referred to these "super-duper" weapons as "hydrosonic," a term that actually describes a kind of toothbrush.) There is disagreement among weapons experts about the extent to which such delivery vehicles will live up to the (hyper) hype about them, but the effort to build them is destabilizing in its own right.

The Bulletin points to a number of other factors that place humanity in ever greater danger. One is, of course, the existential threat of climate change. Another is the widespread dissemination of "false and misleading information." The spread of lies about Covid-19, its editors say, exemplifies the life-threatening nature of a growing "wanton disregard for science and the large-scale embrace of conspiratorial nonsense." This is, they note, "often driven by political figures and partisan media." Such attacks on knowledge itself have "undermined the ability of responsible national and global leaders to protect the security of their citizens."

Passing the (Nuclear) Ball

When Donald Trump announced that he wouldn't attend the inauguration of Joe Biden and Kamala Harris, few people were surprised. After all, he was still insisting that he'd actually won the election, even after that big lie fueled an insurrectionary invasion of the Capitol. But there was another reason for concern: if Trump was going to be at Mar-a-Lago, how would he hand over the "nuclear football" to the new president? That "football" is, in fact, a briefcase containing the nuclear launch codes, which presidents always have with them. Since the dawn of the nuclear age, it's been passed from the outgoing president to the new one on Inauguration Day.

Consternation! The problem was resolved through the use of two briefcases, which were simultaneously deactivated and activated at 11:59:59 a.m. on January 20th, just as Biden was about to be sworn in.

The football conundrum pointed to a far more serious problem, however — that the fate of humanity regularly hangs on the actions of a single individual (whether as unbalanced as Donald Trump or as apparently sensible as Joe Biden) who has the power to begin a war that could end our species.

There's good reason to think that Joe Biden will be more reasonable about the dangers of nuclear warfare than the narcissistic idiot he succeeds. In addition to agreeing to extend the New START treaty, he's also indicated a willingness to rejoin the Iran nuclear deal and criticized Trump's nuclear buildup. Nevertheless, the power to end the world shouldn't lie with one individual. Congress could address this problem, by (as I suggested in 2017) enacting "a law that would require a unanimous decision by a specified group of people (for example, officials like the secretaries of state and defense together with the congressional leadership) for a nuclear first strike."

The Fire Next Time?

"God gave Noah the rainbow sign
No more water but the fire next time"

These words come from the African-American spiritual "I Got a Home in that Rock." The verse refers to God's promise to Noah in Genesis, after the great flood, never again to destroy all life on earth, a promise signified by the rainbow.

Those who composed the hymn may have been a bit less trusting of God — or of human destiny — than the authors of Genesis, since the Bible account says nothing about fire or a next time. Sadly, recent human history suggests that there could indeed be a next time. If we do succeed in destroying ourselves, it seems increasingly likely that it will be by fire, whether the accelerating heating of the globe over decades, or a nuclear conflagration any time we choose. The good news, the flame of hope, is that we still have time — at least 100 seconds — to prevent it.

Copyright 2021 Rebecca Gordon

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer's new dystopian novel Frostlands (the second in the Splinterlands series), Beverly Gologorsky's novel Every Body Has a Story, and Tom Engelhardt's A Nation Unmade by War, as well as Alfred McCoy's In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower's The Violent American Century: War and Terror Since World War II.

Rebecca Gordon, a TomDispatch regular, teaches at the University of San Francisco. She is the author of American Nuremberg: The U.S. Officials Who Should Stand Trial for Post-9/11 War Crimes and is now at work on a new book on the history of torture in the United States.

The signs that our American empire is crumbling

How can you tell when your empire is crumbling? Some signs are actually visible from my own front window here in San Francisco.

Directly across the street, I can see a collection of tarps and poles (along with one of my own garbage cans) that were used to construct a makeshift home on the sidewalk. Beside that edifice stands a wooden cross decorated with a string of white Christmas lights and a red ribbon — a memorial to the woman who built that structure and died inside it earlier this week. We don't know — and probably never will — what killed her: the pandemic raging across California? A heart attack? An overdose of heroin or fentanyl?

Behind her home and similar ones is a chain-link fence surrounding the empty playground of the Horace Mann/Buena Vista elementary and middle school. Like that home, the school, too, is now empty, closed because of the pandemic. I don't know where the families of the 20 children who attended that school and lived in one of its gyms as an alternative to the streets have gone. They used to eat breakfast and dinner there every day, served on the same sidewalk by a pair of older Latina women who apparently had a contract from the school district to cook for the families using that school-cum-shelter. I don't know, either, what any of them are now doing for money or food.

Just down the block, I can see the line of people that has formed every weekday since early December. Masked and socially distanced, they wait patiently to cross the street, one at a time, for a Covid test at a center run by the San Francisco Department of Health. My little street seems an odd choice for such a service, since — especially now that the school has closed — it gets little foot traffic. Indeed, a representative of the Latino Task Force, an organization created to inform the city's Latinx population about Covid resources told our neighborhood paper Mission Local that

"Small public health clinics such as this one 'will say they want to do more outreach, but I actually think they don't want to.' He believes they chose a low-trafficked street like Bartlett to stay under the radar. 'They don't want to blow the spot up, because it does not have a large capacity.'"

What do any of these very local sights have to do with a crumbling empire? They're signs that some of the same factors that fractured the Roman empire back in 476 CE (and others since) are distinctly present in this country today — even in California, one of its richest states. I'm talking about phenomena like gross economic inequality; over-spending on military expansion; political corruption; deep cultural and political fissures; and, oh yes, the barbarians at the gates. I'll turn to those factors in a moment, but first let me offer a brief defense of the very suggestion that U.S. imperialism and an American empire actually exist.

Imperialism? What's That Supposed to Mean?

What better source for a definition of imperialism than the Encyclopedia Britannica, that compendium of knowledge first printed in 1768 in the country that became the great empire of the nineteenth and first part of the twentieth centuries? According to the Encyclopedia, "imperialism" denotes "state policy, practice, or advocacy of extending power and dominion, especially by direct territorial acquisition or by gaining political and economic control of other areas." Furthermore, imperialism "always involves the use of power, whether military or economic or some subtler form." In other words, the word indicates a country's attempts to control and reap economic benefit from lands outside its borders.

In that context, "imperialism" is an accurate description of the trajectory of U.S. history, starting with the country's expansion across North America, stealing territory and resources from Indian nations and decimating their populations. The newly independent United States would quickly expand, beginning with the 1803 Louisiana Purchase from France. That deal, which effectively doubled its territory, included most of what would become the state of Louisiana, together with some or all of the present-day states of New Mexico, Texas, Arkansas, Missouri, Oklahoma, Kansas, Colorado, Iowa, Nebraska, Wyoming, Minnesota, North and South Dakota, Montana, and even small parts of what are today the Canadian provinces of Alberta and Saskatchewan.


Eventually, such expansionism escaped even those continental borders, as the country went on to gobble up the Philippines, Hawaii, the Panama Canal Zone, the Virgin Islands, Puerto Rico, Guam, American Samoa, and the Mariana Islands, the last five of which remain U.S. territories to this day. (Inhabitants of the nation's capital, where I grew up, were only partly right when we used to refer to Washington, D.C., as "the last colony.")

Of course, France didn't actually control most of that land, apart from the port city of New Orleans and its immediate environs. What Washington bought was the "right" to take the rest of that vast area from the native peoples who lived there, whether by treaty, population transfers, or wars of conquest and extermination. The first objective of that deal was to settle land on which to expand the already hugely lucrative cotton business, that economic engine of early American history fueled, of course, by slave labor. It then supplied raw materials to the rapidly industrializing textile industry of England, which drove that country's own imperial expansion.

U.S. territorial expansion continued as, in 1819, Florida was acquired from Spain and, in 1845, Texas was forcibly annexed from Mexico (as well as various parts of California a year later). All of those acquisitions accorded with what newspaper editor John O'Sullivan would soon call the country's manifest — that is, clear and obvious — destiny to control the entire continent.

American Doctrines from Monroe to Truman to (G.W.) Bush

U.S. economic, military, and political influence has long extended far beyond those internationally recognized possessions and various presidents have enunciated a series of "doctrines" to legitimate such an imperial reach.

Monroe: The first of these was the Monroe Doctrine, introduced in 1823 in President James Monroe's penultimate State of the Union address. He warned the nations of Europe that, while the United States recognized existing colonial possessions in the Americas, it would not permit the establishment of any new ones.

President Teddy Roosevelt would later add a corollary to Monroe's doctrine by establishing Washington's right to intercede in any country in the Americas that, in the view of its leaders, was not being properly run. "Chronic wrongdoing," he said in a 1904 message to Congress, "may in America, as elsewhere, ultimately require intervention by some civilized nation." The United States, he suggested, might find itself forced, "however reluctantly, in flagrant cases of such wrongdoing or impotence, to the exercise of an international police power." In the first quarter of the twentieth century, that Roosevelt Corollary would be used to justify U.S. occupations of Cuba, the Dominican Republic, Haiti, and Nicaragua.

Truman: Teddy's cousin, President Franklin D. Roosevelt, publicly renounced the Monroe Doctrine and promised a hands-off attitude towards Latin America, which came to be known as the Good Neighbor Policy. It didn't last long, however. In a 1947 address to Congress, the next president, Harry S. Truman, laid out what came to be known as the Truman Doctrine, which would underlie the country's foreign policy at least until the collapse of the Soviet Union in 1991. It held that U.S. national security interests required the "containment" of existing Communist states and the prevention of the further spread of Communism anywhere on Earth.

It almost immediately led to interventions in the internal struggles of Greece and Turkey and would eventually underpin Washington's support for dictators and repressive regimes from El Salvador to Indonesia. It would justify U.S.-backed coups in places like Iran, Guatemala, and Chile. It would lead this country into a futile war in Korea and a disastrous defeat in Vietnam.

That post-World War II turn to anticommunism would be accompanied by a new kind of colonialism. Rather than directly annexing territories to extract cheap labor and cheaper natural resources, under this new "neocolonial" model, the United States — and soon the great multilateral institutions of the post-war era, the World Bank and the International Monetary Fund — would gain control over the economies of poor nations. In return for aid — or loans often pocketed by local elites and repaid by the poor — those nations would accede to demands for the "structural adjustment" of their economic systems: the privatization of public services like water and utilities and the defunding of human services like health and education, usually by American or multinational corporations. Such "adjustments," in turn, allowed the recipients to service the loans, extracting scarce hard currency from already deeply impoverished nations.

Bush: You might have thought that the fall of the Soviet empire and the end of the Cold War would have provided Washington with an opportunity to step away from resource extraction and the seemingly endless military and CIA interventions that accompanied it. You might have imagined that the country then being referred to as the "last superpower" would finally consider establishing new and different relationships with the other countries on this little planet of ours. However, just in time to prevent even the faint possibility of any such conversion came the terrorist attacks of 9/11, which gave President George W. Bush the chance to promote his very own doctrine.

In a break from postwar multilateralism, the Bush Doctrine outlined the neoconservative belief that, as the only superpower in a now supposedly "unipolar" world, the United States had the right to take unilateral military action any time it believed it faced external threat of any imaginable sort. The result: almost 20 years of disastrous "forever wars" and a military-industrial complex deeply embedded in our national economy. Although Donald Trump's foreign policy occasionally feinted in the direction of isolationism in its rejection of international treaties, protocols, and organizational responsibilities, it still proved itself a direct descendant of the Bush Doctrine. After all, it was Bush who first took the United States out of the Anti-Ballistic Missile Treaty and rejected the Kyoto Protocol to fight climate change.

His doctrine instantly set the stage for the disastrous invasion and occupation of Afghanistan, the even more disastrous Iraq War, and the present-day over-expansion of the U.S. military presence, overt and covert, in practically every corner of the world. And now, to fulfill Donald Trump's Star Trek fantasies, even in outer space.

An Empire in Decay

If you need proof that the last superpower, our very own empire, is indeed crumbling, consider the year we've just lived through, not to mention the first few weeks of 2021. I mentioned above some of the factors that contributed to the collapse of the famed Roman empire in the fifth century. It's fair to say that some of those same things are now evident in twenty-first-century America. Here are four obvious candidates:

Grotesque Economic Inequality: Ever since President Ronald Reagan began the Republican Party's long war on unions and working people, economic inequality has steadily increased in this country, punctuated by terrible shocks like the Great Recession of 2007-2008 and, of course, by the Covid-19 disaster. We've seen 40 years of tax reductions for the wealthy, stagnant wages for the rest of us (including a federal minimum wage that hasn't changed since 2009), and attacks on programs like TANF (welfare) and SNAP (food stamps) that literally keep poor people alive.

The Romans relied on slave labor for basics like food and clothing. This country relies on super-exploited farm and food-factory workers, many of whom are unlikely to demand more or better because they came here without authorization. Our (extraordinarily cheap) clothes are mostly produced by exploited people in other countries.

The pandemic has only exposed what so many people already knew: that the lives of the millions of working poor in this country are growing ever more precarious and desperate. The gulf between rich and poor widens by the day to unprecedented levels. Indeed, as millions have descended into poverty since the pandemic began, the Guardian reports that this country's 651 billionaires have increased their collective wealth by $1.1 trillion. That's more than the $900 billion Congress appropriated for pandemic aid in the omnibus spending bill it passed at the end of December 2020.

An economy like ours, which depends so heavily on consumer spending, cannot survive the deep impoverishment of so many people. Those 651 billionaires are not going to buy enough toys to dig us out of this hole.

Wild Overspending on the Military: At the end of 2020, Congress overrode Trump's veto of the annual National Defense Authorization Act, which provided a stunning $741 billion to the military this fiscal year. (That veto, by the way, wasn't in response to the vast sums being appropriated in the midst of a devastating pandemic, but to the bill's provisions for renaming military bases currently honoring Confederate generals, among other extraneous things.) A week later, Congress passed that omnibus pandemic spending bill and it contained an additional $696 billion for the Defense Department.

All that money for "security" might be justified, if it actually made our lives more secure. In fact, our federal priorities virtually take food out of the mouths of children to feed the maw of the military-industrial complex and the never-ending wars that go with it. Even before the pandemic, more than 10% of U.S. families regularly experienced food insecurity. Now, it's a quarter of the population.

Corruption So Deep It Undermines the Political System: Suffice it to say that the man who came to Washington promising to "drain the swamp" has presided over one of the most corrupt administrations in U.S. history. Whether it's been blatant self-dealing (like funneling government money to his own businesses); employing government resources to forward his reelection (including using the White House as a staging ground for parts of the Republican National Convention and his acceptance speech); tolerating corrupt subordinates like Secretary of Commerce Wilbur Ross; or contemplating a self-pardon, the Trump administration has set the bar high indeed for any future aspirants to the title of "most corrupt president."

One problem with such corruption is that it undermines the legitimacy of government in the minds of the governed. It makes citizens less willing to obey laws, pay taxes, or act for the common good by, for example, wearing masks and socially distancing during a pandemic. It rips apart social cohesion from top to bottom.

Of course, Trump's most dangerous corrupt behavior — one in which he's been joined by the most prominent elected and appointed members of his government and much of his party — has been his campaign to reject the results of the 2020 general election. The concerted and cynical promotion of the big lie that the Democrats stole that election has so corrupted faith in the legitimacy of government that up to 68% of Republicans now believe the vote was rigged to elect Joe Biden. At "best," Trump has set the stage for increased Republican suppression of the vote in communities of color. At worst, he has so poisoned the electoral process that a substantial minority of Americans will never again accept as free and fair an election in which their candidate loses.

A Country in Ever-Deepening Conflict: White supremacy has infected the entire history of this country, beginning with the near-extermination of its native peoples. The Constitution, while guaranteeing many rights to white men, proceeded to codify the enslavement of Africans and their descendants. In order to maintain that enslavement, the southern states seceded and fought a civil war. After a short-lived period of Reconstruction in which Black men were briefly enfranchised, white supremacy regained direct legal control in the South, and flourished in a de facto fashion in the rest of the country.

In 1858, two years before that civil war began, Abraham Lincoln addressed the Illinois Republican State Convention, reminding those present that

"'A house divided against itself cannot stand.' I believe this government cannot endure, permanently half slave and half free. I do not expect the Union to be dissolved — I do not expect the house to fall – but I do expect it will cease to be divided. It will become all one thing, or all the other."

More than 160 years later, the United States clearly not only remains but has become ever more divided. If you doubt that the Civil War is still being fought today, look no farther than the Confederate battle flags proudly displayed by members of the insurrectionary mob that overran the Capitol on January 6th.

Oh, and the barbarians? They are not just at the gate; they have literally breached it, as we saw in Washington when they burst through the doors and windows of the center of government.

Building a Country From the Rubble of Empire

Human beings have long built new habitations quite literally from the rubble — the fallen stones and timbers — of earlier ones. Perhaps it's time to think about what kind of a country this place — so rich in natural resources and human resourcefulness — might become if we were to take the stones and timbers of empire and construct a nation dedicated to the genuine security of all its people. Suppose we really chose, in the words of the preamble to the Constitution, "to promote the general welfare, and to secure the blessings of liberty to ourselves and our posterity."

Suppose we found a way to convert the desperate hunger for ever more, which is both the fuel of empires and the engine of their eventual destruction, into a new contentment with "enough"? What would a United States whose people have enough look like? It would not be one in which tiny numbers of the staggeringly wealthy made hundreds of billions more dollars and the country's military-industrial complex thrived in a pandemic, while so many others went down in disaster.

This empire will fall sooner or later. They all do. So, this crisis, just at the start of the Biden and Harris years, is a fine time to begin thinking about what might be built in its place. What would any of us like to see from our front windows next year?

Copyright 2021 Rebecca Gordon

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer's new dystopian novel Frostlands (the second in the Splinterlands series), Beverly Gologorsky's novel Every Body Has a Story, and Tom Engelhardt's A Nation Unmade by War, as well as Alfred McCoy's In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower's The Violent American Century: War and Terror Since World War II.

Rebecca Gordon, a TomDispatch regular, teaches at the University of San Francisco. She is the author of American Nuremberg: The U.S. Officials Who Should Stand Trial for Post-9/11 War Crimes and is now at work on a new book on the history of torture in the United States.

Trump's broken promise shows how the American empire is rotting from within

It was the end of October 2001. Two friends, Max Elbaum and Bob Wing, had just dropped by. (Yes, children, believe it or not, people used to drop in on each other, maskless, once upon a time.) They had come to hang out with my partner Jan Adams and me. Among other things, Max wanted to get some instructions from fellow-runner Jan about taping his foot to ease the pain of plantar fasciitis. But it soon became clear that he and Bob had a bigger agenda for the evening. They were eager to recruit us for a new project.

And so began War Times/Tiempo de Guerras, a free, bilingual, antiwar tabloid that, at its height, distributed 100,000 copies every six weeks to more than 700 antiwar organizations around the country. It was already clear to the four of us that night -- as it was to millions around the world -- that the terrorist attacks of September 11th would provide the pretext for a major new projection of U.S. military power globally, opening the way to a new era of "all-war-all-the-time." War Times was a project of its moment (although the name would still be apt today, given that those wars have never ended). It would be superseded in a few years by the explosive growth of the Internet and the 24-hour news cycle. Still, it represented an early effort to fill the space where a peace movement would eventually develop.

All-War-All-the-Time -- For Some of Us

We were certainly right that the United States had entered a period of all-war-all-the-time. It's probably hard for people born since 9/11 to imagine how much -- and how little -- things changed after September 2001. By the end of that month, this country had already launched a "war" on an enemy that then-Secretary of Defense Donald Rumsfeld told us was "not just in Afghanistan," but in "50 or 60 countries, and it simply has to be liquidated."

Five years and two never-ending wars later, he characterized what was then called the war on terror as "a generational conflict akin to the Cold War, the kind of struggle that might last decades as allies work to root out terrorists across the globe and battle extremists who want to rule the world." A generation later, it looks like Rumsfeld was right, if not about the desires of the global enemy, then about the duration of the struggle.

Here in the United States, however, we quickly got used to being "at war." In the first few months, interstate bus and train travelers often encountered (and, in airports, still encounter) a new and absurd kind of "security theater." I'm referring to those long, snaking lines in which people first learned to remove their belts and coats, later their hats and shoes, as ever newer articles of clothing were recognized as potential hiding places for explosives. Fortunately, the arrest of the Underwear Bomber never led the Transportation Security Administration to the obvious conclusion about the clothing travelers should have to remove next. We got used to putting our three-ounce containers of liquids (No more!) into quart-sized baggies (No bigger! No smaller!).

It was all-war-all-the-time, but mainly in those airports. Once the shooting wars started dragging on, if you didn't travel by airplane much or weren't deployed to Afghanistan or Iraq, it was hard to remember that we were still in war time at all. There were continuing clues for those who wanted to know, like the revelations of CIA torture practices at "black sites" around the world, the horrors of military prisons like the ones at Bagram Air Force Base in Afghanistan, Abu Ghraib in Baghdad, and the still-functioning prison complex at Guantánamo Bay, Cuba. And soon enough, of course, there were the hundreds and then thousands of veterans of the Iraq and Afghan wars taking their places among the unhoused veterans of earlier wars in cities across the United States, almost unremarked upon, except by service organizations.

So, yes, the wars dragged on at great expense, but with little apparent effect in this country. They even gained new names like "the long war" (as Donald Trump's Secretary of Defense James Mattis put it in 2017) or the "forever wars," a phrase now so common that it appears all over the place. But apart from devouring at least $6.4 trillion dollars through September 2020 that might otherwise have been invested domestically in healthcare, education, infrastructure, or addressing poverty and inequality, apart from creating increasingly militarized domestic police forces armed ever more lethally by the Pentagon, those forever wars had little obvious effect on the lives of most Americans.

Of course, if you happened to live in one of the places where this country has been fighting for the last 19 years, things are a little different. A conservative estimate by Iraq Body Count puts violent deaths among civilians in that country alone at 185,454 to 208,493 and Brown University's Costs of War project points out that even the larger figure is bound to be a significant undercount:

Several times as many Iraqi civilians may have died as an indirect result of the war, due to damage to the systems that provide food, health care, and clean drinking water, and as a result, illness, infectious diseases, and malnutrition that could otherwise have been avoided or treated.

And that's just Iraq. Again, according to the Costs of War Project, "At least 800,000 people have been killed by direct war violence in Iraq, Afghanistan, Syria, Yemen, and Pakistan."

Of course, many more people than that have been injured or disabled. And America's post-9/11 wars have driven an estimated 37 million people from their homes, creating the greatest human displacement since World War II. People in this country are rightly concerned about the negative effects of online schooling on American children amid the ongoing Covid-19 crisis (especially poor children and those in communities of color). Imagine, then, the effects on a child's education of losing her home and her country, as well as one or both parents, and then growing up constantly on the move or in an overcrowded, under-resourced refugee camp. The war on terror has truly become a war of generations.

Every one of the 2,977 lives lost on 9/11 was unique and invaluable. But the U.S. response has been grotesquely disproportionate -- and worse than we War Times founders could have imagined that October night so many years ago.

Those wars of ours have gone on for almost two decades now. Each new metastasis has been justified by George W. Bush's and then Barack Obama's use of the now ancient 2001 Authorization for the Use of Military Force (AUMF), which Congress passed in the days after 9/11. Its language actually limited presidential military action to a direct response to the 9/11 attacks and the prevention of future attacks by the same actors. It stated that the president

...is authorized to use all necessary and appropriate force against those nations, organizations, or persons he determines planned, authorized, committed, or aided the terrorist attacks that occurred on September 11, 2001, or harbored such organizations or persons, in order to prevent any future acts of international terrorism against the United States by such nations, organizations or persons.

Despite that AUMF's limited scope, successive presidents have used it to justify military action in at least 18 countries. (To be fair, President Obama realized the absurdity of his situation when he sent U.S. troops to Syria and tried to wring a new authorization out of Congress, only to be stymied by a Republican majority that wouldn't play along.)

In 2002, in the run-up to the Iraq War, Congress passed a second AUMF, which permitted the president to use the armed forces as "necessary and appropriate" to "defend U.S. national security against the continuing threat posed by Iraq." In January 2020, Donald Trump used that second authorization to justify the murder by drone of Qasem Soleimani, an Iranian general, along with nine other people.

Trump Steps In

In 2016, peace activists were preparing to confront a Hillary Clinton administration that we expected would continue Obama's version of the forever wars -- the "surge" in Afghanistan, the drone assassination campaigns, the special ops in Africa. But on Tuesday, November 8, 2016, something went "Trump" in the night and Donald J. Trump took over the presidency with a promise to end this country's forever wars, which he had criticized relentlessly during his campaign. That, of course, didn't mean we should have expected a peace dividend anytime soon. He was also committed to rebuilding a supposedly "depleted" U.S. military. As he said at a 2019 press conference,

When I took over, it was a mess... One of our generals came in to see me and he said, 'Sir, we don't have ammunition.' I said, 'That's a terrible thing you just said.' He said, 'We don't have ammunition.' Now we have more ammunition than we've ever had.

It's highly unlikely that the military couldn't afford to buy enough bullets when Trump entered the Oval Office, given that publicly acknowledged defense funding was then running at $580 billion a year. He did, however, manage to push that figure to $713 billion by fiscal year 2020. That December, he threatened to veto an even larger appropriation for 2021 -- $740 billion -- but only because he wanted the military to continue to honor Confederate generals by keeping their names on military bases. Oh, and because he thought the bill should also change liability rules for social media companies, an issue you don't normally expect to see addressed in a defense appropriations bill. And, in any case, Congress passed the bill with a veto-proof majority.

As Pentagon expert Michael Klare pointed out recently, while it might seem contradictory that Trump would both want to end the forever wars and to increase military spending, his actions actually made a certain sense. The president, suggested Klare, had been persuaded to support the part of the U.S. military command that has favored a sharp pivot away from reigning post-9/11 Pentagon practices. For 19 years, the military high command had hewed fairly closely to the strategy laid out by Secretary of Defense Donald Rumsfeld early in the Bush years: maintaining the capacity to fight ground wars against one or two regional powers (think of that "Axis of Evil" of Iraq, North Korea, and Iran), while deploying agile, technologically advanced forces in low-intensity (and a couple of higher-intensity) counterterrorism conflicts. Nineteen years later, whatever its objectives may have been -- a more-stable Middle East? Fewer and weaker terrorist organizations? -- it's clear that the Rumsfeld-Bush strategy has failed spectacularly.

Klare points out that, after almost two decades without a victory, the Pentagon has largely decided to demote international terrorism from rampaging monster to annoying mosquito cloud. Instead, the U.S. must now prepare to confront the rise of China and Russia, even if China has only one overseas military base and Russia, economically speaking, is a rickety petro-state with imperial aspirations. In other words, the U.S. must prepare to fight short but devastating wars in multiple domains (including space and cyberspace), perhaps even involving the use of tactical nuclear weapons on the Eurasian continent. To this end, the country has indeed begun a major renovation of its nuclear arsenal and announced a new 30-year plan to beef up its naval capacity. And President Trump rarely misses a chance to tout "his" creation of a new Space Force.

Meanwhile, did he actually keep his promise and at least end those forever wars? Not really. He did promise to bring all U.S. troops home from Afghanistan by Christmas, but acting Defense Secretary Christopher Miller only recently said that we'd be leaving about 2,500 troops there and a similar number in Iraq, with the hope that they'd all be out by May 2021. (In other words, he dumped those wars in the lap of the future Biden administration.)

In the meantime in these years of "ending" those wars, the Trump administration actually loosened the rules of engagement for air strikes in Afghanistan, leading to a "massive increase in civilian casualties," according to a new report from the Costs of War Project. "From the last year of the Obama administration to the last full year of recorded data during the Trump administration," writes its author, Neta Crawford, "the number of civilians killed by U.S.-led airstrikes in Afghanistan increased by 330 percent."

In spite of his isolationist "America First" rhetoric, in other words, President Trump has presided over an enormous buildup of an institution, the military-industrial complex, that was hardly in need of major new investment. And in spite of his anti-NATO rhetoric, his reduction by almost a third of U.S. troop strength Germany, and all the rest, he never really violated the post-World War II foreign policy pact between the Republican and Democratic parties. Regardless of how they might disagree about dividing the wealth domestically, they remain united in their commitment to using diplomacy when possible, but military force when necessary, to maintain and expand the imperial power that they believed to be the guarantor of that wealth.

And Now Comes Joe

On January 20, 2021, Joe Biden will become the president of a country that spends as much on its armed forces, by some counts, as the next 10 countries combined. He'll inherit responsibility for a nation with a military presence in 150 countries and special-operations deployments in 22 African nations alone. He'll be left to oversee the still-unfinished, deeply unsuccessful, never-ending war on terror in Iraq, Syria, Afghanistan, Yemen, and Somalia and, as publicly reported by the Department of Defense, 187,000 troops stationed outside the United States.

Nothing in Joe Biden's history suggests that he or any of the people he's already appointed to his national security team have the slightest inclination to destabilize that Democratic-Republican imperial pact. But empires are not sustained by inclination alone. They don't last forever. They overextend themselves. They rot from within.

If you're old enough, you may remember stories about the long lines for food in the crumbling Soviet Union, that other superpower of the Cold War. You can see the same thing in the United States today. Once a week, my partner delivers food boxes to hungry people in our city, those who have lost their jobs and homes, because the pandemic has only exacerbated this country's already brutal version of economic inequality. Another friend routinely sees a food line stretching over a mile, as people wait hours for a single free bag of groceries.

Perhaps the horrors of 2020 -- the fires and hurricanes, Trump's vicious attacks on democracy, the death, sickness, and economic dislocation caused by Covid-19 -- can force a real conversation about national security in 2021. Maybe this time we can finally ask whether trying to prop up a dying empire actually makes us -- or indeed the world -- any safer. This is the best chance in a generation to start that conversation. The alternative is to keep trudging mindlessly toward disaster.

Rebecca Gordon, a TomDispatch regular, teaches at the University of San Francisco. She is the author of American Nuremberg: The U.S. Officials Who Should Stand Trial for Post-9/11 War Crimes and is now at work on a new Dispatch book on the history of torture in the United States.

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer's new dystopian novel (the second in the Splinterlands series) Frostlands, Beverly Gologorsky's novel Every Body Has a Story, and Tom Engelhardt's A Nation Unmade by War, as well as Alfred McCoy's In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower's The Violent American Century: War and Terror Since World WarII.

Copyright 2020 Rebecca Gordon

Trump's through-the-looking-glass presidency has turned the country inside out

In the chaos of this moment, it seems likely that Joe Biden will just squeeze into the presidency and that he'll certainly win the popular vote, Donald Trump's Mussolini-like behavior and election night false claim of victory notwithstanding. Somehow, it all brings another moment in my life to mind.

Back in October 2016, my friends and I frequently discussed the challenges progressives would face if the candidate we expected to win actually entered the Oval Office. There were so many issues to worry about back then. The Democratic candidate was an enthusiastic booster of the U.S. armed forces and believed in projecting American power through its military presence around the world. Then there was that long record of promoting harsh sentencing laws and the disturbing talk about "the kinds of kids that are called superpredators -- no conscience, no empathy."

In 2016, the country was already riven by deep economic inequality. While Hillary Clinton promised "good-paying jobs" for those struggling to stay housed and buy food, we didn't believe it. We'd heard the same promises so many times before, and yet the federal minimum wage was still stuck where it had been ever since 2009, at $7.25 an hour. Would a Clinton presidency really make a difference for working people? Not if we didn't push her -- and hard.

The candidate we were worried about was never Donald Trump, but Hillary Clinton. And the challenge we expected to confront was how to shove that quintessential centrist a few notches to the left. We were strategizing on how we might organize to get a new administration to shift government spending from foreign wars to human needs at home and around the world. We wondered how people in this country might finally secure the "peace dividend" that had been promised to us in the period just after the Cold War, back when her husband Bill became president. In those first (and, as it turned out, only) Clinton years, what we got instead was so-called welfare reform whose consequences are still being felt today, as layoffs drive millions into poverty.

We doubted Hillary Clinton's commitment to addressing most of our other concerns as well: mass incarceration and police violence, structural racism, economic inequality, and most urgent of all (though some of us were just beginning to realize it), the climate emergency. In fact, nationwide, people like us were preparing to spend a day or two celebrating the election of the first woman president and then get down to work opposing many of her anticipated policies. In the peace and justice movements, in organized labor, in community-based organizations, in the two-year-old Black Lives Matter movement, people were ready to roll.

And then the unthinkable happened. The woman we might have loved to hate lost that election and the white-supremacist, woman-hating monster we would grow to detest entered the Oval Office.

For the last four years, progressives have been fighting largely to hold onto what we managed to gain during Barack Obama's presidency: an imperfect healthcare plan that nonetheless insured millions of Americans for the first time; a signature on the Paris climate accord and another on a six-nation agreement to prevent Iran from pursuing nuclear weapons; expanded environmental protections for public lands; the opportunity for recipients of Deferred Action for Childhood Arrivals -- DACA -- status to keep on working and studying in the U.S.

For those same four years, we've been fighting to hold onto our battered capacity for outrage in the face of continual attacks on simple decency and human dignity. There's no need to recite here the catalogue of horrors Donald Trump and his spineless Republican lackeys visited on this country and the world. Suffice it to say that we've been living like Alice in Through the Looking Glass, running as hard as we can just to stand still. That fantasy world's Red Queen observes to a panting Alice that she must come from

"A slow sort of country! Now, here, you see, it takes all the running you can do to keep in the same place. If you want to get somewhere else, you must run at least twice as fast as that!"

It wasn't simply the need to run faster than full speed just in order to stay put that made Trump World so much like Looking-Glass Land. It's that, just as in Lewis Carroll's fictional world, reality has been turned inside out in the United States. As new Covid-19 infections reached an all-time high of more than 100,000 in a single day and the cumulative death toll surpassed 230,000, the president in the mirror kept insisting that "we're rounding the corner" (and a surprising number of Americans seemed to believe him). He neglected to mention that, around that very corner, a coronaviral bus is heading straight toward us, accelerating as it comes. In a year when, as NPR reported, "Nearly 1 in 4 households have experienced food insecurity," Trump just kept bragging about the stock market and reminding Americans of how well their 401k's were doing -- as if most people even had such retirement accounts in the first place.

Trump World, Biden Nation, or Something Better?

After four years of running in place, November 2016 seems like a lifetime ago. The United States of 2020 is a very different place, at once more devastated and more hopeful than at least we were a mere four years ago. On the one hand, pandemic unemployment has hit women, especially women of color, much harder than men, driving millions out of the workforce, many permanently. On the other, we've witnessed the birth of the #MeToo movement against sexual harassment and of the Time's Up Legal Defense Fund, which has provided millions of dollars for working-class women to fight harassment on the job. In a few brief years, physical and psychological attacks on women have ceased to be an accepted norm in the workplace. Harassment certainly continues every day, but the country's collective view of it has shifted.

Black and Latino communities still face daily confrontations with police forces that act more like occupying armies than public servants. The role of the police as enforcers of white supremacy hasn't changed in most parts of the country. Nonetheless, the efforts of the Black Lives Matter movement and of the hundreds of thousands of people who demonstrated this summer in cities nationwide have changed the conversation about the police in ways no one anticipated four years ago. Suddenly, the mainstream media are talking about more than body cams and sensitivity training. In June 2020, the New York Times ran an op-ed entitled, "Yes, We Mean Literally Abolish the Police," by Miramne Kaba, an organizer working against the criminalization of people of color. Such a thing was unthinkable four years ago.

In the Trumpian pandemic moment, gun purchases have soared in a country that already topped the world by far in armed citizens. And yet young people -- often led by young women -- have roused themselves to passionate and organized action to get guns off the streets of Trump Land. After a gunman shot up Emma Gonzalez's school in Parkland, Florida, she famously announced, "We call BS" on the claims of adults who insisted that changing the gun laws was unnecessary and impossible. She led the March for Our Lives, which brought millions onto the streets in this country to denounce politicians' inaction on gun violence.

While Donald Trump took the U.S. out of the Paris climate agreement, Greta Thunberg, the 17-year-old Swedish environmental activist, crossed the Atlantic in a carbon-neutral sailing vessel to address the United Nations, demanding of the adult world "How dare you" leave it to your children to save an increasingly warming planet:

"You have stolen my dreams and my childhood with your empty words. And yet I'm one of the lucky ones. People are suffering. People are dying. Entire ecosystems are collapsing. We are in the beginning of a mass extinction, and all you can talk about is money and fairy tales of eternal economic growth. How dare you!"

"How dare you?" is a question I ask myself every time, as a teacher, I face a classroom of college students who, each semester, seem both more anxious about the future and more determined to make it better than the present.

Public attention is a strange beast. Communities of color have known for endless years that the police can kill them with impunity, and it's not as if people haven't been saying so for decades. But when such incidents made it into the largely white mainstream media, they were routinely treated as isolated events -- the actions of a few bad apples -- and never as evidence of a systemic problem. Suddenly, in May 2020, with the release of a hideous video of George Floyd's eight-minute murder in Minneapolis, Minnesota, systematic police violence against Blacks became a legitimate topic of mainstream discussion.

The young have been at the forefront of the response to Floyd's murder and the demands for systemic change that have followed. This June in my city of San Francisco, where police have killed at least five unarmed people of color in the last few years, high school students planned and led tens of thousands of protesters in a peaceful march against police violence.

Now that the election season has reached its drawn-out crescendo, there is so much work ahead of us. With the pandemic spreading out of control, it's time to begin demanding concerted federal action, even from this most malevolent president in history. There's no waiting for Inauguration Day, no matter who takes the oath of office on January 20th. Many thousands more will die before then.

And isn't it time to turn our attention to the millions who have lost their jobs and face the possibility of losing their housing, too, as emergency anti-eviction decrees expire? Isn't it time for a genuine congressional response to hunger, not by shoring up emergency food distribution systems like food pantries, but by putting dollars in the hands of desperate Americans so they can buy their own food? Congress must also act on the housing emergency. The Centers for Disease Control and Prevention's "Temporary Halt in Residential Evictions To Prevent the Further Spread of Covid-19" only lasts until December 31st and it doesn't cover tenants who don't have a lease or written rental agreement. It's crucial, even with Donald Trump still in the White House as the year begins, that it be extended in both time and scope. And now Senate Republican leader Mitch McConnell has said that he won't even entertain a new stimulus bill until January.

Another crucial subject that needs attention is pushing Congress to increase federal funding to state and local governments, which so often are major economic drivers for their regions. The Trump administration and McConnell not only abandoned states and cities, leaving them to confront the pandemic on their own just as a deep recession drastically reduced tax revenues, but -- in true looking-glass fashion -- treated their genuine and desperate calls for help as mere Democratic Party campaign rhetoric.

"In Short, There Is Still Much to Do"

My favorite scene in Gillo Pontecorvo's classic 1966 film The Battle of Algiers takes place at night on a rooftop in the Arab quarter of that city. Ali La Pointe, a passionate recruit to the cause of the National Liberation Front (NLF), which is fighting to throw the French colonizers out of Algeria, is speaking with Ben M'Hidi, a high-ranking NLF official. Ali is unhappy that the movement has called a general strike in order to demonstrate its power and reach to the United Nations. He resents the seven-day restriction on the use of firearms. "Acts of violence don't win wars," Ben M'Hidi tells Ali. "Finally, the people themselves must act."

For the last four years, Donald Trump has made war on the people of this country and indeed on the people of the entire world. He's attacked so many of us, from immigrant children at the U.S. border to anyone who tries to breathe in the fire-choked states of California, Oregon, Washington, and most recently Colorado. He's allowed those 230,000 Americans to die in a pandemic that could have been controlled and thrown millions into poverty, to mention just a few of his "war" crimes. Finally, the people themselves must act.

On that darkened rooftop in an eerie silence, Ben M'Hidi continues his conversation with La Pointe. "You know, Ali," he says. "It's hard enough to start a revolution, even harder to sustain it, and hardest of all to win it." He pauses, then continues, "But it's only afterwards, once we've won, that the real difficulties begin. In short, there is still much to do."

It's hard enough to vote out a looking-glass president. But it's only once we've won, whether that's now or four years from now, that the real work begins. There is, indeed, still much to do.

Rebecca Gordon, a TomDispatch regular, teaches at the University of San Francisco. She is the author of American Nuremberg: The U.S. Officials Who Should Stand Trial for Post-9/11 War Crimes and is now at work on a new Dispatch book on the history of torture in the United States.

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer's new dystopian novel (the second in the Splinterlands series) Frostlands, Beverly Gologorsky's novel Every Body Has a Story, and Tom Engelhardt's A Nation Unmade by War, as well as Alfred McCoy's In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower's The Violent American Century: War and Terror Since World War II.

Copyright 2020 Rebecca Gordon

How to take on Trump on the ground — and convince others to do the same

"Look, folks, the air quality is in the red zone today. The EPA says that means people with lung or heart issues should avoid prolonged activity outdoors."

That was J.R. de Vera, one of two directors of UNITE-HERE!'s independent expenditure campaign to elect Biden and Harris in Reno, Nevada. UNITE-HERE! is a union representing 300,000 workers in the hospitality industry -- that world of hotels and bars, restaurants and caterers. Ninety percent of its members are now laid off because of Trump's bungling of the Covid-19 pandemic and many are glad for the chance to help get him out of the White House.

"So some of you will want to stay in your hotel rooms and make phone calls today," JR continues. Fifty faces fall in the 50 little Zoom boxes on my laptop screen. Canvassers would much rather be talking to voters at their doors than calling them on a phone bank. Still, here in the burning, smoking West, the union is as committed to its own people's health and safety as it is to dragging Donald Trump out of office. So, for many of them, phone calls it will be.

My own job doesn't change much from day to day. Though I live in San Francisco, I've come to Reno to do back-room logistics work in the union campaign's cavernous warehouse of an office: ordering supplies, processing reimbursements, and occasionally helping the data team make maps of the areas our canvassers will walk.

Our field campaign is just one of several the union is running in key states. We're also in Arizona and Florida and, only last week, we began door-to-door canvassing in Philadelphia. Social media, TV ads, bulk mail, and phone calls are all crucial elements in any modern electoral campaign, but none of them is a substitute for face-to-face conversations with voters.

We've been in Reno since early August, building what was, until last week, the only field campaign in the state supporting Joe Biden and Kamala Harris. (Just recently, our success in campaigning safely has encouraged the Democratic Party to start its own ground game here and elsewhere.) We know exactly how many doors we have to knock on, how many Biden voters we have to identify, how many of them we have to convince to make a concrete voting plan, and how many we have to get out to vote during Nevada's two-week early voting period to win here.

We're running a much larger campaign in Clark County, where close to three-quarters of Nevada's population lives (mostly in Las Vegas). Washoe County, home of the twin cities of Reno and Sparks, is the next largest population center with 16% of Nevadans. The remaining 14 counties, collectively known as "the Rurals," account for the rest. Washoe and Clark are barely blue; the Rurals decidedly red.

In 2018, UNITE-HERE!'s ground campaign helped ensure that Jacky Rosen would flip a previously Republican Senate seat, and we helped elect Democrat Steve Sisolak as governor. He's proved a valuable union ally, signing the Adolfo Fernandez Act, a first-in-the-nation law protecting workers and businesses in Nevada from the worst effects of the Covid-19 pandemic.

Defying a threatened Trump campaign lawsuit (later dismissed by a judge), Sisolak also signed an election reform bill that allows every active Nevada voter to receive a mail-in ballot. Largely as a result of the union's work in 2018, this state now boasts an all-female Democratic senatorial delegation, a Democratic governor, and a female and Democratic majority in the state legislature. Elections, as pundits of all stripes have been known to say, have consequences.

Door-to-Door on Planet A

"¿Se puede, o no se puede?"

"¡Sí, se puede!"

("Can we do it?" "Yes, we can!")

Each morning's online canvass dispatch meeting starts with that call-and-response followed by a rousing handclap. Then we talk about where people will be walking that day and often listen to one of the canvassers' personal stories, explaining why he or she is committed to this campaign. Next, we take a look at the day's forecast for heat and air quality as vast parts of the West Coast burn, while smoke and ash travel enormous distances. Temperatures here were in the low 100s in August (often hovering around 115 degrees in Las Vegas). And the air? Let's just say that there have been days when I've wished breathing were optional.

Climate-change activists rightly point out that "there's no Planet B" for the human race, but some days it seems as if our canvassers are already working on a fiery Planet A that is rapidly becoming unlivable. California's wildfires -- including its first-ever "gigafire" -- have consumed more than four million acres in the last two months, sending plumes of ash to record heights, and dumping a staggering amount of smoke into the Reno-Sparks basin. Things are a little better at the moment, but for weeks I couldn't see the desert mountains that surround the area. Some days I couldn't even make out the Grand Sierra Reno casino, a quarter mile from the highway on which I drive to work each morning.

For our canvassers -- almost every one a laid-off waiter, bartender, hotel housekeeper, or casino worker -- the climate emergency and the Covid-19 pandemic are literally in their faces as they don their N95 masks to walk the streets of Reno. It's the same for the voters they meet at their doors. Each evening, canvassers report (on Zoom, of course) what those voters are saying and, for the first time I can remember, they are now talking about the climate. They're angry at a president who pulled the U.S. out of the Paris climate accord and they're scared about what a potentially searing future holds for their children and grandchildren. They may not have read Joe Biden's position on clean energy and environmental justice, but they know that Donald Trump has no such plan.

Braving Guns, Germs, and Smoke

In his classic book Guns, Germs, and Steel, Jared Diamond suggested that the three variables in his title helped in large part to explain how European societies and the United States came to control much of the planet in the twentieth century. As it happens, our door-to-door canvassers confront a similar triad of obstacles right here in Reno, Nevada (if you replace that final "steel" with "smoke.")

Guns and Other Threats

Nevada is an open-carry state and gun ownership is common here. It's not unusual to see someone walking around a supermarket with a holstered pistol on his hip. A 2015 state law ended most gun registration requirements and another allows people visiting from elsewhere to buy rifles without a permit. So gun sightings are everyday events.

Still, it can be startling, if you're not used to it, to have a voter answer the door with a pistol all too visible, even if securely holstered. And occasionally, our canvassers have even watched those guns leave their holsters when the person at the door realizes why they're there (which is when the campaign gets the police involved). Canvassers are trained to observe very clear protocols, including immediately leaving an area if they experience any kind of verbal or physical threat.

African American and Latinx canvassers who've campaigned before in Reno say that, in 2020, Trump supporters seem even more emboldened than in the past to shout racist insults at them. More than once, neighbors have called the police on our folks, essentially accusing them of canvassing-while-black-or-brown. Two days before I wrote this piece, the police pulled over one young Latino door-knocker because neighbors had called to complain that he was walking up and down the street waving a gun. (The "gun" in question was undoubtedly the electronic tablet he was carrying to record the results of conversations with voters.) The officer apologized.

Which reminds me of another apology offered recently. A woman approached an African-American canvasser, demanding to know what in the world he was doing in her neighborhood. On learning his mission, she offered an apology as insulting as her original question. "We're not used to seeing people like you around here," she explained.

Germs

Until the pandemic, my partner and I had planned to work together with UNITE-HERE! in Reno during this election, as we did in 2018. But she's five years older than I am, and her history of pneumonia means that catching Covid-19 could be especially devastating for her. So she's stayed in San Francisco, helping out the union's national phone bank effort instead.

In fact, we didn't really expect that there would be a ground campaign this year, given the difficulties presented by the novel coronavirus. But the union was determined to eke out that small but genuine addition to the vote that a field campaign can produce. So they put in place stringent health protocols for all of us: masks and a minimum of six feet of distance between everyone at all times; no visits to bars, restaurants, or casinos, including during off hours; temperature checks for everyone entering the office; and the immediate reporting of any potential Covid-19 symptoms to our health and safety officer. Before the union rented blocks of rooms at two extended-stay hotels, our head of operations checked their mask protocols for employees and guests and examined their ventilation systems to make sure that the air conditioners vented directly outdoors and not into a common air system for the whole building.

To date, not one of our 57 canvassers has tested positive, a record we intend to maintain as we add another 17 full-timers to our team next week.

One other feature of our coronavirus protocol: we don't talk to any voter who won't put on a mask. I was skeptical that canvassers would be able to get voters to mask up, even with the individually wrapped surgical masks we're offering anyone who doesn't have one on or handy. However, it turns out that, in this bizarre election year, people are eager to talk, to vent their feelings and be heard. So many of the people we're canvassing have suffered so much this year that they're surprised and pleased when someone shows up at their door wondering how they're doing.

And the answer to that question for so many potential voters is not well -- with jobs lost, housing threatened, children struggling with online school, and hunger pangs an increasingly everyday part of life. So yes, a surprising number of people, either already masked or quite willing to put one on, want to talk to us about an election that they generally see as the most important of their lifetime.

Smoke

And did I mention that it's been smoky here? It can make your eyes water, your throat burn, and the urge to cough overwhelm you. In fact, the symptoms of smoke exposure are eerily similar to the ones for Covid-19. More than one smoke-affected canvasser has spent at least five days isolated in a hotel room, waiting for negative coronavirus test results.

The White House website proudly quotes the president on his administration's testing record: "We do tremendous testing. We have the best testing in the world." Washoe County health officials are doing what they can, but if this is the best in the world, then the world is in worse shape than we thought.

The Power of a Personal Story

So why, given the genuine risk and obstacles they face, do UNITE-HERE!'s canvassers knock on doors six days a week to elect Joe Biden and Kamala Harris? Their answers are a perfect embodiment of the feminist dictum "the personal is political." Every one of them has a story about why she or he is here. More than one grew up homeless and never want another child to live that way. One is a DACA recipient who knows that a reelected Donald Trump will continue his crusade to end that amnesty for undocumented people brought to the United States as children. Through their participation in union activism, many have come to understand that workers really can beat the boss when they organize -- and Trump, they say, is the biggest boss of all.

Through years of political campaigning, the union's leaders have learned that voters may think about issues, but they're moved to vote by what they feel about them. The goal of every conversation at those doors right now is to make a brief but profound personal connection with the voter, to get each of them to feel just how important it is to vote this year. Canvassers do this by asking how a voter is doing in these difficult times and listening -- genuinely listening -- and responding to whatever answer they get. And they do it by being vulnerable enough to share the personal stories that lie behind their presence at the voter's front door.

One canvasser lost his home at the age of seven, when his parents separated. He and his mother ended up staying in shelters and camping for months in a garden shed on a friend's property. One day recently he knocked on a door and found a Trump supporter on the other side of it. He noticed a shed near the house, pointed to it, and told the man about living in something similar as a child. That Trumpster started to cry. He began talking about how he'd had just the same experience and the way, as a teenager, he'd had to hold his family together when his heroin-addicted parents couldn't cope. He'd never talked to any of his present-day friends about how he grew up and, in the course of that conversation, came to agree with our canvasser that Donald Trump wasn't likely to improve life for people like them. He was, he said, changing his vote to Biden right then and there. (And that canvasser will be back to make sure he actually votes.)

Harvard University Professor Marshall Ganz pioneered the "public narrative," the practice of organizing by storytelling. It's found at the heart of many organizing efforts these days. The 2008 Obama campaign, for example, trained thousands of volunteers to tell their stories to potential voters. The It Gets Better Project has collected more than 50,000 personal messages from older queer people to LGBTQ youth who might be considering suicide or other kinds of self-harm -- assuring them that their own lives did, indeed, get better.

Being the sort of political junkie who devours the news daily, I was skeptical about the power of this approach, though I probably shouldn't have been. After all, how many times did I ask my mother or father to "tell me a story" when I was a kid? What are our lives but stories? Human beings are narrative animals and, however rational, however versed in the issues we may sometimes be, we still live through stories.

Data can give me information on issues I care about, but it can't tell me what issues I should care about. In the end, I'm concerned about racial and gender justice as well as the climate emergency because of the way each of them affects people and other creatures with whom I feel connected.

A Campaign Within a Campaign

Perhaps the most inspiring aspect of UNITE-HERE!'s electoral campaign is the union's commitment to developing every canvasser's leadership skills. The goal is more than winning what's undoubtedly the most important election of our lifetime. It's also to send back to every hotel, restaurant, casino, and airport catering service leaders who can continue to organize and advocate for their working-class sisters and brothers. This means implementing an individual development plan for each canvasser.

Team leaders work with all of them to hone their stories into tools that can be used in an honest and generous way to create a genuine connection with voters. They help those canvassers think about what else they want to learn to do, while developing opportunities for them to master technical tools like computer spreadsheets and databases.

There's a special emphasis on offering such opportunities to women and people of color who make up the vast majority of the union's membership. Precious hours of campaign time are also devoted to workshops on how to understand and confront systemic racism and combat sexual harassment, subjects President Trump is acquainted with in the most repulsively personal way. The union believes its success depends as much on fostering a culture of respect as on the hard-nosed negotiating it's also famous for.

After months of pandemic lockdown and almost four years of what has objectively been the worst, most corrupt, most incompetent, and possibly even most destructive presidency in the nation's history, it's a relief to be able to do something useful again. And sentimental as it may sound, it's an honor to be able to do it with this particular group of brave and committed people. Sí, se puede. Yes, we can.

Rebecca Gordon, a TomDispatch regular, teaches at the University of San Francisco. She is the author of American Nuremberg: The U.S. Officials Who Should Stand Trial for Post-9/11 War Crimes and is now at work on a new Dispatch book on the history of torture in the United States.

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer's new dystopian novel (the second in the Splinterlands series) Frostlands, Beverly Gologorsky's novel Every Body Has a Story, and Tom Engelhardt's A Nation Unmade by War, as well as Alfred McCoy's In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower's The Violent American Century: War and Terror Snce World War II.

Copyright 2020 Rebecca Gordon

How a diseased planet is reshaping the world of work

In two weeks, my partner and I were supposed to leave San Francisco for Reno, Nevada, where we’d be spending the next three months focused on the 2020 presidential election. As we did in 2018, we’d be working with UNITE-HERE, the hospitality industry union, only this time on the campaign to drive Donald Trump from office.

Keep reading...Show less

The US is in free fall — and at risk of becoming a failed state

You know that feeling when you trip on the street and instantly sense that you’re about to crash hard and there’s no way to prevent it? As gravity has its way with you, all you can do is watch yourself going down. Yeah, that feeling.

Keep reading...Show less

How the credibility gap became a chasm in the age of Trump

These days, teaching graduating college seniors has me, as the Brits would say in the London Underground, “minding the gap.” In my case, however, it’s not the gap between the platform and the train I’m thinking of, but a couple of others: those between U.S. citizens and our government, as well as between different generations of Americans. The Covid-19 crisis has made some of those gaps far more visible than usual, just as my students leave school with many of their personal expectations in tatters.

Keep reading...Show less

Why the chaotic mind of Donald Trump is strangely and terrifyingly fascinating

My partner and I have been fighting about politics since we met in 1965. I was 13. She was 18 and my summer camp counselor. (It was another 14 years before we became a couple.) We spent that first summer arguing about the Vietnam War. I was convinced that the U.S. never should have gotten involved there. Though she agreed, she was more concerned that the growth of an antiwar movement would distract people from what she considered far more crucial to this country: the Civil Rights movement. As it happened, we were both right.

Keep reading...Show less

The future may be female — but the pandemic is patriarchal

Before I found myself “sheltering in place,” this article was to be about women’s actions around the world to mark March 8th, International Women’s Day. From Pakistan to Chile, women in their millions filled the streets, demanding that we be able to control our bodies and our lives. Women came out in Iraq and Kyrgyzstan, Turkey and Peru, the Philippines and Malaysia. In some places, they risked beatings by masked men. In others, they demanded an end to femicide -- the millennia-old reality that women in this world are murdered daily simply because they are women.

Keep reading...Show less
BRAND NEW STORIES
@2025 - AlterNet Media Inc. All Rights Reserved. - "Poynter" fonts provided by fontsempire.com.