Rebecca Gordon

Automated killer robots aren't science fiction anymore — and the world isn't ready

Here’s a scenario to consider: a military force has purchased a million cheap, disposable flying drones each the size of a deck of cards, each capable of carrying three grams of explosives — enough to kill a single person or, in a “shaped charge,” pierce a steel wall. They’ve been programmed to seek out and “engage” (kill) certain human beings, based on specific “signature” characteristics like carrying a weapon, say, or having a particular skin color. They fit in a single shipping container and can be deployed remotely. Once launched, they will fly and kill autonomously without any further human action.

Science fiction? Not really. It could happen tomorrow. The technology already exists.

In fact, lethal autonomous weapons systems (LAWS) have a long history. During the spring of 1972, I spent a few days occupying the physics building at Columbia University in New York City. With a hundred other students, I slept on the floor, ate donated takeout food, and listened to Alan Ginsberg when he showed up to honor us with some of his extemporaneous poetry. I wrote leaflets then, commandeering a Xerox machine to print them out.

And why, of all campus buildings, did we choose the one housing the Physics department? The answer: to convince five Columbia faculty physicists to sever their connections with the Pentagon’s Jason Defense Advisory Group, a program offering money and lab space to support basic scientific research that might prove useful for U.S. war-making efforts. Our specific objection: to the involvement of Jason’s scientists in designing parts of what was then known as the “automated battlefield” for deployment in Vietnam. That system would indeed prove a forerunner of the lethal autonomous weapons systems that are poised to become a potentially significant part of this country’s — and the world’s — armory.

Early (Semi-)Autonomous Weapons

Washington faced quite a few strategic problems in prosecuting its war in Indochina, including the general corruption and unpopularity of the South Vietnamese regime it was propping up. Its biggest military challenge, however, was probably North Vietnam’s continual infiltration of personnel and supplies on what was called the Ho Chi Minh Trail, which ran from north to south along the Cambodian and Laotian borders. The Trail was, in fact, a network of easily repaired dirt roads and footpaths, streams and rivers, lying under a thick jungle canopy that made it almost impossible to detect movement from the air.

The U.S. response, developed by Jason in 1966 and deployed the following year, was an attempt to interdict that infiltration by creating an automated battlefield composed of four parts, analogous to a human body’s eyes, nerves, brain, and limbs. The eyes were a broad variety of sensors — acoustic, seismic, even chemical (for sensing human urine) — most dropped by air into the jungle. The nerve equivalents transmitted signals to the “brain.” However, since the sensors had a maximum transmission range of only about 20 miles, the U.S. military had to constantly fly aircraft above the foliage to catch any signal that might be tripped by passing North Vietnamese troops or transports. The planes would then relay the news to the brain. (Originally intended to be remote controlled, those aircraft performed so poorly that human pilots were usually necessary.)

And that brain, a magnificent military installation secretly built in Thailand’s Nakhon Phanom, housed two state-of-the-art IBM mainframe computers. A small army of programmers wrote and rewrote the code to keep them ticking, as they attempted to make sense of the stream of data transmitted by those planes. The target coordinates they came up with were then transmitted to attack aircraft, which were the limb equivalents. The group running that automated battlefield was designated Task Force Alpha and the whole project went under the code name Igloo White.

As it turned out, Igloo White was largely an expensive failure, costing about a billion dollars a year for five years (almost $40 billion total in today’s dollars). The time lag between a sensor tripping and munitions dropping made the system ineffective. As a result, at times Task Force Alpha simply carpet-bombed areas where a single sensor might have gone off. The North Vietnamese quickly realized how those sensors worked and developed methods of fooling them, from playing truck-ignition recordings to planting buckets of urine.

Given the history of semi-automated weapons systems like drones and “smart bombs” in the intervening years, you probably won’t be surprised to learn that this first automated battlefield couldn’t discriminate between soldiers and civilians. In this, they merely continued a trend that’s existed since at least the eighteenth century in which wars routinely kill more civilians than combatants.

None of these shortcomings kept Defense Department officials from regarding the automated battlefield with awe. Andrew Cockburn described this worshipful posture in his book Kill Chain: The Rise of the High-Tech Assassins, quoting Leonard Sullivan, a high-ranking Pentagon official who visited Vietnam in 1968: “Just as it is almost impossible to be an agnostic in the Cathedral of Notre Dame, so it is difficult to keep from being swept up in the beauty and majesty of the Task Force Alpha temple.”

Who or what, you well might wonder, was to be worshipped in such a temple?

Most aspects of that Vietnam-era “automated” battlefield actually required human intervention. Human beings were planting the sensors, programming the computers, piloting the airplanes, and releasing the bombs. In what sense, then, was that battlefield “automated”? As a harbinger of what was to come, the system had eliminated human intervention at a single crucial point in the process: the decision to kill. On that automated battlefield, the computers decided where and when to drop the bombs.

In 1969, Army Chief of Staff William Westmoreland expressed his enthusiasm for this removal of the messy human element from war-making. Addressing a luncheon for the Association of the U.S. Army, a lobbying group, he declared:

“On the battlefield of the future enemy forces will be located, tracked, and targeted almost instantaneously through the use of data links, computer-assisted intelligence evaluation, and automated fire control. With first round kill probabilities approaching certainty, and with surveillance devices that can continually track the enemy, the need for large forces to fix the opposition will be less important.”

What Westmoreland meant by “fix the opposition” was kill the enemy. Another military euphemism in the twenty-first century is “engage.” In either case, the meaning is the same: the role of lethal autonomous weapons systems is to automatically find and kill human beings, without human intervention.

New LAWS for a New Age — Lethal Autonomous Weapons Systems

Every autumn, the British Broadcasting Corporation sponsors a series of four lectures given by an expert in some important field of study. In 2021, the BBC invited Stuart Russell, professor of computer science and founder of the Center for Human-Compatible Artificial Intelligence at the University of California, Berkeley, to deliver those “Reith Lectures.” His general subject was the future of artificial intelligence (AI), and the second lecture was entitled “The Future Role of AI in Warfare.” In it, he addressed the issue of lethal autonomous weapons systems, or LAWS, which the United Nations defines as “weapons that locate, select, and engage human targets without human supervision.”

Russell’s main point, eloquently made, was that, although many people believe lethal autonomous weapons are a potential future nightmare, residing in the realm of science fiction, “They are not. You can buy them today. They are advertised on the web.”

I’ve never seen any of the movies in the Terminator franchise, but apparently military planners and their PR flacks assume most people derive their understanding of such LAWS from this fictional dystopian world. Pentagon officials are frequently at pains to explain why the weapons they are developing are not, in fact, real-life equivalents of SkyNet — the worldwide communications network that, in those films, becomes self-conscious and decides to eliminate humankind. Not to worry, as a deputy secretary of defense told Russell, “We have listened carefully to these arguments and my experts have assured me that there is no risk of accidentally creating SkyNet.”

Russell’s point, however, was that a weapons system doesn’t need self-awareness to act autonomously or to present a threat to innocent human beings. What it does need is:

  • A mobile platform (anything that can move, from a tiny quadcopter to a fixed-wing aircraft)
  • Sensory capacity (the ability to detect visual or sound information)
  • The ability to make tactical decisions (the same kind of capacity already found in computer programs that play chess)
  • The ability to “engage,” i.e. kill (which can be as complicated as firing a missile or dropping a bomb, or as rudimentary as committing robot suicide by slamming into a target and exploding)

The reality is that such systems already exist. Indeed, a government-owned weapons company in Turkey recently advertised its Kargu drone — a quadcopter “the size of a dinner plate,” as Russell described it, which can carry a kilogram of explosives and is capable of making “anti-personnel autonomous hits” with “targets selected on images and face recognition.” The company’s site has since been altered to emphasize its adherence to a supposed “man-in-the-loop” principle. However, the U.N. has reported that a fully autonomous Kargu-2 was, in fact, deployed in Libya in 2020.

You can buy your own quadcopter right now on Amazon, although you’ll still have to apply some DIY computer skills if you want to get it to operate autonomously.

The truth is that lethal autonomous weapons systems are less likely to look like something from the Terminator movies than like swarms of tiny killer bots. Computer miniaturization means that the technology already exists to create effective LAWS. If your smartphone could fly, it could be an autonomous weapon. Newer phones use facial-recognition software to “decide” whether to allow access. It’s not a leap to create flying weapons the size of phones, programmed to “decide” to attack specific individuals, or individuals with specific features. Indeed, it’s likely such weapons already exist.

Can We Outlaw LAWS?

So, what’s wrong with LAWS, and is there any point in trying to outlaw them? Some opponents argue that the problem is they eliminate human responsibility for making lethal decisions. Such critics suggest that, unlike a human being aiming and pulling the trigger of a rifle, a LAWS can choose and fire at its own targets. Therein, they argue, lies the special danger of these systems, which will inevitably make mistakes, as anyone whose iPhone has refused to recognize his or her face will acknowledge.

In my view, the issue isn’t that autonomous systems remove human beings from lethal decisions. To the extent that weapons of this sort make mistakes, human beings will still bear moral responsibility for deploying such imperfect lethal systems. LAWS are designed and deployed by human beings, who therefore remain responsible for their effects. Like the semi-autonomous drones of the present moment (often piloted from half a world away), lethal autonomous weapons systems don’t remove human moral responsibility. They just increase the distance between killer and target.

Furthermore, like already outlawed arms, including chemical and biological weapons, these systems have the capacity to kill indiscriminately. While they may not obviate human responsibility, once activated, they will certainly elude human control, just like poison gas or a weaponized virus.

And as with chemical, biological, and nuclear weapons, their use could effectively be prevented by international law and treaties. True, rogue actors, like the Assad regime in Syria or the U.S. military in the Iraqi city of Fallujah, may occasionally violate such strictures, but for the most part, prohibitions on the use of certain kinds of potentially devastating weaponry have held, in some cases for over a century.

Some American defense experts argue that, since adversaries will inevitably develop LAWS, common sense requires this country to do the same, implying that the best defense against a given weapons system is an identical one. That makes as much sense as fighting fire with fire when, in most cases, using water is much the better option.

The Convention on Certain Conventional Weapons

The area of international law that governs the treatment of human beings in war is, for historical reasons, called international humanitarian law (IHL). In 1995, the United States ratified an addition to IHL: the 1980 U.N. Convention on Certain Conventional Weapons. (Its full title is much longer, but its name is generally abbreviated as CCW.) It governs the use, for example, of incendiary weapons like napalm, as well as biological and chemical agents.

The signatories to CCW meet periodically to discuss what other weaponry might fall under its jurisdiction and prohibitions, including LAWS. The most recent conference took place in December 2021. Although transcripts of the proceedings exist, only a draft final document — produced before the conference opened — has been issued. This may be because no consensus was even reached on how to define such systems, let alone on whether they should be prohibited. The European Union, the U.N., at least 50 signatory nations, and (according to polls), most of the world population believe that autonomous weapons systems should be outlawed. The U.S., Israel, the United Kingdom, and Russia disagree, along with a few other outliers.

Prior to such CCW meetings, a Group of Government Experts (GGE) convenes, ostensibly to provide technical guidance for the decisions to be made by the Convention’s “high contracting parties.” In 2021, the GGE was unable to reach a consensus about whether such weaponry should be outlawed. The United States held that even defining a lethal autonomous weapon was unnecessary (perhaps because if they could be defined, they could be outlawed). The U.S. delegation put it this way:

“The United States has explained our perspective that a working definition should not be drafted with a view toward describing weapons that should be banned. This would be — as some colleagues have already noted — very difficult to reach consensus on, and counterproductive. Because there is nothing intrinsic in autonomous capabilities that would make a weapon prohibited under IHL, we are not convinced that prohibiting weapons based on degrees of autonomy, as our French colleagues have suggested, is a useful approach.”

The U.S. delegation was similarly keen to eliminate any language that might require “human control” of such weapons systems:

“[In] our view IHL does not establish a requirement for ‘human control’ as such… Introducing new and vague requirements like that of human control could, we believe, confuse, rather than clarify, especially if these proposals are inconsistent with long-standing, accepted practice in using many common weapons systems with autonomous functions.”

In the same meeting, that delegation repeatedly insisted that lethal autonomous weapons would actually be good for us, because they would surely prove better than human beings at distinguishing between civilians and combatants.

Oh, and if you believe that protecting civilians is the reason the arms industry is investing billions of dollars in developing autonomous weapons, I’ve got a patch of land to sell you on Mars that’s going cheap.

The Campaign to Stop Killer Robots

The Governmental Group of Experts also has about 35 non-state members, including non-governmental organizations and universities. The Campaign to Stop Killer Robots, a coalition of 180 organizations, among them Amnesty International, Human Rights Watch, and the World Council of Churches, is one of these. Launched in 2013, this vibrant group provides important commentary on the technical, legal, and ethical issues presented by LAWS and offers other organizations and individuals a way to become involved in the fight to outlaw such potentially devastating weapons systems.

The continued construction and deployment of killer robots is not inevitable. Indeed, a majority of the world would like to see them prohibited, including U.N. Secretary General Antonio Guterres. Let’s give him the last word: “Machines with the power and discretion to take human lives without human involvement are politically unacceptable, morally repugnant, and should be prohibited by international law.”

I couldn’t agree more.

Copyright 2022 Rebecca Gordon

Featured image: Killer Robots by Global Panorama is licensed under CC BY-SA 2.0 / Flickr

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer’s new dystopian novel, Songlands (the final one in his Splinterlands series), Beverly Gologorsky’s novel Every Body Has a Story, and Tom Engelhardt’s A Nation Unmade by War, as well as Alfred McCoy’s In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower’s The Violent American Century: War and Terror Since World War II.

Rebecca Gordon, a TomDispatch regular, teaches at the University of San Francisco. She is the author of Mainstreaming Torture, American Nuremberg: The U.S. Officials Who Should Stand Trial for Post-9/11 War Crimes and is now at work on a new book on the history of torture in the United States.

Graveyard shift: The dark reality of the modern economy reveals itself under pandemic-era demands

In mid-October, President Biden announced that the Port of Los Angeles would begin operating 24 hours a day, seven days a week, joining the nearby Port of Long Beach, which had been doing so since September. The move followed weeks of White House negotiations with the International Longshore and Warehouse Union, as well as shippers like UPS and FedEx, and major retailers like Walmart and Target.

The purpose of expanding port hours, according to the New York Times, was “to relieve growing backlogs in the global supply chains that deliver critical goods to the United States.” Reading this, you might be forgiven for imagining that an array of crucial items like medicines or their ingredients or face masks and other personal protective equipment had been languishing in shipping containers anchored off the West Coast. You might also be forgiven for imagining that workers, too lazy for the moment at hand, had chosen a good night’s sleep over the vital business of unloading such goods from boats lined up in their dozens offshore onto trucks, and getting them into the hands of the Americans desperately in need of them. Reading further, however, you’d learn that those “critical goods” are actually things like “exercise bikes, laptops, toys, [and] patio furniture.”

Fair enough. After all, as my city, San Francisco, enters what’s likely to be yet another almost rainless winter on a planet in ever more trouble, I can imagine my desire for patio furniture rising to a critical level. So, I’m relieved to know that dock workers will now be laboring through the night at the command of the president of the United States to guarantee that my needs are met. To be sure, shortages of at least somewhat more important items are indeed rising, including disposable diapers and the aluminum necessary for packaging some pharmaceuticals. Still, a major focus in the media has been on the specter of “slim pickings this Christmas and Hanukkah.”

Providing “critical” yard furnishings is not the only reason the administration needs to unkink the supply chain. It’s also considered an anti-inflation measure (if an ineffective one). At the end of October, the Consumer Price Index had jumped 6.2% over the same period in 2020, the highest inflation rate in three decades. Such a rise is often described as the result of too much money chasing too few goods. One explanation for the current rise in prices is that, during the worst months of the pandemic, many Americans actually saved money, which they’re now eager to spend. When the things people want to buy are in short supply — perhaps even stuck on container ships off Long Beach and Los Angeles — the price of those that are available naturally rises.

Republicans have christened the current jump in the consumer price index as “Bidenflation,” although the administration actually bears little responsibility for the situation. But Joe Biden and the rest of the Democrats know one thing: if it looks like they’re doing nothing to bring prices down, there will be hell to pay at the polls in 2022, and so it’s the night shift for dock workers and others in Los Angeles, Long Beach, and possibly other American ports.

However, running West Coast ports 24/7 won’t solve the supply-chain problem, not when there aren’t enough truckers to carry that critical patio furniture to Home Depot. The shortage of such drivers arises because there’s more demand than ever before, and because many truckers have simply quit the industry. As the New York Times reports, “Long hours and uncomfortable working conditions are leading to a shortage of truck drivers, which has compounded shipping delays in the United States.”

Rethinking (Shift) Work

Truckers aren’t the only workers who have been rethinking their occupations since the coronavirus pandemic pressed the global pause button. The number of employees quitting their jobs hit 4.4 million this September, about 3% of the U.S. workforce. Resignations were highest in industries like hospitality and medicine, where employees are most at risk of Covid-19 exposure.

For the first time in many decades, workers are in the driver’s seat. They can command higher wages and demand better working conditions. And that’s exactly what they’re doing at workplaces ranging from agricultural equipment manufacturer John Deere to breakfast-cereal makers Kellogg and Nabisco. I’ve even been witnessing it in my personal labor niche, part-time university faculty members (of which I’m one). So allow me to pause here for a shout-out to the 6,500 part-time professors in the University of California system: Thank you! Your threat of a two-day strike won a new contract with a 30% pay raise over the next five years!

This brings me to Biden’s October announcement about those ports going 24/7. In addition to demanding higher pay, better conditions, and an end to two-tier compensation systems (in which laborers hired later don’t get the pay and benefits available to those already on the job), workers are now in a position to reexamine and, in many cases, reject the shift-work system itself. And they have good reason to do so.

So, what is shift work? It’s a system that allows a business to run continuously, ceaselessly turning out and/or transporting widgets year after year. Workers typically labor in eight-hour shifts: 8:00 a.m. to 4:00 p.m., 4:00 p.m. to midnight, and midnight to 8:00 a.m., or the like. In times of labor shortages, they can even be forced to work double shifts, 16 hours in total. Businesses love shift work because it reduces time (and money) lost to powering machinery up and down. And if time is money, then more time worked means more profit for corporations. In many industries, shift work is good for business. But for workers, it’s often another story.

The Graveyard Shift

Each shift in a 24-hour schedule has its own name. The day shift is the obvious one. The swing shift takes you from the day shift to the all-night, or graveyard, shift. According to folk etymology, that shift got its name because, once upon a time, cemetery workers were supposed to stay up all night listening for bells rung by unfortunates who awakened to discover they’d been buried alive. While it’s true that some coffins in England were once fitted with such bells, the term was more likely a reference to the eerie quiet of the world outside the workplace during the hours when most people are asleep.

I can personally attest to the strangeness of life on the graveyard shift. I once worked in an ice cream cone factory. Day and night, noisy, smoky machines resembling small Ferris wheels carried metal molds around and around, while jets of flame cooked the cones inside them. After a rotation, each mold would tip, releasing four cones onto a conveyor belt, rows of which would then approach my station relentlessly. I’d scoop up a stack of 25, twirl them around in a quick check for holes, and place them in a tall box.

Almost simultaneously, I’d make cardboard dividers, scoop up three more of those stacks and seal them, well-divided, in that box, which I then inserted in an even larger cardboard carton and rushed to a giant mechanical stapler. There, I pressed it against a switch, and — boom-ba-da-boom — six large staples would seal it shut, leaving me just enough time to put that carton atop a pallet of them before racing back to my machine, as new columns of just-baked cones piled up, threatening to overwhelm my worktable.

The only time you stopped scooping and boxing was when a relief worker arrived, so you could have a brief break or gobble down your lunch. You rarely talked to your fellow-workers, because there was only one “relief” packer, so only one person at a time could be on break. Health regulations made it illegal to drink water on the line and management was too cheap to buy screens for the windows, which remained shut, even when it was more than 100 degrees outside.

They didn’t like me very much at the Maryland Pacific Cone Company, maybe because I wanted to know why the high school boys who swept the floors made more than the women who, since the end of World War II, had been climbing three rickety flights of stairs to stand by those machines. In any case, management there started messing with my shifts, assigning me to all three in the same week. As you might imagine, I wasn’t sleeping a whole lot and would occasionally resort to those “little white pills” immortalized in the truckers’ song “Six Days on the Road.”

But I’ll never forget one graveyard shift when an angel named Rosie saved my job and my sanity. It was probably three in the morning. I’d been standing under fluorescent lights, scooping, twirling, and boxing for hours when the universe suddenly stood still. I realized at that moment that I’d never done anything else since the beginning of time but put ice cream cones in boxes and would never stop doing so until the end of time.

If time lost its meaning then, dimensions still turned out to matter a lot, because the cones I was working on that night were bigger than I was used to. Soon I was falling behind, while a huge mound of 40-ounce Eat-It-Alls covered my table and began to spill onto the floor. I stared at them, frozen, until I suddenly became aware that someone was standing at my elbow, gently pushing me out of the way.

Rosie, who had been in that plant since the end of World War II, said quietly, “Let me do this. You take my line.” In less than a minute, she had it all under control, while I spent the rest of the night at her machine, with cones of a size I could handle.

I have never been so glad to see the dawn.

The Deadly Reality of the Graveyard Shift

So, when the president of the United States negotiated to get dock workers in Los Angeles to work all night, I felt a twinge of horror. There’s another all-too-literal reason to call it the “graveyard” shift. It turns out that working when you should be in bed is dangerous. Not only do more accidents occur when the human body expects to be asleep, but the long-term effects of night work can be devastating. As the Centers for Disease Control and Prevention’s National Institute of Occupational Safety and Health (NIOSH) reports, the many adverse effects of night work include:

“type 2 diabetes, heart disease, stroke, metabolic disorders, and sleep disorders. Night shift workers might also have an increased risk for reproductive issues, such as irregular menstrual cycles, miscarriage, and preterm birth. Digestive problems and some psychological issues, such as stress and depression, are more common among night shift workers. The fatigue associated with nightshift can lead to injuries, vehicle crashes, and industrial disasters.”

Some studies have shown that such shift work can also lead to decreased bone-mineral density and so to osteoporosis. There is, in fact, a catchall term for all these problems: shift-work disorder.

In addition, studies directly link the graveyard shift to an increased incidence of several kinds of cancer, including breast and prostate cancer. Why would disrupted sleep rhythms cause cancer? Because such disruptions affect the release of the hormone melatonin. Most of the body’s cells contain little “molecular clocks” that respond to daily alternations of light and darkness. When the light dims at night, the pineal gland releases melatonin, which promotes sleep. In fact, many people take it in pill form as a “natural” sleep aid. Under normal circumstances, such a melatonin release continues until the body encounters light again in the morning.

When this daily (circadian) rhythm is disrupted, however, so is the regular production of melatonin, which turns out to have another important biological function. According to NIOSH, it “can also stop tumor growth and protect against the spread of cancer cells.” Unfortunately, if your job requires you to stay up all night, it won’t do this as effectively.

There’s a section on the NIOSH website that asks, “What can night shift workers do to stay healthy?” The answers are not particularly satisfying. They include regular checkups and seeing your doctor if you have any of a variety of symptoms, including “severe fatigue or sleepiness when you need to be awake, trouble with sleep, stomach or intestinal disturbances, irritability or bad mood, poor performance (frequent mistakes, injuries, vehicle crashes, near misses, etc.), unexplained weight gain or loss.”

Unfortunately, even if you have access to healthcare, your doctor can’t write you a prescription to cure shift-work disorder. The cure is to stop working when your body should be asleep.

An End to Shift Work?

Your doctor can’t solve your shift work issue because, ultimately, it’s not an individual problem. It’s an economic and an ethical one.

There will always be some work that must be performed while most people are sleeping, including healthcare, security, and emergency services, among others. But most shift work gets done not because life depends upon it, but because we’ve been taught to expect our patio furniture on demand. As long as advertising and the grow-or-die logic of capitalism keep stoking the desire for objects we don’t really need, may not even really want, and will sooner or later toss on a garbage pile in this or some other country, truckers and warehouse workers will keep damaging their health.

Perhaps the pandemic, with its kinky supply chain, has given us an opportunity to rethink which goods are so “critical” that we’re willing to let other people risk their lives to provide them for us. Unfortunately, such a global rethink hasn’t yet touched Joe Biden and his administration as they confront an ongoing pandemic, supply-chain problems, a rise in inflation, and — oh yes! — an existential climate crisis that gets worse with every plastic widget produced, packed, and shipped.

It’s time for Biden — and the rest of us — to take a breath and think this through. There are good reasons that so many people are walking away from underpaid, life-threatening work. Many of them are reconsidering the nature of work itself and its place in their lives, no matter what the president or anyone else might wish.

And that’s a paradigm shift we all could learn to live with.

Copyright 2021 Rebecca Gordon

Featured image: Port of Los Angeles sunrise by pete is licensed under CC BY 2.0 / Flickr

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer’s new dystopian novel, Songlands (the final one in his Splinterlands series), Beverly Gologorsky’s novel Every Body Has a Story, and Tom Engelhardt’s A Nation Unmade by War, as well as Alfred McCoy’s In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower’s The Violent American Century: War and Terror Since World War II.

Rebecca Gordon, a TomDispatch regular, teaches at the University of San Francisco. She is the author of American Nuremberg: The U.S. Officials Who Should Stand Trial for Post-9/11 War Crimes and is now at work on a new book on the history of torture in the United States.

The curse of the ignored modern prophets

For decades, I kept a poster on my wall that I'd saved from the year I turned 16. In its upper left-hand corner was a black-and-white photo of a white man in a grey suit. Before him spread a cobblestone plaza. All you could see were the man and the stones. Its caption read, "He stood up alone and something happened."

It was 1968. "He" was Minnesota Senator Eugene McCarthy. As that campaign slogan suggested, his strong second-place showing in the Maine primary was proof that opposition to the Vietnam War had finally become a viable platform for a Democratic candidate for president. I volunteered in McCarthy's campaign office that year. My memory of my duties is now vague, but they mainly involved alphabetizing and filing index cards containing information about the senator's supporters. (Remember, this was the age before there was a computer in every pocket, let alone social media and micro-targeting.)

Running against the Vietnam War, McCarthy was challenging then-President Lyndon Johnson in the Democratic primaries. After McCarthy had a strong second-place showing in Maine, New York Senator Robert F. Kennedy entered the race, too, running against the very war his brother, President John F. Kennedy, had bequeathed to Johnson when he was assassinated. Soon, Johnson would withdraw from the campaign, announcing in a televised national address that he wouldn't run for another term.

With his good looks and family name, Bobby Kennedy appeared to have a real chance for the nomination when, on June 5, 1968, during a campaign event in Los Angeles, he, like his brother, was assassinated. That left the war's opponents without a viable candidate for the nomination. Outside the Democratic Party convention in Chicago that August, tens of thousands of angry, mostly young Americans demonstrated their frustration with the war and the party's refusal to take a stand against it. In what was generally recognized as a police riot, the Chicago PD beat protesters and journalists bloody on national TV, as participants chanted, "The whole world is watching." And indeed, it was.

In the end, the nomination went to Johnson's vice president and war supporter Hubert Humphrey, who would face Republican hawk Richard Nixon that November. The war's opponents watched in frustration as the two major parties closed ranks, cementing their post-World-War-II bipartisan agreement to use military power to enforce U.S. global dominance.

Cassandra Foresees the Future

Of course, the McCarthy campaign's slogan was wrong on two counts. He didn't stand up alone. Millions of us around the world were then working to end the war in Vietnam. Sadly, nothing conclusive happened as a result of his campaign. Nixon went on to win the 1968 general election and the Vietnam War dragged on to an ignominious U.S. defeat seven years later.

Nineteen sixty-eight was also the year my high school put on Tiger at the Gates, French playwright Jean Giraudoux's antiwar drama about the run-up to the Trojan War. Giraudoux chronicled that ancient conflict's painful inevitability, despite the fervent desire of Troy's rulers and its people to prevent it. The play opens as Andromache, wife of the doomed Trojan warrior Hector, tells her sister-in-law Cassandra, "There's not going to be a Trojan war."

Cassandra, you may remember, bore a double curse from the gods: yes, she could see into the future, but no one would believe her predictions. She informs Andromache that she's wrong; that, like a tiger pacing outside the city's walls, war with all its bloody pain is preparing to spring. And, of course, she's right. Part of the play's message is that Cassandra doesn't need her supernatural gift to predict the future. She can guess what will happen simply because she understands the relentless forces driving her city to war: the poets who need tragedies to chronicle; the would-be heroes who desire glory; the rulers caught in the inertia of tradition.

Although Tiger was written in the 1930s, between the two world wars, it could just as easily have appeared in 1968. Substitute the mass media for the poets; the military-industrial complex for the Greek and Trojan warriors; and administration after administration for the city's rulers, and you have a striking representation of the quicksand war that dragged 58,000 U.S. soldiers and millions of Vietnamese, Laotians, and Cambodians to their deaths. And in some sense, we — the antiwar forces in this country — foresaw it all (in broad outline, if not specific detail): the assassinations, carpet bombings, tiger cages, and the CIA's first mass assassination and torture scheme, the Phoenix Program. Of course we couldn't predict the specifics. Indeed, some turned out worse than we'd feared. In any case, our foresight did us no more good than Cassandra's did her.

Rehabilitations and Revisions

It's just over a month since the 20th anniversary of the 9/11 attacks and the start of the "Global War on Terror." The press has been full of recollections and rehabilitations. George W. Bush used the occasion to warn the nation (as if we needed it at that point) about the dangers of what CNN referred to as "domestic violent extremists." He called them "children of the same foul spirit" as the one that engenders international terrorism. He also inveighed against the January 6th Capitol invasion:

"'This is how election results are disputed in a banana republic — not our democratic republic,' he said in a statement at the time, adding that he was 'appalled by the reckless behavior of some political leaders since the election.'"

You might almost think he'd forgotten that neither should elections in a democracy be "disputed" by three-piece-suited thugs shutting down a ballot count — as happened in Florida during his own first election in 2000. Future Trump operative Roger Stone has claimed credit for orchestrating that so-called Brooks Brothers Rebellion, which stopped the Florida vote count and threw the election to the Supreme Court and, in the end, to George W. Bush.

You might also think that, with plenty of shoving from his vice president Dick Cheney and a cabal of leftover neocons from the Project for a New American Century, Bush had never led this country into two devastating, murderous, profoundly wasteful wars. You might think we'd never seen the resumption of institutionalized CIA- and military-run state torture on a massive scale under his rule, or his administration's refusal to join the International Criminal Court.

And finally, you might think that nobody saw all this coming, that there were no Cassandras in this country in 2001. But there you would be wrong. All too many of us sensed just what was coming as soon as the bombing and invasion of Afghanistan began. I knew, for example, as early as November 2001, when the first mainstream article extolling the utility of torture appeared, that whatever else the U.S. response to the 9/11 attacks would entail, organized torture would be part of it. As early as December 2002, we all could have known that. That's when the first articles began appearing in the Washington Post about the "stress and duress" techniques the CIA was already beginning to use at Bagram Air Base in Afghanistan. Some of the hapless victims would later turn out to have been sold to U.S. forces for bounties by local strongmen.

It takes very little courage for a superannuated graduate student (as I was in 2001) to write academic papers about U.S. torture practices (as I did) and the stupidity and illegality of our invasion of Afghanistan. It's another thing, however, when a real Cassandra stands up — all alone — and tries to stop something from happening.

I'm talking, of course, about Representative Barbara Lee, the only member of Congress to vote against granting the president the power to "use all necessary and appropriate force against those nations, organizations, or persons he determines planned, authorized, committed, or aided the terrorist attacks that occurred on September 11, 2001, or harbored such organizations or persons." It was this Authorization of the Use of Military Force, or AUMF, that provided the legal grounds for the U.S. invasion of Afghanistan in September 2001. Lee was right when, after agonizing about her vote, she decided to follow the counsel of the dean of the National Cathedral, the Reverend Nathan Baxter. That very morning, she had heard him pray that, in response to the terrible crimes of 9/11, we not "become the evil we deplore."

How right she was when she said on the House floor:

"However difficult this vote may be, some of us must urge the use of restraint. Our country is in a state of mourning. Some of us must say, 'Let's step back for a moment, let's just pause, just for a minute, and think through the implications of our actions today, so that this does not spiral out of control.'"

The legislation she opposed that day would indeed allow "this" to spiral out of control. That same AUMF has since been used to justify an ever-metastasizing series of wars and conflicts that spread from Afghanistan in central Asia through the Middle East, south to Yemen, and leapt to Libya, Somalia, and other lands in Africa. Despite multiple attempts to repeal it, that same minimalist AUMF remains in effect today, ready for use by the next president with aspirations to military adventures. In June 2021, the House of Representatives did finally pass a bill rescinding it, sponsored by Barbara Lee herself. At present, however, it languishes in the Senate's Committee on Foreign Relations.

In the days after 9/11, Lee was roundly excoriated for her vote. The Wall Street Journal called her a "clueless liberal," while the Washington Times wrote that she was "a long-practicing supporter of America's enemies." Curiously, both those editorials were headlined with the question, "Who Is Barbara Lee?" (Those of us in the San Francisco Bay Area could have answered that. Lee was — and remains — an African American congressional representative from Oakland, California, the inheritor of the seat and mantle of another great black congressional representative, Ron Dellums.) She received mountains of hate mail then and enough death threats to force her to seek police protection.

Like George W. Bush, Lee received some media rehabilitation in various 20th anniversary retrospectives of 9/11. In her case, however, it was well-deserved. The Washington Post, for instance, praised her for her courage, noting that no one — not Bernie Sanders, not Joe Biden — shared her vision, or, I would add, shared Cassandra's curse with her. Like the character in Tiger at the Gates, Lee didn't need a divine gift to foresee that the U.S. "war on terror" would spin disastrously out of control. A little historical memory might have served the rest of the country well, reminding us of what happened the last time the United States fought an ever-escalating war.

Cassandras and Their Mirror Images

It was clear from the start that Vice President Dick Cheney and Secretary of Defense Donald Rumsfeld were never that interested in Afghanistan (although that was no solace to the many thousands of Afghans who were bombed, beaten, and tortured). Those officials had another target in mind — Iraq — almost literally from the moment al-Qaeda's hijacked planes struck New York and Washington.

In 2002, after months of lies about Iraqi leader Saddam Hussein's possession of (nonexistent) weapons of mass destruction (WMD) and his supposed pursuit of a nuclear bomb, the Bush administration got its second AUMF, authorizing "the President to use the U.S. armed forces to: …defend U.S. national security against the continuing threat posed by Iraq," functionally condoning the U.S. invasion of his country. This time, Barbara Lee was not alone in her opposition. In the House, she was joined by 132 Democrats, 6 Republicans, and one independent (Bernie Sanders). Only 23 senators, however, voted "nay," including Rhode Island Republican Lincoln Chafee and Vermont independent Jim Jeffords.

In the run-up to the March 2003 invasion, figures who might be thought of as "anti-Cassandras" took center stage. Unlike the Greek seer, these unfortunates were apparently doomed to tell falsehoods — and be believed. Among them was Condoleezza Rice, President Bush's national security advisor, who, when pressed for evidence that Saddam Hussein actually possessed WMD, told CNN's Wolf Blitzer that "we don't want the smoking gun to be a mushroom cloud," implying Iraq represented a nuclear threat to this country.

Then there was secretary of State Colin Powell, who put the case for war to the United Nations General Assembly in February 2003, emphasizing the supposedly factual basis of everything he presented:

"My colleagues, every statement I make today is backed up by sources, solid sources. These are not assertions. What we're giving you are facts and conclusions based on solid intelligence."

It wasn't true, of course, but around the world, many believed him.

And let's not leave the mainstream press out here. There's plenty of blame to go around, but perhaps the anti-Cassandra crown should go to the New York Times for its promotion of Bush administration war propaganda, especially by its reporter Judith Miller. In 2004, the Times published an extraordinary mea culpa, an apologetic note "from the editors" that said,

"[W]e have found a number of instances of coverage that was not as rigorous as it should have been. In some cases, information that was controversial then, and seems questionable now, was insufficiently qualified or allowed to stand unchallenged. Looking back, we wish we had been more aggressive in re-examining the claims as new evidence emerged — or failed to emerge."

I suspect the people of Iraq might share the Times's wish.

There was, of course, one other group of prophets who accurately foresaw the horrors that a U.S. invasion would bring with it: the millions who filled the streets of their cities here and around the world, demanding that the United States stay its hand. So powerful was their witness that they were briefly dubbed "the other superpower." Writing in the Nation, Jonathan Schell extolled their strength, saying that this country's "shock and awe" assault on Iraq "has found its riposte in courage and wonder." Alas, that mass witness in those streets was not enough to forestall one more murderous assault by what would, in the long run, prove to be a dying empire.

Cassandra at the Gates (of Glasgow)

And now, the world is finally waking up to an even greater disaster: the climate emergency that's burning up my part of the world, the American West, and drowning others. This crisis has had its Cassandras, too. One of these was 89-year-old John Rogalsky, who worked for 35 years as a meteorologist in the federal government. As early as 1963, he became aware of the problem of climate change and began trying to warn us. In 2017, he told the Canadian Broadcasting Company:

"[B]y the time the end of the 60s had arrived, I was absolutely convinced that it was real, it was just a question of how rapidly it would happen and how difficult it would become for the world at large, and how soon before people, or governments would even listen to the science. People I talked to about this, I was letting them know, this is happening, get ready."

This November, the 197 nations that have signed up to the United Nations Framework Convention on Climate Change will meet in Glasgow, Scotland, at the 2021 United Nations Climate Change Conference. We must hope that this follow-up to the 2015 Paris agreement will produce concrete steps to reverse the overheating of this planet and mitigate its effects, especially in those nations that have contributed the least to the problem and are already suffering disproportionately. Italy and the United Kingdom will serve as co-hosts.

I hope it's a good sign that at a pre-Glasgow summit in Milan, Italy's Prime Minister Mario Draghi met with three young "Cassandras" — climate activists Greta Thunberg (Sweden), Vanessa Nakate (Uganda), and Martina Comparelli (Italy) — after Thunberg's now famous "blah, blah, blah" speech, accusing world leaders of empty talk. "Your pressure, frankly, is very welcome," Draghi told them. "We need to be whipped into action. Your mobilization has been powerful, and rest assured, we are listening."

For the sake of the world, let us hope that this time Cassandra will be believed.

Copyright 2021 Rebecca Gordon

Featured image: Climate protest by Victoria Pickering is licensed under CC BY-NC-ND 2.0 / Flickr

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer's new dystopian novel, Songlands (the final one in his Splinterlands series), Beverly Gologorsky's novel Every Body Has a Story, and Tom Engelhardt's A Nation Unmade by War, as well as Alfred McCoy's In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower's The Violent American Century: War and Terror Since World War II.

Rebecca Gordon, a TomDispatch regular, teaches at the University of San Francisco. She is the author of American Nuremberg: The U.S. Officials Who Should Stand Trial for Post-9/11 War Crimes and is now at work on a new book on the history of torture in the United States.

Debt and disillusionment: Is higher education a giant pyramid scheme?

For the last decade and a half, I've been teaching ethics to undergraduates. Now — admittedly, a little late to the party — I've started seriously questioning my own ethics. I've begun to wonder just what it means to be a participant, however minor, in the pyramid scheme that higher education has become in the years since I went to college.

Airplane Games

Sometime in the late 1980s, the Airplane Game roared through the San Francisco Bay Area lesbian community. It was a classic pyramid scheme, even if cleverly dressed up in language about women's natural ability to generate abundance, just as we gestate children in our miraculous wombs. If the connection between feminism and airplanes was a little murky — well, we could always think of ourselves as modern-day Amelia Earharts. (As long as we didn't think too hard about how she ended up.)

A few women made a lot of money from it — enough, in the case of one friend of mine, for a down payment on a house. Inevitably, a lot more of us lost money, even as some like me stood on the sidelines sadly shaking our heads.

There were four tiers on that "airplane": a captain, two co-pilots, four crew, and 8 passengers — 15 in all to start. You paid $3,000 to get on at the back of the plane as a passenger, so the first captain (the original scammer), got out with $24,000 — $3,000 from each passenger. The co-pilots and crew, who were in on the fix, paid nothing to join. When the first captain "parachuted out," the game split in two, and each co-pilot became the captain of a new plane. They then pressured their four remaining passengers to recruit enough new women to fill each plane, so they could get their payday, and the two new co-pilots could each captain their own planes.

Unless new people continued to get on at the back of each plane, there would be no payday for the earlier passengers, so the pressure to recruit ever more women into the game only grew. The original scammers ran through the game a couple of times, but inevitably the supply of gullible women willing to invest their savings ran out. By the time the game collapsed, hundreds of women had lost significant amounts of money.

No one seemed to know the women who'd brought the game and all those "planes" to the Bay Area, but they had spun a winning story about endless abundance and the glories of women's energy. After the game collapsed, they took off for another women's community with their "earnings," leaving behind a lot of sadder, poorer, and perhaps wiser San Francisco lesbians.

Feasting at the Tenure Trough or Starving in the Ivory Tower?

So, you may be wondering, what could that long-ago scam have to do with my ethical qualms about working as a college instructor? More than you might think.

Let's start with PhD programs. In 2019, the most recent year for which statistics are available, U.S. colleges and universities churned out about 55,700 doctorates — and such numbers continue to increase by about 1% a year. The average number of doctorates earned over the last decade is almost 53,000 annually. In other words, we're talking about nearly 530,000 PhDs produced by American higher education in those 10 years alone. Many of them have ended up competing for a far smaller number of jobs in the academic world.

It's true that most PhDs in science or engineering end up with post-doctoral positions (earning roughly $40,000 a year) or with tenure-track or tenured jobs in colleges and universities (averaging $60,000 annually to start). Better yet, most of them leave their graduate programs with little or no debt.

The situation is far different if your degree wasn't in STEM (science, technology, engineering, or mathematics) but, for example, in education or the humanities. As a start, far more of those degree-holders graduate owing money, often significant sums, and ever fewer end up teaching in tenure-track positions — in jobs, that is, with security, decent pay, and benefits.

Many of the non-STEM PhDs who stay in academia end up joining an exploited, contingent workforce of part-time, or "adjunct," professors. That reserve army of the underemployed is higher education's dirty little secret. After all, we — and yes, I'm one of them — actually teach the majority of the classes in many schools, while earning as little as $1,500 a semester for each of them.

I hate to bring up transportation again, but there's a reason teachers like us are called "freeway flyers." A 2014 Congressional report revealed that 89% of us work at more than one institution and 27% at three different schools, just to cobble together the most meager of livings.

Many of us, in fact, rely on public antipoverty programs to keep going. Inside Higher Ed, reflecting on a 2020 report from the American Federation of Teachers, describes our situation this way:

"Nearly 25% of adjunct faculty members rely on public assistance, and 40% struggle to cover basic household expenses, according to a new report from the American Federation of Teachers. Nearly a third of the 3,000 adjuncts surveyed for the report earn less than $25,000 a year. That puts them below the federal poverty guideline for a family of four."

I'm luckier than most adjuncts. I have a union, and over the years we've fought for better pay, healthcare, a pension plan, and a pathway (however limited) to advancement. Now, however, my school's administration is using the pandemic as an excuse to try to claw back the tiny cost-of-living adjustments we won in 2019.

The Oxford Dictionary of English defines an adjunct as "a thing added to something else as a supplementary rather than an essential part." Once upon a time, in the middle of the previous century, that's just what adjunct faculty were — occasional additions to the full-time faculty. Often, they were retired professionals who supplemented a department's offerings by teaching a single course in their area of expertise, while their salaries were more honoraria than true payments for work performed. Later, as more women entered academia, it became common for a male professor's wife to teach a course or two, often as part of his employment arrangement with the university. Since her salary was a mere adjunct to his, she was paid accordingly.

Now, the situation has changed radically. In many colleges and universities, adjunct faculty are no longer supplements, but the most "essential part" of the teaching staff. Classes simply couldn't go on without us; nor, if you believe college administrations, could their budgets be balanced without us. After all, why pay a full-time professor $10,000 to teach a class (since he or she will be earning, on average, $60,000 a year and covering three classes a semester) when you can give a part-timer like me $1,500 for the very same work?

And adjuncts have little choice. The competition for full-time positions is fierce, since every year another 53,000 or more new PhDs climb into the back row of the academic airplane, hoping to make it to the pilot's seat and secure a tenure-track position.

And here's another problem with that. These days the people in the pilots' seats often aren't parachuting out. They're staying right where they are. That, in turn, means new PhDs find themselves competing for an ever-shrinking prize, as Laura McKenna has written in the Atlantic, "not only with their own cohort but also with the unemployed PhDs who graduated in previous years." Many of those now clinging to pilots' seats are members of my own boomer generation, who still benefit from a 1986 law (signed by then-75-year-old President Ronald Reagan) that outlawed mandatory retirements.

Grade Inflation v. Degree Inflation?

People in the world of education often bemoan the problem of "grade inflation" — the tendency of average grades to creep up over time. Ironically, this problem is exacerbated by the adjunctification of teaching, since adjuncts tend to award higher grades than professors with secure positions. The reason is simple enough: colleges use student evaluations as a major metric for rehiring adjuncts and higher grades translate directly into better evaluations. Grade inflation at the college level is, in my view, a non-issue, at least for students. Employers don't look at your transcript when they're hiring you and even graduate schools care more about recommendations and GRE scores.

The real problem faced by today's young people isn't grade inflation. It's degree inflation.

Once upon a time in another America, a high-school diploma was enough to snag you a good job, with a chance to move up as time went on (especially if you were white and male, as the majority of workers were in those days). And you paid no tuition whatsoever for that diploma. In fact, public education through 12th grade is still free, though its quality varies profoundly depending on who you are and where you live.

But all that changed as increasing numbers of employers began requiring a college degree for jobs that don't by any stretch of the imagination require a college education to perform. The Washington Post reports:

"Among the positions never requiring a college degree in the past that are quickly adding that to the list of desired requirements: dental hygienists, photographers, claims adjusters, freight agents, and chemical equipment operators."

In 2017, Manjari Raman of the Harvard Business School wrote that

"the degree gap — the discrepancy between the demand for a college degree in job postings and the employees who are currently in that job who have a college degree — is significant. For example, in 2015, 67% of production supervisor job postings asked for a college degree, while only 16% of employed production supervisors had one."

In other words, even though most people already doing such jobs don't have a bachelor's degree, companies are only hiring new people who do. Part of the reason: that requirement automatically eliminates a lot of applicants, reducing the time and effort involved in making hiring decisions. Rather than sifting through résumés for specific skills (like the ability to use certain computer programs or write fluently), employers let a college degree serve as a proxy. The result is not only that they'll hire people who don't have the skills they actually need, but that they're eliminating people who do have the skills but not the degree. You won't be surprised to learn that those rejected applicants are more likely to be people of color, who are underrepresented among the holders of college degrees.

Similarly, some fields that used to accept a BA now require a graduate degree to perform the same work. For example, the Bureau of Labor Statistics reports that "in 2015–16, about 39% of all occupational therapists ages 25 and older had a bachelor's degree as their highest level of educational attainment." Now, however, employers are commonly insisting that new applicants hold at least a master's degree — and so up the pyramid we continually go (at ever greater cost to those students).

The Biggest Pyramid of All

In a sense, you could say that the whole capitalist economy is the biggest pyramid of them all. For every one of the fascinating, fulfilling, autonomous, and well-paying jobs out there, there are thousands of boring, mind- and body-crushing ones like pulling items for shipment in an Amazon warehouse or folding clothes at Forever 21.

We know, in other words, that there are only a relatively small number of spaces in the cockpit of today's economic plane. Nonetheless, we tell our young people that the guaranteed way to get one of those rare gigs at the top of the pyramid is a college education.

Now, just stop for a second and consider what it costs to join the 2021 all-American Airplane Game of education. In 1970, when I went to Reed, a small, private, liberal arts college, tuition was $3,000 a year. I was lucky. I had a scholarship (known in modern university jargon as a "tuition discount") that covered most of my costs. This year, annual tuition at that same school is a mind-boggling $62,420, more than 20 times as high. If college costs had simply risen with inflation, the price would be about $21,000 a year, or just under triple the price.

If I'd attended Federal City College (now the University of D.C.), my equivalent of a state school then, tuition would have been free. Now, even state schools cost too much for many students. Annually, tuition at the University of California at Berkeley, the flagship school of that state's system, is $14,253 for in-state students, and $44,007 for out-of-staters.

I left school owing $800, or about $4,400 in today's dollars. These days, most financial "aid" resembles foreign "aid" to developing countries — that is, it generally takes the form of loans whose interest piles up so fast that it's hard to keep up with it, let alone begin to pay off the principal in your post-college life. Some numbers to contemplate: 62% of those graduating with a BA in 2019 did so owing money — owing, in fact, an average of almost $29,000. The average debt of those earning a graduate degree was an even more staggering $71,000. That, of course, is on top of whatever the former students had already shelled out while in school. And that, in turn, is before the "miracle" of compound interest takes hold and that debt starts to grow like a rogue zucchini.

It's enough to make me wonder whether a seat in the Great American College and University Airplane Game is worth the price, and whether it's ethical for me to continue serving as an adjunct flight attendant along the way. Whatever we tell students about education being the path to a good job, the truth is that there are remarkably few seats at the front of the plane.

Of course, on the positive side, I do still believe that time spent at college offers students something beyond any price — the opportunity to learn to think deeply and critically, while encountering people very different from themselves. The luckiest students graduate with a lifelong curiosity about the world and some tools to help them satisfy it. That is truly a ticket to a good life — and no one should have to buy a seat in an Airplane Game to get one.

Copyright 2021 Rebecca Gordon

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer's new dystopian novel, Songlands (the final one in his Splinterlands series), Beverly Gologorsky's novel Every Body Has a Story, and Tom Engelhardt's A Nation Unmade by War, as well as Alfred McCoy's In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower's The Violent American Century: War and Terror Since World War II.

Rebecca Gordon, a TomDispatch regular, teaches at the University of San Francisco. She is the author of American Nuremberg: The U.S. Officials Who Should Stand Trial for Post-9/11 War Crimes and is now at work on a new book on the history of torture in the United States.


Teetering on the existential edge, we have one more last, best chance for survival

In San Francisco, we're finally starting to put away our masks. With 74% of the city's residents over 12 fully vaccinated, for the first time in more than a year we're enjoying walking, shopping, and eating out, our faces naked. So I was startled when my partner reminded me that we need to buy masks again very soonN95 masks, that is. The California wildfire season has already begun, earlier than ever, and we'll need to protect our lungs during the months to come from the fine particulates carried in the wildfire smoke that's been engulfing this city in recent years.

I was in Reno last September, so I missed the morning when San Franciscans awoke to apocalyptic orange skies, the air freighted with smoke from burning forests elsewhere in the state. The air then was bad enough even in the high mountain valley of Reno. At that point, we'd already experienced "very unhealthy" purple-zone air quality for days. Still, it was nothing like the photos that could have been from Mars then emerging from the Bay Area. I have a bad feeling that I may get my chance to experience the same phenomenon in 2021 — and, as the fires across California have started so much earlier, probably sooner than September.

The situation is pretty dire: this state — along with our neighbors to the north and southeast — is now living through an epic drought. After a dry winter and spring, the fuel-moisture content in our forests (the amount of water in vegetation, living and dead) is way below average. This April, the month when it is usually at its highest, San Jose State University scientists recorded levels a staggering 40% below average in the Santa Cruz Mountains, well below the lowest level ever before observed. In other words, we have never been this dry.

Under the Heat Dome

When it's hot in most of California, its often cold and foggy in San Francisco. Today is no exception. Despite the raging news about heat records, it's not likely to reach 65 degrees here. So it's a little surreal to consider what friends and family are going through in the Pacific Northwest under the once-in-thousands-of-years heat dome that's settled over the region. A heat dome is an area of high pressure surrounded by upper-atmosphere winds that essentially pin it in place. If you remember your high-school physics, you'll recall that when a gas (for example, the air over the Pacific Northwest) is contained, the ratio between pressure and temperature remains constant. If the temperature goes up, the pressure goes up.

The converse is also true; as the pressure rises, so does the temperature. And that's what's been happening over Oregon, Washington, and British Columbia in normally chilly Canada. Mix in the fact that climate change has driven average temperatures in those areas up by three to four degrees since the industrial revolution, and you have a recipe for the disaster that struck the region recently.

And it has indeed been a disaster. The temperature in the tiny town of Lytton, British Columbia, for instance, hit 121 degrees on June 29th, breaking the Canadian heat record for the third time in as many days. (The previous record had stood since 1937.) That was Tuesday. On Wednesday night, the whole town was engulfed in the flames of multiple fires. The fires, in turn, generated huge pyrocumulus clouds that penetrated as high as the stratosphere (a rare event in itself), producing lightning strikes that ignited new fires in a vicious cycle that, in the end, simply destroyed the kilometer-long town.

Heat records have been broken all over the Pacific Northwest. Portland topped records for three days running, culminating with a 116-degree day on June 28th; Seattle hit a high of 108, which the Washington Post reported "was 34 degrees above the normal high of 74 and higher than the all-time heat record in Washington, D.C., among many other cities much farther to its south."

With the heat comes a rise in "sudden and unexpected" deaths. Hundreds have died in Oregon and Washington and, according to the British Columbia coroner, at least 300 in her state — almost double the average number for that time period.

Class, Race, and Hot Air

It's hardly a new observation that the people who have benefited least from the causes of climate change — the residents of less industrialized countries and poor people of all nations — are already suffering most from its results. Island nations like the Republic of Palau in the western Pacific are a prime example. Palau faces a number of climate-change challenges, according to the United Nations Development Program, including rising sea levels that threaten to inundate some of its lowest-lying islands, which are just 10 meters above sea level. In addition, encroaching seawater is salinating some of its agricultural land, creating seaside strips that can now grow only salt-tolerant root crops. Meanwhile, despite substantial annual rainfall, saltwater inundation threatens the drinking water supply. And worse yet, Palau is vulnerable to ocean storms that, on our heating planet, are growing ever more frequent and severe.

There are also subtle ways the rising temperatures that go with climate change have differential effects, even on people living in the same city. Take air conditioning. One of the reasons people in the Pacific Northwest suffered so horrendously under the heat dome is that few homes in that region are air conditioned. Until recently, people there had been able to weather the minimal number of very hot days each year without installing expensive cooling machinery.

Obviously, people with more discretionary income will have an easier time investing in air conditioning now that temperatures are rising. What's less obvious, perhaps, is that its widespread use makes a city hotter — a burden that falls disproportionately on people who can't afford to install it in the first place. Air conditioning works on a simple principle; it shifts heat from air inside an enclosed space to the outside world, which, in turn, makes that outside air hotter.

A 2014 study of this effect in Phoenix, Arizona, showed that air conditioning raised ambient temperatures by one to two degrees at night — an important finding, because one of the most dangerous aspects of the present heat waves is their lack of night-time cooling. As a result, each day's heat builds on a higher base, while presenting a greater direct-health threat, since the bodies of those not in air conditioning can't recover from the exhaustion of the day's heat at night. In effect, air conditioning not only heats the atmosphere further but shifts the burden of unhealthy heat from those who can afford it to those who can't.

Just as the coronavirus has disproportionately ravaged black and brown communities (as well as poor nations around the world), climate-change-driven heat waves, according to a recent University of North Carolina study reported by the BBC, mean that "black people living in most U.S. cities are subject to double the level of heat stress as their white counterparts." This is the result not just of poverty, but of residential segregation, which leaves urban BIPOC (black, indigenous, and other people of color) communities in a city's worst "heat islands" — the areas containing the most concrete, the most asphalt, and the least vegetation — and which therefore attract and retain the most heat.

"Using satellite temperature data combined with demographic information from the U.S. Census," the researchers "found that the average person of color lives in an area with far higher summer daytime temperatures than non-Hispanic white people." They also discovered that, in all but six of the 175 urban areas they studied in the continental U.S., "people of color endure much greater heat impacts in summer." Furthermore, "for black people this was particularly stark. The researchers say they are exposed to an extra 3.12C [5.6F] of heating, on average, in urban neighborhoods, compared to an extra 1.47C [2.6F] for white people."

That's a big difference.

Food, Drink, and Fires — the View from California

Now, let me return to my own home state, California, where conditions remain all too dry and, apart from the coast right now, all too hot. Northern California gets most of its drinking water from the snowpack that builds each year in the Sierra Nevada mountains. In spring, those snows gradually melt, filling the rivers that fill our reservoirs. In May 2021, however, the Sierra snowpack was a devastating six percent of normal!

Stop a moment and take that in, while you try to imagine the future of much of the state — and the crucial crops it grows.

For my own hometown, San Francisco, things aren't quite that dire. Water levels in Hetch Hetchy, our main reservoir, located in Yosemite National Park, are down from previous years, but not disastrously so. With voluntary water-use reduction, we're likely to have enough to drink this year at least. Things are a lot less promising, however, in rural California where towns tend to rely on groundwater for domestic use.

Shrinking water supplies don't just affect individual consumers here in this state, they affect everyone in the United States who eats, because 13.5% of all our agricultural products, including meat and dairy, as well as fruits and vegetables, come from California. Growing food requires prodigious amounts of water. In fact, farmland irrigation accounts for roughly 80% of all water used by businesses and homes in the state.

So how are California's agricultural water supplies doing this year? The answer, sadly, is not very well. State regulators have already cut distribution to about a quarter of California's irrigated acreage (about two million acres) by a drastic 95%. That's right. A full quarter of the state's farmlands have access to just 5% of what they would ordinarily receive from rivers and aqueducts. As a result, some farmers are turning to groundwater, a more easily exhausted source, which also replenishes itself far more slowly than rivers and streams. Some are even choosing to sell their water to other farmers, rather than use it to grow crops at all, because that makes more economic sense for them. As smaller farms are likely to be the first to fold, the water crisis will only enhance the dominance of major corporations in food production.

Meanwhile, we'll probably be breaking out our N95 masks soon. Wildfire season has already begun — earlier than ever. On July 1st, the then-still-uncontained Salt fire briefly closed a section of Interstate 5 near Redding in northern California. (I-5 is the main north-south interstate along the West coast.) And that's only one of the more than 4,500 fire incidents already recorded in the state this year.

Last year, almost 10,000 fires burned more than four million acres here, and everything points to a similar or worse season in 2021. Unlike Donald Trump, who famously blamed California's fires on a failure to properly rake our forests, President Biden is taking the threat seriously. On June 30th, he convened western state leaders to discuss the problem, acknowledging that "we have to act and act fast. We're late in the game here." The president promised a number of measures: guaranteeing sufficient, and sufficiently trained, firefighters; raising their minimum pay to $15 per hour; and making grants to California counties under the Federal Emergency Management Agency's BRIC (Building Resilient Infrastructure and Communities) program.

Such measures will help a little in the short term, but none of it will make a damn bit of difference in the longer run if the Biden administration and a politically divisive Congress don't begin to truly treat climate change as the immediate and desperately long-term emergency it is.

Justice and Generations

In his famous A Theory of Justice, the great liberal philosopher of the twentieth century John Rawls proposed a procedural method for designing reasonable and fair principles and policies in a given society. His idea: that the people determining such basic policies should act as if they had stepped behind a "veil of ignorance" and had lost specific knowledge of their own place in society. They'd be ignorant of their own class status, ethnicity, or even how lucky they'd been when nature was handing out gifts like intelligence, health, and physical strength.

Once behind such a veil of personal ignorance, Rawls argued, people might make rules that would be as fair as possible, because they wouldn't know whether they themselves were rich or poor, black or white, old or young — or even which generation they belonged to. This last category was almost an afterthought, included, he wrote, "in part because questions of social justice arise between generations as well as within them."

His point about justice between generations not only still seems valid to me, but in light of present-day circumstances radically understated. I don't think Rawls ever envisioned a trans-generational injustice as great as the climate-change one we're allowing to happen, not to say actively inducing, at this very moment.

Human beings have a hard time recognizing looming but invisible dangers. In 1990, I spent a few months in South Africa providing some technical assistance to an anti-apartheid newspaper. When local health workers found out that I had worked (as a bookkeeper) for an agency in the U.S. trying to prevent the transmission of AIDS, they desperately wanted to talk to me. How, they hoped to learn, could they get people living in their townships to act now to prevent a highly transmissible illness that would only produce symptoms years after infection? How, in the face of the all-too-present emergencies of everyday apartheid life, could they get people to focus on a vague but potentially horrendous danger barreling down from the future? I had few good answers and, almost 30 years later, South Africa has the largest HIV-positive population in the world.

Of course, there are human beings who've known about the climate crisis for decades — and not just the scientists who wrote about it as early as the 1950s or the ones who gave an American president an all-too-accurate report on it in 1965. The fossil-fuel companies have, of course, known all along — and have focused their scientific efforts not on finding alternative energy sources, but on creating doubt about the reality of human-caused climate change (just as, once upon a time, tobacco companies sowed doubt about the relationship between smoking and cancer). As early as 1979, the Guardian reports, an internal Exxon study concluded that the use of fossil fuels would certainly "cause dramatic environmental effects" in the decades ahead. "The potential problem is great and urgent," the study concluded.

A problem that was "great and urgent" in 1979 is now a full-blown existential crisis for human survival.

Some friends and I were recently talking about how ominous the future must look to the younger people we know. "They are really the first generation to confront an end to humanity in their own, or perhaps their children's lifetimes," I said.

"But we had The Bomb," a friend reminded me. "We grew up in the shadow of nuclear war." And she was right of course. We children of the 1950s and 1960s grew up knowing that someone could "press the button" at any time, but there was a difference. Horrifying as is the present retooling of our nuclear arsenal (going on right now, under President Biden), nuclear war nonetheless remains a question of "if." Climate change is a matter of "when" and that when, as anyone living in the Northwest of the United States and Canada should know after these last weeks, is all too obviously now.

It's impossible to overstate the urgency of the moment. And yet, as a species, we're acting like the children of indulgent parents who provide multiple "last chances" to behave. Now, nature has run out of patience and we're running out of chances. So much must be done globally, especially to control the giant fossil-fuel companies. We can only hope that real action will emerge from November's international climate conference. And here in the U.S., unless congressional Democrats succeed in ramming through major action to stop climate change before the 2022 midterms, we'll have lost one more last, best chance for survival.

Copyright 2021 Rebecca Gordon

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer's new dystopian novel, Songlands (the final one in his Splinterlands series), Beverly Gologorsky's novel Every Body Has a Story, and Tom Engelhardt's A Nation Unmade by War, as well as Alfred McCoy's In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower's The Violent American Century: War and Terror Since World War II.

Rebecca Gordon, a TomDispatch regular, teaches at the University of San Francisco. She is the author of American Nuremberg: The U.S. Officials Who Should Stand Trial for Post-9/11 War Crimes and is now at work on a new book on the history of torture in the United States.

Welfare for weapons makers doesn't make us safer

These days my conversations with friends about the new administration go something like this:

"Biden's doing better than I thought he would."

"Yeah. Vaccinations, infrastructure, acknowledging racism in policing. A lot of pieces of the Green New Deal, without calling it that. The child subsidies. It's kind of amazing."

"But on the military–"

"Yeah, same old, same old."

As my friends and I have noticed, President Joe Biden remains super-glued to the same old post-World War II agreement between the two major parties: they can differ vastly on domestic policies, but they remain united when it comes to projecting U.S. military power around the world and to the government spending that sustains it. In other words, the U.S. "national security" budget is still the third rail of politics in this country.

Assaulting the Old New Deal

It was Democratic House Speaker Tip O'Neill who first declared that Social Security is "the third rail" of American politics. In doing so, he metaphorically pointed to the high-voltage rail that runs between the tracks of subways and other light-rail systems. Touch that and you'll electrocute yourself.

O'Neill made that observation back in 1981, early in Ronald Reagan's first presidential term, at a moment when the new guy in Washington was already hell-bent on dismantling Franklin Delano Roosevelt's New Deal legacy.

Reagan would fight his campaign to do so on two key fronts. First, he would attack labor unions, whose power had expanded in the years since the 1935 Wagner Act (officially the National Labor Relations Act) guaranteed workers the right to bargain collectively with their employers over wages and workplace rules. Such organizing rights had been hard-won indeed. Not a few workers died at the hands of the police or domestic mercenaries like Pinkerton agents, especially in the early 1930s. By the mid-1950s, union membership would peak at around 35% of workers, while wages would continue to grow into the late 1970s, when they stagnated and began their long decline.

Reagan's campaign began with an attack on PATCO, a union of well-paid professionals — federally-employed air-traffic controllers — which his National Labor Relations Board eventually decertified. That initial move signaled the Republican Party's willingness, even enthusiasm, for breaking with decades of bipartisan support for organized labor. By the time Donald Trump took office in the next century, it was a given that Republicans would openly support anti-union measures like federal "right-to-work" laws, which, if passed, would make it illegal for employers to agree to a union-only workplace and so effectively destroy the bargaining power of unions. (Fortunately, opponents were able to forestall that move during Trump's presidency, but in February 2021, Republicans reintroduced their National Right To Work Act.)

The Second Front and the Third Rail

There was a second front in Reagan's war on the New Deal. He targeted a group of programs from that era that came to be known collectively as "entitlements." Three of the most important were Aid to Dependent Children, unemployment insurance, and Social Security. In addition, in 1965, a Democratic Congress had added a healthcare entitlement, Medicare, which helps cover medical expenses for those over 65 and younger people with specific chronic conditions, as well as Medicaid, which does the same for poor people who qualify. These, too, would soon be in the Republican gunsights.

The story of Reagan's racially inflected attacks on welfare programs is well-known. His administration's urge to go after unemployment insurance, which provided payments to laid-off workers, was less commonly acknowledged. In language eerily echoed by Republican congressional representatives today, the Reagan administration sought to reduce the length of unemployment benefits, so that workers would be forced to take any job at any wage. A 1981 New York Times report, for instance, quoted Reagan Assistant Secretary of Labor Albert Agrisani as saying:

"'The bottom line… is that we have developed two standards of work, available work and desirable work.' Because of the availability of unemployment insurance and extended benefits, he said, 'there are jobs out there that people don't want to take.'"

Reagan did indeed get his way with unemployment insurance, but when he turned his sights on Social Security, he touched Tip O'Neill's third rail.

Unlike welfare, whose recipients are often framed as lazy moochers, and unemployment benefits, which critics claim keep people from working, Social Security was then and remains today a hugely popular program. Because workers contribute to the fund with every paycheck and usually collect benefits only after retirement, beneficiaries appear deserving in the public eye. Of all the entitlement programs, it's the one most Americans believe that they and their compatriots are genuinely entitled to. They've earned it. They deserve it.

So, when the president moved to reduce Social Security benefits, ostensibly to offset a rising deficit in its fund, he was shocked by the near-unanimous bipartisan resistance he met. His White House put together a plan to cut $80 billion over five years by — among other things — immediately cutting benefits and raising the age at which people could begin fully collecting them. Under that plan, a worker who retired early at 62 and was entitled to $248 a month would suddenly see that payout reduced to $162.

Access to early retirement was, and remains, a justice issue for workers with shorter life expectancies — especially when those lives have been shortened by the hazards of the work they do. As South Carolina Republican Congressman Carroll Campbell complained to the White House at the time: "I've got thousands of sixty-year-old textile workers who think it's the end of the world. What the hell am I supposed to tell them?"

After the Senate voted 96-0 to oppose any plan that would "precipitously and unfairly reduce early retirees' benefits," the Reagan administration regrouped and worked out a compromise with O'Neill and the Democrats. Economist (later Federal Reserve chair) Alan Greenspan would lead a commission that put together a plan, approved in 1983, to gradually raise the full retirement age, increase the premiums paid by self-employed workers, start taxing benefits received by people with high incomes, and delay cost-of-living adjustments. Those changes were rolled out gradually, the country adjusted, and no politicians were electrocuted in the process.

Panic! The System Is Going Broke!

With its monies maintained in a separately sequestered trust fund, Social Security, unlike most government programs, is designed to be self-sustaining. Periodically, as economist and New York Times columnist Paul Krugman might put it, serious politicians claim to be concerned about that fund running out of money. There's a dirty little secret that those right-wing deficit slayers never tell you, though: when the Social Security trust fund runs a surplus, as it did from 1983 to 2009, it's required to invest it in government bonds, indirectly helping to underwrite the federal government's general fund.

They also aren't going to mention that one group who contributes to that surplus will never see a penny in benefits: undocumented immigrant workers who pay into the system but won't ever collect Social Security. Indeed, in 2016, such workers provided an estimated $13 billion out of about $957 billion in Social Security taxes, or almost 3% of total revenues. That may not sound like much, but over the years it adds up. In that way, undocumented workers help subsidize the trust fund and, in surplus years, the entire government.

How, then, is Social Security funded? Each year, employees contribute 6.2% of their wages (up to a cap amount). Employers match that, for a total of 12.4% of wages paid, and both put out another 1.45% each for Medicare. Self-employed people pay both shares or a total of 15.3% of their income, including Medicare. And those contributions add up to about nine-tenths of the fund's annual income (89% in 2019). The rest comes from interest on government bonds.

So, is the Social Security system finally in trouble? It could be. When the benefits due to a growing number of retirees exceed the fund's income, its administrators will have to dip into its reserves to make up the difference. As people born in the post-World War II baby boom reach retirement, at a moment when the American population is beginning to age rapidly, dire predictions are resounding about the potential bankruptcy of the system. And there is, in fact, a consensus that the fund will begin drawing down its reserves, possibly starting this year, and could exhaust them as soon as 2034. At that point, relying only on the current year's income to pay benefits could reduce Social Security payouts to perhaps 79% of what's promised at present.

You can already hear the cries that the system is going broke!

But it doesn't have to be that way. Employees and employers only pay Social Security tax on income up to a certain cap. This year it's $142,800. In other words, employees who make a million dollars in 2021 will contribute no more to Social Security than those who make $142,800. To rescue Social Security, all it would take is raising that cap — or better yet, removing it altogether.

In fact, the Congressional Budget Office has run the numbers and identified two different methods of raising it to eventually tax all wage income. Either would keep the trust fund solvent.

Naturally, plutocrats and their congressional minions don't want to raise the Social Security cap. They'd rather starve the entitlement beast and blame possible shortfalls on greedy boomers who grew up addicted to government handouts. Under the circumstances, we, and succeeding generations, had better hope that Social Security remains, as it was in 1981, the third rail in American politics.

Welfare for Weapons Makers

Of course, there's a second high-voltage, untouchable rail in American politics and that's funding for the military and weapons manufacturers. It takes a brave politician indeed to suggest even the most minor of reductions in Pentagon spending, which has for years been the single largest item of discretionary spending in the federal budget.

It's notoriously difficult to identify how much money the government actually spends annually on the military. President Trump's last Pentagon budget, for the fiscal year ending on September 30th, offered about $740 billion to the armed services (not including outlays for veteran services and pensions). Or maybe it was only $705.4 billion. Or perhaps, including Department of Energy outlays involving nuclear weapons, $753.5 billion. (And none of those figures even faintly reflected full national-security spending, which is certainly well over a trillion dollars annually.)

Most estimates put President Biden's 2022 military budget at $753 billion — about the same as Trump's for the previous year. As former Senator Everett Dirksen is once supposed to have said, "A billion here, a billion there, and pretty soon you're talking real money."

Indeed, we're talking real money and real entitlements here that can't be touched in Washington without risking political electrocution. Unlike actual citizens, U.S. arms manufacturers seem entitled to ever-increasing government subsidies — welfare for weapons, if you like. Beyond the billions spent to directly fund the development and purchase of various weapons systems, every time the government permits arms sales to other countries, it's expanding the coffers of companies like Lockheed-Martin, Northrup-Grumman, Boeing, and Raytheon Technologies. The real beneficiaries of Donald Trump's so-called Abraham Accords between Israel and the majority Muslim states of Morocco, the United Arab Emirates, Bahrain, and Sudan were the U.S. companies that sell the weaponry that sweetened those deals for Israel's new friends.

When Americans talk about undeserved entitlements, they're usually thinking about welfare for families, not welfare for arms manufacturers. But military entitlements make the annual federal appropriation of $16.5 billion for Temporary Aid to Needy Families (TANF) look puny by comparison. In fact, during Republican and Democratic administrations alike, the yearly federal outlay for TANF hasn't changed since it was established through the 1996 Personal Responsibility and Work Opportunity Reconciliation Act, known in the Clinton era as "welfare reform." Inflation has, however, eroded its value by about 40% in the intervening years.

And what do Americans get for those billions no one dares to question? National security, right?

But how is it that the country that spends more on "defense" than the next seven, or possibly 10, countries combined is so insecure that every year's Pentagon budget must exceed the last one? Why is it that, despite those billions for military entitlements, our critical infrastructure, including hospitals, gas pipelines, and subways (not to mention Cape Cod steamships), lies exposed to hackers?

And if, thanks to that "defense" budget, we're so secure, why is it that, in my wealthy home city of San Francisco, residents now stand patiently in lines many blocks long to receive boxes of groceries? Why is "national security" more important than food security, or health security, or housing security? Or, to put it another way, which would you rather be entitled to: food, housing, education, and healthcare, or your personal share of a shiny new hypersonic missile?

But wait! Maybe defense spending contributes to our economic security by creating, as Donald Trump boasted in promoting his arms deals with Saudi Arabia, "jobs, jobs, jobs." It's true that spending on weaponry does, in fact, create jobs, just not nearly as many as investing taxpayer dollars in a variety of far less lethal endeavors would. As Brown University's Costs of War project reports:

"Military spending creates fewer jobs than the same amount of money would have, if invested in other sectors. Clean energy and health care spending create 50% more jobs than the equivalent amount of spending on the military. Education spending creates more than twice as many jobs."

It seems that President Joe Biden is ready to shake things up by attacking child poverty, the coronavirus pandemic, and climate change, even if he has to do it without any Republican support. But he's still hewing to the old Cold War bipartisan alliance when it comes to the real third rail of American politics — military spending. Until the power can be cut to that metaphorical conduit, real national security remains an elusive dream.

The reality of work in the Biden-Harris era

A year ago, just a few weeks before San Francisco locked itself down for the pandemic, I fell deeply in love with a 50-year-old. The object of my desire was a wooden floor loom in the window of my local thrift shop. Friends knowledgeable on such matters examined photos I took of it and assured me that all the parts were there, so my partner (who puts up with such occasional infatuations) helped me wrangle it into one of our basement rooms and I set about learning to weave.

These days, all I want to do is weave. The loom that's gripped me, and the pandemic that's gripped us all, have led me to rethink the role of work (and its subset, paid labor) in human lives. During an enforced enclosure, this 68-year-old has spent a lot of time at home musing on what the pandemic has revealed about how this country values work. Why, for example, do the most "essential" workers so often earn so little — or, in the case of those who cook, clean, and care for the people they live with, nothing at all? What does it mean when conservatives preach the immeasurable value of labor, while insisting that its most basic price in the marketplace shouldn't rise above $7.25 per hour?

That, after all, is where the federal minimum wage has been stuck since 2009. And that's where it would probably stay forever, if Republicans like Kansas Senator Roger Marshall had their way. He brags that he put himself through college making $6 an hour and doesn't understand why people can't do the same today for $7.25. One likely explanation: the cost of a year at Kansas State University has risen from $898 when he was at school to $10,000 today. Another? At six bucks an hour, he was already making almost twice the minimum wage of his college years, a princely $3.35 an hour.

It's Definitely Not Art, But Is It Work?

It's hard to explain the pleasure I've gotten from learning the craft of weaving, an activity whose roots extend at least 20,000 years into the past. In truth, I could devote the next (and most likely last) 20 years of my life just to playing with "plain weave," its simplest form — over-under, over-under — and not even scratch the surface of its possibilities. Day after day, I tromp down to our chilly basement and work with remarkable satisfaction at things as simple as getting a straight horizontal edge across my cloth.

But is what I'm doing actually "work"? Certainly, at the end of a day of bending under the loom to tie things up, of working the treadles to raise and lower different sets of threads, my aging joints are sore. My body knows all too well that I've been doing something. But is it work? Heaven knows, I'm not making products crucial to our daily lives or those of others. (We now possess more slightly lopsided cloth napkins than any two-person household could use in a lifetime.) Nor, at my beginner's level, am I producing anything that could pass for "art."

I don't have to weave. I could buy textiles for a lot less than it costs me to make them. But at my age, in pandemic America, I'm lucky. I have the time, money, and freedom from personal responsibilities to be able to immerse myself in making cloth. For me, playing with string is a first-world privilege. It won't help save humanity from a climate disaster or reduce police violence in communities of color. It won't even help a union elect an American president, something I was focused on last fall, while working with the hospitality-industry union. It's not teaching college students to question the world and aspire to living examined lives, something I've done in my official work as a part-time professor for the last 15 years. It doesn't benefit anyone but me.

Nevertheless, what I'm doing certainly does have value for me. It contributes, as philosophers might say, to my human flourishing. When I practice weaving, I'm engaged in something political philosopher Iris Marion Young believed essential to a good life. As she put it, I'm "learning and using satisfying and expansive skills." Young thought that a good society would offer all its members the opportunity to acquire and deploy such complicated skills in "socially recognized settings." In other words, a good society would make it possible for people to do work that was both challenging and respected.

Writing in the late 1980s, she took for granted that "welfare capitalism" of Europe, and to a far lesser extent the United States, would provide for people's basic material needs. Unfortunately, decades later, it's hard even to teach her critique of such welfare capitalism — a system that sustained lives but didn't necessarily allow them to flourish — because my students here have never experienced an economic system that assumes any real responsibility for sustaining life. Self-expression and an opportunity to do meaningful work? Pipe dreams if you aren't already well-off! They'll settle for jobs that pay the rent, keep the refrigerator stocked, and maybe provide some health benefits as well. That would be heaven enough, they say. And who could blame them when so many jobs on offer will fall far short of even such modest goals?

What I'm not doing when I weave is making money. I'm not one of the roughly 18 million workers in this country who do earn their livings in the textile industry. Such "livings" pay a median wage of about $28,000 a year, which likely makes it hard to keep a roof over your head. Nor am I one of the many millions more who do the same around the world, people like Seak Hong who sews garments and bags for an American company in Cambodia. Describing her life, she told a New York Times reporter, "I feel tired, but I have no choice. I have to work." Six days a week,

"Ms. Hong wakes up at 4:35 a.m. to catch the truck to work from her village. Her workday begins at 7 and usually lasts nine hours, with a lunch break. During the peak season, which lasts two to three months, she works until 8:30 p.m."
"Ms. Hong has been in the garment business for 22 years. She earns the equivalent of about $230 a month and supports her father, her sister, her brother (who is on disability) and her 12-year-old son."

Her sister does the unpaid — but no less crucial — work of tending to her father and brother, the oxen, and their subsistence rice plants.

Hong and her sister are definitely working, one with pay, the other without. They have, as she says, no choice.

Catherine Gamet, who makes handbags in France for Louis Vuitton, is also presumably working to support herself. But hers is an entirely different experience from Hong's. She loves what she's been doing for the last 23 years. Interviewed in the same article, she told the Times, "To be able to build bags and all, and to be able to sew behind the machine, to do hand-sewn products, it is my passion." For Gamet, "The time flies by."

Both these women have been paid to make bags for more than 20 years, but they've experienced their jobs very differently, undoubtedly thanks to the circumstances surrounding their work, rather than the work itself: how much they earn; the time they spend traveling to and from their jobs; the extent to which the "decision" to do a certain kind of work is coerced by fear of poverty. We don't learn from Hong's interview how she feels about the work itself. Perhaps she takes pride in what she does. Most people find a way to do that. But we know that making bags is Gamet's passion. Her work is not merely exhausting, but in Young's phrase "satisfying and expansive." The hours she spends on it are lived, not just endured as the price of survival.

Pandemic Relief and Its Discontents

Joe Biden and Kamala Harris arrived at the White House with a commitment to getting a new pandemic relief package through Congress as soon as possible. It appears that they'll succeed, thanks to the Senate's budget reconciliation process — a maneuver that bypasses the possibility of a Republican filibuster. Sadly, because resetting the federal minimum wage to $15 per hour doesn't directly involve taxation or spending, the Senate's parliamentarian ruled that the reconciliation bill can't include it.

Several measures contained in the package have aroused conservative mistrust, from the extension of unemployment benefits to new income supplements for families with children. Such measures provoke a Republican fear that somebody, somewhere, might not be working hard enough to "deserve" the benefits Congress is offering or that those benefits might make some workers think twice about sacrificing their time caring for children to earn $7.25 an hour at a soul-deadening job.

As New York Times columnist Ezra Klein recently observed, Republicans are concerned that such measures might erode respect for the "natural dignity" of work. In an incisive piece, he rebuked Republican senators like Mike Lee and Marco Rubio for responding negatively to proposals to give federal dollars to people raising children. Such a program, they insisted, smacked of — the horror! — "welfare," while in their view, "an essential part of being pro-family is being pro-work." Of course, for Lee and Rubio "work" doesn't include changing diapers, planning and preparing meals, doing laundry, or helping children learn to count, tell time, and tie their shoelaces — unless, of course, the person doing those things is employed by someone else's family and being paid for it. In that case it qualifies as "work." Otherwise, it's merely a form of government-subsidized laziness.

There is, however, one group of people that "pro-family" conservatives have long believed are naturally suited to such activities and who supposedly threaten the well-being of their families if they choose to work for pay instead. I mean, of course, women whose male partners earn enough to guarantee food, clothing, and shelter with a single income. I remember well a 1993 article by Pat Gowens, a founder of Milwaukee's Welfare Warriors, in the magazine Lesbian Contradiction. She wondered why conservative anti-feminists of that time thought it good if a woman with children had a man to provide those things, but an outrage if she turned to "The Man" for the same aid. In the first case, the woman's work is considered dignified, sacred, and in tune with the divine plan. Among conservatives, then or now, the second could hardly be dignified with the term "work."

The distinction they make between private and public paymasters, when it comes to domestic labor contains at least a tacit, though sometimes explicit, racial element. When the program that would come to be known as "welfare" was created as part of President Franklin Roosevelt's New Deal in the 1930s, it was originally designed to assist respectable white mothers who, through no fault of their own, had lost their husbands to death or desertion. It wasn't until the 1960s that African American women decided to secure their right to coverage under the same program and built the National Welfare Rights Organization to do so.

The word "welfare" refers, as in the preamble to the Constitution, to human wellbeing. But when Black women started claiming those rights, it suddenly came to signify undeserved handouts. You could say that Ronald Reagan rode into the White House in 1980 in a Cadillac driven by the mythical Black "welfare queen" he continually invoked in his campaign. It would be nice to think that the white resentment harnessed by Reagan culminated (as in "reached its zenith and will now decline") with Trump's 2016 election, but, given recent events, that would be unrealistically optimistic.

Reagan began the movement to undermine the access of poor Americans to welfare programs. Ever since, starving the entitlement beast has been the Republican lodestar. In the same period, of course, the wealthier compatriots of those welfare mothers have continued to receive ever more generous "welfare" from the government. Those would include subsidies to giant agriculture, oil-depletion allowances and other subsidies for fossil-fuel companies, the mortgage-interest tax deduction for people with enough money to buy rather than rent their homes, and the massive tax cuts for billionaires of the Trump era. However, it took a Democratic president, Bill Clinton, to achieve what Reagan couldn't, and, as he put it, "end welfare as we know it."

The Clinton administration used the same Senate reconciliation process in play today for the Biden administration's Covid-19 relief bill to push through the 1996 Personal Responsibility and Work Opportunity Reconciliation Act. It was more commonly known as "welfare reform." That act imposed a 32-hour-per-week work or training requirement on mothers who received what came to be known as Temporary Assistance to Needy Families. It also gave "temporary" its deeper meaning by setting a lifetime benefits cap of five years. Meanwhile, that same act proved a bonanza for non-profits and Private Industry Councils that got contracts to administer "job training" programs and were paid to teach women how to wear skirts and apply makeup to impress future employers. In the process, a significant number of unionized city and county workers nationwide were replaced with welfare recipients "earning" their welfare checks by sweeping streets or staffing county offices, often for less than the minimum wage.

In 1997, I was working with Californians for Justice (CFJ), then a new statewide organization dedicated to building political power in poor communities, especially those of color. Given the high unemployment rates in just such communities, our response to Clinton's welfare reforms was to demand that those affected by them at least be offered state-funded jobs at a living wage. If the government was going to make people work for pay, we reasoned, then it should help provide real well-paying jobs, not bogus "job readiness" programs. We secured sponsors in the state legislature, but I'm sure you won't be shocked to learn that our billion-dollar jobs bill never got out of committee in Sacramento.

CFJ's project led me into an argument with one of my mentors, the founder of the Center for Third World Organizing, Gary Delgado. Why on earth, he asked me, would you campaign to get people jobs? "Jobs are horrible. They're boring: they waste people's lives and destroy their bodies." In other words, Gary was no believer in the inherent dignity of paid work. So, I had to ask myself, why was I?

Among those who have inspired me, Gary wasn't alone in holding such a low opinion of jobs. The Greek philosopher Aristotle, for instance, had been convinced that those whose economic condition forced them to work for a living would have neither the time nor space necessary to live a life of "excellence" (his requirement for human happiness). Economic coercion and a happy life were, in his view, mutually exclusive.

Reevaluating Jobs

One of the lies capitalism tells us is that we should be grateful for our jobs and should think of those who make a profit from our labor not as exploiters but as "job creators." In truth, however, there's no creativity involved in paying people less than the value of their work so that you can skim off the difference and claim that you earned it. Even if we accept that there could be creativity in "management" — the effort to organize and divide up work so it's done efficiently and well — it's not the "job creators" who do that, but their hirelings. All the employers bring to the game is money.

Take the example of the admirable liberal response to the climate emergency, the Green New Deal. In the moral calculus of capitalism, it's not enough that shifting to a green economy could promote the general welfare by rebuilding and extending the infrastructure that makes modern life possible and rewarding. It's not enough that it just might happen in time to save billions of people from fires, floods, hurricanes, or starvation. What matters — the selling point — is that such a conversion would create jobs (along with the factor no one mentions out loud: profits).

Now, I happen to support exactly the kind of work involved in building an economy that could help reverse climate devastation. I agree with Joe Biden's campaign statement that such an undertaking could offer people jobs with "good wages, benefits, and worker protections." More than that, such jobs would indeed contribute to a better life for those who do them. As the philosopher Iris Marion Young puts it, they would provide the chance to learn and use "satisfying and expansive skills in a socially recognized setting." And that would be a very good thing even if no one made a penny of profit in the process.

Now, having finished my paid labor for the day, it's back to the basement and loom for me.

Copyright 2021 Rebecca Gordon

Rebecca Gordon, a TomDispatch regular, teaches at the University of San Francisco. She is the author of American Nuremberg: The U.S. Officials Who Should Stand Trial for Post-9/11 War Crimes and is now at work on a new book on the history of torture in the United States.

How 4 years of Trump brought us closer to doomsday

If you live in California, you're likely to be consumed on occasion by thoughts of fire. That's not surprising, given that, in last year alone, actual fires consumed over four and a quarter million acres of the state, taking with them 10,488 structures, 33 human lives, and who knows how many animals. By the end of this January, a month never before even considered part of the "fire" season, 10 wildfires had already burned through 2,337 more acres, according to the California Department of Forestry and Fire Protection (CalFire).

With each passing year, the state's fire season arrives earlier and does greater damage. In 2013, a mere eight years ago, fires consumed about 602,000 acres and started significantly later. That January, CalFire reported only a single fire, just two in February, and none in March. Fire season didn't really begin until April and had tapered off before year's end. This past December, however, 10 fires still burned at least 10,000 acres. In fact, it almost doesn't make sense to talk about a fire "season" anymore. Whatever the month, wildfires are likely to be burning somewhere in the state.

Clearly, California's fires (along with Oregon's and Washington's) are getting worse. Just as clearly, notwithstanding Donald Trump's exhortations to do a better job of "raking" our forests, climate change is the main cause of this growing disaster.

Fortunately, President Joe Biden seems to take the climate emergency seriously. In just his first two weeks in office, he's canceled the Keystone XL pipeline project, forbidden new drilling for oil or gas on public lands, and announced a plan to convert the entire federal fleet of cars and trucks to electric vehicles. Perhaps most important of all, he's bringing the U.S. back into the Paris climate accords, signaling an understanding that a planetary crisis demands planetwide measures and that the largest carbon-emitting economies should be leading the way. "This isn't [the] time for small measures," Biden has said. "We need to be bold."

Let's just hope that such boldness has arrived in time and that the Biden administration proves unwilling to sacrifice the planet on an altar of elusive congressional unity and illusionary bipartisanship.

Another Kind of Fire

If climate change threatens human life as we know it, so does another potential form of "fire" — the awesome power created when a nuclear reaction converts matter to energy. This is the magic of Einstein's observation that e=mc2, or that the energy contained in a bit of matter is equal to its mass (roughly speaking, its weight) multiplied by the speed of light expressed in meters per second. Roughly speaking, as we've all known since August 6, 1945, when an atomic bomb was dropped on the Japanese city of Hiroshima, that's an awful lot of energy. When a nuclear reaction is successfully controlled, the energy can be regulated and used to produce electricity without emitting carbon dioxide in the process.

Unfortunately, while nuclear power plants don't add greenhouse gasses to the atmosphere, they do create radioactive waste, some of which remains deadly for thousands of years. Industry advocates who argue for nuclear power as a "green" alternative generally ignore the problem which has yet to be solved ­­ of disposing of that waste.

In what hopefully is just a holdover from the Trump administration, the Energy Department website still "addresses" this issue by suggesting that all the nuclear waste produced to date "could fit on a football field at a depth of less than 10 yards!" The site neglects to add that, if you shoved that 3,456,000 square feet of nuclear waste together the wrong way, the resultant explosive chain reaction would probably wipe out most life on Earth.

Remember, too, that "controlled" nuclear reactions don't always remain under human control. Ask anyone who lived near the Three Mile Island nuclear reactor in Pennsylvania, the Chernobyl nuclear power plant in the Ukraine, or the Fukushima Daiichi nuclear power plant in Japan.

There is, however, another far more devastating form of "controlled" nuclear reaction, the kind created when a nuclear bomb explodes. Only one country has ever deployed atomic weapons in war, of course: the United States, in its attack on Hiroshima and, three days later, on Nagasaki. Those bombs were of the older uranium-based variety and were puny by the standards of today's nuclear weapons. Still, the horror of those attacks was sufficient to convince many that such weapons should never be used again.

Treaties and Entreaties

In the decades since 1945, various configurations of nations have agreed to treaties prohibiting the use of, or limiting the proliferation of, nuclear weapons — even as the weaponry spread and nuclear arsenals grew. In the Cold War decades, the most significant of these were the bilateral pacts between the two superpowers of the era, the U.S. and the Soviet Union. When the latter collapsed in 1991, Washington signed treaties instead with the Russian Federation government, the most recent being the New START treaty, which came into effect in 2011 and was just extended by Joe Biden and Vladimir Putin.

In addition to such bilateral agreements, the majority of nations on the planet agreed on various multilateral pacts, including the Nuclear Non-Proliferation Treaty, or NPT, which has been signed by 191 countries and has provided a fairly effective mechanism for limiting the spread of such arms. Today, there are still "only" nine nuclear-armed states. Of these, six have signed the NPT, but just five of them — China, France, Russia, the United Kingdom, and United States — admit to possessing such weaponry. Israel, which also signed the pact, has never publicly acknowledged its growing nuclear arsenal. Three other nuclear-armed countries — India, Pakistan, and North Korea — have never signed the treaty at all. Worse yet, in 2005, the George W. Bush administration inked a side-deal with India that gave Washington's blessing to the acceleration of that country's nuclear weapons development program outside the monitoring constraints of the NPT.

The treaty assigns to the International Atomic Energy Agency (IAEA) the authority to monitor compliance. It was this treaty, for example, that gave the IAEA the right to inspect Iraq's nuclear program in the period before the U.S. invaded in 2003. Indeed, the IAEA repeatedly reported that Iraq was, in fact, in compliance with the treaty in the months that preceded the invasion, despite the claims of the Bush administration that Iraqi ruler Saddam Hussein had such weaponry. The United States must act, President Bush insisted then, before the "smoking gun" of proof the world demanded turned out to be a "mushroom cloud" over some American city. As became clear after the first few months of the disastrous U.S. military occupation, there simply were no weapons of mass destruction in Iraq. (At least partly in recognition of the IAEA's attempts to forestall that U.S. invasion, the agency and its director general, Mohamed El Baradei, would receive the 2005 Nobel Peace Prize.)

Like Iraq, Iran also ratified the NPT in 1968, laying the foundation for ongoing IAEA inspections there. In recent years, having devastated Iraq's social, economic, and political infrastructure, the United States shifted its concern about nuclear proliferation to Iran. In 2015, along with China, Russia, France, the United Kingdom, Germany, and the European Union, the Obama administration signed the Joint Comprehensive Plan of Action (JCPOA), informally known as the Iran nuclear deal.

Under the JCPOA, in return for the lifting of onerous economic sanctions that were affecting the whole population, Iran agreed to limit the development of its nuclear capacity to the level needed to produce electricity. Again, IAEA scientists would be responsible for monitoring the country's compliance, which by all accounts was more than satisfactory — at least until 2018. That's when President Donald Trump unilaterally pulled the U.S. out of the agreement and reimposed heavy sanctions. Since then, as its economy began to be crushed, Iran was, understandably enough, reluctant to uphold its end of the bargain.

In the years since 1945, the world has seen treaties signed to limit or ban the testing of nuclear weapons or to cap the size of nuclear arsenals, as well as bilateral treaties to decommission parts of existing ones, but never a treaty aimed at outlawing nuclear weapons altogether. Until now. On January 22, 2021, the United Nations Treaty on the Prohibition of Nuclear Weapons took effect. Signed so far by 86 countries, the treaty represents "a legally binding instrument to prohibit nuclear weapons, leading towards their total elimination," according to the U.N. Sadly, but unsurprisingly, none of the nine nuclear powers are signatories.

"Fire and Fury"

I last wrote about nuclear danger in October 2017 when Donald Trump had been in the White House less than a year and, along with much of the world, I was worried that he might bungle his way into a war with North Korea. Back then, he and Kim Jong-un had yet to fall in love or to suffer their later public breakup. Kim was still "Little Rocket Man" to Trump, who had threatened to "rain fire and fury like the world has never seen" on North Korea.

The world did, in the end, survive four years of a Trump presidency without a nuclear war, but that doesn't mean he left us any safer. On the contrary, he took a whole series of rash steps leading us closer to nuclear disaster:

  • He pulled the U.S. out of the JCPOA, thereby destabilizing the Iran nuclear agreement and reigniting Iran's threats (and apparent efforts toward) someday developing nuclear weapons.
  • He withdrew from the 1987 Intermediate Range Nuclear Forces Treaty between the U.S. and the Soviet Union (later the Russian Federation), which, according to the nonpartisan Arms Control Association,
"required the United States and the Soviet Union to eliminate and permanently forswear all of their nuclear and conventional ground-launched ballistic and cruise missiles with ranges of 500 to 5,500 kilometers. The treaty marked the first time the superpowers had agreed to reduce their nuclear arsenals, eliminate an entire category of nuclear weapons, and employ extensive on-site inspections for verification."
  • He withdrew from the Open Skies Treaty, which gave signatories permission to fly over each other's territories to identify military installations and activities. Allowing this kind of access was meant to contribute to greater trust among nuclear-armed nations.
  • He threatened to allow the New START Treaty to expire, should he be reelected.
  • He presided over a huge increase in spending on the "modernization" of the U.S. nuclear arsenal, including on new submarine- and land-based launching capabilities. A number of these programs are still in their initial stages and could be stopped by the Biden administration.

In January 2021, after four years of Trump, the Bulletin of Atomic Scientists adjusted its "Doomsday Clock," moving the minute hand forward, to a mere 100 seconds to midnight. Since 1947, that Clock's annual resetting has reflected how close, in the view of the Bulletin's esteemed scientists and Nobel laureates, humanity has come to ending it all. As the Bulletin's editors note, "The Clock has become a universally recognized indicator of the world's vulnerability to catastrophe from nuclear weapons, climate change, and disruptive technologies in other domains."

Why so close to midnight? The magazine lists a number of reasons, including the increased danger of nuclear war, due in large part to steps taken by the United States in the Trump years, as well as to the development of "hypersonic" missiles, which are supposed to fly at five times the speed of sound and so evade existing detection systems. (Trump famously referred to these "super-duper" weapons as "hydrosonic," a term that actually describes a kind of toothbrush.) There is disagreement among weapons experts about the extent to which such delivery vehicles will live up to the (hyper) hype about them, but the effort to build them is destabilizing in its own right.

The Bulletin points to a number of other factors that place humanity in ever greater danger. One is, of course, the existential threat of climate change. Another is the widespread dissemination of "false and misleading information." The spread of lies about Covid-19, its editors say, exemplifies the life-threatening nature of a growing "wanton disregard for science and the large-scale embrace of conspiratorial nonsense." This is, they note, "often driven by political figures and partisan media." Such attacks on knowledge itself have "undermined the ability of responsible national and global leaders to protect the security of their citizens."

Passing the (Nuclear) Ball

When Donald Trump announced that he wouldn't attend the inauguration of Joe Biden and Kamala Harris, few people were surprised. After all, he was still insisting that he'd actually won the election, even after that big lie fueled an insurrectionary invasion of the Capitol. But there was another reason for concern: if Trump was going to be at Mar-a-Lago, how would he hand over the "nuclear football" to the new president? That "football" is, in fact, a briefcase containing the nuclear launch codes, which presidents always have with them. Since the dawn of the nuclear age, it's been passed from the outgoing president to the new one on Inauguration Day.

Consternation! The problem was resolved through the use of two briefcases, which were simultaneously deactivated and activated at 11:59:59 a.m. on January 20th, just as Biden was about to be sworn in.

The football conundrum pointed to a far more serious problem, however — that the fate of humanity regularly hangs on the actions of a single individual (whether as unbalanced as Donald Trump or as apparently sensible as Joe Biden) who has the power to begin a war that could end our species.

There's good reason to think that Joe Biden will be more reasonable about the dangers of nuclear warfare than the narcissistic idiot he succeeds. In addition to agreeing to extend the New START treaty, he's also indicated a willingness to rejoin the Iran nuclear deal and criticized Trump's nuclear buildup. Nevertheless, the power to end the world shouldn't lie with one individual. Congress could address this problem, by (as I suggested in 2017) enacting "a law that would require a unanimous decision by a specified group of people (for example, officials like the secretaries of state and defense together with the congressional leadership) for a nuclear first strike."

The Fire Next Time?

"God gave Noah the rainbow sign
No more water but the fire next time"

These words come from the African-American spiritual "I Got a Home in that Rock." The verse refers to God's promise to Noah in Genesis, after the great flood, never again to destroy all life on earth, a promise signified by the rainbow.

Those who composed the hymn may have been a bit less trusting of God — or of human destiny — than the authors of Genesis, since the Bible account says nothing about fire or a next time. Sadly, recent human history suggests that there could indeed be a next time. If we do succeed in destroying ourselves, it seems increasingly likely that it will be by fire, whether the accelerating heating of the globe over decades, or a nuclear conflagration any time we choose. The good news, the flame of hope, is that we still have time — at least 100 seconds — to prevent it.

Copyright 2021 Rebecca Gordon

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer's new dystopian novel Frostlands (the second in the Splinterlands series), Beverly Gologorsky's novel Every Body Has a Story, and Tom Engelhardt's A Nation Unmade by War, as well as Alfred McCoy's In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower's The Violent American Century: War and Terror Since World War II.

Rebecca Gordon, a TomDispatch regular, teaches at the University of San Francisco. She is the author of American Nuremberg: The U.S. Officials Who Should Stand Trial for Post-9/11 War Crimes and is now at work on a new book on the history of torture in the United States.

The signs that our American empire is crumbling

How can you tell when your empire is crumbling? Some signs are actually visible from my own front window here in San Francisco.

Directly across the street, I can see a collection of tarps and poles (along with one of my own garbage cans) that were used to construct a makeshift home on the sidewalk. Beside that edifice stands a wooden cross decorated with a string of white Christmas lights and a red ribbon — a memorial to the woman who built that structure and died inside it earlier this week. We don't know — and probably never will — what killed her: the pandemic raging across California? A heart attack? An overdose of heroin or fentanyl?

Behind her home and similar ones is a chain-link fence surrounding the empty playground of the Horace Mann/Buena Vista elementary and middle school. Like that home, the school, too, is now empty, closed because of the pandemic. I don't know where the families of the 20 children who attended that school and lived in one of its gyms as an alternative to the streets have gone. They used to eat breakfast and dinner there every day, served on the same sidewalk by a pair of older Latina women who apparently had a contract from the school district to cook for the families using that school-cum-shelter. I don't know, either, what any of them are now doing for money or food.

Just down the block, I can see the line of people that has formed every weekday since early December. Masked and socially distanced, they wait patiently to cross the street, one at a time, for a Covid test at a center run by the San Francisco Department of Health. My little street seems an odd choice for such a service, since — especially now that the school has closed — it gets little foot traffic. Indeed, a representative of the Latino Task Force, an organization created to inform the city's Latinx population about Covid resources told our neighborhood paper Mission Local that

"Small public health clinics such as this one 'will say they want to do more outreach, but I actually think they don't want to.' He believes they chose a low-trafficked street like Bartlett to stay under the radar. 'They don't want to blow the spot up, because it does not have a large capacity.'"

What do any of these very local sights have to do with a crumbling empire? They're signs that some of the same factors that fractured the Roman empire back in 476 CE (and others since) are distinctly present in this country today — even in California, one of its richest states. I'm talking about phenomena like gross economic inequality; over-spending on military expansion; political corruption; deep cultural and political fissures; and, oh yes, the barbarians at the gates. I'll turn to those factors in a moment, but first let me offer a brief defense of the very suggestion that U.S. imperialism and an American empire actually exist.

Imperialism? What's That Supposed to Mean?

What better source for a definition of imperialism than the Encyclopedia Britannica, that compendium of knowledge first printed in 1768 in the country that became the great empire of the nineteenth and first part of the twentieth centuries? According to the Encyclopedia, "imperialism" denotes "state policy, practice, or advocacy of extending power and dominion, especially by direct territorial acquisition or by gaining political and economic control of other areas." Furthermore, imperialism "always involves the use of power, whether military or economic or some subtler form." In other words, the word indicates a country's attempts to control and reap economic benefit from lands outside its borders.

In that context, "imperialism" is an accurate description of the trajectory of U.S. history, starting with the country's expansion across North America, stealing territory and resources from Indian nations and decimating their populations. The newly independent United States would quickly expand, beginning with the 1803 Louisiana Purchase from France. That deal, which effectively doubled its territory, included most of what would become the state of Louisiana, together with some or all of the present-day states of New Mexico, Texas, Arkansas, Missouri, Oklahoma, Kansas, Colorado, Iowa, Nebraska, Wyoming, Minnesota, North and South Dakota, Montana, and even small parts of what are today the Canadian provinces of Alberta and Saskatchewan.


Eventually, such expansionism escaped even those continental borders, as the country went on to gobble up the Philippines, Hawaii, the Panama Canal Zone, the Virgin Islands, Puerto Rico, Guam, American Samoa, and the Mariana Islands, the last five of which remain U.S. territories to this day. (Inhabitants of the nation's capital, where I grew up, were only partly right when we used to refer to Washington, D.C., as "the last colony.")

Of course, France didn't actually control most of that land, apart from the port city of New Orleans and its immediate environs. What Washington bought was the "right" to take the rest of that vast area from the native peoples who lived there, whether by treaty, population transfers, or wars of conquest and extermination. The first objective of that deal was to settle land on which to expand the already hugely lucrative cotton business, that economic engine of early American history fueled, of course, by slave labor. It then supplied raw materials to the rapidly industrializing textile industry of England, which drove that country's own imperial expansion.

U.S. territorial expansion continued as, in 1819, Florida was acquired from Spain and, in 1845, Texas was forcibly annexed from Mexico (as well as various parts of California a year later). All of those acquisitions accorded with what newspaper editor John O'Sullivan would soon call the country's manifest — that is, clear and obvious — destiny to control the entire continent.

American Doctrines from Monroe to Truman to (G.W.) Bush

U.S. economic, military, and political influence has long extended far beyond those internationally recognized possessions and various presidents have enunciated a series of "doctrines" to legitimate such an imperial reach.

Monroe: The first of these was the Monroe Doctrine, introduced in 1823 in President James Monroe's penultimate State of the Union address. He warned the nations of Europe that, while the United States recognized existing colonial possessions in the Americas, it would not permit the establishment of any new ones.

President Teddy Roosevelt would later add a corollary to Monroe's doctrine by establishing Washington's right to intercede in any country in the Americas that, in the view of its leaders, was not being properly run. "Chronic wrongdoing," he said in a 1904 message to Congress, "may in America, as elsewhere, ultimately require intervention by some civilized nation." The United States, he suggested, might find itself forced, "however reluctantly, in flagrant cases of such wrongdoing or impotence, to the exercise of an international police power." In the first quarter of the twentieth century, that Roosevelt Corollary would be used to justify U.S. occupations of Cuba, the Dominican Republic, Haiti, and Nicaragua.

Truman: Teddy's cousin, President Franklin D. Roosevelt, publicly renounced the Monroe Doctrine and promised a hands-off attitude towards Latin America, which came to be known as the Good Neighbor Policy. It didn't last long, however. In a 1947 address to Congress, the next president, Harry S. Truman, laid out what came to be known as the Truman Doctrine, which would underlie the country's foreign policy at least until the collapse of the Soviet Union in 1991. It held that U.S. national security interests required the "containment" of existing Communist states and the prevention of the further spread of Communism anywhere on Earth.

It almost immediately led to interventions in the internal struggles of Greece and Turkey and would eventually underpin Washington's support for dictators and repressive regimes from El Salvador to Indonesia. It would justify U.S.-backed coups in places like Iran, Guatemala, and Chile. It would lead this country into a futile war in Korea and a disastrous defeat in Vietnam.

That post-World War II turn to anticommunism would be accompanied by a new kind of colonialism. Rather than directly annexing territories to extract cheap labor and cheaper natural resources, under this new "neocolonial" model, the United States — and soon the great multilateral institutions of the post-war era, the World Bank and the International Monetary Fund — would gain control over the economies of poor nations. In return for aid — or loans often pocketed by local elites and repaid by the poor — those nations would accede to demands for the "structural adjustment" of their economic systems: the privatization of public services like water and utilities and the defunding of human services like health and education, usually by American or multinational corporations. Such "adjustments," in turn, allowed the recipients to service the loans, extracting scarce hard currency from already deeply impoverished nations.

Bush: You might have thought that the fall of the Soviet empire and the end of the Cold War would have provided Washington with an opportunity to step away from resource extraction and the seemingly endless military and CIA interventions that accompanied it. You might have imagined that the country then being referred to as the "last superpower" would finally consider establishing new and different relationships with the other countries on this little planet of ours. However, just in time to prevent even the faint possibility of any such conversion came the terrorist attacks of 9/11, which gave President George W. Bush the chance to promote his very own doctrine.

In a break from postwar multilateralism, the Bush Doctrine outlined the neoconservative belief that, as the only superpower in a now supposedly "unipolar" world, the United States had the right to take unilateral military action any time it believed it faced external threat of any imaginable sort. The result: almost 20 years of disastrous "forever wars" and a military-industrial complex deeply embedded in our national economy. Although Donald Trump's foreign policy occasionally feinted in the direction of isolationism in its rejection of international treaties, protocols, and organizational responsibilities, it still proved itself a direct descendant of the Bush Doctrine. After all, it was Bush who first took the United States out of the Anti-Ballistic Missile Treaty and rejected the Kyoto Protocol to fight climate change.

His doctrine instantly set the stage for the disastrous invasion and occupation of Afghanistan, the even more disastrous Iraq War, and the present-day over-expansion of the U.S. military presence, overt and covert, in practically every corner of the world. And now, to fulfill Donald Trump's Star Trek fantasies, even in outer space.

An Empire in Decay

If you need proof that the last superpower, our very own empire, is indeed crumbling, consider the year we've just lived through, not to mention the first few weeks of 2021. I mentioned above some of the factors that contributed to the collapse of the famed Roman empire in the fifth century. It's fair to say that some of those same things are now evident in twenty-first-century America. Here are four obvious candidates:

Grotesque Economic Inequality: Ever since President Ronald Reagan began the Republican Party's long war on unions and working people, economic inequality has steadily increased in this country, punctuated by terrible shocks like the Great Recession of 2007-2008 and, of course, by the Covid-19 disaster. We've seen 40 years of tax reductions for the wealthy, stagnant wages for the rest of us (including a federal minimum wage that hasn't changed since 2009), and attacks on programs like TANF (welfare) and SNAP (food stamps) that literally keep poor people alive.

The Romans relied on slave labor for basics like food and clothing. This country relies on super-exploited farm and food-factory workers, many of whom are unlikely to demand more or better because they came here without authorization. Our (extraordinarily cheap) clothes are mostly produced by exploited people in other countries.

The pandemic has only exposed what so many people already knew: that the lives of the millions of working poor in this country are growing ever more precarious and desperate. The gulf between rich and poor widens by the day to unprecedented levels. Indeed, as millions have descended into poverty since the pandemic began, the Guardian reports that this country's 651 billionaires have increased their collective wealth by $1.1 trillion. That's more than the $900 billion Congress appropriated for pandemic aid in the omnibus spending bill it passed at the end of December 2020.

An economy like ours, which depends so heavily on consumer spending, cannot survive the deep impoverishment of so many people. Those 651 billionaires are not going to buy enough toys to dig us out of this hole.

Wild Overspending on the Military: At the end of 2020, Congress overrode Trump's veto of the annual National Defense Authorization Act, which provided a stunning $741 billion to the military this fiscal year. (That veto, by the way, wasn't in response to the vast sums being appropriated in the midst of a devastating pandemic, but to the bill's provisions for renaming military bases currently honoring Confederate generals, among other extraneous things.) A week later, Congress passed that omnibus pandemic spending bill and it contained an additional $696 billion for the Defense Department.

All that money for "security" might be justified, if it actually made our lives more secure. In fact, our federal priorities virtually take food out of the mouths of children to feed the maw of the military-industrial complex and the never-ending wars that go with it. Even before the pandemic, more than 10% of U.S. families regularly experienced food insecurity. Now, it's a quarter of the population.

Corruption So Deep It Undermines the Political System: Suffice it to say that the man who came to Washington promising to "drain the swamp" has presided over one of the most corrupt administrations in U.S. history. Whether it's been blatant self-dealing (like funneling government money to his own businesses); employing government resources to forward his reelection (including using the White House as a staging ground for parts of the Republican National Convention and his acceptance speech); tolerating corrupt subordinates like Secretary of Commerce Wilbur Ross; or contemplating a self-pardon, the Trump administration has set the bar high indeed for any future aspirants to the title of "most corrupt president."

One problem with such corruption is that it undermines the legitimacy of government in the minds of the governed. It makes citizens less willing to obey laws, pay taxes, or act for the common good by, for example, wearing masks and socially distancing during a pandemic. It rips apart social cohesion from top to bottom.

Of course, Trump's most dangerous corrupt behavior — one in which he's been joined by the most prominent elected and appointed members of his government and much of his party — has been his campaign to reject the results of the 2020 general election. The concerted and cynical promotion of the big lie that the Democrats stole that election has so corrupted faith in the legitimacy of government that up to 68% of Republicans now believe the vote was rigged to elect Joe Biden. At "best," Trump has set the stage for increased Republican suppression of the vote in communities of color. At worst, he has so poisoned the electoral process that a substantial minority of Americans will never again accept as free and fair an election in which their candidate loses.

A Country in Ever-Deepening Conflict: White supremacy has infected the entire history of this country, beginning with the near-extermination of its native peoples. The Constitution, while guaranteeing many rights to white men, proceeded to codify the enslavement of Africans and their descendants. In order to maintain that enslavement, the southern states seceded and fought a civil war. After a short-lived period of Reconstruction in which Black men were briefly enfranchised, white supremacy regained direct legal control in the South, and flourished in a de facto fashion in the rest of the country.

In 1858, two years before that civil war began, Abraham Lincoln addressed the Illinois Republican State Convention, reminding those present that

"'A house divided against itself cannot stand.' I believe this government cannot endure, permanently half slave and half free. I do not expect the Union to be dissolved — I do not expect the house to fall – but I do expect it will cease to be divided. It will become all one thing, or all the other."

More than 160 years later, the United States clearly not only remains but has become ever more divided. If you doubt that the Civil War is still being fought today, look no farther than the Confederate battle flags proudly displayed by members of the insurrectionary mob that overran the Capitol on January 6th.

Oh, and the barbarians? They are not just at the gate; they have literally breached it, as we saw in Washington when they burst through the doors and windows of the center of government.

Building a Country From the Rubble of Empire

Human beings have long built new habitations quite literally from the rubble — the fallen stones and timbers — of earlier ones. Perhaps it's time to think about what kind of a country this place — so rich in natural resources and human resourcefulness — might become if we were to take the stones and timbers of empire and construct a nation dedicated to the genuine security of all its people. Suppose we really chose, in the words of the preamble to the Constitution, "to promote the general welfare, and to secure the blessings of liberty to ourselves and our posterity."

Suppose we found a way to convert the desperate hunger for ever more, which is both the fuel of empires and the engine of their eventual destruction, into a new contentment with "enough"? What would a United States whose people have enough look like? It would not be one in which tiny numbers of the staggeringly wealthy made hundreds of billions more dollars and the country's military-industrial complex thrived in a pandemic, while so many others went down in disaster.

This empire will fall sooner or later. They all do. So, this crisis, just at the start of the Biden and Harris years, is a fine time to begin thinking about what might be built in its place. What would any of us like to see from our front windows next year?

Copyright 2021 Rebecca Gordon

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer's new dystopian novel Frostlands (the second in the Splinterlands series), Beverly Gologorsky's novel Every Body Has a Story, and Tom Engelhardt's A Nation Unmade by War, as well as Alfred McCoy's In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower's The Violent American Century: War and Terror Since World War II.

Rebecca Gordon, a TomDispatch regular, teaches at the University of San Francisco. She is the author of American Nuremberg: The U.S. Officials Who Should Stand Trial for Post-9/11 War Crimes and is now at work on a new book on the history of torture in the United States.

Trump's broken promise shows how the American empire is rotting from within

It was the end of October 2001. Two friends, Max Elbaum and Bob Wing, had just dropped by. (Yes, children, believe it or not, people used to drop in on each other, maskless, once upon a time.) They had come to hang out with my partner Jan Adams and me. Among other things, Max wanted to get some instructions from fellow-runner Jan about taping his foot to ease the pain of plantar fasciitis. But it soon became clear that he and Bob had a bigger agenda for the evening. They were eager to recruit us for a new project.

And so began War Times/Tiempo de Guerras, a free, bilingual, antiwar tabloid that, at its height, distributed 100,000 copies every six weeks to more than 700 antiwar organizations around the country. It was already clear to the four of us that night -- as it was to millions around the world -- that the terrorist attacks of September 11th would provide the pretext for a major new projection of U.S. military power globally, opening the way to a new era of "all-war-all-the-time." War Times was a project of its moment (although the name would still be apt today, given that those wars have never ended). It would be superseded in a few years by the explosive growth of the Internet and the 24-hour news cycle. Still, it represented an early effort to fill the space where a peace movement would eventually develop.

All-War-All-the-Time -- For Some of Us

We were certainly right that the United States had entered a period of all-war-all-the-time. It's probably hard for people born since 9/11 to imagine how much -- and how little -- things changed after September 2001. By the end of that month, this country had already launched a "war" on an enemy that then-Secretary of Defense Donald Rumsfeld told us was "not just in Afghanistan," but in "50 or 60 countries, and it simply has to be liquidated."

Five years and two never-ending wars later, he characterized what was then called the war on terror as "a generational conflict akin to the Cold War, the kind of struggle that might last decades as allies work to root out terrorists across the globe and battle extremists who want to rule the world." A generation later, it looks like Rumsfeld was right, if not about the desires of the global enemy, then about the duration of the struggle.

Here in the United States, however, we quickly got used to being "at war." In the first few months, interstate bus and train travelers often encountered (and, in airports, still encounter) a new and absurd kind of "security theater." I'm referring to those long, snaking lines in which people first learned to remove their belts and coats, later their hats and shoes, as ever newer articles of clothing were recognized as potential hiding places for explosives. Fortunately, the arrest of the Underwear Bomber never led the Transportation Security Administration to the obvious conclusion about the clothing travelers should have to remove next. We got used to putting our three-ounce containers of liquids (No more!) into quart-sized baggies (No bigger! No smaller!).

It was all-war-all-the-time, but mainly in those airports. Once the shooting wars started dragging on, if you didn't travel by airplane much or weren't deployed to Afghanistan or Iraq, it was hard to remember that we were still in war time at all. There were continuing clues for those who wanted to know, like the revelations of CIA torture practices at "black sites" around the world, the horrors of military prisons like the ones at Bagram Air Force Base in Afghanistan, Abu Ghraib in Baghdad, and the still-functioning prison complex at Guantánamo Bay, Cuba. And soon enough, of course, there were the hundreds and then thousands of veterans of the Iraq and Afghan wars taking their places among the unhoused veterans of earlier wars in cities across the United States, almost unremarked upon, except by service organizations.

So, yes, the wars dragged on at great expense, but with little apparent effect in this country. They even gained new names like "the long war" (as Donald Trump's Secretary of Defense James Mattis put it in 2017) or the "forever wars," a phrase now so common that it appears all over the place. But apart from devouring at least $6.4 trillion dollars through September 2020 that might otherwise have been invested domestically in healthcare, education, infrastructure, or addressing poverty and inequality, apart from creating increasingly militarized domestic police forces armed ever more lethally by the Pentagon, those forever wars had little obvious effect on the lives of most Americans.

Of course, if you happened to live in one of the places where this country has been fighting for the last 19 years, things are a little different. A conservative estimate by Iraq Body Count puts violent deaths among civilians in that country alone at 185,454 to 208,493 and Brown University's Costs of War project points out that even the larger figure is bound to be a significant undercount:

Several times as many Iraqi civilians may have died as an indirect result of the war, due to damage to the systems that provide food, health care, and clean drinking water, and as a result, illness, infectious diseases, and malnutrition that could otherwise have been avoided or treated.

And that's just Iraq. Again, according to the Costs of War Project, "At least 800,000 people have been killed by direct war violence in Iraq, Afghanistan, Syria, Yemen, and Pakistan."

Of course, many more people than that have been injured or disabled. And America's post-9/11 wars have driven an estimated 37 million people from their homes, creating the greatest human displacement since World War II. People in this country are rightly concerned about the negative effects of online schooling on American children amid the ongoing Covid-19 crisis (especially poor children and those in communities of color). Imagine, then, the effects on a child's education of losing her home and her country, as well as one or both parents, and then growing up constantly on the move or in an overcrowded, under-resourced refugee camp. The war on terror has truly become a war of generations.

Every one of the 2,977 lives lost on 9/11 was unique and invaluable. But the U.S. response has been grotesquely disproportionate -- and worse than we War Times founders could have imagined that October night so many years ago.

Those wars of ours have gone on for almost two decades now. Each new metastasis has been justified by George W. Bush's and then Barack Obama's use of the now ancient 2001 Authorization for the Use of Military Force (AUMF), which Congress passed in the days after 9/11. Its language actually limited presidential military action to a direct response to the 9/11 attacks and the prevention of future attacks by the same actors. It stated that the president

...is authorized to use all necessary and appropriate force against those nations, organizations, or persons he determines planned, authorized, committed, or aided the terrorist attacks that occurred on September 11, 2001, or harbored such organizations or persons, in order to prevent any future acts of international terrorism against the United States by such nations, organizations or persons.

Despite that AUMF's limited scope, successive presidents have used it to justify military action in at least 18 countries. (To be fair, President Obama realized the absurdity of his situation when he sent U.S. troops to Syria and tried to wring a new authorization out of Congress, only to be stymied by a Republican majority that wouldn't play along.)

In 2002, in the run-up to the Iraq War, Congress passed a second AUMF, which permitted the president to use the armed forces as "necessary and appropriate" to "defend U.S. national security against the continuing threat posed by Iraq." In January 2020, Donald Trump used that second authorization to justify the murder by drone of Qasem Soleimani, an Iranian general, along with nine other people.

Trump Steps In

In 2016, peace activists were preparing to confront a Hillary Clinton administration that we expected would continue Obama's version of the forever wars -- the "surge" in Afghanistan, the drone assassination campaigns, the special ops in Africa. But on Tuesday, November 8, 2016, something went "Trump" in the night and Donald J. Trump took over the presidency with a promise to end this country's forever wars, which he had criticized relentlessly during his campaign. That, of course, didn't mean we should have expected a peace dividend anytime soon. He was also committed to rebuilding a supposedly "depleted" U.S. military. As he said at a 2019 press conference,

When I took over, it was a mess... One of our generals came in to see me and he said, 'Sir, we don't have ammunition.' I said, 'That's a terrible thing you just said.' He said, 'We don't have ammunition.' Now we have more ammunition than we've ever had.

It's highly unlikely that the military couldn't afford to buy enough bullets when Trump entered the Oval Office, given that publicly acknowledged defense funding was then running at $580 billion a year. He did, however, manage to push that figure to $713 billion by fiscal year 2020. That December, he threatened to veto an even larger appropriation for 2021 -- $740 billion -- but only because he wanted the military to continue to honor Confederate generals by keeping their names on military bases. Oh, and because he thought the bill should also change liability rules for social media companies, an issue you don't normally expect to see addressed in a defense appropriations bill. And, in any case, Congress passed the bill with a veto-proof majority.

As Pentagon expert Michael Klare pointed out recently, while it might seem contradictory that Trump would both want to end the forever wars and to increase military spending, his actions actually made a certain sense. The president, suggested Klare, had been persuaded to support the part of the U.S. military command that has favored a sharp pivot away from reigning post-9/11 Pentagon practices. For 19 years, the military high command had hewed fairly closely to the strategy laid out by Secretary of Defense Donald Rumsfeld early in the Bush years: maintaining the capacity to fight ground wars against one or two regional powers (think of that "Axis of Evil" of Iraq, North Korea, and Iran), while deploying agile, technologically advanced forces in low-intensity (and a couple of higher-intensity) counterterrorism conflicts. Nineteen years later, whatever its objectives may have been -- a more-stable Middle East? Fewer and weaker terrorist organizations? -- it's clear that the Rumsfeld-Bush strategy has failed spectacularly.

Klare points out that, after almost two decades without a victory, the Pentagon has largely decided to demote international terrorism from rampaging monster to annoying mosquito cloud. Instead, the U.S. must now prepare to confront the rise of China and Russia, even if China has only one overseas military base and Russia, economically speaking, is a rickety petro-state with imperial aspirations. In other words, the U.S. must prepare to fight short but devastating wars in multiple domains (including space and cyberspace), perhaps even involving the use of tactical nuclear weapons on the Eurasian continent. To this end, the country has indeed begun a major renovation of its nuclear arsenal and announced a new 30-year plan to beef up its naval capacity. And President Trump rarely misses a chance to tout "his" creation of a new Space Force.

Meanwhile, did he actually keep his promise and at least end those forever wars? Not really. He did promise to bring all U.S. troops home from Afghanistan by Christmas, but acting Defense Secretary Christopher Miller only recently said that we'd be leaving about 2,500 troops there and a similar number in Iraq, with the hope that they'd all be out by May 2021. (In other words, he dumped those wars in the lap of the future Biden administration.)

In the meantime in these years of "ending" those wars, the Trump administration actually loosened the rules of engagement for air strikes in Afghanistan, leading to a "massive increase in civilian casualties," according to a new report from the Costs of War Project. "From the last year of the Obama administration to the last full year of recorded data during the Trump administration," writes its author, Neta Crawford, "the number of civilians killed by U.S.-led airstrikes in Afghanistan increased by 330 percent."

In spite of his isolationist "America First" rhetoric, in other words, President Trump has presided over an enormous buildup of an institution, the military-industrial complex, that was hardly in need of major new investment. And in spite of his anti-NATO rhetoric, his reduction by almost a third of U.S. troop strength Germany, and all the rest, he never really violated the post-World War II foreign policy pact between the Republican and Democratic parties. Regardless of how they might disagree about dividing the wealth domestically, they remain united in their commitment to using diplomacy when possible, but military force when necessary, to maintain and expand the imperial power that they believed to be the guarantor of that wealth.

And Now Comes Joe

On January 20, 2021, Joe Biden will become the president of a country that spends as much on its armed forces, by some counts, as the next 10 countries combined. He'll inherit responsibility for a nation with a military presence in 150 countries and special-operations deployments in 22 African nations alone. He'll be left to oversee the still-unfinished, deeply unsuccessful, never-ending war on terror in Iraq, Syria, Afghanistan, Yemen, and Somalia and, as publicly reported by the Department of Defense, 187,000 troops stationed outside the United States.

Nothing in Joe Biden's history suggests that he or any of the people he's already appointed to his national security team have the slightest inclination to destabilize that Democratic-Republican imperial pact. But empires are not sustained by inclination alone. They don't last forever. They overextend themselves. They rot from within.

If you're old enough, you may remember stories about the long lines for food in the crumbling Soviet Union, that other superpower of the Cold War. You can see the same thing in the United States today. Once a week, my partner delivers food boxes to hungry people in our city, those who have lost their jobs and homes, because the pandemic has only exacerbated this country's already brutal version of economic inequality. Another friend routinely sees a food line stretching over a mile, as people wait hours for a single free bag of groceries.

Perhaps the horrors of 2020 -- the fires and hurricanes, Trump's vicious attacks on democracy, the death, sickness, and economic dislocation caused by Covid-19 -- can force a real conversation about national security in 2021. Maybe this time we can finally ask whether trying to prop up a dying empire actually makes us -- or indeed the world -- any safer. This is the best chance in a generation to start that conversation. The alternative is to keep trudging mindlessly toward disaster.

Rebecca Gordon, a TomDispatch regular, teaches at the University of San Francisco. She is the author of American Nuremberg: The U.S. Officials Who Should Stand Trial for Post-9/11 War Crimes and is now at work on a new Dispatch book on the history of torture in the United States.

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer's new dystopian novel (the second in the Splinterlands series) Frostlands, Beverly Gologorsky's novel Every Body Has a Story, and Tom Engelhardt's A Nation Unmade by War, as well as Alfred McCoy's In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower's The Violent American Century: War and Terror Since World WarII.

Copyright 2020 Rebecca Gordon

Trump's through-the-looking-glass presidency has turned the country inside out

In the chaos of this moment, it seems likely that Joe Biden will just squeeze into the presidency and that he'll certainly win the popular vote, Donald Trump's Mussolini-like behavior and election night false claim of victory notwithstanding. Somehow, it all brings another moment in my life to mind.

Back in October 2016, my friends and I frequently discussed the challenges progressives would face if the candidate we expected to win actually entered the Oval Office. There were so many issues to worry about back then. The Democratic candidate was an enthusiastic booster of the U.S. armed forces and believed in projecting American power through its military presence around the world. Then there was that long record of promoting harsh sentencing laws and the disturbing talk about "the kinds of kids that are called superpredators -- no conscience, no empathy."

In 2016, the country was already riven by deep economic inequality. While Hillary Clinton promised "good-paying jobs" for those struggling to stay housed and buy food, we didn't believe it. We'd heard the same promises so many times before, and yet the federal minimum wage was still stuck where it had been ever since 2009, at $7.25 an hour. Would a Clinton presidency really make a difference for working people? Not if we didn't push her -- and hard.

The candidate we were worried about was never Donald Trump, but Hillary Clinton. And the challenge we expected to confront was how to shove that quintessential centrist a few notches to the left. We were strategizing on how we might organize to get a new administration to shift government spending from foreign wars to human needs at home and around the world. We wondered how people in this country might finally secure the "peace dividend" that had been promised to us in the period just after the Cold War, back when her husband Bill became president. In those first (and, as it turned out, only) Clinton years, what we got instead was so-called welfare reform whose consequences are still being felt today, as layoffs drive millions into poverty.

We doubted Hillary Clinton's commitment to addressing most of our other concerns as well: mass incarceration and police violence, structural racism, economic inequality, and most urgent of all (though some of us were just beginning to realize it), the climate emergency. In fact, nationwide, people like us were preparing to spend a day or two celebrating the election of the first woman president and then get down to work opposing many of her anticipated policies. In the peace and justice movements, in organized labor, in community-based organizations, in the two-year-old Black Lives Matter movement, people were ready to roll.

And then the unthinkable happened. The woman we might have loved to hate lost that election and the white-supremacist, woman-hating monster we would grow to detest entered the Oval Office.

For the last four years, progressives have been fighting largely to hold onto what we managed to gain during Barack Obama's presidency: an imperfect healthcare plan that nonetheless insured millions of Americans for the first time; a signature on the Paris climate accord and another on a six-nation agreement to prevent Iran from pursuing nuclear weapons; expanded environmental protections for public lands; the opportunity for recipients of Deferred Action for Childhood Arrivals -- DACA -- status to keep on working and studying in the U.S.

For those same four years, we've been fighting to hold onto our battered capacity for outrage in the face of continual attacks on simple decency and human dignity. There's no need to recite here the catalogue of horrors Donald Trump and his spineless Republican lackeys visited on this country and the world. Suffice it to say that we've been living like Alice in Through the Looking Glass, running as hard as we can just to stand still. That fantasy world's Red Queen observes to a panting Alice that she must come from

"A slow sort of country! Now, here, you see, it takes all the running you can do to keep in the same place. If you want to get somewhere else, you must run at least twice as fast as that!"

It wasn't simply the need to run faster than full speed just in order to stay put that made Trump World so much like Looking-Glass Land. It's that, just as in Lewis Carroll's fictional world, reality has been turned inside out in the United States. As new Covid-19 infections reached an all-time high of more than 100,000 in a single day and the cumulative death toll surpassed 230,000, the president in the mirror kept insisting that "we're rounding the corner" (and a surprising number of Americans seemed to believe him). He neglected to mention that, around that very corner, a coronaviral bus is heading straight toward us, accelerating as it comes. In a year when, as NPR reported, "Nearly 1 in 4 households have experienced food insecurity," Trump just kept bragging about the stock market and reminding Americans of how well their 401k's were doing -- as if most people even had such retirement accounts in the first place.

Trump World, Biden Nation, or Something Better?

After four years of running in place, November 2016 seems like a lifetime ago. The United States of 2020 is a very different place, at once more devastated and more hopeful than at least we were a mere four years ago. On the one hand, pandemic unemployment has hit women, especially women of color, much harder than men, driving millions out of the workforce, many permanently. On the other, we've witnessed the birth of the #MeToo movement against sexual harassment and of the Time's Up Legal Defense Fund, which has provided millions of dollars for working-class women to fight harassment on the job. In a few brief years, physical and psychological attacks on women have ceased to be an accepted norm in the workplace. Harassment certainly continues every day, but the country's collective view of it has shifted.

Black and Latino communities still face daily confrontations with police forces that act more like occupying armies than public servants. The role of the police as enforcers of white supremacy hasn't changed in most parts of the country. Nonetheless, the efforts of the Black Lives Matter movement and of the hundreds of thousands of people who demonstrated this summer in cities nationwide have changed the conversation about the police in ways no one anticipated four years ago. Suddenly, the mainstream media are talking about more than body cams and sensitivity training. In June 2020, the New York Times ran an op-ed entitled, "Yes, We Mean Literally Abolish the Police," by Miramne Kaba, an organizer working against the criminalization of people of color. Such a thing was unthinkable four years ago.

In the Trumpian pandemic moment, gun purchases have soared in a country that already topped the world by far in armed citizens. And yet young people -- often led by young women -- have roused themselves to passionate and organized action to get guns off the streets of Trump Land. After a gunman shot up Emma Gonzalez's school in Parkland, Florida, she famously announced, "We call BS" on the claims of adults who insisted that changing the gun laws was unnecessary and impossible. She led the March for Our Lives, which brought millions onto the streets in this country to denounce politicians' inaction on gun violence.

While Donald Trump took the U.S. out of the Paris climate agreement, Greta Thunberg, the 17-year-old Swedish environmental activist, crossed the Atlantic in a carbon-neutral sailing vessel to address the United Nations, demanding of the adult world "How dare you" leave it to your children to save an increasingly warming planet:

"You have stolen my dreams and my childhood with your empty words. And yet I'm one of the lucky ones. People are suffering. People are dying. Entire ecosystems are collapsing. We are in the beginning of a mass extinction, and all you can talk about is money and fairy tales of eternal economic growth. How dare you!"

"How dare you?" is a question I ask myself every time, as a teacher, I face a classroom of college students who, each semester, seem both more anxious about the future and more determined to make it better than the present.

Public attention is a strange beast. Communities of color have known for endless years that the police can kill them with impunity, and it's not as if people haven't been saying so for decades. But when such incidents made it into the largely white mainstream media, they were routinely treated as isolated events -- the actions of a few bad apples -- and never as evidence of a systemic problem. Suddenly, in May 2020, with the release of a hideous video of George Floyd's eight-minute murder in Minneapolis, Minnesota, systematic police violence against Blacks became a legitimate topic of mainstream discussion.

The young have been at the forefront of the response to Floyd's murder and the demands for systemic change that have followed. This June in my city of San Francisco, where police have killed at least five unarmed people of color in the last few years, high school students planned and led tens of thousands of protesters in a peaceful march against police violence.

Now that the election season has reached its drawn-out crescendo, there is so much work ahead of us. With the pandemic spreading out of control, it's time to begin demanding concerted federal action, even from this most malevolent president in history. There's no waiting for Inauguration Day, no matter who takes the oath of office on January 20th. Many thousands more will die before then.

And isn't it time to turn our attention to the millions who have lost their jobs and face the possibility of losing their housing, too, as emergency anti-eviction decrees expire? Isn't it time for a genuine congressional response to hunger, not by shoring up emergency food distribution systems like food pantries, but by putting dollars in the hands of desperate Americans so they can buy their own food? Congress must also act on the housing emergency. The Centers for Disease Control and Prevention's "Temporary Halt in Residential Evictions To Prevent the Further Spread of Covid-19" only lasts until December 31st and it doesn't cover tenants who don't have a lease or written rental agreement. It's crucial, even with Donald Trump still in the White House as the year begins, that it be extended in both time and scope. And now Senate Republican leader Mitch McConnell has said that he won't even entertain a new stimulus bill until January.

Another crucial subject that needs attention is pushing Congress to increase federal funding to state and local governments, which so often are major economic drivers for their regions. The Trump administration and McConnell not only abandoned states and cities, leaving them to confront the pandemic on their own just as a deep recession drastically reduced tax revenues, but -- in true looking-glass fashion -- treated their genuine and desperate calls for help as mere Democratic Party campaign rhetoric.

"In Short, There Is Still Much to Do"

My favorite scene in Gillo Pontecorvo's classic 1966 film The Battle of Algiers takes place at night on a rooftop in the Arab quarter of that city. Ali La Pointe, a passionate recruit to the cause of the National Liberation Front (NLF), which is fighting to throw the French colonizers out of Algeria, is speaking with Ben M'Hidi, a high-ranking NLF official. Ali is unhappy that the movement has called a general strike in order to demonstrate its power and reach to the United Nations. He resents the seven-day restriction on the use of firearms. "Acts of violence don't win wars," Ben M'Hidi tells Ali. "Finally, the people themselves must act."

For the last four years, Donald Trump has made war on the people of this country and indeed on the people of the entire world. He's attacked so many of us, from immigrant children at the U.S. border to anyone who tries to breathe in the fire-choked states of California, Oregon, Washington, and most recently Colorado. He's allowed those 230,000 Americans to die in a pandemic that could have been controlled and thrown millions into poverty, to mention just a few of his "war" crimes. Finally, the people themselves must act.

On that darkened rooftop in an eerie silence, Ben M'Hidi continues his conversation with La Pointe. "You know, Ali," he says. "It's hard enough to start a revolution, even harder to sustain it, and hardest of all to win it." He pauses, then continues, "But it's only afterwards, once we've won, that the real difficulties begin. In short, there is still much to do."

It's hard enough to vote out a looking-glass president. But it's only once we've won, whether that's now or four years from now, that the real work begins. There is, indeed, still much to do.

Rebecca Gordon, a TomDispatch regular, teaches at the University of San Francisco. She is the author of American Nuremberg: The U.S. Officials Who Should Stand Trial for Post-9/11 War Crimes and is now at work on a new Dispatch book on the history of torture in the United States.

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer's new dystopian novel (the second in the Splinterlands series) Frostlands, Beverly Gologorsky's novel Every Body Has a Story, and Tom Engelhardt's A Nation Unmade by War, as well as Alfred McCoy's In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower's The Violent American Century: War and Terror Since World War II.

Copyright 2020 Rebecca Gordon

How to take on Trump on the ground — and convince others to do the same

"Look, folks, the air quality is in the red zone today. The EPA says that means people with lung or heart issues should avoid prolonged activity outdoors."

That was J.R. de Vera, one of two directors of UNITE-HERE!'s independent expenditure campaign to elect Biden and Harris in Reno, Nevada. UNITE-HERE! is a union representing 300,000 workers in the hospitality industry -- that world of hotels and bars, restaurants and caterers. Ninety percent of its members are now laid off because of Trump's bungling of the Covid-19 pandemic and many are glad for the chance to help get him out of the White House.

"So some of you will want to stay in your hotel rooms and make phone calls today," JR continues. Fifty faces fall in the 50 little Zoom boxes on my laptop screen. Canvassers would much rather be talking to voters at their doors than calling them on a phone bank. Still, here in the burning, smoking West, the union is as committed to its own people's health and safety as it is to dragging Donald Trump out of office. So, for many of them, phone calls it will be.

My own job doesn't change much from day to day. Though I live in San Francisco, I've come to Reno to do back-room logistics work in the union campaign's cavernous warehouse of an office: ordering supplies, processing reimbursements, and occasionally helping the data team make maps of the areas our canvassers will walk.

Our field campaign is just one of several the union is running in key states. We're also in Arizona and Florida and, only last week, we began door-to-door canvassing in Philadelphia. Social media, TV ads, bulk mail, and phone calls are all crucial elements in any modern electoral campaign, but none of them is a substitute for face-to-face conversations with voters.

We've been in Reno since early August, building what was, until last week, the only field campaign in the state supporting Joe Biden and Kamala Harris. (Just recently, our success in campaigning safely has encouraged the Democratic Party to start its own ground game here and elsewhere.) We know exactly how many doors we have to knock on, how many Biden voters we have to identify, how many of them we have to convince to make a concrete voting plan, and how many we have to get out to vote during Nevada's two-week early voting period to win here.

We're running a much larger campaign in Clark County, where close to three-quarters of Nevada's population lives (mostly in Las Vegas). Washoe County, home of the twin cities of Reno and Sparks, is the next largest population center with 16% of Nevadans. The remaining 14 counties, collectively known as "the Rurals," account for the rest. Washoe and Clark are barely blue; the Rurals decidedly red.

In 2018, UNITE-HERE!'s ground campaign helped ensure that Jacky Rosen would flip a previously Republican Senate seat, and we helped elect Democrat Steve Sisolak as governor. He's proved a valuable union ally, signing the Adolfo Fernandez Act, a first-in-the-nation law protecting workers and businesses in Nevada from the worst effects of the Covid-19 pandemic.

Defying a threatened Trump campaign lawsuit (later dismissed by a judge), Sisolak also signed an election reform bill that allows every active Nevada voter to receive a mail-in ballot. Largely as a result of the union's work in 2018, this state now boasts an all-female Democratic senatorial delegation, a Democratic governor, and a female and Democratic majority in the state legislature. Elections, as pundits of all stripes have been known to say, have consequences.

Door-to-Door on Planet A

"¿Se puede, o no se puede?"

"¡Sí, se puede!"

("Can we do it?" "Yes, we can!")

Each morning's online canvass dispatch meeting starts with that call-and-response followed by a rousing handclap. Then we talk about where people will be walking that day and often listen to one of the canvassers' personal stories, explaining why he or she is committed to this campaign. Next, we take a look at the day's forecast for heat and air quality as vast parts of the West Coast burn, while smoke and ash travel enormous distances. Temperatures here were in the low 100s in August (often hovering around 115 degrees in Las Vegas). And the air? Let's just say that there have been days when I've wished breathing were optional.

Climate-change activists rightly point out that "there's no Planet B" for the human race, but some days it seems as if our canvassers are already working on a fiery Planet A that is rapidly becoming unlivable. California's wildfires -- including its first-ever "gigafire" -- have consumed more than four million acres in the last two months, sending plumes of ash to record heights, and dumping a staggering amount of smoke into the Reno-Sparks basin. Things are a little better at the moment, but for weeks I couldn't see the desert mountains that surround the area. Some days I couldn't even make out the Grand Sierra Reno casino, a quarter mile from the highway on which I drive to work each morning.

For our canvassers -- almost every one a laid-off waiter, bartender, hotel housekeeper, or casino worker -- the climate emergency and the Covid-19 pandemic are literally in their faces as they don their N95 masks to walk the streets of Reno. It's the same for the voters they meet at their doors. Each evening, canvassers report (on Zoom, of course) what those voters are saying and, for the first time I can remember, they are now talking about the climate. They're angry at a president who pulled the U.S. out of the Paris climate accord and they're scared about what a potentially searing future holds for their children and grandchildren. They may not have read Joe Biden's position on clean energy and environmental justice, but they know that Donald Trump has no such plan.

Braving Guns, Germs, and Smoke

In his classic book Guns, Germs, and Steel, Jared Diamond suggested that the three variables in his title helped in large part to explain how European societies and the United States came to control much of the planet in the twentieth century. As it happens, our door-to-door canvassers confront a similar triad of obstacles right here in Reno, Nevada (if you replace that final "steel" with "smoke.")

Guns and Other Threats

Nevada is an open-carry state and gun ownership is common here. It's not unusual to see someone walking around a supermarket with a holstered pistol on his hip. A 2015 state law ended most gun registration requirements and another allows people visiting from elsewhere to buy rifles without a permit. So gun sightings are everyday events.

Still, it can be startling, if you're not used to it, to have a voter answer the door with a pistol all too visible, even if securely holstered. And occasionally, our canvassers have even watched those guns leave their holsters when the person at the door realizes why they're there (which is when the campaign gets the police involved). Canvassers are trained to observe very clear protocols, including immediately leaving an area if they experience any kind of verbal or physical threat.

African American and Latinx canvassers who've campaigned before in Reno say that, in 2020, Trump supporters seem even more emboldened than in the past to shout racist insults at them. More than once, neighbors have called the police on our folks, essentially accusing them of canvassing-while-black-or-brown. Two days before I wrote this piece, the police pulled over one young Latino door-knocker because neighbors had called to complain that he was walking up and down the street waving a gun. (The "gun" in question was undoubtedly the electronic tablet he was carrying to record the results of conversations with voters.) The officer apologized.

Which reminds me of another apology offered recently. A woman approached an African-American canvasser, demanding to know what in the world he was doing in her neighborhood. On learning his mission, she offered an apology as insulting as her original question. "We're not used to seeing people like you around here," she explained.

Germs

Until the pandemic, my partner and I had planned to work together with UNITE-HERE! in Reno during this election, as we did in 2018. But she's five years older than I am, and her history of pneumonia means that catching Covid-19 could be especially devastating for her. So she's stayed in San Francisco, helping out the union's national phone bank effort instead.

In fact, we didn't really expect that there would be a ground campaign this year, given the difficulties presented by the novel coronavirus. But the union was determined to eke out that small but genuine addition to the vote that a field campaign can produce. So they put in place stringent health protocols for all of us: masks and a minimum of six feet of distance between everyone at all times; no visits to bars, restaurants, or casinos, including during off hours; temperature checks for everyone entering the office; and the immediate reporting of any potential Covid-19 symptoms to our health and safety officer. Before the union rented blocks of rooms at two extended-stay hotels, our head of operations checked their mask protocols for employees and guests and examined their ventilation systems to make sure that the air conditioners vented directly outdoors and not into a common air system for the whole building.

To date, not one of our 57 canvassers has tested positive, a record we intend to maintain as we add another 17 full-timers to our team next week.

One other feature of our coronavirus protocol: we don't talk to any voter who won't put on a mask. I was skeptical that canvassers would be able to get voters to mask up, even with the individually wrapped surgical masks we're offering anyone who doesn't have one on or handy. However, it turns out that, in this bizarre election year, people are eager to talk, to vent their feelings and be heard. So many of the people we're canvassing have suffered so much this year that they're surprised and pleased when someone shows up at their door wondering how they're doing.

And the answer to that question for so many potential voters is not well -- with jobs lost, housing threatened, children struggling with online school, and hunger pangs an increasingly everyday part of life. So yes, a surprising number of people, either already masked or quite willing to put one on, want to talk to us about an election that they generally see as the most important of their lifetime.

Smoke

And did I mention that it's been smoky here? It can make your eyes water, your throat burn, and the urge to cough overwhelm you. In fact, the symptoms of smoke exposure are eerily similar to the ones for Covid-19. More than one smoke-affected canvasser has spent at least five days isolated in a hotel room, waiting for negative coronavirus test results.

The White House website proudly quotes the president on his administration's testing record: "We do tremendous testing. We have the best testing in the world." Washoe County health officials are doing what they can, but if this is the best in the world, then the world is in worse shape than we thought.

The Power of a Personal Story

So why, given the genuine risk and obstacles they face, do UNITE-HERE!'s canvassers knock on doors six days a week to elect Joe Biden and Kamala Harris? Their answers are a perfect embodiment of the feminist dictum "the personal is political." Every one of them has a story about why she or he is here. More than one grew up homeless and never want another child to live that way. One is a DACA recipient who knows that a reelected Donald Trump will continue his crusade to end that amnesty for undocumented people brought to the United States as children. Through their participation in union activism, many have come to understand that workers really can beat the boss when they organize -- and Trump, they say, is the biggest boss of all.

Through years of political campaigning, the union's leaders have learned that voters may think about issues, but they're moved to vote by what they feel about them. The goal of every conversation at those doors right now is to make a brief but profound personal connection with the voter, to get each of them to feel just how important it is to vote this year. Canvassers do this by asking how a voter is doing in these difficult times and listening -- genuinely listening -- and responding to whatever answer they get. And they do it by being vulnerable enough to share the personal stories that lie behind their presence at the voter's front door.

One canvasser lost his home at the age of seven, when his parents separated. He and his mother ended up staying in shelters and camping for months in a garden shed on a friend's property. One day recently he knocked on a door and found a Trump supporter on the other side of it. He noticed a shed near the house, pointed to it, and told the man about living in something similar as a child. That Trumpster started to cry. He began talking about how he'd had just the same experience and the way, as a teenager, he'd had to hold his family together when his heroin-addicted parents couldn't cope. He'd never talked to any of his present-day friends about how he grew up and, in the course of that conversation, came to agree with our canvasser that Donald Trump wasn't likely to improve life for people like them. He was, he said, changing his vote to Biden right then and there. (And that canvasser will be back to make sure he actually votes.)

Harvard University Professor Marshall Ganz pioneered the "public narrative," the practice of organizing by storytelling. It's found at the heart of many organizing efforts these days. The 2008 Obama campaign, for example, trained thousands of volunteers to tell their stories to potential voters. The It Gets Better Project has collected more than 50,000 personal messages from older queer people to LGBTQ youth who might be considering suicide or other kinds of self-harm -- assuring them that their own lives did, indeed, get better.

Being the sort of political junkie who devours the news daily, I was skeptical about the power of this approach, though I probably shouldn't have been. After all, how many times did I ask my mother or father to "tell me a story" when I was a kid? What are our lives but stories? Human beings are narrative animals and, however rational, however versed in the issues we may sometimes be, we still live through stories.

Data can give me information on issues I care about, but it can't tell me what issues I should care about. In the end, I'm concerned about racial and gender justice as well as the climate emergency because of the way each of them affects people and other creatures with whom I feel connected.

A Campaign Within a Campaign

Perhaps the most inspiring aspect of UNITE-HERE!'s electoral campaign is the union's commitment to developing every canvasser's leadership skills. The goal is more than winning what's undoubtedly the most important election of our lifetime. It's also to send back to every hotel, restaurant, casino, and airport catering service leaders who can continue to organize and advocate for their working-class sisters and brothers. This means implementing an individual development plan for each canvasser.

Team leaders work with all of them to hone their stories into tools that can be used in an honest and generous way to create a genuine connection with voters. They help those canvassers think about what else they want to learn to do, while developing opportunities for them to master technical tools like computer spreadsheets and databases.

There's a special emphasis on offering such opportunities to women and people of color who make up the vast majority of the union's membership. Precious hours of campaign time are also devoted to workshops on how to understand and confront systemic racism and combat sexual harassment, subjects President Trump is acquainted with in the most repulsively personal way. The union believes its success depends as much on fostering a culture of respect as on the hard-nosed negotiating it's also famous for.

After months of pandemic lockdown and almost four years of what has objectively been the worst, most corrupt, most incompetent, and possibly even most destructive presidency in the nation's history, it's a relief to be able to do something useful again. And sentimental as it may sound, it's an honor to be able to do it with this particular group of brave and committed people. Sí, se puede. Yes, we can.

Rebecca Gordon, a TomDispatch regular, teaches at the University of San Francisco. She is the author of American Nuremberg: The U.S. Officials Who Should Stand Trial for Post-9/11 War Crimes and is now at work on a new Dispatch book on the history of torture in the United States.

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer's new dystopian novel (the second in the Splinterlands series) Frostlands, Beverly Gologorsky's novel Every Body Has a Story, and Tom Engelhardt's A Nation Unmade by War, as well as Alfred McCoy's In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower's The Violent American Century: War and Terror Snce World War II.

Copyright 2020 Rebecca Gordon

How a diseased planet is reshaping the world of work

In two weeks, my partner and I were supposed to leave San Francisco for Reno, Nevada, where we’d be spending the next three months focused on the 2020 presidential election. As we did in 2018, we’d be working with UNITE-HERE, the hospitality industry union, only this time on the campaign to drive Donald Trump from office.

Keep reading... Show less

The US is in free fall — and at risk of becoming a failed state

You know that feeling when you trip on the street and instantly sense that you’re about to crash hard and there’s no way to prevent it? As gravity has its way with you, all you can do is watch yourself going down. Yeah, that feeling.

Keep reading... Show less

How the credibility gap became a chasm in the age of Trump

These days, teaching graduating college seniors has me, as the Brits would say in the London Underground, “minding the gap.” In my case, however, it’s not the gap between the platform and the train I’m thinking of, but a couple of others: those between U.S. citizens and our government, as well as between different generations of Americans. The Covid-19 crisis has made some of those gaps far more visible than usual, just as my students leave school with many of their personal expectations in tatters.

Keep reading... Show less

Why the chaotic mind of Donald Trump is strangely and terrifyingly fascinating

My partner and I have been fighting about politics since we met in 1965. I was 13. She was 18 and my summer camp counselor. (It was another 14 years before we became a couple.) We spent that first summer arguing about the Vietnam War. I was convinced that the U.S. never should have gotten involved there. Though she agreed, she was more concerned that the growth of an antiwar movement would distract people from what she considered far more crucial to this country: the Civil Rights movement. As it happened, we were both right.

Keep reading... Show less

The future may be female — but the pandemic is patriarchal

Before I found myself “sheltering in place,” this article was to be about women’s actions around the world to mark March 8th, International Women’s Day. From Pakistan to Chile, women in their millions filled the streets, demanding that we be able to control our bodies and our lives. Women came out in Iraq and Kyrgyzstan, Turkey and Peru, the Philippines and Malaysia. In some places, they risked beatings by masked men. In others, they demanded an end to femicide -- the millennia-old reality that women in this world are murdered daily simply because they are women.

Keep reading... Show less

'The right to do whatever I want as president'

On February 5th, the Senate voted to acquit President Donald J. Trump of abuse of power and obstruction of Congress. In other words, Trump’s pre-election boast that he “could stand in the middle of Fifth Avenue and shoot somebody” and not “lose any voters” proved something more than high-flown hyperbole. (To be fair, he did lose one Republican “voter” in the Senate — Mitt Romney — but it wasn’t enough to matter.)

Keep reading... Show less

How the cowardice of the Obama administration foreshadowed the impunity of Trump

On February 5th, the Senate voted to acquit President Donald J. Trump of abuse of power and obstruction of Congress. In other words, Trump’s pre-election boast that he “could stand in the middle of Fifth Avenue and shoot somebody” and not “lose any voters” proved something more than high-flown hyperbole. (To be fair, he did lose one Republican “voter” in the Senate -- Mitt Romney -- but it wasn’t enough to matter.)

Keep reading... Show less

Here's what Trump doesn't understand about the 'deep state'

This seems like a strange moment to be writing about "the deep state" with the country entering a new phase of open and obvious aboveground chaos and instability. Just as we had gotten used to the fact that the president is, in effect, under congressional indictment, just as we had settled into a more or less stable stalemate over when (and if) the Senate will hold an impeachment trial, the president shook the snow globe again, by ordering the assassination of foreign military officials and threatening the destruction of Iran’s cultural sites. Nothing better than the promise of new war crimes to take the world’s attention away from a little thing like extorting a U.S. ally to help oneself get reelected.

Keep reading... Show less

What's wrong with Trump Republican Party? It springs from the twin roots of political evil

On the Thursday of the second week of the House Intelligence Committee’s impeachment hearings, former U.S. Attorney Preet Bharara had a special guest on his weekly podcastCarl Bernstein. It was Bernstein, with fellow Washington Post journalist Bob Woodward, whose reporting broke open the story of how the Committee to Re-elect the President burglarized Democratic Party headquarters at the Watergate office building in Washington, D.C.  That reporting and the impeachment hearings that followed eventually forced President Richard Nixon to resign in disgrace in 1974. Bharara wanted to hear about what differences Bernstein sees between the Nixon impeachment proceedings and Donald Trump’s today.

Keep reading... Show less

Trump's high crimes: Why impeachment fever is catching on — and could stop a historically destructive president

Recently a friend who follows the news a bit less obsessively than I do said, “I thought George W. Bush was bad, but it seems like Donald Trump is even worse. What do you think?”

Keep reading... Show less

Trump wants to give pardons for war crimes — but the worst offenders don't even need them

Memorial Day has come and gone and President Trump did not issue his pardons after all. There was substantial evidence that he was planning to use the yearly moment honoring the country’s war dead to grant executive clemency to several U.S. soldiers and at least one military contractor. All have been accused, and one already convicted, of crimes in the never-ending war on terror. But apparently Trump received enough resistance from serving and retired senior military officers and former soldiers, including presidential candidate Pete Buttigieg, to change his mind -- for now.

Keep reading... Show less

How to make yourself an exception to the rule of law

Events just fly by in the ever-accelerating rush of Trump Time, so it’s easy enough to miss important ones in the chaos. Paul Manafort is sentenced twice and indicted a third time! Whoosh! Gone! The Senate agrees with the House that the United States should stop supporting Saudi Arabia in Yemen (and Mitch McConnell calls this attempt to extricate the country from cooperation in further war crimes “inappropriate and counterproductive”)! Whoosh! Gone! Twelve Republican senators cross party lines to overturn Trump’s declaration of a national emergency on the U.S.-Mexico border, followed by the president’s veto! Whoosh! Gone! Delegates to the March 2019 U.N. Environment Assembly meeting agree to a non-binding but important resolution drastically reducing the production of single-use plastic. The United States delegation, however, succeeds in watering down the final language lest it “endorse the approach being taken in other countries, which is different than our own”! Once again, the rest of the world is briefly reminded of the curse of American exceptionalism and then, whoosh! Gone!

Keep reading... Show less

How Trump's ego could weaken the National Guard

A young friend is seriously considering joining her state’s National Guard. She’s a world-class athlete, but also a working-class woman from a rural background competing in a rich person’s sport. Between seasons, she works for a local farm and auctioneer to put together the money for equipment and travel.

Keep reading... Show less

A world on fire: Trump's dizzying outrage cycle fuels our anxiety like a drug - but his presidency's dangers are very real

I took my first hit of speed in 1970 during my freshman year in college. That little white pill -- Dexedrine -- was a revelation. It made whatever I was doing absolutely fascinating. Amphetamine sharpened my focus and banished all appetites except a hunger for knowledge. I spent that entire night writing 35 pages of hand-scrawled notes about a 35-page article by the philosopher Ludwig Feurbach, thereby convincing the professor who would become my advisor and mentor that I was absolutely fascinating.

Keep reading... Show less

American Psychologists Helped Design Torture Programs - Now They Want to End Them

Sometimes the good guys do win. That’s what happened on August 8th in San Francisco when the Council of Representatives of the American Psychological Association (APA) decided to extend a policy keeping its members out of the U.S. detention center at Guantánamo Bay, Cuba.

Keep reading... Show less

Trump's On-Going Drone War Is Far Worse Than You Think

They are like the camel’s nose, lifting a corner of the tent. Don’t be fooled, though. It won’t take long until the whole animal is sitting inside, sipping your tea and eating your sweets. In countries around the world -- in the Middle EastAsia MinorCentral AsiaAfrica, even the Philippines -- the appearance of U.S. drones in the sky (and on the ground) is often Washington’s equivalent of the camel’s nose entering a new theater of operations in this country’s forever war against “terror.” Sometimes, however, the drones are more like the camel's tail, arriving after less visible U.S. military forces have been in an area for a while.

Keep reading... Show less

Want to Bring Down Donald Trump? Follow the People Who Follow the Money

They call people like us “bean counters” -- the soulless ones beavering away in some windowless accounting department, the living calculators who don’t care about desperation or aspirations, who just want you to turn in your expense report on time and explain those perfectly legitimate charges on the company credit card. We’re the ones whose demands are mere distractions from any organization’s or government agency’s true mission.

Keep reading... Show less

Is Trump Trying to Go to War?

A barely noticed anniversary slid by on March 20th. It’s been 15 years since the United States committed the greatest war crime of the twenty-first century: the unprovoked, aggressive invasion of Iraq. The New York Times, which didn’t exactly cover itself in glory in the run-up to that invasion, recently ran an op-ed by an Iraqi novelist living in the United States entitled “Fifteen Years Ago, America Destroyed My Country,” but that was about it. The Washington Post, another publication that (despite the recent portrayal of its Vietnam-era heroism in the movie The Post) repeatedly editorialized in favor of the invasion, marked the anniversary with a story about the war’s “murky” body count. Its piece concluded that at least 600,000 people died in the decade and a half of war, civil war, and chaos that followed -- roughly the population of Washington, D.C.

Keep reading... Show less

America's Endless Wars Are Creating A Generational Struggle in the Classroom

I was teaching the day the airplanes hit the World Trade Center. It was the second meeting of “The Communist Manifesto for Seminarians,” a course for my fellow graduate students. By the time I got to class, both towers had collapsed. A few hours later, Building 7 came down as well. We dispensed with a planned discussion about what Marxists mean by “idealism” and “materialism” and talked instead about the meaning of this particular example of the “propaganda of the deed.”