Rajan Menon

The seedy politics of our drinking water

Think of it this way: what we don't know will hurt us. And water — yes, water — is an example of just that. Even at a time of such angry political disputes, you might imagine that, in a wealthy country like the United States, it would still be possible to agree that clean water should be not just a right, but a given. Well, welcome to America 2021.

When it comes to basic water supplies, that's hardly an outlandish thought. After all, back in 2015, our government, along with other members of the United Nations, embraced the U.N.'s Sustainable Development Goals, the sixth of which is universal access to safe drinking water. Despite modest progress globally — 71% of the world's population lacked that simple necessity then, "only" 61% today — nearly 900 million people still don't have it. Of course, the overwhelming majority of them live in the poorest countries on this planet.

The United States, however, has the world's largest economy, the fifth-highest per-capita income, and is a technological powerhouse. How, then, could the American Society of Civil Engineers (ASCE) have given our water infrastructure (pipes, pumping stations, reservoirs, and purification and recycling facilities) a shocking C- grade in their 2021 "report card"? How to explain why Yale University's Environmental Performance Index ranked the U.S. only 26th globally when it comes to the quality of its drinking water and sanitation?

Worse yet, two million Americans still have no running water and indoor plumbing. Native Americans are 19 times more likely to lack this rudimentary amenity than Whites; Latinos and African Americans, twice as likely. On average, Americans use 82 gallons of water daily; Navajos, seven — or the equivalent of about five flushes of a toilet. Moreover, many Native Americans must drive miles to fetch fresh water, making regular handwashing, a basic precaution during the Covid-19 pandemic, just one more hardship.

"Safe" Water

Washington and Philadelphia are just two of the many American cities whose water-distribution systems, some of them wooden, contain pipes that predate the Civil War. Naturally, time has taken its toll. The Environmental Protection Agency (EPA) reports that water mains, especially such old ones, rupture 240,000 times annually, while "trillions of gallons" of potable water worth $2.6 billion seep from leaky pipes, and "billions of gallons of raw sewage" pollute the surface water that provides 61% of our supply. Fixing busted pipes, which break at the rate of one every two minutes nationally, has cost nearly $70 billion since 2000.

The U.S. has 2.2 million miles of waterpipes, which are, on average, 45 years old. The EPA's 2015 estimate for overhauling such an aging system of piping was $473 billion, or $23.7 billion annually over 20 years — in other words, anything but chump change. Still, compared to the way Congress allots money to the U.S. military for its endless losing wars and eternal build-ups of weaponry, it couldn't be more modest. After all, the Pentagon's latest budget request was for $715 billion, to which the House Armed Services Committee added $25.5 billion, unsolicited, as did its Senate counterpart. Self-styled congressional budget hawks never complain about our military spending, even though it exceeds that of the next 11 countries combined. So, $23.7 billion annually to renovate an antediluvian water system? That shouldn't be a problem, right?

It turns out, though, that it is. The federal government's share of total investment in updating water infrastructure plunged from nearly-two-thirds in 1977 to less than a tenth of that by 2019. With state and local governments under increasing financial pressure, the funding shortfall for modernizing the water infrastructure could reach a staggering $434 billion by 2029.

Considering where the American water system already falls utterly short, a contrarian could counter that it's not a big deal for a mere two million people in a country of 333 million not to have water directly piped into their homes. But in the wealthiest country on earth? Really? And a lack of easy access to water is hardly the only problem. A substantial number of Americans are drinking (and cooking with) contaminated supplies of it. A 2017 investigation found that 63 million of them had done so at least once during the previous 10 years, or nearly a fifth of the population.

This finding wasn't an outlier. The Natural Resources Defense Council (NRDC) discovered that, "in 2015 alone, there were more than 80,000 reported violations of the Safe Water Drinking Act by community water systems" that served nearly 77 million people. And of the total number of violations, 12,000, traced to water providers serving 27 million people, were health-related (rather than monitoring and reporting infractions). There's more. A study in the Proceedings of the National Academy of Sciences concluded that 21 million consumers received water that didn't meet federal standards; and Time reported that 30 million did in 2019.

The Flint Saga and Beyond

Occasionally, stories about unsafe drinking water do make the headlines, as happened with Flint, Michigan. Once a prosperous city, Flint was slammed by a post-1970s wave of de-industrialization in the Midwest and now has a poverty rate of nearly 39% (and 54% of its population is Black). By 2013, facing its massive budget deficit, a commission appointed by the governor devised a cost-saving measure. The city's water supply would be switched to the Flint River, pending construction of new supply lines from Lake Huron. That river, however, had long been contaminated by waste from factories, paper mills, and meatpacking plants along its shore, as well as untreated sewage.

Residents began complaining that their water smelled and tasted bad, but were regularly reassured that it was safe. Testing, however, revealed lead levels that far exceeded the EPA-stipulated maximum because the water hadn't been treated with anti-corrosion additives to counter contamination. (There is, in fact, no "safe" level for lead, a toxic metal, but the EPA requires remedial action if 10% of water samples show concentrations exceeding 15 ppb, or parts per billion.) Flint's water also contained trihalomethane, a carcinogen, as well as dangerous E. coli and legionella bacteria. A scandal ensued.

Flint, as it turned out, wasn't alone. The NRDC reported this year that "dozens of cities have been found to have dangerous levels of elevated lead" in their water. Another of its studies concluded that the drinking water of 186 million people (56% of Americans) had more than one part per billion of lead, the maximum recommended by the American Academy of Pediatrics, and that 61 million Americans used bottled water from sources that exceeded the Food and Drug Administration's five ppb maximum, while lead levels in the water of seven million others exceeded the 15-ppb EPA threshold for mandatory corrective measures.

In 1986, Congress banned the future use of pipes that weren't "lead free," but didn't require the replacement of existing ones. Even today, as many as 12 million lead pipes still serve households in this country and scientists generally regard the EPA's lead limit as far too lax and its testing requirements and reporting standards as too permissive. Perhaps you won't be surprised to learn that local governments and utility companies have regularly opposed tougher regulations for lead-pipe replacement.

Eliminating lead water pipes entirely in this country would cost up to $50 billion. Though that's a lot of money, it's hardly unaffordable. In fact, the American Jobs Plan proposed $45 billion for that task, though the separate bipartisan infrastructure bill cut it to $15 billion — again illustrating that penny pinching applies to threats to Americans' day-to-day well-being, but not to our militarized conception of national security.

Other Contaminants

Lead isn't the sole contaminant in our drinking water.

  • In farming communities in California's Central Valley and in the San Joaquin Valley, increasing amounts of uranium — associated with kidney damage and a greater risk of cancer — have turned up in the local drinking water, including private wells, which aren't regulated by the EPA, but are used by migrant workers. A 2015 Associated Press investigation found that a quarter of San Joaquin Valley households were then using drinking water from private wells containing "dangerous amounts of uranium." Moreover, one in 10 of the Valley's community water systems contained uranium levels that exceeded federal and state limits — and there's no reason to believe that has changed in the last six years.
  • The rise in fertilizer use — fivefold since the 1950s — to boost crop yields and its runoff has increased the nitrate levels in drinking water. High levels of nitrates, which have been linked to various forms of cancer, birth defects, and thyroid disease, have been found in 4,000 public water systems in 10 states supplying 45 million people, especially in the West and Midwest. In more than half of these places, the contamination seems only to be increasing. The EPA's maximum concentration level for nitrates is 10 milligrams per liter, but studies reveal that the risk of birth defects and cancer increase even when people consume water containing half that amount.
  • Arsenic, a known carcinogen, is another hazard. A 2020 Columbia University study found that, though the average concentration of arsenic in the water supply, nationwide, fell by 10% between 2006 and 2011, concentrations exceeding the EPA's maximum of 0.01 milligrams per liter were far more likely in smaller communities that use groundwater and are disproportionately Hispanic. A U.S. Geological Survey report, which focused on wells providing drinking water, noted that there were "dangerously high levels of arsenic, potentially exposing 2.1 million people" to health risks in more than half of all states.
  • Per- and Polyfluoroalkyl substances (PFAS) are used in numerous products, including non-stick cookware, pizza boxes, firefighting foam, and waterproof apparel. However, they remain unregulated by the EPA despite being associated with a range of health risks. Worse yet, these "forever chemicals" take thousands of years to break down. Scientists estimate that the tap water of 200 million Americans contains PFAS concentrations that put them at risk.

The Bad News for 2021

Since the early nineteenth century, enormous progress has been made toward providing Americans with abundant, clean water. And water-borne diseases like cholera, which still kills close to 100,000 people worldwide every year, and typhoid, which claims as many as 161,000, have essentially been eliminated in this country (though there are still 16 million annual cases of acute gastroenteritis traceable to contaminated water). So, yes, water in the U.S. is generally fit to drink, but given this country's economic and technological resources, it's scandalous that the problems that remain haven't at least been substantially mitigated.

To understand such a failure, just consider our politics, which, in the wake of recent elections, only seem to be growing worse by the day.

Since the 1980s, the public sphere has been dominated by a narrative that portrays just about anything the government does, other than profligate spending on the U.S. military, as financially reckless, intrusive, and counterproductive. Instead of creating a compelling message to persuade Americans that many valued public benefits, ranging from land grant colleges, the Internet, Social Security, and Medicare to the national highway system and medical research breakthroughs, owe much to government policies, too many Democrats continue to run scared, fearful of being labeled "big-government-tax-and-spend liberals."

Add to this the outsized political influence that big money exercises through copious campaign contributions — all but limitless thanks to recent Supreme Court decisions — and pricey lobbyists. (Yes, unions and public interest groups lobby, too, but for each dollar they spend, corporations spend $34.)

Companies that, for instance, produce perchlorate, a chemical found in U.S. water supplies that's used in rocket fuel and munitions and is harmful to iodine-deficient pregnant women and fetuses, have paid lobbyists to fight stricter regulations for years. Not coincidentally, the EPA, which has been monitoring perchlorate since 2001, has yet to set mandatory limits on it for drinking water, though it continues to consider a "roadmap" for doing so. Similarly, the seven largest producers of PFAS spent $61 million in 2019 and 2020 on campaign contributions and lobbying efforts. In 2018, there were only two firms lobbying against tougher PFAS regulations; a year later that number had increased to 14.

The EPA sets maximum drinking water levels for 90 substances, but hasn't (except in a few instances where Congress mandated that it do so) added more since 1996 even though its "Drinking Water Contaminant Candidate List" now contains nearly 100 additional substances. This shouldn't be a surprise. Companies that oppose tougher regulations have political access and clout. Political appointees to important EPA posts often hail from those very industries or the lobbying groups they bankroll. Scientists paid by industries have weighed in, lending an aura of legitimacy to special-interest pleading.

Water policy is rife with scientific complexity, but the legislation and regulations that shape it are hashed out in the political arena. There, the deck is increasingly stacked — and not in favor of the average consumer. If the Republicans take back Congress in 2022 and the presidency in 2024, my small suggestion: have a nice cool glass of ice water and relax. What could possibly go wrong?

Copyright 2021 Rajan Menon

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer's new dystopian novel, Songlands (the final one in his Splinterlands series), Beverly Gologorsky's novel Every Body Has a Story, and Tom Engelhardt's A Nation Unmade by War, as well as Alfred McCoy's In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower's The Violent American Century: War and Terror Since World War II.

Rajan Menon, a TomDispatch regular, is the Anne and Bernard Spitzer Professor of International Relations at the Powell School, City College of New York, Senior Research Fellow at Columbia University's Saltzman Institute of War and Peace Studies, and a non-resident fellow at the Quincy Institute for Responsible Statecraft. He is the author, most recently, of The Conceit of Humanitarian Intervention.


The war of unintended consequences: 4 key lessons after the withdrawal from Afghanistan

Disagreements over how to assess the American exodus from Afghanistan have kept the pundits busy these last weeks, even though there wasn't much to say that hadn't been said before. For some of them, however, that was irrelevant. Having overseen or promoted the failed Afghan War themselves, all the while brandishing various "metrics" of success, they were engaged in transparent reputation-salvaging.

Not surprisingly, the entire spectacle has been tiresome and unproductive. Better to devote time and energy to distilling the Afghan war's larger lessons.

Here are four worth considering.

Lesson One: When You Make Policy, Give Serious Thought to Possible Unintended Consequences

The architects of American policy toward Afghanistan since the late 1970s bear responsibility for the disasters that occurred there because they couldn't, or wouldn't, look beyond their noses. As a result, their policies backfired with drastic consequences. Some historical scene-setting is required to understand just why and how.

Let's start in another country and another time. Consider the December 1979 decision of the leadership of the Soviet Union to send in the Red Army to save the ruling Marxist and pro-Soviet People's Democratic Party of Afghanistan (PDPA). Having seized control of that country the previous year, the PDPA was soon pleading for help. By centralizing its power in the Afghan capital, Kabul (never a good way to govern that land), and seeking to modernize society at breakneck speed — through, among other things, promoting the education and advancement of women — it had provoked an Islamic insurgency that spread rapidly. Once Soviet troops joined the fray, the United States, assisted by Saudi Arabia, Egypt, Pakistan, and even China, would start funding, arming, and training the mujahedeen, a collection of Islamist groups committed to waging jihad there.

The decision to arm them set the stage for much of what happened in Afghanistan ever since, especially because Washington gave Pakistan carte blanche to decide which of the jihadist groups would be armed, leaving that country's powerful Inter-Services Intelligence Agency to call the shots. The ISI favored the most radical mujahedeen groups, calculating that an Islamist-ruled Afghanistan would provide Pakistan with "strategic depth" by ending India's influence there.

India did indeed have close ties with the PDPA, as well as the previous government of Mohammed Daoud, who had overthrown King Zahir Shah, his cousin, in 1973. Pakistan's Islamist parties, especially the Jama'at-i-Islami, which had been proselytizing among the millions of Afghan refugees then in Pakistan, along with the most fundamentalist of the exiled Afghan Islamist groups, also helped recruit fighters for the war against the Soviet troops.

From 1980 until 1989, when the defeated Red Army finally departed from Afghanistan, Washington's foreign policy crew focused in a single-minded fashion on expelling them by arming those anti-Soviet insurgents. One rationale for this was a ludicrous theory that the Soviet move into Afghanistan was an initial step toward Moscow's ultimate goal: conquering the oil-rich Persian Gulf. The spinners of this apocalyptic fantasy, notably President Jimmy Carter's hawkish national security adviser, Zbigniew Brzezinski, seemed not to have even bothered to peruse a map of the terrain between Afghanistan and the Gulf. It would have shown that among the obstacles awaiting Russian forces headed there was the 900-mile-long, 14,000-foot-high Zagros mountain range.

Enmeshed in a Cold War-driven frenzy and eager to stick it to the Soviets, Brzezinski and others of like mind gave no thought to a critical question: What would happen if the Soviets were finally expelled and the mujahedeen gained control of Afghanistan? That lapse in judgment and lack of foresight was just the beginning of what proved to be a chain of mistakes.

Though the PDPA government outlasted the Red Army's retreat, the collapse of the Soviet Union in late 1991 proved a death sentence for its Afghan allies. Instead of forming a unity government, however, the mujahedeen promptly turned on one another. There ensued a vicious civil war, pitting Pashtun mujahedeen groups against their Tajik and Uzbek counterparts, with Kabul as the prize. The fighting destroyed large parts of that city's western and southern neighborhoods, killing as many as 25,000 civilians, and forcing 500,000 of them, nearly a third of the population, to flee. So wearied were Afghans by the chaos and bloodletting that many were relieved when the Taliban, themselves former participants in the anti-Soviet jihad, emerged in 1994, established themselves in Kabul in 1996, and pledged to reestablish order.

Some of the Taliban and Taliban-allied leaders who would later make the United States' most-wanted list had, in fact, been bankrolled by the CIA to fight the Red Army, including Jalaluddin Haqqani, founder of the now-infamous Haqqani Network, and the notoriously cruel Gulbuddin Hekmatyar, leader of the Hezb-e-Islami, arguably the most extreme of the mujahedeen groups, who is now negotiating with the Taliban, perhaps angling for a spot in its new government.

Osama Bin Laden's links with Afghanistan can also be traced to the anti-Soviet war. He achieved his fame thanks to his role in that American-backed jihad and, along with other Arabs involved in it, founded al-Qaeda in 1988. Later, he decamped to Sudan, but after American officials demanded his expulsion, moved, in 1996, back to Afghanistan, a natural haven given his renown there.

Though the Taliban, unlike al-Qaeda, never had a transnational Islamist agenda, they couldn't deny him succor — and not just because of his cachet. A main tenet of Pashtunwali, the Pashtun social code they lived by, was the duty to provide refuge (nanawati) to those seeking it. Mullah Mohammad Omar, the Taliban's supreme leader, became increasingly perturbed by Bin Laden's incendiary messages proclaiming it "an individual duty for every Muslim" to kill Americans, including civilians, and personally implored him to stop, but to no avail. The Taliban were stuck with him.

Now, fast forward a couple of decades. American leaders certainly didn't create the Islamic State-Khorasan Province — aka IS-K, an affiliate of the main Islamic State — whose suicide bombers killed 170 people at Kabul airport on August 26th, 13 of them American troops. Yet IS-K and its parent body emerged partly from the ideological evolution of various extremists, including many Taliban commanders, who had fought the Soviets. Later, inspired, especially after the 2003 U.S. invasion of Iraq, to continue the jihad, they yearned for something bolder and more ambitious than the Taliban's version, which was confined to Afghanistan.

It should hardly have required clairvoyance in the 1980s to grasp that funding an anti-Soviet Islamist insurgency might have dangerous long-term consequences. After all, the mujahedeen were hardly secretive about the sort of political system and society they envisaged for their country.

Lesson Two: Beware the Overwhelming Pride Produced by the Possession of Unrivaled Global Power

The idea that the U.S. could topple the Taliban and create a new state and society in Afghanistan was outlandish considering that country's history. But after the Soviet Union started to wobble and eventually collapsed and the Cold War was won, Washington was giddy with optimism. Recall the paeans in those years to "the unipolar moment" and "the end of history." We were Number One, which meant that the possibilities, including remaking entire countries, were limitless.

The response to the 9/11 attacks then wasn't simply a matter of shock and fear. Only one person in Washington urged reflection and humility in that moment. On September 14, 2001, as Congress prepared to authorize a war against al-Qaeda and its allies (the Taliban), Representative Barbara Lee (D-CA) gave a prescient speech. "I know," she said, "this resolution will pass, although we all know that the president can wage a war even without it. However… let's step back for a moment… and think through the implications of our actions today, so that this does not spiral out of control."

In the heat of that moment, in a country that had become a military power beyond compare, no one cared to consider alternative responses to the al-Qaeda attacks. Lee's would be the lone no vote against that Authorization to Use Military Force. Afterward, she would receive hate mail, even death threats.

So confident was Washington that it rejected the Taliban's offer to discuss surrendering Bin Laden to a third country if the U.S. stopped the bombing and provided proof of his responsibility for 9/11. Secretary of State Donald Rumsfeld also refused to consider the Taliban's leadership attempts to negotiate a surrender and amnesty. The Bush administration treated the Taliban and al-Qaeda as identical and excluded the former from the December 2001 Bonn talks it had convened to form a new Afghan government. As it happened, the Taliban, never having received the memo from various eminences who pronounced it dead, soon regrouped and revved up its insurgency.

The United States then faced two choices, neither of them good. Its top officials could have decided that the government they had created in Kabul wouldn't survive and simply withdrawn their forces. Or they could have stuck with nation-building and periodically "surged" troops into the country in hopes that a viable government and army would eventually emerge. They chose the latter. No president or senior military commander wanted to be blamed for "losing" Afghanistan or the "war on terror," so the baton was passed from one commander to the next and one administration to another, each claiming to have made significant progress. The result: a 20-year, $2.3-trillion fiasco that ended chaotically at Kabul airport.

Lesson Three: Don't Assume That Opponents Whose Values Don't Fit Yours Won't Be Supported Locally

Reporting on the Taliban's retrograde beliefs and pitiless practices helped foster the belief that such a group, itself supposedly a Pakistani creation, could be routed because Afghans reviled it. Moreover, the bulk of the dealings American officials and senior military leaders had were with educated, urbane Afghan men and women, and that strengthened their view that the Taliban lacked legitimacy while the U.S.-backed government had growing public confidence.

Had the Taliban truly been a foreign transplant, however, they could never have kept fighting, dying, and recruiting new members for nearly two decades in opposition to a government and army backed by the world's sole superpower. The Taliban certainly inspired fear and committed numerous acts of brutality and horror, but poor rural Pashtuns, their social base, didn't view them as outsiders with strange beliefs and customs, but as part of the local social fabric.

Mullah Omar, the Taliban's first supreme leader, was born in Kandahar Province and raised in Uruzgun Province. His father, Moulavi Ghulam Nabi, had been respected locally for his learning. Omar became the leader of the Hotaki tribe, part of one of the two main Pashtun tribal confederacies, the Ghilzai, which was a Taliban mainstay. He joined the war against the Soviet occupation in 1979, returning to Kandahar once it ended, where he ran a madrassa, or religious school. After the Taliban took power in 1996, though its leader, he remained in Kandahar, seldom visiting the capital.

The Kabul government and its American patrons may have inadvertently helped the Taliban's cause. The more that ordinary Afghans experienced the raging corruption of the American-created system and the viciousness of the paramilitary forces, militias, and warlords the U.S. military relied on, the more successful the insurgents were at portraying themselves as the country's true nationalists resisting foreign occupiers and their collaborators. Not for nothing did the Taliban liken Afghanistan's U.S.-supported presidents to Shah Shuja, an exiled Afghan monarch the invading British placed on the throne, triggering an armed uprising that lasted from 1839 to 1842 and ended with British troops suffering a catastrophic defeat.

But who needed history? Certainly not the greatest power ever.

Lesson Four: Beware the Generals, Contractors, Consultants, and Advisers Who Eternally Issue Cheery Reports From War Zones

The managers of wars and economic projects acquire a vested interest in touting their "successes" (even when they know quite well that they're actually failures). Generals worry about their professional reputations, nation-builders about losing lucrative government contracts.

Senior American commanders repeatedly assured the president and Congress that the Afghan army was becoming a thoroughly professional fighting force, even when they knew better. (If you doubt this, just read the scathing analysis of retired Lieutenant Colonel Daniel Davis, who did two tours of duty in Afghanistan.)

Reporter Craig Whitlock's Afghanistan Papers — based on a trove of once-private documents as well as extensive interviews with numerous American officials — contains endless example of such happy talk. After serving 19 months leading U.S. and allied forces in Afghanistan, General John Allen declared that the Afghan army could hold its own, adding that "this is what winning looks like." General John Campbell, who held the same position during the last quarter of 2015, praised those troops as a "capable" and "modern, professional force." American generals constantly talked about corners being turned.

Torrents of data were cited to tout the social and economic progress produced by American aid. It mattered not at all that reports by the Special Inspector General for Afghan Reconstruction questioned the readiness and capabilities of the Afghan army, while uncovering information about schools and hospitals funded but never built, or built but never used, or "unsafe," "literally crumbling," or saddled with unsustainable operating costs. Staggering sums of American aid were also lost to systemic corruption. U.S.-funded fuel supplies were typically stolen on a "massive scale" and sold on the black market.

Afghan commanders padded payrolls with thousands of "ghost soldiers," pocketing the cash as they often did the salaries of unpaid actual soldiers. The economic aid that American commander General David Petraeus wanted ramped up because he considered it essential to victory fueled bribe-taking by officials managing basic services. That, in turn, only added to the mistrust of the U.S.-backed government by ordinary citizens.

Have policymakers learned any lessons from the Afghan War? President Biden did declare an end to this country's "forever wars" and its nation-building (though not to its anti-terror strategy driven by drone attacks and commando raids). Real change, however, won't happen until the vast national security establishment and military-industrial complex nourished by the post-9/11 commitment to the war on terror, regime change, and nation building reaches a similar conclusion. And only a wild optimist could believe that likely.

Here, then, is the simplest lesson of all: no matter how powerful your country may be, your wishes are not necessarily the world's desires and you probably understand a lot less than you think.

Copyright 2021 Rajan Menon

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer's new dystopian novel, Songlands (the final one in his Splinterlands series), Beverly Gologorsky's novel Every Body Has a Story, and Tom Engelhardt's A Nation Unmade by War, as well as Alfred McCoy's In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower's The Violent American Century: War and Terror Since World War II.

Rajan Menon, a TomDispatch regular, is the Anne and Bernard Spitzer Professor of International Relations at the Powell School, City College of New York, Senior Research Fellow at Columbia University's Saltzman Institute of War and Peace Studies, and a non-resident fellow at the Quincy Institute for Responsible Statecraft. He is the author, most recently, of The Conceit of Humanitarian Intervention.

Power, wealth and justice in the time of COVID-19: The global north returns to 'normal'

Fifteen months ago, the SARS-CoV-2 virus unleashed Covid-19. Since then, it's killed more than 3.8 million people worldwide (and possibly many more). Finally, a return to normalcy seems likely for a distinct minority of the world's people, those living mainly in the United States, Canada, the United Kingdom, the European Union, and China. That's not surprising. The concentration of wealth and power globally has enabled rich countries to all but monopolize available vaccine doses. For the citizens of low-income and poor countries to have long-term pandemic security, especially the 46% of the world's population who survive on less than $5.50 a day, this inequity must end, rapidly — but don't hold your breath.

This article originally appeared on TomDispatch.

The Global North: Normalcy Returns

In the United States new daily infections, which peaked in early January, had plummeted 96% by June 16th. The daily death toll also dropped — by 92% — and the consequences were apparent. Big-city streets were bustling again, as shops and restaurants became ever busier. Americans were shedding their reluctance to travel by plane or train, as schools and universities prepared to resume "live instruction" in the fall. Zoom catch-ups were yielding to socializing the old-fashioned way.

By that June day, new infections and deaths had fallen substantially below their peaks in other wealthy parts of the world as well. In Canada, cases had dropped by 89% and deaths by 94%; in Europe by 87% and 87%; and in the United Kingdom by 84% and 99%.

Yes, European governments were warier than the U.S. about giving people the green light to resume their pre-pandemic lifestyles and have yet to fully abolish curbs on congregating and traveling. Perhaps recalling Britain's previous winter surge, thanks to the B.1.1.7 mutation (initially discovered there) and the recent appearance of two other virulent strains of Covid-19, B.1.167 and B.1.617.2 (both first detected in India), Downing Street has retained restrictions on social gatherings. It's even put off a full reopening on June 21st, as previously planned. And that couldn't have been more understandable. After all, on June 17th, the new case count had reached 10,809, the highest since late March. Still, new daily infections there are less than a tenth what they were in early January. So, like the U.S., Britain and the rest of Europe are returning to some semblance of normalcy.

The Global South: A Long Road Ahead

Lately, the place that's been hit the hardest by Covid-19 is the global south where countries are particularly ill-prepared.

Consider social distancing. People with jobs that can be done by "working from home" constitute a far smaller proportion of the labor force than in wealthy nations with far higher levels of education, mechanization, and automation, along with far greater access to computers and the Internet. An estimated 40% of workers in rich countries can work remotely. In lower- and middle-income lands perhaps 10% can do so and the numbers are even worse in the poorest of them.

During the pandemic, millions of Canadians, Europeans, and Americans lost their jobs and struggled to pay food and housing bills. Still, the economic impact has been far worse in other parts of the world, particularly the poorest African and Asian nations. There, some 100 million people have fallen back into extreme poverty.

Such places lack the basics to prevent infections and care for Covid-19 patients. Running water, soap, and hand sanitizer are often not readily available. In the developing world, 785 million or more people lack "basic water services," as do a quarter of health clinics and hospitals there, which have also faced crippling shortages of standard protective gear, never mind oxygen and ventilators.

Last year, for instance, South Sudan, with 12 million people, had only four ventilators and 24 ICU beds. Burkina Faso had 11 ventilators for its 20 million people; Sierra Leone 13 for its eight million; and the Central African Republic, a mere three for eight million. The problem wasn't confined to Africa either. Virtually all of Venezuela's hospitals have run low on critical supplies and the country had 84 ICU beds for nearly 30 million people.

Yes, wealthy countries like the U.S. faced significant shortages, but they had the cash to buy what they needed (or could ramp up production at home). The global south's poorest countries were and remain at the back of the queue.

India's Disaster

India has provided the most chilling illustration of how spiraling infections can overwhelm healthcare systems in the global south. Things looked surprisingly good there until recently. Infection and death rates were far below what experts had anticipated based on the economy, population density, and the highly uneven quality of its healthcare system. The government's decision to order a phased lifting of a national lockdown seemed vindication indeed. As late as April, India reported fewer new cases per million than Britain, France, Germany, the U.K., or the U.S.

Never one for modesty, its Hindu nationalist prime minister, Narendra Modi, boasted that India had "saved humanity from a great disaster by containing Corona effectively." He touted its progress in vaccination; bragged that it was now exporting masks, test kits, and safety equipment; and mocked forecasts that Covid-19 would infect 800 million Indians and kill a million of them. Confident that his country had turned the corner, he and his Bharatiya Janata Party held huge, unmasked political rallies, while millions of Indians gathered in vast crowds for the annual Kumbh Mela religious festival.

Then, in early April, the second wave struck with horrific consequences. By May 6th, the daily case count had reached 414,188. On May 19th, it would break the world record for daily Covid-19 deaths, previously a dubious American honor, recording almost 4,500 of them.

Hospitals quickly ran out of beds. The sick were turned away in droves and left to die at home or even in the streets, gasping for breath. Supplies of medical oxygen and ventilators ran out, as did personal protective equipment. Soon, Modi had to appeal for help, which many countries provided.

Indian press reports estimate that fully half of India's 300,000-plus Covid-19 deaths have occurred in this second wave, the vast majority after March. During the worst of it, the air in India's big cities was thick with smoke from crematoria, while, because of the shortage of designated cremation and burial sites, corpses regularly washed up on riverbanks.

We may never know how many Indians have actually died since April. Hospital records, even assuming they were kept fastidiously amid the pandemonium, won't provide the full picture because an unknown number of people died elsewhere.

The Vaccination Divide

Other parts of the global south have also been hit by surging infections, including countries in Asia which had previously contained Covid-19's spread, among them Malaysia, Nepal, the Philippines, Sri Lanka, Thailand, and Vietnam. Latin America has seen devastating surges of the pandemic, above all in Brazil because of President Jair Bolsonaro's stunning combination of fecklessness and callousness, but also in Bolivia, Columbia, Chile, Paraguay, Peru, and Uruguay. In Africa, Angola, Namibia, South Africa, and the Democratic Republic of the Congo are among 14 countries in which infections have spiked.

Meanwhile, the data reveal a gargantuan north-south vaccination gap. By early June, the U.S. had administered doses to nearly half the country's population, in Britain slightly more than half, in Canada just over a third, and in the European Union approximately a third. (Bear in mind that the proportions would be far higher were only adults counted and that vaccination rates are still increasing far faster in these places than in the global south.)

Now consider examples of vaccination coverage in low-income countries.

  • In the Democratic Republic of the Congo, Ethiopia, Nigeria, South Sudan, Sudan, Vietnam, and Zambia it ranged from 0.1% to 0.9% of the population.
  • In Angola, Ghana, Kenya, Pakistan, Senegal, and South Africa, between 1% and 2.4%.
  • In Botswana and Zimbabwe, which have the highest coverage in sub-Saharan Africa, 3% and 3.6% respectively.
  • In Asia (China and Singapore aside), Cambodia at 9.6% was the leader, followed by India at 8.5%. Coverage in all other Asian countries was below 5.4.%.

This north-south contrast matters because mutations first detected in the U.K., Brazil, India, and South Africa, which may prove up to 50% more transmissible, are already circulating worldwide. Meanwhile, new ones, perhaps even more virulent, are likely to emerge in largely unvaccinated nations. This, in turn, will endanger anyone who's unvaccinated and so could prove particularly calamitous for the global south.

Why the vaccination gap? Wealthy countries, none more than the United States, could afford to spend billions of dollars to buy vaccines. They're home as well to cutting-edge biotechnology companies like AstraZeneca, BioNTech, Johnson and Johnson, Moderna, and Pfizer. Those two advantages enabled them to preorder enormous quantities of vaccine, indeed almost all of what BioNTech and Moderna anticipated making in 2021, and even before their vaccines had completed clinical trials. As a result, by late March, 86% of all vaccinations had been administered in that part of the world, a mere 0.1% in poor regions.

This wasn't the result of some evil conspiracy. Governments in rich countries weren't sure which vaccine-makers would succeed, so they spread their bets. Nevertheless, their stockpiling gambit locked up most of the global supply.

Equity vs. Power

Tedros Adhanom Ghebreyesus, who leads the World Health Organization (WHO), was among those decrying the inequity of "vaccine nationalism." To counter it, he and others proposed that the deep-pocketed countries that had vacuumed up the supplies, vaccinate only their elderly, individuals with pre-existing medical conditions, and healthcare workers, and then donate their remaining doses so that other countries could do the same. As supplies increased, the rest of the world's population could be vaccinated based on an assessment of the degree to which different categories of people were at risk.

COVAX, the U.N. program involving 190 countries led by the WHO and funded by governments and private philanthropies, would then ensure that getting vaccinated didn't depend on whether or not a person lived in a wealthy country. It would also leverage its large membership to secure low prices from vaccine manufacturers.

That was the idea anyway. The reality, of course, has been altogether different. Though most wealthy countries, including the U.S. following Biden's election, did join COVAX, they also decided to use their own massive buying power to cut deals directly with the pharmaceutical giants and vaccinate as many of their own as they could. And in February, the U.S. government took the additional step of invoking the Defense Production Act to restrict exports of 37 raw materials critical for making vaccines.

COVAX has received support, including $4 billion pledged by President Joe Biden for 2021 and 2022, but nowhere near what's needed to reach its goal of distributing two billion doses by the end of this year. By May, in fact, it had distributed just 3.4% of that amount.

Biden recently announced that the U.S. would donate 500 million doses of vaccines this year and next, chiefly to COVAX; and at their summit this month, the G-7 governments announced plans to provide one billion altogether. That's a large number and a welcome move, but still modest considering that 11 billion doses are needed to vaccinate 70% of the world.

COVAX's problems have been aggravated by the decision of India, counted on to provide half of the two billion doses it had ordered for this year, to ban vaccine exports. Aside from vaccine, COVAX's program is focused on helping low-income countries train vaccinators, create distribution networks, and launch public awareness campaigns, all of which will be many times more expensive for them than vaccine purchases and no less critical.

Another proposal, initiated in late 2020 by India and South Africa and backed by 100 countries, mostly from the global south, calls for the World Trade Organization (WTO) to suspend patents on vaccines so that pharmaceutical companies in the global south can manufacture them without violating intellectual property laws and so launch production near the places that need them the most.

That idea hasn't taken wing either.

The pharmaceutical companies, always zealous about the sanctity of patents, have trotted out familiar arguments (recall the HIV-AIDS crisis): their counterparts in the global south lack the expertise and technology to make complex vaccines quickly enough; efficacy and safety could prove substandard; lifting patent restrictions on this occasion could set a precedent and stifle innovation; and they had made huge investments with no guarantees of success.

Critics challenged these claims, but the bio-tech and pharmaceutical giants have more clout, and they simply don't want to share their knowledge. None of them, for instance, has participated in the WHO's Covid-19 Technology Access Pool (C-TAP), created expressly to promote the voluntary international sharing of intellectual property, technology, and knowhow, through non-restricted licensing.

On the (only faintly) brighter side, Moderna announced last October that it wouldn't enforce its Covid-19 vaccine patents during the pandemic — but didn't offer any technical assistance to pharmaceutical firms in the global south. AstraZeneca gave the Serum Institute of India a license to make its vaccine and also declared that it would forgo profits from vaccine sales until the pandemic ends. The catch: it reserved the right to determine that end date, which it may declare as early as this July.

In May, President Biden surprised many people by supporting the waiving of patents on Covid-19 vaccines. That was a big change given the degree to which the U.S. government has been a dogged defender of intellectual property rights. But his gesture, however commendable, may remain just that. Germany dissented immediately. Others in the European Union seem open to discussion, but that, at best, means protracted WTO negotiations about a welter of legal and technical details in the midst of a global emergency.

And the pharmaceutical companies will hang tough. Never mind that many received billions of dollars from governments in various forms, including equity purchases, subsidies, large preordered vaccine contracts ($18 billion from the Trump administration's Operation Warp Speed program alone), and research-and-development partnerships with government agencies. Contrary to its narrative, Big Pharma never placed huge, risky bets to create Covid-19 vaccines.

How Does This End?

Various mutations of the virus, several highly infectious, are now traveling the world and new ones are expected to arise. This poses an obvious threat to the inhabitants of low-income countries where vaccination rates are already abysmally poor. Given the skewed distribution of vaccines, people there may not be vaccinated, even partially, until 2022, or later. Covid-19 could therefore claim more millions of lives.

But the suffering won't be confined to the global south. The more the virus replicates itself, the greater the probability of new, even more dangerous, mutations — ones that could attack the tens of millions of unvaccinated in the wealthy parts of the world, too. Between a fifth and a quarter of adults in the U.S. and the European Union say that they're unlikely to, or simply won't, get vaccinated. For various reasons, including worry about the safety of vaccines, anti-vax sentiments rooted in religious and political beliefs, and the growing influence of ever wilder conspiracy theories, U.S. vaccination rates slowed starting in mid-April.

As a result, President Biden's goal of having 70% of adults receive at least one shot by July 4th won't be realized. With less than two weeks to go, at least half of the adults in 25 states still remain completely unvaccinated. And what if existing vaccines don't ensure protection against new mutations, something virologists consider a possibility? Booster shots may provide a fix, but not an easy one given this country's size, the logistical complexities of mounting another vaccination campaign, and the inevitable political squabbling it will produce.

Amid the unknowns, this much is clear: for all the talk about global governance and collective action against threats that don't respect borders, the response to this pandemic has been driven by vaccine nationalism. That's indefensible, both ethically and on the grounds of self-interest.

The graveyard of empire: Why American failure in Afghanistan was guaranteed

On May 1st, the date Donald Trump signed onto for the withdrawal of the remaining 3,500 American troops from Afghanistan, the war there, already 19 years old, was still officially a teenager. Think of September 11, 2021 — the 20th anniversary of the 9/11 attacks and the date Joe Biden has chosen for the same — as, in essence, the very moment when its teenage years will be over.

In all that time, Washington has been fighting what, in reality, should have been considered a fantasy war, a mission impossible in that country, however grim and bloody, based on fantasy expectations and fantasy calculations, few of which seem to have been stanched in Washington even so many years later. Not surprisingly, Biden's decision evoked the predictable reactions in that city. The military high command's never-ending urge to stick with a failed war was complemented by the inside-the-Beltway Blob's doomsday scenarios and tired nostrums.

The latter began the day before the president even went public when, in a major opinion piece, the Washington Post's editorial board distilled the predictable platitudes to come: such a full-scale military exit, they claimed, would deprive Washington of all diplomatic influence and convince the Taliban that it could jettison its talks with President Ashraf Ghani's demoralized U.S.-backed government and fight its way to power. A Taliban triumph would, in turn, eviscerate democracy and civil society, leaving rights gained by women and minorities in these years in the dust, and so destroy everything the U.S. had fought for since October 2001.

By this September, of course, 775,000-plus Americans soldiers will have served in Afghanistan (a few of them the children of those who had served early in the war). More than a fifth of them would endure at least three tours of duty there! Suffice it to say that most of the armchair generals who tend to adorn establishment think tanks haven't faced such hardships.

In 2010 and 2011, the Obama surge would deploy as many as 100,000 U.S. troops to Afghanistan. The Pentagon states that, as of this month, 2,312 American soldiers have died there (80% killed in action) and 20,666 have been injured. Then there's the toll taken on vets of that never-ending war thanks to PTSD, suicide, and substance abuse. Military families apart, however, much of the American public has been remarkably untouched by the war, since there's no longer a draft and Uncle Sam borrowed money, rather than raising taxes, to foot the $2.26 trillion bill. As a result, the forever war dragged on, consuming blood and treasure without any Vietnam War-style protests.

Not surprisingly, most Americans know even less about the numbers of Afghan civilians killed and wounded in these years. Since 2002, at least 47,000 non-combatants have been killed and another 43,000 injured, whether by airstrikes, artillery fire, shootings, improvised explosive devices, or suicide and car bombings. A 2020 U.N. report on civilian casualties in Afghanistan notes that 2019 was the sixth straight year in which 10,000 civilians were killed or wounded. And this carnage has occurred in one of the world's poorest countries, which ranks 187th in per-capita income, where the death or incapacitation of an adult male (normally the primary breadwinner in a rural Afghan home) can tip already-poor families into destitution.

So how, then, can the calls to persevere make sense? Seek and you won't find a persuasive answer. Consider the most notable recent attempt to provide one, the Afghanistan Study Group report, written by an ensemble of ex-officials, retired generals, and think-tank luminaries, not a few of them tied to big weapons-producing companies. Released with significant fanfare in February, it offered no substantive proposals for attaining goals that have been sought for 19 years, including a stable democracy with fair elections, a free press, an unfettered civil society, and equal rights for all Afghans — all premised on a political settlement between the U.S.-backed government and the Taliban.

Still Standing After All These Years

Now, consider Afghanistan's bedrock reality: the Taliban, which has battled the world's most fearsome military machine for two decades, remains standing, and continues to expand its control in rural areas. The U.S., its NATO allies, and the Afghanistan National Security and Defense Forces have indeed killed some 50,000 Taliban fighters over the years, including, in 2016, its foremost leader Mullah Akhtar Mohammad Mansoor. In 2019-2020 alone, several senior commanders, also members of the Taliban's shadow government, were killed, including the "governors" of Badakshan, Farah, Logar, Samangan, and Wardak provinces. Yet the Taliban, whose roots lie among the Pashtun, the country's historically dominant ethnic group, have managed to replenish their ranks, procure new weapons and ammunition, and raise money, above all through taxes on opium poppy farming.

It helps that the Taliban continues to get covert support from Pakistan's military and intelligence service, which played a pivotal role in creating the movement in the early 1990s after it was clear that the leaders of the Pakistan-backed Pashtun mujahedeen (literally, those who wage jihad) proved unable to shoot their way into power because minority nationalities (mainly Uzbeks and Tajiks) resisted ferociously. Yet the Taliban has indigenous roots, too, and its success can't be attributed solely to intimidation and violence. Its political agenda and puritanical version of Islam appeal to many Afghans. Absent that, it would have perished long ago.

Instead, according to the Long War Journal, the Taliban now controls 75 of Afghanistan's 400 districts; the government rules 133 others, with the remaining 187 up for grabs. Although the insurgency isn't on the homestretch to victory, it's never been in a stronger military position since the 2001 American invasion. Nor has the morale of its fighters dissipated, though many are doubtless weary of war. According to a May U.N. report, "the Taliban remain confident they can take power by force," even though their fighters have long been vastly outmatched in numbers, mobility, supplies, transportation, and the caliber of their armaments. Nor do they have the jets, helicopters, and bombers their adversaries, especially the United States do, and use with devastating effect. In 2019, 7,423 bombs and other kinds of ordnance were dropped on Afghanistan, eight times as many as in 2015.

Tallying Costs

As 2019 ended, a group of former senior U.S. officials claimed that the Afghan campaign's costs have been overblown. American troops killed there the previous year, they pointed out, amounted to only a fifth of those who died during "non-combat training exercises" and that "U.S. direct military expenditures in Afghanistan are approximately three percent of annual U.S. military spending" and were decreasing. It evidently escaped them that even a few fatalities that occur because a country's leaders pursue outlandish objectives like reshaping an entire society in a distant land should matter.

As for the monetary costs, it depends on what you count. Those "direct military expenditures" aren't the only ones incurred year after year from the Afghan War. Brown University's Costs of War Project, for instance, also includes expenses from the Pentagon's "base budget" (the workaday costs of maintaining the armed forces); funds allotted for "Overseas Contingency Operations," the post-9/11 counter-terrorism wars; interest payments on money borrowed to fund the war; the long-term pensions and benefits of its veterans; and economic aid provided to Afghanistan by the State Department and the U.S. Agency for International Development (USAID). Do the math that way and the price tag turns out to be so much larger.

But even if you were to accept that 3% figure, that would still total $22 billion from the $738 billion fiscal year 2020 Pentagon budget, hardly chump change — especially given the resources needed to address festering problems on the home front, including a pandemic, child poverty, hunger, homelessness, and an opioid epidemic.

Nation-Building: Form vs. Substance

Now, consider some examples of the "progress" highlighted by the proponents of pressing on. These would include democratic elections and institutions, less corruption, and inroads against the narcotics trade.

First, the election system, an effective one being, of course, a prerequisite for democracy. Of course, given the way Donald Trump and crew dealt with election 2020 here in the U.S., Americans should think twice before blithely casting stones at the Afghan electoral system. In addition, organizing elections in a war-ravaged country is a dangerous task when an insurgency is working overtime to violently disrupt them.

Still, each of Afghanistan's four presidential elections (2004, 2009, 2014, 2019) produced widespread, systematic fraud verified by investigative reporters and noted in U.S. government reports. After the 2014 presidential poll, for instance, candidate Abdullah Abdullah wouldn't concede and threatened to form a parallel government, insisting that his opponent, Ashraf Ghani, had won fraudulently. To avert bloodshed, U.S. Secretary of State John Kerry brokered a power-sharing deal that made Abdullah the "chief executive" — a position unmentioned in the Afghan constitution. (Incidentally, elections to the national legislature have also been plagued by irregularities.) Although USAID has worked feverishly to improve election procedures and turnout, spending $200 million on the 2014 presidential election alone, voting fraud remained pervasive in 2019.

As for key political institutions, which also bear American fingerprints, the respected Afghanistan Analyst Network only recently examined the state of the supreme court, the senate, provincial and district assemblies, and the Independent Commission for Overseeing the Implementation of the Constitution (ICOIC). It concluded that they "lacked even the minimum independence needed to exercise their constitutional mandate to provide accountability" and aggravated the "stagnation of the overall political system."

The senate lacked the third of its membership elected by district assemblies — the remaining senators are appointed by the president or elected by provincial assemblies — for a simple reason. Though constitutionally mandated, district assembly elections have never been held. As for the ICOIC, it had only four out of its seven legally required commissioners, insufficient for a quorum.

When it comes to the narcotics trade, Afghanistan now accounts for 90% of the world's illicit opium, essential for the making of heroin. The hectares of land devoted to opium-poppy planting have increased dramatically from 8,000 in 2001 to 263,000 by 2018. (A slump in world demand led to a rare drop in 2019.) Little wonder, since poppies provide destitute Afghan farmers with income to cover their basic needs. A U.N. study estimates that poppy sales, at $2 billion in 2019, exceeded the country's legal exports, while the opium economy accounted for 7% to 11% of the gross domestic product.

Although the U.S. has spent at least $9 billion attempting to stamp out Afghanistan's narcotics trade, a 2021 report to Congress by the Special Inspector General for Afghan Reconstruction (SIGAR) concluded that the investment had next to no effect and that Afghan dominance of the global opium business remained unrivalled. The report didn't, however, mention the emergence of a new, more insidious problem. In recent years, that country has become a major producer of illegal synthetic drugs, especially methamphetamine, both cheaper and more profitable than opium cultivation. It now houses, according to a European Union study, an estimated 500 meth labs that manufacture 65.5 tons of the stuff daily.

As for the campaign against corruption, a supposed pillar of U.S. nation-building, forget it. From shakedowns by officials and warlords to palatial homes built with ill-gotten gains by the well-connected, corruption permeates the American-installed system in Afghanistan.

Though U.S. officials have regularly fumed about the corruption of senior Afghan officials, including the first post-Taliban president, Hamid Karzai, the CIA funneled "tens of millions" of dollars to him for years (as he himself confirmed). Investigative reporting by the Washington Post's Craig Whitlock revealed that many notorious warlords and senior officials were also blessed by the Agency's beneficence. They included Uzbek strongman and one-time First Vice President Abdul Rashid Dostum, accused of murder, abduction, and rape, and Mohammed Zia Salehi, the head of administration at the National Security Council under President Karzai.

In 2015, a U.S. government investigation revealed that $300 million earmarked to pay the Afghan police never actually reached them and was instead "paid" to "ghost" (non-existent) officers or simply stolen by police officials. A 2012 study traced 3,000 Pentagon contracts totaling $106 billion and concluded that 40% of that sum had ended up in the pockets of crime bosses, government officials, and even insurgents.

According to SIGAR's first 2021 quarterly report to Congress, one U.S. contractor pled guilty to stealing $775,000 in State Department funds. Two others, subcontractors to weapons giant Lockheed Martin, submitted nearly $1.8 million in fraudulent invoices, while hiring local employees who lacked contractually required qualifications. (They were asked to procure counterfeit college diplomas from an Internet degree mill.)

And lest you think that this deeply embedded culture of corruption in Afghanistan is a "Third World" phenomenon, consider an American official's recollection that "the biggest source of corruption" in that country "was the United States."

Hubris and Nemesis Strike — Yet Again

While writing this piece, a memory came back to me. In 1988, I was part of a group that visited Afghanistan just as Soviet troops were starting to withdraw from that country. After a disastrous 10-year war, those demoralized young soldiers were headed for a homeland that itself would soon implode. The Red Army had been sent to Afghanistan in December 1979 by a geriatric Politburo leadership confident that it would save an embattled Afghan socialist regime, which had seized power in April 1978 and soon sparked a countrywide Islamist insurgency backed by the CIA and Saudi dollars that spawned a small group that called itself al-Qaeda, headed by a rich young Saudi.

Once the guerillas were crushed, so Soviet leaders then imagined, the building of a modern socialist society would proceed amid stability and a shiny new Soviet-allied Afghanistan would emerge. As for those ragtag bands of primitive Islamic warriors, what chance did they stand against well-trained Russian soldiers bearing the latest in modern firepower?

Moscow may even have believed that the Kabul government would hold its own after the Soviet military left what its new young leader, Mikhail Gorbachev, had then taken to calling "the bleeding wound." The Afghan president of that moment certainly did. When our group met him, Mohammed Najibullah Ahmadzai, a burly, fearsome fellow who had previously headed the KHAD, the country's brutal intelligence agency, confidently assured us that his government had strong support and plenty of staying power. Barely four years later, he would be castrated, dragged behind a vehicle, and strung up in public.

The Politburo's experiment in social re-engineering in a foreign country — no one said "nation-building" back then — led to more than 13,000 dead Soviet soldiers and perhaps as many as one million dead Afghans. No two wars are alike, of course, but the same vainglory that possessed those Soviet leaders marked the American campaign in Afghanistan in its early years. The white-hot anger that followed the 9/11 attacks and the public's desire for vengeance led the George W. Bush administration to topple the Taliban government. He and his successors in the White House, seized by the overweening pride theologian Reinhold Niebuhr had long ago warned his fellow Americans about, also believed that they would build a democratic and modern Afghanistan.

As it happened, they simply started another, even longer cycle of war in that unfortunate country, one guaranteed to rage on and consume yet more lives after American soldiers depart this September — assuming Biden's decision isn't thwarted.

Copyright 2021 Rajan Menon

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer's new dystopian novel Frostlands (the second in the Splinterlands series), Beverly Gologorsky's novel Every Body Has a Story, and Tom Engelhardt's A Nation Unmade by War, as well as Alfred McCoy's In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower's The Violent American Century: War and Terror Since World War II.

Rajan Menon, a TomDispatch regular, is the Anne and Bernard Spitzer Professor of International Relations at the Powell School, City College of New York, and Senior Research Fellow at Columbia University's Saltzman Institute of War and Peace Studies. He is the author, most recently, of The Conceit of Humanitarian Intervention.

Hunger in America: On the frontlines of the COVID-19 nightmare

As autumn fades and winter looms, the dire predictions public-health experts made about Covid-19 have, unfortunately, proven all-too-accurate. On October 27th, 74,379 people were infected in the United States; less than a month and a half later, on December 9th, that number had soared to 218,677, while the 2020 total has just surpassed 15 million, a number no other country, not even India, which has a population three times that of the U.S., has surpassed

And now, it seems, the third wave of the virus has arrived. As recently as late October, the embattled Dr. Anthony Fauci, the nation's leading infectious disease expert, warned that "we are in for a whole lot of hurt" and that infections could reach 100,000 a day. As it happens, he was wildly optimistic. A little more than a month later, there were more than twice that many. Is it possible, however, that the current surge is due in part to increased testing, as President Trump and others have regularly claimed? Here's the problem. Even if that theory were true, it can't account for the spiraling death toll, which is now more than 300,000 and could hit 450,000 by February, according to Robert Redfield, the head of the Centers for Disease Control and Prevention. Nor can it explain the daily Covid-19 hospitalizations, the first round of which peaked at 59,712 on July 23rd, dropped pretty steadily to a low of 28,606 on September 20th, and then started to soar, reaching 106,671 on December 9th.

Though big-picture statistics like these should help us grasp the staggering magnitude of our current public-health crisis, what they don't reveal is the searing effects it's had on the lives of millions of Americans, even those who have managed to evade the virus or haven't seen friends or family fall ill or die from Covid-19. The pandemic has been especially hard for those on the front lines: doctors, nurses, and other hospital workers who experience battle fatigue and despair while besieged by suffering and deaths, visceral reminders of their own vulnerability.

In society at large, precautions -- lockdowns, social distancing, limits on festive gatherings -- necessary to keep Covid-19 at bay have increased loneliness and social isolation. Contrary to early expectations, reports of abuse and violence within families haven't actually spiraled, but experts suggest that may be because the victims, confined to their homes alongside their tormentors, are finding it harder to seek help and fear reporting what's happening to them. As for children, teachers are no longer seeing their pupils in person as regularly and so are less able to spot the typical warning signs of mistreatment.

Thankfully, the pandemic has yet to increase this country's already alarming suicide rate, but the same can't be said for levels of stress and depression, both of which have risen noticeably. School closures and the move to online learning have forced parents, particularly women, to scramble for childcare and to work less, even though many of them were barely getting by while working full-time, or stop working altogether, often a genuine disaster in poor families.

Not surprisingly, people who have been laid off or had their work hours reduced have fallen behind on their mortgage and rent payments. Although various federal and state moratoriums on such payments, as well as on evictions and foreclosures, were enacted, such protections will eventually end. And the moratoriums don't negate renters' or homeowners' obligations to settle accounts with their bankers and landlords somewhere down the line (which for many Americans may, in the end, prove an impossibility).

Food and the Pandemic

Apart from the illness and death it causes, perhaps the most poignant consequence of Covid-19 has been the way it's increased what's called "food insecurity" across the United States. That ungainly term doesn't refer to the chronic food scarcity and undernourishment, which afflicts more than 800 million people in poor countries, but rather to the disruption of people's typical food-consumption patterns. The U.S. Department of Agriculture (USDA) distinguishes between what it calls low food security ("reduced quality, variability, or desirability of diet") and the very low version of the same ("multiple indications of disrupted eating patterns and reduced food intake").

Surveys by the USDA and the Census Bureau show that both variants have risen steeply during the pandemic. Just before the coronavirus struck, 35 million Americans, 11 million of them children, experienced food insecurity, the lowest figure in two decades. This year, those numbers are projected to reach 54 million and 18 million respectively. In 2018, 4% of American adults reported that at least some members of their family did not have enough to eat; by July 2020, that figure had hit 11%, according to a study by Northwestern University's Food Research and Action Center, and will only increase as the pandemic worsens.

Income supplements provided by the $2.2 trillion CARES Act that Congress passed in March in response to the economic problems created by Covid-19, increases in the government's Supplementary Nutritional Program (SNAP), and the Pandemic Electronic Benefit (P-EBT), which helps parents whose children no longer get free or subsidized school lunches, have made a difference -- but not enough to make up for lost or reduced income, lost homes, and other disasters of this moment. And sadly, any follow-up to the CARES Act, assuming Congress reaches some kind of agreement on its terms before the current legislation expires at the end of December, will almost certainly be far less generous than the original law. The SNAP increases already excluded the poorest seven million households that were then receiving the maximum amount, and the new increases now under discussion in Congress would add less than one dollar to a four-person family's maximum daily benefit. P-EBT expired in most states at the end of September, in some as early as July.

That food insecurity has "skyrocketed," as the Center for Budget and Policy Priorities puts it, during the pandemic despite government assistance shouldn't come as a surprise. Millions of people have lost their jobs. Some have seen their earnings diminished because of furloughs, wage cuts, freezes, or reduced working hours. Others have looked for jobs in vain and finally given up (but aren't included in official unemployment statistics). Millions of adults have children who no longer receive those free or subsidized lunches because of the switch, in whole or part, to online teaching. Worse yet, as pandemic-induced firings, layoffs, and wage cuts have reduced incomes, and so consumer purchasing power, food prices, especially for meat, fish, and eggs, have only risen. Such costs have increased for other reasons as well. The pandemic has disrupted supply networks, national and international. Leery consumers, anticipating shortages or seeking to reduce trips to grocery stores to avoid being infected by Covid-19, have also resorted to panic buying and the stockpiling of food and other necessities.

Who You Are and Where You Live Matters Most

Of course, not everyone has been hit with equal force by rising food prices. Americans high on the income ladder can absorb such extra costs easily enough and, in any case, spend a substantially smaller portion of their income on groceries. According to the USDA, adults with incomes in the top fifth of society spent 8% of their income on food last year; for the bottom fifth, it was 36%. The first group also obviously has a lot more money available to stock up on food than that bottom fifth, so many of whom have also become jobless or seen their paychecks diminish since the pandemic started. In March, for example, 39% of those making less than $40,000 had already lost their jobs or had their paychecks reduced, but only 13% of those who earned $100,000 or more, and that gap continued into the fall.

Not surprisingly, then, the bigger the hit people took from the Covid-19 recession, the more likely they were to experience food insecurity, which is why aggregate statistics on the phenomenon and other societal problems attributable to the pandemic can be misleading. They tend to mask the reality that its effects have been felt primarily by the most vulnerable, while the others have been touched much more lightly, or not at all.

The variations are rooted in ethnicity and location as well as income level (and the three tend to be closely linked). A USDA report classified 19% of Black households and 16% of Hispanic households as food insecure in 2019, compared to 8% of their white counterparts. By this summer, food insecurity had increased significantly across the board, afflicting 36% of Black, 32% of Hispanic, and 18% of white households. While the pandemic has certainly made matters worse, African Americans had the highest rate among those three groups even before it started. This was especially true of counties -- the U.S. has more than 3,000 of them -- in which they were in the majority. In 2016, those particular counties accounted for a mere 3% of the national total, but 96% of them had "high food insecurity," as the Department of Agriculture defines it, as well as a poverty rate more than twice the national average (12.7% that year).

Native Americans have had the worst of it, however, since many of their families lack access to running water and plumbing (58 per 1,000 households compared to three per 1,000 for whites). Nearly 75% of Native Americans must travel more than a mile to reach a supermarket, compared to 40% of the population as a whole, and the disruption of supply chains has only diminished their food security further relative to other ethnic communities. Even prior to the pandemic, counties in which they (or Native Alaskans) constituted a majority were among those with the highest levels of food insecurity. Not coincidentally, in 2016, the poverty rate in nearly 70% of Native American-majority counties averaged a whopping 37%.

In other words, while every group has suffered in this pandemic year, race matters -- a lot -- when it comes to the degree of suffering.

So does income. In coronavirus-stricken America, only 1% of adults with an annual income exceeding $100,000 surveyed by the Census Bureau this summer responded that, during the preceding week, their household "sometimes or often did not have enough to eat." Compare that to 16% of those making $25,000-$35,000 and 28% of those earning less than $25,000.

Finally, food insecurity during the pandemic has varied by location as well. Ten states (and the District of Columbia) had the highest rates, ranging from Mississippi (33.5%), which stood atop this group, to Alabama (27%), which had the lowest. In between, in descending order, were Washington, D.C., Nevada, Louisiana, New York, New Mexico, Florida, Tennessee, and North Carolina.

Food Banks and Pantries: On The Front Lines

The other day, a close friend described to me the daily scene at a food distribution center in New York City's Harlem neighborhood. Well before trucks laden with food pulled up early in the morning, he said, the lines had already started forming, hundreds of people waiting patiently in a queue that encircled the block. And that's just one of many neighborhoods in New York where this is all too typical these days. In Queens, for instance, one pantry regularly faces a demand so steep that lines can extend for eight blocks. Try to imagine what the waiting time must be. All told, 1.5 million people in the city, unable to buy the groceries they need, rely on food pantries, and New York is anything but unusual. Photographs abound of cars lined up by the hundreds, even thousands, at food pantries in major cities around the country.

Feeding America, a non-profit organization that supports 200 food storage centers and 60,000 pantries nationwide, reports that the country's food banks have provided the equivalent of more than 4.2 billion meals since March, a 50% increase compared to a year ago and 40% of the people who come to such pantries are first-time visitors. A Consumer Reports survey of grocery shoppers found that nearly a fifth of them had turned to a food pantry since the pandemic began (half of whom hadn't sought such help at all in 2019). In March, before the first wave of Covid-19 began to peak, 18 million Americans already used food pantries; by August, that number had climbed to 22 million, even though an additional 6.2 million people had received benefits from SNAP (the food-stamp program in common parlance) between March and May alone. By early July, 37.4 million people had signed up for SNAP compared to 35.7 million for all of last year.

Little wonder, then, that food banks, facing a tsunami of demand, have struggled to stay stocked amid rising prices, shortages, reduced donations from big chain supermarkets, and disrupted supply chains. It's also become even harder for them to raise the money they need to operate. Not a few have buckled under the strain and many have been forced to shut down. Pantries have also had a hard time mustering volunteers, in part because seniors, particularly vulnerable to the virus, made up a significant segment of such helpers. Not surprisingly, then, food banks and pantries have battled to function or simply survive in these months, while also having to implement an array of cumbersome and costly safety measures to keep volunteers, staff, and clients infection-free.

Despite their heroic role, such food banks and pantries are the equivalent of the proverbial finger in the dike. For Covid-induced food insecurity and hunger to decline significantly, the third wave of infections will have to subside and Congress will have to offer more effective aid. The Trump administration's recent proposal, blessed by Senate Majority Leader Mitch McConnell, to provide a one-shot $600 check to all adults (whether they're unemployed or not) certainly isn't. At the same time, vaccines will have to be produced in sufficient quantities and distributed rapidly. (We are far from ready on that front.) All this in a country where striking numbers of people look askance at vaccination -- in a December survey only 63% of Americans said they would be willing to get vaccinated against Covid-19 -- and are also drawn to conspiracy mongers whose appeal has grown, thanks in part to social media.

Once the virus is vanquished or at least brought under reasonable control, the economy can be reopened. Then, many of the nearly 11 million at-present unemployed people will perhaps have a shot at working again or having their employers end reduced hours and cut wages.

Here's hoping that these various stars align by summer 2021. We can then revert to pre-pandemic normalcy, even though that state of affairs was marked by substantial poverty -- 34 million people last year -- and rising inequality.

Rajan Menon, a TomDispatch regular, is the Anne and Bernard Spitzer Professor of International Relations at the Powell School, City College of New York, senior research fellow at Columbia University's Saltzman Institute of War and Peace Studies, and a non-resident fellow at the Quincy Institute for Responsible Statecraft. His latest book is The Conceit of Humanitarian Intervention.

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer's new dystopian novel (the second in the Splinterlands series) Frostlands, Beverly Gologorsky's novel Every Body Has a Story, and Tom Engelhardt's A Nation Unmade by War, as well as Alfred McCoy's In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower's The Violent American Century: War and Terror Since World War II.

Copyright 2020 Rajan Menon

The nightmare Joe Biden could inherit

Donald Trump isn't just inside the heads of his Trumpster base; he's long been a consuming obsession among those yearning for his defeat in November. With barely more than a week to go before the election of our lifetime, those given to nail biting as a response to anxiety have by now gnawed ourselves down to the quick. And many have found other ways to manage (or mismanage) their apprehensions through compulsive rituals, which only ratchet up the angst of the moment, among them nonstop poll tracking, endless "what if" doomsday-scenario conversations with friends, and repeated refrigerator raids.

As one of those doomsday types, let me briefly suggest a few of the commonplace dystopian possibilities for November. Trump gets the majority of the votes cast in person on November 3rd. A Pew Research Center survey found that 60% of those supporting the president intend to vote that way on Election Day compared to 23% of Biden supporters; and a Washington Post-University of Maryland poll likewise revealed a sizable difference between Republicans and Democrats, though not as large. He does, however, lose handily after all mail-in and absentee ballots are counted. Once every ballot is finally tabulated, Biden prevails in the popular vote and ekes out a win in the Electoral College. The president, however, having convinced his faithful that voting by mail will result in industrial-scale fraud (unless he wins, of course), proclaims that he -- and "the American people" -- have been robbed by the establishment. On cue, outraged Trumpsters, some of them armed, take to the streets. Chaos, even violence, ensues. The president's army of lawyers frenetically file court briefs contesting the election results and feverishly await a future Supreme Court decision, Mitch McConnell having helpfully rammed through Amy Coney Barrett's nomination to produce a 6-3 conservative majority (including three Trump-appointed Supremes) that will likely favor him in any disputed election case.

Or the vote tally shows that Trump didn't prevail in pivotal states, but in state legislatures with Republican majorities, local GOP leaders appoint electors from their party anyway, defying the popular will without violating Article II, Section I, of the Constitution, which doesn't flat-out prohibit such a stratagem. That was one possibility Barton Gellman explored in his bombshell Atlantic piece on the gambits Trump could use to snatch victory (of a sort) from the jaws of a Biden victory. Then there are the sundry wag-the-dog plots, including a desperate Trump trying to generate a pre-election rally-around-the-flag effect by starting a war with Iran -- precisely what, in 2011, he predicted Barack Obama would do to boost his chances for reelection.

And that, of course, is just part of a long list of nightmarish possibilities. Whatever your most dreaded outcome, dwelling on it doesn't make for happiness or even ephemeral relief. Ultimately, it's not under your control. Besides, no one knows what will happen, and some prominent pundits have dismissed such apocalyptic soothsaying with assurances that the system will work the way it's supposed to and foil Trumpian malfeasance. Here's hoping.

In the meantime, let's summon what passes for optimism these days. Imagine that none of the alarmist denouements materializes. Biden wins the popular vote tally and the Electoral College. The GOP's leaders discover that they do, in fact, have backbones (or at least the instinct for political survival), refusing to echo Trump's rants about rigging. The president rages but then does go, unquietly, into the night.

Most of my friends on the left assume that a new dawn would then emerge. In some respects, it indeed will. Biden won't be a serial liar. That's no small matter. By the middle of this year, Trump had made false or misleading pronouncements of one sort or another more than 20,000 times since becoming president. Nor will we have a president who winks and nods at far-right groups or racist "militias," nor one who blasts a governor -- instead of expressing shock and solidarity -- soon after the FBI foils a plot by right-wing extremists to kidnap her for taking steps to suppress the coronavirus. We won't have a president who repeatedly intimates that he will remain in office even if he loses the election. We won't have a president who can't bring himself to appeal to Americans to display their patriotism through the simple act of donning masks to protect others (and themselves) from Covid-19. And we won't have a president who lacks the compassion to express sorrow over the 225,000 Americans (and rising) who have been killed by that disease, or enough respect for science and professional expertise, to say nothing of humility, to refrain from declaring, as his own experts squirm, that warm weather will cause the virus to vanish miraculously or that injections of disinfectant will destroy it.

And these, of course, won't be minor victories. Still, Joe Biden's arrival in the Oval Office won't alter one mega-fact: Donald Trump will hand him a monstrous economic mess. Worse, in the almost three months between November 3rd and January 20th, rest assured that he will dedicate himself to making it even bigger.

The motivation? Sheer spite for having been put in the position -- we know that he will never accept any responsibility for his defeat -- of facing what, for him, may be more unbearable than death itself: losing. The gargantuan challenge of putting the economy back on the rails while also battling the pandemic would be hard enough for any new president without the lame-duck commander-in-chief and Senate Republicans sabotaging his efforts before he even begins. The long stretch between Election Day and Inauguration Day will provide Donald Trump ample time to take his revenge on a people who will have forsaken, in his opinion, the best president ever.

More on Trump's vengeance, but first, let's take stock of what awaits Biden should he win in November.

Our Covid-Ravaged Economy

To say that we are, in some respects, experiencing the biggest economic disaster since the Great Depression of the 1930s is anything but hyperbole. The statistics make that clear. The economy had contracted at a staggering annual rate of 31.4% during the second quarter of this pandemic year. During the 2007-2009 Great Recession, unemployment, at its height, was 10%. This year's high point, in April, was 14.7%. Over the spring, 40 million jobs disappeared, eviscerating all gains made during the two pre-pandemic years.

There were, however, some relatively recent signs of a rebound. The Philadelphia Federal Reserve Bank's survey of economic forecasters, released in mid-August, yielded an estimate of a 19.1% expansion for the third quarter of 2020. But that optimism came in the wake of Congress passing the Coronavirus Aid, Relief, and Economic Security (CARES) Act, on March 27th, which pumped about $2.2 trillion into the economy. The slowdown in job growth between July and September suggests that its salutary effects may be petering out. Even with that uptick, the economy remains in far worse shape than before the virus started romping through the landscape.

However, while useful, aggregate figures obscure stark variations in how the pain produced by a Covid-19 economy has been felt across different parts of American society. No, we aren't all in this together, if by "together" you mean anything remotely resembling equalized distress. A Bureau of Labor Statistics (BLS) release, for instance, reveals that September's 7.9% nationwide unemployment rate hit some groups far harder than others.

The jobless rate for whites dropped to 7%, but for Hispanics it was 10.3%, for African Americans 12.1%. Furthermore, high-skill, high-wage workers have gotten off far more lightly than those whose jobs can't be done from home, including restaurant servers and cooks, construction workers, meatpackers, housecleaners, agricultural laborers, subway, bus, and taxi drivers, first responders, and retail and hotel staff, among others. For workers like them, essential public health precautions, whether "social distancing" or stay-at-home decrees, haven't just been an inconvenience. They have proven economically devastating. These are the Americans who are struggling hardest to buy food and pay the rent.

More than 25 million of them fall in the lowest 20% of the earnings scale and -- no surprise here -- have, at best, the most meager savings. According to the Fed's calculations, of the bottom 25% of Americans, only 11% have what they require for at least six months of basic expenses and less than 17% for at least three. Yes, unemployment insurance helps, but depending on the state, it covers just 30% to 50% of lost wages. Moreover, there's no telling when, or whether, such workers will be rehired or find new jobs that pay at least as much. The data on long-term unemployment isn't encouraging. The BLS reports that, in September, 2.4 million workers had been unemployed for 27 weeks or more, another 4.9 million for 15 to 27 weeks.

These disparities and the steps the Fed has taken, including keeping interest rates low and buying treasury bills, mortgage-backed securities, and corporate bonds, help explain why high stock prices and massive economic suffering have coexisted, however incongruously, during the pandemic. The problem with bull markets, however, is that they don't bring direct gains to the chunk of American society that's been hurt the most.

Nearly half of American households own no stock at all, according to the Federal Reserve Bank, even if you count pension and 401k plans or Individual Retirement Accounts -- and for black and Hispanic families the numbers are 69% and 72%, respectively. Furthermore, the wealthiest 10% of households own 84% of all stock.

Trump preens when the stock market soars, as he did on April 10th, when 16 million Americans had just filed for unemployment. Tweets trumpeting "the biggest Stock Market increase since 1974" were cold comfort for Americans who could no longer count on paychecks.

The Signs of Suffering

Even such numbers don't fully reveal the ways in which prolonged joblessness has upended lives. To get a glimpse of that, consider how low-income workers, contending with extended unemployment, have struggled to pay for two basic necessities: housing and food.

Reuters reported in late July that Americans already owed $21.5 billion in back rent. Worse yet, 17.3 million of the country's 44 million renter households couldn't afford to pay the landlord and faced possible eviction. A fifth of all renters had made only partial payments that month or hadn't paid anything. Again, not surprisingly, some were in more trouble than others. In September, 12% of whites owed back rent compared to 25% of African Americans, 24% of Asians, and 22% of Latinos. A May Census Bureau survey revealed that nearly 45% of African Americans and Hispanics but "only" 20% of whites had little or no confidence in their ability to make their June rent payments. (Households with kids were in an even bigger bind.)

The rent crunch also varied depending on a worker's education, a reliable predictor of earnings. Workers with high school diplomas earned only 60% as much as workers who had graduated from college and only 50% of those with a master's degree. And the more education workers had, the less likely they were to be laid off. Between February and August, 2.5% of employees with college degrees lost their jobs compared to nearly 11% of those who hadn't attended college.

Those, then, are the Americans most likely to be at risk of eviction. Yes, the federal government, states, and cities have issued rent moratoriums, but the protections in them varied considerably and, by August, they had ended in 24 of the 43 states that enacted them; nor did they release renters from future obligations to pay what they owe, sometimes with penalties. In addition, eviction stays haven't stopped landlords nationwide from taking thousands of delinquent renters to court and even, depending on state laws, seeking to evict them. The courts are clogged with such cases. Eventually, millions of renters could face what a BBC report called a potential "avalanche" of evictions.

Nor have homeowners been safe. The CARES Act did include provisions to protect some of them, offering those with federal-backed mortgages the possibility of six-month payment deferrals, potential six-month extensions of that, and the possibility of negotiating affordable payment plans thereafter. In many cases, however, that "forbearance" initiative hasn't worked as intended. Often, homeowners didn't know about it or weren't aware that they had to file a formal request with their lenders to qualify or got the run around when they tried to do so. Still, mortgage forbearance helped millions, but it expires in March 2021 when many homeowners could still be jobless or have new jobs that don't pay as well. Just how desperate such people will be depends, of course, on how strongly Covid-19 resurges, what future shutdowns it produces, and when it will truly subside.

Meanwhile, according to the Mortgage Bankers Association, the residential mortgage delinquency rate hit 8.22% as the second quarter of 2020 ended, the highest since 2014. Meanwhile, between June and July, mortgage payments overdue 90 or more days increased by 20% to a total unseen since 2010. True, we're not yet headed for defaults and foreclosures on the scale of the Great Recession of 2007-2008, but that's a very high bar.

As for hunger, a September Census Bureau survey reports that 10.5% of adults, or 23 million people, stated that household members weren't getting enough to eat. That's a sharp increase from the 3.7% in a Department of Agriculture survey for 2019. In July, the Wall Street Journal reported, 12% of adults said their families didn't have enough food (compared to 10% in May). A fifth of them lacked the money to feed their kids adequately, a three-percent increase from May. Recent food-insecurity estimates for households with children range from 27.5% to 29.5%.

Meanwhile, enrollments in the Supplemental Nutritional Assistance Program (known until 2008 as the Food Stamp Program) grew by 17% between February and May, forcing the government to increase its funding. Food banks, overwhelmed by demand, are pleading for money and volunteers. In August, a mile-long line of cars formed outside a food bank in Dallas, one of many such poignant scenes in cities across the country since the pandemic struck.

What Happens After the Election?

For those who have lost their jobs, the CARES Act provided $600 a week to supplement unemployment benefits, as well as a one-time payment of $1,250 per adult and $2,400 for married couples. That stipend, though, ended on July 31st when the Republican Senate balked at renewing it. In August, by executive order, the president directed the Federal Emergency Management Agency to step in with three weeks of $300 payments, which were extended for another three. That, however, was half what they would have received had the CARES supplement been extended and, by October, most states had used up the Trump allotments.

In the ongoing congressional negotiations over prolonging supplemental benefits and other assistance, President Trump engaged, only to disengage. With a September ABC News/IPSOS voter survey showing that just 35% of the public approved of his handling of the pandemic, and Joe Biden having opened a double-digit lead in many polls, the president suddenly offered a $1.8 trillion version of the CARES Act, only to encounter massive blowback from his own party.

And that's where we are as the election looms. If Trump loses (and accepts the loss), he will hand Joe Biden an economic disaster of the first order that he's made infinitely worse by belittling mask-wearing and social distancing, disregarding and undercutting his administration's own medical experts, peddling absurd nostrums, and offering rosy but baseless prognostications. And between November 3rd, Election Day, and January 20th, Inauguration Day, expect -- hard as it might be to imagine -- an angrier, more vengeful Trump.

For now, as his prospects for victory seem to dim, he has good reason to push for, or at least be seen as favoring, additional aid, but here's a guarantee: if he loses in November, he won't just moan about election rigging, he'll also lose all interest in providing more help to millions of Americans at the edge of penury and despair. Vindictiveness, not sympathy, will be his response, even to his base, for whom he clearly has a barely secret disdain. So accept this guarantee, as well: between those two dates, whatever he does will be meant to undermine the incoming Biden administration. That includes working to make the climb as steep as possible for the rival he's depicted as a semi-senile incompetent. He will want only one thing: to see his successor fail.

Once Trump formally hands over the presidency -- assuming his every maneuver to retain power flops -- he'll work to portray any measure the new administration adopts to corral the virus he helped let loose and to aid those in need as profligacy, and as "socialism" and governmental overreach imperiling freedom. Last guarantee: he won't waste a minute getting his wrecking operation underway, while "his" party will posture as the paragon of financial rectitude. It won't matter that Republican administrations have racked up the biggest budget deficits in our history. They, too, will ferociously resist Biden's efforts to help millions of struggling Americans.

And think of all of this, assuming Biden wins, as the "good news."

Rajan Menon, a TomDispatch regular, is the Anne and Bernard Spitzer Professor of International Relations at the Powell School, City College of New York, senior research fellow at Columbia University's Saltzman Institute of War and Peace Studies, and a non-resident fellow at the Quincy Institute for Responsible Statecraft. His latest book is The Conceit of Humanitarian Intervention.

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer's new dystopian novel (the second in the Splinterlands series) Frostlands, Beverly Gologorsky's novel Every Body Has a Story, and Tom Engelhardt's A Nation Unmade by War, as well as Alfred McCoy's In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower's The Violent American Century: War and Terror Since World War II.

Copyright 2020 Rajan Menon

A homelessness crisis mounts in the COVID-19 era — as Trump's callousness reaches new heights

The novel SARS-CoV-2 has roared through the American landscape leaving physical, emotional, and economic devastation in its wake. By early July, known infections in this country exceeded three million, while deaths topped 135,000. Home to just over 4% of the global population, the United States accounts for more than a quarter of all fatalities from Covid-19, the disease produced by the coronavirus. Amid a recent surge of infections, especially across the Sun Belt, which Vice President Mike Pence typically denied was even occurring, the Centers for Disease Control and Prevention (CDC) reported that the daily total of infections had reached a record 60,000Arizona's seven-day average alone approached that of the European Union, which has 60 times as many people.

Keep reading... Show less

The coronavirus is a stress test for our society — and it's exposing fundamental grim truths

The Severe Acute Respiratory Syndrome (SARS-CoV-2) virus, which causes Covid-19, seemed to emerge from deepest history, from the Black Death of the 14th century and the “Spanish Flu” of 1918. In just months, it has infected more than 1.5 million people and claimed more than 88,000 lives. The virus continues to spread almost everywhere. In no time at all, it’s shattered the global economy, sent it tumbling toward a deep recession (possibly even a depression), and left much of a planet locked indoors. Think of it as a gigantic stress test.

Keep reading... Show less

The shame of child poverty in the Trump era: Poor kids lose as billionaires cash in

The plight of impoverished children anywhere should evoke sympathy, exemplifying as it does the suffering of the innocent and defenseless. Poverty among children in a wealthy country like the United States, however, should summon shame and outrage as well. Unlike poor countries (sometimes run by leaders more interested in lining their pockets than anything else), what excuse does the United States have for its striking levels of child poverty? After all, it has the world’s 10th highest per capita income at $62,795 and an unrivaled gross domestic product (GDP) of $21.3 trillion. Despite that, in 2020, an estimated 11.9 million American kids -- 16.2% of the total -- live below the official poverty line, which is a paltry $25,701 for a family of four with two kids. Put another way, according to the Children's Defense Fund, kids now constitute one-third of the 38.1 million Americans classified as poor and 70% of them have at least one working parent -- so poverty can’t be chalked up to parental indolence.

Keep reading... Show less

Why arms races never end

Hypersonic weapons close in on their targets at a minimum speed of Mach 5, five times the speed of sound or 3,836.4 miles an hour. They are among the latest entrants in an arms competition that has embroiled the United States for generations, first with the Soviet Union, today with China and Russia. Pentagon officials tout the potential of such weaponry and the largest arms manufacturers are totally gung-ho on the subject. No surprise there. They stand to make staggering sums from building them, especially given the chronic “cost overruns” of such defense contracts -- $163 billion in the far-from-rare case of the F-35 Joint Strike Fighter.

Keep reading... Show less
BRAND NEW STORIES

Happy Holidays!