Steve Fraser

The return of child labor is the latest sign of American decline

Steve Fraser, Return of the Repressed

Yes, as TomDispatch regular and historian Steve Fraser, author of Class Matters: The Strange Career of an American Delusion, points out today, child labor is making a grim comeback in this country — and immigrant children are leading the way. This brings to my mind a tale from my own past. My grandfather, born in what’s now Ukraine, fled home at age 14, spent two years all too literally working his way north to Hamburg, Germany, where he got a job as a “scribe” — he had beautiful handwriting — and then boarded a boat for America. As his daughter, my Aunt Hilda, wrote years ago, “A boy of 16, he arrived in New York from Europe in March 1888. It was during the famous blizzard and after a sea voyage of about 30 days. He had no money. He often said that he had a German 50-cent piece in his pocket when he landed. His trip had to be in the cheapest part of the ship — way down in steerage. Poor boy… and for the first few months in America I imagine he slept behind the stove in somebody’s kitchen.”

She hardly needed to add that, as immigrant labor, he went right to work. His wife, my grandmother Celia, was born into a poverty-stricken family — her parents had been immigrants — two years before he arrived and, as Hilda reported, she completed just one year of high school before she, too, had to go to work “as soon as possible… The last job she had was with the telephone company as an operator. She was with them for a year or two when she left to get married” at age 17.

And that is indeed ancient history, personal and otherwise. But how unbelievably eerie that such a distant tale should once again be so desperately of this moment. Let Fraser explain. Tom

Caution: Children at Work

The Return of Child Labor Is the Latest Sign of American Decline

An aged Native-American chieftain was visiting New York City for the first time in 1906. He was curious about the city and the city was curious about him. A magazine reporter asked the chief what most surprised him in his travels around town. “Little children working,” the visitor replied.

Child labor might have shocked that outsider, but it was all too commonplace then across urban, industrial America (and on farms where it had been customary for centuries). In more recent times, however, it’s become a far rarer sight. Law and custom, most of us assume, drove it to near extinction. And our reaction to seeing it reappear might resemble that chief’s — shock, disbelief.

But we better get used to it, since child labor is making a comeback with a vengeance. A striking number of lawmakers are undertaking concerted efforts to weaken or repeal statutes that have long prevented (or at least seriously inhibited) the possibility of exploiting children.

Take a breath and consider this: the number of kids at work in the U.S. increased by 37% between 2015 and 2022. During the last two years, 14 states have either introduced or enacted legislation rolling back regulations that governed the number of hours children can be employed, lowered the restrictions on dangerous work, and legalized subminimum wages for youths.

Iowa now allows those as young as 14 to work in industrial laundries. At age 16, they can take jobs in roofing, construction, excavation, and demolition and can operate power-driven machinery. Fourteen-year-olds can now even work night shifts and once they hit 15 can join assembly lines. All of this was, of course, prohibited not so long ago.

Legislators offer fatuous justifications for such incursions into long-settled practice. Working, they tell us, will get kids off their computers or video games or away from the TV. Or it will strip the government of the power to dictate what children can and can’t do, leaving parents in control — a claim already transformed into fantasy by efforts to strip away protective legislation and permit 14-year-old kids to work without formal parental permission.

In 2014, the Cato Institute, a right-wing think tank, published “A Case Against Child Labor Prohibitions,” arguing that such laws stifled opportunity for poor — and especially Black — children. The Foundation for Government Accountability, a think tank funded by a range of wealthy conservative donors including the DeVos family, has spearheaded efforts to weaken child-labor laws, and Americans for Prosperity, the billionaire Koch brothers’ foundation, has joined in.

Nor are these assaults confined to red states like Iowa or the South. California, Maine, Michigan, Minnesota, and New Hampshire, as well as Georgia and Ohio, have been targeted, too. Even New Jersey passed a law in the pandemic years temporarily raising the permissible work hours for 16- to 18-year-olds.

The blunt truth of the matter is that child labor pays and is fast becoming remarkably ubiquitous. It’s an open secret that fast-food chains have employed underage kids for years and simply treat the occasional fines for doing so as part of the cost of doing business. Children as young as 10 have been toiling away in such pit stops in Kentucky and older ones working beyond the hourly limits prescribed by law. Roofers in Florida and Tennessee can now be as young as 12.

Recently, the Labor Department found more than 100 children between the ages of 13 and 17 working in meatpacking plants and slaughterhouses in Minnesota and Nebraska. And those were anything but fly-by-night operations. Companies like Tyson Foods and Packer Sanitation Services (owned by BlackRock, the world’s largest asset management firm) were also on the list.

At this point, virtually the entire economy is remarkably open to child labor. Garment factories and auto parts manufacturers (supplying Ford and General Motors) employ immigrant kids, some for 12-hour days. Many are compelled to drop out of school just to keep up. In a similar fashion, Hyundai and Kia supply chains depend on children working in Alabama.

As the New York Timesreported last February, helping break the story of the new child labor market, underage kids, especially migrants, are working in cereal-packing plants and food-processing factories. In Vermont, “illegals” (because they’re too young to work) operate milking machines. Some children help make J. Crew shirts in Los Angeles, bake rolls for Walmart, or work producing Fruit of the Loom socks. Danger lurks. America is a notoriously unsafe place to work and the accident rate for child laborers is especially high, including a chilling inventory of shattered spines, amputations, poisonings, and disfiguring burns.

Journalist Hannah Dreier has called it “a new economy of exploitation,” especially when it comes to migrant children. A Grand Rapids, Michigan, schoolteacher, observing the same predicament, remarked: “You’re taking children from another country and putting them almost in industrial servitude.”

The Long Ago Now

Today, we may be as stunned by this deplorable spectacle as that chief was at the turn of the twentieth century. Our ancestors, however, would not have been. For them, child labor was taken for granted.

Hard work, moreover, had long been considered by those in the British upper classes who didn’t have to do so as a spiritual tonic that would rein in the unruly impulses of the lower orders. An Elizabethan law of 1575 provided public money to employ children as “a prophylactic against vagabonds and paupers.”

By the eighteenth century, the philosopher John Locke, then a celebrated champion of liberty, was arguing that three-year-olds should be included in the labor force. Daniel Defoe, author of Robinson Crusoe, was happy that “children after four or five years of age could every one earn their own bread.” Later, Jeremy Bentham, the father of utilitarianism, would opt for four, since otherwise, society would suffer the loss of “precious years in which nothing is done! Nothing for Industry! Nothing for improvement, moral or intellectual.”

American “founding father” Alexander Hamilton’s 1791 Report on Manufacturing noted that children “who would otherwise be idle” could instead become a source of cheap labor. And such claims that working at an early age warded off the social dangers of “idleness and degeneracy” remained a fixture of elite ideology well into the modern era. Indeed, it evidently remains so today.

When industrialization began in earnest during the first half of the nineteenth century, observers noted that work in the new factories (especially textile mills) was “better done by little girls of 6-12 years old.” By 1820, children accounted for 40% of the mill workers in three New England states. In that same year, children under 15 made up 23% of the manufacturing labor force and as much as 50% of the production of cotton textiles.

And such numbers would only soar after the Civil War. In fact, the children of ex-slaves were effectively re-enslaved through onerous apprenticeship arrangements. Meanwhile, in New York City and other urban centers, Italian padrones expedited the exploitation of immigrant kids while treating them brutally. Even the then-brahmin-minded, anti-immigrant New York Times took offense: “The world has given up stealing men from the African coast, only to kidnap children from Italy.”

Between 1890 and 1910, 18% of all children between the ages of 10 and 15, about two million young people, worked, often 12 hours a day, six days a week.

Their jobs covered the waterfront — all too literally as, under the supervision of padrones, thousands of children shucked oysters and picked shrimp. Kids were also street messengers and newsies. They worked in offices and factories, banks and brothels. They were “breakers” and “trappers” in poorly ventilated coal mines, particularly dangerous and unhealthy jobs. In 1900, out of 100,000 workers in textile mills in the South, 20,000 were under the age of 12.

City orphans were shipped off to labor in the glassworks of the Midwest. Thousands of children stayed home and helped their families turn out clothing for sweatshop manufacturers. Others packed flowers in ill-ventilated tenements. One seven-year-old explained that “I like school better than home. I don’t like home. There are too many flowers.” And down on the farm, the situation was no less grim, as children as young as three worked hulling berries.

All in the Family

Clearly, well into the twentieth century, industrial capitalism depended on the exploitation of children who were cheaper to employ, less able to resist, and until the advent of more sophisticated technologies, well suited to deal with the relatively simple machinery then in place.

Moreover, the authority exercised by the boss was in keeping with that era’s patriarchal assumptions, whether in the family or even in the largest of the overwhelmingly family-owned new industrial firms of that time like Andrew Carnegie’s steelworks. And such family capitalism gave birth to a perverse alliance of boss and underling that transformed children into miniature wage-laborers.

Meanwhile, working-class families were so severely exploited that they desperately needed the income of their children. As a result, in Philadelphia around the turn of the century, the labor of children accounted for between 28% and 33% of the household income of native-born, two-parent families. For Irish and German immigrants, the figures were 46% and 35% respectively. Not surprisingly, then, working-class parents often opposed proposals for child labor laws. As noted by Karl Marx, the worker was no longer able to support himself, so “now he sells his wife and child. He becomes a slave dealer.”

Nonetheless, resistance began to mount. The sociologist and muckraking photographer Lewis Hine scandalized the country with heart-rending pictures of kids slaving away in factories and down in the pits of mines. (He got into such places by pretending to be a Bible salesman.) Mother Jones, the militant defender of labor organizing, led a “children’s crusade” in 1903 on behalf of 46,000 striking textile workers in Philadelphia. Two hundred child-worker delegates showed up at President Teddy Roosevelt’s Oyster Bay, Long Island, residence to protest, but the president simply passed the buck, claiming child labor was a state matter, not a federal one.

Here and there, kids tried running away. In response, owners began surrounding their factories with barbed wire or made the children work at night when their fear of the dark might keep them from fleeing. Some of the 146 women who died in the infamous Triangle Shirtwaist Factory fire of 1911 in Manhattan’s Greenwich Village — the owners of that garment factory had locked the doors, forcing the trapped workers to leap to their deaths from upper floor windows — were as young as 15. That tragedy only added to a growing furor over child labor.

A National Child Labor Committee was formed in 1904. For years, it lobbied states to outlaw, or at least rein in, the use of child labor. Victories, however, were often distinctly pyrrhic, as the laws enacted were invariably weak, included dozens of exemptions, and poorly enforced. Finally, in 1916, a federal law was passed that outlawed child labor everywhere. In 1918, however, the Supreme Court declared it unconstitutional.

In fact, only in the 1930s, after the Great Depression hit, did conditions begin improving. Given its economic devastation, you might assume that cheap child labor would have been at a premium. However, with jobs so scarce, adults — males especially — took precedence and began doing work once relegated to children. In those same years, industrial work began incorporating ever more complex machinery that proved too difficult for younger kids. Meanwhile, the age of compulsory schooling was steadily rising, limiting yet more the available pool of child laborers.

Most important of all, the tenor of the times changed. The insurgent labor movement of the 1930s loathed the very idea of child labor. Unionized plants and whole industries were no-go zones for capitalists looking to exploit children. And in 1938, with the support of organized labor, President Franklin Roosevelt’s New Deal administration finally passed the Fair Labor Standards Act which, at least in theory, put an end to child labor (although it exempted the agricultural sector in which such a workforce remained commonplace).

Moreover, Roosevelt’s New Deal transformed the national zeitgeist. A sense of economic egalitarianism, a newfound respect for the working class, and a bottomless suspicion of the corporate caste made child labor seem particularly repulsive. In addition, the New Deal ushered in a long era of prosperity, including rising standards of living for millions of working people who no longer needed the labor of their children to make ends meet.

Back to the Future

It’s all the more astonishing then to discover that a plague, once thought banished, lives again. American capitalism is a global system, its networks extend virtually everywhere. Today, there are an estimated 152 million children at work worldwide. Not all of them, of course, are employed directly or even indirectly by U.S. firms. But they should certainly be a reminder of how deeply retrogressive capitalism has once again become both here at home and elsewhere across the planet.

Boasts about the power and wealth of the American economy are part of our belief system and elite rhetoric. However, life expectancy in the U.S., a basal measure of social retrogression, has been relentlessly declining for years. Health care is not only unaffordable for millions, but its quality has become second-rate at best if you don’t belong to the top 1%. In a similar fashion, the country’s infrastructure has long been in decline, thanks to both its age and decades of neglect.

Think of the United States, then, as a “developed” country now in the throes of underdevelopment and, in that context, the return of child labor is deeply symptomatic. Even before the Great Recession that followed the financial implosion of 2008, standards of living had been falling, especially for millions of working people laid low by a decades-long tsunami of de-industrialization. That recession, which officially lasted until 2011, only further exacerbated the situation. It put added pressure on labor costs, while work became increasingly precarious, ever more stripped of benefits and ununionized. Given the circumstances, why not turn to yet another source of cheap labor — children?

The most vulnerable among them come from abroad, migrants from the Global South, escaping failing economies often traceable to American economic exploitation and domination. If this country is now experiencing a border crisis — and it is — its origins lie on this side of the border.

The Covid-19 pandemic of 2020-2022 created a brief labor shortage, which became a pretext for putting kids back to work (even if the return of child labor actually predated the disease). Consider such child workers in the twenty-first century as a distinct sign of social pathology. The United States may still bully parts of the world, while endlessly showing off its military might. At home, however, it is sick.

Liberty or death in the time of 'woke communism'

Steve Fraser: The Republicans Face Off Against... Yes, Communism!

Recently, at a well-publicized dinner, our former president — the one who encouraged a coup d’état to keep himself in power after losing an election — expressed his latest fear. “The problem we have,” he said ominously, “is that we are headed toward communism. We are beyond socialism. A lot of people say, ‘Well, not really.’ Of course we are.” And then he added, “If you do something spectacular, they don’t talk about it. If you do something not so good, they’ll make it into a travesty. If the other side does something bad, really bad, you’ll never find it. There’s never been a period of time like that in our country’s history. And that’s the way communism starts. And… we’re not gonna let it happen.”

OMG! A new red scare in America, as in my childhood when anti-communism was a part of everyday Republican politics? Did someone revive the Soviet Union when none of us were looking? Have Xi Jinping’s confederates infiltrated this country’s business class? Who could even begin to explain it? Hmm, let’s see… how about TomDispatch regular and historian Steve Fraser, author of Mongrel Firebugs and Men of Property: Capitalism and Class Conflict in American History, who (like Donald Trump and me) lived through those red-scare years of the 1950s.

Admittedly, running against Communism was once the norm in America. With that in mind, take a little voyage of — dare I call it nostalgia? — with Fraser and meditate a bit on what a truly strange world we now live in, one of remarkable extremes of weather, income inequality, and politics. After all, one of our two major parties has gone certifiably mad (and, at least in some parts of the country, that very madness has attracted voters in a big-time fashion).

Oh, and I think I can already hear Marjorie Taylor Greene calling me a “liar” for what I just wrote… or is that my imagination? Who can tell anymore? Tom

The Specter of “Woke Communism”: How Corporate America Became the Bogeyman of Today's Anti-Communist Crusaders

Ron DeSantis, governor of Florida and perhaps the next president of the United States, is waging war against something he and many others on the right identify as “woke communism.” DeSantis even persuaded the Florida legislature to pass a Victims of Communism law, mandating that every November 7th (the anniversary of the Bolshevik Revolution in Russia), all public schools in the state must devote 45 minutes of instruction to the evils of the red menace.

You might reasonably ask: What menace? After all, the Soviet Union fell apart more than 30 years ago and, long before that, communist parties around the world had dwindled in numbers and lost their revolutionary zeal. The American Communist Party was buried alive nearly three-quarters of a century ago during the McCarthy hysteria of the 1950s.

How then can there be a muscular rebirth of anti-communism when there’s no communism to face off against? The Claremont Institute, a right-wing think tank, explains the paradox this way: the powers that be of the present moment, including “education, corporate media, entertainment, big business, especially big tech, are to varying degrees aligned with the Democratic Party which is now controlled by Woke Communism.”

All clear now? A “cold civil war” is afoot, so we’re assured by DeSantis and crew, and if we don’t act quickly, “woke communism will replace American justice… the choice is between liberty or death.”

Naturally, Donald Trump has joined the chorus, declaiming that the Democratic Party functions as a cover for “wild-eyed Marxists.” People like Presidents Barack Obama and Joe Biden, formerly considered proud defenders of capitalism, are now censored as socialists. Steve Bannon, right-wing populist organizer and one-time Trump adviser, has attacked the Business Roundtable and venture capitalists like Larry Fink of Blackrock, the largest asset management firm in the world, because he’s determined to defend a “government of laws, not Woke CEOs.”

At the January gathering of the Republican National Committee, angry that Ronna McDaniel had held onto her position as its chairperson, right-wing activist Charlie Kirk put the matter in stark class terms: “The country club won today. So, the grassroots people who can’t afford to buy a steak and are struggling to make ends meet, they just got told by their representatives at an opulent $900/night hotel that ‘We hate you.’”

How surpassingly odd! Somehow, the “spectre” invoked nearly 200 years ago by Karl Marx in The Communist Manifesto, reflecting his urge to see the exploited and impoverished mobilized to overthrow capitalism, now hangs out at country clubs, corporate boardrooms, and the White House — all the redoubts of capitalism.

Listen to DeSantis. At a rally in Sarasota during the 2022 midterm elections, he got his loudest applause for denouncing corporate America — and not just for assaulting the Walt Disney company’s criticism of the state’s “don’t say gay” policy. He went after Wall Street, too, noting that the “masters of the universe are using their economic power to impose policies on the country that they could not do at the ballot box” and promising to “fight the Woke in corporate America.” A recent Gallup poll signals that he might be onto something, since the percentage of Republicans unhappy with big business has soared.

Once upon a time, anti-communism was the ideology of a ruling class under siege, warning that its enemies among hard-pressed farmers and industrial workers were intent on destroying the foundations of civilized life: private property, the family, religion, and the nation. Now, of all the unlikely suspects, anti-communism has become part of the ideological arsenal aimed at those very dominating elites.

Bankers and Bolsheviks

What happened? There’s certainly a history here. Start with Henry Ford, a folk hero to millions of Americans during the early decades of the twentieth century. He not only invented the Model T Ford but also helped articulate a new version of anti-communism. He was a notorious antisemite, even publishing a book in 1923 called The International Jew that warned of a Jewish conspiracy to take over the world. (It became a bestseller.)

Antisemitism had long traded on the stereotype of the Jew as Shylock, the usurious, heartless banker. Ford, who also hated bankers, sought something far grander than classic antisemitism. As he came to see it, the International Jew was conspiring not just with the titans of high finance but with their supposedly inveterate enemies, the Bolsheviks, whose tyranny in Russia was but a foretaste of the future. In America, as he saw it, financiers were secretly plotting with the Industrial Workers of the World and the Socialist Party to make war on capitalism, an unholy alliance of Wall Street, Jewish financiers, and the Kremlin, the Rothschilds and Lenin, seeking to unravel the very moral fiber of western civilization.

Together, so this line of thought went, they plotted to saturate a hard-working, family-centered, patriarchal, sexually orthodox, racially homogenous, god-fearing capitalist society with soul-destroying hedonism, allowing the bankers to make money and the Bolsheviks to find their way to power. After all, communists were atheists, who held the traditional patriarchal family in contempt, believed in both woman’s equality and racial equality, and felt no loyalty to the nation. Similarly, bankers worshipped Mammon, who had no homeland.

According to the automaker, evidence of this conspiracy of moral subversion lay in plain sight. After all, the commies were peddling pornography through their control of the movie business, while — it was the prohibition era — they saturated the country in bootleg gin. Because they were also the masterminds behind the publishing industry, they arranged an endless flow of sex and sensationalism in newspapers, magazines, and pulp novels. And “Jewish jazz,” bankrolled by the same circles, was on its way to becoming the national music, its rhythms an open invitation to the lewd and lascivious.

In the years just preceding Ford’s antisemitic outburst, the Palmer raids, conducted by United States Attorney General A. Mitchell Palmer during and following World War I, had imprisoned and deported thousands of radical political activists, only heightening the panic about a coming communist revolution in America. But no one before Ford had ever imagined communists combining forces with the ruling class they presumably were out to overthrow. That was the bizarre eye-opener of a disturbed and disturbing moment and, mad as it was, should once again sound eerily familiar.

From Anti-Communism to Anti-Capitalism

That imaginary league of Bolsheviks and bankers would remain an undercurrent of popular superstition, while anti-communism began to mutate, coming to have ever less to do with communist movements and ever more with a perverse form of anti-capitalism.

As the giant corporation run by faceless functionaries in suits and ties along with vast government bureaucracies supplanted old-style family capitalism, a whole galaxy of moral and social certitudes about self-reliance, frugality, independence, upward mobility, and piety came under assault. The new order, capitalism on steroids, left a beleaguered and angry world of “little men” in its wake, overwhelmed by a sense of material and spiritual dispossession.

In the 1930s, President Franklin Roosevelt’s New Deal response to the Great Depression would only reignite the dark fantasies of Ford’s conspiracy-mongering. In those years, populist demagogue Father Charles Coughlin (known as the “radio priest” thanks to his charismatic weekly sermons listened to by millions) preached about how corporate capitalism was “privately sustaining in some instances the worst elements of Communism.” Coughlin grew increasingly hostile to the New Deal. Its bureaucracies, he claimed, meddled in family life, while its regulatory reforms were a disguised version of “financial socialism.” In 1936, he and fellow demagogues formed the Union Party to try (unsuccessfully) to stop Roosevelt’s reelection.

In the eyes of that radio priest, another antisemite by the way, Roosevelt was nothing less than America’s Lenin and his New Deal “a broken-down Colossus straddling the harbor of Rhodes, its left leg standing on ancient Capitalism and its right mired in the red mud of communism.”

Joseph McCarthy vs. The Establishment

Though Ford’s and Coughlin’s outpourings were infected with a deep strain of antisemitism, the invective of Wisconsin Senator Joseph McCarthy, whose name became synonymous with mid-twentieth century anti-communism, McCarthyism was not. He was after bigger game, namely the whole WASP establishment.

The Cold War with the Soviet Union provided the ostensible context for McCarthy’s ravings about “a conspiracy so immense and an infamy so black as to dwarf any previous venture in the history of man.” But his focus was far less on the Russians and far more on the upper-crust mandarins running the post-New Deal state.

Traitors to their class, as he saw it, figures like Secretary of State Dean Acheson or international banker John McCloy were, he insisted, closet communists. Yet, gallingly, they also hailed from the most privileged precincts of American society, places like the elite prep school Groton, Harvard, and Wall Street. McCarthy would mock their cosmopolitan associations, their Anglophilia, their gilded careers as international financiers and the heads of major corporations. He would typically portray Averell Harriman, the scion of a railroad and banking family (and a future governor of New York), as “a guy whose admiration for everything Russian is unrivaled outside the confines of the Communist Party.”

The notorious Senate “hearings” he held and the McCarthyism he promoted would prove potent enough to ruin the lives of countless teachers, writers, trade unionists, civil-rights activists, performing artists, journalists, even librarians who lost their jobs and worse, thanks to his infamous inquisitions. And in those years, much of the Republican Party would mimic his message.

Few were safe from such fulminations and McCarthy was anything but alone in delivering them. For instance, the son of a former president, Senator Robert Taft, known as “Mr. Republican” and a leader from the party’s midwestern heartland, was often hailed as its future presidential nominee. Running in the 1952 primaries, he told his supporters in Ohio that “if we get [Dwight D.] Eisenhower, we will practically have a Republican New Deal Administration with just as much spending and socialism as under [President Harry] Truman.” When he lost the nomination to the former World War II commander, Taft would rage that “every Republican candidate for president since 1936 has been nominated by Chase Bank.”

The imagery of tea-sipping, silk handkerchiefs, and silver spoons that spiced McCarthy’s savage depiction of the supposedly left-wing establishment pointed to a subtle shift in the political center of gravity of the anticommunist crusade. While the economic throw-weight of those capitalists-cum-communists remained in the crosshairs of the McCarthyites, cultural matters tended to take center stage.

Although the moral dangers of a supposedly communist-influenced New Deal-style state still preoccupied the senator and his legions of followers, his archetypical enemy came ever more to resemble Coughlin’s: not just left-leaning intellectuals but Ivy League financiers, bankers with “grouse-hunting estates in Scotland,” whom they saw as an aristocracy of destruction.

Liberty or Death in the Time of “Woke Communism”

Oscillating with the ups and downs of the economy, that version of anti-capitalism regularly masqueraded as anti-communism. And all these years later, “woke capital,” the target of so much Trumpublican fury, is once again being labeled a communist phenomenon. That’s because so many Fortune 500 companies, leading banks, and mass-media outfits have had to come to terms with racial and gender equality, sexual and marital choice, and multicultural diversity — with, that is, the latest version of secularism generally.

Coca-Cola and Delta have even criticized Republican state voter suppression laws. As early as 2015, corporate opposition forced Republicans in Indiana and North Carolina to back off anti-gay and anti-transgender legislation. In 2019, more than 180 CEOs posted a full-page ad in the New York Times announcing that restrictions on abortion were bad for business. A year later, Goldman Sachs established a $10 million fund to promote racial equality to “honor the legacy of George Floyd, Breonna Taylor, and Ahmaud Arbery.” After the January 6th insurrection at the Capitol, more than 50 companies said they would no longer contribute to the eight Republican senators who objected to certifying the presidential election.

Moreover, dominant financial and business interests depend on the various agencies of the state to subsidize profits and stabilize the economy. This feeds a government apparatus that has long been the bête noire of anti-communists. When the Covid-19 pandemic took down the economy, the 1% thrived, enraging many on the right as well as the left.

Refugees from the social revolutions in Venezuela and Cuba, who have gathered in significant numbers in Florida, no doubt resonate to the latest anti-communist dog whistles of Governor DeSantis in a way that echoes the sentiments aroused in the old days by the Bolshevik Revolution. But what really fires up the passions of Republicans is the tendency of modern capitalism to make peace with (and profit from) the social, racial, and more recently environmental upheavals of the last half-century. Resentment about this continues to ferment in right-wing circles.

The Republican Party now claims to be “standing between capitalism and communism.” But the capitalism it promotes — of the self-reliant entrepreneur, the pious family patriarch, the free-booting version of commerce that depended on racial and gender inequalities and brooked no interference from the state — has been on the defensive for a long time.

Yes, the DeSantis-style anti-communists of today do worry about the possible appeal of socialism to young people, but in real life, the “communism” they face off against is modern capitalism, or what one right-wing wit termed “capitalism with Chinese characteristics.” “Woke capital” or, if you’re the governor of Florida, “woke communism,” has indeed seized power.

Ironically, the woke-communist persuasion may be loony about “communism,” but they are unintentionally right that capitalism is indeed the problem. Capitalism exploits millions of workers, devours the environment, creates obscene inequalities in income and wealth, depends on war and the machinery of war, poisons the well of democracy, incites resentments, and destroys any instinct for social solidarity. But count on one thing: the wet dreams of today’s anti-communists when it comes to restoring some superannuated, idealized form of small-town capitalism are harmful fantasies of the first order. Indeed, they have already done much damage, and Governor DeSantis and crew will only make things far worse.

Donald Trump's tyrannical Supreme Court is nothing new

Steve Fraser, The Rogue Court, Then and Now

Try to take this in for a moment. Only recently, as ProPublica reported, a Chicago electronics mogul gave the largest political donation in American history, $1.6 billion. (Yes, you read that right!) To put such a sum in perspective — if that’s even conceivable — the New York Timespointed out that it’s “slightly more than the total of $1.5 billion spent in 2020 by 15 of the most politically active nonprofit organizations that generally align with Democrats.” As it happens (don’t be shocked!), that gift didn’t go to the Democrats. It was given to the Marble Freedom Trust, a new nonprofit run by Leonard Leo, a key adviser to Donald Trump. Long involved in trying to create a deeply conservative court system, Leo helped The Donald definitively shape a right-wing super-majority from hell on the Supreme Court, one that’s slated to be in place for years to come and is now moving to transform our country in a staggering fashion, starting, of course, with abortion rights.

Once upon a time, such a donation would have been inconceivable. However, thanks to a series of court decisions, chiefly the Supreme Court’s 5-4 2010 Citizens United one that wiped out century-old campaign finance restric­tions, corpor­a­tions and other outside — or as they’re known, “dark-money” — groups can now essentially sink unlim­ited funds into elec­ting those they believe will benefit them most. In this case, that money will assumedly help Leo and his nonprofit further reshape both American politics and our courts.

In such a context, TomDispatch regular and historian Steve Fraser reminds us of something that’s been all too easily forgotten by those (like me) who grew up in the years of a Supreme Court run by liberal justice Earl Warren. Historically speaking, Trump’s Supremes are anything but an extreme aberration. In fact, they’re eerily closer to the norm of American history when it comes to the top court in the land. In the end, you’ll have to decide whether knowing that offers you a kind of grim relief or a sense of even deeper horror. But first take a little trip through time with Fraser and consider the long history of America’s rogue court. Tom

The Trump Supreme Court Is Nothing New: A History of the Tyranny of the Supremes

Has the Trump Supreme Court gone rogue? The evidence mounts. Certainly, its recent judicial blitzkrieg has run roughshod over a century’s worth of settled law.

A woman’s right to get an abortion? Gone (at least as a constitutionally protected civil right). Meanwhile, voting rights are barely hanging on, along with the 1965 Voting Rights Act that gave them life. State legislatures, so the court ruled, may no longer rein in the wanton availability of firearms and so the bloodshed will inevitably follow. Climate catastrophe will only get closer as the Supremes have moved to disarm the Environmental Protection Agency’s efforts to reduce carbon emissions. Religion, excluded from the public arena since the nation’s founding, can now invade the classroom, thanks to the court’s latest pronouncement.

This renegade court is anything but finished doing its mischief. Affirmative action may be next on the chopping block. Gerrymandering, long an ignoble tradition in American political life, could become unconstrained if the Supremes decide to exempt such practices from state court judicial review. And who knows what they are likely to rule when every election not won by the Republican Party may be liable to a lawsuit.

Donald Trump’s three appointments to the court — Neil Gorsuch, Brett Kavanaugh, and Amy Coney Barrett — cemented in place a rightward shift in its center of gravity that had begun decades earlier. Ever since, in 1986, President Ronald Reagan appointed William Rehnquist, a staunch conservative, as chief justice, the court has only become ever more averse to regulating business, even as it worked to reduce the power of the Federal government.

Don’t forget that it essentially appointed George W. Bush president in 2000 by ruling that Florida couldn’t conduct a recount of the vote, though it seemed likely that Al Gore would prevail and enter the Oval Office. And even after Rehnquist passed away, the court’s 2010 Citizens United decision granted corporations the same free speech rights as people, further eroding democracy by removing limitations on their campaign contributions.

This march to the right was in stark contrast to the earlier deliberations of the court led by Chief Justice Earl Warren. The Warren court was, of course, best known for its landmark 1954 Brown v. Board of Education decision striking down public school segregation. It would also become the judicial centerpiece of a post-World War II liberal order that favored labor unions, civil rights, government oversight of business, and the welfare state.

Historically speaking, however, the Warren Court was the exception, not the one cobbled together by Donald Trump and effectively, if not officially, presided over by Justice Clarence Thomas. The Supremes were born to be bad.

Enshrined in the Constitution

From the beginning, the Supreme Court was conceived as a bulwark against excessive democracy, as indeed was the Constitution itself.

During the years leading up to the 1787 constitutional convention in Philadelphia, the country was in a chronic state of upheaval. Local insurrections against heavy taxation, land and currency speculators, and merchant-bankers had called into question the security and sanctity of private property. Local legislatures proved vulnerable to take-over by the hoi polloi who felt free to cancel debts, print paper money, stop evictions, and oust elites from their accustomed positions of power.

Various impediments to this kind of “mobocracy” were baked into the Constitution, including the electoral college for presidential votes and the indirect election of senators by state legislatures (until the 17th amendment was ratified in 1913). The Supreme Court was just another such obstacle.

Founding Father James Madison typically saw that court as protection against “factious majorities” at the state and local level that might threaten the rights of property-holders. Fearing “passionate majorities,” he went so far as to propose a joint executive-judicial council with veto power over all legislation.

That idea went nowhere. Still, the principle of “judicial review” — the power of the court to have the last say on the constitutionality of legislation — although not made explicit in the Constitution was implicit in the way the founding fathers sought to reign in democratic impulses. French author Alexis de Tocqueville in his nineteenth-century classic, Democracy in America, typically recognized the special status accorded to judicial elites, describing them as America’s “high political class.“

At first, the Supreme Court’s services weren’t needed as a guardian of vested interests and its presence was muted indeed. It met in the basement of the Capitol and, between 1803 and 1857, struck down only two federal statutes. (Compare that to the 22 it struck down between 1992 and 2002 alone.)

The court would, however, establish an enduring reputation for conservatism thanks to its infamous 1857 Dred Scott decision. By a 7-2 majority, the justices declared all Black people — free or enslaved — to be non-citizens. They also ruled that, even if a slave made his or her way to a free state, he or she would remain the property of the slave owner and declared that no territory under U.S. jurisdiction could prohibit slavery.

Dred Scott is generally considered to be the most egregious decision in the court’s 250-year history. That ruling was, however, in keeping with its basic orientation: to side with propertied interests, not the unpropertied; slave-owners, not slaves; and industrialists and financiers rather than with those who worked for and depended on them.

Gatling-Gun Injunctions and Yellow Dog Contracts

After the Civil War, the court became ever more aggressive in defending the interests of the powerful. There was a need for that as, once again, the powerless threatened the status quo.

Reconstruction — the period immediately after the Civil War when the Federal government imposed martial law on the former Confederate states — empowered ex-slaves to militantly exercise their rights to full civil and political equality under the 14th and 15th amendments. Desperate farmers in the Midwest, on the Great Plains, and in the South were then mobilizing to protect themselves from predatory banks, railroads, and commodity speculators. Industrial workers were engaged in pitched battles with their employers, confrontations that elicited widespread sympathy in cities and towns across the country.

“Passionate majorities” needed chastening and the court met the challenge. It launched an era, much like our own, of “judge-made law” that would last from the late 1880s into the 1920s.

Early on, the Supremes declared a civil rights act unconstitutional. Later, in Plessy v. Ferguson, they made segregation constitutionally legitimate via the doctrine of “separate but equal” and so helped restore elite white rule in the South. By ensconcing segregation, they also ended the hopes aroused by the Populist movement for an alliance of black and white rural poor against predatory banks and landlords.

The populist fervor of that era led some state legislatures to adopt laws regulating railroad rates and the fees charged by grain-elevator operators, while challenging corporate monopoly power over the vital necessities of life. Initially, the court tread carefully. Soon enough, however, the justices shed that reticence, using the power of judicial review to wipe such laws off the books. With a distinct touch of irony, they concluded that, in the eyes of the law, corporations were indeed persons and so entitled to the very civil rights guaranteed to ex-slaves by the 14th amendment (“rights” presumably denied them under state regulatory statutes).

Regulating business, the justices suggested, was tantamount to confiscating it. As one railroad lawyer had argued before the court, such regulation was “communism pure and simple.” From that same perspective, the court found a federal law establishing an income tax unconstitutional. (It took the 16th amendment, passed in 1913, to make the income tax national law.)

Industrial capitalism accumulated its wealth by subjecting the lives of millions of workers to abject misery: poverty, overwork, danger, disease, and profound indignity. It would prove a bloody affair, igniting confrontations between workers and their bosses more violent than anywhere else in the western world. As those workers began organizing collectively, their middle-class allies occasionally succeeded in passing relevant laws for minimum wages, outlawing child labor, putting a ceiling on the work hours an employer could enforce, and making the workplace safer or, at least, compensating those injured on the job.

The justices of the Supreme Court, some of whom had once been lawyers for the railroad, iron, and steel industries, knew just what to do in response to such democratic challenges to the prerogatives of capital. While the right to strike might be honored in theory, the court issued injunctions to stop such strikes from happening so often that the era became known (after the early machine gun of that time) for its “Gatling-gun injunctions.” That term was used in part as well because such rulings could be enforced by the Army or its state militia equivalents, not to mention the imprisonment and heavy fines often involved. During one such bloody encounter, William Howard Taft, then an Ohio judge, later president, and finally chief justice of the Supreme Court, complained that federal troops had “killed only six of the mob as yet. This is hardly enough to make an impression.”

To rub yet more salt in the wound, such injunctions were often justified under the Sherman Anti-Trust Act of 1890. Originally designed to break up monopolies, it would be used far more frequently to bust strikes (and sympathy boycotts) on the grounds that they were “conspiracies in restraint of trade.” The court repeatedly enjoined “secondary boycotts“; that is, supportive actions by other unions or groups sympathetic to striking workers. It also struck down a Kansas statute that banned “yellow dog contracts” — agreements promising that they would never join a union that many workers were forced to sign on being hired.

Laws that attempted to ameliorate the harshness of working-class life were treated with similar disdain. New York state, for example, passed one banning cigar making in tenement workshops as a danger to workers’ health. The court saw otherwise, treating such tenement dwellers as independent contractors who had freely chosen their way of life.

New York also tried to limit the hours bakers could work to 10 a day and 60 a week. At the time, they were normally compelled to work 75 to 100 hours weekly in ill-ventilated cellars of tenement bakeries where breathing in the flour was a danger to their lungs. The justices begged to differ. In Lochner v. New York — named after the bakery owner who sued the state — they refused to recognize any threat to the well-being of bakers who, in the eyes of the court, had freely contracted to work on those terms. They were after all as free as their employers to strike a bargain or choose not to work.

The freedom of contract was then the reigning judicial orthodoxy, inherited ironically enough from the long struggle against slave labor. Unlike slavery, free labor allegedly enjoyed an equality of standing in any contractual relationship with an employer. Laws or unions which interfered with that “freedom” were rendered nugatory by the Court and it didn’t matter how obvious it was that the imputed equality between owners of capital and the men and women compelled to work for them was illusory.

The only laws of that sort which passed muster were those protecting women and child laborers. The justices considered such workers inferior and dependent, and so, unlike men, unable to freely enter into relations of contractual equality. In the case of women, there was the added danger of jeopardizing their maternal role. Still, consider it an indication of just how reliant businesses had then become on child labor that even a federal law that controlled the ages and hours children could work was, in the end, struck down by the Supreme Court.

The Court v. the People

By the turn of the twentieth century, the outcry against “judge-made law,” the willful manipulation of the Constitution to shore up endangered bastions of wealth and power, had grown ever stronger. Some more recent scholars have found the court’s rulings then not as one-sided as its reputation suggests, but contemporaries certainly didn’t share those doubts.

When the Supreme Court overturned an income tax law, a dissenting justice vividly described its decision as a “surrender to the moneyed classes.“

Similarly, in 1905, Supreme Court Justice Oliver Wendell Holmes broke with his colleagues when they ruled in the Lochner case, noting that “the 14th amendment does not enact Mr. Herbert Spencer’s Social Statics.” (Spencer was then the world’s foremost proponent of social darwinism and a staunch defender of free-market economics.) A few years later, future Supreme Court Justice Louis Brandeis cuttingly noted that “to destroy a business is illegal. It not illegal to lower the standard of the working man’s living or to destroy the union which aims to raise or maintain such a standard. A business is property… A man’s standard of living is not property.”

Other voices were also being raised in alarm over the coming of a “judicial oligarchy.” Politicians from former president Theodore Roosevelt to perennial Socialist Party presidential candidate Eugene Debs began denouncing “the rogue court.” When he ran again for president in 1912 as the candidate of the Bull Moose, or Progressive Party, Roosevelt declared that the people are “the ultimate makers of their own Constitution” and swore that Americans would not surrender that prerogative to “any set of men, no matter what their positions or their character.” His rival for the party’s nomination, Wisconsin senator Robert LaFollette, typically offered this observation: “Evidence abounds that… the courts pervert justice almost as often as they administer it.” There existed, he concluded, “one law for the rich and another for the poor.

Calls for reform back then should sound eerily familiar today. Populist presidential candidate James Weaver urged that Supreme Court justices be elected and lifetime terms abolished. A bill introduced in Congress proposed that a majority of both houses should have the power to recall and remove a judge from office. Another demanded a super-majority of justices — seven out of nine — be required to invalidate a law. Roosevelt argued that there should be popular referenda on the court’s decisions. The Socialist Party demanded that the Supreme Court’s power to review the constitutionality of federal laws be done away with and all judges elected for short terms.

Still, the court prevailed until the Great Depression of the 1930s. President Franklin Roosevelt, however, passed new laws regulating business and finance, as well as a national minimum wage and maximum-work-hours statute, while legalizing the right to join a union. Together with yet another uprising of beleaguered industrial workers in those years, this would shift the balance of power. Even then, the Supreme Court justices at first succeeded in nullifying key pieces of Roosevelt’s economic recovery legislation, while Democrats at the time, (as today), talked about adding new justices to the court.

In the end, however, the national trauma of capitalism seemingly on the verge of collapse, the weight of changing public opinion, and the aging out of some of the justices ended the dominion of the Lochner court.

“The Race Question”

During the long years of opposition to that court, little of the criticism touched on “the race question.” How to account for that? From the Gilded Age of the late nineteenth century to Roosevelt’s New Deal, Americans were preoccupied with “the labor question” (as it was then called) — that is, how to deal with the great social divide between capital and labor opened up by industrialization.

The silence when it came to the no less striking racial bias of the Supreme Court speaks to an ubiquitous national blindness on matters of racial justice then. Of course, segregation was settled law at the time. In the words of a justice deciding the Plessy case, white supremacy was “in the nature of things.” (Sound familiar?) So, too, the relative weakness of mass movements addressing the racial dilemma during the Lochner court years was striking, making the issue easier to ignore.

The Supreme Court’s original responsibility was, as James Madison once put it, to guard against the “tyranny of the majority.” African-Americans were, of course, a long-tyrannized minority.

However, on that subject the Lochner court went AWOL, even by its own standards. If the “minority” in question happened to be a corporation, it, of course, needed the court’s protection. Not so fortunate were millions of ex-slaves and their descendants.

Eventually, a different Supreme Court, the one overseen by Chief Justice Earl Warren, faced the “race question.” Indeed, it expanded civil rights and civil liberties generally by making racial segregation illegal in public schools, increasing the constitutional rights of defendants, outlawing state-sponsored school prayer, and creating the groundwork to legalize abortion.

Times had changed. Civil rights for African-Americans (about which Roosevelt’s New Deal did little) became an increasing concern during and after World War II. Growing civil rights organizations and a then-powerful labor movement began to press the issue ever harder. By the time the Warren Court made its celebrated 1954 Brown v. Board of Education decision, race had become a “question,” just as the “labor question” had in the New Deal era.

Before then, pressure alone, however muscular, had not produced a shift in the high court’s approach as the Lochner court so amply demonstrated. Segregation had, after all, become entrenched as a way of life endorsed by local white legislatures. Southern commercial interests in particular — plantation owners, textile manufacturers, and raw material producers — depended on it.

Beyond those circles, however, segregation had become increasingly repellent in a culture ever more infused with the multi-ethnic sympathies and cosmopolitanism of the New Deal era. In beginning the dismantlement of legal segregation, the Warren court would not, in fact, threaten the country’s central institutions of power and wealth which, if anything, had by then come to find American-style apartheid inimical to their interests.

Justice is supposed to be nonpolitical, but that has never been the case. What was once termed the “counter-majoritarian” mission of the court — to discipline “passionate majorities” — produced great wrongs in the era of the gatling-gun injunction as had also been true earlier. The Warren court, however, was the exception. It achieved the very opposite results, even as it relied on the same constitutional logic (the civil rights enshrined in the 14th amendment) the Lochner court had in thwarting mass movements for justice and equality.

Today’s Supreme Court is more than Donald Trump’s creation. It’s the result of a long counter-revolution against the political, economic, and cultural reforms of the New Deal, as well as of the labor, civil rights, women’s, and gay liberation movements of the last century.

Sadly, those are the “passionate majorities” the court now seems all too determined to squelch and in that it stands in a long American tradition, though one most of us had forgotten in the Warren years. One thing should be obvious by now: if the country is ever to live up to its democratic and egalitarian promise, the tyranny of the Supreme Court must be ended.

The Rebellion From the Left and the Right: Where Will It Take Us?

Arising from the shadows of the American repressed, Bernie Sanders and Donald Trump have been sending chills through the corridors of establishment power. Who would have thunk it? Two men, both outliers, though in starkly different ways, seem to be leading rebellions against the masters of our fate in both parties; this, after decades in which even imagining such a possibility would have been seen as naïve at best, delusional at worst. Their larger-than-life presence on the national stage may be the most improbable political development of the last American half-century.  It suggests that we are entering a new phase in our public life.

Keep reading...Show less

One of the Most Successful Right-Wing Myths of All Time: 'The Limousine Liberal'

The following is an excerpt from the new bookLimousine Liberal by Steve Fraser (Basic Books, 2016): 

Keep reading...Show less

How the Economy Favored the 1 Percent Over 100 Years Ago

To stay on top of important articles like these, sign up to receive the latest updates from TomDispatch.com  here.

Keep reading...Show less

How Billionaires Like the Koch Brothers, Sheldon Adelson and Sam Walton Are Destroying America

To stay on top of important articles like these, sign up to receive the latest updates from TomDispatch.com  here.

Keep reading...Show less

From the San Francisco Earthquake to Superstorm Sandy, How Disaster Capitalism Makes Money off of Tragedy

To stay on top of important articles like these, sign up to receive the latest updates from TomDispatch.com here.

Keep reading...Show less

America Is Turning Into One Big Prison for People in Debt

To stay on top of important articles like these, sign up to receive the latest updates from TomDispatch.com here.

Keep reading...Show less

Getting Paid 93 Cents a Day in America? Corporations Bring Back the 19th Century

To stay on top of important articles like these, sign up to receive the latest updates from TomDispatch.com here.  

Keep reading...Show less

The Perfect Storm of Campaign 2008



Will the presidential election of 2008 mark a turning point in American political history? Will it terminate with extreme prejudice the conservative ascendancy that has dominated the country for the last generation? No matter the haplessness of the Democratic opposition, the answer is yes.



With Richard Nixon's victory in the 1968 presidential election, a new political order first triumphed over New Deal liberalism. It was an historic victory that one-time Republican strategist and now political critic Kevin Phillips memorably anointed the "emerging Republican majority." Now, that Republican "majority" finds itself in a systemic crisis from which there is no escape.



Only at moments of profound shock to the old order of things -- the Great Depression of the 1930s or the coming together of imperial war, racial confrontation, and de-industrialization in the late 1960s and 1970s -- does this kind of upheaval become possible in a political universe renowned for its stability, banality, and extraordinary capacity to duck things that matter. The trauma must be real and it must be perceived by people as traumatic. Both conditions now apply.



War, economic collapse, and the political implosion of the Republican Party will make 2008 a year to remember.



The Politics of Fear in Reverse




Iraq is an albatross that, all by itself, could sink the ship of state. At this point, there's no need to rehearse the polling numbers that register the no-looking-back abandonment of this colossal misadventure by most Americans. No cosmetic fix, like the "surge," can, in the end, make a difference -- because large majorities decided long ago that the invasion was a fiasco, and because the geopolitical and geo-economic objectives of the Bush administration leave no room for a genuine Iraqi nationalism which would be the only way out of this mess.



The fatal impact of the President's adventure in Iraq, however, runs far deeper than that. It has undermined the politics of fear which, above all else, had sustained the Bush administration. According to the latest polls, the Democrats who rate national security a key concern has shrunk to a percentage bordering on the statistically irrelevant. Independents display a similar "been there, done that" attitude. Republicans do express significantly greater levels of alarm, but far lower than a year or two ago.



In fact, the politics of fear may now be operating in reverse. The chronic belligerence of the Bush administration, especially in the last year with respect to Iran, and the cartoonish saber-rattling of Republican presidential candidates (whether genuine or because they believe themselves captives of the Bush legacy) is scary. Its only promise seems to be endless war for purposes few understand or are ready to salute. To paraphrase Franklin Delano Roosevelt, for many people now, the only thing to fear is the politics of fear itself.



And then there is the war on the Constitution. Randolph Bourne, a public intellectual writing around the time of World War I, is remembered today for one trenchant observation: that war is the health of the state. Mobilizing for war invites the cancerous growth of the bureaucratic state apparatus and its power over everyday life. Like some over-ripe fruit this kind of war-borne "healthiness" is today visibly morphing into its opposite -- what we might call the "sickness of the state."



The constitutional transgressions of the executive branch and its abrogation of the powers reserved to the other two branches of government are, by now, reasonably well known. Most of this aggressive over-reaching has been encouraged by the imperial hubris exemplified by the invasion of Iraq. It would be short-sighted to think that this only disturbs the equanimity of a small circle of civil libertarians. There is a long-lived and robust tradition in American political life always resentful of this kind of statism. In part, this helps account for wholesale defections from the Republican Party by those who believe it has been kidnapped by political elites masquerading as down-home, "live free or die" conservatives.



Now, add potential economic collapse to this witches brew. Even the soberest economy watchers, pundits with PhDs -- whose dismal record in predicting anything tempts me not to mention this -- are prophesying dark times ahead. Depression -- or a slump so deep it's not worth quibbling about the difference -- is evidently on the way; indeed is already underway. The economics of militarism have been a mainstay of business stability for more than half century; but now, as in the Vietnam era, deficits incurred to finance invasion only exacerbate a much more embracing dilemma.



Start with the confidence game being run out of Wall Street; after all, the subprime mortgage debacle now occupies newspaper front pages day after outrageous day. Certainly, these tales of greed and financial malfeasance are numbingly familiar. Yet, precisely that sense of déjà vu all over again, of Enron revisited, of an endless cascade of scandalous, irrational behavior affecting the central financial institutions of our world suggests just how dire things have become.



Enronization as Normal Life




Once upon a time, all through the nineteenth century, financial panics -- often precipitating more widespread economic slumps -- were a commonly accepted, if dreaded, part of "normal" economic life. Then the Crash of 1929, followed by the New Deal Keynesian regulatory state called into being to prevent its recurrence, made these cyclical extremes rare.



Beginning with the stock market crash of 1987, however, they have become ever more common again, most notoriously -- until now, that is -- with the dot.com implosion of 2000 and the Enronization that followed. Enron seems like only yesterday because, in fact, it was only yesterday, which strongly suggests that the financial sector is now increasingly out of control. At least three factors lurk behind this new reality.



Thanks to the Reagan counterrevolution, there is precious little left of the regulatory state -- and what remains is effectively run by those who most need to be regulated. (Despite bitter complaints in the business community, the Sarbanes-Oxley bill, passed after the dot.com bubble burst, has proven weak tea indeed when it comes to preventing financial high jinks, as the current financial meltdown indicates.)



More significantly, for at least the last quarter-century, the whole U.S. economic system has lived off the speculations generated by the financial sector -- sometimes given the acronym FIRE for finance, insurance, and real estate). It has grown exponentially while, in the country's industrial heartland in particular, much of the rest of the economy has withered away. FIRE carries enormous weight and the capacity to do great harm. Its growth, moreover, has fed a proliferation of financial activities and assets so complex and arcane that even their designers don't fully understand how they operate.



One might call this the sorcerer's apprentice effect. In such an environment, the likelihood and frequency of financial panics grows, so much so that they become "normal accidents" -- an oxymoron first applied to highly sophisticated technological systems like nuclear power plants by the sociologist Charles Perrow. Such systems are inherently subject to breakdowns for reasons those operating them can't fully anticipate, or correctly respond to, once they're underway. This is so precisely because they never fully understood the labyrinthine intricacies and ramifying effects of the way they worked in the first place.



Likening the current subprime implosion to such a "normal accident" is more than metaphorical. Today's Wall Street fabricators of avant-garde financial instruments are actually called "financial engineers." They got their training in "labs," much like Dr. Frankenstein's, located at Wharton, Princeton, Harvard, and Berkeley. Each time one of their confections goes south, they scratch their heads in bewilderment -- always making sure, of course, that they have financial life-rafts handy, while investors, employees, suppliers, and whole communities go down with the ship.



What makes Wall Street's latest "normal accident" so portentous, however, is the way it is interacting with, and infecting, healthier parts of the economy. When the dot.com bubble burst many innocents were hurt, not just denizens of the Street. Still, its impact turned out to be limited. Now, via the subprime mortgage meltdown, Main Street is under the gun.



It is not only a matter of mass foreclosures. It is not merely a question of collapsing home prices. It is not simply the shutting down of large portions of the construction industry (inspiring some of those doom-and-gloom prognostications). It is not just the born-again skittishness of financial institutions which have, all of sudden, gotten religion, rediscovered the word "prudence," and won't lend to anybody. It is all of this, taken together, which points ominously to a general collapse of the credit structure that has shored up consumer capitalism for decades.



Campaigning Through a Perfect Storm of Economic Disaster



The equity built up during the long housing boom has been the main resource for ordinary people financing their big-ticket-item expenses -- from college educations to consumer durables, from trading-up on the housing market to vacationing abroad. Much of that equity, that consumer wherewithal, has suddenly vanished, and more of it soon will. So, too, the life-lines of credit that allow all sorts of small and medium-sized businesses to function and hire people are drying up fast. Whole communities, industries, and regional economies are in jeopardy.



All of that might be considered enough, but there's more. Oil, of course. Here, the connection to Iraq is clear; but, arguably, the wild escalation of petroleum prices might have happened anyway. Certainly, the energy price explosion exacerbates the general economic crisis, in part by raising the costs of production all across the economy, and so abetting the forces of economic contraction. In the same way, each increase in the price of oil further contributes to what most now agree is a nearly insupportable level in the U.S. balance of payments deficit. That, in turn, is contributing to the steady withering away of the value of the dollar, a devaluation which then further ratchets up the price of oil (partially to compensate holders of those petrodollars who find themselves in possession of an increasingly worthless currency). As strategic countries in the Middle East and Asia grow increasingly more comfortable converting their holdings into euros or other more reliable -- which is to say, more profitable -- currencies, a speculative run on the dollar becomes a real, if scary, possibility for everyone.



Finally, it is vital to recall that this tsunami of bad business is about to wash over an already very sick economy. While the old regime, the Reagan-Bush counterrevolution, has lived off the heady vapors of the FIRE sector, it has left in its wake a de-industrialized nation, full of super-exploited immigrants and millions of families whose earnings have suffered steady erosion. Two wage-earners, working longer hours, are now needed to (barely) sustain a standard of living once earned by one. And that doesn't count the melting away of health insurance, pensions, and other forms of protection against the vicissitudes of the free market or natural calamities. This, too, is the enduring hallmark of a political economy about to go belly-up.



This perfect storm will be upon us just as the election season heats up. It will inevitably hasten the already well-advanced implosion of the Republican Party, which is the definitive reason 2008 will indeed qualify as a turning-point election. Reports of defections from the conservative ascendancy have been emerging from all points on the political compass. The Congressional elections of 2006 registered the first seismic shock of this change. Since then, independents and moderate Republicans continue to indicate, in growing numbers in the polls, that they are leaving the Grand Old Party. The Wall Street Journal reports on a growing loss of faith among important circles of business and finance. Hard core religious right-wingers are airing their doubts in public. Libertarians delight in the apostate candidacy of Ron Paul. Conservative populist resentment of immigration runs head on into corporate elite determination to enlarge a sizeable pool of cheap labor, while Hispanics head back to the Democratic Party in droves. Even the Republican Party's own elected officials are engaged in a mass movement to retire.



All signs are ominous. The credibility and legitimacy of the old order operate now at a steep discount. Most telling and fatal perhaps is the paralysis spreading into the inner councils at the top. Faced with dire predicaments both at home and abroad, they essentially do nothing except rattle those sabers, captives of their own now-bankrupt ideology. Anything, many will decide, is better than this.



Or will they? What if the opposition is vacillating, incoherent, and weak-willed -- labels critics have reasonably pinned on the Democrats? Bad as that undoubtedly is, I don't think it will matter, not in the short run at least.



Take the presidential campaign of 1932 as an instructive example. The crisis of the Great Depression was systemic, but the response of the Democratic Party and its candidate Franklin Delano Roosevelt -- though few remember this now -- was hardly daring. In many ways, it was not very different from that of Republican President Herbert Hoover; nor was there a great deal of militant opposition in the streets, not in 1932 anyway, hardly more than the woeful degree of organized mass resistance we see today despite all the Bush administration's provocations.



Yet the New Deal followed. And not only the New Deal, but an era of social protest, including labor, racial, and farmer insurgencies, without which there would have been no New Deal or Great Society. May something analogous happen in the years ahead? No one can know. But a door is about to open.
BRAND NEW STORIES
@2023 - AlterNet Media Inc. All Rights Reserved. - "Poynter" fonts provided by fontsempire.com.