The Conversation

Trump underestimated Iran’s resilience — and now there is only one way out: scholar

For all their claims of military success in their war with Iran, the United States and Israel have yet to clearly define their rationale for starting the conflict, their goals and their exit strategy.

With the Iranian regime having mounted a robust response, the Middle East has been plunged into an unnecessary confrontation with no end in sight.

When US President Donald Trump and Israeli Prime Minister Benjamin Netanyahu started this war a month ago, they didn’t have a clear understanding of the nature of the Iranian regime and its defensive capability.

They didn’t expect Tehran to counter their offensives with an unprecedented level of preparedness, striking US bases across the Persian Gulf and hitting Israel hard.

Nor did they anticipate Tehran would close the Strait of Hormuz, partially or fully, to cause a shortage of oil and gas with severe consequences for the global economy.

Driven by an embrace of military power, they acted on a belief that American and Israeli might from the air and sea would force the Islamic government to quickly capitulate, enabling the Iranian people to instigate a favourable regime change – something that has not transpired.

With a military victory now looking increasingly elusive, Trump will need to pivot to a diplomatic solution – and force Netanyahu to comply.

Why Iran has proven so resilient

Prior to the war, the Islamic government was under enormous domestic pressure and international criticism for its suppression of widespread public protests that left thousands of Iranians dead.

The regime was also struggling to come to terms with Israel’s degradation of its regional affiliates, Hamas and Hezbollah in particular, not to mention the fall of Bashar al-Assad’s dictatorship in Syria.

While distrustful of Trump, it felt compelled to enter into negotiations with the US once more for a viable settlement of its controversial nuclear program. In late February, the chief mediator, the Omani foreign minister, said a deal was within reach.

When the US and Israel attacked instead, it gave the Islamic government a different sort of opportunity: it could demonstrate the resilience it had spent decades building.

Iran’s system of authority, governance and security was structured to withstand the loss of its leaders and commanders. The regime had shown this in the 1980s in the face of stiff internal opposition, the eight-year war with Iraq, US efforts to contain it and regional hostility.

The Islamic government has also managed to survive despite its theocratic impositions, frequent public uprisings and domestic and foreign policy shortfalls. The reasons for this include:

  • the belief of many Shia Muslims in revolutionary Islamism
  • its combination of ideological rigidity and pragmatic flexibility, and
  • a dedicated and entrenched security, intelligence and administrative apparatus whose survival is dependent on the regime’s survival.

While many Iranians have wanted to see the back of the Islamic government, most are still very proud of their cultural and civilisational heritage. They don’t like to see Iran being subjected to outside aggression, destruction and humiliation.

A war of endurance

This explains why many Iranians have rallied around the flag, as they have historically done against outside aggression.

Knowing it cannot match the firepower of the US and Israel, the Islamic government has shown ingenuity in creating a “mosaic defence” strategy of asymmetrical warfare. This entails adapting and responding to US military weaknesses (for instance, by targeting US bases in Persian Gulf countries with drones and missiles) and decentralising its command structure so leaders can quickly be replaced when they are killed.

The regime has been assisted by Russia and China with supplies of dual-use technologies and revenue from oil imports. Russia has also reportedly been giving Iran intelligence on the location of US assets in the region.

And although Iran’s regional affiliates have been degraded, they are still capable of backing the Islamic Republic in the conflict. Both Hezbollah and the Yemeni Houthis have entered the war by targeting Israel. The Houthis may also attempt to disrupt shipping through the Red Sea.

In short, the Iranian government is resolved to deny the US and Israel a victory at all costs. Given this, the conflict has become a war of endurance.

A deal is the only way out

How long the US, Israel and Iran stay in the fight is a matter of conjecture. However, as the situation stands, the space for a diplomatic resolution has very much tightened. Iran has not shown a desire to back down, and the US and Israel are not united in their goals.

Trump may eventually settle for a deal on Iran’s nuclear program and a potential reopening of the Strait of Hormuz, given the costs of the war and his falling poll numbers in a year of mid-term elections.

But Netanyahu seems adamant in his pursuit. He wants to destroy the Islamic government and weaken the Iranian state as a regional actor.

What is increasingly clear is the war is unlikely to end by military means. The only way forward is a negotiated settlement. The onus will therefore fall on Trump to pull Netanyahu into line and take the lead on trying to strike a deal.

Some analysts have already concluded that no matter how the war ends, Iran is prevailing.The Conversation

Amin Saikal, Emeritus Professor of Middle Eastern Studies, Australian National University; The University of Western Australia; Victoria University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

How a 200-year-old Danish folktale perfectly captures MAGA buyers' remorse

In mid-March, an activist group in Rutland County, Vermont, held its usual weekly rally protesting the actions of US president Donald Trump. One protester, Marsha Cassel, led the crowd, dressed as an unclothed Trump wearing a crown and holding a staff. Cassel was followed by another protester holding a sign proclaiming “THE EMPEROR HAS NO CLOTHES!”.

This is not the first time Trump has been compared to Hans Christian Andersen’s bumbling emperor, who marched unclothed through the streets while claiming to be dressed in finery – a fiction many of his subjects willingly indulged.

Who was Andersen, what aspects of his life informed this particular story and why might this be useful to know in the age of Trump?

Andersen was born in Odense, Denmark, in 1805. While his grandfather supposedly claimed noble origins for the family, Andersen’s father was a cobbler and his mother an illiterate washerwoman.

After his father died, Andersen moved to Copenhagen for work, where he found a patron, theatre director Jonas Collin, who paid for his education. Andersen started writing after graduating from university, becoming well known for his fairy tales, which he began publishing in the 1830s.

The Emperor’s New Clothes is in his 1837 work, Fairy Tales Told for Children, which featured other memorable tales such as The Steadfast Tin Soldier and The Little Mermaid.

The story follows a vain and clothes-obsessed emperor who commissions clothing from two travelling conmen. These men, posing as weavers, visit his court to show off a new kind of material, which is supposedly rendered invisible to a man “unfit for the office he held”, or “extraordinarily simple in character”.

Afraid to reveal that he cannot see the material, the emperor sends in several aides to review the process, who all lie about being able to see the clothes being made.

llustration by Edmund Dulac from Stories from Hans Andersen, published 1938. Universal Images Group via Getty Images

Once the “outfit” is finished, the emperor dons it and parades unclothed through the town. The townsfolk compliment the garments, until a small child bursts the bubble, yelling out that the emperor has no clothes.

Unable to admit this, the emperor continues on his way. But the townsfolk now laugh.

This simple tale powerfully criticises rulers who tell untruths, performing intelligence and leadership, as well as those who uncritically allow this.

An outsider looking in

Like many fairy tales, the origins of this one stretch back centuries. Older versions date to medieval times. All feature people in power being duped by conmen who play on their vanities about their own intelligence. Literary scholar Hollis Robbins suggests Andersen’s version reflects a newly-emerging working class culture where “professional competence” was “quickly overtaking legitimacy and heritage as a source of aristocratic anxiety”.

In his book The Enchanted Screen: The Unknown History of Fairy-Tale Films, fairy tale scholar Jack Zipes claims Andersen was “embarrassed by his proletarian background” and “rarely mingled with the lower classes” once he found success as a writer.

Andersen never married and more recently, has been understood as a bisexual man. He had infatuations with both men and women, including Edvard Collin (the son of his patron Jonas) and Swedish opera singer Jenny Lind. After a fall in 1872, from which he never recovered, he died in 1875.

Andersen’s lower class background, argues Zipes, meant he was particularly well suited to biting cultural commentary about the difficult path for those escaping poverty.

In one translation of The Emperor’s New Clothes, the child who proclaims the nudity of the emperor is called “the voice of innocence” by his father. This voice spreads through the crowd, leading to the comical image of the unclothed emperor’s aides striving to lift the invisible train of his outfit even higher.

Regardless of one’s position in life, this story suggests you cannot escape “suffering, humiliation, and torture,” writes Zipes.

Indeed, many of Andersen’s tales feature characters (often frail, young women) who suffer immensely before dying nobly. The Emperor’s New Clothes, with its child character as the voice of reason, has an ending that, while not “happily ever after”, is as lighthearted as Andersen gets.

The power of fairy tales

The fairy tale is one of the most recognisable literary genres. We hear them from such a young age it is almost like we were born knowing them. Beginning as oral folktales, many of the tales we know today were first written down in 16th and 17th century France, Italy and Germany as social commentary and educational stories.

It is difficult to identify the “originals” of many tales, given their folkloric origins. Still, while it is almost stereotypical now to note that the “original fairy tales” (before contemporary Disney adaptations) were surprisingly dark Andersen’s are noticeably, and notably, bleak.

The Emperor’s New Clothes has been retold many times, with print, screen and musical adaptations. As Donald Trump, in the words of one pundit, continues to “construct a narrative, declare it to be true and relentlessly force the world to submit to it”, the story resonates today.

Indeed, literary academic Naomi Wood has argued that in a post 9/11 world, a “terrifying possibility” emerges in readings of the tale.

The truth of the fairy tale is not its glorification of the voice of innocence, free from corruption and untruth. Rather, it is that adults will continue to believe their own lies, even when they are clearly revealed. As a result, we allow the parade to continue, even while knowing it is farcical.The Conversation

Nicola Welsh-Burke, Sessional Academic in Literary and Cultural Studies, Western Sydney University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Four words will determine Trump case's fate at the Supreme Court

The Supreme Court on April 1, 2026, will hear oral arguments in Trump v. Barbara, the case that challenges the Trump administration’s efforts to bar the children of immigrants without legal status from birthright citizenship by reinterpreting the terms of the 14th Amendment.

In January 2025, President Donald Trump issued an executive order removing the recognition of citizenship for the U.S.-born children of both immigrants here illegally and visitors here only temporarily. The new rule is not retroactive. This change in long-standing U.S. policy sparked a wave of litigation.

When the justices weigh the arguments, they will focus on the meaning of the first sentence of the 14th Amendment, known as the citizenship clause: “All persons born or naturalized in the United States, and subject to the jurisdiction thereof, are citizens of the United States and of the State wherein they reside.”

Both sides agree that to be granted birthright citizenship under the Constitution, a child must be born inside U.S. borders and the parents must be “subject to the jurisdiction” of the United States. However, each side will give a very different interpretation of what the second requirement means. Who falls under “the jurisdiction” of the United States in this context?

As a close observer of the court, I anticipate a divided outcome grounded in strong arguments from each side.

Arguments for automatic citizenship

Simply put, the argument against the Trump administration is that the 14th Amendment’s expansion of citizenship after the eradication of slavery was meant to be broad rather than narrow, encompassing not only formerly enslaved Black people but all persons who arrived on U.S. soil under the protection of the Constitution.

The Civil War amendments – the 13th, 14th and 15th – established inherent equality as a constitutional value, which embraced all persons born in the nation without reference to race, ethnicity or origin.

One of the strongest arguments that automatic citizenship is the meaning of the Constitution is long-standing practice. Citizenship by birth regardless of parental status – with few exceptions – has been the effective rule since the time of America’s founding.

Advocates also point to precedent: the landmark case of United States v. Wong Kim Ark in 1898. When an American-born descendant of resident noncitizens sued after being refused re-entry to San Francisco under the Chinese Exclusion Act, the court recognized his natural-born citizenship.

If we read the Constitution in a living fashion – emphasizing the evolution of American beliefs and values over time – the constitutional commitment to broad citizenship grounded in equality, regardless of ethnicity or economic status, seems even more clear.

However, advocates must try to convince the court’s originalists – Clarence Thomas, Samuel Alito, Neil Gorsuch, Brett Kavanaugh and Amy Coney Barrett – who read the Constitution based on its meaning when it was adopted.

The originalist argument in favor of birthright citizenship is that the phrase “subject to the jurisdiction” was meant to invoke only a small set of exceptions found in traditional British common law. In the Wong Kim Ark ruling, the court relied on this “customary law of England, brought to America by the colonists.”

One exception to birthright citizenship covered by this line of rulings is the child of a foreign diplomat, whose parents represent the interests of another country. Another exception is the children of invading foreign armies. A third exception discussed explicitly by the framers of the 14th Amendment was Native Americans, who at the time were understood to be under the jurisdiction of their tribal government as a separate sovereign. That category of exclusion faded away after Congress recognized the citizenship of Native Americans in 1924.

The advocates of automatic birthright citizenship conclude that whether the 14th Amendment is interpreted in a living or in an original way, its small set of exceptions do not override its broad message of citizenship grounded in human equality.

Opposition to birthright citizenship

The opposing argument begins with a simple intuition: In a society defined by self-government, as America is, there is no such thing as citizenship without consent. In the same way that an American citizen cannot declare himself a French citizen and vote in French elections without consent from the French government, a foreign national cannot declare himself a U.S. citizen without consent.

This argument emphasizes that citizenship in a democracy means holding equal political power over our collective decisions. That is something only existing citizens hold the right to offer to others, something which must be decided through elections and the lawmaking process.

The court’s ruling in Elk v. Wilkins in 1884 – just 16 years after the ratification of the 14th Amendment – endorses “the principle that no one can become a citizen of a nation without its consent.” By making entry into the United States without approval a federal offense, Congress has effectively denied that consent.

Scholars who support this view argue that the 14th Amendment does not provide this consent. Instead it sets a limitation. To the authors of the 14th Amendment, “subject to the jurisdiction thereof” conveyed a limit to natural citizenship grounded in mutual allegiance. That means if people are free to deny their old national allegiance, and an independent nation is free to decide its own membership, the recognition of a new national identity must be mutual.

Immigrants living in the United States illegally have not accepted the sovereignty of the nation’s laws. On the other side of the coin, the government has not officially accepted them as residents under its protection.

If mutual recognition of allegiance is the meaning of the 14th Amendment, the Trump administration has not violated it.

The opponents of birthright citizenship argue that the Wong Kim Ark ruling has been misrepresented. In that case, the court only considered permanent legal residents like Wong Kim Ark’s parents, but not residents here illegally or temporarily. The focus on British common law in that ruling is simply misguided because the findings of Calvin’s Case or any other precedents dealing with British subjects were voided by the American Revolution.

In this view, the Declaration of Independence replaced subjects with citizens. The power to determine national membership was taken away from kings and placed in the hands of democratic majorities.

For opponents of birthright citizenship, the 14th Amendment does not take that power away from citizens but instead codifies the rule that mutual consent is the touchstone of admission. The requirement to be “subject to the jurisdiction” provides the mechanism of that consent.

Congress can determine who is accepted as a member of the national community under its jurisdiction. In this view, Congress – and the American people – have spoken: Current federal laws make entry into U.S. borders without permission a crime rather than a forced acceptance of political membership.

What might happen

The court will likely announce a ruling in summer 2026 before early July, just in time for the 250th anniversary of the Declaration of Independence. The court will ultimately decide whether the Constitution endorses the declaration’s invocation of essential equality or its creation of a sovereign people empowered to determine the boundaries of national membership.

The court’s three Democratic-appointed justices – Ketanji Brown Jackson, Elena Kagan and Sonia Sotomayor – will surely side against the Trump administration. The six Republican-appointed justices seem likely to divide, a symptom of disagreements within the originalist camp.

The liberal justices need at least two of the conservatives to join them to form a majority of five to uphold universal birthright citizenship. This will likely be some combination of Chief Justice John Roberts, Brett Kavanaugh and Amy Coney Barrett.

The Trump administration will prevail only if five out of the six conservatives reject the British common law foundations of the Wong Kim Ark ruling in favor of citizenship by consent alone.

America should know by July Fourth.

This is an updated version of an article originally published on Dec. 5, 2025.The Conversation

Morgan Marietta, Professor of American Civics, University of Tennessee

This article is republished from The Conversation under a Creative Commons license. Read the original article.

I went to CPAC and found an impotent Trump coalition falling apart at the seams

There is a pall over the Make America Great Again, or MAGA, movement. Donald Trump overpromised. His public support has fallen. Some “America First” die-hards now openly criticize him.

Amid war, economic challenges, democratic backsliding, the Epstein files and Americans shot dead in the street by government agents, Trump’s support is softening and his vow to bring a “golden age of America” is looking more like a political winter for Trump and his MAGA movement.

This is my big takeaway from this year’s annual Conservative Political Action Conference, or CPAC. The event, organized by the American Conservative Union, launched with an international summit on March 25, 2026, and runs through March 28 in Grapevine, Texas.

Don’t get me wrong. The attendees are decked out in red, white and blue MAGA merch: sequined “Trump” purses and jackets, USA flag bags, ties and headbands, and, of course, iconic red MAGA caps. As always, they chant “USA,” even if not as often or as loudly as before.

Starting with the first talk by Rev. Franklin Graham, speakers here are still singing Trump’s praises. They underscore what they regard as major Trump 2.0 accomplishments: combating illegal immigration, cutting taxes, a budding economic boom, deregulation, U.S. gas and oil output surging, administrative state winnowing, pro-Christian policies and pulling the plug on the “woke” agenda.

These issues are foregrounded in sessions with titles like “Walls Work,” “Don’t Let Woke Marxists Raise Your Children,” “MAGA vs. Mullah Madness,” “Commies Go Home” and “Cancelling Satan.” In between, pro-Trump advertisements checklist Trump’s accomplishments.

This rose-tinted view is to be expected. After all, CPAC – a cross between a political rally, networking mixer and MAGA Comic-Con – is all about galvanizing the conservative base. Beneath the surface, however, MAGA is churning.

Major grievances

An anthropologist of American political culture and author of the book “It Can Happen Here,” I have been studying MAGA for years and attending CPAC since 2023. Attendees at last year’s CPAC, held a month after Trump’s inauguration, were jubilant, with nonstop talk of “the comeback kid” and “the golden age.”

Why is the mood at this year’s CPAC more subdued?

Enthusiasm for Trump is dampened because some of his supporters feel he has betrayed America First principles, failed to fulfill key campaign promises and been unable to supercharge the economy. Here are their major grievances:

‘America First’ vs. ‘Israel First’

America First” is the guiding principle of MAGA. It encompasses border security, prioritizing the U.S. economy and ensuring rights such as free speech. It also means avoiding unnecessary wars.

This is why Trump’s support of the June 2025 “12-day war” on Iran led Tucker Carlson, Marjorie Taylor Greene and other MAGA influencers, who have tens of millions of followers, to criticize Trump. The conflict, they contend, served Israel’s interest – their phrase is “Israel First” – not those of the U.S.

Their criticisms became even more pronounced after the U.S. again began bombing Iran on Feb. 28, 2026. The criticism is part of a growing MAGA fissure with pro-Israel stalwarts such as conservative activists Mark Levin, Laura Loomer and Ben Shapiro, who support U.S. intervention in the Middle East. Things got so bad that after Levin called his fellow conservative media personality Megyn Kelly “unhinged, lewd and petulant,” she dubbed him “Microp---- Mark.”

But the MAGA unease with the war extends well beyond the “America First” influencers.

It includes figures from the fringe far right such as provocateur Nick Fuentes, center-right “brocaster” Joe Rogan, and even the Trump administration itself – as illustrated by an intelligence officer whose resignation stated, “Iran posed no imminent threat to our nation, and it is clear that we started this war due to pressure from Israel and its powerful American lobby.”

Notably, none of the main Trump critics have been scheduled to speak at this year’s CPAC. Some now call it “TPAC,” or the Trump Political Action Conference.

The Epstein files

MAGA also has a strong populist and anti-elite streak of conspiracy thinking.

Large numbers of Trump supporters, for example, believe there is an elite plot to what they call “replace” the white population with nonwhites through mass immigration. Many also bought into the QAnon conspiracy theory, which centers on the idea that Trump is fighting Satanic, deep state elites who are running a child sex trafficking operation.

On the campaign trail, Trump vowed to take down political, deep state and global elites. He also promised to release the Jeffrey Epstein files, which QAnon conspiracists and others believe prove elite debauchery, including pedophilia.

Trump didn’t deliver. He backtracked and stonewalled on the release of the Epstein files, raising MAGA suspicion that Trump himself is implicated or is protecting elites. Remarkably, one recent poll found that roughly half of Americans, including a quarter of Republicans, believe the Iran war was partly meant to distract from the Epstein files.

Economy and immigration

Trump is also facing headwinds on the bread-and-butter issues of the 2024 election: the economy and immigration.

At CPAC, speakers have repeatedly given him kudos for shutting down the border. Acknowledging the MAGA in-fighting, conservative commentator Benny Johnson said he wanted to “white pill” – or buck up – the audience by reminding them that Trump had stopped an “invasion” and brought “criminal alien border crossings down to zero.”

As a photo of Trump’s bloodied face after the assassination attempt in Butler, Pennsylvania, on July 13, 2024, was displayed, Johnson claimed, “Our God saved President’s Trump’s life for this moment.”

But fewer Republicans approve of his handling of immigration compared with a year ago. Like many Americans, a growing number have misgivings about the strong-arm tactics used by government immigration enforcement agents in places such as Minnesota.

For many, the economy remains a serious worry. A recent poll, conducted before the Iran war, found that the vast majority of Americans, including large numbers of Republicans, are concerned about inflation, jobs and the cost of living. Health care, including the lost Obamacare subsidies, is also a source of consternation.

Few people believe the economy is “booming” – let alone that a “golden age” has arrived – as Trump and his allies often proclaim. The war with Iran, which has led to stock market declines and gas pump hikes, has only added to the unease.

MAGA ‘shattered’?

Amid the recent MAGA in-fighting about the Iran war, conservative podcaster Tim Pool proclaimed, “The MAGA coalition is shattered.”

Not exactly. Despite the many challenges Trump is facing, the vast majority of his MAGA base voters still support him – including almost 90% backing his war with Iran.

But Trump’s support has eased in several ways. First, even his hardcore supporters worry about the economy, and they want him to declare victory and exit the war. And second, Trump has lost support on the edges. Many people in the key groups with which he made crucial inroads in the last election – such as young men and nonwhite voters – have turned from him. The same is true for independents and other Trump voters who don’t identify as MAGA.

Trumpism isn’t dead, as the MAGA-merched crowds here at CPAC make clear. But Trump is struggling through a political winter that could signal the early stages of his MAGA movement’s decline.The Conversation

Alex Hinton, Distinguished Professor of Anthropology; Director, Center for the Study of Genocide and Human Rights, Rutgers University - Newark

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Trump’s ‘new’ 15‑point plan is the biggest sign yet that DC fears it's losing this war

The language of power often reveals more than it intends. In a rare moment of candour on March 7, the US president, Donald Trump, described the confrontation with Iran as “a big chess game at a very high level … I’m dealing with very smart players … high-level intellect. High, very high-IQ people.”

If Iran is, by Trump’s own admission, a “high-level” opponent, then the sudden revival of a 15-point plan previously rejected by Iran a year ago suggests a disconnect between how the adversary is understood and how it is being approached. It’s a plan already examined in negotiation by Iran and dismissed as unrealistic and coercive. Despite this, the Trump administration is once again framing the “roadmap” as a pathway to de-escalation. Tehran has once again dismissed the gambit as Washington “negotiating with itself – reinforcing the perception that the US is attempting to impose terms rather than negotiate them.

The US president is right about one thing – Iran is not an opponent that can be easily dismissed or overwhelmed. Trump’s own description is a tacit acknowledgement that this is a far more capable and complex adversary than those the US has faced in past Middle Eastern wars, such as Iraq. And that is why the odds are increasingly stacked against the United States and Israel.

This conflict reflects a familiar but flawed imperial assumption: that overwhelming military force can compensate for strategic misunderstanding. The US and Israel appear to have misjudged not only Iran’s capabilities, but the political, economic and historical terrain on which this war is being fought.

Unlike Iraq, Iran is a deeply embedded and adaptable regional power. It has resilient institutions, networks of influence, and the capacity to impose asymmetric costs across multiple theatres. It knows how to manage maximum pressure.

The most immediate problem is lack of legitimacy. This war has authorisation from neither the United Nations or, in the case of America, the US Congress. Further, US intelligence assessments indicate Iran was not rebuilding its nuclear programme following earlier strikes – contradicting one of Washington’s justifications for war. The resignation of Joe Kent as head of the National Counterterrorism Center on March 17, was even more revealing. In his resignation letter Kent insisted that Iran posed no imminent threat.

This effectively collapses one of the original narratives underpinning the US decision to start the war – a further blow to legitimacy.

A majority of Americans oppose the war, reflecting deep fatigue after Iraq and Afghanistan – hardly ideal conditions for what increasingly looks like another "forever war” in the Middle East. Current polling shows Trump’s Republicans trailing the Democrats ahead of the all-important midterm elections in November.

The war is both militarily uncertain and politically unsustainable. International allied support is also eroding. The United Kingdom — often trumpeted as Washington’s closest partner — has limited itself to defensive coordination, while Germany and France have distanced themselves from offensive operations. European allies also declined a US request to deploy naval forces to secure the strait of Hormuz. This reflects not just disagreement, but a deeper loss of trust in US leadership and strategic judgement.

US influence has long depended on legitimacy as much as force. That reservoir is now rapidly draining. Global confidence is falling, while images of civilian casualties — including over 160 schoolchildren killed in an airstrike on the first day of the war – have shocked international onlookers. Rather than reinforcing leadership, this war is accelerating its erosion.

Israel faces a parallel crisis of legitimacy – one that began in Gaza and has now deepened. The war in Gaza severely damaged its global standing, with sustained civilian casualties and humanitarian devastation drawing unprecedented criticism, even among traditional allies. This confrontation with Iran compounds that decline.

Striking Iran during active negotiations — for the second time — reinforces the perception that escalation is preferred over diplomacy. The issue is no longer just conduct, but credibility.

Strategic failure, narrative defeat

The conduct of the war compounds the problem. The assassinations of Iranian leaders, framed as tactical victories, are strategic failures. They have unified rather than destabilised Iran. Mass pro-regime demonstrations illustrate how external aggression can consolidate internal legitimacy.

The issue is no longer just the conduct of the war, but the credibility of the conflict itself. Regardless of how impressive the US and Israeli military are, it doesn’t compensate for reputational collapse. When building support for a conflict like this – domestically and internationally – legitimacy is a strategic asset. Once eroded across multiple conflicts, it is extraordinarily difficult to rebuild.

Rather than stabilising the system, US actions are fragmenting it. Allies are distancing themselves, adversaries are adapting, and neutral states are hedging.

The most decisive factor may be economic. The war is already destabilising global markets – driving up oil prices, inflation, and volatility at levels that combine the effects of 1970s and Ukraine war oil shocks.

This is a war that cannot be contained geographically nor economically. The deployment of 2,500 US marines to the Middle East (and reports that up to another 3,000 paratroopers will also be sent), reportedly with plans to secure Kharg Island – and with it Iran’s most important oil infrastructure – would be a dangerous escalation.

For Gulf states, the assumption that the US can guarantee security is increasingly questioned. Some states are reportedly now looking to diversify their partnerships and turning toward China and Russia, mirroring post-Iraq shifts, when US failure opened space for alternative powers.

Iran holds the cards

Wars are not won by destroying capabilities alone, but by securing sustainable and legitimate political outcomes. On both counts, the US and Israel are falling short.

Iran, by contrast, does not need military victory. It only needs to endure, impose costs, and outlast its adversaries. This is the logic of asymmetric conflict: the weaker power wins by not losing, while the stronger one loses when the costs of continuing become unsustainable.

This dynamic is already visible. Having escalated rapidly, Trump now appears to be searching for an off-ramp — reviving proposals and signalling openness to negotiation. But he is doing so from a position of diminishing leverage. In contrast, Iran’s ability to threaten energy flows, absorb pressure, and shape the tempo of escalation means it increasingly holds key strategic cards. The longer the war continues, the more that balance tilts.

Empires rarely recognise when they begin to lose. They escalate, double down, and insist victory is near. But by the time the costs become undeniable – economic crisis, political fragmentation, global isolation – it is already too late. The US and Israel may win battles. But they may be losing the war that matters: legitimacy, stability and long-term influence.

And, as history suggests, that loss may not only define the limits of their power, but mark a broader shift in how power itself is judged, constrained, and resisted.The Conversation

Bamo Nouri, Honorary Research Fellow, Department of International Politics, City St George's, University of London and Inderjeet Parmar, Professor in International Politics, City St George's, University of London

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Trump in trouble as he triggers massive GOP nightmare exodus of voters

In 2024, Donald Trump dramatically improved his performance among nearly all groups of voters from four years earlier. Trump’s growth among Hispanic voters was especially notable, increasing by more than 10 points from 2020 to 2024, at least according to exit polls.

This led to a considerable amount of commentary speculating that Hispanic voters, historically more supportive of Democrats, might continue shifting toward the GOP.

News reports suggesting Latinos were critical to Trump’s 2024 victory were, in our view, overblown. Even if Latinos had not shifted, Trump still would have won in 2024.

Yet there is no question that over the past three election cycles, Latino voters – Latino men under 40, in particular – have shifted right. That change has benefited GOP candidates, even as the majority of Latinos still voted for Democrats.

However, evidence from general elections in 2025 in places such as New Jersey, New York and Virginia, as well as special elections in 2026, suggest an abrupt correction is underway, with some of the Latino voters who backed Trump now swinging back to the Democrats.

As political scientists and pollsters who study Hispanic voting trends, we are concerned with the question of whether these latest movements are real or simply a function of fluctuating Latino Democratic turnout rates. In other words, are Latinos broadly changing their votes back to Democrats, or are Latinos who remained loyal to the Democrats now more angry and fired up?

Survey and election data suggest it’s a bit of both. So what does this portend for the future of American politics?

Latino voting trends

The history of the Latino vote nationwide had for decades been one of long-term stability. Historically, Democrats enjoyed an approximate 65% to 35% advantage over Republicans.

That advantage shrank marginally after Republican President Ronald Reagan signed the Immigration Reform and Control Act in 1986, providing a path to citizenship for millions. But the more familiar two-thirds advantage for the Democratic Party returned following passage of Proposition 187, a 1994 anti-immigrant initiative in California that ultimately mobilized Latinos against Republicans.

Another effort at GOP outreach to Hispanic voters culminated in President George W. Bush taking approximately 40% of the Latino vote in 2004. That growth, however, soon eroded in the wake of anti-immigrant legislation passed by the Republican-controlled House in 2005 and 2006.

The successful campaigns of Democrat Barack Obama in 2008 and 2012, as well as Hillary Clinton’s unsuccessful 2016 campaign against Trump, saw Democrats reaping a relatively high level of Latino support, peaking at a 3-to-1 advantage in 2012.

That made Trump’s improvements among Latinos in 2020 and 2024 feel, for some, particularly unexpected. He lodged notable breakthroughs in parts of Florida, where he carried Miami-Dade County, and Texas, where he flipped the historically Democratic Rio Grande Valley.

Some Latinos question whether Democrats have delivered

It should not have been such a surprise. There has been a history of sizable shares of Latinos supporting Republican candidates. For instance, both former President George W. Bush and his brother, former Florida Gov. Jeb Bush, performed well with Latinos in Texas and Florida.

For two decades, Democrats have campaigned among Latinos on the promise of comprehensive immigration reform and an economic policy that would level the playing field, including raising the federal minimum wage, providing universal pre-K education and promoting affordable housing.

Many Latinos feel they are still waiting for these Democratic policies to be enacted, let alone improve their lives.

Democratic trifectas in 2009-10 and 2021-22 – when the party held both chambers of Congress, along with the presidency – failed to produce meaningful movement on immigration policy. Many Latinos felt their daily lives had not improved, as they faced high costs of living, expensive housing markets and rising health care costs. While House Democrats did pass numerous bills to address these topics, Senate moderates proved difficult to persuade.

Given these shortcomings, running on the message that “the GOP are bad guys” only gets Democrats so far. In 2024, surveys and focus groups of Hispanic voters made it clear that not everyone was convinced by this characterization. The frustrations of working-class families during the Biden administration were real, whereas fears of mass deportations and other social chaos that a second Trump term might portend were, at that point, conjecture.

The Trump campaign specifically promised widespread action against immigrants, but many of our Latino focus group participants felt this was bluster. They believed that Trump’s actions would be targeted against blatant criminals and that his policies would not affect their families and friends.

They did not believe the worst-case scenarios presented by Vice President Kamala Harris and other Democrats during the campaign. Despite often not liking Trump, his economic promises felt good during the 2024 affordability crisis.

Latinos shifting back left?

Many Latinos are now quite upset with Trump. The 2025 gubernatorial elections in New Jersey and Virginia point to dramatic 25-point changes in the Latino vote in the Democrats’ direction, compared with Trump’s 2024 performance.

In December 2025, the first Democrat was elected mayor of Miami since 1997, with Latino support. A Democrat won a heavily Republican state legislative district in Texas in February 2026 with an estimated 79% of the Latino vote. Most recently, Latino voter turnout surged to record levels in the March Democratic primary in Texas.

Majorities of Latino voters believe that their economic fortunes have declined since Trump returned to the White House. Moreover, they expect the situation to worsen over the next year. In March 2026, The Economist reported that Trump’s support among Latinos had fallen to 22%.

In a bipartisan poll by UnidosUS released in November 2025, only 14% of Latino voters said their lives were better after one year under Trump, while 39% said they had gotten worse. Looking ahead, 50% expected things to get worse still in 2026, while only 20% were optimistic about their economic future. Two-thirds of Latino voters felt that Trump and the Republicans were not focusing enough on improving the economy for people like them.

What’s more, mass deportations have happened under the second Trump administration. The vast majority of those detained for deportation, including those who have died, had no criminal record.

Latinos are overwhelmingly opposed to federal troops in U.S. cities, according to our polling; 41% fear legal residents and U.S. citizens getting caught up in enforcement actions. The No. 1 immigration concern for Latino voters remains a path to citizenship for Dreamers – the undocumented immigrants brought to the U.S. as children – and for immigrants who have worked and paid taxes in the country for more than 20 years but lack formal status.

Among Latinos who actually voted for Trump, many would not do so again. Our poll suggests that 22% of Latinos who voted for Trump in 2024 would not vote for him again. By contrast, Democrats retain support from 93% of their 2024 Latino voters.

The long-term effects of the Trump presidency on the Latino electorate are difficult to predict, but for now party preferences have shifted firmly back toward the Democrats. Among voters in the UnidosUS poll, 55% said they felt the Democrats “care a great deal” about Latinos, compared with 29% saying they felt that way about the GOP. At the same time, 33% of Latino voters see the GOP as “hostile,” compared with just 7% who believe this about the Democrats.

If the recent leftward shift is sustained, or the earlier shift to the right was illusory, the effects on the politics of 2026 could be large, potentially putting control of Congress in the hands of Latino voters. There are 46 House districts where the number of registered voters who are Latino exceeds the total margin of victory for those seats in 2024, with 23 currently held by Republicans and 23 currently held by Democrats.

Latino voters need to believe that politicians truly care about their concerns and will work to implement a plan to create equal opportunities for the nation’s largest minority group to achieve the American dream. We believe the candidates able to make that pitch convincingly will be the most successful.The Conversation

Matt A. Barreto, Professor of Political Science, UCLA Luskin School of Public Affairs and Gary M. Segura, Professor of Public Policy, UCLA Luskin School of Public Affairs

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Some Christians think Trump will end the world as we know it — and they feel fine

Soldiers in the United States Armed Forces have lodged more than 100 complaints with the Military Religious Freedom Foundation (MRFF) stating that their commanders are using extremist religious rhetoric to describe the U.S.-Israel war against Iran.

According to some complaints, American military commanders have told their troops the attack on Iran is a holy war, and that U.S. President Donald Trump was “anointed by Jesus to light the signal fire in Iran to cause Armageddon and mark his return to Earth.”

In a recent interview with Democracy Now!, the MRFF’s president, Mikey Weinstein, said the foundation was “inundated” with calls from soldiers indicating that commanders across the armed forces “were euphoric” because the war would serve as a way to “bring their version of weaponized Jesus back.”

The comments are among other violent religious rhetoric to come from U.S. officials. The U.S. ambassador to Israel, Mike Huckabee, caused a diplomatic row when he suggested Israel had a biblical claim to take over much of the Middle East.

The language also comes as some American officials have sought to characterize the Iranian government as fanatical. Secretary of State Marco Rubio said Iran was run by “religious fanatic lunatics.” Secretary of Defense Pete Hegseth said: “Crazy regimes like Iran, hell-bent on prophetic Islamic delusions, cannot have nuclear weapons.”

Meanwhile, American televangelist John Hagee recently claimed that Russia, Turkey, “what’s left of Iran” and “groups of Islamics” would soon invade Israel and be destroyed by God.

American evangelicalism

During my PhD in Christian theology, I’ve asked why some American evangelical religious movements, which have gained increasing visibility and power through President Donald Trump’s MAGA politics shaped heavily by white Christian nationalism, embrace violent interpretations of what theologians refer to as “eschatology” (a theology of end times).

While the term “evangelical Christian” is notoriously difficult to define, historian David Bebbington, who focused on these movements in the United Kingdom, delineated four broad charactersitics: a strong belief in the Bible, the death of Jesus for sins, a conversion experience and social activism.

My own research specialization is how modern Protestant Christians, including evangelical Christians, understand the significance of Jesus’s death, also referred to as the atonement, and its relationship to the end times.

Seeking Armageddon

Rhetoric about wars being religious, and Trump being divinely anointed and about to cause Armageddon, is deeply disturbing and has catalyzed condemnation from Christians in the U.S. and beyond advocating non-violent and diplomatic foreign policy.

Violent U.S. religious rhetoric being amplified with the U.S.-Israel war against Iran is associated with beliefs that once Israel is restored as a nation and the temple in Jerusalem is rebuilt, Jesus will return and judge humanity.

Christians adhering to these views read the Biblical Book of Revelation, with its vivid symbolic apocalyptic language, as making literal claims about history. They maintain their inspired and authoritative Biblical interpretation allows them to know that conflicts in the Middle East initiate God’s final act in history, with Trump seen as the dominating and aggressive man who can help usher in God’s violent judgment of his enemies.

Interpretations of Jesus’s death and violence

It’s relevant to consider how some Christian beliefs about Jesus’s death correlate with a willingness to support or justify violence.

Protestant Evangelical theologians, such as J. I. Packer and John Stott, argue that Jesus’s death primarily “paid the penalty” for human sin. They emphasize that God’s holiness requires a payment for this sin. In this framework, God orchestrates the violent death of Jesus to satisfy God’s penal justice to forgive humanity.

Non-evangelical Christians, on the other hand, like 19th-century Congregationalist Horace Bushnell and contemporary Mennonite theologian J. Denny Weaver, understand the death of Jesus as an example of God’s love.

In this interpretation, Jesus doesn’t endure violence to pay a debt to God. Instead, the death of Jesus is more akin to that of a martyr’s tragic death. These theologians reject violence as a condition for forgiveness.

A 2012 debate in the Presbyterian Church (U.S.A.) about a hymn demonstrates this tension, with a proposed change of hymn lyrics from “on that cross, as Jesus died, the wrath of God was satisfied” to “the love of God was magnified.” Ultimately, the authors rejected the proposal.

But the conflict demonstrates that Christians are passionate about their different interpretations of Jesus’s death.

Divine violence in atonement

Researchers have shown that penal atonement beliefs predict a negative association with a sense of responsibility for reducing pain and suffering in the world. This is not surprising when violence is incorporated as redemptive into theological frameworks.

I make a connection in my PhD dissertation between accepting divine violence in the atonement and divine violence in eschatology. Of course, this topic is far more complex and nuanced. Nonetheless, Christians are always going through a process of interpretation and negotiation when it comes to sacred texts.

For example, is Jesus the warrior Christ of Revelation 19 riding a warhorse to go into battle against his enemies or the teacher of peace in Matthew 21 who commands his followers to love their enemies just as God perfectly does?

Biblical interpretation and political beliefs

For those who see violence as a tool for redemption, they are more apt to subvert Jesus’s nonviolent teaching with images found in the Biblical book of Revelation. For those who see violence as incompatible with Christian ethics, they will interpret this allegorically, and with humility, paying attention to signs of God not just in their own lives and “insider” group. These two approaches will also inform political beliefs as well.

Consider a recent social media post by Reformed Baptist theologian and pastor John Piper. The post simply quotes Leviticus 19:34:

“You shall treat the stranger who sojourns with you as the native among you, and you shall love him as yourself, for you were strangers in the land of Egypt: I am the Lord your God.”

Piper was quickly labelled “woke” and pushing an “irresponsible” theology by Trump supporters. American theologan Russell Moore noted years ago:

“Multiple pastors tell me, essentially, the same story about quoting the Sermon on the Mount, parenthetically, in their preaching — ‘turn the other cheek’ — to have someone come up after to say, ‘Where did you get those liberal talking points?’”

A more responsible evangelical theology

I argue Christians should not believe in a God of violent death, but life. Violent atonement and eschatology portrays a God who is not above revenge and a God who leaves most of humanity hopeless.

We are left asking a series of disturbing questions if God is indeed about to end the world with violence. Why does the tone of this theology resemble the tone of empire, which crushes enemies instead of building bridges with them? Why does Jesus, as One Person of the One God, expect his followers to love their enemies — if God the Father ultimately does not?

All Christians in the U.S. and beyond need to reject violent theology as incompatible with the love of God that was magnified on the cross.The Conversation

Matthew Burkholder, PhD Candidate, Theological Studies, University of Toronto

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Trump is losing the war in America — and he knows it

No US president in living memory has gone to war with less public support than Donald Trump has for the war in Iran. Even Barack Obama’s much-maligned Libyan intervention began with 60% of Americans in support in 2011. There is no poll that shows a majority of Americans supporting the Iran war, and multiple polls showing clear majorities against it. And wars usually lose public support as they go on.

Trump did not make a public case for the war before it began, because he preferred quick, surprising strikes preceded by theatrical suspense. He presented the vast military buildup in the Persian Gulf as a high-pressure negotiating tactic in the short-lived bargaining sessions over Iran’s nuclear enrichment.

Trump was undoubtedly emboldened by the tactical success of his removal of Venezuelan President Nicolas Maduro, though that too was not very popular with Americans.

Wars are not necessarily better when the US government invests a huge effort in justifying them. The justification for the disastrous Iraq War, after all, was based on misperceptions, distortions and falsehoods. But by completely disregarding US public opinion before the war, Trump now finds himself in all kinds of trouble as he tries to fight it.

Americans don’t like seeing themselves as aggressors

Political scientist Bruce Jentleson argued that public support for war in the United States depends not just on how the war is going, but on the public’s understanding of the war’s aims. The US public is much more likely to support wars aimed at imposing restraints on aggressive powers than wars aimed at bringing political change to other countries.

That theory explains why the Bush administration made such an effort to claim Iraq had weapons of mass destruction and was linked to the September 11 terrorist attacks, even though “regime change” was the aim of the Iraq war.

Regime change is also, quite clearly, the aim of the Iran war. Trump has been talking about it for months, and is still talking about it.

It was only after the bombs started falling on Iran that Trump and his administration began to make the case that Iran was an “imminent threat” to the US. It wasn’t very convincing.

After all, Trump had been boasting until recently that he had “completely obliterated” Iran’s nuclear program the year before. In a video released shortly after the attacks, Trump complained about the 1979 Tehran hostage crisis, the 1983 Hezbollah attack on US marines in Beirut, and the 2000 bombing of the USS Cole, which he said Iran was “probably involved in”.

It was left to Secretary of State Marco Rubio to make the convoluted argument that the US was acting in preemptive self-defence, because it knew Israel was going to strike Iran, and that Iran would retaliate against Americans in the Middle East.

That did not play well in a country increasingly wary of Israel. A Gallup poll released just before the war began showed that, for the first time this century, more Americans said their sympathies were with Palestinians than Israelis. Recently, the biggest drop in support for Israel has been among political Independents, whose views have shifted significantly during the Gaza War.

Tucker Carlson, the loudest critic of the Iran war on the right, immediately labelled it “Israel’s war”. Joe Rogan, an influential figure among Trump’s 2024 support base of disillusioned young men, said they felt “betrayed” by the war.

Meanwhile, Secretary of Defense Pete Hegseth has tried to sell the war to Americans by gloating about the death, destruction and fear being inflicted on Iran. Even as investigations show the US military was responsible for the bombing of a school that killed more than a hundred children, he dismisses rules of military engagement as “stupid”. The most recent Quinnipiac Poll showed Hegseth’s approval rating at 37%.

Americans are unprepared for sacrifice

Despite high-profile opponents like Carlson and Marjorie Taylor Greene, Trump still has most of the MAGA base with him for now. They were never really opposed to foreign wars. What they hated was losing foreign wars, and Trump is promising them swift victory in Iran.

But Trump has not prepared them or anyone else, including his own cabinet, for the costs this war will incur. Especially the disruption to global oil markets, which the International Energy Agency is calling the largest in history, and which will elevate the cost of everything from travel to food.

Trump’s rhetoric about the price of war has hardly been Churchillian. One night he posted on social media that a short term increase in oil prices is “a very small price to pay for U.S.A., and World, Safety and Peace. ONLY FOOLS WOULD THINK DIFFERENTLY!”

But the next day he was forced to calm markets by claiming the war was nearly over.

The Iranian regime, whose main goal is survival, is well aware of the political and economic vulnerabilities of the US and its Middle Eastern allies, and these appear to be what it is targeting.

At the beginning of the war, Iran’s seemingly scattered attacks on infrastructure, embassies and hotels in Gulf states were a source of mirth for some American commentators. But these were eventually enough to shut down large swathes of energy production and shipping, and inflict far more pain than Trump or his supporters were expecting.

Trump was already facing the same domestic problem that Joe Biden faced. It doesn’t matter how much you tell Americans about positive GDP, stock market and employment numbers; if they are struggling with the cost of living, their view of both the economy and the President will be bleak.

Trump’s glib dismissals of the price of oil are sounding a lot like his airy reassurances at the beginning of the pandemic.

Few Republicans in Congress have been prepared to stand up to Trump over the war. But as midterm elections approach, many of them will be silently praying he finds an excuse to end it as soon as possible.The Conversation

David Smith, Associate Professor in American Politics and Foreign Policy, US Studies Centre, University of Sydney

This article is republished from The Conversation under a Creative Commons license. Read the original article.

'Deranged scumbags' and the hidden clues embedded in Trump's war language

US President Donald Trump speaks in a way unlike any of his predecessors. His distinctive and highly recognisable style may even play a role in his appeal to his political base. Since the infamous Access Hollywood tapes, he has got away with saying things none of his predecessors would have ever dreamed of saying in public. This is particularly striking in a country that was shocked to learn in the 1970s that Richard Nixon used dirty words in the Oval Office.

Scholars have described Trump’s rhetorical style as “unbalanced vituperation”, stressing his constant use of demeaning language, false equivalences and exclusion.

Even more strikingly, a recent study found Trump’s use of violent vocabulary, especially language linked to war and crime, represents a radical departure from US political tradition.

Since the beginning of the war with Iran, Trump’s rhetoric has become even more combative and outrageous, marking an even sharper shift from the language used by his predecessors in similar occasions.

What effect does this have and what does it tell us about the commander-in-chief’s state of mind?

Demeaning opponents

Trump announced the death of Iran’s Supreme Leader Ayatollah Ali Khamenei by calling him a “wretched and vile man”. Later, in a Truth Social post, he called him “one of the most evil people in history” and referred to “his gang of bloodthirsty thugs”.

A few days later, he continued denigrating leaders of the Iranian regime, describing them as “deranged scumbags” whose killing was for him a “great honor”. He has also insulted Mojtaba Khamenei, who succeeded his father as Iran’s Supreme Leader, describing him as “unacceptable” and a “lightweight”. He also stated during an interview that he believes Mojtaba is alive but “damaged”.

Americans are no strangers to their presidents using strong language to describe adversaries. Ronald Reagan famously referred to the Soviet Union as an “evil empire”, and George W. Bush warned of an “Axis of Evil”.

Yet such rhetoric rarely extended to personal insults against individual foreign leaders. Leaders generally bring a mood to these speeches that recognises their words will be frightening for many people. It also acknowledges that in a war situation, lives will inevitably be lost.

George W. Bush, for example, simply stated that US forces “captured Saddam Hussein alive”. Barack Obama announced to the nation Osama bin Laden’s killing by addressing the mastermind of the worst terrorist attack on US soil simply as “Osama bin Laden, leader of al Qaeda, and a terrorist”.

Constant threats

Trump has also shown little restraint in issuing threats. At the beginning of the conflict he stated in an interview that they had not even started hitting Iran hard and that the “big wave” was coming soon. He later posted on Truth Social that he was ready to hit Iran “twenty times harder” and threatened to “make it virtually impossible for Iran to ever be built back, as a Nation, again”, adding that “death, fire and fury will reign [sic] upon them”. At one point, he even suggested that he might strike Iran’s Kharg Island oil export hub again “just for fun”.

This language is not only vitriolic. It also is in sharp contrast with the rhetoric of past US presidents who often emphasised restraint in the use of force and showed willingness to de-escalate military conflicts.

Previous presidents have been very clear about the strength of the US military, but they have also tried to focus on diplomacy and negotiation.

Obama, talking about Syria, famously remarked that “the United States military doesn’t do pinpricks”. Yet, moments later, he asked Congress to postpone a vote authorising the use of force while his administration pursued diplomatic options.

Nixon stated during the Vietnam war that “The peace we seek to win is not victory over any other people, but the peace that comes ‘with healing in its wings’; with compassion for those who have suffered; with understanding for those who have opposed us; with the opportunity for all the peoples of this Earth to choose their own destiny”.

Trump’s threats of escalation also raise concerns about the safety of civilians and the protection of critical infrastructure. He recently stated he “didn’t do anything to do with the energy lines, because having to rebuild that would take years”. This remark suggests some awareness of the consequences of such actions.

Even so, earlier presidents often distinguished explicitly between military targets and civilian populations. George H. W. Bush, during the Gulf War, declared “our quarrel is not with the people of Iraq. We do not wish for them to suffer”.

In 2003, George W. Bush warned Iraqi military and civilian personnel: “do not destroy oils wells, a source of wealth that belongs to the Iraqi people. Do not obey any command to use weapons of mass destruction against anyone, including the Iraqi people”.

Words matter

It is still unclear why Trump’s rhetoric is so violent and so far removed from the language of virtually every US president before him. A 2020 study found Trump’s foreign policy rhetoric often aims to create a sense of crisis to mobilise his domestic base – or distract from political troubles at home.

Some observers argue Trump has used, or even manufactured, national crises as a mechanism to expand executive power through emergency declarations. Whether this is the case in the current war with Iran remains to be seen.

But words certainly matter.

On December 19 1945, US President Harry S. Truman issued a special message to Congress recommending the Department of War and the Department of the Navy be merged into a single “Department of National Defense”. Between 1947 and 1949, Congress and the executive branch implemented this proposal. Many other countries went through a similar process in the postwar period, replacing the language of “war” from the name of their departments and ministries with the more restrained term “defence”.

Seventy-six years later, in 2025, Trump reversed that tradition with an executive order renaming the Department of Defense as the US Department of War.

This same executive order clearly states that the new name demonstrates a willingness to fight wars at a moment’s notice. And the reason is not only to defend, but to “secure what is ours”.

Viewed in light of the current war with Iran, those words provide some insight into the administration’s thinking. They also invite reflection on other words coming out of the administration and its supporters, including the “Gulf of America”, the idea of Canada as the “51st state”, and even the far-fetched “Trump 2028” chant.The Conversation

Rodrigo Praino, Professor & Director, Jeff Bleich Centre for Democracy and Disruptive Technologies, Flinders University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

The Trump administration has the answers many academics are too afraid to seek

President Donald Trump directed the Pentagon and other federal agencies to begin releasing government files related to UFOs and unidentified anomalous phenomena – called UAP – in February 2026, following years of pressure from Congress, military whistleblowers and the public.

Congress formally mandated UAP investigations through the National Defense Authorization Act in December 2022. The Pentagon’s official UAP investigative body, the All-domain Anomaly Resolution Office, AARO, now carries a caseload exceeding 2,000 reports dating back to 1945. Defense Secretary Pete Hegseth confirmed this figure earlier this year.

The cases were submitted by military personnel, pilots and government employees describing aerial objects that could not be explained as known aircraft, drones or weather phenomena. Governments in Japan, France, Brazil and Canada also have their own formal UAP investigation programs.

Yet modern research universities remain almost entirely absent from this conversation. No major university has established a dedicated UAP research center. No federal science agency offers competitive grants for UAP inquiry. No doctoral programs train researchers in UAP methodology. The gap between what governments openly acknowledge and what universities are willing to study is, at this point, difficult to explain on purely intellectual grounds.

I have navigated this gap while conducting my own UAP research. My work developing the temporal aerospace correlation tool, a standardized framework for correlating civilian UAP sighting reports with documented rocket launch activity from Cape Canaveral, is currently under peer review at Limina: The Journal of UAP Studies.

Designing that framework meant making methodological decisions without community standards, without institutional funding and without the professional infrastructure many researchers in established fields take for granted. What is missing is not interest or data – it is the shared scaffolding that turns isolated curiosity into cumulative science.

Stigma is measurable

The most rigorous evidence for the gap between faculty interest in UAP and faculty willingness to study it UAP comes from peer-reviewed studies by Marissa Yingling, Charlton Yingling and Bethany Bell, published in the scholarly journal Humanities and Social Sciences Communications.

Across 14 disciplines at 144 major U.S. research universities, 1,460 faculty responded to their 2023 national survey. Most surveyed believed UAP research was important. Curiosity outweighed skepticism in every discipline that was part of the study. Nearly one-fifth had personally observed something aerial they could not identify. Yet fewer than 1% had ever conducted UAP-related research.

The gap was not explained by intellectual dismissal, but it was in part explained by fear. Researchers were not primarily deterred by intellectual skepticism because they doubted the topic’s merits. Instead, they feared they might lose funding, face ridicule from colleagues or find their careers quietly derailed. Faculty reported being told to “be careful.”

A 2024 follow-up study found that roughly 28% said they might vote against a colleague’s tenure case for conducting UAP research, even when they personally believed the topic warranted study.

Historian and philosopher of science Thomas Kuhn argued that scientific communities suppress anomalous questions not because those questions are unanswerable, but because they fall outside the boundaries the community has collectively decided are worth investigating.

Sociologist Thomas Gieryn called this suppression “boundary work,” referring to the active process by which scientists police what counts as legitimate science.

For UAP researchers, the data and tools to study the phenomenon exist. What may not exist is social permission to use them without professional consequence.

Creating an academic discipline

Academic disciplines do not emerge spontaneously. They require dedicated journals, agreed-upon methods, graduate programs and professional societies.

The history of cognitive neuroscience demonstrates how disciplines emerge. Before the 1980s, researchers at the intersection of neuroscience and cognitive psychology faced resistance from both parent disciplines.

These fields achieved mainstream acceptance only after targeted funding from the Alfred P. Sloan Foundation, new brain-imaging tools and the gradual formation of academic programs that created career pathways for researchers. Researchers at the nexus of these fields did not wait for central questions to be resolved. They built infrastructure, and the infrastructure made progress possible.

UAP studies as a discipline is developing some of these elements, but largely outside universities. The Society for UAP Studies, a nonprofit of scholars and researchers, operates Limina as a double-blind, peer-reviewed journal and has convened international symposia drawing researchers from physics, philosophy of science and the social sciences. But a nonprofit scholarly society without tenured faculty does not constitute a discipline.

To turn UAP studies into a recognized academic field would require three things.

First, funding. The Yingling studies found that competitive research grants would do more to unlock faculty participation than any other single factor. Without grants, researchers cannot hire students to assist them, maintain instruments or sustain the multiyear projects that produce meaningful results.

Second, shared methodological standards – these would entail agreed-upon procedures for collecting, recording and evaluating UAP reports – would mean findings from one research group can be compared and built upon by others.

Third, institutions could publicly affirm that they will evaluate appropriately rigorous UAP scholarship on its scientific merits during tenure reviews. Several universities have already done this for gun violence research and psychedelic-assisted therapy studies.

These are not isolated examples. Research into near-death experiences and adverse childhood experiences followed similar trajectories, moving from being a professional liability to mainstream legitimacy after the removal of institutional barriers.

The international comparison

This gap in UAP scholarship is unique to the United States. France’s GEIPAN, a dedicated investigation unit within its national space agency, has operated since 1977. It has publicly archived approximately 5,300 French UAP cases, of which about 2% to 3% remain unexplained after rigorous analysis.

In 2020, Japan formalized UAP reporting protocols for its Self-Defense Forces, the branch of the Japanese military responsible for national defense. By June 2024, more than 80 lawmakers had formed a parliamentary UAP investigation group that by May 2025 had formally proposed a dedicated UAP research office to the defense minister. Canada launched its own multiagency UAP investigation survey in 2023.

None of these actions has produced a corresponding response from American research universities. Universities provide independent, peer-reviewed analyses that government programs structurally cannot.

The University of Würzburg in Germany became the first Western university to officially recognize UAP as a legitimate object of academic research in 2022, when it formally added UAP investigation to its research canon. Researchers at Stockholm University and the Nordic Institute for Theoretical Physics in Sweden have been actively publishing peer-reviewed UAP research since 2017, most recently in Scientific Reports in October 2025.

Congress has passed legislation, the Pentagon is reporting on its investigations, and the president has directed federal agencies to begin releasing records. So the question no longer is whether governments take UAP seriously – it is whether universities will follow, and which ones will get there first.The Conversation

Darrell Evans, Professor of Environmental Science and Sustainability, Purdue University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

The next escalation in the Iran conflict could be between the US and a vital ally

In the two weeks since the US and Israeli strikes on Iran began, Donald Trump’s war aims have fluctuated between crippling Iranian military capabilities and toppling the regime that has ruled there since 1979. But despite the success of the initial strikes, which killed the supreme leader, Ali Khamenei, many analysts believe that air power alone will not be sufficient to bring about regime change.

They say this objective would be impossible to achieve without combat troops on the ground, a move that most US military and political leaders have long opposed. Instead, one idea that seems to be circulating in Washington is to support an invasion by armed Kurdish groups in Iraq and western Iran to destabilise the Islamic Republic from within.

Trump publicly backed away from this idea on March 6, telling reporters: “I don’t want the Kurds to go into Iran … The war is complicated enough as it is.” But, given Trump’s trademark inconsistency and the unpredictable nature of this conflict, an armed Kurdish uprising remains a distinct possibility. Such a scenario could have consequences that extend far beyond Iran.

The Kurds are an ethnic group with their own language and culture who have lived in a mountainous area of the Middle East for centuries. Nowadays, they number around 30 million and live in a region that spans parts of Turkey, Iran, Iraq and Syria. The Kurds are widely considered to be the world’s largest stateless people because they do not have a country of their own.

This situation dates to the end of the first world war, when the Ottoman empire collapsed. Kurdish leaders at that time hoped to establish their own state, having lived for 400 years under Ottoman rule. But instead their homeland was divided between several new countries that emerged from the defeated Ottoman state. This left Kurdish communities split across international

Around 10% of Iran’s population is Kurdish and many live in the country’s north-west near the borders of Iraq and Turkey. The Kurdish region of Iran has long been the least economically developed part of the country and Kurdish political parties are outlawed. Armed Kurdish groups have periodically clashed with the Iranian state, demanding greater autonomy or independence.

The Kurdish question is even more sensitive in Turkey, which is home to the largest population of Kurds in the world. Since 1984, the Turkish state has been locked in conflict with the Kurdistan Workers’ party (PKK), an armed group that has fought to establish an independent Kurdish state. This conflict has killed more than 40,000 people in the past four decades.

For the Turkish government, the possibility that the US may support Kurdish fighters in neighbouring Iran is therefore not just a foreign policy issue. Turkish leaders worry that strengthening Kurdish armed groups elsewhere in the region could embolden similar movements inside Turkey itself.

In the recent past, Turkey has launched military incursions into the Kurdish regions of Iraq and Syria. It has also fought a brutal counterinsurgency against PKK fighters inside its own borders. These actions show how strongly Turkish leaders oppose any notion of Kurdish independence anywhere in the region.

American support for Kurdish fighters has caused tension between the US and Turkey in the past. Turkey strongly opposed the partnership between Washington and Syrian Kurdish forces during the fight against the Islamic State militant group in Syria in the late 2010s. It argued that some of these Kurdish groups were linked to the PKK.

Turkey’s relations with Israel have also been strained by the Kurdish question. The Turkish president, Recep Tayyip Erdoğan, has accused the Israeli prime minister, Benjamin Netanyahu, of undermining the transitional Syrian government by aiding Kurdish groups there. The Kurdish issue has clearly become a major source of tension between Turkey, a key member of the Nato alliance, and the west.

So far, Turkey has largely remained neutral in the Iran war. Despite their regional rivalry, Turkish and Iranian leaders share concerns about Kurdish separatist movements and have sometimes cooperated to contain them. In the past, security forces from both countries have coordinated efforts against Kurdish militant groups operating along their shared border.

Turkish and Iranian officials have also exchanged intelligence and carried out military operations against Kurdish fighters moving between the two countries. And both governments strongly opposed the 2017 referendum on independence that was held by the Kurds in northern Iraq. Over 92% of votes were cast in favour of independence.

Iranian regime change

For Turkey, the collapse or fragmentation of the Iranian state would be deeply worrying. It could create exactly the conditions Turkish leaders fear most: armed Kurdish groups operating across a much longer and more unstable border.

Another concern is the possibility of a new refugee crisis. Turkey already hosts nearly 4 million Syrians following the civil war that began there in 2011 – the largest refugee population in the world. This has become a major political issue inside Turkey.

If conflict or state collapse in Iran – a larger and even more politically complex state than Syria – triggers large-scale displacement, many more refugees could head west towards Turkey. Such a scenario would place considerable political and economic pressure on the government.

Washington may see the Kurds as a useful way to confront the Iranian regime without deploying American troops. But such a strategy could create new tensions elsewhere in the region. For Turkey, Kurdish militancy is not simply a foreign policy issue but a core national security concern.

If the Iran war ends up empowering Kurdish armed groups or destabilising Turkey’s border, Erdoğan may yet feel compelled to respond. This could open up another front in an already expanding regional conflict.The Conversation

Ben Seymour, PhD Candidate in International Relations, Nottingham Trent University and Eszter Simon, Senior Lecturer Politics and International Relations, Nottingham Trent University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Secrets, sexism and hypocrisy: Inside the Murdochs' real succession drama

Does the world need another biography of Rupert Murdoch? It depends what it has to say and who has written it.

Bonfire of the Murdochs, by journalist Gabriel Sherman, looks promising. He made his name with an exhaustively researched biography of long-running Fox News head and serial sexual harasser, Roger Ailes. The Loudest Voice in the Room (2014) has 98 pages of endnotes and a team of three fact-checkers. It was made into a series starring Russell Crowe as Ailes. Sherman was also the screenwriter of Donald Trump biopic, The Apprentice, which Trump fought hard to prevent being screened.

Promising credentials, yes, but what does Sherman add to the eight Murdoch biographies already published?

The first was Simon Regan’s business-oriented biography published in 1976. It has been forgotten, but not so George Munster’s A Paper Prince (1985), which laid out Murdoch’s deal-making modus operandi, nor William Shawcross’ 1992 semi-authorised work, which charted Murdoch’s creation of the first global media empire.

Michael Wolff’s The Man Who Owns the News (2008) painted the most vivid portrait of the Australian born media mogul. Flushed with the success of buying The Wall Street Journal, Murdoch agreed to more than 50 hours of interviews with Wolff and opened the doors of his notoriously secretive media empire to the Vanity Fair media columnist.

Wolff did report the Wall Street Journal takeover in detail, but he also retailed a breathtaking amount of industry and family gossip.

One example among many. He writes that Prudence, Murdoch’s daughter from his first marriage, gave him exasperated grooming advice after Murdoch botched a DIY makeover as he tried keeping up with Wendi Deng, his third wife who was the same age as his children.

“Dad, I understand about dyeing the hair and the age thing. Just go somewhere proper. What you need is very light highlights.” But he insists on doing it overthe sink because he doesn’t want anybody to know. Well, hello! Look in the mirror.Look at the pictures in the paper. It’s such a hatchet job.

Murdoch’s response? He told her she needed a face lift.

Murdoch’s response to Wolff’s biography was that it needed more than a face lift – it should not have been published with the errors it had. He did not sue for defamation, however. Wolff has since become an even more controversial figure: he is embroiled in suit and counter-suit with Donald and Melania Trump over Wolff’s claims about Trump’s relationship with convicted sex offender Jeffrey Epstein.

The long-running struggle for succession in the Murdoch family famously inspired the brilliantly coruscating fictional television series Succession (2018–2023). Sherman’s is the first biography to deal with its resolution, which happened only last September, when Rupert Murdoch and his eldest son, Lachlan, succeeded in changing the terms of an apparently irrevocable family trust.

The trust had been created when Rupert and his second wife, Anna, separated in 1998. (She died on February 17 this year.) It was her attempt to put a brake on Murdoch’s continual pitting of his children, especially his sons, against each other in the quest to succeed him as head of News Corporation.

It didn’t work. Rupert’s plan for Lachlan to lead the company, continuing its hard right position led by Fox News, eventually succeeded. To a greater or lesser degree, the other children from his first two marriages – Prudence, Elisabeth and James – loathed what Fox News had become and, reportedly led by James, were prepared to use their votes in the family trust to oust Lachlan after Rupert died.

In the end, though, they agreed to sell their shares in the family trust for US$1.1 billion each. Grace and Chloe, the two children from Murdoch’s third marriage, are part of a newly drawn family trust with their own shares in News.

The machinations behind this episode were reported last year in two extraordinary pieces of journalism, by Jonathan Mahler and Jim Rutenberg of The New York Times, who were leaked 3,000 pages of court documents about the case, and by McKay Coppins in The Atlantic magazine. He secured a long, revealing interview with James Murdoch, who was labelled in Rupert and Lachlan’s legal materials the “troublesome beneficiary”.

For those without subscriptions to these publications, my colleague, Andrew Dodd, and I discussed the case in The Conversation here and here.

An outstanding journalist

Sherman, another outstanding journalist, has been reporting on the Murdochs since 2008. Ailes threatened him with legal action and engineered a smear campaign over The Loudest Voice in the Room, as Sherman calmly detailed in “A Note on Sources” at the end of the book. It was Sherman who in 2016 broke the news about Fox News presenter Gretchen Carlson’s sexual harassment suit against Ailes that led to his ousting from the network.

In 2018, he revealed Murdoch came close to death after a fall on Lachlan’s maxi-yacht while sailing in the Caribbean.

Sherman also had the inside scoop on the end of Murdoch’s fourth marriage in 2022. The then 91-year-old mogul not only broke up by text with his wife, supermodel and actor Jerry Hall, but included in the divorce terms a demand she not give story ideas to the scriptwriters of Succession!

Hall later realised the marriage had ended, in Murdoch’s eyes, some time before, when he met Ann Lesley Smith, a 65-year-old former dental hygienist turned conservative radio host and follower of QAnon-style conspiracy theories. At a dinner at Murdoch’s ranch in Carmel, Smith gushed that Murdoch and Fox News were the saviours of democracy, and offered to clean his teeth for him.

Murdoch proposed to Smith in early 2023, but he soon called off the wedding after another dinner, where she told then Fox News host Tucker Carlson he was a messenger from God. Hall felt humiliated by Murdoch’s treatment of her but told friends she took satisfaction in making an effigy of him, tying dental floss around its neck and burning it on the barbecue.

All these disclosures, and gossip, are included in Bonfire of the Murdochs. Indeed, Sherman’s reporting, for New York and Vanity Fair magazines, forms a good deal of the book. If you have already read his lengthy articles, there is not much new here. But if you haven’t, or if you are confused by the countless deals and complex financial/political transactions of Murdoch’s seven-decades-plus career in media, this biography is well worth reading.

‘Destroyed everything he loved’

At 241 pages, it has the virtue, as well as the shortcoming, of being the shortest of the Murdoch biographies. Sherman has a gift for succinctly summarising key themes.

The first is that more than most, Murdoch’s media empire is secretive. Remember, his plan to change the family trust was supposed to be heard behind closed doors. We only know about it because The New York Times was leaked the court records, which revealed Murdoch’s testimony. As Sherman puts it: “Rupert crafted narratives in the shadows, but the courtroom would require him to do it in the open.”

Initially, it did not go well for Murdoch. Under cross-examination, his determination to get his way no matter what and his sexism towards his daughters was revealed.

The second theme is the extent to which Murdoch will ignore the stated mission of his media outlets – report what is happening accurately – if it aligns with his commercial goals. During the global pandemic, while Fox News hosts fulminated about lockdowns and advocated dubious treatments like hydroxychloroquine, Murdoch followed the science and, Sherman reports, was one of the first in the world to be vaccinated, in December 2020.

“He was scared for himself and was very careful,” a person who spoke to Murdoch at the time recalled for Sherman. Questioned about the disconnect between his network’s coverage and his own behaviour, Murdoch would deflect responsibility for the presenters’ commentary, even though this seeming passivity contrasted sharply with his history of editorial interference.

As Sherman comments: “The hypocrisy revealed something essential about Rupert’s worldview: he had always been able to separate his personal beliefs from his business interests.” He adds that Murdoch thought then president, Donald Trump, grievously mishandled the pandemic but refused to use his position as head of Fox to pressure the president to treat it seriously.

Nor did Murdoch take any responsibility when a friend told him the channel was killing its elderly audience. According to one of Sherman’s sources, he replied: “They’re dying from old age and other illnesses, but COVID was being blamed.”

The biographer quotes other sources who say the quid pro quo was that Murdoch had successfully lobbied Trump in his first term to take action against Facebook and Google, who were winning advertising revenue from News (along with other legacy media companies) and to open up land for fracking, which was to boost the value of Murdoch’s fossil fuel investments.

The third theme is that Murdoch built the world’s first global media empire but has always run his companies as a family business, with him as the first and ultimate decision-maker. Nimbleness is the advantage of this approach. As with any autocratically run organisation, though, there are disadvantages. Among them is that no one has a perfect strike rate for success.

Along the way, talented executives such as Barry Diller, former chief executive at Twentieth Century Fox or Chase Carey, former top executive at 21st Century Fox, knew – or found out – that their path to the top was blocked not only by the company’s head, but by Murdoch’s desire to advance or protect family members. Murdoch once told shareholders complaining about nepotism: “If you don’t like it, sell your shares.”

From the 1950s, when Murdoch was the “boy publisher” of the afternoon newspaper he inherited from his father, the Adelaide News, he behaved, Sherman writes, as though “promises were like inconvenient facts: fungible when they got in the way of profit.” The newspaper’s editor, Rohan Rivett, was the first among several, alongside numerous politicians, who learnt this to their cost.

The fourth theme is that Murdoch has always wanted his children involved in his business, but only on his terms. “Growing up,” Sherman writes, “the children’s relationship to their father was expressed through the business, making them equate paternal love with corporate advancement.”

Where earlier writers have drawn parallels with Shakespeare’s King Lear, Sherman thinks King Midas is a more appropriate comparison.

Like the mythical monarch whose touch turned everything to gold, Rupert built a $17 billion fortune but destroyed everything he loved in the process. His media outlets stoked hatred and division on an industrial scale, and amassing that wealthrequired him to damage virtually anything he touched: the environment, women’srights, the Republican Party, truth, decency – even his own family.

The weakest part

These are potent themes that resonate with those of us living in the country of Murdoch’s origin, which brings us to the book’s shortcoming. Australia features early on, but this is the weakest part of the book. Murdoch’s early years are well covered in Munster and Shawcross’s biographies and more recently have been given detailed attention in Walter Marsh’s Young Rupert (2023).

There are basic errors: The Daily Mirror in Sydney, which Murdoch bought in 1960, is misnamed The Mirror, while the Herald and Weekly Times Ltd., which he bought in 1987, becomes the Herald Times Group. Nor does it help that on the book’s final page, Sherman writes “Rupert was with his fourth wife while his children were scattered across the globe” – when Murdoch had discarded Jerry Hall in 2022 and was now married a fifth time, to Elena Zhukova.

Fourth, fifth? It’s easy to lose count. More seriously, in buying the HWT, Murdoch became the dominant newspaper owner in Australia, but his control did not account for 75% of the market, as Sherman writes. It is more like 60% to 65%, depending on whether you use circulation or number of newspapers as a measure.

Murdoch’s early years in Australia are briskly dealt with in chapter one, before he moves on in his relentless quest to acquire more media properties in the United Kingdom and the US. This is true as far as it goes, but once Murdoch does head north, his biographer loses almost all interest in how Australia is faring – even, or especially actually, after Murdoch acquires the HWT.

The same is true to a lesser extent with Sherman’s treatment of the UK. The phone hacking scandal is covered, of course, but not much else is once Murdoch arrives in New York in the mid-seventies.

What is lost, then, in Sherman’s compression, is context for events. Such as: where did the phone hacking culture come from? What lengths did News go to in denying the practice went beyond two “rogue reporters” or in obstructing official inquiries? Why have they since paid so much money settling with phone hacking victims, rather than going to court?

Missing, too, is any sense of the connections between Murdoch’s media outlets in the three main countries in which News operates. Has the hostile coverage of trans people been imported from Fox News to Sky News Australia? What affect has his media outlets’ campaigning against action on climate change had across these three countries?

These, and others, are relevant questions to ask about a global media empire. Rupert Murdoch may have handed over the company to Lachlan in 2023, but he led it for 70 years, he created its culture and he still wields influence. In case it passed you by, it was Rupert Murdoch – not Lachlan, according to the reports – who in February had a private dinner at the White House with US president Donald Trump.The Conversation

Matthew Ricketson, Professor of Communication, Deakin University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

A behind-the-scenes power player emerges as an unexpected threat to Trump

It’s well known that Donald Trump consumes television broadcasts and often makes policy based more on Fox News punditry than advice from political or government advisors. So it’s unsurprising that one of his most influential advisers, Tucker Carlson, has never held a political or government appointment.

Of course, Carlson, an early sceptic about the Iraq War, last week called the attack on Iran “absolutely disgusting and evil”. Trump responded by saying “Tucker has lost his way” and “he’s not MAGA”.

While this may signal the end of his hold over Trump, they’ve weathered disagreements before – as when Carlson attacked last year’s strikes on Iran, as well as consistently pressing Trump over the Epstein files.

But if Carlson’s ruptures with Trump widen, some observers told the author of a new book, “he could then portray himself to a disillusioned MAGA base as the true leader of their movement – and run for president himself in 2028”.

The great mystery of Tucker Carlson is how a once-serious journalist, whose writing for the likes of New York magazine and Esquire was admired, wandered into the crazy world of the American far right and came to dominate it.

In his book, Hated By All the Right People, Jason Zengerle (a contributing writer for the New York Times magazine) traces Carlson’s evolution over past 30 years. It is, he writes, the story of what has happened to the United States in that period.

Origin stories

Carlson was born in 1969 to a prominent conservative father and a bohemian heiress mother: they divorced before his eighth birthday and Carlson’s father got sole custody. His mother lived mostly abroad. “I don’t know this person,” Carlson reported feeling as she was dying. She left him a dollar in her will.

He failed to graduate from college, where, Zengerle writes, he was an “abysmal student”, but charmed his way into a succession of small conservative media outlets, and a few national magazines. By the turn of the century, he discovered the lure of television and went through a series of attempts to break into mainstream broadcasting.

First CNN, where Jon Stewart essentially ended Carlson’s contract and his show by savaging it, at length, while appearing as a guest. Then PBS, and MSNBC – where Carlson picked liberal self-described “butch lesbian” talk radio host Rachel Maddow to be his sparring partner. (Maddow is now one of the most high-profile media defenders of progressive politics in the US.)

At his lowest point, he became a political analyst at the only cable-news network he’d yet to work at, Fox News – or, as he’d once described it, “a mean, sick group of people”.

His rise (and increased air time) was tied to Donald Trump’s: he was the rare conservative or Fox News pundit who didn’t initially dismiss him. Fox gave him his own show days before Trump was elected in 2016.

For seven years, Carlson was a mainstay of Fox right-wing cheerleading, until he was unceremoniously dumped in 2023. Just why he was removed is not clear. Carlson came to believe it was part of Fox’s settlement in the Dominion lawsuit. Zengerle speculates Rupert Murdoch finally lost patience with Carlson (despite his closeness to Lachlan Murdoch), as he had on several occasions with Trump too.

Considered for Trump’s ‘veep’

Carlson bounced back, creating his own successful network, on which he hosted interviews with Andrew Tate, Nazi apologist historian Darryl Cooper and Trump himself (including an interview aired on X at the same time as Fox’s first presidential primary debate, in which Trump refused to participate).

In 2024, he campaigned vigorously for Trump’s second term. Trump even told reporters, Zengerle writes, that he “was entertaining the idea of tapping Carlson as his veep”.

Carlson had endeared himself further by presenting a three-part series, Patriot Purge, which presented the riots at the Capitol on January 6 2021 as “a false flag operation, instigated by undercover FBI operatives in the crowd, so that the Biden administration could then persecute Americans for the crime of being conservative”.

During the Biden years, a bizarre crowd of conspiracy seekers and racist right-wingers paid court to Trump. Carlson was among the most important: possibly even more than Elon Musk. As Zengerle writes, he was active behind the scenes in the vice-presidential selection of JD Vance, whom he had helped mentor into politics, and at least two cabinet members: Robert F. Kennedy Jr and Tulsi Gabbard.

Vance’s “remarkable dressing-down” of Ukrainian president Volodymyr Zelensky was “a direct echo” of Carlson’s criticisms on his shows for the previous three years. Carlson’s criticisms of Zelensky drew on antisemitic tropes, calling him “ratlike” and “a persecutor of Christians”.

Zengerle credits Carlson with providing much of the mismatch of policies that have marked Trump’s second term (as well as the border wall with Mexico, which Carlson argued for as far back as 2005).

Trump has consistently expressed hostility to immigrants, with the notable exception of white South Africans – whose cause Carlson seems to have pioneered – and promoted Viktor Orban’s Hungarian authoritarian regime, which Carlson called a “lesson” for America after he visited to interview Orban, before anyone in the US had paid him much attention.

Unsurprisingly, Carlson has expressed sympathy for Vladimir Putin. He became the first American journalist to obtain a one-on-one interview with Putin after the invasion of Ukraine.

It was widely believed Putin played him, avoiding any difficult questions about respect for Ukrainian sovereignty: just as he had played Trump in his infamous meeting in Helsinki in 2018. Zengerle does not explore whether there is any connection between the two men’s remarkable sympathy for the Russian dictator.

Since Trump’s re-election, Carlson has become less sycophantic, particularly on Iran and the Epstein files. At one point, he claimed Epstein was working at the behest of Israel’s government: part of the increasingly antisemitic and anti-Israeli raves that characterise the contemporary Carlson.

Carlson and the Republican journey

Carlson, like Vance before he became vice president, has become a strident America Firster, opposed to involvement in foreign wars or desire for regime change.

Given the uncertain outcome of the current war on Iran, it is impossible to predict whether Carlson’s position as perhaps the most significant right-wing ideologue in the American media is doomed to burn out, or to become yet more influential.

Either way, Zengerle is right to point to Carlson’s career as a symbol of the way the Republican Party has been captured by a set of beliefs and principles previous Republican leaders would have denounced as racist and undemocratic. The two Republican candidates for president before Trump, John McCain and Mitt Romney, would no longer find a home in their party.

But of course, they both lost to Barack Obama. Trump’s 2016 victory caused a major reversal in American politics and many of the people who originally abhorred him are now part of his inner circle. Both Vance and secretary of state Marco Rubio had declared him totally unfit for office. Zengerle reminds us that while a senator, Rubio supported immigration reforms he has now disavowed in fealty to the president.

Carlson shared these doubts about Trump in 2016, though he was one of the first to recognise the strange charisma that would propel Trump to the top.

As the Republican Party has moved increasingly into territory that used to be regarded as frankly conspiratorial and crazed, so too has Carlson. But while Zengerle does an excellent job of charting this transformation, he does little to explain why it happened.

He writes well, as befits a veteran of the best US print media, but there is a surplus of information and a lack of real analysis. Take the example of Carlson’s increasingly virulent antisemitism. Early in his career, he worked with and for many prominent Jewish intellectuals, like neoconservative writers Bill Kristol and John Podhoretz. Zengerle demonstrates that Carlson is providing increasing time to extreme antisemites, but makes no real attempt to explain it.

Calculation or genuine belief?

But his drift towards the fringes of overt racism seem to date back to his founding of the briefly successful website The Daily Caller in 2010.

While it began with some claim to journalistic integrity, The Daily Caller soon found space for that particularly virulent antisemitism that ties together ancient tropes about Jews with fear and hatred of African Americans and Muslims. Carlson’s willingness to host antisemites on his program has meant his criticism of Israel’s behaviour in Gaza is too easily dismissed by the powerful Israeli lobby in the US.

Reading Carlson’s increasing attraction to fringe irrationality, I wondered how far this is political calculation and how far it represents genuinely held beliefs. Does Carlson ever wake in the night and ask himself if he bears any responsibility for Trump’s cruelty to alleged illegal aliens – or Republican attempts to disenfranchise electors?

Hated by all the Right People is a revealing title, akin to Hillary Clinton’s comment about the “basket of deplorables” who voted for Trump. But I would have liked to see Zengerle explore the reasons for Carlson’s appeal. As he concludes, Carlson now speaks to millions. Maybe he should have spoken to some of these millions, to better understand why they listen to him.The Conversation

Dennis Altman, Vice Chancellor's Fellow and Professorial Fellow, Institute for Human Security and Social Change, La Trobe University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Iran's regime was built for survival — and a long war is now likely

The joint US–Israel strikes on Iran, which killed the Iranian supreme leader, Ayatollah Ali Khamenei, and Tehran’s retaliatory strikes on Israel and neighbouring Arab countries have again plunged the Middle East into war.

US President Donald Trump and Israeli Prime Minister Benjamin Netanyahu said their aim is to bring about a favourable regime change in Iran. The implications of this for Iran, the region and beyond should not be underestimated.

Although Khamenei’s killing is a significant blow to the Islamic regime, it is not insurmountable. Many Iranian leaders have been killed in the past, including Qassem Soleimani, Tehran’s regional security architect, who was assassinated by the US in January 2020.

But they have been replaced relatively smoothly, and the Islamic regime has endured.

Khamenei’s departure is unlikely to mean the end of the Islamic regime in the short run. He anticipated this eventuality, and reportedly last week arranged a line of succession for his leadership and that of senior military, security and political leaders if they were “martyred”.

However, Khamenei was both a political and spiritual leader. He has commanded followers not only among devout Shias in Iran, but also many Muslims across the wider region. His assassination will spur some of them to seek revenge, potentially sparking a wave of extremist violent actions in the region and beyond.

A regime built for survival

Under a constitutional provision of the Islamic Republic, the Assembly of Experts – the body responsible for appointing and dismissing a supreme leader – will now meet and appoint an interim or long-term leader, either from among their own ranks or outside.

There are three likely candidates to be his successor:

  • Gholam-Hossein Mohseni-Eje’i, the head of the judiciary
  • Ali Asghar Hejazi, Khamenei’s chief-of-staff
  • Hassan Khomeini, the grandson of the founder of the Islamic Republic, Ayatollah Rohullah Khomeini.

The regime has every incentive to do what it must to ensure its survival.There are many regime enforcers and defenders, led by the Islamic Revolutionary Guard Corps (IRGC) and its subordinate paramilitary Basij group, across the country to suppress any domestic uprisings and fight for the endurance of the regime.

Their fortunes are intimately tied to the regime. So are a range of administrators and bureaucrats in the Iranian government, as well as regime sympathisers among ordinary Iranians. They are motivated by a blend of Shi’ism and fierce nationalism to remain loyal to the regime.

Trump and Netanyahu have called on the Iranian people – some 60% of whom are below the age of 30 – to topple the regime once the US-Israeli operations have crippled it.

Many are deeply aggrieved by the regime’s theocratic impositions and dire economic situation and took to the streets in protests in late 2025 and early 2026. The regime cracked down harshly then, killing thousands.

Could a public uprising happen now? So far, the coercive and administrative state apparatus seems to be solidly backing the regime. Without serious cracks appearing among these figures – particularly the IRGC – the regime can be expected to survive this crisis.

Global economic pain

The regime has also been able to respond very quickly to outside aggression. It has already hit back at Israel and US military bases across the Persian Gulf, using short-range and long-range advanced ballistic missiles and drones.

While many of the projectiles have been repelled, some have hit their targets, causing serious damage.

The IRGC has also set out to choke the Strait of Hormuz – the narrow strategic waterway that connects the Persian Gulf to the Gulf of Oman and Indian Ocean. Some 20% of the world’s oil and 25% of its liquefied gas flows through the strait every day.

The United States has vowed to keep the strait open, but the IRGC is potentially well-placed to block traffic from going through. There could be serious implications for the global energy supply and broader economy.

Both sides in this conflict have trespassed all of the previous red lines. They are now in open warfare, which is engulfing the entire region.

A prolonged war looks likely

If there was any pretence on the part of Washington and Jerusalem that their attacks would not lead to a regional war, they were wrong. This is already happening.

Many countries that have close cooperation agreements with Iran, including China and Russia, have condemned the US-Israeli actions. The United Nations secretary-general António Guterres has also urgently called for de-escalation and a return to diplomatic negotiations, as have many others.

But the chances for this look very slim. The US and Iran were in the middle of a second round of talks over Tehran’s nuclear program when the attacks happened. The Omani foreign minister, who mediated between the two sides, publicly said just days ago that “peace was within reach”.

But this was not enough to convince Trump and Netanyahu to let the negotiations continue. They sensed now was the best time to strike the Islamic Republic to destroy not just its nuclear program but also its military capability after Israel degraded some of Tehran’s regional affiliates, such as Hamas and Hezbollah, and expanded its footprint in Lebanon and Syria over the last two and a half years.

While it is difficult to be definitive about where the war is likely to lead, the scene is set for a long conflict. It may not last days, but rather weeks. The US and Israel do not want anything short of regime change, and the regime is determined to survive.

With this war, the Trump leadership is also signalling to its adversaries – China, in particular – that the US remains the preeminent global power, while Netanyahu is seeking to cement Israel’s position as the dominant regional actor.

Pity the Iranian people, the region and the world that have to endure the consequences of another war of choice in the Middle East for geopolitical gains in an already deeply troubled world.The Conversation

Amin Saikal, Emeritus Professor of Middle Eastern Studies, Australian National University; The University of Western Australia; Victoria University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Stephen Colbert was right

Talk show host Stephen Colbert made headlines on Feb. 17, 2026, when he wrapped a network statement in a dog-waste bag and tossed it in the trash.

He did it live, while on air.

The move came after CBS lawyers reportedly told him he could not broadcast a scheduled interview with Democratic Texas Senate candidate James Talarico on his show, Late Night with Stephen Colbert. According to Colbert, the network warned him that broadcasting the interview could trigger the Federal Communications Commission’s equal time rule, which requires broadcasters to allow political candidates equal access to the nation’s airwaves.

CBS said it gave Colbert “legal guidance” that airing the segment could raise equal time concerns and suggested other options.

Colbert countered that in decades of late-night television, he could not find a single example of the rule being enforced against a talk show interview. He ultimately posted his Talarico interview on YouTube instead, where broadcasting rules don’t apply.

As a media scholar, I believe Colbert is right about the law. Congress has deliberately protected editorial discretion to prevent equal time rules from chilling political speech. And the FCC has extended this privilege to shows like his.

To understand why, you have to go back to 1959 and to a forgotten fight over the role of broadcasting in a democratic society.

Amending ‘equal time’

Because the airwaves have been viewed as a scarce public resource, radio and television broadcasting have been regulated to balance the First Amendment rights of the press with public interest obligations. That includes the need to provide reasonable access to the airwaves for candidates for office – so citizens can hear what they have to say, whether in the form of paid advertising or unpaid news coverage.

After first appearing in the Radio Act of 1927, the equal time provision was codified in Section 315 of the Communications Act of 1934.

That law created the FCC and still governs the use of the nation’s airwaves today. It requires broadcast licensees to provide “equal opportunities” to legally qualified candidates in a given election if they allow one candidate to “use” their facilities. The requirement was intended to prevent broadcasters from favoring one candidate over another and to foster robust political debate that would serve the public interest.

But the statute did not clearly define what counted as a “use.”

That ambiguity was a known issue, but it came to a head in 1959, when Lar Daly, a fringe Chicago mayoral candidate, filed a complaint with the FCC. He argued that if stations aired news clips of his opponents – including the incumbent mayor – as part of their routine coverage, he was entitled to equal time on air.

The FCC agreed. And it created a ruling that meant even routine news coverage of a candidate could trigger equal time obligations.

Broadcasters immediately warned that the decision would make political journalism nearly impossible. If every news interview or campaign clip required providing comparable time to every rival – including minor or fringe candidates – stations would either have to book everyone or drastically scale back political coverage.

NBC president Robert Sarnoff issued a thinly veiled threat in a message that was not lost on politicians who would be affected by the change: “Unless the gag is lifted during the current session of the Congress, a major curtailment of television and radio political coverage in 1960 is inevitable.”

Later that year, Congress stepped in and amended Section 315 to create explicit exemptions for “bona fide” newscasts, news interviews, news documentaries and on-the-spot coverage of news events. As my colleague Tim P. Vos and I note in our research on the history of the amendment, Congress rejected calls to repeal equal time altogether.

Instead, lawmakers preserved the rule for candidate-sponsored advertising while shielding news programming. Persuaded by broadcasters, lawmakers determined that professional journalism, guided by norms of balance and fairness, would best serve democratic discourse.

In signing the 1959 legislation, President Dwight D. Eisenhower highlighted the “continuing obligation of broadcasters to operate in the public interest and to afford reasonable opportunity for the discussion of conflicting views on important public issues.”

Eisenhower concluded by appealing to the good intentions of the nation’s broadcasters: “There is no doubt in my mind that the American radio and television stations can be relied upon to carry out fairly and honestly the provisions of this Act without abuse or partiality to any individual, group, or party.”

The talk show exemption

Over the decades, the FCC has interpreted the 1959 exemptions broadly.

Programs ranging from Meet the Press to The Jerry Springer Show to The Tonight Show and other interview-based broadcasts have been treated as “bona fide news interviews,” even when hosted by comedians. That’s why Colbert’s claim that there is no enforcement history against late-night talk shows is accurate.

It’s important to remember that equal time still applies in other contexts. If a candidate purchases or receives airtime for an advertisement, opponents are entitled to comparable access.

Equal time also applies to non-exempt entertainment programming, such as Saturday Night Live. Donald Trump’s hosting gig on SNL in November 2015 triggered an equal time request from four opposing primary candidates. And NBC obliged by providing a comparable amount of airtime for their campaign messages.

FCC Chairman Brendan Carr recently signaled he was considering eliminating the talk-show exemption, arguing that some programs are “motivated by partisan purposes.”

As of now, no legal change has occurred. And it seems to me that CBS has acted out of caution, responding to political and regulatory pressure rather than to an actual rule change. That makes this episode unusual: The equal time rule was perhaps applied indirectly, through corporate self-censorship, not through direct FCC enforcement.

Why this moment matters

Either way, the Colbert incident highlights the growing restrictions on editorial independence during the second Trump administration – either imposed by government threat or corporate fear.

Whether through direct regulatory intervention or indirect corporate influence, this incident and others like it show an increased willingness to interfere with the editorial independence of media producers.

The dispute is part of what some critics view as an ongoing effort by the Trump administration to silence criticism. Trump is no fan of Colbert and has targeted comedians before.

CBS already announced in 2025 that Colbert’s show will be canceled in May 2026, leading many to suggest CBS was trying to appease Trump and his FCC, particularly ahead of a then-pending merger that required FCC approval.

The 1959 amendment that created the equal time exemption aimed to preserve editorial independence and protect free expression by limiting equal time claims and ensuring vibrant political discourse. The decision reflected a judgment that professional editorial discretion, not mandatory equivalence, best served citizens.

If the FCC alters the exemption, it would represent a major shift in U.S. media policy and would almost certainly face legal challenges. The government has an important role to play in promoting free expression and protecting free speech, but this is a good time to be wary of efforts to alter regulations to control content.The Conversation

Seth Ashley, Professor of Communication, Boise State University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

How Britain's right wing is benefiting from the Epstein scandal

The arrest of Andrew Mountbatten-Windsor on suspicion of misconduct in public office will heap yet more pressure on the beleaguered government of Prime Minister Keir Starmer.

Mountbatten-Windsor’s arrest over allegations he passed government documents to sex offender Jeffrey Epstein comes directly on the heels of the resignation of Peter Mandelson, Starmer’s ambassador to the United States, due to his own alleged associations with Epstein.

The fallout from the scandal is hugely damaging to public trust in both the political establishment and institutions in the United Kingdom, including the royal family.

Trust in the royals already declining

It’s hard to separate the fate and popularity of the royal family from the institutions of British governance because they’re very much part of it.

The monarchy, specifically the Crown, is part of the British constitution. The monarch gives assent to all legislation that’s passed by parliament (in other words, he or she has to sign it for it to pass). While that might seem like a rubber-stamping exercise and that the monarch is a mere symbol in British politics, King Charles and, in slightly different ways, Queen Elizabeth II certainly have had their political preferences.

And despite the impression you get during royal occasions like weddings, funerals and coronations, the royals don’t enjoy unanimous support in Britain. In fact, public support has been declining in recent years, especially among the young.

In an Ipsos survey released this week, just 47% of Britons said they had a favourable opinion of the royal family on the whole (a seven-point decline from November). And just 28% of Britons believe the royal family has handled the allegations against Mountbatten-Windsor well, compared to 37% in November.

Importantly, there’s been a long-term trend of steady decline in support for the monarchy since 1983, when the British Social Attitudes survey first asked about this.

More broadly, and in common with many other liberal democracies, there is a pervasive sense the Epstein scandal is more evidence of the existence of a self-serving, corrupt elite making good for itself and harming others, while many people in the “left behind” and “squeezed middle” of society are struggling.

Politically, this perception adds further fuel to the notion that the inequality between the rulers and the ruled has become unjustifiable. Something has to change.

Pressure mounting on Labour

Starmer’s Labour government was already deeply unpopular before Mandelson’s alleged ties to Epstein were revealed. Now, it has entered some sort of permanent crisis mode.

Mandelson was one of the key figures behind the so-called “New Labour” project associated with the leadership of Prime Minister Tony Blair from 1997–2007.

New Labour has a dual legacy in British politics. On one level, it was the most electorally successful Labour government ever. But that electoral success seemed to come at the expense of a clearly defined sense of what a Labour Party stood for. Key players like Mandelson courted wealthy backers and moved Labour to the centre of British politics to, not unreasonably, win elections.

As such, many Labour supporters started to drift away from the party and towards other, at times diametrically opposed, political parties. In Scotland, this benefited the pro-independence parties. In England, it benefitted the radical-right Reform UK.

Reform has precious little governing experience, but that is its appeal. Its radical messages are finding traction with a large number of voters, many of whom formerly supported Conservative or Labour.

So in this context, when Mandelson, an already divisive figure, was named ambassador to the US in the belief he could help manage President Donald Trump, Starmer’s political gamble to reinstate him to a public role backfired.

Reform could ultimately benefit

The British government’s travails represent another gilt-edged opportunity for Reform UK to capitalise on the unpopularity of Starmer, Labour and politics more broadly. But there is a risk for Reform, too.

Radical-right parties tend to place a great emphasis on the figure of the leader. For Reform UK, this is Nigel Farage.

Farage has had an incredible impact on British politics, especially since Brexit. But Farage, a former merchant banker, is also part of this global elite, despite pitching his politics at the “left behinds”. He has spent years courting Trump’s friendship. So, while there are no allegations against him related to Epstein, the public anger towards elites in general may eventually rebound on Farage, too.

Reform UK, however, is positioning itself successfully as an alternative to the two major parties in the UK, and could form a minority government at the next UK-wide elections in 2029.

The Conservative Party has shot its bolt as a result of its 14 years in government. And Labour came to power more as a rejection of the Conservatives than an endorsement of its policies. It has thus far excelled in failing to meet these low expectations, to Reform’s benefit.

Excluding a by-election in February, the first major political test will be local government elections in England, and elections to the Scottish Parliament and Welsh Senedd in May. A poor Labour showing will quite possibly lead to a leadership challenge against Starmer, whose government seems incapable of stemming the rise of support for an emboldened Reform.

A boost to republicanism

“Unprecedented” is an over-worn term. However, the arrest of a member of the royal family is the first in England since 1647 (it didn’t end well).

Prince William is still very popular. But there could still be very serious consequences for support for the monarchy in the various nations of the United Kingdom.

There isn’t the same sort of support for republicanism in England as there is in Australia, where republicans can de-legitimnise the king as a “foreign” monarch. Although this argument is made by republicans in Northern Ireland, English republicanism needs to be driven by some other sentiment.

And the Epstein crisis could be it, given it is drawing attention to gross inequality and damaging entitlement. It’s hard to see where exactly all this will end up, but it is quite possible this will give the greatest boost to anti-monarchical sentiment in England for some centuries.

It is important not to forget the women and girls who were victims of this rich man’s cabal. Yet, one great harm of the Epstein scandal in Britain is the further damage done to trust in institutions of governance and the boost it provides for the illiberal critics of what seems like a decaying order.The Conversation

Ben Wellings, Associate Professor in Politics and International Relations, Monash University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Clementine Barnabet: The Black woman blamed for serial murders in the Jim Crow South

In April 1912, a young Black woman named Clementine Barnabet confessed to murdering four families in and around Lafayette, Louisiana. The widespread news coverage at the time effectively branded her a serial killer.

Her confession, however, did not align with the timeline of crimes that had gripped America’s rice belt region with fear. Even today, her guilt is debated.

From November 1909 until August 1912, an unknown assailant – or assailants – zigzagged across southwestern Louisiana and southeastern Texas. Many Black families were slaughtered in their homes under the cover of darkness. An ax – the telltale weapon – was almost always found in the bloody aftermath.

All but one of the scenes were located within a mile of the Southern Pacific Railroad’s Sunset Route. In each case, a mother and child were always among the victims. Evidence of additional weapons was often found nearby, suggesting a deliberate cruelty to the carnage.

Dubbed the “axman”, the unknown assailant eluded the authorities and terrified local Black communities.

Today, when scholars and laypeople alike discuss Clementine Barnabet, they oscillate between two extremes: portraying her as a fear-inducing, cult-leading Black female serial killer, or as an innocent young Black woman caught in circumstances beyond her control.

In more than a decade of researching Clementine Barnabet, I’ve been struck by how print media created overtly sensationalized accounts of the mythology of the axman and, by extension, the axwoman. Whether Barnabet committed the crimes she said she did – or any of the axman murders, for that matter – is irrelevant to the primary motive the media constructed for her fatal violence: religion.

Diverse faith traditions

In Jim Crow Louisiana, various expressions of faith were possible. The state’s history as a French colony – one that also practiced slavery – meant it was home to the largest percentage of Black Catholics in the United States.

At the same time, religions like Voodoo, that originated in West Africa, reached the region on slave ships. Voodoo was not necessarily at odds with Catholicism; enslaved practitioners creatively adapted their ancestral faith to that of their enslavers.

Some displays of faith were not organized religions at all, but folkways. Hoodoo, for example, has West African origins, though it also draws upon European and Native American elements. Hoodoo practitioners – sometimes called doctors – and their clients often practice a religion, yet they also seek comfort in the supernatural possibilities of their craft.

This craft involves the physical manipulation of earthly elements such as graveyard dirt or plants like John the Conqueror root to achieve magical ends, often resulting in conjures – or ritual objects – needed to bring about desired goals. Conjures are believed to help people protect themselves, harm one’s adversaries, alter one’s circumstances, intervene in one’s relationships and more.

In their most powerful form, believers contend that conjures can bring about a person’s death.

For some believers, elements of Catholicism, Voodoo, Protestantism and hoodoo combine into syncretic faith practices. Incorporating multiple systems of beliefs has been an aspect of many Louisianans’ identities for generations. Most of the time, this blending of practices, ideologies and communities is depicted as a quirky – even “backward” – way to make sense of the world.

Yet during the axman’s reign in the early 1900s, a Black woman’s confession to murder was interpreted through the lens of religious deviance rather than diversity.

A timeline of events

When Barnabet confessed in April 1912, it was technically the second time she had done so. The first time was in November 1911 in the aftermath of the Randall family murder. Five members of the Randall family and their overnight guest had been brutally slaughtered in Lafayette, Louisiana at the end of the month.

According to regional newspapers, Barnabet was in the crowd that had gathered near the Randall family’s home after the murders were discovered. Reportedly, she caught the attention of the local sheriff. Not only did she live near the slain, but, according to a New Orleans daily, the authorities found “her room saturated with blood and covered with human brains.”

Barnabet was given a “third degree” examination – meaning she was tortured – by the New Orleans Police Department, and then supposedly confessed that she had killed the Randalls because, according to a Midwestern newspaper, they “disobeyed the orders of the church.” That church would become a topic of scrutiny and sensationalism by regional lawmen and news outlets alike throughout much of 1912.

At that time, Barnabet is also said to have confessed to killing another family in Lafayette.

Thus, Barnabet had already been in jail for over four months before her springtime confession. Between January and March 1912, four more families had been axed to death between Crowley, Louisiana and Glidden, Texas. In April, when Barnabet re-confessed, she added two more families to her victim roster.

In aggregate, the four families Barnabet confessed to killing had been slain between November 1909 and November 1911. Four more families had been murdered between her arrest and second confession, meaning she was in jail when they occurred. After her second confession and while she was still in custody, another three families were attacked with an ax, though for the first time, people survived the axman.

This convoluted timeline, in which more than half of the axman murders occurred after Barnabet had been apprehended, presented a challenge for investigators. They generally believed the crimes were related. Yet Barnabet could not have physically carried out the attacks in 1912.

To explain the continuation of the killings despite Barnabet’s incarceration, local lawmen leveraged the young woman’s own statements that had landed her in jail in the first place: that religion compelled her to murder.

It was this November 1911 confession that gave investigators the motive of religious fanaticism to attach to the axman crimes. Then, in January 1912, when the Broussards – another Black family – were murdered with an ax in Lake Charles, Louisiana, the local police found a Bible verse scrawled on their front door. This overtly religious symbol appeared roughly two months after Barnabet’s first confession and seemed to confirm her claims.

By April 1912, the idea of religiously motivated serial murder had been circulating in the rice belt region for months.

Hoodoo, conjures, and sensationalism

Barnabet’s confession was transcribed by R. H. Broussard (no relation to the victims), a newspaper reporter for the “New Orleans Item,” in April 1912.

According to the report, Barnabet claimed that she and four friends purchased conjures from a local hoodoo doctor one evening while socializing. They paid the practitioner for his services. Supposedly, the group then used the charms to move about undetected while committing murder.

In both her November 1911 and April 1912 confessions, Barnabet offered faith-based motives, albeit different ones. In the first case, it was the victims who reportedly erred in their religious duties. In the second, it was Barnabet’s own belief in hoodoo that facilitated such carnage. White media outlets did not interpret either of these statements as evidence of the region’s deep history of diverse faith expressions.

Instead, they labeled Barnabet “a black borgia,” “the directing head of a fanatical cult,” and the “Priestess of [a] Colored Human Sacrifice Cult.”

Moreover, sensationalized news coverage labeled the church Barnabet mentioned as the “Sacrifice Church.” Not surprisingly, the press depicted it as a cult-like organization, portraying Barnabet as either a low-level member or the “high priestess.” Sometimes, news reports also conflated the Sacrifice Church with Voodoo, thereby criminalizing a legitimate West African-derived religion as a cult.

According to unsubstantiated media accounts, the so-called Sacrifice Church promoted human sacrifice to gain immortality. Simultaneously, newspapers treated the conjure Barnabet possessed as proof of her fanaticism, reporting her claim that the only reason she confessed was because she had lost her charm.

Combined these selective – and sensational – interpretations of Barnabet’s supposed religious beliefs ignored the possibility of diverse spiritual practices that enriched life in the rice belt region.

Jim Crow and Black faith

I have yet to find evidence the Sacrifice Church existed. My research suggests the white press conflated the word “sacrifice” with the word “sanctified.” This might have been due, in part, to both sensationalism and ignorance.

Pentecostalism, a branch of evangelical Christianity that emphasizes baptism by the Holy Spirit and direct communication from God, started growing in popularity in the U.S. in the early 1900s. Many Pentecostal denominations call their adherents saints and their churches sanctified. Since sanctified churches were relatively new to Louisiana and some Pentecostal teachings – like speaking in tongues – challenged more mainstream Protestant doctrine, Pentecostalism might have contributed to the media’s reporting.

Although the Sacrifice Church may have simply been a linguistic error in reference to any number of sanctified churches in the rice belt, it is possible that Barnabet did indeed possess a conjure. The hoodoo doctor she accused of selling her and her comrades their charms was arrested and questioned by the Lafayette authorities. The statements he gave to the police aligned with hoodoo practices even as he denied knowing Barnabet or being involved in such folkways.

Given the variety of faith practices in Jim Crow Louisiana, it is possible both that Barnabet believed in her conjure and that sanctified churches were growing in popularity in the region. Whether she ever attended one is hard to know, just as the legitimacy of either confession is difficult to determine.

What is clear is that faith anchored the statements Barnabet made to the authorities. The other anchor, however, was murder. The consequences of how these events aligned reverberate in how Barnabet has been depicted.

Barnbet was front-page news in 1912. People knew her name, even as they debated her guilt. When she was convicted of murder, she was sentenced to life at the Louisiana State Penitentiary. A little over a decade later, she was released and disappeared from public view.

Today, however, no Black female serial killer occupies a similar place in America’s collective memory.

In recent years, there have been calls for a more serious acceptance of Black women’s experiences, knowledge and beliefs within the dominant culture. This shift also invites, I believe, a fresh look at Barnabet’s confessions and the crimes that were attributed to her.The Conversation

Lauren Nicole Henley, Assistant Professor of Leadership Studies, University of Richmond

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Why ‘The West Wing’ went from a bipartisan hit to a polarized streaming comfort watch

When the early 2000s hit series “The West Wing” returned on Netflix in December 2025, it spurred conversation about how the idealistic political drama would play in Donald Trump’s second term.

The series features a Democratic presidential administration led by President Josiah “Jed” Bartlet, played by Martin Sheen, and his loyal White House staff negotiating political challenges with character, competence and a fair bit of humor.

It sparked cultural commentary long after it ceased its original run in 2005.

In 2016, The Guardian’s Brian Moylan asserted that the “The West Wing” was appealing because it portrayed “a world where the political system works. It reminds us of a time, not too long ago, when people in political office took their jobs very seriously and wanted to actually govern this country rather than settle scores and appeal to their respective bases.”

In 2025, Vanity Fair’s Savannah Walsh mused that “The West Wing” might be dismissed by younger audiences as a “form of science fiction” or lauded by the demographic currently watching “Jed Bartlet fancams scored to Taylor Swift’s ‘Father Figure’” on TikTok.

Audiences have been comfort-streaming the “The West Wing” since Trump’s first term. Interest in the series spiked after Trump’s election in 2016, and it served as an escape from the contentious 2020 campaign.

When the cast reunited at the 2024 Emmy awards, the Daily Beast’s Catherine L. Hensley remarked that the series’ “sense of optimism about how American government actually functions … rang hollow, almost like watching a show from another planet.”

Nonetheless, Collider’s Rachel LaBonte hailed its Netflix return in late 2025 as a “balm for these confusing times.”

“The West Wing’s” transition from broadcast television behemoth to “bittersweet comfort watch” in today’s streaming era reveals a lot about how much our media and political landscapes have changed in the past 25 years.

As professors of media studies and political communication, we study the fracturing of our media and political environments.

The shifting appeal of “The West Wing” during the past quarter century raises a sobering question: Is political competence and an idealized respect for democratic norms losing popularity in 2026? Or does the new political reality demand engagement with the seamier side of politics?

‘The West Wing’s’ optimistic big tent

“The West Wing” premiered on NBC in the fall of 1999, blending political intrigue with workplace drama in a formula audiences found irresistible. The show surged in viewership in its second and third seasons, as it imagined responses from a Democratic administration to the values and ideology of the newly installed Republican President George W. Bush.

But the series was undergirded by an ethic of political cooperation, reinforcing the idea that, according to Walsh, “we’re all a lot more aligned than we realize.” In 2020, Sheen observed in an interview that writer “Aaron Sorkin never trashed the opposition,” choosing instead to depict “people with differences of opinion trying to serve.”

In 2019, The New York Times observed that the “The West Wing” presented “opposition Republicans, for the most part, as equally honorable,” and noted that the show earned fan mail from viewers across the political spectrum.

At its height of popularity, episodes of “The West Wing” garnered 25 million viewers. Such numbers are reserved today only for live, mass culture events like Sunday night football.

Of course, “The West Wing” aired in a radically different television environment from today.

Despite competition from cable, that era’s free, over-the-airwaves broadcasters like NBC accounted for roughly half of all television viewing in the 2001-02 season. Currently, they account for only about 20%.

Gone are the days of television’s ability to create the “big tents” of diverse audiences. Instead, since “The West Wing’s” original airing, television gathers smaller segments of viewers based on political ideology and ultraspecific demographic markers.

Darker, more polarized media environment

The fracturing of the television audience parallels the schisms in America’s political culture, with viewers and voters increasingly sheltering in partisan echo chambers. Taylor Sheridan has replaced Sorkin as this decade’s showrunner, pumping out conservatively aligned hits such as “Yellowstone” and “Landman.”

Liberals, conversely, now see “West Wing” alumni recast in dystopian critiques of contemporary conservatism. Bradley Whitford morphed from President Bartlet’s political strategist to a calculating racist in Jordan Peele’s “Get Out,” and a commander in “The Handmaid’s Tale’s” misogynist army.

Allison Janney, who played “The West Wing’s” earnest and scrupulous press secretary, is now a duplicitous and potentially treasonous U.S. president in “The Diplomat,” whose creator in fact got her start on “The West Wing.”

Even Sheen has been demoted from serving as America’s favorite fictional president to playing J. Edgar Hoover in the film “Judas and the Black Messiah,” whom Sheen described as “a wretched man” and “one of the worst villains imaginable.”

Television as equipment for living

Philosopher Kenneth Burke argued that stories function as “equipment for living.” Novels, films, songs, video games and television series are important because they not only reveal our cultural predilections, they shape them, providing us with strategies for navigating the world around us.

Films and series like “Get Out,” “The Handmaid’s Tale,” “The Diplomat” and “Judas and the Black Messiah” urge audiences to confront the racism and sexism ever-present in media and politics. That includes, as some scholars and viewers have noted, the often casual misogyny and second-string roles for some women and Black men in “The West Wing.”

As U.S. citizens protest authoritarianism in the streets from Portland, Oregon, to Portland, Maine, a comfort binge of a series in which the White House press secretary, as Vanity Fair said, “dorkily performs ‘The Jackal’ and doesn’t dream of restricting West Wing access – even on the administration’s worst press days” is appealing.

But indulging an appetite for what one critic has called “junk-food nostalgia for a time that maybe never even existed” may leave audience members less equipped to build the healthy democracy for which the characters on “The West Wing” always strived. Or it may invigorate them.The Conversation

Karrin Vasby Anderson, Professor of Communication Studies, Colorado State University and Nick Marx, Professor of Film and Media Studies, Colorado State University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Pharaohs in Dixieland: How 19th-century America reimagined Egypt to justify slavery

When Napoleon embarked upon a military expedition into Egypt in 1798, he brought with him a team of scholars, scientists and artists. Together, they produced the monumental “Description de l’Égypte,” a massive, multivolume work about Egyptian geography, history and culture.

At the time, the United States was a young nation with big aspirations, and Americans often viewed their country as an heir to the great civilizations of the past. The tales of ancient Egypt that emerged from Napoleon’s travels became a source of fascination to Americans, though in different ways.

In the slaveholding South, ancient Egypt and its pharaohs became a way to justify slavery. For abolitionists and African Americans, biblical Egypt served as a symbol of bondage and liberation.

As a historian, I study how 19th-century Americans – from Southern intellectuals to Black abolitionists – used ancient Egypt to debate questions of race, civilization and national identity. My research traces how a distorted image of ancient Egypt shaped competing visions of freedom and hierarchy in a deeply divided nation.

Egypt inspires the pro-slavery South

In 1819, when lawyer John Overton, military officer James Winchester and future president Andrew Jackson founded a city in Tennessee along the Mississippi River, they christened it Memphis, after the ancient Egyptian capital.

While promoting the new city, Overton declared of the Mississippi River that ran alongside it: “This noble river may, with propriety, be denominated the American Nile.”

“Who can tell that she may not, in time, rival … her ancient namesake, of Egypt in classic elegance and art?” The Arkansas Banner excitedly reported.

In the region’s fertile soil, Chancellor William Harper, a jurist and pro-slavery theorist from South Carolina, saw the promise of an agricultural empire built on slavery, one “capable of being made a far greater Egypt.”

There was a reason pro-slavery businessmen and thinkers were energized by the prospect of an American Egypt: Many Southern planters imagined themselves as guardians of a hierarchical and aristocratic system, one grounded in landownership, tradition and honor. As Alabama newspaper editor William Falconer put it, he and his fellow white Southerners belonged to a race that “had established law, order and government over the earth.”

To them, Egypt represented the archetype of a great hierarchical civilization. Older than Athens or Rome, Egypt conferred a special legitimacy. And just like the pharaohs, the white elites of the South saw themselves as the stewards of a prosperous society sustained by enslaved labor.

Leading pro-slavery thinkers like Virginia social theorist George Fitzhugh, South Carolina lawyer and U.S. Senator Robert Barnwell Rhett and Georgia lawyer and politician Thomas R.R. Cobb all invoked Egypt as an example to follow.

“These [Egyptian] monuments show negro slaves in Egypt at least 1,600 years before Christ,” Cobb wrote in 1858. “That they were the same happy negroes of this day is proven by their being represented in a dance 1,300 years before Christ.”

A distorted view of history

But their view of history didn’t exactly square with reality. Slavery did exist in ancient Egypt, but most slaves had been originally captured as prisoners of war.

The country never developed a system of slavery comparable to that of Greece or Rome, and servitude was neither race-based nor tied to a plantation economy. The mistaken notion that Egypt’s great monuments were built by slaves largely stems from ancient authors and the biblical account of the Hebrews. Later, popular culture – especially Hollywood epics – would continue to advance this misconception.

Nonetheless, 19th-century Southern intellectuals drew on this imagined Egypt to legitimize slavery as an ancient and divinely sanctioned institution.

Even after the Civil War, which ended in 1865, nostalgia for these myths of ancient Egypt endured. In the 1870s, former Confederate officer Edward Fontaine noted how “Veritable specimens of black, woolyheaded negroes are represented by the old Egyptian artists in chains, as slaves, and even singing and dancing, as we have seen them on Southern plantations in the present century.”

Turning Egypt white

But to claim their place among the world’s great civilizations, Southerners had to reconcile a troubling fact: Egypt was located in Africa, the ancestral land of those enslaved in the U.S.

In response, an intellectual movement called the American School of Ethnology – which promoted the idea that races had separate, unequal origins to justify Black inferiority and slavery – set out to “whiten” Egypt.

In a series of texts and lectures, they portrayed Egypt as a slaveholding civilization dominated by whites. They pointed to Egyptian monuments as proof of the greatness that a slave society could achieve. And they also promoted a scientifically discredited theory called “polygenesis,” which argued that Black people did not descend from the Bible’s Adam, but from some other source.

Richard Colfax, the author of the 1833 pamphlet “Evidence Against the Views of the Abolitionists,” insisted that “the Egyptians were decidedly of the Caucasian variety of men.” Most mummies, he added, “bear not the most distant resemblance to the negro race.”

Physician Samuel George Morton cited “Crania Aegyptiaca,” an 1822 German study of Egyptian skulls, to reinforce this view. Writing in the Charleston Medical Journal in 1851, he explained how the German study had concluded that the skulls mirrored those of Europeans in size and shape. In doing so, it established “the negro his true position as an inferior race.”

Physician Samuel George Morton’s “Crania Aegyptiaca,” an 1844 study of Egyptian skulls, reinforced this view. He argued that the skulls mirrored those of Europeans in size and shape. In doing so, noted the Charleston Medical Journal in 1851, Morton established “the Negro his true position as an inferior race.”

Physician Josiah C. Nott, Egyptologist George Gliddon and physician and propagandist John H. Van Evrie formed an effective triumvirate: Through press releases and public lectures featuring the skulls of mummies, they turned Egyptology into a tool of pro-slavery propaganda.

“The Negro question was the one I wished to bring out,” Nott wrote, adding that he “embalmed it in Egyptian ethnography.”

Nott and Gliddon’s 1854 bestseller “Types of Mankind” fused pseudoscience with Egyptology to both “prove” Black inferiority and advance the idea that their beloved African civilization was populated by a white Egyptian elite.

“Negroes were numerous in Egypt,” they write, “but their social position in ancient times was the same that it now is, that of servants and slaves.”

Denouncing America’s pharaohs

This distorted vision of Egypt, however, wasn’t the only one to take hold in the U.S., and abolitionists saw this history through a decidedly different lens.

In the Bible, Egypt occupies a central place, mentioned repeatedly as a land of refuge – notably for Joseph – but also as a nation of idolatry and as the cradle of slavery.

The episode of the Exodus is perhaps the most famous reference. The Hebrews, enslaved under an oppressive pharaoh, are freed by Moses, who leads them to the Promised Land, Canaan. This biblical image of Egypt as a land of bondage deeply shaped 19th-century moral and political debates: For many abolitionists, it represented the ultimate symbol of tyranny and human oppression.

When the Emancipation Proclamation went into effect on Jan. 1, 1863, Black people could be heard singing in front of the White House, “Go down Moses, way down in Egypt Land … Tell Jeff Davis to let my people go.”

Black Americans seized upon this biblical parallel. Confederate President Jefferson Davis was a contemporary pharaoh, with Moses still the prophet of liberation.

African American writers and activists like Phillis Wheatley and Sojourner Truth also invoked Egypt as a tool of emancipation.

“In every human breast, God has implanted a principle, which we call love of freedom,” Wheatley wrote in a 1774 letter. “It is impatient of oppression and pants for deliverance; and by the leave of our modern Egyptians, I will assert that the same principle lives in us.”

Yet the South’s infatuation with Egypt shows how antiquity can always be recast to serve the powerful. And it’s a reminder that the past is far from neutral terrain – that there is rarely, if ever, a ceasefire in wars over history and memory.

This article has been updated to correctly attribute Samuel George Morton as the author of “Crania Aegyptiaca,” not as the author of the Charleston Medical Journal article. Quoted texts from Phillis Wheatley and William Falconer have also been slightly amended for accuracy.The Conversation

Charles Vanthournout, Ph.D. Student in Ancient History, Université de Lorraine

This article is republished from The Conversation under a Creative Commons license. Read the original article.

‘Which Side Are You On?’ American protest songs have emboldened social movements for generations

The presence of Department of Homeland Security agents in Minnesota compelled many people there to use songs as a means of protest. Those songs were from secular as well as religious traditions.

On Jan. 8, 2026, the day after Immigration and Customs Enforcement agent Jonathan Ross killed Minneapolis resident Renée Good on Portland Avenue, an anonymous post appeared on Reddit that featured an uncredited text clearly adapted from the lyrics of a Depression-era protest song from Appalachia, “Which Side Are You On?” The Reddit text criticized the recent federal presence in Minnesota and implored Minnesotans to take a stand.

In our town of Minneapolis,
There’s no neutrals here at home.
You’re either marching in the streets
or you kill for Kristi Noem
Which side are you on,
Oh which side are you on?
Which side are you on,
Oh which side are you on?
ICE is a bunch of killers
who hide behind a mask.
How do they get away with this?
That’s what you have to ask.
Which side are you on …

For centuries, songs have served as vehicles for expressing community responses to sociopolitical crises, whether government repression or corporate exploitation. “Which Side Are You On?” resonated with Minnesotans, in part because it has been recorded by numerous artists over the decades.

The song dates back to another societal struggle that occurred in another part of the United States during another crisis moment in American history. “Which Side Are You On?” has consoled and empowered countless people for generations during struggles in red as well as blue states. It has also inspired people to write new protest songs in the face of new crises.

Birth of a protest anthem

“Which Side Are You On?” was composed in 1931, a woman’s spontaneous response to a coal company’s effort to prevent miners in Harlan County, Kentucky, from joining the United Mine Workers of America. Those miners hoped the labor union would improve their working conditions and overturn imposed reductions to their wages.

In support of the coal company, sheriff J. H. Blair and armed deputies broke into the house of union organizer Sam Reece to apprehend him and locate evidence of union activity. Reece was in hiding elsewhere, but his wife, Florence, and their children were present. After ransacking the house, the sheriff and deputies left.

Florence tore a page out of a calendar and jotted down lyrics for an impromptu song, which she recalled setting to the melody of a Baptist hymn “I’m gonna land on the shore.” Others have observed that the melody in Florence’s song was similar to that of the traditional British ballad “Jack Monroe,” which features the haunting refrain “Lay the Lily Low.”

A black-and-white photo of a man playing guitar
Woody Guthrie, one of America’s most celebrated folk singers of the 20th century, sang many protest songs. Al Aumuller, via the Library of Congress


“Which Side Are You On?” channeled Florence’s reaction to that traumatic experience. Throughout the 1930s, she and others sang the song during labor strikes in the Appalachian coalfields, and the lyrics were included in union songbooks. Then, in 1941, the Almanac Singers, a folk supergroup featuring Woody Guthrie and Pete Seeger, recorded the song, and it reached many people beyond Appalachia.

Since then, a range of musicians – including Charlie Byrd; Peter, Paul and Mary; the Dropkick Murphys; Natalie Merchant; Ani DiFranco; and the Kronos Quartet – performed “Which Side Are You On?” in concert settings and for recordings. A solo live performance with a concert audience joining the chorus was a focal point of Seeger’s “Greatest Hits” album in 1967.

The Academy Award-winning documentary film “Harlan County U.S.A.” (1976) included a clip of Florence Reece singing her song during a 1973 strike. “Which Side Are You On?” was translated into other languages – a testament to its universal theme of encouraging solidarity to people confronting authoritarian power.

Florence Reece sings ‘Which side are you on?’ four decades after she wrote the song.


Protest songs of the modern era

While the American protest song tradition can be traced back to the origins of the nation, “Which Side Are You On?” served as a prototype for the modern-era protest song because of its lyrical directness. Many memorable, risk-taking protest songs were composed in the wake of, and in the spirit of, “Which Side Are You On?”

Noteworthy are numerous protest classics in the folk vein, epitomized by a sizable part of Guthrie’s repertoire, by early Bob Dylan songs like “Masters of War” (1963), “The Times They Are a-Changin’” (1964) and “Only A Pawn in Their Game” (1964), and by Phil Ochs’ mid-1960s songs of political critique, such as “Here’s to the State of Mississippi” (1965).

But protest songs have hailed from all music genres. Rock and rhythm and blues, for instance, have spawned many iconic recordings of protest music: Sam Cooke’s “A Change Is Gonna Come” (1964), Buffalo Springfield’s “For What It’s Worth” (1966), Creedence Clearwater Revival’s “Fortunate Son” (1969), Edwin Starr’s “War” (1970) and Crosby, Stills, Nash and Young’s “Ohio” (1970) among many others.

Blues, country, reggae and hip-hop have spawned broadly inspirational protest songs, and jazz too has yielded classic protest recordings, such as Abel Meeropol’s “Strange Fruit” (1939), popularized by Billie Holiday, and Gil Scott-Heron’s 1971 recording of the jazz-poem “The Revolution Will Not Be Televised.”

Indeed, there are so many enduring contributions to the American protest song canon that a list like Rolling Stone’s recent “100 Best Protest Songs of All Time” is only the tip of the iceberg. Regardless of the genre, effective protest songs retain their power to move and motivate people today despite having been composed in response to past situations or circumstances. And protest songs from the past are often adapted to help people more effectively respond to the crisis of the moment.

Songs for this moment

“Which Side Are You On?” was sung – and its theme invoked – in Minnesota throughout January 2026. On Jan. 24, shortly after Border Patrol agents killed Alex Pretti on Nicollet Avenue, Minneapolis Mayor Jacob Frey referred to the song’s title during a public address to his constituents: “Stand up for America. Recognize that your children will ask you what side you were on.” That same day, the grassroots organization 50501: Minnesota posted online an appeal to those in power: “[E]very politician and person in uniform must ask themselves one question – which side are you on?”

The next day, Minnesota Gov. Tim Walz acknowledged divisions in the U.S. during a televised briefing, urging citizens in his state and across the nation to consider the choice before them: “I’ve got a question for all of you. What side do you want to be on?”

People protesting ICE and Customs and Border Protection actions in Minnesota and elsewhere have been singing “Which Side Are You On?” and other well-known protest songs, but musicians have also been writing new protest songs about the crisis. On Jan. 8, the Dropkick Murphys posted on social media a clip of “Citizen I.C.E.,” a revamped version of the group’s 2005 song “Citizen C.I.A.,” augmented by video of the Jan. 7 fatal shooting of Renée Good. On Jan. 27, British musician Billy Bragg released “City of Heroes,” which he composed in tribute to the Minneapolis protesters.

Following suit was Bruce Springsteen, a longtime champion of the protest song legacy. On Jan. 28, Springsteen released online his newly composed and recorded “Streets of Minneapolis.” Millions of people around the world heard the song and saw its accompanying video.

On Jan. 30, Springsteen made a surprise appearance at the Minneapolis club First Avenue, performing his new song at the “Defend Minnesota” benefit concert, organized by musician Tom Morello to raise funds for the families of Good and Pretti.

Bruce Springsteen’s ‘Streets of Minneapolis’ rages against the killings of Renée Good and Alex Pretti.


Making a difference

On the day Pretti was shot dead, hundreds of Minneapolis protesters attended a special service at Minneapolis’ Hennepin Avenue United Methodist Church. Pastor Elizabeth MacAuley, in a televised interview with CNN’s Anderson Cooper, reflected on the role of song in helping people cope: “It’s been a time when it is pretty tempting to feel so disempowered. … [T]he singing resistance movement … brought out the hope and the grief and the rage and the beauty.”

Cooper asked: “Do you think song makes a difference?” MacAuley replied: “I know song makes a difference.”The Conversation

Ted Olson, Professor of Appalachian Studies and Bluegrass, Old-Time and Roots Music Studies, East Tennessee State University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

This right-wing social network isn't just biased – it's actively radicalizing users

A new study published today in Nature has found that X’s algorithm – the hidden system or “recipe” that governs which posts appear in your feed and in which order – shifts users’ political opinions in a more conservative direction.

Led by Germain Gauthier from Bocconi University in Italy, it is a rare, real-world randomised experimental study on a major social media platform. And it builds on a growing body of research that shows how these platforms can shape people’s political attitudes.

Two different algorithms

The researchers randomly assigned 4,965 active US-based X users to one of two groups.

The first group used X’s default “For You” feed. This features an algorithm that selects and ranks posts it thinks users will be more likely to engage with, including posts from accounts that they don’t necessarily follow.

The second group used a chronological feed. This only shows posts from accounts users follow, displayed in the order they were posted. The experiment ran for seven weeks during 2023.

Users who switched from the chronological feed to the “For You” feed were 4.7 percentage points more likely to prioritise policy issues favoured by US Republicans (for example, crime, inflation and immigration). They were also more likely to view the criminal investigation into US President Donald Trump as unacceptable.

They also shifted in a more pro-Russia direction in regards to the war in Ukraine. For example, these users became 7.4 percentage points less likely to view Ukrainian President Volodymyr Zelenskyy positively, and scored slightly higher on a pro-Russian attitude index overall.

The researchers also examined how the algorithm produced these effects.

They found evidence that the algorithm increased the share of right-leaning content by 2.9 percentage points overall (and 2.5 points among political posts), compared with the chronological feed.

It also significantly demoted the share of posts from traditional news organisations’ accounts while promoting or boosting posts from political activists.

One of the most concerning findings of the study is the longer-term effects of X’s algorithmic feed. The study showed the algorithm nudged users towards following more right-leaning accounts, and that the new following patterns endured even after switching back to the chronological feed.

In other words, turning the algorithm off didn’t simply “reset” what people see. It had a longer-lasting impact beyond its day-to-day effects.

One piece of a much bigger picture

This new study supports findings of similar studies.

For example, a study in 2022, before Elon Musk had bought Twitter and rebranded it as X, found the platform’s algorithmic systems amplified content from the mainstream political right more than the left in six out of the seven countries.

An experimental study from 2025 re-ranked X feeds to reduce exposure to content that expresses antidemocratic attitudes and partisan animosity. They found this shifted feelings towards their political opponents by more than two points on a 0–100 “feeling thermometer”. This is a shift the authors argued would have normally taken about three years to occur organically in the general population.

My own research offers another piece of evidence to this picture of algorithmic bias on X. Along with my colleague Mark Andrejevic, I analysed engagement data (such as likes and reposts) from prominent political accounts during the final stages of the 2024 US election.

Our findings unearthed a sudden and unusual spike in engagement with Musk’s account after his endorsement of Trump on July 13 – the day of the assassination attempt on Trump. Views on Musk’s posts surged by 138%, retweets by 238%, and likes by 186%. This far outstripped increases on other accounts.

After July 13, right-leaning accounts on X gained significantly greater visibility than progressive ones. The “playing field” for attention and engagement on the platform was tilted thereafter towards right-leaning accounts – a trend that continued for the remainder of the time period we analysed in that study.

Not a niche product

This matters because we are not talking about a niche product.

X has more than 400 million users globally. It has become embedded as infrastructure – a key source of political and social communication. And once technical systems become infrastructure, they can become invisible – like background objects that we barely think about, but which shape society at its foundations and can be exploited under our noses.

Think of the overpass bridges Robert Moses designed in New York in the 1930s. These seemed like inert objects. But they were designed to be very low, to exclude people of colour from taking buses to recreation areas in Long Island.

Similar to this, the design and governance of social media platforms also has real consequences.

The point is that X’s algorithms are not neutral tools. They are an editorial force, shaping what people know, whom they pay attention to, who the outgroup is and what “we” should do about or to them – and, as this new study shows, what people come to believe.

The age of taking platform companies at their word about the design and effects of their own algorithms must come to an end. Governments around the world – including in Australia where the eSafety Commissioner has powers to drive “algorithmic transparency and accountability” and require that platforms report on how their algorithms contribute to or reduce harms – need to mandate genuine transparency over how these systems work.

When infrastructure become harmful or unsafe, nobody bats an eye when governments do something to protect us. The same needs to happen urgently for social media infrastructures.The Conversation

Timothy Graham, Associate Professor in Digital Media, Queensland University of Technology

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Astrologers think Donald Trump's destiny is tied to the eclipse

The Moon crossed the Sun’s path on February 17, causing what is known as an annular solar eclipse. The Sun was not covered completely, but the Moon blocked enough of its light to leave a fiery ring. Unless you’re deep in the southern hemisphere, you won’t have noticed.

However, astrologically speaking, eclipses have effects regardless of who is watching. In astrology, an ancient tradition that lacks scientific grounding, eclipses are regarded as being powerful and politically significant celestial events. They are traditionally associated with the destiny of rulers – and some astrologers think Donald Trump is no exception.

Astrologers interpret the meaning of eclipses through horoscopes, celestial maps that locate the Sun, Moon and planets within the 12 signs of the Zodiac that encircle our solar system. During the eclipse, the Sun and Moon were at the edges of the sign Aquarius, a position astrologers associate with endings and shakeups.

This, alongside various other factors including Trump being born during a lunar eclipse in 1946, has led some astrologers to suggest that the eclipse could mark the start of a severe crisis for the US president – even his death.

Predictions like this come around fairly often, and Trump has outlasted many of them before. But these extreme forecasts follow a very old script. For thousands of years, eclipses have been treated as political events, read as omens about kingdoms and their rulers.

Bad omens

Eclipses have been connected with the fate of rulers since at least ancient Mesopotamia, around 4,000 years ago. Keen observers there, in what is now modern-day Iraq, kept lists of phenomena they believed were linked to specific outcomes.

“If a lizard gives birth in the walkway of a house, the household will fall” and “if a white partridge is seen in the city, commercial activity will diminish” are two examples. But one omen has long outlived the others: “if there is an eclipse, the king will die”.

With such high stakes, ancient astronomers invested in systematic observation, record-keeping and calculation to predict eclipses with ever-greater accuracy. This enabled the so-called “substitute king” ritual, where royals tried to avoid their fate by temporarily making someone else king until an eclipse passed.

The link between eclipses and the death of kings spread widely in the ancient world. Egyptian papyri show evidence of this belief, and Greek and Roman history is full of stories connecting eclipses with prominent deaths.

Roman historian Cassius Dio recorded a solar eclipse around the death of the first Roman emperor, Augustus, in AD14, during which “most of the sky seemed to be on fire”. In the gospels of Matthew, Mark and Luke, the death of Jesus is also marked by darkened Sun.

In the medieval period, when Arabic chroniclers recorded eclipses, they usually noted concurrent deaths of rulers. And in Europe, a solar eclipse in 1133 was so closely associated with the 1135 death of King Henry I of England that it became known as “King Henry’s Eclipse”.

Premodern rulers often hired astrologers to interpret their birth charts – the horoscope cast for the moment they were born. Ideally, the astrologer would pick out an aspect of the chart they could say justified the ruler’s leadership and foretold a long and prosperous reign. This was useful astrological propaganda.

But rulers were less happy when astrologers did this without authorisation – especially if they forecast illness or death. Astrologers were expelled from ancient Rome on numerous occasions for doing just that.

In his book, Lives of the Caesars, Roman historian Suetonius recounted the fate of an astrologer called Ascletarion (or Ascletario). Ascletarion’s predictions of the Emperor Domitian’s imminent downfall in the first century AD prompted the angry emperor to order his execution.

More than 1,400 years later, an astrologer in Oxford was executed for predicting the death of the reigning English monarch, Edward IV. And in 1581, Queen Elizabeth I of England made it a felony to use horoscopes to predict her death or her successor.

Similarly in France, royal pronouncements in 1560, 1579 and 1628 prohibited astrological predictions about princes, states and public affairs. Around the same time, astrologers in Italy got into serious trouble for predicting the deaths of popes.

This was not just a matter of anxiety on the part of rulers. It was also a question of maintaining public order and political stability. State powers were concerned with the ability of astrological predictions to cause general chaos and even prompt protests and rebellions.

They were right to worry. In a time when astrology was taken very seriously, predictions could cause collective panic. During the so-called wars of the three kingdoms, a series of conflicts fought between 1639 and 1653 in England, Scotland and Ireland, astrologers’ radical political predictions about the fate of the English monarchy fed revolutionary sentiment.

One of these astrologers, Nicholas Culpeper, published predictions of the downfall of all European monarchies on the basis of a solar eclipse in 1652.

Astrology left the world of universities and political courts in the 17th century, but astrologers did not stop making political predictions. In 1790s London, an astrologer called William Gilbert predicted the death of King Gustav III of Sweden. His prophecy was fulfilled a few months later.

And after his attempted assassination in 1981, the then-US president, Ronald Reagan, asked astrologer Joan Quigley whether she could have predicted it. She said yes. Quigley worked for the Reagans for many years, and claimed that she provided advice not just on personal affairs but also on matters of the state, including the best timing to make political announcements.

Although astrology is no longer counted as a science, it remains a player in contemporary politics. Whether or not eclipse predictions come to pass is almost besides the point. Historically, what made eclipses politically dangerous was the speculation often attached to them.The Conversation

Michelle Pfeffer, Research Fellow in Early Modern History, University of Oxford

This article is republished from The Conversation under a Creative Commons license. Read the original article.

How Jesse Jackson embodied Southern politics − and changed American elections

Editor's Note: Rev. Jesse Jackson, the legendary civil rights activist and two-time presidential candidate who fundamentally reshaped American politics and inspired generations of African Americans to seek elected office, has died. He was 83.

Jackson's passing marks the end of an era in American political and social history. From his emergence as a leader in the civil rights movement in the early 1960s to his groundbreaking presidential campaigns in 1984 and 1988, Jackson's life was defined by an unwavering commitment to social and economic justice.

The article that follows, originally published by The Conversation last year, examines how Jackson's Southern identity shaped his life's work and his enduring influence on American politics. It is reprinted here as a tribute to his legacy.

Holding hands with other prominent Black leaders, the Rev. Jesse Jackson crossed the Edmund Pettus bridge in Selma, Alabama, on March 9, 2025, to commemorate the 60th anniversary of “Bloody Sunday.” Like several survivors of that violent day in 1965, when police brutally attacked civil rights protesters, Jackson crossed the bridge in a wheelchair.

Jesse Louis Jackson was born Oct. 8, 1941, in Greenville, South Carolina, a town firmly entrenched in the racially segregated Deep South. This time and place aren’t footnotes to Jackson’s life, but rather key facts that shaped his civil rights activism and historic runs for the U.S. presidency.

Growing up in the segregated South shaped Jackson’s attitudes, opinions and outlook in ways that remain apparent today. While he lived in Chicago for most of his adult life, he remained a Southerner. And other Southerners viewed him as such.

Jackson biographer David Masciotra said the South gave Jackson “a sense of the oppression and the persecution that he wanted to fight.”

As scholars of Southern politics, we see Jackson’s Southern identity as essential to understanding his life. Southerners often identify with the region, even after leaving the geographic South. As sociologist John Shelton Reed once wrote, Southernness has more to do with attitude than latitude.

A segregated childhood

In the South Carolina of Jackson’s youth, water fountains, bathrooms, swimming pools and lunch counters were all segregated. While white people his age attended Greenville High School, Jackson attended the all-Black Sterling High School, where he was a star quarterback and class president.

His experience of segregation shaped how Jackson views his life.

“I keep thinking about the odds,” Jackson told his biographer and fellow South Carolinian Marshall Frady in 1988, marveling at the “responsibility I have now against what I was expected then to be doing at this stage of life.”

“Even mean ole segregation couldn’t break in on me and steal my soul,” he later told Frady.

If Jackson had been white, a star student like him might have enrolled at Clemson University or the University of South Carolina. Or he might have said yes when he was offered a contract to play professional baseball.

Instead, Jackson rejected the contract because the pay would be approximately six times less than a white player’s and went North, to the University of Illinois.

He did not find a more welcoming atmosphere in Champaign, Illinois. According to biographer Barbara Reynolds, the segregation that he thought he had left behind “cropped up in Illinois to convince him that was not the place to be.”

In the fall of 1960, Jackson transferred to North Carolina Agricultural and Technical State University, a historically Black college in Greensboro, North Carolina, to complete his sociology degree.

His return to the South marked Jackson’s emergence as a leader in the growing Civil Rights Movement.

Greensboro was a center of this struggle, with large, regular demonstrations, often led by local students of color. Six months prior to his arrival in Greensboro, four Black students from North Carolina A&T refused to leave the whites-only Woolworth lunch counter, launching a sit-in movement that soon drew national attention.

Jackson himself led protests to integrate Greensboro businesses. After one pivotal student march on City Hall, he was arrested and charged with inciting a riot. In jail, Jackson wrote a “Letter From a Greensboro Jail,” a rhetorical tip of the hat to Martin Luther King Jr.’s “Letter from a Birmingham Jail.”

A move north

Jackson’s second move north, in 1964, stuck.

Like so many other Black Southerners who participated in what later became known as the “second great migration,” Jackson went to Chicago. He attended Chicago Theological Seminary, inspired not by a deep love of scripture but by what Jackson perceived as the church’s ability to do good on this earth.

As North Carolina A&T’s president, Dr. Sam Proctor, advised Jackson, “You don’t have to enter the ministry because you want to save people from a burning hell. It may be because you want to see his kingdom come on earth as it is in heaven.”

Jackson thought his time in Chicago “would be quiet and peaceful and I could reflect.”

It was anything but. Following the path of King and other religiously inspired civil rights activists, Jackson continued his civil rights organizing, leading Operation Breadbasket, an initiative of King’s to boycott businesses that did not employ Black workers.

Presidential aspirations

Over the next few years, Jackson took on ever more high-profile organizing, patterned after the life and work of King – another Southerner. As the former King aide Bernard Lafayette once said, “I mean, he cloned himself out of Martin Luther King.”

In 1984, Jackson turned to politics. He became the second African American to run for the nation’s highest office, following in the footsteps of Shirley Chisholm and her 1972 candidacy.

Announcing his bid, Jackson pledged to “help restore a moral tone, a redemptive spirit, and a sensitivity to the poor and dispossessed of this nation.”

But the campaign always represented more than a policy platform. Jackson wanted to mobilize more Americans to vote and to run for office, especially the “voiceless and the downtrodden.”

Jackson finished third in the 1984 Democratic primary but with a remarkably strong showing, taking 18% of all primary votes. He performed especially well south of the Mason-Dixon Line, winning both Louisiana and the District of Columbia. He also performed well in the Mississippi and South Carolina Democratic caucuses.

This surprising success inspired Jackson to run for president again. In 1988, he did even better, winning nearly 7 million votes and 11 contests, and sweeping the South during the primary season.

He won the South Carolina caucuses and the Super Tuesday states of Alabama, Georgia, Louisiana, Mississippi and Virgina. In his second run, Jackson more than doubled his share of the white vote, from 5% in 1984 to 12% in 1988.

Jackson finished second in the Democratic primary to Massachusetts Gov. Michael Dukakis, who would go on to lose the 1988 presidential election to George H.W. Bush. But Jackson’s strong results solidified his position as a major figure in American politics and a power broker in the Democratic Party.

A towering figure in American politics

Jesse Jackson’s two presidential runs fundamentally altered the U.S. political landscape.

Beyond being the first Black candidate to win a state primary contest, Jackson also helped end the primary system by which the winner of a state would receive all the state’s delegates. Jackson claimed the system hurt Black and minority candidates and advocated to implement reforms that had been first recommended following the 1968 Democratic primary.

Back then, the party had pushed for a system in which delegates could be allocated based on the proportion of the vote won by each candidate, but it wasn’t adopted in every state.

Starting in 1992, following Jackson’s intervention, candidates receiving at least 15% of the vote officially received a proportion of the delegates. These reforms opened up the possibility that a minority candidate could secure the Democratic nomination through a more proportional allocation of delegates.

Jackson’s background also reinforced the importance of the Black church in Black political mobilization.

Perhaps most importantly, Jackson expanded the size and diversity of the electorate and inspired a generation of African Americans to seek office.

“It is because people like Jesse ran that I have this opportunity to run for president today,” said Barack Obama in 2007.

The long Southern strategy

Jackson’s political rise coincided with and likely encouraged the exodus of racially conservative white voters out of the Democratic Party.

The Republican Party’s Long Southern Strategy – an opportunistic plan to cultivate Southern white voters by capitalizing on “white racial angst” and conservative social values – had been underway before Jackson’s presidential bids. But his focus on social and economic justice undoubtedly helped drive conservative Southern whites to the GOP.

Today, some political thinkers question whether a distinct “Southern politics” continues to exist.

The life and career of Jesse Jackson reflect that place still matters – even for people who have left that region for colder pastures.The Conversation

Gibbs Knotts, Professor of Political Science, Coastal Carolina University and Christopher A. Cooper, Professor of Political Science & Public Affairs, Western Carolina University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Multiple warning signs suggest trouble ahead for Trump — and the GOP

On Feb. 7, 2026, Chasity Verret Martinez won a special election to fill a vacant seat in the Louisiana House. That’s an outcome that might not mean very much to people outside of the state or even outside her Baton Rouge-area district.

But Martinez is a Democrat who took 62% of the vote in a district that had given Donald Trump a 13-percentage-point victory in the 2024 presidential race. And her win came a week after Democrats seized a Texas Senate district that had supported Trump even more strongly – a result that immediately triggered concern in Republican circles.

Because fewer people turn out for special elections, they’re considered an early predictor of partisan enthusiasm heading into regularly scheduled elections. And with the 2026 midterm elections less than nine months away, analysts are already scrambling for indications of the likely outcome.

As a political scientist who studies congressional elections, I’m interested in the question of whether special elections can really tell us which way the political winds are currently blowing.

Democrats, of course, are hoping for a “blue wave” like they rode in 2018, when they picked up 40 House seats and won a majority in that chamber, while Republicans want to hang on to the very slim margins they have in both the House and Senate.

In the 2026 election cycle, as in previous ones, prognosticators and political professionals are looking to the outcomes of these intermittent races at various levels of government as a gauge of how voters are feeling about the two parties. And the results from the first 15 months of the second Trump administration appear to spell very bad news for the Republicans.

Setting a baseline

Since Election Day 2024, 88 special elections featuring candidates from both major parties have taken place for institutions including state legislatures and the U.S. House.

When analyzing the results of these races, it’s important to have figures to compare them to. After all, a Democrat just barely squeaking by in a state legislative race may not look very impressive on its face – but if that race took place in the rural heart of a red state, it could raise hackles among Republicans.

Most political analysts agree that the best available comparison point for special elections are the results for the most recent presidential election in that same district. There are a few reasons for this.

The nationalization of party politics means there are few members of Congress representing states or districts that voted for the other party for president. So the best comparison is to the only truly national election in the U.S.

Second, using presidential results creates the same baseline for all races. By comparing special election results to the prior election environment, all the special election results get compared to the same standard.

Finally, and perhaps most importantly, recent midterm elections have typically served as a referendum on the party in power, particularly the president. In trying to measure how voters are reacting to Trump’s second term, it makes sense to measure their behavior against the last time Trump was on the ballot.

Are special elections predictive?

With this baseline in mind, it’s easy to compare the results of special elections in particular districts to the results of the last presidential election in that same district.

In the 2022 cycle, for example, Democrats running in special elections underperformed President Joe Biden’s 2020 results in their districts by about 4 percentage points on average, which translated into a 3-percentage-point loss nationwide in U.S. House races in the November 2022 midterms and the loss of their majority in the chamber.

Conversely, in 2018 – like this year, a midterm following a Trump election – Democrats bested Republicans by 8 percentage points in November, after overperforming Hillary Clinton’s 2016 margins in special elections throughout the previous two years by 9 percentage points on average.

The 2024 cycle is a clear exception to this pattern of regular elections closely following special election results: Prior to the presidential election, Democrats outperformed in special elections by an average of 4 percentage points but ended up losing nationally by 3 percentage points in November.

Like special elections, midterm contests tend to turn out fewer but more engaged voters than presidential years. Therefore, it may be that special elections are more predictive of midterm results than presidential cycles. At any rate, if previous midterm outcomes are any guide, the numbers being posted by Democrats in special elections so far in the 2026 cycle are impossible to ignore.

On average, they’re running ahead of Harris’s 2024 margins by a whopping 13 percentage points. That’s better than they did in 2018, when they ultimately picked up 40 seats in the House and seven governorships across the country.

What’s different about specials?

Democrats, however, may not want to pop the champagne corks just yet. Many roadblocks remain in their quest to take back control of Congress. For one thing, the U.S. Senate map remains a difficult one for Democrats. Even if they end up creating a 2018-like election environment with an unpopular president, many Senate contests are taking place in solidly red states.

It’s also always worth bearing in mind that there’s no telling how the events of the next nine months might reshape public opinion.

And special elections, while useful metrics, are far from perfect barometers of public opinion. They take place at different times, and could be just as reflective of hyperlocal factors, such as flawed candidates, as they are of nationalized partisan conditions.

Special elections tend to have far lower turnout than regular midterm or presidential contests. It’s also difficult to tell whether overperformance is due to highly motivated partisans or persuasion of independents and voters from the other party.

Using all the tools available

Still, special elections do have key advantages over traditional polling. Although polls do their best to approximate voters’ political attitudes, elections reveal these attitudes through voters’ actual, observed behavior – exactly the type of behavior that analysts are trying to predict in November.

Generally, this is preferable to asking a hypothetical in opinion polls, which are getting more difficult than ever to do well.

In the end, special elections are just one piece of the prediction puzzle. But the other puzzle pieces are also spelling out potential bad news for the GOP.

The generic ballot, a standard polling question that asks voters’ intent to vote for one party or the other in November without naming specific candidates, has the GOP about 6 percentage points behind the Democrats. Trump’s approval rating, meanwhile, continues to hover below 40%.

There’s no telling for sure whether these indicators will turn out to be truly predictive until November. But all of them should be sounding alarm bells for Republicans.The Conversation

Charlie Hunt, Associate Professor of Political Science, Boise State University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

How a Trump campaign contractor learned how to read your mind

The dealings that have been revealed between Cambridge Analytica and Facebook have all the trappings of a Hollywood thriller: a Bond villain-style CEO, a reclusive billionaire, a naïve and conflicted whistle-blower, a hipster data scientist turned politico, an academic with seemingly questionable ethics, and of course a triumphant president and his influential family.

Much of the discussion has been on how Cambridge Analytica was able to obtain data on more than 50m Facebook users – and how it allegedly failed to delete this data when told to do so. But there is also the matter of what Cambridge Analytica actually did with the data. In fact the data crunching company’s approach represents a step change in how analytics can today be used as a tool to generate insights – and to exert influence.

For example, pollsters have long used segmentation to target particular groups of voters, such as through categorising audiences by gender, age, income, education and family size. Segments can also be created around political affiliation or purchase preferences. The data analytics machine that presidential candidate Hillary Clinton used in her 2016 campaign – named Ada after the 19th-century mathematician and early computing pioneer – used state-of-the-art segmentation techniques to target groups of eligible voters in the same way that Barack Obama had done four years previously.

Cambridge Analytica was contracted to the Trump campaign and provided an entirely new weapon for the election machine. While it also used demographic segments to identify groups of voters, as Clinton’s campaign had, Cambridge Analytica also segmented using psychographics. As definitions of class, education, employment, age and so on, demographics are informational. Psychographics are behavioural – a means to segment by personality.

This makes a lot of sense. It’s obvious that two people with the same demographic profile (for example, white, middle-aged, employed, married men) can have markedly different personalities and opinions. We also know that adapting a message to a person’s personality – whether they are open, introverted, argumentative, and so on – goes a long way to help getting that message across.

Understanding people better

There have traditionally been two routes to ascertaining someone’s personality. You can either get to know them really well – usually over an extended time. Or you can get them to take a personality test and ask them to share it with you. Neither of these methods is realistically open to pollsters. Cambridge Analytica found a third way with the assistance of University of Cambridge academic Aleksandr Kogan.

Kogan sold Cambridge Analytica access to 270,000 personality tests completed by Facebook users through an online app he had created for research purposes. Providing the data to Cambridge Analytica was, it seems, against Facebook’s internal code of conduct, but only now in March 2018 has Kogan been banned by Facebook from the platform. In addition, Kogan’s data also came with a bonus: he had reportedly collected Facebook data from the test-takers’ friends – and, at an average of 200 friends per person, that added up to some 50m people.

While not all of these people had provided personality test responses, it is possible to reverse-engineer a personality profile from Facebook activity. Decades of psychological research has formed around the lexical hypothesis, that personality traits can be inferred by studying the subject’s use of language. Facebook patented a process to do just this in 2012, as part of its commercial aims to provide more targeted advertising, by mapping the contents of posts and likes against the “Big Five” model of psychological traits, sometimes known as OCEAN (openness, conscientiousness, extroversion, agreeableness, neuroticism). Whether you choose to like pictures of sunsets, puppies or people apparently says a lot about your personality: a 2015 study by other academics from the Cambridge psychology lab found that the model of predicting personality traits using Facebook data could generate a personality profile with the same accuracy as a spouse with just 300 likes.

Kogan developed his own model along the same lines and cut a deal with Cambridge Analytica. Armed with this bounty – and combined with additional data gleaned from elsewhere – Cambridge Analytica built personality profiles for more than 100m registered US voters. It’s claimed the company then used these profiles for targeted advertising.

Imagine for example that you could identify a segment of voters that is high in conscientiousness and neuroticism, and another segment that is high in extroversion but low in openness. Clearly, people in each segment would respond differently to the same political ad. But on Facebook they do not need to see the same ad at all – each will see an individually tailored ad designed to elicit the desired response, whether that is voting for a candidate, not voting for a candidate, or donating funds.

Cambridge Analytica worked hard to develop dozens of ad variations on different political themes such as immigration, the economy and gun rights, all tailored to different personality profiles. There is no evidence at all that Clinton’s election machine had the same ability.

Behavioural analytics and psychographic profiling are here to stay, no matter what becomes of Cambridge Analytica – which has robustly criticised what it calls “false allegations in the media”. In a way it industrialises what good salespeople have always done, by adjusting their message and delivery to the personality of their customers. This approach to electioneering – and indeed to marketing – will be Cambridge Analytica’s ultimate legacy.

Updated: This piece was amended on 13 Feb 2026 to make clear that while Michal Kosinski and David Stillwell’s research had demonstrated the effectiveness of using Facebook data to generate personality profiles, they were not involved with Cambridge Analytica and their work was not used by Cambridge Analytica.The Conversation

Michael Wade, Professor of Innovation and Strategy, Cisco Chair in Digital Business Transformation, International Institute for Management Development (IMD)

This article is republished from The Conversation under a Creative Commons license. Read the original article.

A forgotten Supreme Court case can help prevent Trump's takeover of elections

The recent FBI search of the Fulton County, Georgia, elections facility and the seizure of election-related materials pursuant to a warrant has attracted concern for what it might mean for future elections.

What if a determined executive branch used federal law enforcement to seize election materials to sow distrust in the results of the 2026 midterm congressional elections?

Courts and states should be wary when an investigation risks commandeering the evidence needed to ascertain election results. That is where a largely forgotten Supreme Court case from the 1970s matters, a case about an Indiana recount that sets important guardrails to prevent post-election chaos in federal elections.

Congress’s constitutionally-delegated role

The case known as Roudebush v. Hartke arose from a razor-thin U.S. Senate race in Indiana in 1970. The ballots were cast on Election Day, and the state counted and verified the results, a process known as the “canvass.” The state certified R. Vance Hartke as the winner. Typically, the certified winner presents himself to Congress, which accepts his certificate of election and seats the member to Congress.

The losing candidate, Richard L. Roudebush, invoked Indiana’s recount procedures. Hartke then sued to stop the recount. He argued that a state recount would intrude on the power of each chamber, the Senate or the House of Representatives, to judge its own elections under Article I, Section 5 of the U.S. Constitution. That clause gives each chamber the sole right to judge elections. No one else can interfere with that power.

Hartke worried that a recount might result in ballots that could be altered or destroyed, which would diminish the ability of the Senate to engage in a meaningful examination of the ballots if an election contest arose.

But the Supreme Court rejected that argument.

It held that a state recount does not “usurp” the Senate’s authority because the Senate remains free to make the ultimate judgment of who won the election. The recount can be understood as producing new information – in this case, an additional set of tabulated results – without stripping the Senate of its final say.

Furthermore, there was no evidence that a recount board would be “less honest or conscientious in the performance of its duties” than the original precinct boards that tabulated the election results the first time around, the court said.

A state recount, then, is perfectly acceptable, as long as it does not impair the power of Congress.

In the Roudebush decision, the court recognized that states run the mechanics of congressional elections as part of their power under Article I, Section 4 of the U.S. Constitution to set the “Times, Places and Manner of holding Elections for Senators and Representatives,” subject to Congress’s own regulation.

At the same time, each chamber of Congress judges its own elections, and courts and states should not casually interfere with that core constitutional function. They cannot engage in behaviors that usurp Congress’s constitutionally-delegated role in elections.

Evidence can be power

The Fulton County episode is legally and politically fraught not because federal agents executed a warrant – courts authorize warrants all the time – but because of what was seized: ballots, voting machines, tabulation equipment and related records.

Those items are not just evidence. They are also the raw materials for the canvassing of votes and certification of winners. They provide the foundation for audits and recounts. And, importantly, they are necessary for any later inquiry by Congress if a House or Senate race becomes contested.

That overlap creates a structural problem: If a federal investigation seizes, damages, or destroys election materials, it can affect who has the power to assess the election. It can also inject uncertainty into the chain of custody: Because ballots are removed from absentee envelopes or transferred from Election Day precincts to county election storage facilities, states ensure the ballots cast on Election Day are the only ones tabulated, and that ballots are not lost or destroyed in the process.

Disrupting this chain of custody by seizing ballots, however, can increase, rather than decrease, doubts about the reliability of election results.

That is the modern version of “usurpation.”

From my perspective as an election law scholar, Roudebush is a reminder that courts should be skeptical of executive actions that shift decisive control over election proof away from the institutions the Constitution expects to do the judging.

There is another institutional reason courts should be cautious about federal actions that seize or compromise election materials: The House already has a long-running capacity to observe state election administration in close congressional races.

The Committee on House Administration maintains an Election Observer Program. That program deploys credentialed House staff to be on-site at local election facilities in “close or difficult” House elections. That staff observes casting, processing, tabulating and canvassing procedures.

The program exists for a straightforward reason: If the House may be called upon to judge a contested election under Article I, Section 5, it has an institutional interest in understanding how the election was administered and how records were handled.

That observation function is not hypothetical. The committee has publicly announced deployments of congressional observers to watch recount processes in tight House races throughout the country.

I saw it take place first-hand in 2020. The House deployed election observers in Iowa’s 2nd Congressional District to oversee a recount of a congressional election that was ultimately certified by a margin of just six votes.

Democratic and Republican observers from the House politely observed, asked questions, and kept records – but never interfered with the state election apparatus or attempted to lay hands on election equipment or ballots.

Congress has not rejected a state’s election results since 1984, and for good reason. States now have meticulous recordkeeping, robust chain-of-custody procedures for ballots, and multiple avenues of verifying the accuracy of results. And with Congress watching, state results are even more trustworthy.

When federal investigations collide with election materials

Evidence seizures can adversely affect election administration. So courts and states ought to be vigilant, enforcing guardrails that help respect institutional boundaries.

To start, any executive branch effort to unilaterally inject itself into a state election apparatus should face meaningful scrutiny. Unlike the Fulton County warrant, which targeted an election nearly six years old, warrants that interrupt ongoing state processes in an election threaten to usurp the constitutional role of Congress. And executive action cannot proceed if it impinges upon the ultimate ability of Congress to judge the election of its members.

In the exceedingly unlikely event that a court issues a warrant, a court should not permit seizure of election equipment and ballots during a state’s ordinary post-election canvass. Instead, inspection of items, provision of copies of election materials, or orders to preserve evidence are more tailored means to accomplish the same objectives. And courts should establish clear chain-of-custody procedures in the event that evidence must be preserved for a future seizure in a federal investigation.

The fear driving much public commentary about the danger to midterm elections is not merely that election officials will be investigated or that evidence would be seized. It is that investigations could be used as a pretense to manage or, worse, disrupt elections – chilling administrators, disorganizing record keeping or manufacturing doubt by disrupting custody of ballots and systems.

Roudebush provides a constitutional posture that courts should adopt, a recognition that some acts can usurp the power of Congress to judge elections. That will provide a meaningful constraint on the executive ahead of the 2026 election and reduce the risk of intervention in an ongoing election.The Conversation

Derek T. Muller, Professor of Law, University of Notre Dame

This article is republished from The Conversation under a Creative Commons license. Read the original article.

'Unprecedented': Expert condemns Trump admin's credibility crisis with judges

The word “unprecedented” is getting a workout after a grand jury in Washington on Feb. 10, 2026, rebuffed an attempt by federal prosecutors to get an indictment against perceived enemies of President Donald Trump.

It began with an unprecedented video in November 2025 featuring six Democratic lawmakers alerting military and intelligence community members that they had the duty to disobey illegal orders. That enraged Trump, who in an unprecedented move said the lawmakers were guilty of sedition, which is punishable by death. The U.S. attorney for the District of Columbia, Jeanine Pirro, made the unprecedented attempt to indict the lawmakers. The final element in this drama – the federal grand jury’s rejection of Pirro’s request – wasn’t itself unprecedented. That’s because it’s only the latest in an unprecedented string of losses for the Trump administration before grand juries.

Dickinson College President John E. Jones III, a former federal judge, spoke with The Conversation politics editor Naomi Schalit about the role of grand juries, why a grand jury would not indict someone – and how all of this is a reflection of the administration’s remarkable loss of credibility with judges and the citizens who make up grand juries.

How does the grand jury process work?

The grand jury really dates back to before the Bill of Rights, but for our purposes it’s memorialized in the Fifth Amendment within the Bill of Rights. It is meant to be a mechanism that screens cases brought by prosecutors.

Ordinary citizens, not fewer than 16 or more than 23, have the facts presented to them by a United States attorney or assistant United States attorney. They must make a determination as to whether or not there is probable cause to believe that a crime has been committed. It is not the purview of grand jurors to determine guilt or innocence, but merely to determine whether there is probable cause sufficient to indict.

So that means that a prosecutor will come to a grand jury and present them with the facts that they have chosen to present them with. There’s no defense at that point, and the grand jury then, relatively routinely, says OK, “Indict that person,” or “Indict those people”?

That’s correct. It’s a very one-sided process. There are no defense attorneys present. There’s a court reporter, the grand jury, the United States attorney, and such witnesses as the United States attorney decides to call. While the target of a grand jury can endeavor to present witnesses, including themselves, that generally never happens because of the danger of self-incrimination. The grand jurors can ask questions of the witnesses, but the United States attorney can choose the evidence that it wants to present to the grand jury, and typically they present only such evidence as is necessary in order to establish probable cause that a crime has been committed.

Does the public know what is presented in a grand jury room by the prosecutor?

The grand jury proceedings are absolutely secret and they remain that way, unless a federal judge authorizes that they be unsealed. So in the case involving the six lawmakers, we don’t know what the prosecutor presented to the grand jury. We just know that the grand jury refused to return an indictment. As far as I know, we don’t even know what crimes were put before the grand jury, let alone what testimony was presented. What we do know is that in all six cases, the grand jury refused to vote in favor of the indictment that was requested by the United States attorney.

Why would a grand jury refuse to give the prosecutor what they want?

It’s unprecedented, although we now see a wave of grand juries pushing back against the government. I don’t recall a single instance, during the almost 20 years I served as a U.S. District judge, when a grand jury refused to return a true bill, an indictment. It just is completely aberrational. The grand jury would have to totally reject the whole premise of the case that’s being presented to them by the United States attorney because, remember, there are typically no witnesses appearing before the grand jury to dispute the facts. The grand jury is clearly saying, “Even accepting the facts you’re putting before us as true, we don’t think under these circumstances this case is worthy of a federal indictment.”

Can a prosecutor just try again?

They can return to the well, so to speak, and they did that in Virginia in the case of Letitia James. But it’s pretty perilous because, bluntly, it’s a way that a prosecutor can get their head handed to them twice.

Originally, as set out in the Fifth Amendment to the Constitution, the grand jury was supposed to be a vigorous and robust check against prosecutors simply charging people with crimes. But over time, it’s become far less than that. And there is the famous quote by Judge Sol Wachtler in New York that a grand jury can be made to “indict a ham sandwich.”

So to see a grand jury fail to return true bills multiple times over the past couple of months is remarkable and unprecedented. It occurs to me that what is happening here is kind of parallel to what’s taking place with the administration and federal judges. I think we now have entered a world where the Department of Justice has lost its credibility with the judiciary.

We’re seeing that time and again in appearances in court where judges simply don’t believe what U.S. attorneys are telling them, based on past demonstrable falsehoods that have been stated in open court. And now we see grand juries that are also doubting the credibility of federal prosecutors. And these grand jurors are not blind to what is taking place in the world around them.

I think that this is further polluted by the fact that the president of the United States, for example, in the case of the six defendants from Congress and the Senate, said that they had committed seditious acts – which is punishable by death.

Obviously, this tilts the scales and is fundamentally unfair because it is destroying the concept of due process of law. People notice what the president says, and I am happy to see that the average citizen serving on a grand jury has retained what I think is a fundamental sense of fairness, even in the face of a pretty stacked deck.

What does it mean if you have a court system, judges and the grand juries who do not have faith in the administration and its legal claims?

It’s a complete drag on our system of justice. For all of the time that I sat on the federal bench, I had great respect for the Department of Justice, and the department had tremendous credibility. They were straight shooters. The prosecutors who appeared in front of me were professionals. I didn’t always agree with their arguments, of course, nor did I agree with a few of their charging decisions, but I can tell you that not once did I see a federal prosecution in front of me that I felt strongly should never have been brought at its inception.

But we now have a system where, because of the whims of the president, the Department of Justice has become utterly weaponized against his perceived enemies, and that’s a gross misuse of our prosecutorial power at the federal level.

Also, if, for example, these members of Congress had been indicted, they’d have to lawyer up, they’d have to fight their way out. That would take a lot of resources.

So, yes, the judiciary can be a bulwark against improvident prosecutions. But that comes at a cost to the defendant, and it’s been said that the process itself is the punishment. I suspect that’s what the president wants; it’s the trauma that you put somebody through that can be almost as bad as being convicted. And, of course, there’s the reputational harm as well.The Conversation

John E. Jones III, President, Dickinson College

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Newly released Epstein files suggest secret intel operations involving wealthy elites

For obvious reasons, the secretive world of intelligence agencies and the people who revolve in its orbit remains opaque. So much so, that some of those people may not even be aware of any involvement in the secret world.

The Epstein papers have thrown up speculation about whether the late financier and sex offender might have performed services for one or another of the big intelligence agencies. And in the wake of that speculation, it has been noted that the father of Epstein’s one-time girlfriend, Ghislaine Maxwell, was the late Robert Maxwell, well-known as a larger than life publisher and newspaper proprietor in the UK from the 1950s to the early 90s. He, too, was the subject of much speculation that he might have been involved in intelligence work.

Epstein is now better known for his sex trafficking network and Maxwell for stealing from his employees’ pension funds. But their examples point to how intelligence, high finance and influence work.

Generally speaking there are three main classes of people involved in state intelligence gathering. “Officers” are full-time employees of state intelligence agencies such as MI6. They run their groups of “agents”, who are not formally employed by the state but who deliberately and knowingly gather intelligence and perform tasks for intelligence officers. And there are what is known as “intelligencers” (or sometimes assets) who may not even know they are providing information to a spy agency.

The currency of human intelligence is access, knowledge and often the ability to compromise officials and influential people.

We often think that intelligence agencies and their agent runners seek to directly recruit people with the access and motivation to pass on state secrets. While this is undeniably the case – and the examples of the American Aldrich Ames and the Briton Melita Norwood provide good evidence of this – intelligence agencies are equally interested in recruiting what’s known as “access agents”.

Access agents

The value of an access agent is not the secrets they have access to, but the social and professional access they provide to people who do. People in high-end society, scientific research, banking, politics and culture make excellent targets for access agents. And from an agency’s point of view, the best thing is that these agents are deniable and under the radar.

Intelligence officers and their operatives require funding, mobility and a credible back story (known as a legend). Businessmen like Robert Maxwell and Jeffrey Epstein had plenty of all three, making them excellent candidates to theoretically serve the needs of intelligence agencies.

But rather than indulging in speculation about Epstein and Maxwell, which is unlikely ever to be conclusively confirmed or denied, it’s more instructive to look at what we know about access agents. They are often business people, sometimes academics or journalists with a reason to travel and the opportunity to meet people in influential circles in the course of their legitimate business.

It’s worth remembering that Kim Philby, the most notorious of the Cambridge spy ring, cut his teeth as a reporter in Spain during the civil war, before embarking on a career as an MI6 officer (and Soviet double agent). Australian journalist, Richard Hughes – who appeared lightly disguised in novels by Ian Fleming and John le Carre – was believed by many to be an agent for British intelligence, working in southeast Asia during the upheavals of the 1960s and 1970s.

Perhaps the most famous businessman-agent was Cyril Bertram Mills who combined being the director of the Bertram Mills Circus with a four-decade career spanning the years before and after the second world war with British intelligence. Travelling widely in Europe, ostensibly to seek out circus acts, he provided his spymasters with evidence of German rearmament in the 1930s. He also recruited Garbo, one of the most successful double agents, who was instrumental in convincing Germany that the D-Day landings would be in Calais, not Normandy.

An access agent is trained “to be the friend the informant doesn’t have”. They can provide what their contact needs and cannot get hold of: whether that’s useful inside information of some kind, an introduction to someone important, a sexual partner or finance for one of their ventures.

MI5 is quite open about this on its website: “Agents operate by exploiting trusted relationships and positions to obtain sensitive information. They may also look for vulnerabilities among those handling secrets.

Secrets and lies

Determining truth in intelligence is complicated. Very rarely do we see a single piece of incontrovertible evidence that proves someone’s intelligence status or the ethics or efficacy of their actions. But then as we know, all of this is shrouded in secrecy and supposition.

In Maxwell’s case, historical scholarship and TV documentaries have provided unverified hints. In Epstein’s we have indicators such as the claim by former US attorney, Alexander Acosta that he was told Epstein "belonged to intelligence”, when he negotiated his plea deal. But it’s unlikely we’ll ever know the truth about either.The Conversation

Robert Dover, Professor of Intelligence and National Security & Dean of Faculty, University of Hull

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Americans are asking too much of their dogs

Americans love dogs.

Nearly half of U.S. households have one, and practically all owners see pets as part of the family – 51% say pets belong “as much as a human member.” The pet industry keeps generating more and more jobs, from vets to trainers, to influencers. Schools cannot keep up with the demand for veterinarians.

It all seems part of what Mark Cushing, a lawyer and lobbyist for veterinary issues, calls “the pet revolution”: the more and more privileged place that pets occupy in American society. In his 2020 book “Pet Nation,” he argues that the internet has caused people to become more lonely, and this has made them focus more intensely on their pets – filling in for human relationships.

I would argue that something different is happening, however, particularly since the COVID-19 lockdown: Loving dogs has become an expression not of loneliness but of how unhappy many Americans are with society and other people.

In my own book, “Rescue Me,” I explore how today’s dog culture is more a symptom of our suffering as a society than a cure for it. Dogs aren’t just being used as a substitute for people. As a philosopher who studies the relationships between animals, humans and the environment, I believe Americans are turning to dogs to alleviate the erosion of social life itself. For some owners, dogs simply offer more satisfying relationships than other people do.

And I am no different. I live with three dogs, and my love for them has driven me to research the culture of dog ownership in an effort to understand myself and other humans better. By nature, dogs are masters of social life who can communicate beyond the boundaries of their species. But I believe many Americans are expecting their pets to address problems that they cannot fix.

Dogs over people

During the pandemic, people often struggled with the monotony of spending too much time cooped up with other humans – children, romantic partners, roommates. Meanwhile, relationships with their dogs seemed to flourish.

Rescuing shelter animals grew in popularity, and on social media people celebrated being at home with their pets. Dog content on Instagram and Pinterest now commonly includes hashtags like #DogsAreBetterThanPeople and #IPreferDogsToPeople.

“The more I learn about people, the more I like my dog” appears on merchandise all over e-commerce sites such as Etsy, Amazon and Redbubble.

One 2025 study found that dog owners tend to rate their pets more highly than their human loved ones in several areas, such as companionship and support. They also experienced fewer negative interactions with their dogs than with the closest people in their lives, including children, romantic partners and relatives.

The late primatologist Jane Goodall celebrated her 90th birthday with 90 dogs. She stated in an interview with Stephen Colbert that she preferred dogs to chimps, because chimps were too much like people.

Fraying fabric

This passion for dogs seems to be growing as America’s social fabric unravels – which began long before the pandemic.

In 1972, 46% of Americans said “most people can be trusted.” By 2018, that percentage dropped to 34%. Americans report seeing their friends less than they used to, a phenomenon called the “friendship recession,” and avoid having conversations with strangers because they expect the conversation to go badly. People are spending more time at home.

Today, millennials make up the largest percentage of pet owners. Some cultural commentators argue dogs are especially important for this generation because other traditional markers of stability and adulthood – a mortgage, a child – feel out of reach or simply undesirable. According to the Harris Poll, a marketing research firm, 43% of Americans would prefer a pet to a child.

Amid those pressures, many people turn to the comfort of a pet – but the expectations for what dogs can bring to our lives are becoming increasingly unreasonable.

For some people, dogs are a way to feel loved, to relieve pressures to have kids, to fight the drudgery of their job, to reduce the stress of the rat race and to connect with the outdoors. Some expect pet ownership to improve their physical and mental health.

And it works, to a degree. Studies have found dog people to be “warmer” and happier than cat people. Interacting with pets can improve your health and may even offer some protection against cognitive decline. Dog-training programs in prisons appear to reduce recidivism rates.

Unreasonable expectations

But expecting that dogs will fill the social and emotional gaps in our lives is actually an obstacle to dogs’ flourishing, and human flourishing as well.

In philosophical terms, we could call this an extractive relationship: Humans are using dogs for their emotional labor, extracting things from them that they cannot get elsewhere or simply no longer wish to. Just like natural resource extraction, extractive relationships eventually become unsustainable.

The late cultural theorist Lauren Berlant argued that the present stage of capitalism creates a dynamic called “slow death,” a cycle in which “life building and the attrition of life are indistinguishable.” Keeping up is so exhausting that, in order to maintain that life, we need to do things that result in our slow degradation: Work becomes drudgery under unsustainable workloads, and the experience of dating suffers under the unhealthy pressure to have a partner.

Similarly, today’s dog culture is leading to unhealthy and unsustainable dynamics. Veterinarians are concerned that the rise of the “fur baby” lifestyle, in which people treat pets like human children, can harm animals, as owners seek unnecessary veterinary care, tests and medications. Pets staying at home alone while owners work suffer from boredom, which can cause chronic psychological distress and health problems. And as the number of pets goes up, many people wind up giving up their animal, overcrowding shelters.

So what should be done? Some philosophers and activists advocate for pet abolition, arguing that treating any animals as property is ethically indefensible.

This is a hard case to make – especially with dog lovers. Dogs were the first animal that humans domesticated. They have evolved beside us for as long as 40,000 years, and are a central piece of the human story. Some scientists argue that dogs made us human, not the other way around.

Perhaps we can reconfigure aspects of home, family and society to be better for dogs and humans alike – more accessible health care and higher-quality food, for example. A world more focused on human thriving would be more focused on pets’ thriving, too. But that would make for a very different America than this one.The Conversation

Margret Grebowicz, Distinguished Professor of the Humanities, Missouri University of Science and Technology

This article is republished from The Conversation under a Creative Commons license. Read the original article.

US gets a wake-up call as evidence points to a 'Trump slump'

With an upcoming FIFA World Cup being staged across the nation, 2026 was supposed to be a bumper year for tourism to the United States, driven in part by hordes of arriving soccer fans.

And yet, the U.S. tourism industry is worried. While the rest of the world saw a travel bump in 2025, with global international arrivals up 4%, the U.S. saw a downturn. The number of foreign tourists who came to the United States fell by 5.4% during the year – a sharper decline than the one experienced in 2017-18, the last time, outside the height of the COVID-19 pandemic, that the industry was gripped by fears of a travel slump.

Policy stances from the Trump administration on everything from immigration to tariffs, along with currency swings and stricter border controls, have seemingly proved a turnoff to travelers from other countries, especially Canadians – the single largest source of foreign tourists for the United States. Canadian travel to the U.S. fell by close to 30% in 2025. But it is not just visitors from Canada who are choosing to avoid the United States. Travel from Australia, India and Western Europe, among others, has also shrunk.

We are experts in tourism. And while we don’t possess a crystal ball, we believe that the tourism decline of 2025 could well continue through 2026. The evidence appears clear: Washington’s ongoing policies are putting off would-be travelers. In other words, the tourism industry is in the midst of a “Trump slump.”

Fewer Canadians heading south

The impact of Donald Trump’s policies are perhaps most pronounced when looking north of the U.S. border. According to the U.S. Travel Association, Canadian visitors generated approximately 20.4 million visits and roughly US$20.5 billion in visitor spending in 2024, supporting about 140,000 American jobs.

The economic impact of fewer Canadian visitors in 2025 affects mostly border states that depend heavily on people driving across the border for retail, restaurants, casinos and short-stay hotels.

The sharp drop in return trips by car to Canada is a direct indication that border economies might be facing stress. This has led elected officials and tourism professionals to woo Canadians in recent months, sometimes with “Canadian-only deals.”

And it isn’t just border states. In Las Vegas, some hotels are now offering currency rate parity between Canadian and U.S. dollars for rooms and gambling vouchers in a bid to attract customers.

Winter-sun states, such as Florida, Arizona and California, are facing both fewer short-stay arrivals and an emerging drop-off in Canadian “snowbirds.” Reports indicate a noticeable increase in Canadians listing U.S. properties in Florida and Arizona for sale and canceling seasonal plans, threatening lodging, health care spending and property tax revenue.

Economic and safety concerns

Economic policies pursued by the Trump administration appear to be among the main reasons visitors are staying away from the U.S. Multiple tariff announcements – pushing tariffs to the highest levels since 1935 – along with tougher border-related rhetoric and an aggressive foreign policy have contributed to a negative perception of the U.S. among would-be tourists.

Many foreigners report feeling unwelcome or uncertain about travel to the U.S., and some public leaders from Canada and Europe have urged citizens to spend domestically, instead. This significantly reduced intent to travel to the U.S. in 2025.

Meanwhile, exchange rates and inflation have further affected some aspiring travelers, especially Canadians. The Canadian dollar was weakened in 2025, making U.S. trips more expensive. This disproportionately affected day-trip and shopping-driven border crossings.

Travelers are also staying away from the U.S. because of safety concerns. Several countries have posted travel advisories about the risks of traveling to the U.S., with Germany being the latest. Although most worries are related to increased border controls, recent aggressive tactics by immigration agents have added to potential visitors’ decisions to avoid the U.S.

A wake-up call for the US

The current tourism outlook is reason for concern. Julia Simpson, president and CEO of the industry association World Travel and Tourism Council, has described the situation as a “wake-up call” for the U.S. government.

“The world’s biggest travel and tourism economy is heading in the wrong direction,” she said in May 2025. “While other nations are rolling out the welcome mat, the U.S. government is putting up the ‘closed’ sign.”

According to estimates, the U.S. stood to lose about $30 billion in international tourism in 2025 as travelers chose to travel elsewhere.

The disappointing figures for U.S. tourism follow a longer trend. The share of global international travel heading to the U.S. fell from 8.4% in 1996 to 4.9% in 2024 and was expected to drop to 4.8% in 2025. Meanwhile, arrivals to other top tourism destinations, including France, Greece, Mexico and Italy, are set to increase.

The decline is also being felt by the business tourism sector, with every major global region sending fewer people to the U.S. for work.

A World Cup bump?

So what does that mean for the upcoming FIFA World Cup, with 75% of the soccer matches being hosted across the United States? Traditionally, host nations benefit from sports events, although impacts are often overestimated. After a disappointing year, the U.S. tourism sector expects the World Cup to boost visits and revenue.

But Trump’s foreign policy may undermine those expectations.

A new visa integrity fee of $250 and plans for social media screening of some visitors make travel to the U.S. less attractive. And there are growing calls for a boycott of the U.S. following some of Trump’s policies, including his aggressive stance about Greenland.

Former FIFA President Sepp Blatter has suggested that fans avoid going to the U.S. for the World Cup.

It remains to be seen whether fans will follow his call. Bookings for flights and hotels were up after the dates and venues of games were announced in December.

But current political rhetoric is affecting travel decisions, especially given that fans from some specific countries may not be able to get visas. The U.S. government has imposed travel bans on Senegal, Ivory Coast, Iran and Haiti, all of which have qualified for the World Cup.

European soccer leaders have even discussed the possibility of a boycott, although such an action is unlikely to happen, given the revenue at stake for national teams and football associations.

Will the ‘Trump slump’ continue?

White House policies look unlikely to drastically change in the next few months. And this causes concern for tourism professionals, although most have remained silent about the recent immigration crackdown.

To make matters worse, federal funding for Brand USA, the national destination marketing organization, was cut deeply in mid-2025, leading to staff shortages that have reduced the country’s capacity to counter negative sentiment through positive promotion.

Soccer fans tend to be passionate about following their national side. And this could offset some of the impact of the Trump travel slump.

Yet, with sky-high match ticket prices and the international reputation of the U.S. as a tourism destination damaged, we believe it is unlikely that the tourism industry will recover in 2026. It will take a long time and good strategies to repair the serious damage done to the nation’s image among travelers in the rest of the world.The Conversation

Frédéric Dimanche, Professor and former Director (2015-2025), Ted Rogers School of Hospitality and Tourism Management, Toronto Metropolitan University and Kelley A. McClinchey, Teaching Faculty, Geography and Environmental Studies, Wilfrid Laurier University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Trying to predict what Trump will do next is bad for your brain — according to science

Donald Trump can change the temperature of a room with a sentence. One minute he is certain, the next he is backtracking. One day he is threatening, the next he is hinting at a deal. Even before anything concrete happens, people brace for his next turn.

That reaction is not just political. It is what unpredictability does to any system that requires stability. To act at all, you need some working sense of what is happening and what is likely to happen next.

One influential framework in brain science called predictive processing suggests the mind does not wait passively for events. It constantly guesses what will happen, checks those guesses against reality, and adjusts.

A brain that predicts can prepare, even when what it prepares for is uncertainty.The gap between what you expect and what actually happens is known as a prediction error. These gaps are not mistakes but the basis of learning. When they resolve, the brain updates its picture of the world and moves on.

This is not about what anyone intends, but about what unpredictability does to systems that need some stability to work. Trouble starts when mismatches do not resolve because the source keeps changing. People are told one thing, then the opposite, then told the evidence was never real.

The brain may struggle to settle on what to trust, so uncertainty stays high. In this view, attention is how the brain weighs up what counts as best evidence, and turns the volume up on some signals and down on others.

Uncertainty can be worse than bad news

When this keeps happening, it’s hard to get closure. Effort is spent checking and second guessing. That is one reason why uncertainty can feel worse than bad news. Bad news closes the question, uncertainty keeps it open. When expectations will not stabilise, the body stays on standby, prepared for many possible futures at once.

One idea from this theory is that there are two broad ways to deal with persistent mismatch. One is to change your expectations by getting better information and revising your view. The other is to change the situation so that outcomes become more predictable. You either update the model, or you act to make the world easier to deal with.

On the world stage, flattery can be a crude version of the second route, an attempt to make a volatile person briefly easier to predict. Everyday life shows the same pattern, such as unpredictable workplaces. When priorities change without warning, people cannot anticipate what is required. Extra effort may go into reducing uncertainty rather than doing the job.

Research links this kind of unpredictability to higher daily stress and poorer wellbeing.

The same pattern shows up in close relationships. When someone is unpredictable, people scan tone and try to guess whether today brings warmth or conflict. It can look obsessive, but it is often an attempt to avoid the wrong move.

Studies link unpredictable early environments to poorer emotional control and more strained relationships later in life.

The strain does not stay in thought alone. The brain does a lot more than thinking. A big part of its work is regulating the body, such as the heart rate, energy use and the meaning of bodily sensations.

It does this by anticipating what the body will need next. When those anticipations cannot settle, regulation becomes costly.

Words matter here in a literal sense. Language does not just convey information. It shapes expectations, which changes how the body feels.

Trump can do this at a distance. A few words about a situation can raise or lower the stakes for people, whether in Minneapolis or Iran. The point is that signals from powerful, volatile sources force others to revise their models and prepare their bodies for what might come next.

Communication is a form of regulation. Clarity and consistency help other people settle. Volatility and contradiction keep them on edge.

When a single voice can repeatedly unsettle expectations across millions of people, unpredictability stops being a personal stress and becomes a collective regulatory problem.

How to deal with unpredictability

So what helps when unpredictability keeps pulling your attention? Try checking for new information if it changes your next step or plan, otherwise it just keeps the uncertainty alive.

When a source keeps changing, reduce the effort spent trying to decode it. Switch to action. Set a rule that makes the next step predictable. For example, read the news at 8am, then stop and get on with your day.

Learn where not to look. When messages keep reversing, the problem is not a lack of information, it is an unreliable source.

Biological systems survive by limiting wasted predictions. Sometimes that means changing your expectations; sometimes it means changing the situation. And sometimes it means accepting that when Donald Trump is talking, the safest move is to stop trying to predict what comes next.The Conversation

Robin Bailey, Assistant Professor in Clinical Psychology, University of Cambridge

This article is republished from The Conversation under a Creative Commons license. Read the original article.

BRAND NEW STORIES
@2026 - AlterNet Media Inc. All Rights Reserved. - "Poynter" fonts provided by fontsempire.com.