Matthew Rozsa

Scientists believe that the 'basic ingredients' to support life 'could exist' on Jupiter's icy moon Europa

Human beings have long looked to the stars and hoped that alien life might look back at us. Yet the truth is that the first extraterrestrial life we discover is far more likely to be microbial — a prospect less romantic perhaps than the idea of bipedal aliens shaking hands with humans after landing on Earth.

Such microbial life has been theorized to have existed in the early days of Mars, before its water dried up, though we still don't know for certain. Now, astrobiologists are turning their gaze towards another nearby neighbor, Europa — an icy gray moon of Jupiter — as a suddenly much more alluring candidate for simple life.

Renewed interest in Europa's potential to harbor life stems from a new study about the peculiar moon. The subject of curiosity is the giant ridges that criss-cross the planet's surface like scratches on a cue ball. Underneath those ridges, explain the authors of a new paper in the journal Nature Communications, there may be pools of salty, liquid water. And since those ridges are ubiquitous, that means the pools could also be commonplace.

Of course, early microbial life on Earth evolved in the liquid salt water environment of our oceans — which is what makes the hint of salt water on Europa so tantalizing. The unique geography of Europa also happens to very much resemble Northwest Greenland, which is the other half of what the study concerns.

"Here we present the discovery and analysis of a double ridge in Northwest Greenland with the same gravity-scaled geometry as those found on Europa," the authors explained. "Using surface elevation and radar sounding data, we show that this double ridge was formed by successive refreezing, pressurization, and fracture of a shallow water sill within the ice sheet. If the same process is responsible for Europa's double ridges, our results suggest that shallow liquid water is [ubiquitous] across Europa's ice shell."

Europa is not a particularly large world; a mere 2,000 miles in diameter, it is not even as large as Earth's own moon. Yet Europa's surface is unique, festooned in giant double ridges that can tower as high up as 1,000 feet into the air.

When a team of scientists at Stanford University learned about Europa's double ridges, they decided to study smaller geological structures in Greenland's northwest. More specifically, they studied the little double ridge feature in Greenland and learned how it was formed. It turned out that they came into existence because shallow pools of water beneath the surface first froze and then wound up breaking through on multiple occasions. This repeatedly pushed up the twin ridges.If the analogous ridges on Europa were formed the same way, as seems probable, the constant churning could have helped bring about the chemical reactions necessary to create life. It is an intriguing premise, to say the least, and is part of a long history of astrobiological interest in Europa.

"Scientists know from a combination of observations by Earth-based telescopes and spacecraft such as Galileo that the surface of Europa is covered primarily with water ice," Dr. Cynthia B. Phillips, Europa Project Staff Scientist and Science Communications Lead from the NASA Jet Propulsion Laboratory, told Salon by email. Astronomers estimate that Europa's surface has the same density as water ice and is roughly 100 kilometers thick, but the gravity measurements used to obtain that estimate do not answer questions about exact composition. How much of this is solid ice and how much of this is liquid water?

"Gravity measurements also tell us that below this ice/water layer is a layer of rock and then a metallic core at the center," Phillips, who was not involved in the most recent study, added. If you want there to be life in the universe, these are all good signs, as they suggest the basic ingredients could exist on the enigmatic moon.

"There are three things needed for life as we know it," explained Dr. Christopher Chyba, Professor of Astrophysical Sciences and International Affairs at Princeton University, in an email to Salon. In addition to liquid water and a source of useable energy, you need "the so-called biogenic elements" — like carbon — "that our kind of life is based on," plus a source of useable energy. "NASA's strategy for searching for life has long been 'follow the water,' and Europa and Enceladus in our Solar System are the two places, besides Mars, where we have a lot of evidence for liquid water that is probably accessible to exploration," Chyba, who was not involved in the study explained.

Chyba said it would be "bizarre" if Europa did not form with the "usual complement" of biogenic elements one finds on celestial bodies, "but even if Europa somehow formed without them, the late Betty Pierazzo showed that Europa would have accumulated a significant inventory of them over Solar System history from comet impacts." Pierazzo was a researcher at the Planetary Science Institute who specialized in impact craters.

"Gravity measurements also tell us that below this ice/water layer is a layer of rock and then a metallic core at the center," Phillips, who was not involved in the most recent study, added. If you want there to be life in the universe, these are all good signs, as they suggest the basic ingredients could exist on the enigmatic moon.

"There are three things needed for life as we know it," explained Dr. Christopher Chyba, Professor of Astrophysical Sciences and International Affairs at Princeton University, in an email to Salon. In addition to liquid water and a source of useable energy, you need "the so-called biogenic elements" — like carbon — "that our kind of life is based on," plus a source of useable energy. "NASA's strategy for searching for life has long been 'follow the water,' and Europa and Enceladus in our Solar System are the two places, besides Mars, where we have a lot of evidence for liquid water that is probably accessible to exploration," Chyba, who was not involved in the study explained.

Chyba said it would be "bizarre" if Europa did not form with the "usual complement" of biogenic elements one finds on celestial bodies, "but even if Europa somehow formed without them, the late Betty Pierazzo showed that Europa would have accumulated a significant inventory of them over Solar System history from comet impacts." Pierazzo was a researcher at the Planetary Science Institute who specialized in impact craters.

"We don't know if there is life there or not, because we don't have enough of an understanding of the origin of life (on Earth or anywhere else) to say whether Europa's conditions would have favored the origin of life," Chyba observed.

Lessons of the Radical Republicans

Once upon a time in a country somewhat resembling this one, the Republican Party had a radical faction — and not because it believed in bizarre theories about election fraud or wanted to undermine democracy. By modern standards, the Radical Republicans of the 1860s would clearly be regarded as leftists: They fiercely supported racial equality, had no tolerance for insurrectionists and believed government should help the most vulnerable people in society. Their story is important for many reasons: They helped shape modern-day America, and they may even provide clues about how it can be saved.

As with so many great stories in American history, this one begins with Abraham Lincoln.

After Lincoln won the contentious presidential election of 1860 — in several states, his name wasn't even on the ballot — slave-owners across the South convinced that this meant the end of their "peculiar institution" decided to secede from the Union. That provoked the Civil War, of course, but as you probably know, it didn't immediately lead to the end of slavery. Indeed, for almost two years, many Republicans harshly criticized their own party's president for moving too slowly on that issue. Even after Lincoln issued the Emancipation Proclamation, the so-called Radical Republican wing noticed his loophole: It only applied to enslaved peoples in the rebellious states; those in states that had not seceded, such as Kentucky, Maryland and Missouri, were still in shackles.

Lincoln was unambiguous, however, in his contempt for rebels. Whether or not it's fair to compare the attempted coup of 2021 the insurrection that began in 1860, Lincoln viewed the latter as straightforward treason. If citizens in a democratic society are permitted to rebel simply because they dislike the results of an election, he reasoned, then democracy itself cannot endure. One law passed with Lincoln's support banned former Confederate leaders from holding political office of any kind — and even there, many of the Radical Republicans felt he was being too lenient.

With the surrender of the Confederacy and Lincolin's assassination in 1865, everything changed. The new president was Andrew Johnson, a former Tennessee senator and an avowed white supremacist, although he had remained loyal to the Union. Although passage of the 13th Amendment had ended slavery for good, Johnson gave the formerly rebellious states so much leeway that the plight of newly emancipated Black people was not much better than it had been before. The Radical Republican cause was clearly on the back foot — until they struck back.

After Johnson vetoed the Civil Rights Act of 1866 — which made it illegal to deny someone equal citizenship based on color — Republicans overrode his veto, the first time that had ever happened with a major piece of legislation. Radical Republican commentators and orators toured the land, presenting Lincoln as a martyred hero (despite their tepid attitude toward him in life) and insisting that Johnson was disgracing his memory. By the time the 1866 midterm elections rolled around, they had created conditions for what we would now call a "wave election."

Once in control of Congress, the Radical Republicans made what must be regarded as a grave error: They impeached Johnson for purely political reasons. Frustrated at the president's intransigence and bigotry, leading Republicans like Sen. Benjamin Wade of Ohio and Rep. Thaddeus Stevens of Pennsylvania set a trap for Johnson. They passed a blatantly unconstitutional law that restricted the president's power to remove certain officeholders without the Senate's approval. Then they waited until Secretary of War Edwin Stanton (a Lincoln holdover and Republican ally) disobeyed Johnson's orders and was dismissed, using that as the pretext to begin impeachment proceedings. Johnson was impeached in the House but avoided Senate conviction by one vote. Today, the legal consensus holds that this was nearly a dreadful miscarriage of justice: Johnson was a bad president, but the Republicans had no legitimate reason to remove him from office.

Aside from that, and their later willingness to turn a blind eye to the various scandals of President Ulysses S. Grant's administration, the Radical Republicans were on the right side of history a hell of a lot. In 1871, they pushed for a new Civil Rights Act that allowed Grant to suspend the writ of habeas corpus to fight the Ku Klux Klan and other white supremacist groups. With Grant's support, they pushed through a series of laws in 1870 and 1871 that tried to undo racial discrimination as much as possible. These Enforcement Acts were meant to guaranteed that Black men could vote, serve on juries and hold office, and were entitled to equal protection under the law. (Women of any race had few political rights, and that didn't begin to change until the end of the century.)

This was of course the period of progressive reforms and potential racial reconciliation known as Reconstruction. But then came the Depression of 1873. As usually happens during economic downturns, the incumbent party was blamed, but the 1874 midterm elections were no ordinary contest. In addition to the usual economic resentments, many Americans in the North wanted to put the Civil War in the past and end the de facto military occupation of the Southern states. Instead of blaming racial terrorists like the KKK for the continuing conflict, some blamed the Radical Republicans. The result was a massive swing to the Democrats, who defined themselves as a populist party representing the interests of working people both North and South — as long as they were white. In that election, Democrats picked up 94 House seats (out of just 293 at the time) and held a majority for 12 of the next 14 years.

One more shoe needed to fall, and that was the tortuous presidential election of 1876, in many ways an eerie precursor to the 2020 contest. As I observed two years ago, the 1876 election had the highest voter turnout rate in American history, at 81.8%, while the 2020 election had the highest turnout (about 66%) in 120 years. At least in 2020 only one side tried to cheat, whereas in 1876 both sides did.

Republicans hoped their candidate, Rutherford B. Hayes, could keep them in power another four years despite the apparent turning of the tide. It's still not clear whether Hayes or Democrat Samuel J. Tilden would have won a free and fair contest, but that tainted and deadlocked contest ended in the Compromise of 1877, in which Hayes won the White House at the cost of ending Reconstruction and effectively allowing the South to launch the Jim Crow regime of racist oppression and legal segregation

What are the lessons of the Radical Republican experiment? That depends on your point of view, of course, but we might conclude that sacrificing one's core political principles in the interest of winning elections never ends well. It would take nearly 100 years for America's political reality to catch up to the Radical Republicans' policies, but they set an important example that has fascinated historians and progressive activists ever since. Third and perhaps most difficult is the lesson that in politics you have to expect the unexpected, such as economic downturns and currents of domestic unrest. We are not quite halfway through Joe Biden's first term as president, and after the Afghanistan withdrawal, two new waves of COVID and the war in Ukraine, that lesson seems to be harshly reinforced every single day.

Why life after nuclear war – even a limited one – would suck

At the time this article is being written, Russian President Vladimir Putin is escalating his invasion of Ukraine with no end in sight. Because Russia has nuclear weapons, experts agree that it is possible they will be used during the war — perhaps on a smaller scale, perhaps on a larger one, say, with a NATO country like the United States. As Putin becomes increasingly desperate to recreate the Russian empire and destroy the liberal world order, there is no telling what he might do to save face and salvage what remains of his geopolitical ambitions.

If that happened, what would that mean for the rest of the world? The answer is both complicated (it depends to an extent on where you live) and terrifyingly simple — it would be an apocalyptic scenario right out of the most dire Biblical prophecy or dystopian science fiction story.

And the conflict doesn't need to culminate in a literal world war to have an effect on your life.

Even a comparatively smaller nuclear conflict, such as one that "merely" incinerates a few cities, would instantly plunge the world's economy into chaos. Globalization has resulted in a worldwide web of supply chains that are extremely vulnerable to disruption — this is already being seen with COVID-19 and climate change — and any goods that linked to supply chains in affected areas would grind to a halt. That, however, would be the least of humanity's problems. As smoke from the destroyed areas rises into the atmosphere, the entire planet will soon be choked to its breaking point under a blanket of soot. The Sun will not be able to reach vital crops, leading to dire food shortages, and survivors will be left inhabiting a state of constant winter.

Hence the term "nuclear winter."

For what it's worth, there are scientists who believe that a small-scale nuclear war might not automatically result in a nuclear winter (although it would still be devastating). When it comes to predicting nuclear war, there are a number of variables to consider: The physical composition of the cities being attacked, which will impact what they put into the atmosphere after being destroyed; how many weapons will be used; what kinds of clouds will emerge; and so on. Even the scientists who are skeptics about what would happen during a smaller nuclear conflict readily state that a worldwide nuclear conflict would lead to winter. In the case of a smaller one, questions of scale come into play.

The wind also makes a big difference. In addition to destroying our climate, nuclear war will release radioactive materials as high as 50 miles into our atmosphere. While most of these particles fall to the ground around the area where the bombs detonated, the lighter ones will stay in the atmosphere indefinitely and become known as fallout. As they get carried all over the world by the wind and various other atmospheric variables, they cause diseases like radiation sickness and cancer to anyone exposed to them. This can continue for years as fallout circles the globe, only landing due to precipitation or just gradually settling on the ground. After that, however, they can continue to be dangerous; there are usually hundreds of different radionuclides in nuclear fallout, and some of the more lethal ones (like cesium-137) have long half-lives (in the case of that element, more than 30 years).

(For what it is worth, the likelihood is that things would not be quite as bleak in the Southern Hemisphere as in its northern counterpart. The reason for this is simple: Planetary wind patterns tend to not cross the equator, and as such scientists believe that most of the fallout from a nuclear conflict would stay in whichever hemisphere the bombs exploded in).

A nuclear war would also exacerbate climate change. As nuclear weapons rip into the atmosphere, they will destroy massive chunks of what is left of Earth's ozone layer. Organisms would suffer from the increased exposure to ultraviolet light, and the trends that already exist due to climate change — wildfires, hurricanes, loss of species diversity — will be accelerated.

These thoughts may not bother Putin, who has reportedly toyed with the idea of nuclear war and said it would end in Russians going "to heaven as martyrs," but it weighs heavily on much of the rest of the world. Occasionally there have been successful efforts to protect humanity from the threat of nuclear annihilation through political means. Perhaps the most important example was the campaign to ban above-ground testing of hydrogen bombs. When it was kicked off by Democratic presidential candidate Adlai Stevenson in 1956, he was widely derided as being soft on Russia (at that time the Soviet Union). Yet as he spread knowledge to the international public about how above-ground hydrogen bomb tests were putting fallout into the atmosphere, people realized he was correct. Within a few years, treaties were signed to make sure that nuclear tests could only be performed underground.

Given the threat of nuclear war today, it is easy to see why.

This is what would happen to Earth if a nuclear war broke out between the West and Russia: experts

Suddenly, the threat of nuclear war feels closer than it has in decades. The Bulletin of Atomic Scientists updated their Doomsday Clock to 100 seconds to midnight, and President Joe Biden has issued increasingly ominous statements reflecting how the looming conflict over the Ukraine that could ensnare both Russia and the west into conventional war.

And, some fear, war with nuclear weapons. It is a prospect that has haunted human beings since the dawn of the Cold War. Politicians who were perceived as too open to the idea of nuclear war would pay for their hawkishness at the polls. Motion pictures from "Dr. Strangelove" to "The Day After" have depicted an uninhabitable world, filled with lethal amounts of radiation and short on necessities like food and water. As our electrical infrastructure collapsed around us, people would resort to looting and other violent methods to survive. The seeming deterioration of civilization during the early months of the COVID-19 pandemic would be nothing compared to the anarchy and destruction that would follow nuclear war.

Yet decades of living with nuclear weapons have produced a broad body of knowledge as to what a nuclear war might do to the planet, and to humanity. If even a "small" nuclear war were to break out, tens of millions of people would die after the initial blasts. A blanket of soot would wrap the rays of the Sun and cause a nuclear winter, destroying crops all over the planet and plunging billions into famine. In the northern hemisphere, there would be such severe ozone depletion from the nuclear smoke that organisms would suffer from increased exposure to damaging ultraviolet light. While things would not be as bad in the southern hemisphere, even well-positioned countries like Australia would face the ripple effects from a small nuclear war in the northern hemisphere by sheer virtue of its interconnectedness with the global community.

"The worst-case scenario is that US and Russian central strategic forces would be launched with the detonation of several thousand warheads," Hans M. Kristensen, Director, Nuclear Information Project and Associate Senior Fellow to SIPRI, Federation of American Scientists, told Salon by email. "A large nuclear exchange would not only kill millions of people and contaminate wast areas with radioactive fallout but potentially also have longer-term climatic effects."

Yet Kristensen said he does not believe the current Ukraine conflict is likely to become a nuclear war. He is not the only nuclear weapons expert who feels that way.

"First, there is little chance of that happening barring some massive miscalculation, accident or escalation of any conflict there," Geoff Wilson, the political director of Council for a Livable World, a non-profit dedicated to eliminating nuclear weapons from America's arsenal, told Salon by email. Ukraine is not part of the North Atlantic Treaty Organization (NATO) and, as such, the United States has not committed to use its military if Ukraine's sovereignty is encroached. While American policymakers can provide material aid and punish Russia through sanctions, it is unlikely that they will risk open warfare.

That said, the world's nuclear powers (which, in addition to the United States and Russia, also includes China, India, Israel, France, North Korea, Pakistan and the United Kingdom) still have vast arsenals at their disposal. In addition, President Donald Trump has overseen the development of new weapons like the W76-2 low-yield nuclear warheads. As such, the possibility of nuclear war always remains — not likely in this scenario, perhaps, but never entirely out of the question.

"The fact that the United States has started to develop these weapons again is crazy, and it sends a very poor message to the rest of the world when we have been pushing nations to end nuclear proliferation and reduce the size and scope of nuclear arsenals for so long," Wilson explained. "What's more, it sends a dangerous signal to our adversaries that we think that tactical nuclear weapons are important again, and will likely signal to them that they should follow suit."

Like Kristensen, Wilson made it clear that if conventional war with nuclear weapons ever did break out, it would end disastrously.

"Researchers have estimated that a 'regional nuclear war,' say, a couple hundred low-yield weapons exchanged between India and Pakistan, could lead to the deaths of billions people worldwide, due to the effects on global food production," Wilson explained. "So, yeah, it would not be good."

Since the United States dropped an atomic bomb on Hiroshoma in 1945, intellectuals from a number of disciplines have advocated for world government as an alternative to a possible nuclear holocaust. Andreas Bummel, co-founder and director of the international campaign for a United Nations Parliamentary Assembly and of Democracy Without Borders, has made that argument as well, telling Salon that there are no national policies which can entirely eliminate the threat.

"The only way is institutional and structural by creating a workable international system of collective security which is not only based on total elimination of WMD but also radical conventional disarmament, setting up UN capacities for rapid intervention and democratic decision-making bodies and procedures," Bummel explained by email. He added that it is "doubtful" whether this can happen in a meaningful way "while major nuclear powers are autocratic and one-party dictatorships."

Kristensen offered some less sweeping alternatives.

"Arms control agreements to reduce the numbers and role of nuclear weapons," Kristensen told Salon. "Crisis management agreements to reduce the chances for and risks of misunderstandings and overreactions. And changes in national policies so countries refrain from taking aggressive action. All of this requires political will to change."

The psychological reason that so many fall for the 'Big Lie'

Contrary to popular belief, there is no evidence that Nazi propaganda chief Joseph Goebbels coined the term "Big Lie." According to the supposed quote, Goebbels said that if you tell "a lie big enough" and regularly repeat it, "people will eventually come to believe it." That said, Adolf Hitler actually did use the phrase "big lie" — but not to describe his own propaganda strategy. In a darkly ironic case of psychological projection, he came up with the expression to defame the Jewish community.

"In the big lie there is always a certain force of credibility," Hitler wrote in "Mein Kampf," his 1925 autobiographical manifesto. He observed that most people are only comfortable telling small lies, and imagined others would be as uncomfortable as themselves perpetuating big ones. "It would never come into their heads to fabricate colossal untruths, and they would not believe that others could have the impudence to distort the truth so infamously," Hitler explained. "Even though the facts which prove this to be so may be brought clearly to their minds, they will still doubt and waver and will continue to think that there may be some other explanation."

Indeed, like many abusers before him, Hitler rationalized his own depraved behavior by falsely accusing his victims of doing the same thing. The story of World War II is, in many ways, a tale of a Big Lie run amok. Germany felt humiliated after its loss in World War I, and the nationalistic pride which had fueled that conflict still burned in the hearts of millions.

This tactic, of a leader hypnotizing vast swathes of the public through the perpetuation of a grandiose falsehood, is a phenomenon that extends well beyond World War II and Adolf Hitler. Recently, the term has been recycled to refer to the falsity that the 2020 presidential election was "stolen" in some indeterminate way, a lie that is repeated ad infinitum by Trump and a slew of his supporters at all levels from yard-sign wielding footsoldier all the way up to his closest legal counsel.

The term "Big Lie" is believed to have been first popularized in the Anglophone world by Walter Langer, a psychoanalyst who prepared a psychological profile of Adolf Hitler for the U.S. government in 1943. In that report, Langer wrote:

[Hitler's] primary rules were: never allow the public to cool off; never admit a fault or wrong; never concede that there may be some good in your enemy; never leave room for alternatives; never accept blame; concentrate on one enemy at a time and blame him for everything that goes wrong; people will believe a big lie sooner than a little one; and if you repeat it frequently enough people will sooner or later believe it.

Beyond Langer, psychologists and sociologists throughout the twentieth and twenty-first century have been intrigued by the success of the Big Lie strategy — meaning a story pushed by a political leader that is clearly bald-faced, yet so grandiose as to make it hard to believe that someone would fabricate it. Indeed, it is an intriguing question as to why this works politically, and why so many millions are so quick to believe Big Lies — be it about voting fraud or Jewish conspiracies. The counterintuitive nature of the Big Lie tactic is perhaps what is most peculiar: wouldn't a small lie be easier to pass off than a large one?

Not necessarily, psychologists say.

RELATED: The Revolution of 2020: How Trump's Big Lie reshaped history after 220 years

"Repetition is important, because the Big Lie works through indoctrination," Dr. Ramani Durvasula, a licensed clinical psychologist and professor of psychology who is noted as an expert on narcissistic personality disorder and narcissistic abuse, told Salon by email. "The Big Lie then becomes its own evidence base — if it is repeated enough, people believe it, and the very repetition almost tautologically becomes the support for the Lie."

Durvasula added that this is amplified by the numerous media platforms which exist in the modern era, as they trick people into thinking a certain falsehood has been reinforced even if all of their media platforms have the same political leanings.

"The banners and hats crucially add an air of silliness to everything. If I can buy a novelty hat about it, can it really be so serious? It's a genius mindf**k."

"Hear something enough it becomes truth," Durvasula explained. "People assume there is an evidence base when the lie is big (it's like a blind spot)."

Indeed, Hitler rose to power through a Big Lie that soothed Germans' wounded egos and targeted already-popular scapegoats: Jews and socialists, who according to the Nazi narrative had betrayed Germany through backroom dealings after the empire had won on the battlefield. All of the "evidence" that Hitler marshaled to support this claim was false (fact-checkers who pointed this out were described as Jews promoting a "big lie"), and for that reason only die-hard Nazis believed the Big Lie — at first. After Hitler gained power, however, he was able to effectively spread both that fabrication and other lies, convincing more and more people that a conspiracy of Jews and leftists were enemies of Europe's supposedly superior races. Dissent was squashed, fascism prevailed and even so-called moderates began to think that there must be at least some truth in the accusations. After all, they were being repeated everywhere.

This omnipresence, apparently, is a big part of what makes it so easy for people to be fooled by a Big Lie.

Want more health and science stories in your inbox? Subscribe to Salon's weekly newsletter The Vulgar Scientist.

Logician Miriam Bowers-Abbott, an associate professor at Mount Carmel College of Nursing, stressed the importance of repetition in spreading a Big Lie.

"What's especially helpful is repetition in a variety of contexts," Bowers-Abbott wrote to Salon. "That is, not just the same words over and over — but integration of an idea in lots of ways. It builds its own little web of support."

As a hypothetical example, Bowers-Abbott suggested a scenario where she would want to falsely convince Salon that green grapes are a superfood.

"I need to do more than state, 'Green grapes are a superfood' repetitively, I need to work it into conversations," Bowers-Abbott explained. "'Oh, I see grapes are on sale this week, so much nutrition at such a low price!'; 'My dietician has a great superfood recipe that features kale and grapes!'; 'Yes! Green grapes are green! That's the color of superfoods!'"

Dr. Matt Blanchard, a clinical psychologist at New York University, told Salon by email that this kind of immersion does not have to be merely rhetorical. If the purveyors of a Big Lie are shrewd, they can even incorporate it into a target's physical environment.

"You might think I'm kidding, but.... Nothing sells the Big Lie like novelty t-shirts, hats and banners," Blanchard told Salon. "These items are normally associated with sports teams, not life-and-death political issues. But [former President Donald] Trump and his circle have deftly used these items to generate the kind of unbridled loyalty Americans associate with pro football." Blanchard noted that the mob which attempted a coup on January 6th was "at points indistinguishable from a rowdy tailgate party. The banners and hats crucially add an air of silliness to everything. If I can buy a novelty hat about it, can it really be so serious? Or a flag featuring Trump as Rambo? The use of these sports fan items allows them to both be attacking the Capitol building and at the same time, just having good clean fun."

He added, "It's a genius mindf**k. This goofy paraphernalia has confused our response to the riot ever since."



Bandy Lee, an American psychiatrist who edited the book "The Dangerous Case of Donald Trump: 27 Psychiatrists and Mental Health Experts Assess a President," noted that people embrace outrageous assertions for emotional reasons, and that propagandists play into that as they repeat their narrative.

"Usually, they are trying to find comfort and to avoid pain," Lee wrote to Salon. "This happens in states of lesser health, where one is less inclined to venture into new domains or to seek creative solutions. There is comfort in repetition, and so a people or a nation under duress will gravitate more toward what is repeated to them than what is realistic. Adolf Hitler understood this very well, which is why the American psychologist Walter Langer coined the phrase to describe his method."

Durvasula also speculated that Big Lies benefit from humanity's hierarchical nature, given that "primate groups do tend to organize into tribes with alphas and leaders and hierarchies, and that's us as people." She added that many people are not sufficiently informed about the narcissistic behaviors that are warning signs "that there are people in our midst that lack empathy, have no care for the common good, are grandiose, arrogant, and willing to exploit and manipulate people for solely their own egocentric needs." Instead "a sort of halo effect imbues leaders with presumed expertise and power — when that is not at all the case (most if not all megalomaniacal leaders, despots, tyrants, oligarchs share narcissism/psychopathy as a trait)."

Obviously, most of those who rally around a Big Lie do not do so from a place of deliberate deceptiveness — and they certainly wouldn't call it that. Take the Big Lie being spread by Trump: That the 2020 election was stolen from him. Like Hitler's Big Lie, all of Trump's "evidence" of fraud has been exposed as spurious, from the dozens of lost legal cases (he never once proved fraud in court) to the fact that his own attorney general and Vice President admitted the election had not been stolen. To believe that the election was stolen, one would have to envision a conspiracy including hundreds of Republicans as well as Democrats and absolutely no "smoking gun" leaks — an absurd concept if one tries to break it down logistically.

"We don't truly 'believe' things, so much as provisionally accept information we find useful."

Yet the lack of substance is precisely the point, as Big Lies are structured to turn attention away from their lack of substance — in most contexts, a person with Trump's personal history and lack of evidence would never be taken seriously — by instead playing on the desires of their targets.

"Everything we know about the human brain suggests it is composed of numerous systems that interact, overlap, excite, inhibit, and often contradict each other, and may even hide information from consciousness," Blanchard told Salon. "So it comes as no surprise that the act of 'believing' is not just one thing that humans do. Instead, this one word represents a wide range of relationships that humans have with information. We don't truly 'believe' things, so much as provisionally accept information we find useful."

As Blanchard put it, people will weigh information that directly impacts their lives differently than information which seems more abstract. The name of the game is proximity.

"For example, a man who suspects his wife is cheating on him (close proximity) may work feverishly to find the truth, plant cameras in the home, hire a private detective, and so on," Blanchard explained. "But if the topic switches to Joe Biden's election (far proximity) the same man probably won't bother to even check a second news source before he decides what to 'believe.'" This can lead to a "pretty careless" relationship with political information because people do not truly appreciate the long-term consequences.

"We tool-using humans look at every object and wonder, 'How can I use this? What is this good for?'" Blanchard told Salon. "Political information is no different. The Big Lie is no different."

He added, "So most people don't whole-heartedly 'believe' the Big Lie, but they are more than happy to provisionally accept it because... why not? It might be entertaining. It might flatter your identity. It might help you bond with other people in your community. Or it might help you vent some rage."

The popularity of social media platforms like Facebook, Twitter and Instagram further exacerbates these trends because they add new elements of social pressure. An individual who has embraced a Big Lie repeatedly in those public settings will feel a level of personal investment that makes dislodging that much more challenging.

"It was easier to dislodge untruths before social media," Bowers-Abbott told Salon. "In social media, people tend to take public positions. When that position turns out to be wrong, it's embarrassing. And backing down is typically seen as weakness. So they double-down on untrue claims to save face and personal credibility."

She added, "We are way too emotionally attached to being right. It would be better for our culture as a whole to value uncertainty and intellectual humility and curiousity. Those values help us ask questions without the expectation of permanent answers."

Durvasula expressed a similar point, arguing that the best antidote to Big Lies is for people to learn more about critical thinking skills.

"Pushback means education in critical thinking (but given that school board heads are facing death threats over teaching critical thinking — that is not likely to happen)," Durvasula wrote. "It means ending algorithms that only provide confirmatory news and instead people seeing stories and information that provide other points of view (again, not likely to happen), creating safe spaces to have these conversations (who will be the referee?), encouraging civil discourse with those who hold different opinions, teaching people to find common ground (e.g. love of family) even when belief systems are not aligned."

"'Belief' is always predicated on usefulness, and useless beliefs do not survive."

Durvasula was skeptical about the idea that you can persuade someone to abandon a Big Lie through evidence — and Blanchard said the same thing. The problem is that, simply put, a lot of people believe the Big Lie because they want to. It helps them. And the only way to stop the Big Lie, in those situations, is to stop the people spreading it.

"Trump's lies have always been about power," Blanchard wrote to Salon. "He demonstrates his power by lying to your face, and when there are no consequences, his power is seen to be confirmed. The actual content of his lies is of secondary importance." As such, Trump and the other spreaders of the Big Lie will only be discredited in the eyes of their supporters if they face their greatest fear — accountability.

"They must be seen to lose at the ballot box, they must be arrested when they break the law, they must be sued for every defamation, they must be pursued with every legal tool available in an open society," Blanchard explained. "Above all else they must be seen as weak. Only then will their lies lose their usefulness for the millions who once saw something to gain -- personally, psychologically, politically, financially -- in choosing to believe."

He added, "As I said above, 'belief' is always predicated on usefulness, and useless beliefs do not survive."

Lee compared disabusing someone of the falsehoods in a Big Lie to treating regular delusions. One rule: Don't put them on the defensive.

"Confronting them, or presenting facts or evidence, never works," Lee told Salon. "You have to fix the underlying emotional vulnerability that led people to believing it in the first place. For populations, it is usually the pain of not having a place in the world, which socioeconomic inequality exacerbates. Deprivation of health care, education, an ability to make a living, and other avenues for dignity can make a population psychologically vulnerable to those who look to exploit them."

How Covid-19 is causing a crisis in faith

In times of crisis, humans often turn to faith for reassurance. Oddly, the COVID-19 pandemic might go down in history as an exception — when, according to some research, many people actually turned away from faith, and became less religious.

According to a study published earlier this month in the Journal of Religion and Health, the two largest religious groups in Germany both saw significant drops in religious faith during the COVID-19 pandemic — particularly during the more stringent lockdowns which followed the second wave.

Examining how Catholics and Protestants measured their own wellness and faith at various points during the pandemic, the researchers found that 15 percent of the surveyed participants during the first wave said they had lost faith in God or a higher power because of COVID-19. During the second wave, 21.5 percent said they had lost faith as a result of the pandemic, and the pattern of decline continued during the first half of 2021. The authors added that things "started to improve slightly" during the fourth wave, which began near the end of 2021.

"In our area, we have noticed a decline of religious interest also before the pandemic," Arndt Büssing, the corresponding author for the study and a professor on the faculty of health at Witten/Herdecke University, told Salon by email. "Now during the pandemic, social restrictions and the lockdowns lead to an increase of perceived loneliness and social isolation with a decline of emotional wellbeing. An argument is that faith could be a resource to cope, to find meaning in times of darkness and insecurity. However, due to the long course of the pandemic with less hope that things will significantly change, even when we now have a vaccination, people may lose their hope."

RELATED: The Christian right didn't used to care about abortion — until they did

Büssing explained that there are a number of ways in which this manifests itself. Some people feel that their prayers are going into "the void," and others simply observe that faith no longer provides them with comfort. There are individuals who respond to the pandemic by asserting it shows God's indifference to humanity. In addition, there is the phenomenon of "hope fatigue," in which "religious people have expected 'responses of hope' and supporting reactions from their church" and missed them. The survey found low levels of satisfaction with the support people felt they had received from their local religious communities, and some of them may have become more privately religious in response.

"This is a problematic situation for the churches as one may assume they focus too much on other 'topics of interest' rather than what several of their parishioners may see as relevant in their concrete life," Büssing observed.

Yet other recent studies found that religious faith was not hampered by the COVID-19 pandemic, at least in certain situations and for certain groups. In the journal European Societies, a group of scholars in October 2020 revealed that Italians who had a COVID-19 infection in their family were more likely to become religious both in terms of attending religious services and privately praying. Yet this finding primarily held for individuals who had been socialized into religion as children, and therefore already had those religious tools at their disposal when needing comfort during the difficult time.

"This reinforces the role of family transmission as a way to shape religious beliefs and behaviours and to provide individuals with religious coping strategies," the authors of the European Societies wrote. They noted that their findings "suggest that under dramatic circumstances a short-term religious revival is possible, even in contexts where the process of secularization is ongoing." Their study also noted as one example that in the United States, one-quarter of adults said their faith had become stronger as a result of the pandemic.

Other studies have further complicated the analysis about the relationship between religion and the COVID-19 pandemic. A March study in the journal Frontiers in Psychology explored whether religion helped or hurt the psychological resilience of healthcare workers in a group of South Taiwanese hospitals. While the German and Italian studies looked primarily at Christian faiths, this study was able to compare Christian and non-Christian religious groups. This allowed them to look at how specific religious values have an impact on various metrics of psychological well-being. They found that Buddhists and Taoists were less likely to experience mental distress and, as a result, would see an indirect increase in their level of happiness. Workers from Christian/Catholic backgrounds, by contrast, showed better psychological well-being but over time had a slower recovery rate than people from other religious faiths.

"Christianity values high-arousal positive states (such as excitement), whereas Buddhism values low-arousal positive states (such as calm)," the authors explained. In addition, Buddhists are encouraged to find "a balance between asceticism and self-indulgence" and emphasize values like "tolerance, non-violence, respect for the individual, and a belief in the fundamental spiritual equality of all humans" through collectivist cultural beliefs. Christians, on the other hand, "believe in a personal soul and internal attributes of a person who behaves in the way they do because of their traits or disposition and their personal faith in God."

Büssing also offered another observation, based on the results of his research, about what might separate people who find comfort in their faith from those who do not.

"We have to differentiate two approaches," Büssing wrote, namely one which assumes that God will intervene during hardships and one which places unconditional trust in God's judgment. "The first group may easily be disappointed and lose their faith, while the second remains close to God (even in their struggle) and expect 'dawn' in times of darkness."

Scientists pinpoint a key weakness in the omicron variant linked to its lower fatality rate

Although the omicron variant is spreading like wildfire through the United States, some scientists have cautiously expressed cautious hope that its emergence could still mark the beginning of the pandemic's end. Perhaps it will mark the moment that COVID-19 becomes an endemic (as opposed to pandemic) virus like influenza; or, it could be auspicious that infected patients seem to get less sick than they would have from other strains.

As it turns out, that latter line of thinking may have some credibility. According to a new study by researchers at Hong Kong University, the omicron variant of SARS-CoV-2 has a more difficult time replicating in human lung tissue than either the delta strain or the original SARS-CoV-2 virus. Indeed, omicron was more than 10 times less efficient than the original virus in this regard.

Medical researchers believe that this could explain why so many patients can withstand infections from omicron, and why in certain countries, such as South Africa, hospitalizations have been comparatively lower on a per capita basis. The hypothesis goes that COVID-19 becomes severe when it spreads from the respiratory system to the rest of the body; confining it to the upper airway (i.e., out of the lungs) therefore becomes essential in staving off severe symptoms.

If this finding is backed up in future studies, it could explain some of the mysteries surrounding omicron.

Since emerging as a prominent COVID-19 strain last month, omicron quickly changed the course of the pandemic. Last week it caused Europe to post record numbers coronavirus infections every single day, and the United States set new daily case records in what was certainly also an undercount. Its dominance could be seen in local statistics; in New York City, for instance, omicron has fueled a record level of COVID-19 hospitalizations.

Yet despite this bleak news, there have also been some more welcoming signs. Although the omicron strain has caused an inevitable surge in COVID-19 cases, there has been a far lower hospitalization rate in the United States linked to the omicron surge than existed with other mutant variant surges. A British report revealed that patients with omicron are half as likely to require hospitalization, and one-third as likely to need emergency case, as those who carry the delta variant. All of the studies found that patients who had been vaccinated were much less likely to develop serious illnesses if they became infected.

Perhaps the most revealing study was one that occurred in South Africa, near the omicron variant originated (it likely originated in neighboring Botswana). In examining omicron cases in Gauteng province, the authors found the percentage of people hospitalized during the omicron wave was roughly one-third of the number that needed to be hospitalized during the delta wave — a 10 percent drop, all the way down to 4.9 percent. People who were hospitalized stayed for roughly half the time (4 days instead of 7 or 8 days), a statistic no doubt linked to how less than 30 percent of omicron patients met the regional criteria for severe disease. That was half the number who did so for prior variants.

As the authors wrote in their study: "During the first four weeks of the Omicron-dominated fourth wave, the proportion of patients requiring hospital admission was substantially lower and those admitted had less severe illness, with fewer requiring oxygen, mechanical ventilation and intensive care compared to the first four weeks of the Beta- or Delta-dominated waves in Gauteng Province in South Africa."

The authors of the Hong Kong University study cautioned observers against reading too much into their conclusion. For one thing, the paper has yet to be peer reviewed, and its conclusions need to be further tested for definitive confirmation. In addition, there are other mechanisms for severe COVID-19 infection besides passing through the lungs.

"It is important to note that the severity of disease in humans is not determined only by virus replication but also by the host immune response to the infection," lead author Dr Michael Chan explained in a statement.

Can Democrats break the midterm curse? Consider the example of 1934

Now that Joe Manchin has sounded the death knell — at least for the moment — for Joe Biden's Build Back Better package, Democrats are doomed in the 2022 midterm elections.

Or, wait: Are they? Sometimes the "laws" of politics (or economics) are characterized as immutable, akin to the laws of physics. They're not, of course. Not a single ballot has been cast in the midterms. If Democrats can find a way to turn out their voters at unexpectedly high levels, while Republicans don't, they could turn 2022 into another blue wave. Political trends do not govern us; they are the results of human behavior, which is never entirely predictable.

Of course the apparent collapse of Build Back Better doesn't help. But the pattern that every political analyst and historian understands may be the real problem: Since the modern era of American politics began with Franklin D. Roosevelt's election in 1932, the party that controls the White House has lost congressional seats in 19 of the 22 midterm elections. Two of the three exceptions, in 1998 and 2002, were special cases with little relevance to the Democrats' predicament in 2022. In 1998, Democrats benefited from a booming economy and popular backlash against the Republican effort to impeach Bill Clinton over a sex scandal. The 2002 midterm elections came just a year after the 9/11 terrorist attacks; patriotic sentiment was running high and George W. Bush had successfully defined himself (for the moment) as a "war president."

It's that third exception, way back in 1934 — during the first of Roosevelt's three-plus terms as president — that may offer an instructive example. Those midterms came after FDR and the Democrats had passed a series of ambitious and historic laws, known collectively as the New Deal. While the Great Depression certainly didn't end immediately, the New Deal put millions of people to work and did a great deal to relieve suffering and despair. Despite Roosevelt's reputation (then and now) as a progressive president, the underlying premise of the New Deal was more pragmatic than ideological: Economic insecurity, poverty and hunger were a threat to social stability and indeed to the capitalist system; creating a social safety net was understood as a matter of urgent importance. A few years later, Roosevelt put it this way in his 1944 State of the Union address: "We have come to a clear realization of the fact that true individual freedom cannot exist without economic security and independence. 'Necessitous men are not free men.' People who are hungry and out of a job are the stuff of which dictatorships are made."

Within his first 100 days, Roosevelt had done a great deal to restore confidence in government, providing emergency financial aid to those who were struggling, passing the National Industrial Recovery Act and the Securities Act, and implementing a number of other regulatory and relief measures. Indeed, his first term was among the most productive in presidential history, and resulted in a fundamental restructuring of the federal government, which from that point onward was a more direct presence in ordinary people's lives than ever before. Conservatives characterized this as a dangerous intrusion on personal freedom (and have done so ever since); what we now call "liberalism" coalesced around the idea that government action was sometimes necessary to help the most vulnerable people in society.

Despite these legislative successes, however, political success was not guaranteed to follow. Persistent unemployment was still painfully high, and corporate America had begun to impose wage cuts. In the era before sophisticated polling, pundits could not scientifically assess a president's popularity like they can today. For all the Democrats knew, Republican warnings about creeping socialism had led to widespread panic, and conservatives might turn out in record numbers to halt a supposed red menace. On the other end of the spectrum, some Democratic radicals and socialists were frustrated with the Roosevelt administration's piecemeal approach to issues of social and economic justice. Sen. Huey Long, the legendary Louisiana populist who advocated a massive program of wealth redistribution and federal spending, was planning to run against Roosevelt for the Democratic nomination in 1936 (and probably would have, if he hadn't been assassinated in 1935).

In the event, Democrats did remarkably well in the 1934 midterms, gaining nine more seats in the House — and also an extraordinary nine seats in the Senate, giving them a supermajority in that chamber, with 69 of the 96 seats. Richard Walker, director of the Living New Deal Project and professor emeritus of geography at the University of California, Berkeley, told Salon by email that the Democrats' big win resulted from two principal factors. Most obviously, FDR and his party were seen as taking decisive steps to address the economic crisis.

"The Depression was still bad and the Republicans had no new ideas since 1932," Walker wrote, "so even though FDR had not solved anything definitively yet, no one was about to bring back the utterly failed Hooverites. Does that count as missteps of the GOP? Well, they had lots of time to do badly from late 1929 to early 1933 and people hadn't forgotten yet."

Joe Biden is no doubt aware of that history, and very likely intended Build Back Better as his own legacy-setting achievement, substantial enough to shift the political tides. Whether that package can yet be resuscitated remains unclear, but his fundamental problem remains that the Democrats have razor-thin majorities in Congress and remain unwilling to end the Senate filibuster. Without a major legislative win, the Democrats' last remaining hope is to run a negative campaign and convince voters that a Republican victory would be catastrophic. Pending decisions at the Supreme Court, including the possible or likely overturn of Roe v. Wade, could potentially produce a political backlash that helps the Democrats hold control of Congress.

There are again vague similarities to the 1934 midterm elections, when Democrats successfully depicted Republicans as extremists, although in a different sense than today: They were associated with the wealthy elite, with businessmen who lived in mansions and held to laissez-faire dogmas radically out of touch with the lives of ordinary people. Returning to the disastrous economic policies of Herbert Hoover's administration, Democrats argued, would be a dreadful mistake. In 2022, the threat posed by a recently defeated Republican administration has taken a more literal and even more dangerous form, with Donald Trump and his supporters using fascist language and tactics and overtly seeking to overthrow democratic institutions.

Calling out that extremism wasn't enough for Democrats in the recent Virginia gubernatorial election, but that doesn't prove it wouldn't work on a national scale, if pursued more aggressively and effectively. If there's an applicable lesson for Democrats to be found in the 1934 midterms, it might be this: The incumbent party can win, but only if it makes an overwhelmingly persuasive and urgent case that their opponents are dangerous and the future will be truly bleak if they prevail. Given the circumstances, that's a highly plausible argument.

The Black Death had its own 'truther' movement

While the COVID-19 pandemic has been an inflection point of modern history, it is nowhere close to being the deadliest pandemic in human history. That dubious distinction belongs to the infamous "Black Death," a bubonic plague that swept through Europe and the Near East in the mid-14th century. Like COVID-19, the bubonic plague was a terrible way to die, but with very different symptoms. The most notorious were the dreaded buboes (hence 'bubonic plague'), severe inflammations of the lymph nodes that oozed pus and broke out over the groin, armpits and neck. A victim's skin and flesh would eventually turn black and rot, although long before this happened they would experience intense nausea, vomiting, fever, headaches and aching joints. Within days — or, at most, a couple weeks — the infected person would die.

This article first appeared in Salon.

One might imagine that a disease this terrible would have been burned into humanity's collective consciousness. Indeed, the Black Death did have a profound impact on our day-to-day lives, influencing everything from the professionalization of medicine and the decline of feudalism to works of art like Giovanni Boccaccio's book "The Decameron" and Ingmar Bergman's movie "The Seventh Seal." Yet the Black Death is not often mentioned in reference or in contrast to the COVID-19 pandemic — even though there are important parallels. Perhaps most tellingly, both diseases fueled scapegoating and mass hysteria because of widespread ignorance.

While the scientific illiteracy in the COVID-19 era is fueled by a mixture of motivated reasoning, political bias and historically-based concerns about institutional trustworthiness, inhabitants of the Middle Ages lacked modern humanity's sophisticated knowledge about biology. Louis Pasteur did not develop modern germ theory until the 19th century, half a millennium after the Black Death. Today we know that the Black Death was a disease, and that the microorganisms was most likely imported from Asia through the Crimea and into Europe and the Near East by way of fleas living on black rats. People who lived in Europe, North Africa and the Middle East in the 1340s and 1350s could not have even imagined what a microorganism was, much less the complex chain of events that would have brought such a deadly one into their homes.

In the absence of knowledge, some ugly alternative theories emerged. Because Jews had been a popular scapegoat in Europe for centuries, a wave of pogroms against Jewish communities broke out during this time as they were blamed for the plague. For years Jews had been collectively blamed for the death of Jesus Christ and accused of sinister blood rituals; around the Crusades, the stereotype also emerged of Jewish wealth, one reinforced in anti-Semitic minds by how Jews were barred from owning land and therefore were disproportionately concentrated in finance. Attacks on Jewish communities were commonplace prior to the Black Death, but now occurred with renewed vigor and effectiveness because the attackers had more motive. Jews were accused of poisoning wells and of other conspiratorial actions, all somehow connected back to alleged vendettas against Christianity, desires to earn money, ominous religious practices or some combination of the three. Victims were tortured into confession and exterminated in large numbers.

There is an obvious parallel between this and the rise of anti-Chinese prejudice during the COVID-19 pandemic. While nowhere near as pervasive as anti-Semitic sentiment during the Black Death, there have been thousands of anti-Asian hate incidents in the United States since COVID-19 reached our shores. These have ranged from taunts and slurs to acts of physical violence. And the rhetoric of certain politicians, who reinforce and encourage the scapegoating of China — or even promote unfounded conspiracy theories that China somehow created the virus — hasn't helped.

"I've been saying this for a year: The rhetoric from Trump has emboldened people to openly speak in an anti-Chinese way, which — being Asian American in the United States, part of the stigma is people can't tell Asians apart, we're forced into a racial group and lumped together," Rosalind Chou, an associate professor of sociology at Georgia State University, told Salon in April. "I'm Taiwanese American, but people walking down the street couldn't differentiate, right? [...] I've been saying for a year that people are going to get hurt if we keep placing blame and calling COVID-19 the 'China virus,' if we have radio talk show hosts and news reports constantly using rhetoric that is anti-Chinese."

Scapegoating during the plague era wasn't confined to Jews. Setting aside lepers and other unfortunate individuals from marginalized groups that were also sometimes blamed for the plague, medieval people had a wide range of theories about who was behind the Black Plague. Some turned to astrology for an explanation for the plague, as well as a possible cure. Many religious people believed it was God's wrath or Satan's scourge; flagellants, or religious penitents who would flog themselves in public and beseech the almighty for forgiveness, became a common sight at this time; more educated people subscribed to the idea that miasmas, or "poisoned air," was responsible for causing disease. (This was probably the closest anyone came to the truth without knowing anything about microbiology.)

There is a lesson in humility there: It is possible that there is much we don't know about COVID-19 in our era that could become common knowledge in a handful of generations. Likewise, there are parallels between the people who saw deities and devils behind every bubonic sore and blister, and those who insist that analogous sinister conspiracies are at work behind current events today. These ideas may seem outlandish, such as claiming that Bill Gates or George Soros is somehow behind the whole thing. On other occasions they have a measure of plausibility, albeit a grossly exaggerated one, such as the idea that the bug may have originated from a Chinese laboratory. Just as the flagellants and anti-Semites of medieval Europe drew from pre-existing religious traditions to color their interpretations of the Black Plague, so too do individuals who were conspiracy theory-minded before the pandemic turn to those types of explanations during it.

"The people who are believing in those conspiracy theories were likely believing in similar conspiracy theories before the COVID-19 pandemic, and they're just applying that style of thinking to this new thing," Joseph E. Uscinski, a political scientist at the University of Miami, told Salon last year. "Basically what we find is that the people who buy into these sorts of conspiracy theories do so because they have what we call underlying conspiracy thinking, meaning that they see the world through a conspiratorial lens."

Not all of the comparisons between the Black Plague and COVID-19 are foreboding. As briefly mentioned earlier, the Black Plague drew attention to how medieval practitioners of medicine usually had no idea what they were doing. This planted seeds that eventually grew into a systematized, scientific approach to healing the human body — in short, the renaissance of modern medicine. While human beings were thankfully much farther progressed in biotechnology by the 2020s, the pandemic helped jump start the development of a new class of vaccine technology, the mRNA vaccines like those mass produced by Moderna and Pfizer/BioNTech, which could revolutionize medicine. Everything from cancer vaccines to universal influenza inoculations are all within the realm of possibility thanks to this platform, which trains cells how to produce proteins that the immune system can recognize as being associated with the SARS-CoV-2 virus.

By the time it had finished peaking (1347 to 1351) and ravaged most of the Western world, the Black Death had claimed anywhere from 75 million to 200 million lives. The COVID-19 pandemic's death toll, though no less tragic, is at the time of this writing just shy of 5.4 million, with more than 800,000 of those in the United States. Fortunately, this is a small faction of the total human population of more than 7.7 billion today; while it is impossible to know for sure how many people were alive in the mid-14th century, most estimates place it around 300 to 400 million. Reflecting back on how far humans have come, we can at least be grateful to live at a time when science has brought us so many miracles of medicine — even if it hasn't yet cured the miasma of misinformation.

Scientists can't explain the origin of the omicron variant — but all of their theories are troubling

According to the Centers for Disease Control and Prevention (CDC), the omicron variant is now the dominant strain of COVID-19 in the United States. Nearly three out of four new infections are from this mutant virus — an increase by sixfold from where omicron infections stood last week, and an even more startling figure considering the first reported case of omicron in the United States was less than a month ago.

Scientists have made remarkable strides in understanding the origin and spread of COVID-19, which is part of what makes the omicron variant so shocking: its origins are perplexing, as it didn't stem from other recent prominent strains like the delta variant. The confusion around its origins creates added hurdles in terms of treating it.

Besides its incredible virulence, here's why the omicron variant is so scary: Omicron has 30 mutations located near its spike protein, which are the thorn-like protrusions on the SARS-CoV-2 virus' central sphere. Because the existing mRNA COVID-19 vaccines are designed to train the immune system to recognize those spines as intruders, mutations on the spike proteins may help the virus evade the body's attempts to defend itself, and perhaps partially evade existing vaccine-based immunity.

So how did omicron rack up so many mutations on its spike proteins, without any intermediate steps of evolution through other variants? Scientists have theories about how that happened, though none are comforting.

First, note that mutations are, to some degree, expected of a virus. As the novel coronavirus began to lose battle after battle to human immune systems and due to human ingenuity (vaccines), the "survivor" viruses tended to be the ones that mutated to effectively ward off human efforts at immunity. Those survivors then pass those traits to the offspring viruses it creates through replication. Thanks to genetic technology, researchers have been able to study those mutant strains and learn about SARS-CoV-2's "family tree," so to speak — that is, the relationship between all the variants that stemmed from one another.

Here's where it gets weird. There is a big gap in the omicron variant's timeline.

Sequence characteristics in any virus' genome can be matched in databases with other strains so experts can deduce their origins. Scientists trace these family trees to learn more about a virus' lineage, and in the hope that this information will help them defeat it. Yet the most recent identifiable sequences on the omicron variant's genome originate from over a year ago, all the way back to the middle of 2020. This means that scientists cannot link it to currently circulating strains. Yet they know for sure that this strain is very different from the original SARS-CoV-2 strain that brought the world to its knees at the beginning of 2020.

So what explains that gap? Where did the omicron variant come from?

One hypothesis is that it developed in an immunocompromised COVID-19 patient. While there is no direct evidence that this happened, scientists do know that viruses can become stronger in the body of a person with a weak immune system, because they circulate for longer — continuing to mutate as they evade the patients' weakened immune system. A virus that circulates for months in the body of an immunocompromised patient might be able to develop superior survival skills by developing defenses against human antibodies.

Richard Lessells, an infectious disease specialist at the University of KwaZulu-Natal in South Africa, saw this in action. Lessells observed SARS-CoV-2 samples from the body of a female HIV patient (who had received improper treatment). Over a period of roughly six months, the virus adapted and changed quite a bit in her body.

"Because we had samples from a few different time points over that six-month period, we could show how the virus evolved and variants with some of the same mutations as the variants of concern appeared over time in the samples," Lessells told NPR.

Writing for Forbes, Dr. William Haseltine — a biologist renowned for his work in confronting the HIV/AIDS epidemic and currently the chair and president of the global health think tank Access Health International — observed that there have been a number of cases in which mutant variants have incubated in immunocompromised COVID-19 patients who are treated by antiviral drugs and antibodies after they could not fully shake their infections. These cases have been found in Italy, the United Kingdom and American cities like Boston and Pittsburgh.

Of course, these are merely theories — it has not been proven that omicron originated in an immunocompromised patient. These studies and theories merely demonstrate that such a development could have happened.

Haseltine also touched upon the next much-discussed possibility, which is that the omicron variant emerged from a process known as reverse zoonosis — that is, a situation in which a virus that originated in another animals jumps to humans, then back to animals, and then back to humans again. The COVID-19 pandemic originated from the first step of that process (jumping from an animal, probably a bat or a pangolin into humans), and the hypothesis is that the virus somehow jumped from a human to an animal and then back to a human.

Indeed, the SARS-CoV-2 virus has proved troubling adept at infecting animals that regularly come into contact with humans. The mink farming industry has taken a hit (quite possibly a fatal one) because of COVID-19 infecting huge numbers of the animals that are raised for their fur to feed the fashion industry. Likewise, the virus has infected dogs and cats, and American deer. Zoo animals like lions, giraffes and two-toed sloths have also gotten sick. While there is no evidence that the SARS-CoV-2 strains that entered these animals have managed to reinfect humans, that does not mean this would be impossible.

Not everyone buys that the omicron variant could have emerged that way. Trevor Bedford, a computational virologist and professor at the Fred Hutchinson Cancer Research Center in Seattle, told NPR that he doubts the omicron variant started in an animal because he does not see residual genetic material from those animals in its genome, but instead an insertion of human RNA. This "suggests that along [omicron's evolutionary] branch, it was evolving in a human."

Haseltine has also advanced the hypothesis that the omicron variant might have arisen because of human intervention. In his Forbes editorial, he suggested that a COVID-19 patient with the Merck drug molnupiravir might have inadvertently incubated the omicron variant. Molnupiravir works by inserting errors into a virus' genetic code, making it harder for the virus to reproduce and therefore easier for the immune system to defeat it. Yet Haseltine claims that if molnupiravir is not administered properly (such as by not being taken over the full five-day period), or even if it is used correctly but everyone involved is just unlucky, it could produce a heavily mutated virus strain.

Haseltine noted that an FDA analysis of the drug's clinical trial results showed that patients who had taken molnupiravir had more viral variation than those who did not, including 72 emergent spike substitutions or changes among 38 patients who took that drug. A Merck spokeswoman told the Financial Times that Haseltine's "unfounded allegation has no scientific basis or merit" and added that "there is no evidence to indicate that any antiviral agent has contributed to the emergence of circulating variants."

While the omicron variant is more transmissible than other SARS-CoV-2 strains, it does not yet appear to be more deadly. However, experts believe it will overtax America's health care system because it will infect so many people, some of whom will inevitably become seriously ill.

"The two vaccinations, typically of any of the vaccines, offer virtually no protection against infection and transmission," Haseltine told Salon earlier this month. "Three vaccinations offer only very temporary protection after three months."

Dr. Monica Gandhi, an infectious disease doctor and professor of medicine at the University of California–San Francisco, told Salon several days ago that the omicron variant is "more transmissible and will cause a wave of new infections," but added that "there is now evidence that omicron is less severe than previous strains." She added that scientists do not yet know "if this is because of increasing cellular immunity in the population in December 2021 versus an inherent property of the strain that makes it less virulent."

In historic address, Biden tells unvaccinated they are playing with 'life or death'

In a historic public address, President Joe Biden spoke on Tuesday about new measures for fighting the omicron variant — a mutant strain of the SARS-CoV-2 virus that now accounts for nearly three-fourths of new COVID-19 infections in the United States — and specifically announced that the government is purchasing 500 million at-home COVID-19 tests for free distribution.

While delivering these remarks, Biden echoed back to one of the main themes from his September national address, in which he said that the COVID-19 pandemic had become "a pandemic of the unvaccinated." Recalling that speech, Biden stated plainly that those refuse vaccination are only hurting themselves and others.

"I promised when I got elected that I would always give it you straight from the shoulder," Biden opened, explaining that he felt it was necessary to answer questions about the omicron variant as Americans head into the Christmas weekend. "If you are not fully vaccinated, you have good reason to be concerned," he warned.

The president pointed out that unvaccinated Americans are at a higher risk of both getting seriously sick from the omicron variant and spreading the virus to other people. He also noted that almost everyone who has died from COVID-19 over the past few months has been unvaccinated, and argued that they are at high risk during the holidays while vaccinated Americans are much less so — especially those who received a booster shot. As such, Biden urged unvaccinated Americans to get their shots.

"It's free. It's convenient. I promise you it has saved lived. And I honest-to-God believe it is your patriotic duty," the president told the American people. He later observed that individuals who are unvaccinated are playing with "life or death" not only for themselves, but for the people around them — both by spreading the infection and by overwhelming hospital systems that need to care for them and those they infect. The president characterizing getting vaccinated as "the only responsible thing to do."

Biden also tried to strike an optimistic tone, stating that there are "three big differences" between where America was when the pandemic reached out country in March 2020 and the present day. One is that 200 million Americans have received COVID-19 shots, meaning that if they get infected they are unlikely to present severe symptoms. In addition, Biden pointed out that "we are prepared today for what's coming," which was not the case when the virus first reached the United States. Finally, Biden said that "we know a lot more today than we did in March 2020," such as by having more resources to keep schools open and making it possible to vaccinate children over the age of five.

"We should all be concerned about omicron, but not panicked," Biden explained.

The president also mentioned a COVID-19 action plan that his administration had already developed for the winter months, adding that it is being revised to account for new conditions. This includes sending hundreds of additional vaccinators to sites throughout the United States and ordering FEMA to create vaccination sites in areas where there is high demand. Biden also gave credit to President Donald Trump's administration (while not mentioning the former president by name) for having the government invest in the programs that helped America develop COVID-19 vaccines. (It is unclear if Biden mentioned this because Trump was booed by his own supporters on Sunday for saying that he had received a booster shot.)

Biden also told the American people that, starting next month, the administration will make sure that Americans have access to free rapid at-home COVID-19 tests by purchasing half a billion of those tests to be distributed to those who request them online. He also said that, as of this week, the federal government will set up emergency testing sites in areas that desperately need them, such as in New York City (where there has been a surge of omicron cases). Biden promised that the government will send PPE (personal protective equipment) to people who need it and send more doctors, nurses and media to underserved areas. Finally, he mentioned a recent court decision that reinstated his vaccine mandates, which he characterized as necessary for protecting public health and keeping businesses open.

Salon spoke to public health experts and doctors who characterized the president's measures as welcomed and "appropriate."

"What the president has described is an appropriate and much needed intensification of vaccination, boosters, masking and at-home testing," Dr. Russell Medford, Chairman of the Center for Global Health Innovation and Global Health Crisis Coordination Center, told Salon by text message. "The commitment of the federal government to 500 million at-home tests, free of charge, delivered directly to peoples homes, is a critically needed first step."

Medford added that it was "notable" that Biden did not call for lockdowns and said he was "somewhat surprised" that Biden "did not re-iterate and emphasize his administration's plans to make Paxlovid [a new anti-COVID-19 pill] available to the American public and globally."

Dr. Monica Gandhi, infectious disease doctor and professor of medicine at the University of California–San Francisco, also critiqued Biden for not mentioning Paxlovid. Gandhi characterized Paxlovid as "a very exciting option to prevent hospitalizations and deaths among the unvaccinated by 88%, [which] still works against variants." She said she wished Biden "would have given a timeline for the approval of this medication."

That said, Gandhi also had praise for Biden's speech, arguing that "the emphasis on increasing testing is a good one in light of the Omicron variant." She also praised his emphasis on the importance of getting boosters.

Dr. Alfred Sommer, dean emeritus and professor of epidemiology at Johns Hopkins Bloomberg School of Public Health, told Salon by text message that he agrees with Biden's message about getting three doses of the vaccine as being "the most important thing we can do." He felt that Biden encouraging people to vacation with their family is "acquiescing with what everyone is going to do anyway and is 'probably' right," although Sommer added that he does not know why the Centers for Disease Control and Prevention (CDC) and other relevant agencies have yet to release preliminary data on the vaccination status of omicron patients.

"The implications one could draw from such data would have wide confidence limits since there are so many other variables to consider," Sommer explained, since there are variables like degree of vaccination and degree of exposure.

Sommer said such data would be helpful in deciding what kinds of gathering were safe.

"If I knew that 80 percent of people in an area were fully boosted, and none of them were hospitalized, or only 10 percent of the hospitalized had been boosted, there would be some good evidence that family gatherings were indeed safe for the boosted," he added.

While Biden's speech was scientifically sound, it remains to be seen whether it will be rhetorically effective. In other words, will it have the galvanizing effect on unvaccinated Americans that the president seems to hope for?

One political historian was marginally hopeful.

"By telling it straight about omicron, President Biden presented a needed contrast to the gaslighting of the pandemic and the peddling of quack cures that we heard from the previous president," Dr. Allan Lichtman, a political historian at American University, told Salon by email. "He delivered a down-to-earth, practical message. The president told Americans what they need to do to say as safe as possible and explained what his government is doing to help them say safe. He issued a needed warning to the purveyors of misinformation, although it is unlikely that they will heed his words."

Lichtman added, "Ultimately, however, the success of any message is dependent on actual progress on the ground for the American people. As Robert Straus, the campaign chair for President Jimmy Carter, said in explaining Carter's loss in 1980: 'The real world is all around us.'"

Good news: Study says third shot of Pfizer's COVID-19 vaccine protects well against omicron variant

Fears about the omicron variant's ability to evade vaccines may end up being overblown — at least, if enough humans are inoculated with a third dose of Pfizer-BioNTech's COVID-19 vaccine.

According to Pfizer-BioNTech, the third dose or "booster" dose of their patented vaccine protects patients against the mutant SARS-CoV-2 virus strain with an effectiveness comparable to how two doses defend against other common strains.

The company determined this based on laboratory studies which revealed analogous levels of neutralizing antibodies, or the proteins produced by the immune system to identify and kill pathogens. There was a 25-fold drop in these neutralizing antibodies among patients who had only received two doses of the Pfizer vaccine, suggesting that the omicron variant may be able to more easily evade the protections in people who lack the extra dose. By contrast, if future research replicates these results, it indicates that three doses of the Pfizer shot will likely be enough to keep the vast majority of patients from developing serious infections.

The challenge when fighting an infectious disease like COVID-19 is making sure that vaccines can protect patients even after the pathogens have evolved. Vaccines work by training the immune system to identify specific markers within a given pathogen; if that pathogen mutates beyond recognition, it weakens the body's ability to detect and stop invading viruses.

Experts are concerned about B.1.1.529 (the omicron variant's formal name) because it has 50 mutations, 32 of which are in the spike protein. The spike protein is so-named because, like spines on a sea urchin, it pokes out from all sides of the SARS-CoV-2 virus; such proteins are intrinsic to the virus' ability to enter and infect human cells. The mRNA viruses work by training the immune system to identify those proteins, meaning that any alterations to their structure could theoretically help the virus beat the vaccine's protection.

"The 32 mutations across the spike protein doesn't mean that it evades immunity, but it is the most [mutations] we've seen," Dr. Monica Gandhi, infectious disease doctor and professor of medicine at the University of California–San Francisco, told Salon last month. "In one region in South Africa, cases [of omicron] are going up really fast — it's really just dominating the screens there — and that made some people say, 'it looks like this is really transmissible' because we thought that delta was the pinnacle of being transmissible right now."

The new research about the latest Pfizer study follows up an earlier announcement in which scientists revealed the company's vaccine offered at least partial protection against omicron. The leader of that study team, Dr. Alex Sigal of the Africa Health Research Institute in Durban, noted that humans are lucky the omicron variant infects cells through the ACE2 receptor. This allows existing vaccines to still partially work against the variant.

"Imagine if this virus had found a different receptor to bind to," Sigal commented. "Then all of our vaccines would have been trash."

It remains to be seen if the news about the Pfizer booster is a game changer in the fight against the omicron variant. Before the news was announced, virologist Dr. Jesse Bloom from the Fred Hutchinson Cancer Research Center in Seattle told The New York Times that "given the very large drop in neutralizing antibody titers that are seen here with omicron" it makes sense to move "as fast as possible with making omicron-specific vaccines, as long as it seems like there's a possibility it could spread widely."

The omicron variant has now reached the United States — here's what you should know

The omicron variant of COVID-19 has, according to the Centers for Disease Control and Prevention (CDC), reached the United States.

The virulent coronavirus strain, which was first detected in South Africa, was confirmed to have arrived in California via a person who had recently visited that country. According to President Joe Biden's health adviser Dr. Anthony Fauci, the individual was fully vaccinated but had not yet received a booster shot, meaning that theirs constitutes a breakthrough case. (Fauci added that policymakers may need to redefine what it means to be fully vaccinated.)

This person only displayed mild symptoms, which are now improving, and has self-quarantined since being diagnosed. Everyone who had close contact with that individual has tested negative, although the patient continues to test positive.

Speaking to the American public last week about the omicron variant, Biden closed America's borders to countries where the variant had been identified, urged fully vaccinated Americans to get booster shots so they could protect themselves, and pleaded with those who were not vaccinated to change that. He also suggested that the international community address the problem of vaccine inequality, noting that no one will be safe from outbreaks if only citizens of affluent countries have unfettered access to inoculations.

"The United States has already donated more vaccines to other countries than every other country combined," Biden claimed. "It is time for other countries to match America's speed and generosity."

The omicron variant is alarming to medical experts because B.1.1.529 (omicron's formal name) has 50 mutations, 32 of which are in the spike protein. That protein, so-named because it constitutes the spikes that stick out of the virus' central sphere like spines on a sea urchin, helps SARS-CoV-2 viruses enter your cells and cause infections. Since mRNA vaccines work by helping the immune system identify SARS-CoV-2 spike proteins, any mutations to the spike protein could help coronaviruses evade the vaccines. This is why other strains with these mutations have also been labeled as "variants of concern."

The omicron strain also has new mutations, including one that may make it more effective at infecting cells. Another study (which has yet to be peer reviewed) points out that omicron shares a mutation with alpha and mu that might help the virus replicate more quickly.

"The 32 mutations across the spike protein doesn't mean that it evades immunity, but it is the most [mutations] we've seen," Dr. Monica Gandhi, infectious disease doctor and professor of medicine at the University of California–San Francisco, told Salon yesterday. "In one region in South Africa, cases [of omicron] are going up really fast — it's really just dominating the screens there — and that made some people say, 'it looks like this is really transmissible' because we thought that delta was the pinnacle of being transmissible right now."

In addition to the vaccine inequity caused by unequal vaccine access internationally, new COVID-19 outbreaks and mutations have also been facilitated by the fact that many people who have access to vaccines refuse to get them. Biden announced a round of vaccine mandates in September as a means of closing that gap.

The Revolution of 2020: How Trump's Big Lie reshaped history after 220 years

There are few words as overused as "revolution," which has many Merriam-Webster definitions and here means "a fundamental change in political organization." While people who discuss politics are prone to dramatic talk of "revolutions," few of the American presidential elections described that way really merit the term. Franklin D. Roosevelt's "revolution" of 1932 changed the nature and role of government in American life, and Ronald Reagan's election in 1980 undid at least some of those changes. But neither election literally altered how our democracy functions.

Those disqualifying details do not apply to the most recent election — the first one ever in which a losing president refused to admit defeat. It's reasonable to describe that as the Revolution of 2020.

There have been two previous elections that could be defined as revolutionary. The more recent was in 1860, which both revealed and reflected a profound rift in the American polity and led directly to the Civil War. Eleven Southern states decided to secede after the Republican victory because they feared Abraham Lincoln's presidency spelled doom for the economic system based on chattel slavery. This story is relevant to the Donald Trump era, but the earlier revolutionary election is our main topic here — and that one is actually known as the Revolution of 1800. As things turned out, it was the rarest kind of revolution: One with a happy ending.

George Washington had served two terms as America's first president, but without facing meaningful opposition or anything resembling a modern election campaign. After he decided not to seek a third term, the 1796 election became the first to feature serious competition between the nation's brand new political parties. Federalist candidate John Adams, who had been Washington's vice president, ultimately prevailed over Thomas Jefferson, former Secretary of State and candidate of the Democratic-Republican Party. But in the momentous election of 1800, Jefferson won the rematch, and Adams — the first incumbent president to be defeated — faced a historic decision: Would he come up with some excuse to cling to power, or simply hand the reins of state to Jefferson and walk away?

If Adams had cried fraud or otherwise claimed the election was illegitimate, he could almost certainly have stayed in office despite electoral defeat. In fact, he wouldn't really have needed much of an excuse. There was almost no precedent for national leaders voluntarily stepping down in the face of popular rebuke, and plenty of examples — from ancient to modern times — that pulled any aspiring dictator in the opposite direction. But Adams was invested in democracy's success, and as such swallowed both his pride and his genuine concerns about Jefferson's political philosophy. He followed the law and surrendered power, and in the process, demonstrated how important the conduct of losing candidates is to democracy. (Like Trump, Adams skipped his successor's inauguration, but he never tried to delegitimize his erstwhile rival's presidency.)

Nearly two decades later, it was Jefferson himself who described Adams' actions as the "Revolution of 1800," comparing it to the better-known revolution that had begun 24 years earlier:

... that was as real a revolution in the principles of our government as that of 76. was in it's form; not effected indeed by the sword, as that, but by the rational and peaceable instrument of reform, the suffrage of the people. the nation declared it's will by dismissing functionaries of one principle, and electing those of another, in the two branches, executive and legislative, submitted to their election...

The ideals of self-government captured in the Declaration of Independence, Jefferson was suggesting, did not become reality until American democracy passed its acid test: The person entrusted with the most powerful office in the land accepted a painful verdict. It had been difficult enough for Washington to leave the presidency, even though he was eager to live out his last years as a civilian. For Adams, it was even worse: He badly wanted to continue as president, and on some level expected to win re-election. In accepting defeat, he proved that democratic government wasn't just an ideal. It was also workable.

Adams' precedent was followed without question for the next 220 years, with 10 incumbent presidents leaving office voluntarily after the voters kicked them out. Most were bitterly disappointed by defeat, it's fair to say, but none of Adams' spurned successors — from his own son in 1828 to George H.W. Bush in 1992 — tried to concoct conspiracy theories in order to claim they hadn't really lost. Then came the Revolution of 2020, when the guy who became famous for telling people "You're fired!" on a reality show refused to accept being canned. That catalyst broke the precedent set in the Revolution of 1800 by America's first fired president.

Students of history like me, who spent a lifetime before the 2020 election immersed in the story of American politics, instantly recognized the magnitude of what Trump was doing — and understood that he represented a tendency George Washington himself had warned the young nation about. Washington directly identified the fundamental ingredients that made Trump's coup attempt, starting with a political party so fanatically determined to win that its motivated reasoning could overpower common sense and basic decency. Once the Republicans filled that role, they just needed a demagogue who was sufficiently unscrupulous to exploit that. (Politics is full of egotists, so it says something that Trump was the first politician narcissistic enough to qualify.)

After pointing out that "the very idea of the power and the right of the people to establish government presupposes the duty of every individual to obey the established government," Washington expresses concern that divisive political sentiments — particularly party zealotry — were "destructive of this fundamental principle, and of fatal tendency." In elaborating on this, Washington almost seemed to be looking into the future with a crystal ball. He could hardly have described the hyper-partisanship of our time more precisely if he had actually known about Donald Trump, with all his odious followers and ludicrous assertions. Political parties, Washington writes,

... serve to organize faction, to give it an artificial and extraordinary force; to put, in the place of the delegated will of the nation the will of a party, often a small but artful and enterprising minority of the community; and, according to the alternate triumphs of different parties, to make the public administration the mirror of the ill-concerted and incongruous projects of faction, rather than the organ of consistent and wholesome plans digested by common counsels and modified by mutual interests.
However combinations or associations of the above description may now and then answer popular ends, they are likely, in the course of time and things, to become potent engines, by which cunning, ambitious, and unprincipled men will be enabled to subvert the power of the people and to usurp for themselves the reins of government, destroying afterwards the very engines which have lifted them to unjust dominion.

Washington was able to foresee all of this because the early American republic was not that different from our own time. Though the issues in the 1800 election may seem remote in 2021, Americans were no less invested in politics. Jefferson was alarmed by the way Washington and Adams had centralized control of economic policy in the federal government, such as by creating a national bank, and was convinced their foreign policies were too friendly toward Britain. He was also appalled by the Alien and Sedition Acts, which brutalized immigrants and violated the First Amendment rights of political dissidents. For his part, Adams viewed Jefferson as a libertine and radical whose ideas might push America into the bloody chaos that had overwhelmed France after the revolution of 1789. And all this vitriol was just from the campaign. After the election was decided, Aaron Burr, Jefferson's running mate, tried to seize the presidency for himself through Machiavellian backroom dealings, prompting a serious constitutional crisis and the swift enactment of the 12th Amendment as a corrective.

None of the hostile rhetoric between Trump supporters and Joe Biden supporters in last year's election can match the sheer bile that Adams, Jefferson and their various partisans flung at each other in 1800. The difference, of course, is that only the Republican Party, after being cannibalized and devoured from within by the Trump faction, has actually failed the ultimate test of democracy. The modern GOP produced the only president who refused to honor the American tradition of accepting defeat with grace and relinquishing power peaceably. Exactly what effect the Revolution of 2020 will have on the overall history of American democracy is not clear — but to this point, the signs are not encouraging.

What scientists learned when they peered into an octopus' brain

Among the smartest animals on Earth, octopuses are unique for being utterly weird in their evolutionary path to developing those smarts. Philosopher Peter Godfrey-Smith has called the octopus the closest thing to an alien that we might encounter on Earth, and their bizarre anatomy speaks to this: An octopus' mind isn't concentrated in its head but spread throughout its body. Their tentacles are packed with neurons that endow each one with a hyperaware sense of touch, as well as the ability to smell and taste. Marine biologists have remarked that each tentacle sometimes seems like it has a mind of its own. Every octopus is a tactile thinker, constantly manipulating its surroundings with a body so soft it almost seems liquid.

All of these things are surprising, at least in theory, because scientists have learned to associate intelligence with vertebrates and a tendency to socialize. Octopuses are either asocial or partially social — and all of them are invertebrates. This raises an obvious question: How did octopuses become so smart?

Scientists know surprisingly little about this subject, as a great deal of the research on octopus neuroanatomy up to this point has focused on one species, the European common octopus (Octopus vulgaris) — which has about as many neurons in its body as a dog. Thanks to the scientists behind a new study in the scientific journal Current Biology, we now know more about the neural wiring of four very different types of octopuses (or, in one case, octopus-like animals): the vampire squid (Vampyroteuthis infernalis), which dwells in the deep sea and is technically neither an octopus nor a squid; the blue-lined octopus (Hapalochlaena fasciata), a venomous creature that keeps to itself while roaming the ocean at night; and "two diurnal reef dwellers," Abdopus capricornicus and Octopus cyanea (also known as the day octopus).

The scientists also examined data about four other species of coastal octopuses based on material in previously published literature. Using that information and their new research, they concluded that octopus intelligence evolved in ways similar to vertebrate animals — specifically, based on the need to accommodate their surroundings. That implies that they had a convergent evolutionary path towards developing intelligence despite having diverged from vertebrates long ago.

"Our study uncovered new insights to confirm that octopus brain structure indeed evolved as those of many other animals," Dr. Wen-Sung Chung, the lead contact on the paper and a postdoctoral research fellow at the Queensland Brain Institute in Australia, told Salon by email.

Chung analogized octopus evolution to shark evolution, noting that sharks evolved differently based on the ocean depths at which they preferred to swim. "It is probably [unsurprising] as they have a short life span and live in a broad range of ocean (from reef to deep sea, from tropical to temperate waters)," all of which have different conditions in terms of predators to evade and other pressures on survival.

"Octopuses and other cephalopods are very likely more complicated than we expected before," Chung added. "Expanding studies toward various species from different habitats, rather than narrowing down to one/a few iconic species, can be a way to study this amazing and apparently smart creature. I believe we can learn more by embracing the diversity of these creatures."

Among other things, scientists did not expect to find as much folding as they did in the octopus brains. The process in which the brain develops what appear to be wrinkles is known as gyrification, and is associated with vertebrates whose highly evolved brains are capable of processing large quantities of complex information. Yet wrinkles have been observed in brain sections for roughly 20 octopus species already, and the new studies revealed unmistakable new evidence of brain structural folding in the octopod's central nervous system.

"The brain folding is certainly a big surprise to us," Chung wrote to Salon. "In order to confirm this, we had to catch different-sized individuals (no way to get them from the animal house or pet shop) to eliminate the possibility of structural deformation caused by the handling during capture, fixation and imaging."

The study also provided new information about the vampire squid, a species that is neither octopus nor squid but rather the last surviving member of its own order. By looking at its brain, scientists were able to learn that it has a strange hybrid of both squid-like and octopus-like features. They also found that, for octopus species that live in reef systems, their entire visual system undergoes major changes to accommodate their daytime-dwelling lifestyle.

Does this mean octopuses are as intellectually complex as humans? Not so fast, Chung warned, noting that scientists can only say for sure that octopuses are smart enough to remember landmarks and break out of their housing tanks. ("This is the nightmare for most octopus researchers," he noted.)

"Honestly, this study is just the very first step to investigate the differences/similarities between octopuses/cephalopods, and we know too little about octopuses in many ways," Chung added. "We should be cautious for this and avoid over-interpretation at this stage until more solid evidence available in the future."

How Trump's Big Lie reshaped history after 220 years

There are few words as overused as "revolution," which has many Merriam-Webster definitions and here means "a fundamental change in political organization." While people who discuss politics are prone to dramatic talk of "revolutions," few of the American presidential elections described that way really merit the term. Franklin D. Roosevelt's "revolution" of 1932 changed the nature and role of government in American life, and Ronald Reagan's election in 1980 undid at least some of those changes. But neither election literally altered how our democracy functions.

Those disqualifying details do not apply to the most recent election — the first one ever in which a losing president refused to admit defeat. It's reasonable to describe that as the Revolution of 2020.

There have been two previous elections that could be defined as revolutionary. The more recent was in 1860, which both revealed and reflected a profound rift in the American polity and led directly to the Civil War. Eleven Southern states decided to secede after the Republican victory because they feared Abraham Lincoln's presidency spelled doom for the economic system based on chattel slavery. This story is relevant to the Donald Trump era, but the earlier revolutionary election is our main topic here — and that one is actually known as the Revolution of 1800. As things turned out, it was the rarest kind of revolution: One with a happy ending.

George Washington had served two terms as America's first president, but without facing meaningful opposition or anything resembling a modern election campaign. After he decided not to seek a third term, the 1796 election became the first to feature serious competition between the nation's brand new political parties. Federalist candidate John Adams, who had been Washington's vice president, ultimately prevailed over Thomas Jefferson, former Secretary of State and candidate of the Democratic-Republican Party. But in the momentous election of 1800, Jefferson won the rematch, and Adams — the first incumbent president to be defeated — faced a historic decision: Would he come up with some excuse to cling to power, or simply hand the reins of state to Jefferson and walk away?

If Adams had cried fraud or otherwise claimed the election was illegitimate, he could almost certainly have stayed in office despite electoral defeat. In fact, he wouldn't really have needed much of an excuse. There was almost no precedents for national leaders voluntarily stepping down in the face of popular rebuke, and plenty of examples — from ancient to modern times — that pulled any aspiring dictator in the opposite direction. But Adams was invested in democracy's success, and as such swallowed both his pride and his genuine concerns about Jefferson's political philosophy. He followed the law and surrendered power, and in the process, demonstrated how important the conduct of losing candidates is to democracy. (Like Trump, Adams skipped his successor's inauguration, but he never tried delegitimize his erstwhile rival's presidency.)

Nearly two decades later, it was Jefferson himself who described Adams' actions as the "Revolution of 1800," comparing it to the better-known revolution that had begun 24 years earlier:

... that was as real a revolution in the principles of our government as that of 76. was in it's form; not effected indeed by the sword, as that, but by the rational and peaceable instrument of reform, the suffrage of the people. the nation declared it's will by dismissing functionaries of one principle, and electing those of another, in the two branches, executive and legislative, submitted to their election...

The ideals of self-government captured in the Declaration of Independence, Jefferson was suggesting, did not become reality until American democracy passed its acid test: The person entrusted with the most powerful office in the land accepted a painful verdict. It had been difficult enough for Washington to leave the presidency, even though he was eager to live out his last years as a civilian. For Adams, it was even worse: He badly wanted to continue as president, and on some level expected to win re-election. In accepting defeat, he proved that democratic government wasn't just an ideal. It was also workable.

Adams' precedent was followed without question for the next 220 years, with 10 incumbent presidents leaving office voluntarily after the voters kicked them out. Most were bitterly disappointed by defeat, it's fair to say, but none of Adams' spurned successors — from his own son in 1828 to George H.W. Bush in 1992 — tried to concoct conspiracy theories in order to claim they hadn't really lost. Then came the Revolution of 2020, when the guy who became famous for telling people "You're fired!" on a reality show refused to accept being canned. That catalyst broke the precedent set in the Revolution of 1800 by America's first fired president.

Students of history like me, who spent a lifetime before the 2020 election immersed in the story of American politics, instantly recognized the magnitude of what Trump was doing — and understood that he represented a tendency George Washington himself had warned the young nation about. Washington directly identified the fundamental ingredients that made Trump's coup attempt, starting with a political party so fanatically determined to win that its motivated reasoning could overpower common sense and basic decency. Once the Republicans filled that role, they just needed a demagogue who was sufficiently unscrupulous to exploit that. (Politics is full of egotists, so it says something that Trump was the first politician narcissistic enough to qualify.)

After pointing out that "the very idea of the power and the right of the people to establish government presupposes the duty of every individual to obey the established government," Washington expresses concern that divisive political sentiments — particularly party zealotry — were "destructive of this fundamental principle, and of fatal tendency." In elaborating on this, Washington almost seemed to be looking into the future with a crystal ball. He could hardly have described the hyper-partisanship of our time more precisely if he had actually known about Donald Trump, with all his odious followers and ludicrous assertions. Political parties, Washington writes,

... serve to organize faction, to give it an artificial and extraordinary force; to put, in the place of the delegated will of the nation the will of a party, often a small but artful and enterprising minority of the community; and, according to the alternate triumphs of different parties, to make the public administration the mirror of the ill-concerted and incongruous projects of faction, rather than the organ of consistent and wholesome plans digested by common counsels and modified by mutual interests.

However combinations or associations of the above description may now and then answer popular ends, they are likely, in the course of time and things, to become potent engines, by which cunning, ambitious, and unprincipled men will be enabled to subvert the power of the people and to usurp for themselves the reins of government, destroying afterwards the very engines which have lifted them to unjust dominion.

Washington was able to foresee all of this because the early American republic was not that different from our own time. Though the issues in the 1800 election may seem remote in 2021, Americans were no less invested in politics. Jefferson was alarmed by the way Washington and Adams had centralized control of economic policy in the federal government, such as by creating a national bank, and was convinced their foreign policies were too friendly toward Britain. He was also appalled by the Alien and Sedition Acts, which brutalized immigrants and violated the First Amendment rights of political dissidents. For his part, Adams viewed Jefferson as a libertine and radical whose ideas might push America into the bloody chaos that had overwhelmed France after the revolution of 1789. And all this vitriol was just from the campaign. After the election was decided, Aaron Burr, Jefferson's running mate, tried to seize the presidency for himself through Machiavellian backroom dealings, prompting a serious constitutional crisis and the swift enactment of the 12th Amendment as a corrective.

None of the hostile rhetoric between Trump supporters and Joe Biden supporters in last year's election can match the sheer bile that Adams, Jefferson and their various partisans flung at each other in 1800. The difference, of course, is that only the Republican Party, after being cannibalized and devoured from within by the Trump faction, has actually failed the ultimate test of democracy. The modern GOP produced the only president who refused to honor the American tradition of accepting defeat with grace and relinquishing power peaceably. Exactly what effect the Revolution of 2020 will have on the overall history of American democracy is not clear — but to this point, the signs are not encouraging.

'Let’s go Brandon': Here’s a short history of insulting presidential nicknames

For those of you who have been mercifully spared this information, supporters of Donald Trump have started using the phrase "Let's Go Brandon" as a code for "Fuck Joe Biden." The craze began after an NBC Sports reporter at a NASCAR race in Alabama mistook the profane chant by some fans as an expression of support for driver Brandon Brown. Realizing that "Let's Go Brandon" does indeed sound a bit like muffled version of the vulgar insult (if sufficiently muffled), it quickly caught on as a stand-in attack on the incumbent president.

Now it's everywhere: On Trump campaign merchandise and among Republican politicians, on weapons parts and, of course, as a trending hashtag on Twitter. A Southwest Airlines pilot even came under investigation for uttering the phrase from the cockpit to a planeload of passengers.

"Let's Go Brandon" is an imaginative troll, specific to the age of the internet — but it's unlikely to have the staying power of the immortal presidential nickname "OK," which over the last 180-plus years has become the most frequently used word on the planet. While its origins remain controversial, historians have confirmed that it was widespread during the 1840 election, when incumbent President Martin Van Buren was running against former U.S. Army Gen. William Henry Harrison. Prior to that election, "OK" had been employed for a few years by New Englanders as a comical shorthand for "all correct" — that is, as an acronym for "oll korrect" or "ole kurreck," implying that the speaker was uncultivated or perhaps a non-English speaker.

The next stage in the "OK" story comes because Van Buren's nickname was "Old Kinderhook," a reference to his hometown in upstate New York. Van Buren supporters capitalized on the term's prevalence by forming "O.K. Clubs," urging Democrats to "Vote for O.K." and saying that it showed Van Buren was "all correct." Seizing an opportunity, Harrison's Whig Party tried to flip the script, claiming that Van Buren's political patron, Andrew Jackson, had signed papers as president "O.K." because he was too ignorant to know better that it was not "Oll Korrect." (It was commonly believed that Jackson was only semi-literate, which wasn't true, although he lacked much formal education.)

Harrison defeated Van Buren, but not because of the mocking usage of "OK." Both campaigns embraced the term and, more importantly, the Whigs developed a number of innovative techniques to push Harrison to victory. Harrison was further boosted by an economic depression that caused widespread hardship; under these conditions, almost anyone could have beaten Van Buren. But perhaps the popular phrase can still be meaningfully linked to that historical event: No prior election had ever had turnout above 60 percent, but in 1840 voter turnout was more than 80 percent — an inconceivably high proportion, then or now. (The 2020 election had the highest turnout in 120 years, and nevertheless only about two-thirds of registered voters even bothered.) It seems plausible that "OK" became so popular in large part because people heard it constantly that year.

It also didn't help Van Buren's image to be known as "OK." He never commanded the grassroots popularity that war hero Andrew Jackson had, and was widely perceived by the public as distant and stuffy — in contemporary terms, part of the "elite." Harrison was also a military veteran, dubbed "Old Tippecanoe" by the Whigs, in reference to a battle he fought in 1811 against the Native American confederacy under the legendary Shawnee chief Tecumseh. Compared to that colorful history (however it may appear to us today), Van Buren seemed like a nonentity, and being called "OK," a word that already had the connotation of "somewhat all right," clearly didn't help.

That helps us focus on the secret of effective presidential epithets — they zero in on a highly distinctive quality of the person in question and skewer it. Think of the catchy insulting monikers from recent history. "Tricky Dick" Nixon has an appealing rhythmic and percussive quality, but also captures the fact that Nixon was seen as a shifty and unscrupulous character long before the Watergate scandal. "Teflon Ron" Reagan was effective because no amount of scandal ever stuck to the relentlessly upbeat Reagan — partly because the press loved him and the Republican Party protected him, and partly because he literally had no idea what was going on in his own administration. "Slick Willie" Clinton perfectly encapsulated the unctuous salesman-cum-preacher mode so distinctive to the 42nd president — and doesn't it seem even more accurate today? George W. Bush was mocked as "Dubya," partly to differentiate himself from his dad and partly to point out that he was a prep-school kid from the uppermost level of society, masquerading as a Texan.

Trump's favorite nickname for his 2016 election opponent, "Crooked Hillary," was idiotic in substance but mercilessly effective. It was grotesquely unfair — Hillary Clinton has been investigated more thoroughly than almost anyone in current public life, and has never faced criminal charges of any kind — but that wasn't necessarily a drawback in the gruesome context of that campaign. The simplistic epithet captured the intense mistrust many on the right felt toward Hillary, going clear back to her husband's first election in 1992. Not coincidentally, it also confirmed the misogynistic stereotype of a conniving, untrustworthy woman.

"Let's Go Brandon" is entirely different. Unlike the other insults reviewed here, there is no deeper meaning that's specific to Joe Biden in any way. Describing him as "China Joe" or "Sleepy Joe" at least conveys specific insults, regardless of their merits. As I suggested earlier, "Let's Go Brandon" is a specific product of this era, but it asserts nothing about Biden rather than overt hostility — at a moment when the president appears embattled amid falling approval ratings.

"Let's Go Brandon" also does not strike me as an especially effective way to "own the libs," although there's some anecdotal evidence that Democrats and their supporters find it troubling. Hardly anyone personally identifies with Joe Biden to such an intense degree that they feel genuine distress when he is attacked. It's Trump supporters who feel that way about their hero, thanks to an unhealthy dose of narcissism by proxy and a profound buy-in to Trump's malignant normality. While millions of people are no doubt invested in Biden's success as president, they don't view him as an untouchable idol.

Finally, the insult fails because it implies that there is some taboo against criticizing Biden, which the last few weeks of plummeting poll numbers and policymaking headaches should have proven is spectacularly untrue. This is another example of Trump supporters' performative subversiveness, in which privileged white people play-act as victims while shilling for fascism. It's the obnoxious, quasi-jokey wish fulfillment that oozes from Trump's pores, boiled down to a single childish slogan.

As always with the Trump movement and Republicans, there's a powerful element of projection to "Let's Go Brandon," which exposes more about the people using it than about its target. In addition to revealing Trump supporters to be childish, vulgar and obsessed with their hero to an unhealthy degree — none of which is a big surprise — it also shows how much they dread being humiliated. "Let's Go Brandon" attempts to taunt Biden with the thing they fear most desperately — being publicly regarded as a joke.

Should Democrats respond with their own demeaning nickname for Trump? That depends on whether you think anything could ever stick to a man who seems impervious to ridicule, and whose innumerable lies, multiple apparent criminal acts and massive incompetence have never affected the intense loyalty of his faithful. It's not like there's a shortage of ample material for his opponents: Trump's followers can serve for the rest of world history as the ultimate example of "sore losers," for instance, as their champion is the only president to refuse to gracefully accept being fired by the American people, and had an extensive history of being a sore loser long before his claims about the 2020 election.

It probably won't happen, for the same set of reasons that Republican politicians usually move in lockstep but Democrats don't. The anti-Trump constituency has no coherent shared ideology beyond supporting "democracy" — which means different things to different people — much less a consistent message. It doesn't help that many liberals still believe in the failing creed, "We go high," fearing that playing dirty will both degrade themselves and backfire politically. But if a catchy disparaging epithet for Trump that nettled his followers actually got traction, from the Democrats' point of view that would be OK.

An 'alarming finding': Many Republicans now ready to support violence

New public opinion research from the nonprofit Public Religion Research Institute, part of its 12th annual American Values Survey, has returned alarming findings.

Close to one-third of Republicans in the survey, or 30%, agreed with the statement that "true American patriots may have to resort to violence in order to save our country." That was more than the combined total of Democrats and independents who say the same thing (at 11% and 17%, respectively).

PRRI CEO and founder Robert Jones said the large proportion of Republicans who appear ready to endorse political violence is "a direct result of former President Trump calling into question the election." Jones noted that according to the same survey, more than two-thirds of Republicans (68%) claim that the 2020 presidential election was stolen from Donald Trump, as opposed to only 26% of independents and 6% of Democrats.

The study also found that 39% of those who believed that Trump had won the 2020 election endorsed potential violence, compared to only 10% of those who rejected election misinformation. There were also signs of a split based on media consumption, with 40% of Republicans who trust far-right news sources agreeing that violence could be necessary, compared to 32% of those who trust Fox News and 22% among those who trust mainstream outlets. In addition, respondents who said violence may be necessary are more likely to report feeling like strangers in their country, to say American culture has mostly worsened since the 1950s and to believe that God has granted America a special role in human history.

This study comes out just before Tuesday's "off-off-year" 2021 elections, with the national media focused on the race for governor in the swing state of Virginia. Republican nominee Glenn Youngkin has floated baseless conspiracy theories about the election and allowed surrogates to perpetuate Trump's Big Lie, while maintaining some distance from the most extreme claims. Youngkin has said the disgraced former president's endorsement is an "honor" and Trump has repeatedly urged his supporters to vote for Youngkin. The unexpectedly close race between Youngkin and Democrat Terry McAuliffe in a state that has largely trended Democratic since 2008 could provide an important symbolic victory for Republicans.

The PRRI survey is not the first indicator that the violent assault on the U.S. Capitol on Jan. 6 may represents a trend rather than an anomaly. Ashli Babbitt, a Jan. 6 rioter killed by a Capitol police officer while attempting to force her way into a secure area, has been turned into a martyr by both Trump and many of his followers. At a recent rally in Virginia, Republicans pledged allegiance to a flag that was supposedly at the Capitol during that riot, and speakers called for Trump supporters to "monitor" election workers and officials. One Virginia election official recently described how Republican poll watchers in his state have acted with "a level of energy and sometimes aggression" and said he had received "very personal attacking, trolling emails accusing me, pre-election, of fraud and even making specific allegations of what the fraud would be."

Indeed, the idea that hypothetical voter fraud could justify violence is, in itself, something new on the American political scene. There have been accusations of fraudulent elections throughout American history — some valid, some bogus — but Trump and his supporters are alone in suggesting violence. (Of course, there was one other presidential election that led to violence: The election of 1860, which sparked the Civil War.) Trump's team lost virtually all the dozens of court cases filed over the 2020 election, and their attempt to get the results overturned was unanimously rejected by the Supreme Court. Even former Attorney General Bill Barr and many key Republican legislators rejected Trump's claims of fraud, meaning that anyone who insists Trump was the real winner presumably thinks that the nefarious conspiracy included dozens of high-ranking Republicans.

Jones, the PRRI CEO, did not mention that additional context, but perhaps did not have to. He described the results of the group's new survey "an alarming finding," adding: "I've been doing this a while, for decades, and it's not the kind of finding that as a sociologist, a public opinion pollster, that you're used to seeing."

Is the Earth hanging by cosmic ropes inside a magnetic tunnel?

It sounds like the premise of an early science fiction novel: What if Earth actually exists inside a giant magnetic tunnel?

According to a preprint study published in the scientific magazine Astrophysical Journal, that fanciful concept may be less absurd than it seems. Indeed, the researchers' idea is one that could literally redraw the map of our universe.

Scientists have known since the 1960s that there are two seemingly separate radio structures — which are defined in astronomy as any object that emits strong radio waves — that can be definitively detected by Earth's technology. Known as the North Polar Spur and the Fan Region, the new study posits that these radio structures resemble long ropes and are approximately 1,000 light-years long, as well as roughly 350 light-years from our planet.

The research by scientists at Penn State University also suggests that, in addition to being near-Earth (relatively speaking), the two structures are connected to each other and, as a result, essentially surround us.

Imagine a giant tube composed of massive, magnetized tendrils that may look a bit like long and slender ropes. These tendrils include a magnetic field and charged particles which manage to link the two radio structures, effectively creating a tunnel-like structure that includes Earth as well as a small section of the Milky Way — that's the idea, at least.

The scientists' findings could help future researchers as they try to create a holistic model of magnetic fields in other galaxies, and understand similar structures uncovered through astronomical observations. They also predict that when scholars are able to observe these radio structures in higher resolution, they will discover additional features, including "a much more complex filamentary structure," among other things. As one of the scientists told Salon, these structures would be quite awe-inspiring if we could detect them with our own eyes. (The North Polar Spur, for instance, appears in one X-ray map as a sort of massive yellowish bubble.)

"If we could see radio light, then we (in the Northern hemisphere) would see several bright patches extending across a very large distance on the sky," Dr. Jennifer L. West, co-author of the paper and astronomer at the Dunlap Institute for Astronomy and Astrophysics at the University of Toronto, told Salon by email. "These patches are fixed on the sky and they would change their position and orientation over a night and over the seasons, just like the stars and constellations." West added that people who ventured outside shortly after sunset in the autumn, as well as in cities at mid-Northern latitudes, would see the Fan Region apparent in one part of the sky.

"The Fan Region would extend from the Northern horizon right up to the point overhead," West explained. "It would pass through the constellations of Cameloparladis, Cassiopeia, and Cepheus. The North Polar Spur would extend up from the Western horizon and also reach nearly overhead. It would pass through the constellations of Bootes, Corona Borealis, and Hercules. Another, somewhat fainter patch would extend up from the South-East."

This new scientific research about the magnificent structures, West explained, "tried to take into account all of the different kinds of observations" from astronomers over the years. It also offers more than aesthetic gratification. As West told Salon, she is fascinated by magnetism in both the universe and our galaxy. Scientists are only beginning to learn more about these magnetic fields, and West is determined to understand as much as possible about why they exist and how they influence star and planet formation.

"One theory of magnetism in galaxies is called Dynamo theory - it's the theory that explains the magnetic field in the Earth and in our Sun, and that they are generated from rotating, charged particles," West said. "We think it is also responsible for generating the magnetic fields in galaxies, but we need more evidence to support this hypothesis."

She added, "In this study we are trying to map the local environment so that when we build models of the whole Galactic magnetic field we can take the local contribution into account. The saying that we can't see the forest for the trees really applies here. We need to understand what we're looking at close-up in order to get a sense of the bigger picture. I hope this is a step towards understanding the magnetic field of our whole Galaxy, and of the Universe."

This might even, West noted hopefully, someday include our own solar system.

The human neck is a mistake of evolution

Critics of evolution often argue that life, rather than gradually changing over the years through natural selection, was actually created by a so-called "intelligent designer." Their position is that the biological machinery which makes up living bodies is so complex, and so perfectly calibrated to support our numerous needs, that it had to have been planned out by a deliberate and thoughtful force of some kind.

Yet if God actually did design human bodies according to a plan, they forgot to make sure that we can breathe while we sleep — a remarkably crucial detail to overlook. While not everyone suffers from the aforementioned anatomical glitch, known to doctors as obstructive sleep apnea, it affects 22 million Americans — and has become an even more hazardous condition amid the spread of a deadly virus that attacks the lungs.

To understand this fault in the human blueprint, imagine your upper airway as a tube that must remain open to do its job. (This is a simplistic reduction for the purpose of analogy.) When you're awake and upright, the tube stays open easily. Yet once you recline — say, to sleep — one's muscles around that tube start to relax. The apparatuses around the tube — including one's tongue and soft palate — can press down and constrict it, interfering with the smooth passage of air, akin to a kink in a hose. When one's breathing is reduced, this condition is known as a hypopnea; if one's breathing stops altogether, it is called an apnea.

To the people in proximity to the sufferer, the result is snoring, choking and other highly unpleasant sounds during sleep. The sufferers themselves are usually deprived of restful sleep and adequate blood oxygen levels, and their consequent lot in life can be one of abject misery: Constant daytime fatigue, headaches and living in a mental fog are just three of the most common symptoms. Over the long term, sufferers are at a high risk of heart disease, Alzheimer's disease, strokes, high blood pressure, diabetes and a number of mental health ailments. For a large percentage of the patient's day, their body endures the stress of repeatedly coming close to suffocating, as well as the weariness of never being allowed recuperative sleep.

Why does this happen? In children, the culprit is frequently obstructions from the adenoids or tonsils, and the solution can be as simple as an operation. Obese people may be at a higher risk for sleep apnea, since excess fat deposits around one's throat and chest can further restrict nighttime breathing. Aging is a factor, too, as aging causes one's throat muscles to weaken. Those who make lifestyle choices that weaken the respiratory system, such as smokers, are at higher risk. Finally, some merely have genetic or anatomical predispositions that, for one reason or another, mess with the proper working of the structures in the upper airway.

COVID-19 has made being an apnea sufferer a more dire condition. In January, a study in the journal BMJ Open Respiratory Research found that obstructive sleep apnea is an independent risk factor for severe COVID-19. Patients with obstructive sleep apnea were at a 2.93 times higher risk of requiring hospitalization for COVID-19 — independent of other risk factors for either the disease or the sleep disorder. While this could simply mean that having obstructive sleep apnea gives a patient other risk factors that coincidentally make them more vulnerable to COVID-19 (such as a high BMI), it could also be that the sleep disorder exacerbates COVID-19 on its own, "especially during the night, when decreased oxygen saturation levels occur in" obstructive sleep apnea, the researchers say.

There are treatments for sleep apnea, the most notable of which is the CPAP, or continuous positive airway pressure, machine. CPAP machines work by keeping the upper respiratory tract open with a constant level of air pressure greater than atmospheric pressure. A patient attaches a nasal mask, a face mask or nasal prongs to their airway, and a machine uses water to lubcricate a regular pressure stream that persists throughout the patient's sleep. While the apparatus can be difficult to adjust to, those able to make the transition often report significant relief. Many patients say that using a CPAP completely changed their lives, restoring their physical and mental vitality literally overnight. (CPAPs have been in the news lately because of a manufacturing issue in the CPAP machines made by Phillips Respironics has put certain customers at risk of cancer; the company has issued a recall.)

So how did nature bring us to a point where, for millions of people, the only effective way to breath while sleeping (aside from major surgery) is to literally force air down their throats? How did evolution let this happen?

The answer, as it turns out, has to do with evolutionary trade-offs. Humans evolved to be highly intelligent, walk upright and communicate through complex vocalizations. Those gifts came with a price.

As Allen J. Moses, Elizabeth T. Kalliath and Gloria Pacini wrote in the dental journal Dental Sleep Practice, lower animals are fortunate to have "evolved structures of nearly perfect design" for tasks like breathing, swallowing, smelling and chewing. Humans, by contrast, need to balance a large cranium (housing a large brain) on a spinal column that remains vertical to the ground to allow them to walk on two legs. They also need equipment in their necks that permit them to produce sounds for talking, and those organs take up more of the already-limited amount of real estate near the throat. The tongue, for instance, descends deeper into a human's neck than it does for any other mammal. Even pioneering biologist Charles Darwin was aware of the absurdity of evolution in allowing food to potentially go down the wrong pipe in your throat; "every particle of food and drink we swallow has to pass over the orifice of the trachea with some risk of falling into the lungs," he wrote.

If the human body were a building, our neck would arguably be the most poorly conceived room in the house, overflowing with functionally mismatched organs stuffed there to accommodate other design priorities. "Significant evolutionary changes to the human head are flat face, smaller chin, shorter oral cavity, changes in jaw function, repositioning of ears behind jaws, ascent of the uvula and descent of the epiglottis, right angle bend in tongue, creation of compliant, combined, flexible airway-footway, and speech," the researchers write in the aforementioned journal.

Perhaps in part because scientists assumed humans could not possibly have such an absurd inherent design flaw, the symptoms of these structural deficiencies — most conspicuously snoring — were for centuries perceived as innocuous or, at worst, merely annoying. It wasn't until the mid-20th century that scientists began to figure out that those periods when sleeping people struggle to breathe actually pose a serious health problem. Even then, a common approach was to perform a tracheotomy, a drastic measure in which a hole is punched into the throat to facilitate breathing. The CPAP was invented after one patient refused to undergo the procedure but was willing to try his doctor's new air-pressure machine. Chronically unable to sleep before using the world's first CPAP, he reported feeling utterly refreshed the following morning. Humanity's architectural flaw had been exposed.

Before long, Japanese scientists were learning how even minor alterations in the size and position of the pharynx drastically altered the likelihood of developing a sleep disorder. Scientists were even figuring out the precise role of obesity in contributing to the disorder. (Obesity enlarges tissues in the already cramped throat.) Within decades, obstructive sleep apnea has become a common diagnosis and a main condition that sleep health professionals look for in their patients.

These problems existed before the COVID-19 era and, despite being worsened by the pandemic, will almost certainly persist after it is over. After all, obstructive sleep apnea has been a literal and figurative pain in the neck for as long as humans have had necks as we currently know them. Aside from the immediate knowledge humans have acquired about our own anatomical deficiencies, the existence of obstructive sleep apnea is a reminder to embrace humility. Millennia after the ancient Greeks created modern medicine, we are still learning surprising new things about the bodies we inhabit every day.

Did COVID-19 secretly unite (rather than divide) us? A new documentary argues just that

Living through history is not always pleasant. Generations from now, historians will study how humanity coped with the pandemic in the 2020s. That's because the COVID-19 pandemic has been an inflection point in history, an event that transforms the world. It is easy to overlook this as we're caught in its throes; yet considering that more global tragedies related to issues like climate change and income inequality are in the offing, that analysis will offer more than merely academic insights.

There are real individual human beings making this history, men and women from all walks of life who view themselves not as tiles in a mosaic but as ordinary people trying to get through the day during unprecedented times. Netflix is sharing their stories in its new documentary, "Convergence: Courage in a Crisis," and does so by showcasing similar experiences from across the globe from humans of all walks of life. There is the heroism of a Syrian refugee and volunteer hospital cleaner, Hassan Akkad (also a co-director), who fights to end a terrible injustice, and of a Miami doctor desperate to protect Florida's homeless community. When a volunteer in Wuhan helps medical workers visit the city where the outbreak all began, there is a visceral sense of tension from anyone who can imagine the stress of such a trip.

To better understand this film — and its surprising argument that humans converged in this crisis — Salon spoke with Orlando von Einsiedel, an Oscar-winning documentarian (for his short "The White Helmets") who led a group of ten co-directors to bring these stories to the screen. "Convergence: Courage in a Crisis" premieres on Netflix on Tuesday, Oct. 12. This interview has been edited for length, clarity and context.

What broader lessons did you learn from looking at how these different cultures have adapted to the COVID-19 pandemic? What are the guiding principles that humanity can take away from this experience?

When we began, this started off as a film about individuals around the globe responding to the pandemic. I think the film morphed into a story about individuals around the world, responding to the flaws in society that the pandemic has massively exposed and then civil society rising up to plug those holes. I think that's one of the things we've seen around the world in a number of places. What are those flaws — injustices, social inequities, big flaws like racial injustice? I think those are the things that COVID specifically has really shown us. It's been like a magnifier.

Can you give specific examples that really struck you while you were making the film?

In Sao Paulo, Brazil's largest city, there is a community where water is shut off at eight o'clock and the protagonist whose story we follow in the film talks about how the government has everyone need to wash their hands and has those sorts of health measures in place. And yet the water for that community is turned off. I think that shows enormous inequality in Brazil, for instance.

I suppose one of the major ethical questions that exists right now in terms of COVID-19 discourse is, to what extent should we respect cultural differences versus to what extent should we insist on people following public health precautions? In the United States, as I'm sure you know, there are many people who oppose mask mandates, vaccine mandates, social distancing, and other public health measures. Does the documentary provide any insights in terms of how we should approach those questions?

Actually we decided from the beginning to not focus on the everyday politics of COVID. We tried to tell a human story. Yes, we focused on countries where COVID has been politicized, because I think that has shown those have been the places where we've seen the pandemic play out at its worst, and it's been the hardest to control it. But the film doesn't sort of delve into the every day politics of COVID in that regard.

Where does it draw the line in terms of the kinds of political issues it does explore and the ones that it does not?

Ultimately we follow the stories of our protagonist at the hands of the film. So it's, where do their stories take us? If the story then explores inequities in Miami to do with unhoused individuals, or inequities in a Brazilian favela, or injustices against migrant workers, then we explore those issues.

Now I'm wondering, in terms of the broader lessons from your film, people like me who think about things in historical terms wonder what will the long-term impact be of the COVID-19 pandemic on humanity? Because something that literally affects every human being alive in such a profound way is going to have permanent consequences. Have you given thought to this question? And if so, what kinds of answers do you think your documentary provides?

One of the things I hope the documentary does is highlight the commonalities between us. I'd like to think that the film focuses on some of the things which pull us together. I believe that it's global events like this, that show that we are all living on a small planet, that can help us in the future tackle these big global issues.

Do you think this is a lesson that humanity is learning? Or is it the lesson that we should learn?

Well, it's definitely a lesson that we should learn. I think COVID has highlighted that enormously, whether or not the leaders in place at the moment are the best equipped to learn from those lessons. I don't know. But I think one of the effects was just to show just how connected we all are, and how to solve these global crises we have to work together.

Your documentary is appealing because it covers the experiences of people from all over the world responding to this same event... You said that the lesson is that we need to view ourselves as a global community in terms of practical efforts. What would that look like? Whether in the United States or Brazil or China or Russia, or anywhere else? What would it look like for real world political change to occur that would better serve everyone, whether it's through COVID-19 or climate change or any other crisis?

I guess the very simple answer to that is working together. [WHO Executive Director- General] Dr. Tedros Adhanom in the film very much says that. This is what he's been very much tasked with doing, is trying to put people together to work together in global solidarity. I think that is key. How can you just solve COVID in one country? You can't. This is a global problem. Everybody has to pull together. And to your point, how do you solve climate change? It involves everybody working together. There's no point just one country working on this decision in isolation. The world doesn't work like that. We are all interconnected and therefore we have to work together on this stuff. As I said earlier, I think COVID can be a real lesson to all of us, but to solve these problems, we have to work together.

A new California oil spill is a pollution nightmare

On Saturday, a ruptured pipeline off the coast of southern California near Huntington Beach spewed around 144,000 gallons of oil into the Pacific Ocean. While the precise cause of the rupture was unclear, some speculate that the pipeline may have been struck by a ship's anchor. Amplify Energy, which owns the 17-mile pipeline, notified authorities about the spill more than 12 hours after an oil sheen was first noticed around the site of the accident. The leak has now stopped, but the oil slick stretches from Huntington Beach to Newport Beach and has already caused dead birds and fish to wash ashore.

According to Orange County Supervisor Katrina Foley, the oil has also entered the local wetlands. She lamented the damage to the wetland ecosystem.

"These are wetlands that we've been working with the Army Corps of Engineers, with the Land Trust, with all the community wildlife partners to make sure to create this beautiful, natural habitat for decades," Foley said. "And now in just a day, it's completely destroyed."

Foley's comments had their parallel in remarks by Newport Beach Mayor Brad Avery, who told Foley that he had spotted dolphins swimming through thick oil plumes. Foley speculated that the damage to local wildlife would be severe, telling reporters that "you can't get wildlife back that are killed in this process, and some of the habitat the plant species, they're going to be impacted for years to come."

The sense of outrage is not limited to either Foley, Avery and the fate of dolphins and wetlands.

"Big company doesn't invest in infrastructure," tweeted environmental activist Erin Brockovich. "Big company has a blow out. Big company gets sued. Lawyers get rich, people and the planet get screwed. Nothing f**king changes."

California Assemblyman Jared Huffman expressed a similar view, tweeting that "this is terrible but not surprising. Where you drill you spill. It's passed time to end our dirty, deadly, planet-killing addiction to fossil fuel."

One environmental activist expressed dismay at media reports that focused on the cancellation of a scheduled air show.

"People are prioritizing entertainment over the destruction of the natural world!" tweeted Alexandria Villaseñor, who described how being exposed to crude oil can damage your nerves or liver, cause chemical cancer or pneumonia, or result in a number of other serious health ailments. Villaseñor then added that "the next time someone is more worried about their own inconveniences or lack of entertainment because of a massive oil spill, tell them these things, or maybe just tell them they are part of nature too and what happens to the planet affects us all."

There are a number of health problems associated with oil spills for humans, as well as for the animals that suffocate or absorb poisonous chemicals. Those tasked with cleaning them up can experience skin or eye irritation. More broadly, anyone exposed to oil spills is at risk of liver damage, increased cancer risk, immune system issues, respiratory issues, reproductive issues and a higher presence of toxic substances in their bodies. These are only the known health problems: There is a dearth of studies that comprehensively catalogue long-term complications from being exposed to spilled crude oil.

Some reports raise questions about the timeline for public information release about the oil spill. As Sammy Roth of the Los Angeles Times reported, people in Huntington Beach and the surrounding area could smell oil on Friday night, but for most of Saturday were reassured by officials that there was only a small spill. That story did not change until Saturday night, when officials and Anthem acknowledged that the spill was a significant disaster. Initial reports pegged the leak at 126,000 gallons; that was later revised to 144,000.

In addition to fueling an industry that causes global warming, oil spills are also more likely as a result of climate change. While the origins of the California spill remain unclear, scientists agree that extreme weather events like hurricanes make spills more likely. Oil drilling is a complex process that demands extensive, delicate infrastructure. Natural disasters that break machinery on rigs, platforms and production facilities will become more common as the planet warms, which increases the likelihood of spills taking place.

The oil spill off of American coastal waters comes only a month after an oil spill in the Gulf of Mexico that began in the days before Hurricane Ida hit the Gulf Coast. That spill occured when an underwater pipeline at an offshore drilling site two miles south of Port Fourchon, Louisiana, began oozing oil. As the oil slick spread out on the water, officials would eventually find more than 100 birds pickled in crude oil, with satellite images showing a dark cloud drifting eastward for miles.

Think Biden is a 'failed' president who can't get re-elected? Consider Bill Clinton

It is historically ignorant to describe Joe Biden as a failed president, which hasn't stopped pundits from trying. From MSNBC to New York Magazine, the view has emerged that if Biden's package of infrastructure and spending bills falls flat, it will be his political death sentence. Centrist Democrats like Joe Manchin and Kyrsten Sinema are depicted as holding the president's very career in their hands. By refusing to support filibuster repeal, and then throwing up roadblocks to his proposed $3.5 trillion spending package, they are seemingly making it impossible for Biden to get anything done at all.

That is unquestionably a big political problem, because if Biden can't get big things done before Democrats lose Congress in the 2022 midterm elections — as is extremely likely to happen — he almost certainly won't be able to do so afterward. If he chooses to run again in 2024, the thinking goes, he'll be a sitting duck for Donald Trump or any other Republican who opposes him: a "failed president," like Trump himself.

Except that history suggests it doesn't work that way. Consider the case of Bill Clinton.

At roughly this time in Clinton's first term, he was also widely regarded as a failure. Like Biden, he had a handful of positive achievements: signing the Family Medical Leave Act, signing the Brady Act (at the time, a historic gun control measure), appointing Ruth Bader Ginsburg to the Supreme Court. (Biden's chief accomplishments are his vaccine policies, COVID-19 stimulus and numerous judicial appointments.)

Clinton also had a number of damaging scandals during this period, from unflattering reports about his business and political fundraising activities to accusations about his handling of a federal raid in Waco, Texas. He also made a number of serious policy blunders. By the 1994 midterm elections, he had signed into law the now-infamous Violent Crime Control and Law Enforcement Act, fueling an era of mass incarceration directed largely at Black men; agreed to the egregious "Don't Ask, Don't Tell" policy in the military, which treated LGBT rights like a bargaining chip; failed to pass an ambitious health care plan spearheaded by Hillary Clinton; overseen a military debacle in Somalia; agreed to the "free trade" deal known as NAFTA; and implemented austerity-driven fiscal policies that drove the Democrats unrecognizably to the right.

Clinton was just getting warmed up. Before Election Day 1996, he would also lose control of Congress in the 1994 midterms to Republicans powered by Newt Gingrich's "Contract with America" and preside over a government shutdown during a budget battle with those same Republicans, which was largely their doing. America's economy was generally strong through those years, but Clinton's approval ratings fluctuated wildly. (Biden, for what it's worth, has been solidly in the 40s and 50s.) It seemed entirely plausible throughout Clinton's first term that he would lose to whomever the Republicans nominated, which turned out to be Sen. Bob Dole, a widely admired World War II veteran.

But American voters didn't (and don't) have long political memories and weren't focused on the granular details. When the 1996 election came around, they knew that the economy was doing well and the nation was at peace both domestically and abroad. In accordance with presidential election precedent, Clinton won easily. As with every election, reams of commentary have been written about the 1996 contest, but the explanation for Clinton's victory is ultimately just that simple.

I'm not claiming that Americans don't care about politics. But the people who follow the details of budget negotiations, legislative deals and foreign policy are usually those who already know what they think. If you care enough about politics to hold a strong view about the Senate filibuster, chances are you're already sure who you will vote for next time around. In terms of determining election outcomes, you probably fall into the category of the people that one party wants to make sure turns out to vote, but the other party would prefer got the flu that week. It's very likely you do not fall into the category of people who sometimes vote for one party and sometimes the other. That group, as relatively small as it is, tends to be decisive.

As I have suggested before, Biden was elected in 2020 largely because of because Barack Obama — a highly popular president who created a new political alignment during the 2008 election. Biden almost explicitly ran as Obama's political heir, and he won in part for that reason. He was also running against a president — you know who I mean! — who, unlike Clinton in 1996, was presiding over a crumbling economy and a public health disaster.

Biden's situation is certainly not an exact parallel with Clinton's — there was no one like Trump on the horizon, and the incumbent he defeated, George H.W. Bush, had the decency to go away. But the most important features apply. As Trump's defeat in 2020 makes clear, he didn't fundamentally transform the rules of American political life, even if it sometimes felt that way. People outside his base don't pay attention to the Trumpist right wing's whimsical fixations, any more than they did to the Gingrich talking points of the 1990s. Trump's takeover of the Republican Party has deepened political divides, and entrenched partisans on both sides. That likely inspires more people to turn out to vote, but it doesn't change the breakdown of how they vote. And in general, less partisan swing voters tend to cast their ballot based on the perceived conditions in the country.

This is the problem facing anyone who tries to predict the 2024 election from here: There is no way of knowing where we will be in three years. Economies can unexpectedly boom or go bust; international crises can emerge, with leaders rising to them or coming up short; unexpected catastrophes like hurricanes and pandemics can derail the best laid plans. And that's the short list. If Biden does unexpectedly well or egregiously flops in responding to these externalities, his political fate will improve or decline accordingly. Should nothing in his presidency move the needle all that much, the public will likely default to its standard voting patterns.

If anything, Biden's biggest political concern should be the Republicans' assault on voting. If his current approval ratings remain stable, and conditions remain acceptable at home and abroad, he'll hold a natural advantage by continuing to rely on Obama's 2008 coalition. Yet now that Trump's Big Lie is being used to roll back voting rights and empower partisan election officials, it is within the realm of possibility that Biden could have a victory stolen from him in 2024. If any policy failures right now are likely to hurt him politically three years hence, it will be the ones that limited his voters' ability to keep him in power. This won't just be the end of Biden's career, it might mean the end of democracy, confirming the Republicans' belief that only they have a right to political power.

But if we assume for the moment that people who want to vote will largely be allowed to — admittedly a very big if — the 2024 election will be decided on bedrock political loyalties, not on Donald Trump's bullshit histrionics. Trump would like to believe he has changed the world, but the same political rules that applied before he rode down that golden escalator still apply today.

What Biden can learn from FDR about dealing with Manchin and Sinema

Allan Lichtman has a track record of accurately predicting presidential elections, and is generally an astute observer of the American political scene. So I paid attention when Lichtman, a political science professor at American University, told me it would be disastrous for President Biden to go war against Joe Manchin, Kyrsten Sinema and the other centrist Democrats jamming him up in Congress.

Lichtman was fully aware that progressives were eager to purge obstructionist Democrats, or at least punish them somehow for constricting or defeating Biden's legislative agenda. I had already spoken to a historian — Harvey J. Kaye, the editor of "FDR on Democracy" — who pithily summed up the logic behind that point of view.

"Look, there's two choices," Kaye said. "For the sake of the future, he should literally go after them, period." His "them" clearly referred to Manchin and Sinema. "But for the sake of democracy in the near term, what happens if the Republicans win?" Kaye added that he could not understand "why Biden hasn't called Manchin" and the others and told them that their political survival depended on toeing the line.

In my conversation with Lichtman, he quoted humorist Will Rogers' famous quip: "I belong to no organized party. I am a Democrat." Kaye said basically the same thing. I reached out to both of them for Salon about the most conspicuous example of a president turning against legislators from his own party: Franklin D. Roosevelt's attempt to purge right-wing Democrats in the 1938 midterm elections. My primary question was about what lessons Biden could learn from that moment, given that his own presidency may go down in flames because of intransigent "moderate" Democrats.

FDR certainly wasn't the first Democratic president to turn against members of his own party. In 1918, Woodrow Wilson campaigned against five Southern legislators who opposed his World War I policies, and only all one of them actually defeated his Wilson-backed challenger. But that was a different era, when the Democratic Party's chaos led to an ideological vacuum. Instead of trying to fill that vacuum, Wilson weeded out politicians who opposed him on a specific set of policies that were widely supported by both parties. So there's no clear parallel to Biden in 2021.

Roosevelt's situation was at least somewhat similar. He explicitly wanted the 1938 midterm elections to realign the Democratic Party in a more liberal direction. Speaking to the nation in a "Fireside Chat" on June 24, Roosevelt characterized the coming primaries as containing "many clashes between two schools of thought, generally classified as liberal and conservative." Liberals recognized "that the new conditions throughout the world call for new remedies," he said, while conservatives do not "recognize the need for government itself to step in and take action to meet these new problems." Concerned that obstructionist members of Congress might roll back his achievements in creating unemployment insurance, old age pensions, anti-monopoly measures and regulation of the financial industry, Roosevelt accused them of wanting a return "to the kind of government that we had in the 1920s." He didn't need to remind his listeners that those policies had plunged America into the Great Depression. As he saw it, Democrats needed to rid themselves of the conservatives who hindering his vision before they destroyed his new liberal coalition.

Well: It didn't work. FDR targeted Rep. John J. O'Connor of New York, then chairman of the House Rules Committee, along with 10 Democratic senators, and only O'Connor was defeated in a primary. This was more than an immediate political setback for Roosevelt, although it definitely counted as that. (Democrats lost seven seats in the Senate and 72 in the House, although they started out with such a huge margin they still retained control of Congress.) In a way, his desire to realign American politics along more ideological lines worked. Right-wing Democrats from the South realized they had common cause with conservative Republicans from the Midwest, and their "conservative coalition" controlled Congress for a generation, shaping national policy regardless of which party officially had a majority. If anything, Roosevelt weakened the liberal cause rather than strengthening it. His only consolation was that many of the policies he was worried would get targeted wound up staying intact.

While the parallels between Roosevelt's predicament and Biden's are inexact, they are similar in the big ways that count. Biden's critics on the left want him to wage political war against the likes of Manchin, Sinema, and Rep. Josh Gottheimer of New Jersey, middle-path Democrats who appear willing to sacrifice his entire agenda in the interest of "bipartisanship." It sounds like strength. It almost certainly would not work.

As Kaye pointed out, Biden simply doesn't have the votes that FDR did, either in Congress or the nation at large. Roosevelt was a deeply beloved figure who had won re-election in 1936 in what at the time was the biggest electoral landslide in history. Biden, although he won decisively in 2020, has a narrower mandate. Lichtman noted that attacking moderate Democrats would imperil the Senate, where even one lost seat would swing the 50-50 body to the Republicans. If Democrats wanted a coalition large enough to render the "centrists" irrelevant, they would need to turn out in larger numbers and elect more Democrats to Congress and local offices. That hasn't happened, and at this moment Biden's legislative coalition is not large enough, nor is his popular support deep enough, even to contemplate Roosevelt's strategy — which, again, did not even work out for the most popular president of the 20th century.

The underlying problem, perhaps, is that the Democratic Party, in its current form, is fundamentally incompetent. Salon executive editor Andrew O'Hehir addressed this a recent article about Democrats' failure to eliminate the filibuster and protect voting rights.

This isn't a nice thing to say about a bunch of mostly sane and approximately reasonable people, but here's the truth: If you set out to design a left-center political party that was fated to surrender, little by little, to authoritarianism — because of circumstances beyond its control, because of internal indecision and ideological fuzziness, because it faced an entrenched and deranged opposition party, because of whatever — you could hardly do better than the current version of the Democratic Party.

This isn't just about Kyrsten Sinema flipping on prescription drug prices right after taking large campaign donations from Big Pharma. Democrats seem incapable of addressing the fundamental problems with our economy and lacks the internal cohesion to stand up to Republicans who are using Trump's Big Lie about the 2020 presidential election to erode or eliminate democracy. Those issues can't be corrected by defeating Joe Manchin and Kyrsten Sinema — which is also probably impossible and likely undesirable. The Democratic Party's best hope is to make itself relevant and vital again, which is a much larger problem.

Experts explain how to tell if someone is lying to you — without even hearing them talk

In a court hearing that went viral earlier this year, Coby Harris, a Michigan man accused of assault, is caught red-handed at his alleged victim's house. As the presiding judge notes, the bizarre moment would have been inconceivable before COVID-19 forced the law to be carried out via Zoom: Harris had violated the conditions of his bond by not merely contacting his accuser, but secretly being at her side during the hearing itself. It is difficult to overstate the danger; a domestic abuser in such a situation could not only intimidate a witness, but physically harm or even kill them.

Fortunately Deborah Davis, the assistant prosecuting attorney, felt something was off. In the video she informs the judge that she believes Harris and the accuser are in the same residence, citing "the fact that she's looking off to the side and he's moving around." The situation is promptly investigated, with police being called after Harris refuses to show his address on-screen and thereby prove he is not potentially intimidating the accuser. The clip reaches a climax when Harris abruptly pops up while being arrested. Cigarette limply dangling out of his mouth, he utters a half-heartedly apology to the court before painting himself as the victim. Davis facepalms, looking incredulous and disgusted as her worst suspicions are confirmed.

"At that very moment, I was dumbfounded that it actually happened the way that it did," Davis recalled to Salon, adding that she also felt significant relief that a possible crisis had been averted and the accuser was now safe.

What struck this author, however, was the fact that Davis had been able to spot Harris' lie at all. As someone on the autism spectrum, I struggle to read social situations; looking at the same footage as Davis, I only observed people staring blankly at their phones. I asked Davis how she was able to detect Harris' dishonesty based on such a seemingly sparse amount of information.

As it turns out, the answer has a lot to do with observational intelligence. To identify a lie as Davis did, the key is to pay attention to little details that are incongruous or simply strike you as off.

"My radar went up when her eyes were shifting and she wasn't answering the questions that we had just spoken about," Davis explained. The accuser shifting her eyes struck Davis as alarming because, while an accuser might look to the side at a defendant when both appear in a courtroom, people usually do not glance to the side during Zoom calls. Davis also described reviewing how the procedure works with the accuser shortly before the hearing, and made a mental note when the accuser suddenly began moving away from what had just been discussed.

"It's those types of non-verbals — where they know what you're asking, they know why you're asking it, and they know the standard that needs to be proven in order to move the case forward, but they have now backtracked on what they were saying," Davis told Salon.

It is notable that Davis' analysis came not just from non-verbals, but non-verbals that she understood within a specific context. Science is pretty definitive about the idea that you can't detect a lie based on non-verbal information alone; indeed, there is just no evidence that it works. Decades of scientific research and literature have failed to yield any consistent information about looks, sounds and any other non-verbal cues that can be indisputably linked to deceit. On many occasions, professionals whose jobs supposedly make them adept lie-detectors (psychiatrists, police officers, job recruiters) were no more adept at spotting fibs during an experiment than laypeople.

This does not mean that you can't figure out when someone is lying. (Just ask Davis.) To do so, however, you need to apply logic to your interpretation of their behaviors and find out where they would not make sense if a person was being truthful.

"It is hard to be consistent when lying," logician Miriam Bowers-Abbott, an associate professor at Mount Carmel College of Nursing, told Salon by email. "A liar must truly embrace any lie as a complete lifestyle to create consistency — like a delusion. Most people don't make that commitment."

This makes it relatively easy to expose bald-faced liars if you simply know how to grill them on inconsistencies. For someone telling a less brazen falsehood, however, you need to look for more subtle indicators, "a peculiar vagueness that someone might use to create a false impression." For instance, a person might say "a lot of people" feel a certain way or say that "some" individuals have a problem. Those statements may sound alarming, but they aren't specific; "a lot" and "some" could mean any number, and reveal precious little about the details within those numbers. To expose such possible lies, it is important to press the potential liars for the kind of information that they should already have if they are telling the truth.

Yet Bowers-Abbott warned against making assumptions about an individual's truthfulness based purely on physical signs. Like an actual lie, believing strictly in physical cues can lead you down a road of deception.

"I think it's common to look for physical signs, like lack-of-eye-contact, to indicate deception," Bowers-Abbott explained. Yet she added, "our world is more multicultural than it used to be, and there are many cultures where it's more normal to use less eye-contact. So, eye-contact isn't always a great clue."

The same principle applies for other supposed tells.

"How about hesitation?" Bowers-Abbott rhetorically asked. "Well, some people hesitate, when they're trying to be as accurate as possible. So hesitation isn't really a clue either." She prefers to look for signs such as a person who avoids questions, provides vague and ambiguous answers, tries to change the subject or is obviously pretending to misunderstand what their interrogator is saying. This approach involves analyzing the entire content of what a person does, while remaining sufficiently detached from the story they're trying to sell that you don't get suckered in by it.

It is also important to trust your instincts. David Ranalli, a magician, speaker and emcee, wrote to Salon that in his experience "there is no foolproof way to know if a person is lying." Ranalli explained that he gathers information from a number of places at once — whether the person's body language indicates they are nervous, whether their story seems believable, if they stay on the same subject for a long time — and draws his conclusions accordingly.

On one occasion, his honed instincts may have prevented a grisly incident.

"I once spotted someone lying during a very important and dangerous moment in my show," Ranalli recalled. "I play a game of Russian roulette with staple guns in the show, and I have someone mix a loaded gun among three empty ones. At the end of the routine, I staple one of the guns to my neck, betting on my ability to know which gun is loaded."

On one occasion, someone switched out the staples in the gun. "While it didn't matter which gun the staples were in, adding a lie into the fold gave me an eerie feeling that could have resulted in a dangerous outcome," Ranalli explained. When he asked if the staples had been switched and the person said no, Ranalli asked the audience if they had witnessed something sneaky.

"They all shouted yes," Ranalli told Salon. "It still ended with a fun outcome, but could have ended badly if I didn't notice that moment."

Not all scenarios involving spotting liars have a "fun" result. As Davis told Salon, much of her career has been spent fighting for victims of domestic abuse, and therefore has seen the damage that can occur when liars have their way. She talked about victims who are told by abusers to wear sleeves that will cover the bruises on their arms or to avoid speaking freely to religious leaders, or simply give stern looks that intimidate them where words cannot. And even when the liar is exposed for all the world to see, the sight can be horrifying.

This brings us back to Harris. Roughly two weeks after his infamous Zoom hearing, Harris had another court date, this one from jail. When things did not go his way during that hearing, Harris can be seen on video acting hysterically — screaming, wildly gesticulating, menacingly approaching the camera, at one point even storming out of the call room. Because his microphone was malfunctioning, Harris could be seen but not heard. As a result, his facial expressions and body language were a veritable smorgasbord of nonverbal communications — and their message was so explicitly violent that even I could easily decipher them.

"Looking at the behaviors of the defendant, based on the testimony of the victim, to me that says a lot about the truthfulness of the victim," Davis said. This happens in court a lot when abusers see their plans for getting off go awry; "you can usually see some form of a meltdown, or in other cases I've seen in Zoom where the defendant changes their posture and it goes from sitting and looking straight at the camera to standing and with their arms folded with the camera pointed up with more of an intimidation type of stance." While judges will never allow that kind of behavior, in court or Zoom, if they spot it, Davis explained that "for me as a prosecutor, sometimes I want to see that because I want to know whether or not we're on the right track and is justice being served, or is it somebody who's maybe fabricating or exaggerating what has happened."

In that moment, as he angrily flailed about, Harris illustrated a very important point about lying: Sometimes, no matter how hard you try, the uncomfortable truth will be stamped all over your body.

Biden issues a sweeping, unprecedented executive order to increase vaccinations in the US

As mutant coronavirus strains surged across the United States, Dr. Rochelle Walensky was alarmed.

"I'm going to reflect on the recurring feeling I have of impending doom," Walensky, the director of the Centers for Disease Control and Prevention (CDC), told reporters at a press conference in March. "We have so much to look forward to. So much promise and potential of where we are and so much reason for hope. But right now I'm scared." The reason, simply put, was that Americans who did not want to get vaccinated, wear masks or abide by lockdown policies were undermining the COVID-19 recovery effort.

Nearly six months later, landmark policies will be implemented as a result of those Americans defying public health advice. In a nationally televised address on Thursday evening, President Joe Biden announced that he is signing an executive order requiring vaccination for executive branch employees and contractors who do business with the federal government. He also said he is going to require employees of health-care facilities that receive Medicare or Medicaid funding to get their shots, as well as pressure businesses to mandate vaccines for their employees. All told, roughly 100 million Americans will be directly impacted by the new policies.

Introducing his multi-point plan, Biden said that "we have the tools to combat the virus" if we simply "come together" to follow basic public health measures like wearing masks and getting vaccinated. He added that "many of us are frustrated with the nearly 80 million Americans" who have not been vaccinated even though inoculations are free and safe, referring to a "pandemic of the unvaccinated" overcrowding our hospitals.

Placing partial blame at the feet of anti-science elected officials and denouncing "pandemic politics," Biden introduced a six-point plan that includes requiring all employers with more than 100 employees to ensure workers are either vaccinated or tested weekly; forcing all federal workers and all contractors doing business with the federal government to be vaccinated; and requiring employers to provide time off for being vaccinated.

Biden is also funding programs to make sure that vaccinated Americans can receive timely booster shots; doubling the fines on airline and other public transportation travelers who refuse to wear masks; supporting the Food and Drug Administration (FDA) as it evaluates a vaccine for people under twelve; providing more support for small businesses harmed by the COVID-19 pandemic; and requiring educators in the Head Start program to be vaccinated.

As the delta variant has overtaken the country, critics have argued for vaccine and mask mandates that will prevent future lockdowns and protect innocent people (particularly those who are unvaccinated due to lack of access or legitimate fears about racism, rather than because of right-wing ideology). As Salon's Amanda Marcotte wrote last month, Biden could mandate vaccinations for anyone who uses trains and airplanes; local leaders can require vaccines in public places; and schools and businesses can prohibit unvaccinated people from being employed there and/or attending as students. (Recent data reveals that the number of COVID-19 hospitalizations for people under 18 has quintupled since June.)

Although many conservatives are expected to complain about Biden's new policies, they are not as sweeping as many historic uses of presidential power.

"This is not the most significant of executive orders," Allan Lichtman, a political scientist at American University, wrote to Salon regarding Biden's decision that all federal employees must be vaccinated. "There is Lincoln's Emancipation Proclamation and his suspension of habeas corpus. There are FDR's orders on private gold holdings, the establishment of the Works Progress Administration, and the internment of the Japanese. There is Reagan's executive order on federal regulations. There are Trump's many orders that rewrote environmental and immigration policy."

He added, "This, however, will be the most significant executive order on public health."

In addition to expanding vaccination in the United States, Biden also promised to address global vaccine inequity through steps he will announce later in the month.

Here's everything we know about the mu variant — the latest coronavirus mutation 'of interest'

A study published last month established that Americans, after years of being pounded with creationist propaganda, had decisively rejected pseudoscience and accepted evolution. While the shift in public opinion had been years in the making, there was a certain poetry to the timing of that study's release. We have watched with bated breath as SARS-CoV-2 — the virus that brought the world to its knees by causing the COVID-19 pandemic — has evolved over and over again into something more effective at spreading through the human population. We have had the delta variant and the lambda variant and the hybrid B.1.429, to name only a few. Whenever a new strain pops up, public health officials try to strike a note between caution and reassurance.

Now we come to the mu variant, also known as B.1.621.

On Monday the World Health Organization (WHO) officially labeled the mu variant as a "variant of interest," a designation that indicates a need for further study about possible dangers while falling short of the more serious classification, "variant of concern." Variants of concern are regarded as a top priority because they are more immunity-resistant, contagious or deadly than other strains. Currently the WHO considers four strains to meet those criteria: alpha, beta, gamma and delta (the variant most prevalent in the United States).

The WHO reports that the earliest documented samples of the mu variant came from Colombia in January; the strain now makes up 39 percent of all the cases there. It has also been detected in dozens of other countries, most commonly popping up in the United States, Mexico, Spain and Ecuador. A British risk assessment released last month suggests that the mu variant could be at least as resistant to vaccine-based immunity as the beta strain, although more research needs to be done. The mu variant contains several mutations that have been associated with resistance to immunity, such as E484K and K417N, as well a mutation known as P681H that has been linked to accelerated transmission.

"This variant has a constellation of mutations that suggests that it would evade certain antibodies, not only monoclonal antibodies, but vaccine- and convalescent serum-induced antibodies," President Joe Biden's COVID-19 adviser Dr. Anthony Fauci told reporters on Thursday. "But there isn't a lot of clinical data to suggest that. It is mostly laboratory in-vitro data."

Fauci emphasized that people who are concerned about the mu variant should still get vaccinated. Experts agree that people who are vaccinated are still less likely overall to develop COVID-19. If they do get sick, they are also less likely to become severely ill thanks to the various protections that the vaccines confer. Fauci made this point by talking about how vaccines are still helping people against the delta variant, which is currently surging in the United States.

"Remember, even when you have variants that do diminish somewhat the efficacy of vaccines, the vaccines still are quite effective against variants of that time," Fauci explained. As for the mu variant, he was characteristically candid.

"We're paying attention to it, we take everything like that seriously, but we don't consider it an immediate threat right now," Fauci explained.

The WHO expressed a similar view regarding the mu variant, saying that more studies need to be performed so scientists can precisely understand the nature of the threat it does (or does not) truly pose.

"The epidemiology of the Mu variant in South America, particularly with the co-circulation of the Delta variant, will be monitored for changes," the organization explained.

During the same Tuesday meeting in which it announced its decision regarding the mu variant, the WHO also announced that it will start naming variants after stars and constellations once it runs out of Greek letters. Mu is the 12th letter in the Greek alphabet, meaning scientists are already halfway through that system. One WHO official wrote to science journalist Kai Kupferschmidt that "they will be less common stars/constellations, easy to pronounce. We are just checking internally with our regional colleagues to ensure none of them cause any offence or are common names in local languages."

As epidemiologist Dr. Eric Feigl-Ding tweeted, "it's sad they need to plan this."

He added that because Andromeda is a constellation, "'Andromeda Strain' is now technically possible as a name," referencing the 1969 Michael Crichton novel that was adapted into a 1971 film.

Why the spotted lanternfly must die

Like many people, I try to avoid killing insects or spiders unless it is absolutely necessary. Yet according to officials, this nonviolent stance towards insects could end up being a threat to the ecosystem of the Northeast. Indeed, the Pennsylvania Department of Agriculture is asking citizens to declare all-out war on one particular insect: the spotted lanternfly. Dispensing with any pretense of bureaucratic detachment, the state's website is admirably blunt:

"Kill it! Squash it, smash it...just get rid of it."

In areas of Pennsylvania like Northampton County (where I live), spotted lanternfly are not hard to find. Despite being only an inch long, the moth-like insect has a beautiful pattern once it spreads its wings. You are greeted with bright red and black spots, similar to a ladybug shell, in sharp contrast to the drab gray, yellow and black-and-white patterns that cover the rest of it. It is hardly the most memorable insect, but it does make an impression.

More to the point, they are everywhere. And if they take over the American northeast, it is going to be a very, very big deal.

"If they go unchecked, they will continue to spread throughout the American northeast as well as to other regions of the US and also potentially to other countries as well," Julie Urban, an associate research professor of entomology at Penn State University, told Salon by email. Urban is the author of "Perspective: Shedding light on spotted lanternfly impacts in the USA," a scholarly article in Pest Management Science that offered projections on what will occur if spotted lanternfly continue to spread through the region. Both in the article and while speaking with Salon, Urban detailed how spotted lanternfly could destroy both lives and the landscape. One reason is that of the two plants that spotted lantern flies have been documented to kill via feeding, one of them — grapes — is a vital crop.

"It has killed grapevines, which of course has direct negative economic impacts for growers," Urban told Salon. "However, it also has caused growers to increase the number of insecticide applications they make to try to control [spotted lanternfly] (which still is not sufficient to allow them to overcome the damage caused by waves of [spotted lanternfly] coming into vineyards, particularly in mid-September), and more insecticide sprays cost more money."

The bugs also harm local tourism economies, especially when they destroy the experience of touring vineyards, hosting events like weddings and holding wine tastings.

All of this is already happening in southeastern Pennsylvania and could occur in areas like Long Island, the Hudson Valley and the Erie regions of both Pennsylvania and New York. The insects have recently been observed in New York City.

Agriculturally speaking, the spotted lanternfly problem is not limited to grapevines. Urban reported that Christmas tree growers and nurseries have also had issues with spotted lanternfly partially damaging their stock. Beyond that, it is expensive to try to keep the insects away from their products, which they must do by monitoring for every stage of the animal's life cycle — from egg masses through every development stage until they reach adulthood. Every occasion when spotted lanternfly are found within a plant — even if they are dead — must be reported, and offending businesses could receive fines and harm their business reputations. Urban also pointed out that other businesses could be impacted precisely because spotted lanternfly are not indigenous to our ecosystem, and are therefore unpredictable.

Melody Keena, a research entomologist at the United States Forest Service, added that the insects can also be a problem for ordinary people just trying to rest in their homes.

"[Spotted lanternfly] are a nuisance pest of homeowners," Keena explained by email. While scientist are unsure about their long-term impact on landscape trees, they produce "copious quantities of sticky honeydew when they feed and black sooty mold will grow on it." These byproducts can make for slippery surfaces and can be a tripping and slipping hazard.

Some of the gross fluids produced by spotted lanternfly directly contribute to their invasiveness. Their egg masses are covered with a gray waxy material that helps them stay attached to smooth surfaces like tree bark, cinder blocks, stone, shipping pallets, rail cars and automobiles.

Experts believe that the spotted lanternfly entered the United States from its native Asia (particularly China, Vietnam and Bangladesh) after an egg mass attached to a shipment of stone arrived in the Berks area of Pennsylvania. Since then, it has continued to spread as the bugs have dispersed, reproduced, and taken advantage of their reproductive cycle to avail themselves of human transportation corridors.

"This makes human aided transport likely, and this has contributed to its spread both nationally and internationally," Keena explained.

Going forward, ordinary Americans are being asked to be on the alert for spotted lanternfly in order to halt the invasion.

"There is an active effort underway to reduce [the lanternfly's] populations," Erin Otto, National Policy Manager, in the U.S. Department of Agriculture's (USDA) Animal and Plant Health Inspection Service (APHIS), told Salon by email. "It may not have as many natural predators here as it does in its native habitat, but APHIS, state departments of agriculture, and U.S. residents are working to contain and manage the pest."

Matthew Helmus, an assistant professor of biology at Temple University, told Salon by email that this will include supporting funding for programs that survey for spotted lanternfly, removing trees of heaven from their properties (this is the other plant that they are known to kill in order to eat), and learning how to identify the spotted lanternfly when it is on one's property.

And if you happen to spot one? Well, the Pennsylvania Department of Agriculture's website tells you exactly what you should do next.

How civilizations thwart extinction in the face of existential crises

Humans set ourselves apart from other animals in our ability to recognize and stave off coming threats. A tiger or a dove cannot plan for a flood that might be decades off, and then build their dens defensively; humans, on the other hand, are blessed with the ability to predict and anticipate threats that may be far in the future.

It is odd, then, to think that American civilization is (for the most part) keenly aware of the threat of climate change, yet so far incapable of compensating and overcome it. We build our cities to withstand hurricanes and floods, heat and cold; we build dams and windmills, and re-routed the Colorado River to keep Phoenix and Tucson hydrated at a cost of $4 billion. Now, we are keenly aware of the existential threat of climate change — of its potential to disrupt our lives in unimaginable ways, or even bring about the end of human civilization. So why is it that our civilization can't wrap our minds around it?

Part of it may be that reality is hard to imagine. For one thing, no one alive today has lived through the collapse of an entire civilization; the actual day-to-day experience of witnessing that process is alien to us.

Still, it is difficult to conceive of why any society, particularly ours, would allow itself to be brought to extinction. We already know climate change is feeding the wildfires that consume the Pacific Coast and causing rising sea levels which will soon flood cities like New York. Large sections of the planet will become too hot or dry to inhabit (goodbye, Phoenix), extreme weather events like hurricanes will be much more common and supply chains for everything from food to microchips will break down.

Yet despite this knowledge, we still seem poised to "walk the plank with our eyes wide open," to quote Gotye's hit 2010 song on the topic. Even after the United Nations climate panel release yet another report warning of an environmental apocalypse unless we eliminate greenhouse gas emissions, humanity is not doing the one thing it must: Dropping all other priorities until the existential threat is neutralized.

The strange thing is that, if you look to history, there are other situations in which societies faced, and recognized existential threats. Some rose to those occasion and overcame them; others saw the writing on the wall but were incapable of changing their ways. The sagas of previous civilizations that recognized these threats — and either bowed to them or planned around them — are key to understanding the plight of our empire as it struggles to address the climate crisis.

* * *

We can start with a lesson in when things went wrong — namely, Ireland, when a famine that was completely foreseeable occurred despite there being ample warnings.

The Great Famine of 1845 to 1852 occurred due to a brutal combination of economic and ecological injustices perpetrated by Ireland's English conquerors. On the economic side, English and Anglo-Irish landlords and middlemen (people tasked to collect rent) controlled the island's land and wealth, leaving the Catholic common folk in abject poverty. Their ability to control land was so limited that they had to rely on potatoes for food, since those could be grown cheaply, abundantly and in lower quality land. More than 175 government panels of various kinds warned that the economic system in Ireland was unsustainable, as millions were already suffering through starvation and ill-health and the slightest breeze of bad luck could knock down the whole house of cards.

That figurative breeze finally came in 1845 in the form of Phytophthora infestans, a fungus-like organism that demolished potato crops throughout Ireland. Ruining up to one-half of the potato crop in 1845 and three-quarters over the following seven years, the Great Famine ultimately led to at least 1 million deaths and the emigration of at least another 1 million souls (many of them to America). England implemented some half-measures in a semi-sincere attempt to alleviate the misery — they repealed laws that made staple foods like bread and corn prohibitively expensive, for example — but refused to check the avarice of the elites. The Irish working class was still forced to produce foods like beans, peas, honey and rabbits that were exported to other countries (in some cases, like with butter and livestock, exports may have increased). Economic elites still were allowed to exploit the Irish working class in a number of ways. The mere concept of widespread wealth redistribution, which alone could create economic justice and responsible land management, was never seriously considered.

Ireland therefore stands mainly as an object lesson in the danger of letting special interests dictate policy, particularly when experts can foresee adverse consequences. The powers controlling Ireland had every reason to know that disaster was afoot but, because they did not value Irish lives, failed to avert a humanitarian catastrophe. Business logic prevailed over plain old logic.

It doesn't help that, unlike the Irish of the mid-19th century, Earthlings in the early-21st don't have the option of emigrating.

* * *

The Netherlands presents us with a more hopeful example of a civilization facing an existential threat and dealing with it. As far back as the days of the Roman Empire, the area now known as the Netherlands was notorious for its intense flooding because of its low elevation. Observers and inhabitants alike knew that if the region was to be populated, the flooding would have to be controlled.

Thus, by the later half of the first millennium AD, the inhabitants of the Netherlands had already begun to construct rudimentary dikes. Dike construction picked up in earnest around the start of the second millennium, and by the 13th century most of the dikes had been linked to each other so an ongoing sea defense system could exist. Dutch officials even created local water boards around this time in order to maintain the dikes.

Crucially, Dutch politicians continued to work together over the years to maintain their dikes, with most major political institutions in the country recognizing that their very survival depends on them. In 1953, when the sea level on the Dutch coast rose so high that flooding destroyed millions of dollars in property and caused 1,835 deaths in the Netherlands (as well as 307 in England), the government quickly created a so-called "Delta Commission." The panel came back with a suggestion that the country create an elaborate infrastructure of dikes, dams, storm barriers and sluices so that future catastrophic flooding will not occur. As a result, no one in the Netherlands has been killed by a flood since the 1953 disaster.

There are several valuable lessons to be found in the Dutch example. First, there is a willingness by all sides to overlook their differences when it comes to matters that literally mean life or death for their civilization. Second, there is a pragmatic approach to problem-solving that is gradual when necessary and drastic when necessary, emphasizing state-of-the-art knowledge over any particular philosophical approach. Lastly, of course, is the fact that governing authorities listened to scientific experts (or, if we're talking about the medieval era, their equivalents). This not only saved lives, but protected Dutch civilization and allowed it to become a modern empire — which, ironically, is now implicated in the global capitalist order that is hastening carbon emissions.

There is also a contested example of a civilization collapsing which, though contested, has many lessons for our modern predicament: that of Easter Island. In his 2005 book "Collapse: How Societies Choose to Fail or Succeed" (which built on the work of other scholars tracing back to the 1970s), biologist Jared Diamond posited that the Rapa Nui people on Easter Island (which is also known as Rapa Nui) saw their civilization collapse because their population grew too large and they overused the island's limited resources. In particular, Diamond notes that they over-harvested the wood that was used to construct moai, or large statues of forefathers that owners used to flaunt their wealth and status. The loss of trees supposedly led to soil erosion, however, making it difficult to grow food in a society that (without wood) people could not leave. The end result, Diamond argues, was cannibalism, warfare and a major population decline.

If true, Diamond's hypothesis would have obvious implications for global warming: a society engaged in deforestation which led to a concomitant collapse in the surrounding ecosystem, causing a food shortage. This happened in a closed system, an island, that was difficult to emigrate from.

Yet not everyone agrees with Diamond's theory. Speaking to Salon by email, scholars Robert DiNapoli, Terry L. Hunt, Carl P. Lipo and Timothy Rieth argue that the archaeological record does not support Diamond's hypothesis. Rather, they believe that the evidence suggests a "population that arrived on the island ca. 1200 AD, grew to a size that was supported by the resources available on the island, and live in communities that drew benefits from monument construction." They argue that these communities were sustainable for more than 500 years.

The "Rapa Nui people were able to adapt to their local environments and live successfully despite the island's inherent constraints (size, remote location, few natural resources, etc.)," the scholars said.

In this narrative, what upset the balance on Rapa Nui was, quite simply, European colonialists. They performed slave raids which took thousands of people off the island, brought diseases with them, and devastated the island's environment. (For 50 years, the island was managed as a sheep ranch through a foreign corporation.) All of these factors spread needless suffering and misery to the Rapa Nui and cost incalculable lives. Yet in spite of this, the Rapa Nui people survived, and today more than 5,000 live on the island, speak their ancestral language and practice their native culture. Those moai, maligned as an unnecessary waste of resources by Diamond, may have even helped, since they brought communities together.

"The very thing that people have said was the 'downfall' of Rapa Nui society is almost certainly a key part of their success," the scholars told Salon.

The lesson of the Rapa Nui is not so simple. Yet their plight speaks to the importance of monuments and other cultural artifacts in civilizations — keystones that help civilizations survive in the face of external threats, whether imperialism, greed or climate change.

Joe McCarthy was never defeated — and Donald Trump now leads the movement he created

A few years ago, I was interviewing Roger Stone when he happened to use the phrase "new McCarthyism," describing those who accused former Trump campaign chair Paul Manafort of being a tool of Russian interests. This was more than a little ironic for abundant reasons, especially given that as a younger man, DonaldTrump had been mentored by the infamous Roy Cohn, Joe McCarthy's right-hand man, Roy Cohn.

Stone tried to defend himself by saying that he'd read M. Stanton Evans' book "Blacklisted by History," and found it "a more balanced review of exactly what McCarthy was talking about and what he did." That didn't make much sense either: Evans' book is a revisionist attempt to defend McCarthy, which is widely maligned by serious historians. It's not surprising that a longtime Republican operative would read it — but then, if Stone was on McCarthy's side, why was he accusing other people of "new McCarthyism"?

Stone tried to salvage that one too, arguing, "Whether I like it or not, people view McCarthyism, as a label, as the hurling of false accusations." Overall, though, there was more truth-telling in that exchange than you normally get from Stone. The Republican Party of 2021 is very much the party that McCarthy envisioned, centered on a supposed strongman's personality, viciously seeking to destroy any outsiders seen as threats and rooted in blatant bigotry. In that context, it's important to clarify what Joe McCarthy did and why his legacy is still dangerous.

McCarthy was elected to the Senate from Wisconsin in 1946, and his early years in office were unmemorable — except for one revealing episode. He denounced the death sentences handed down in U.S.-occupied Germany to a group of Waffen-SS soldiers convicted of murdering American troops in an event known as the Malmedy massacre. This moment in McCarthy's career, though virtually forgotten today, is highly instructive On the surface, he presented himself as a crusader for justice, arguing that the Army was covering up judicial misconduct and that this called into question the validity of the Germans' confessions. (He never provided any evidence for this.)

In fact, McCarthy was doing something much more sinister. On some level he understood that defending a group of Nazis would appeal to the antisemitic American far right at a moment when expressing public hatred for Jews was unacceptable. At least implicitly, McCarthy was accusing the Jewish Americans who helped investigate the crimes of seeking vengeance and perpetrating injustice. Today we might call this flipping the script: Suddenly Jewish people, in the immediate aftermath of the Holocaust, were persecutors, and "Aryan" Germans — those who had committed mass murder — were their victims.

That wasn't enough to make McCarthy a right-wing superstar, probably because any hint of pro-Nazi sympathies was completely out of bounds in the postwar years. McCarthy needed a different vehicle to achieve political stardom and found it in 1950 when, while delivering a speech in West Virginia, he claimed to have a list of more than 200 known Communists who were allowed to work in the State Department. (No such list existed.) '

The speech was a smash hit and over the course of four years, the Wisconsin senator accused countless people of either actually being Communists, being "Communist sympathizers" (whatever that meant) or being "soft on Communism," a hopelessly vague term that could be applied to almost anyone who didn't support open military confrontation with the Soviet Union. With America on edge during the early years of the Cold War, McCarthy inflamed widespread paranoia, without once provided evidence that any of his targets had done anything illegal. That didn't much matter: He was saying what unhappy right-wingers wanted to hear, and they supported him with enthusiasm. (Yeah, some of this might sound familiar.) gave him tremendous political influence as a result.

Many of McCarthy's targets were political opponents, like Sen. Millard Tydings, a Maryland Democrat who had criticized him, and Illinois Gov. Adlai Stevenson, the Democratic presidential nominee in both 1952 and 1956. He also persecuted members of marginalized groups, claiming that they could be vulnerable to Communist influence: Today we would say he was obsessed with the "cultural elite," going after East Coast intellectuals and LGBT people (although that term did not exist), left-wing activists and journalists, members of the Washington political establishment and, of course, Jews. His strident attacks powered Republican victories in the 1950 midterm elections, and plenty of Southern Democrats liked him too.

McCarthy was rewarded with a powerful chairmanship, at the Senate Committee on Government Operations, where Cohn and the young Robert F. Kennedy serving as assistant counsels. There he targeted the Voice of America, the overseas library program of the International Information Agency (this led to book burnings), several prominent Protestant clergymen and finally the U.S. Army. That last crusade proved to be a bridge too far: Joseph Welch, chief counsel for the Army, called out McCarthy on national television for his cruelty and recklessness, famously demanding, "At long last, sir, have you no decency?"

The American public, seeing McCarthy exposed as a bully and liar, rapidly turned against him. He died in 1957, likely as a result of alcoholism, but if he'd lived would likely have lost his Senate seat the following year. Although the term "McCarthyism" had been coined well before his downfall, that guaranteed that it would be an epithet rather than a compliment. From that point on, even Republicans began using the term "McCarthyist" to refer to baseless and malevolent smears.

This brings us to the Trump era. First of all, the accusations that Trump's campaign colluded with Russian agents in 2016 are not "McCarthyist," both because they were highly plausible (and at least partly true) and because they had nothing to do with left-wing or Communist ideology. For a better idea of what McCarthyism actually entails, consider this passage from a 2017 article about Cohn's influence on Trump. It practically lays out, step by step, the ways that Trump's narcissism would later fuel his attempts to overturn the 2020 election:

For author Sam Roberts, the essence of Cohn's influence on Trump was the triad: "Roy was a master of situational immorality . ... He worked with a three-dimensional strategy, which was: 1. Never settle, never surrender. 2. Counter-attack, counter-sue immediately. 3. No matter what happens, no matter how deeply into the muck you get, claim victory and never admit defeat." As columnist Liz Smith once observed, "Donald lost his moral compass when he made an alliance with Roy Cohn."

That attitude is a key element of McCarthyism. The only two ingredients missing from that description are the blatant pandering to bigotry and paranoia and the way supporters are seduced by a narcissist's charismatic allure into a sense of shared omnipotence with them. Without the former, the McCarthyist lacks the fuel necessary to whip up the mob against supposed enemies; without the latter, the demagogue can't convince the mob that his individual desires are also their own.

The bogus and evidence-free claim that Trump really won the 2020 election is quintessentially McCarthyist: Trump refused to settle or admit defeat, trying to proclaiming victory before all the votes had been counted and filing dozens of nonsensical lawsuits. Like McCarthy and Cohn, Trump gaslit America. As with McCarthy's claims that he had lists of Communist agents in the government, Trump's empty allegations force his supporters either to take him at his word or reveal their disloyalty — and nobody who wants a career in Republican politics can afford to be disloyal to Trump at the moment. In both cases, proof was no longer needed, and on some level was viewed with scorn. To doubt Joe McCarthy in the early '50s was to become an accomplice to the Communist conspiracy, just as anyone who rejects Trump's Big Lie today is clearly a socialist antifa liberal.

That is how a lie becomes political dogma, a phenomenon also visible in the current right-wing obsession with "critical race theory." Just as McCarthy defined "Communism" so broadly that it lost all meaning, opposition to "critical race theory" has very little to do with the academic approach that term actually describes — but a great deal to do with maintaining white supremacy. Salon's Chauncey DeVega has described it this way:

For today's Republicans, Trumpists and other members of the white right, "critical race theory" is a form of political ectoplasm: It's both a liquid and a solid, something slimy and sticky which can be shaped into whatever frightening or dangerous thing suits their mood and needs in a given moment.
In this political context, "critical race theory" means both everything and nothing; it is a fetish object used to summon up centuries-old racist nightmares and fears about "scary" Black and brown people who are plotting a rebellion or uprising to undermine the (white) family, indoctrinate (white) children and attack (white) America.
By implication, if "critical race theory" and other Black and brown bogeymen are threats to (white) America, then preemptive violence is both necessary and reasonable. Moreover, multiracial democracy is seen, by definition, as incompatible with white people's safety, security and material interests.

In channeling McCarthyism, whether consciously or otherwise, Trump has been successful to a degree McCarthy himself could only have dreamed about. But the connection is clear. While McCarthy was personally discredited, he made it difficult for any prominent American to express unpopular or radical views without being accused of disloyalty or possessing "Communist sympathies." The McCarthyist current has been with us ever since, and as Trump's career demonstrates, has not yet been defeated. If anything, it appears to be winning. Roger Stone is correct, in an upside-down fashion: There is a new McCarthyism in America today, and his pal Donald Trump and his supporters are the ones practicing it.

How climate change will disrupt supply chains: Experts say you should expect more shortages in the future

Dr. Thomas Goldsby, a professor of supply chain management at the University of Tennessee — Knoxville's Haslam College of Business, assigns his undergraduate student a "routine exercise" that frequently proves revelatory. Its purpose is to illustrate the complexity of the various trade routes that bring products from all over the world to consumers. The assignment is to figure out how far the students can trace the supply chain — if possible going back to the exact point when the raw materials were extracted.

"When my students have had an opportunity to present their results to the companies and the products that they produce, the company executives learn something every time," Goldsby told Salon. "I just think it's remarkable that my undergraduate students can present news about the business or the products and the senior executives are like, 'Wow, we had no idea that a golf club manufacturer is wondering why they have a hard time getting titanium.' It's because there's not a lot of titanium that goes into a golf club, but there's a whole heck of a lot of it that goes into an aircraft to build the fuselage."

Goldsby uses this exercise to explain the complexity of the supply chains and how unanticipated hiccups — such as another industry wanting a resource you need, and you not knowing it — can have drastic, unexpected consequences.

The internecine nature of the supply chain means that seemingly unrelated things can have an effect on each other — say, a global pandemic and a microchip shortage. Microchips, the sets of circuits hosted on small flat pieces of silicon, are intrinsic to so much of industrial civilization: they are used in computers, cars, mobile phones, home appliances and virtually all other electronic equipment. We already have a shortage of microchips because of COVID-19. Yet it is going to get a whole lot worse because of global climate change.

Pandemics may not seem to have much to do with the manufacture of microchips; silicon chips, certainly, cannot contract the virus. Yet the supply chain for microchips is fickle: historically, chipmakers were usually able to keep pace with growing demand for chips in products like automobiles and home electronics. But the pandemic interrupted that rhythm by causing consumers to behave in unpredictable ways, with manufacturers struggling to correctly foresee how many chips they would need for everything from Volkswagens to Playstations. Because the supply chains are so complicated, this made it easier for problems to arise that delayed production or transportation.

Worse, the industry has a lot of bottlenecks. There are only a handful of foundries that account for most of the world's chip fabrication, resulting in roughly 91% of the contract chipmaking business being located in Asia. This makes countries like the United States vulnerable to production disruptions either in those distant lands or at any step along the way. Likewise, there are companies in the United States, Japan, the Netherlands and elsewhere that have also found ways to make themselves indispensable to the global manufacturing of microchips. The end result is that this particularly important piece of equipment is especially vulnerable to shortages when there are unexpected alterations to consumer demand, a phenomenon known as the bullwhip effect.

Experts do not believe that the chip shortage is going to end anytime soon, but it is only the beginning of the problem. If you thought COVID-19 caused problems for supply chains, imagine how they'll be blown apart when climate change causes extreme weather events, rising sea levels and massive spikes in temperature all over the world. There will be increasingly frequent and severe wildfires on the Pacific Coast, flooding in our eastern cities and millions of refugees. It is impossible to anticipate the number of new variables this will throw into orderly supply chain management — other than accepting that particularly intricate supply chains are almost certainly going to start coming apart.

"The industry is very clearly dependent on globally-interconnected supply chains and distribution systems," Dr. Michael E. Mann, a distinguished professor of atmospheric science at Penn State University, told Salon by email. "Anything, such as COVID-19, that disrupts transportation is going to disrupt these supply chains and distribution systems and lead to bottlenecks and backlogs."

Mann cited a recent report by the United Nations' Intergovernmental Panel on Climate Change, explaining that this "will clearly lead to delays in the distribution of microchips and will presumably have an adverse impact on the semiconductor and computer industries."

The most obvious solution to this problem, naturally, would be for world leaders to take global warming seriously and do whatever it takes to both reduce greenhouse gas emissions and fix the damage already done to our planet. Frustrating though it may be, however, there are practical geopolitical realities which strongly suggest this may not happen. That means we face a chip shortage in the foreseeable future — among other shortages.

"I think that we need to acknowledge that there are interactions that we don't yet fully understand and appreciate," Goldsby told Salon. He later added, "We've got to go back to raw material extraction and we need to try to map it out and understand where those raw ingredients come from."

This is more challenging than it might seem, because despite their small size, microchips are very intricate.

"It's basically like making a cake," Dr. Ron Olson, Director of Operations at Cornell NanoScale Science and Technology Facility, told Salon. "You start off with this base layer, then you add and subtract metals and oxides in different layers, just like following a recipe on a cake or baking or whatever. Then you end up with your final product."

Dr. Christopher K. Ober, a professor of materials engineering at Cornell University, explained that one way to get around possible shortages is to follow the example of a car manufacturer that proved more resilient to supply chain issues than other organizations.

"Toyota really was a big proponent of lean manufacturing," Ober explained. "Basically they didn't keep anything in warehouses. It was delivered in a truck at the same time they were going to put it into a car. What Toyota learned, I think it was because of [the Fukushima earthquake in 2011], they couldn't entirely depend on instantaneous delivery. They actually had to start storing critical parts that might be hard to access."

While this kind of smart resource allocation can protect companies from immediate issues like chip shortages, one economist argued to Salon that we need to stop assuming lengthy supply chains are an inevitable and necessary part of our economy. According to Dr. Richard D. Wolff, professor emeritus of economics at the University of Massachusetts Amherst, lengthy international supply chains were developed by a handful of powerful corporate elites to maximize their profits — regardless of the obvious fragilities in such arrangements.

"It was a deliberate economic sequence of decisions made by a particular group of people for particular purposes that created the global supply chain," Wolff explained, saying the choices were made by the thousands of people who comprise the boards of directors in American corporations. "They decided, starting in the 1970s, that American capitalism had reached a kind of tipping point. It had grown spectacularly over the previous century. It had made a ton of money, but along the way, it had had to compensate the working class. Not on a scale that the working class deserved, needed or wanted, but because of the unionization, they had to come across with something."

They did, improving working conditions for American employees at the behest of labor, but in the process corporations saw their profits decline. Then they turned to nations in Africa, Latin America and Asia. Many of them had only recently shaken off the yoke of European colonialism and could not protect their workers as well as Americans. Their citizens had taken over their governments and been able to improve the education, health care and other social systems for their people (these had been neglected by the European colonial powers). This mean that they could be substituted for American workers at much lower cost.

"Long story short, a massive relocation of business was accomplished by the capitalists of the world, in which they went to China, India, Brazil, and other places far away from the centers of capitalism in Western Europe, North America and Japan," Wolff observed. "That's why we have long supply chains. It was the corporate leadership that made the decision to maximize their profits by moving all that."

Like so many other features of capitalism, the development of long supply chains seems poised to self-immolate because of climate change. Unfortunately, there is no sign that the economic titans are interested in changing the way supply chains work at this juncture. That means we are facing a future where we lose access to the internet, can no longer watch television or movies, have countless useless cars and struggle to find food. Hence, unless the economy is restructured to end climate change and ridiculously long supply chains, it is hard to see how realistic hope will be possible.

BRAND NEW STORIES