Matthew Rozsa

Experts explain how to tell if someone is lying to you — without even hearing them talk

In a court hearing that went viral earlier this year, Coby Harris, a Michigan man accused of assault, is caught red-handed at his alleged victim's house. As the presiding judge notes, the bizarre moment would have been inconceivable before COVID-19 forced the law to be carried out via Zoom: Harris had violated the conditions of his bond by not merely contacting his accuser, but secretly being at her side during the hearing itself. It is difficult to overstate the danger; a domestic abuser in such a situation could not only intimidate a witness, but physically harm or even kill them.

Fortunately Deborah Davis, the assistant prosecuting attorney, felt something was off. In the video she informs the judge that she believes Harris and the accuser are in the same residence, citing "the fact that she's looking off to the side and he's moving around." The situation is promptly investigated, with police being called after Harris refuses to show his address on-screen and thereby prove he is not potentially intimidating the accuser. The clip reaches a climax when Harris abruptly pops up while being arrested. Cigarette limply dangling out of his mouth, he utters a half-heartedly apology to the court before painting himself as the victim. Davis facepalms, looking incredulous and disgusted as her worst suspicions are confirmed.

"At that very moment, I was dumbfounded that it actually happened the way that it did," Davis recalled to Salon, adding that she also felt significant relief that a possible crisis had been averted and the accuser was now safe.

What struck this author, however, was the fact that Davis had been able to spot Harris' lie at all. As someone on the autism spectrum, I struggle to read social situations; looking at the same footage as Davis, I only observed people staring blankly at their phones. I asked Davis how she was able to detect Harris' dishonesty based on such a seemingly sparse amount of information.

As it turns out, the answer has a lot to do with observational intelligence. To identify a lie as Davis did, the key is to pay attention to little details that are incongruous or simply strike you as off.

"My radar went up when her eyes were shifting and she wasn't answering the questions that we had just spoken about," Davis explained. The accuser shifting her eyes struck Davis as alarming because, while an accuser might look to the side at a defendant when both appear in a courtroom, people usually do not glance to the side during Zoom calls. Davis also described reviewing how the procedure works with the accuser shortly before the hearing, and made a mental note when the accuser suddenly began moving away from what had just been discussed.

"It's those types of non-verbals — where they know what you're asking, they know why you're asking it, and they know the standard that needs to be proven in order to move the case forward, but they have now backtracked on what they were saying," Davis told Salon.

It is notable that Davis' analysis came not just from non-verbals, but non-verbals that she understood within a specific context. Science is pretty definitive about the idea that you can't detect a lie based on non-verbal information alone; indeed, there is just no evidence that it works. Decades of scientific research and literature have failed to yield any consistent information about looks, sounds and any other non-verbal cues that can be indisputably linked to deceit. On many occasions, professionals whose jobs supposedly make them adept lie-detectors (psychiatrists, police officers, job recruiters) were no more adept at spotting fibs during an experiment than laypeople.

This does not mean that you can't figure out when someone is lying. (Just ask Davis.) To do so, however, you need to apply logic to your interpretation of their behaviors and find out where they would not make sense if a person was being truthful.

"It is hard to be consistent when lying," logician Miriam Bowers-Abbott, an associate professor at Mount Carmel College of Nursing, told Salon by email. "A liar must truly embrace any lie as a complete lifestyle to create consistency — like a delusion. Most people don't make that commitment."

This makes it relatively easy to expose bald-faced liars if you simply know how to grill them on inconsistencies. For someone telling a less brazen falsehood, however, you need to look for more subtle indicators, "a peculiar vagueness that someone might use to create a false impression." For instance, a person might say "a lot of people" feel a certain way or say that "some" individuals have a problem. Those statements may sound alarming, but they aren't specific; "a lot" and "some" could mean any number, and reveal precious little about the details within those numbers. To expose such possible lies, it is important to press the potential liars for the kind of information that they should already have if they are telling the truth.

Yet Bowers-Abbott warned against making assumptions about an individual's truthfulness based purely on physical signs. Like an actual lie, believing strictly in physical cues can lead you down a road of deception.

"I think it's common to look for physical signs, like lack-of-eye-contact, to indicate deception," Bowers-Abbott explained. Yet she added, "our world is more multicultural than it used to be, and there are many cultures where it's more normal to use less eye-contact. So, eye-contact isn't always a great clue."

The same principle applies for other supposed tells.

"How about hesitation?" Bowers-Abbott rhetorically asked. "Well, some people hesitate, when they're trying to be as accurate as possible. So hesitation isn't really a clue either." She prefers to look for signs such as a person who avoids questions, provides vague and ambiguous answers, tries to change the subject or is obviously pretending to misunderstand what their interrogator is saying. This approach involves analyzing the entire content of what a person does, while remaining sufficiently detached from the story they're trying to sell that you don't get suckered in by it.

It is also important to trust your instincts. David Ranalli, a magician, speaker and emcee, wrote to Salon that in his experience "there is no foolproof way to know if a person is lying." Ranalli explained that he gathers information from a number of places at once — whether the person's body language indicates they are nervous, whether their story seems believable, if they stay on the same subject for a long time — and draws his conclusions accordingly.

On one occasion, his honed instincts may have prevented a grisly incident.

"I once spotted someone lying during a very important and dangerous moment in my show," Ranalli recalled. "I play a game of Russian roulette with staple guns in the show, and I have someone mix a loaded gun among three empty ones. At the end of the routine, I staple one of the guns to my neck, betting on my ability to know which gun is loaded."

On one occasion, someone switched out the staples in the gun. "While it didn't matter which gun the staples were in, adding a lie into the fold gave me an eerie feeling that could have resulted in a dangerous outcome," Ranalli explained. When he asked if the staples had been switched and the person said no, Ranalli asked the audience if they had witnessed something sneaky.

"They all shouted yes," Ranalli told Salon. "It still ended with a fun outcome, but could have ended badly if I didn't notice that moment."

Not all scenarios involving spotting liars have a "fun" result. As Davis told Salon, much of her career has been spent fighting for victims of domestic abuse, and therefore has seen the damage that can occur when liars have their way. She talked about victims who are told by abusers to wear sleeves that will cover the bruises on their arms or to avoid speaking freely to religious leaders, or simply give stern looks that intimidate them where words cannot. And even when the liar is exposed for all the world to see, the sight can be horrifying.

This brings us back to Harris. Roughly two weeks after his infamous Zoom hearing, Harris had another court date, this one from jail. When things did not go his way during that hearing, Harris can be seen on video acting hysterically — screaming, wildly gesticulating, menacingly approaching the camera, at one point even storming out of the call room. Because his microphone was malfunctioning, Harris could be seen but not heard. As a result, his facial expressions and body language were a veritable smorgasbord of nonverbal communications — and their message was so explicitly violent that even I could easily decipher them.

"Looking at the behaviors of the defendant, based on the testimony of the victim, to me that says a lot about the truthfulness of the victim," Davis said. This happens in court a lot when abusers see their plans for getting off go awry; "you can usually see some form of a meltdown, or in other cases I've seen in Zoom where the defendant changes their posture and it goes from sitting and looking straight at the camera to standing and with their arms folded with the camera pointed up with more of an intimidation type of stance." While judges will never allow that kind of behavior, in court or Zoom, if they spot it, Davis explained that "for me as a prosecutor, sometimes I want to see that because I want to know whether or not we're on the right track and is justice being served, or is it somebody who's maybe fabricating or exaggerating what has happened."

In that moment, as he angrily flailed about, Harris illustrated a very important point about lying: Sometimes, no matter how hard you try, the uncomfortable truth will be stamped all over your body.

Biden issues a sweeping, unprecedented executive order to increase vaccinations in the US

As mutant coronavirus strains surged across the United States, Dr. Rochelle Walensky was alarmed.

"I'm going to reflect on the recurring feeling I have of impending doom," Walensky, the director of the Centers for Disease Control and Prevention (CDC), told reporters at a press conference in March. "We have so much to look forward to. So much promise and potential of where we are and so much reason for hope. But right now I'm scared." The reason, simply put, was that Americans who did not want to get vaccinated, wear masks or abide by lockdown policies were undermining the COVID-19 recovery effort.

Nearly six months later, landmark policies will be implemented as a result of those Americans defying public health advice. In a nationally televised address on Thursday evening, President Joe Biden announced that he is signing an executive order requiring vaccination for executive branch employees and contractors who do business with the federal government. He also said he is going to require employees of health-care facilities that receive Medicare or Medicaid funding to get their shots, as well as pressure businesses to mandate vaccines for their employees. All told, roughly 100 million Americans will be directly impacted by the new policies.

Introducing his multi-point plan, Biden said that "we have the tools to combat the virus" if we simply "come together" to follow basic public health measures like wearing masks and getting vaccinated. He added that "many of us are frustrated with the nearly 80 million Americans" who have not been vaccinated even though inoculations are free and safe, referring to a "pandemic of the unvaccinated" overcrowding our hospitals.

Placing partial blame at the feet of anti-science elected officials and denouncing "pandemic politics," Biden introduced a six-point plan that includes requiring all employers with more than 100 employees to ensure workers are either vaccinated or tested weekly; forcing all federal workers and all contractors doing business with the federal government to be vaccinated; and requiring employers to provide time off for being vaccinated.

Biden is also funding programs to make sure that vaccinated Americans can receive timely booster shots; doubling the fines on airline and other public transportation travelers who refuse to wear masks; supporting the Food and Drug Administration (FDA) as it evaluates a vaccine for people under twelve; providing more support for small businesses harmed by the COVID-19 pandemic; and requiring educators in the Head Start program to be vaccinated.

As the delta variant has overtaken the country, critics have argued for vaccine and mask mandates that will prevent future lockdowns and protect innocent people (particularly those who are unvaccinated due to lack of access or legitimate fears about racism, rather than because of right-wing ideology). As Salon's Amanda Marcotte wrote last month, Biden could mandate vaccinations for anyone who uses trains and airplanes; local leaders can require vaccines in public places; and schools and businesses can prohibit unvaccinated people from being employed there and/or attending as students. (Recent data reveals that the number of COVID-19 hospitalizations for people under 18 has quintupled since June.)

Although many conservatives are expected to complain about Biden's new policies, they are not as sweeping as many historic uses of presidential power.

"This is not the most significant of executive orders," Allan Lichtman, a political scientist at American University, wrote to Salon regarding Biden's decision that all federal employees must be vaccinated. "There is Lincoln's Emancipation Proclamation and his suspension of habeas corpus. There are FDR's orders on private gold holdings, the establishment of the Works Progress Administration, and the internment of the Japanese. There is Reagan's executive order on federal regulations. There are Trump's many orders that rewrote environmental and immigration policy."

He added, "This, however, will be the most significant executive order on public health."

In addition to expanding vaccination in the United States, Biden also promised to address global vaccine inequity through steps he will announce later in the month.

Here's everything we know about the mu variant — the latest coronavirus mutation 'of interest'

A study published last month established that Americans, after years of being pounded with creationist propaganda, had decisively rejected pseudoscience and accepted evolution. While the shift in public opinion had been years in the making, there was a certain poetry to the timing of that study's release. We have watched with bated breath as SARS-CoV-2 — the virus that brought the world to its knees by causing the COVID-19 pandemic — has evolved over and over again into something more effective at spreading through the human population. We have had the delta variant and the lambda variant and the hybrid B.1.429, to name only a few. Whenever a new strain pops up, public health officials try to strike a note between caution and reassurance.

Now we come to the mu variant, also known as B.1.621.

On Monday the World Health Organization (WHO) officially labeled the mu variant as a "variant of interest," a designation that indicates a need for further study about possible dangers while falling short of the more serious classification, "variant of concern." Variants of concern are regarded as a top priority because they are more immunity-resistant, contagious or deadly than other strains. Currently the WHO considers four strains to meet those criteria: alpha, beta, gamma and delta (the variant most prevalent in the United States).

The WHO reports that the earliest documented samples of the mu variant came from Colombia in January; the strain now makes up 39 percent of all the cases there. It has also been detected in dozens of other countries, most commonly popping up in the United States, Mexico, Spain and Ecuador. A British risk assessment released last month suggests that the mu variant could be at least as resistant to vaccine-based immunity as the beta strain, although more research needs to be done. The mu variant contains several mutations that have been associated with resistance to immunity, such as E484K and K417N, as well a mutation known as P681H that has been linked to accelerated transmission.

"This variant has a constellation of mutations that suggests that it would evade certain antibodies, not only monoclonal antibodies, but vaccine- and convalescent serum-induced antibodies," President Joe Biden's COVID-19 adviser Dr. Anthony Fauci told reporters on Thursday. "But there isn't a lot of clinical data to suggest that. It is mostly laboratory in-vitro data."

Fauci emphasized that people who are concerned about the mu variant should still get vaccinated. Experts agree that people who are vaccinated are still less likely overall to develop COVID-19. If they do get sick, they are also less likely to become severely ill thanks to the various protections that the vaccines confer. Fauci made this point by talking about how vaccines are still helping people against the delta variant, which is currently surging in the United States.

"Remember, even when you have variants that do diminish somewhat the efficacy of vaccines, the vaccines still are quite effective against variants of that time," Fauci explained. As for the mu variant, he was characteristically candid.

"We're paying attention to it, we take everything like that seriously, but we don't consider it an immediate threat right now," Fauci explained.

The WHO expressed a similar view regarding the mu variant, saying that more studies need to be performed so scientists can precisely understand the nature of the threat it does (or does not) truly pose.

"The epidemiology of the Mu variant in South America, particularly with the co-circulation of the Delta variant, will be monitored for changes," the organization explained.

During the same Tuesday meeting in which it announced its decision regarding the mu variant, the WHO also announced that it will start naming variants after stars and constellations once it runs out of Greek letters. Mu is the 12th letter in the Greek alphabet, meaning scientists are already halfway through that system. One WHO official wrote to science journalist Kai Kupferschmidt that "they will be less common stars/constellations, easy to pronounce. We are just checking internally with our regional colleagues to ensure none of them cause any offence or are common names in local languages."

As epidemiologist Dr. Eric Feigl-Ding tweeted, "it's sad they need to plan this."

He added that because Andromeda is a constellation, "'Andromeda Strain' is now technically possible as a name," referencing the 1969 Michael Crichton novel that was adapted into a 1971 film.

Why the spotted lanternfly must die

Like many people, I try to avoid killing insects or spiders unless it is absolutely necessary. Yet according to officials, this nonviolent stance towards insects could end up being a threat to the ecosystem of the Northeast. Indeed, the Pennsylvania Department of Agriculture is asking citizens to declare all-out war on one particular insect: the spotted lanternfly. Dispensing with any pretense of bureaucratic detachment, the state's website is admirably blunt:

"Kill it! Squash it, smash it...just get rid of it."

In areas of Pennsylvania like Northampton County (where I live), spotted lanternfly are not hard to find. Despite being only an inch long, the moth-like insect has a beautiful pattern once it spreads its wings. You are greeted with bright red and black spots, similar to a ladybug shell, in sharp contrast to the drab gray, yellow and black-and-white patterns that cover the rest of it. It is hardly the most memorable insect, but it does make an impression.

More to the point, they are everywhere. And if they take over the American northeast, it is going to be a very, very big deal.

"If they go unchecked, they will continue to spread throughout the American northeast as well as to other regions of the US and also potentially to other countries as well," Julie Urban, an associate research professor of entomology at Penn State University, told Salon by email. Urban is the author of "Perspective: Shedding light on spotted lanternfly impacts in the USA," a scholarly article in Pest Management Science that offered projections on what will occur if spotted lanternfly continue to spread through the region. Both in the article and while speaking with Salon, Urban detailed how spotted lanternfly could destroy both lives and the landscape. One reason is that of the two plants that spotted lantern flies have been documented to kill via feeding, one of them — grapes — is a vital crop.

"It has killed grapevines, which of course has direct negative economic impacts for growers," Urban told Salon. "However, it also has caused growers to increase the number of insecticide applications they make to try to control [spotted lanternfly] (which still is not sufficient to allow them to overcome the damage caused by waves of [spotted lanternfly] coming into vineyards, particularly in mid-September), and more insecticide sprays cost more money."

The bugs also harm local tourism economies, especially when they destroy the experience of touring vineyards, hosting events like weddings and holding wine tastings.

All of this is already happening in southeastern Pennsylvania and could occur in areas like Long Island, the Hudson Valley and the Erie regions of both Pennsylvania and New York. The insects have recently been observed in New York City.

Agriculturally speaking, the spotted lanternfly problem is not limited to grapevines. Urban reported that Christmas tree growers and nurseries have also had issues with spotted lanternfly partially damaging their stock. Beyond that, it is expensive to try to keep the insects away from their products, which they must do by monitoring for every stage of the animal's life cycle — from egg masses through every development stage until they reach adulthood. Every occasion when spotted lanternfly are found within a plant — even if they are dead — must be reported, and offending businesses could receive fines and harm their business reputations. Urban also pointed out that other businesses could be impacted precisely because spotted lanternfly are not indigenous to our ecosystem, and are therefore unpredictable.

Melody Keena, a research entomologist at the United States Forest Service, added that the insects can also be a problem for ordinary people just trying to rest in their homes.

"[Spotted lanternfly] are a nuisance pest of homeowners," Keena explained by email. While scientist are unsure about their long-term impact on landscape trees, they produce "copious quantities of sticky honeydew when they feed and black sooty mold will grow on it." These byproducts can make for slippery surfaces and can be a tripping and slipping hazard.

Some of the gross fluids produced by spotted lanternfly directly contribute to their invasiveness. Their egg masses are covered with a gray waxy material that helps them stay attached to smooth surfaces like tree bark, cinder blocks, stone, shipping pallets, rail cars and automobiles.

Experts believe that the spotted lanternfly entered the United States from its native Asia (particularly China, Vietnam and Bangladesh) after an egg mass attached to a shipment of stone arrived in the Berks area of Pennsylvania. Since then, it has continued to spread as the bugs have dispersed, reproduced, and taken advantage of their reproductive cycle to avail themselves of human transportation corridors.

"This makes human aided transport likely, and this has contributed to its spread both nationally and internationally," Keena explained.

Going forward, ordinary Americans are being asked to be on the alert for spotted lanternfly in order to halt the invasion.

"There is an active effort underway to reduce [the lanternfly's] populations," Erin Otto, National Policy Manager, in the U.S. Department of Agriculture's (USDA) Animal and Plant Health Inspection Service (APHIS), told Salon by email. "It may not have as many natural predators here as it does in its native habitat, but APHIS, state departments of agriculture, and U.S. residents are working to contain and manage the pest."

Matthew Helmus, an assistant professor of biology at Temple University, told Salon by email that this will include supporting funding for programs that survey for spotted lanternfly, removing trees of heaven from their properties (this is the other plant that they are known to kill in order to eat), and learning how to identify the spotted lanternfly when it is on one's property.

And if you happen to spot one? Well, the Pennsylvania Department of Agriculture's website tells you exactly what you should do next.

How civilizations thwart extinction in the face of existential crises

Humans set ourselves apart from other animals in our ability to recognize and stave off coming threats. A tiger or a dove cannot plan for a flood that might be decades off, and then build their dens defensively; humans, on the other hand, are blessed with the ability to predict and anticipate threats that may be far in the future.

It is odd, then, to think that American civilization is (for the most part) keenly aware of the threat of climate change, yet so far incapable of compensating and overcome it. We build our cities to withstand hurricanes and floods, heat and cold; we build dams and windmills, and re-routed the Colorado River to keep Phoenix and Tucson hydrated at a cost of $4 billion. Now, we are keenly aware of the existential threat of climate change — of its potential to disrupt our lives in unimaginable ways, or even bring about the end of human civilization. So why is it that our civilization can't wrap our minds around it?

Part of it may be that reality is hard to imagine. For one thing, no one alive today has lived through the collapse of an entire civilization; the actual day-to-day experience of witnessing that process is alien to us.

Still, it is difficult to conceive of why any society, particularly ours, would allow itself to be brought to extinction. We already know climate change is feeding the wildfires that consume the Pacific Coast and causing rising sea levels which will soon flood cities like New York. Large sections of the planet will become too hot or dry to inhabit (goodbye, Phoenix), extreme weather events like hurricanes will be much more common and supply chains for everything from food to microchips will break down.

Yet despite this knowledge, we still seem poised to "walk the plank with our eyes wide open," to quote Gotye's hit 2010 song on the topic. Even after the United Nations climate panel release yet another report warning of an environmental apocalypse unless we eliminate greenhouse gas emissions, humanity is not doing the one thing it must: Dropping all other priorities until the existential threat is neutralized.

The strange thing is that, if you look to history, there are other situations in which societies faced, and recognized existential threats. Some rose to those occasion and overcame them; others saw the writing on the wall but were incapable of changing their ways. The sagas of previous civilizations that recognized these threats — and either bowed to them or planned around them — are key to understanding the plight of our empire as it struggles to address the climate crisis.

* * *

We can start with a lesson in when things went wrong — namely, Ireland, when a famine that was completely foreseeable occurred despite there being ample warnings.

The Great Famine of 1845 to 1852 occurred due to a brutal combination of economic and ecological injustices perpetrated by Ireland's English conquerors. On the economic side, English and Anglo-Irish landlords and middlemen (people tasked to collect rent) controlled the island's land and wealth, leaving the Catholic common folk in abject poverty. Their ability to control land was so limited that they had to rely on potatoes for food, since those could be grown cheaply, abundantly and in lower quality land. More than 175 government panels of various kinds warned that the economic system in Ireland was unsustainable, as millions were already suffering through starvation and ill-health and the slightest breeze of bad luck could knock down the whole house of cards.

That figurative breeze finally came in 1845 in the form of Phytophthora infestans, a fungus-like organism that demolished potato crops throughout Ireland. Ruining up to one-half of the potato crop in 1845 and three-quarters over the following seven years, the Great Famine ultimately led to at least 1 million deaths and the emigration of at least another 1 million souls (many of them to America). England implemented some half-measures in a semi-sincere attempt to alleviate the misery — they repealed laws that made staple foods like bread and corn prohibitively expensive, for example — but refused to check the avarice of the elites. The Irish working class was still forced to produce foods like beans, peas, honey and rabbits that were exported to other countries (in some cases, like with butter and livestock, exports may have increased). Economic elites still were allowed to exploit the Irish working class in a number of ways. The mere concept of widespread wealth redistribution, which alone could create economic justice and responsible land management, was never seriously considered.

Ireland therefore stands mainly as an object lesson in the danger of letting special interests dictate policy, particularly when experts can foresee adverse consequences. The powers controlling Ireland had every reason to know that disaster was afoot but, because they did not value Irish lives, failed to avert a humanitarian catastrophe. Business logic prevailed over plain old logic.

It doesn't help that, unlike the Irish of the mid-19th century, Earthlings in the early-21st don't have the option of emigrating.

* * *

The Netherlands presents us with a more hopeful example of a civilization facing an existential threat and dealing with it. As far back as the days of the Roman Empire, the area now known as the Netherlands was notorious for its intense flooding because of its low elevation. Observers and inhabitants alike knew that if the region was to be populated, the flooding would have to be controlled.

Thus, by the later half of the first millennium AD, the inhabitants of the Netherlands had already begun to construct rudimentary dikes. Dike construction picked up in earnest around the start of the second millennium, and by the 13th century most of the dikes had been linked to each other so an ongoing sea defense system could exist. Dutch officials even created local water boards around this time in order to maintain the dikes.

Crucially, Dutch politicians continued to work together over the years to maintain their dikes, with most major political institutions in the country recognizing that their very survival depends on them. In 1953, when the sea level on the Dutch coast rose so high that flooding destroyed millions of dollars in property and caused 1,835 deaths in the Netherlands (as well as 307 in England), the government quickly created a so-called "Delta Commission." The panel came back with a suggestion that the country create an elaborate infrastructure of dikes, dams, storm barriers and sluices so that future catastrophic flooding will not occur. As a result, no one in the Netherlands has been killed by a flood since the 1953 disaster.

There are several valuable lessons to be found in the Dutch example. First, there is a willingness by all sides to overlook their differences when it comes to matters that literally mean life or death for their civilization. Second, there is a pragmatic approach to problem-solving that is gradual when necessary and drastic when necessary, emphasizing state-of-the-art knowledge over any particular philosophical approach. Lastly, of course, is the fact that governing authorities listened to scientific experts (or, if we're talking about the medieval era, their equivalents). This not only saved lives, but protected Dutch civilization and allowed it to become a modern empire — which, ironically, is now implicated in the global capitalist order that is hastening carbon emissions.

There is also a contested example of a civilization collapsing which, though contested, has many lessons for our modern predicament: that of Easter Island. In his 2005 book "Collapse: How Societies Choose to Fail or Succeed" (which built on the work of other scholars tracing back to the 1970s), biologist Jared Diamond posited that the Rapa Nui people on Easter Island (which is also known as Rapa Nui) saw their civilization collapse because their population grew too large and they overused the island's limited resources. In particular, Diamond notes that they over-harvested the wood that was used to construct moai, or large statues of forefathers that owners used to flaunt their wealth and status. The loss of trees supposedly led to soil erosion, however, making it difficult to grow food in a society that (without wood) people could not leave. The end result, Diamond argues, was cannibalism, warfare and a major population decline.

If true, Diamond's hypothesis would have obvious implications for global warming: a society engaged in deforestation which led to a concomitant collapse in the surrounding ecosystem, causing a food shortage. This happened in a closed system, an island, that was difficult to emigrate from.

Yet not everyone agrees with Diamond's theory. Speaking to Salon by email, scholars Robert DiNapoli, Terry L. Hunt, Carl P. Lipo and Timothy Rieth argue that the archaeological record does not support Diamond's hypothesis. Rather, they believe that the evidence suggests a "population that arrived on the island ca. 1200 AD, grew to a size that was supported by the resources available on the island, and live in communities that drew benefits from monument construction." They argue that these communities were sustainable for more than 500 years.

The "Rapa Nui people were able to adapt to their local environments and live successfully despite the island's inherent constraints (size, remote location, few natural resources, etc.)," the scholars said.

In this narrative, what upset the balance on Rapa Nui was, quite simply, European colonialists. They performed slave raids which took thousands of people off the island, brought diseases with them, and devastated the island's environment. (For 50 years, the island was managed as a sheep ranch through a foreign corporation.) All of these factors spread needless suffering and misery to the Rapa Nui and cost incalculable lives. Yet in spite of this, the Rapa Nui people survived, and today more than 5,000 live on the island, speak their ancestral language and practice their native culture. Those moai, maligned as an unnecessary waste of resources by Diamond, may have even helped, since they brought communities together.

"The very thing that people have said was the 'downfall' of Rapa Nui society is almost certainly a key part of their success," the scholars told Salon.

The lesson of the Rapa Nui is not so simple. Yet their plight speaks to the importance of monuments and other cultural artifacts in civilizations — keystones that help civilizations survive in the face of external threats, whether imperialism, greed or climate change.

Joe McCarthy was never defeated — and Donald Trump now leads the movement he created

A few years ago, I was interviewing Roger Stone when he happened to use the phrase "new McCarthyism," describing those who accused former Trump campaign chair Paul Manafort of being a tool of Russian interests. This was more than a little ironic for abundant reasons, especially given that as a younger man, DonaldTrump had been mentored by the infamous Roy Cohn, Joe McCarthy's right-hand man, Roy Cohn.

Stone tried to defend himself by saying that he'd read M. Stanton Evans' book "Blacklisted by History," and found it "a more balanced review of exactly what McCarthy was talking about and what he did." That didn't make much sense either: Evans' book is a revisionist attempt to defend McCarthy, which is widely maligned by serious historians. It's not surprising that a longtime Republican operative would read it — but then, if Stone was on McCarthy's side, why was he accusing other people of "new McCarthyism"?

Stone tried to salvage that one too, arguing, "Whether I like it or not, people view McCarthyism, as a label, as the hurling of false accusations." Overall, though, there was more truth-telling in that exchange than you normally get from Stone. The Republican Party of 2021 is very much the party that McCarthy envisioned, centered on a supposed strongman's personality, viciously seeking to destroy any outsiders seen as threats and rooted in blatant bigotry. In that context, it's important to clarify what Joe McCarthy did and why his legacy is still dangerous.

McCarthy was elected to the Senate from Wisconsin in 1946, and his early years in office were unmemorable — except for one revealing episode. He denounced the death sentences handed down in U.S.-occupied Germany to a group of Waffen-SS soldiers convicted of murdering American troops in an event known as the Malmedy massacre. This moment in McCarthy's career, though virtually forgotten today, is highly instructive On the surface, he presented himself as a crusader for justice, arguing that the Army was covering up judicial misconduct and that this called into question the validity of the Germans' confessions. (He never provided any evidence for this.)

In fact, McCarthy was doing something much more sinister. On some level he understood that defending a group of Nazis would appeal to the antisemitic American far right at a moment when expressing public hatred for Jews was unacceptable. At least implicitly, McCarthy was accusing the Jewish Americans who helped investigate the crimes of seeking vengeance and perpetrating injustice. Today we might call this flipping the script: Suddenly Jewish people, in the immediate aftermath of the Holocaust, were persecutors, and "Aryan" Germans — those who had committed mass murder — were their victims.

That wasn't enough to make McCarthy a right-wing superstar, probably because any hint of pro-Nazi sympathies was completely out of bounds in the postwar years. McCarthy needed a different vehicle to achieve political stardom and found it in 1950 when, while delivering a speech in West Virginia, he claimed to have a list of more than 200 known Communists who were allowed to work in the State Department. (No such list existed.) '

The speech was a smash hit and over the course of four years, the Wisconsin senator accused countless people of either actually being Communists, being "Communist sympathizers" (whatever that meant) or being "soft on Communism," a hopelessly vague term that could be applied to almost anyone who didn't support open military confrontation with the Soviet Union. With America on edge during the early years of the Cold War, McCarthy inflamed widespread paranoia, without once provided evidence that any of his targets had done anything illegal. That didn't much matter: He was saying what unhappy right-wingers wanted to hear, and they supported him with enthusiasm. (Yeah, some of this might sound familiar.) gave him tremendous political influence as a result.

Many of McCarthy's targets were political opponents, like Sen. Millard Tydings, a Maryland Democrat who had criticized him, and Illinois Gov. Adlai Stevenson, the Democratic presidential nominee in both 1952 and 1956. He also persecuted members of marginalized groups, claiming that they could be vulnerable to Communist influence: Today we would say he was obsessed with the "cultural elite," going after East Coast intellectuals and LGBT people (although that term did not exist), left-wing activists and journalists, members of the Washington political establishment and, of course, Jews. His strident attacks powered Republican victories in the 1950 midterm elections, and plenty of Southern Democrats liked him too.

McCarthy was rewarded with a powerful chairmanship, at the Senate Committee on Government Operations, where Cohn and the young Robert F. Kennedy serving as assistant counsels. There he targeted the Voice of America, the overseas library program of the International Information Agency (this led to book burnings), several prominent Protestant clergymen and finally the U.S. Army. That last crusade proved to be a bridge too far: Joseph Welch, chief counsel for the Army, called out McCarthy on national television for his cruelty and recklessness, famously demanding, "At long last, sir, have you no decency?"

The American public, seeing McCarthy exposed as a bully and liar, rapidly turned against him. He died in 1957, likely as a result of alcoholism, but if he'd lived would likely have lost his Senate seat the following year. Although the term "McCarthyism" had been coined well before his downfall, that guaranteed that it would be an epithet rather than a compliment. From that point on, even Republicans began using the term "McCarthyist" to refer to baseless and malevolent smears.

This brings us to the Trump era. First of all, the accusations that Trump's campaign colluded with Russian agents in 2016 are not "McCarthyist," both because they were highly plausible (and at least partly true) and because they had nothing to do with left-wing or Communist ideology. For a better idea of what McCarthyism actually entails, consider this passage from a 2017 article about Cohn's influence on Trump. It practically lays out, step by step, the ways that Trump's narcissism would later fuel his attempts to overturn the 2020 election:

For author Sam Roberts, the essence of Cohn's influence on Trump was the triad: "Roy was a master of situational immorality . ... He worked with a three-dimensional strategy, which was: 1. Never settle, never surrender. 2. Counter-attack, counter-sue immediately. 3. No matter what happens, no matter how deeply into the muck you get, claim victory and never admit defeat." As columnist Liz Smith once observed, "Donald lost his moral compass when he made an alliance with Roy Cohn."

That attitude is a key element of McCarthyism. The only two ingredients missing from that description are the blatant pandering to bigotry and paranoia and the way supporters are seduced by a narcissist's charismatic allure into a sense of shared omnipotence with them. Without the former, the McCarthyist lacks the fuel necessary to whip up the mob against supposed enemies; without the latter, the demagogue can't convince the mob that his individual desires are also their own.

The bogus and evidence-free claim that Trump really won the 2020 election is quintessentially McCarthyist: Trump refused to settle or admit defeat, trying to proclaiming victory before all the votes had been counted and filing dozens of nonsensical lawsuits. Like McCarthy and Cohn, Trump gaslit America. As with McCarthy's claims that he had lists of Communist agents in the government, Trump's empty allegations force his supporters either to take him at his word or reveal their disloyalty — and nobody who wants a career in Republican politics can afford to be disloyal to Trump at the moment. In both cases, proof was no longer needed, and on some level was viewed with scorn. To doubt Joe McCarthy in the early '50s was to become an accomplice to the Communist conspiracy, just as anyone who rejects Trump's Big Lie today is clearly a socialist antifa liberal.

That is how a lie becomes political dogma, a phenomenon also visible in the current right-wing obsession with "critical race theory." Just as McCarthy defined "Communism" so broadly that it lost all meaning, opposition to "critical race theory" has very little to do with the academic approach that term actually describes — but a great deal to do with maintaining white supremacy. Salon's Chauncey DeVega has described it this way:

For today's Republicans, Trumpists and other members of the white right, "critical race theory" is a form of political ectoplasm: It's both a liquid and a solid, something slimy and sticky which can be shaped into whatever frightening or dangerous thing suits their mood and needs in a given moment.
In this political context, "critical race theory" means both everything and nothing; it is a fetish object used to summon up centuries-old racist nightmares and fears about "scary" Black and brown people who are plotting a rebellion or uprising to undermine the (white) family, indoctrinate (white) children and attack (white) America.
By implication, if "critical race theory" and other Black and brown bogeymen are threats to (white) America, then preemptive violence is both necessary and reasonable. Moreover, multiracial democracy is seen, by definition, as incompatible with white people's safety, security and material interests.

In channeling McCarthyism, whether consciously or otherwise, Trump has been successful to a degree McCarthy himself could only have dreamed about. But the connection is clear. While McCarthy was personally discredited, he made it difficult for any prominent American to express unpopular or radical views without being accused of disloyalty or possessing "Communist sympathies." The McCarthyist current has been with us ever since, and as Trump's career demonstrates, has not yet been defeated. If anything, it appears to be winning. Roger Stone is correct, in an upside-down fashion: There is a new McCarthyism in America today, and his pal Donald Trump and his supporters are the ones practicing it.

How climate change will disrupt supply chains: Experts say you should expect more shortages in the future

Dr. Thomas Goldsby, a professor of supply chain management at the University of Tennessee — Knoxville's Haslam College of Business, assigns his undergraduate student a "routine exercise" that frequently proves revelatory. Its purpose is to illustrate the complexity of the various trade routes that bring products from all over the world to consumers. The assignment is to figure out how far the students can trace the supply chain — if possible going back to the exact point when the raw materials were extracted.

"When my students have had an opportunity to present their results to the companies and the products that they produce, the company executives learn something every time," Goldsby told Salon. "I just think it's remarkable that my undergraduate students can present news about the business or the products and the senior executives are like, 'Wow, we had no idea that a golf club manufacturer is wondering why they have a hard time getting titanium.' It's because there's not a lot of titanium that goes into a golf club, but there's a whole heck of a lot of it that goes into an aircraft to build the fuselage."

Goldsby uses this exercise to explain the complexity of the supply chains and how unanticipated hiccups — such as another industry wanting a resource you need, and you not knowing it — can have drastic, unexpected consequences.

The internecine nature of the supply chain means that seemingly unrelated things can have an effect on each other — say, a global pandemic and a microchip shortage. Microchips, the sets of circuits hosted on small flat pieces of silicon, are intrinsic to so much of industrial civilization: they are used in computers, cars, mobile phones, home appliances and virtually all other electronic equipment. We already have a shortage of microchips because of COVID-19. Yet it is going to get a whole lot worse because of global climate change.

Pandemics may not seem to have much to do with the manufacture of microchips; silicon chips, certainly, cannot contract the virus. Yet the supply chain for microchips is fickle: historically, chipmakers were usually able to keep pace with growing demand for chips in products like automobiles and home electronics. But the pandemic interrupted that rhythm by causing consumers to behave in unpredictable ways, with manufacturers struggling to correctly foresee how many chips they would need for everything from Volkswagens to Playstations. Because the supply chains are so complicated, this made it easier for problems to arise that delayed production or transportation.

Worse, the industry has a lot of bottlenecks. There are only a handful of foundries that account for most of the world's chip fabrication, resulting in roughly 91% of the contract chipmaking business being located in Asia. This makes countries like the United States vulnerable to production disruptions either in those distant lands or at any step along the way. Likewise, there are companies in the United States, Japan, the Netherlands and elsewhere that have also found ways to make themselves indispensable to the global manufacturing of microchips. The end result is that this particularly important piece of equipment is especially vulnerable to shortages when there are unexpected alterations to consumer demand, a phenomenon known as the bullwhip effect.

Experts do not believe that the chip shortage is going to end anytime soon, but it is only the beginning of the problem. If you thought COVID-19 caused problems for supply chains, imagine how they'll be blown apart when climate change causes extreme weather events, rising sea levels and massive spikes in temperature all over the world. There will be increasingly frequent and severe wildfires on the Pacific Coast, flooding in our eastern cities and millions of refugees. It is impossible to anticipate the number of new variables this will throw into orderly supply chain management — other than accepting that particularly intricate supply chains are almost certainly going to start coming apart.

"The industry is very clearly dependent on globally-interconnected supply chains and distribution systems," Dr. Michael E. Mann, a distinguished professor of atmospheric science at Penn State University, told Salon by email. "Anything, such as COVID-19, that disrupts transportation is going to disrupt these supply chains and distribution systems and lead to bottlenecks and backlogs."

Mann cited a recent report by the United Nations' Intergovernmental Panel on Climate Change, explaining that this "will clearly lead to delays in the distribution of microchips and will presumably have an adverse impact on the semiconductor and computer industries."

The most obvious solution to this problem, naturally, would be for world leaders to take global warming seriously and do whatever it takes to both reduce greenhouse gas emissions and fix the damage already done to our planet. Frustrating though it may be, however, there are practical geopolitical realities which strongly suggest this may not happen. That means we face a chip shortage in the foreseeable future — among other shortages.

"I think that we need to acknowledge that there are interactions that we don't yet fully understand and appreciate," Goldsby told Salon. He later added, "We've got to go back to raw material extraction and we need to try to map it out and understand where those raw ingredients come from."

This is more challenging than it might seem, because despite their small size, microchips are very intricate.

"It's basically like making a cake," Dr. Ron Olson, Director of Operations at Cornell NanoScale Science and Technology Facility, told Salon. "You start off with this base layer, then you add and subtract metals and oxides in different layers, just like following a recipe on a cake or baking or whatever. Then you end up with your final product."

Dr. Christopher K. Ober, a professor of materials engineering at Cornell University, explained that one way to get around possible shortages is to follow the example of a car manufacturer that proved more resilient to supply chain issues than other organizations.

"Toyota really was a big proponent of lean manufacturing," Ober explained. "Basically they didn't keep anything in warehouses. It was delivered in a truck at the same time they were going to put it into a car. What Toyota learned, I think it was because of [the Fukushima earthquake in 2011], they couldn't entirely depend on instantaneous delivery. They actually had to start storing critical parts that might be hard to access."

While this kind of smart resource allocation can protect companies from immediate issues like chip shortages, one economist argued to Salon that we need to stop assuming lengthy supply chains are an inevitable and necessary part of our economy. According to Dr. Richard D. Wolff, professor emeritus of economics at the University of Massachusetts Amherst, lengthy international supply chains were developed by a handful of powerful corporate elites to maximize their profits — regardless of the obvious fragilities in such arrangements.

"It was a deliberate economic sequence of decisions made by a particular group of people for particular purposes that created the global supply chain," Wolff explained, saying the choices were made by the thousands of people who comprise the boards of directors in American corporations. "They decided, starting in the 1970s, that American capitalism had reached a kind of tipping point. It had grown spectacularly over the previous century. It had made a ton of money, but along the way, it had had to compensate the working class. Not on a scale that the working class deserved, needed or wanted, but because of the unionization, they had to come across with something."

They did, improving working conditions for American employees at the behest of labor, but in the process corporations saw their profits decline. Then they turned to nations in Africa, Latin America and Asia. Many of them had only recently shaken off the yoke of European colonialism and could not protect their workers as well as Americans. Their citizens had taken over their governments and been able to improve the education, health care and other social systems for their people (these had been neglected by the European colonial powers). This mean that they could be substituted for American workers at much lower cost.

"Long story short, a massive relocation of business was accomplished by the capitalists of the world, in which they went to China, India, Brazil, and other places far away from the centers of capitalism in Western Europe, North America and Japan," Wolff observed. "That's why we have long supply chains. It was the corporate leadership that made the decision to maximize their profits by moving all that."

Like so many other features of capitalism, the development of long supply chains seems poised to self-immolate because of climate change. Unfortunately, there is no sign that the economic titans are interested in changing the way supply chains work at this juncture. That means we are facing a future where we lose access to the internet, can no longer watch television or movies, have countless useless cars and struggle to find food. Hence, unless the economy is restructured to end climate change and ridiculously long supply chains, it is hard to see how realistic hope will be possible.

Vaccine-makers rethink their strategy

When America shut down in March 2020 due to the COVID-19 pandemic, there was a pervading sense that the situation was probably — or hopefully — temporary. After all, efforts were already underway to develop a vaccine. It was just a matter of time until normalcy would return.

But 17 months later, a return to "normal" is nowhere in sight. Frightening new mutant strains like the delta variant and the lambda variant have emerged, more infectious and possibly more dangerous than their antecedents. Early evidence indicates that, while existing vaccines stop patients from getting severely ill if infected, they do not prevent the infected from transmitting the disease. At the very least, it is theoretically possible that mutant variants could create problems for people who want their inoculations to be effective.

In other words, the vaccines weren't enough. Humanity anxiously awaited development of the first COVID-19 vaccines throughout 2020; now that those vaccines aren't enough to permanently halt COVID-19, it would appear that vaccine manufacturers are pivoting their strategy.

But as to what they have planned, pharmaceutical companies aren't being entirely transparent — or perhaps they aren't sure.

"As SARS-CoV-2 continues to evolve, Pfizer and BioNTech are continuing our work to understand long term immunity, the need for booster shots, and any threat from circulating or new variants of concern to vaccine protection," a Pfizer spokesperson told Salon by email. The company said that the existing body of research and evidence suggests that the circulating variants do not escape their COVID-19 vaccine, adding that they continue to perform clinical trials at various stages for a third dose of their currently two-dose BNT162b2 vaccine, with possibly hopeful results. That vaccine, widely known as the "Pfizer COVID-19 vaccine," is effective in preventing COVID-19; two doses of it significantly strengthens the body's ability to avoid severe disease and hospitalizations.

The company also communicated to Salon that, broadly speaking, they plan on keeping tabs on emerging variants and waning immunity so that they can prepare new products if necessary.

"It is, in part, why we chose a vaccine technology with the flexibility that allows us to both provide boosting doses if needed and to address potential changes in the virus," Pfizer explained.

The biotechnology in question is known as an mRNA vaccine, and it describes the type of inoculation developed by both Pfizer and Moderna (which did not respond to Salon's request for comment). Traditional vaccines work by introducing a weakened or dead pathogen (an organism that causes disease) into the body. The immune system becomes familiar with the pathogens by being exposed to them and, like a soldier participating in war games, learns how to fight a real enemy through training with a facsimile. More specifically, the immune system learns how to recognize antigens (a toxic or foreign substance on an antigen) and produce antibodies to destroy the pathogens associated with them.

mRNA vaccines follow this same principle, but with a twist: They use a synthetic version of RNA, one designed to compliment one of the DNA strands in a gene, and introduce it into the body so cells will produce antigens like those found in a given virus. This has the same effect as traditional vaccine platforms — it helps the body's immune system recognize and fight the pathogen — but can be manufactured quickly and be altered easily as viruses mutate and variants emerge. Despite many advances in biotechnology, it is still complicated and painstaking to manufacture sterile, safe and effective vaccines, so any technology that can shave time off of that process is welcome.

This is why Pfizer identified its use of mRNA vaccines as something that would help them stay on top of future outbreaks.

That said, pharmaceutical giant Johnson & Johnson — which did not develop an mRNA vaccine for COVID-19 but instead has a single-shot traditional vaccine — is also optimistic.

"Evidence from our Phase 3 ENSEMBLE study demonstrates the efficacy of the J&J single-shot COVID-19 vaccine, including against viral variants that are highly prevalent," Johnson & Johnson told Salon, adding that their results have been consistent across geographic and demographic lines. They also explained that they are aware of emerging variants and monitor them "through our ongoing clinical efficacy trials to determine whether the immune response elicited by our COVID-19 vaccine is capable of having a neutralizing effect."

One major challenge facing vaccine manufacturers is the fact that much of the American public is making their job more difficult. There are millions of anti-vaxxers in the United States, whose motives range from sincere skepticism toward the medical-industrial complex to political spite against Democrats. Biotechnological development alone is not capable of saving us from what has also become a cultural and political crisis.

As the Pfizer spokesperson told Salon, anti-vaxxers made it harder to extricate humanity from the pandemic.

"Every person without immunity provides the virus with an opportunity to spread and continue to mutate and further expose our communities," the spokesperson said. "So, it's important to vaccinate as many people as possible as quickly as possible. Following public health guidance to limit exposure to SARS-CoV-2, such as masking and social distancing, in combination with the continued rollout of immunizations may help us achieve herd immunity and reduce cases of COVID-19."

Sarah E. Cobey, a microbiologist at the University of Chicago who specializes in the coevolution of pathogens and hosts' adaptive immunity, emphasized that public health leaders need to get better at articulating the importance of vaccines to the public.

"I think public health leaders have done a poor job communicating the severe health risks from infection and the near inevitability of infection, including possibly severe infection, in people who do not vaccinate," Cobey told Salon by email. She also described it as "sad" that companies were able to use top-notch technology and up-to-date knowledge to produce a vaccine with remarkable speed, but this was taken as evidence by many that the vaccines were untrustworthy.

"I wish basic science education were strong enough in the United States that the vaccines were not met with such distrust, and SARS-CoV-2 was not met with such romanticism about 'natural' pathogen exposures," Cobey added. "It's a horrible virus."

In the long run, some suspect COVID-19 will be like the flu — a disease that ripples through the population and mutates yearly, periodically taking lives; but which can be relatively contained by booster shots which allow inoculations to keep up with mutant strains. Indeed, some states and cities already allow certain citizens to get boosters, although these are not new vaccines but rather additional doses of the existing ones.

Either way, the most probable scenario is that boosters will be the anti-COVID wave of the future — that is, how we will cope with the physical toll of the pandemic.

"All of the current vaccines were developed with the original strain from China, the original parent strain that started the outbreak with the emergence of variants," Dr. Jonathan Zenilman, an infectious disease specialist and professor at Johns Hopkins University School of Medicine who was involved in data and safety monitoring for one of the major vaccine projects, told Salon. "Can you develop boosters to do that? And the answer is absolutely yes. The RNA technologies especially are very flexible and agile."

Psychologically, once we move past the phase of constant lockdowns and rising death counts, there will be persistent trauma as our culture attempts to cope with the terrible tragedies and drastic lifestyle changes that were forced on us after March 2020. Fixing those problems will prove to be another matter entirely, and just one more example of how the COVID-19 pandemic has transformed all of our lives.

History shows 2020 wasn't especially close or controversial — except for the loser's massively wounded ego

In his cascade of lies about the 2020 election, Donald Trump preys on the desire of his snowflake supporters to believe that they're special. The thing is, they're not.

I live in Northampton County, Pennsylvania, a swing region in a swing state. Like the rest of the commonwealth, it backed Barack Obama twice before supporting Trump in 2016 — one of 206 counties out of America's 3,141 to make the Obama-Trump pivot — and then switched back to Joe Biden in 2020.

You might think that a county that supported Biden would never even consider electing someone who openly calls for overturning the will of its own voters. Talk to Trump supporters around here, however, and you'll hear a disturbingly blasé attitude toward that idea. Our Republican candidate for county executive, Steve Lynch, actually attended the Jan. 6 rally that preceded the Capitol assault. At one point Lynch can be seen on camera, around 1 p.m., crowing, "It's going down" as the attack began. When approached by Salon about his controversial campaign, Lynch insisted that the protest he attended had been peaceful, that Nancy Pelosi was (somehow) to blame for the violence and that antifa had caused or provoked the unrest. He complained about "the pathetic narrative put out there by this corrupt media," but offered no evidence to back up his assertions. I twice asked him a question he refused to answer:

American history is full of examples of political injustice, yet there was only one other presidential election before 2020 in which a large part of the country refused to accept the official outcome. How do people who claim this election was stolen justify the way Donald Trump and his movement have responded to their loss?

I think that oddly specific question — even if Trump supporters' claims about the 2020 election were true (which they're not), would they justify violence? — is important. Here's why: Whatever Trump fans may believe, the 2020 contest wasn't especially close or controversial in historical terms. Indeed, there have been many other elections when the loser had far more plausible grounds to claim fraud or contest the result.

We can start with Trump's supposed hero, Andrew Jackson, who won the popular vote by a substantial margin in the 1824 election but was not elected because no candidate had won enough electoral votes. Under the terms of the 12th Amendment, the election was decided in the House of Representatives, where Speaker Henry Clay threw his influence behind Secretary of State John Quincy Adams. As president, Adams in turn appointed Clay as his secretary of state, and Jackson, not without plausibility, accused the two of making a "corrupt bargain." This was never proved, and since Clay had followed the Constitution, Jackson had no legal claim that he'd been cheated. There's no doubt that the Adams-Clay deal was ethically questionable, and Jackson used his sense of outrage to help form a new political organization — the Democratic Party.

Jackson was an unappealing historical figure on many levels, but here's something he didn't do: He didn't incite an insurrection against the government over losing that heated election.

The same is true for almost every other controversial presidential election. Both Democrats and Republicans actively suppressed votes and tampered with results during the 1876 election. Arguably, Rutherford B. Hayes and the Republicans were just a little better at it than Samuel Tilden and the Democrats. A second civil war nearly broke out after that contest, averted because both parties agreed to a racist compromise that allowed Hayes to take the White House while Southern Democrats brought an end to Reconstruction and reduced Black people to second-class citizenship.

After Grover Cleveland lost the 1888 election — despite winning the popular vote, like Jackson and Tilden before him — many of his Democratic supporters accused Republicans of fraud, although Cleveland himself dismissed those claims as not legally supportable.

Much closer to our own time, Richard Nixon actually had a plausible case that John F. Kennedy and the Democrtats rigged the 1960 election through chicanery in Illinois and Texas, but concluded he could not get the election overturned in court. He later wrote, presciently, that "the mark of the good loser is that he takes his anger out on himself and not on his victorious opponents or on his teammates."

After the 2000 election, Al Gore urged his supporters to peacefully accept the results even though he had won the popular vote and only lost in the Electoral College because he trailed George W. Bush by 537 votes in Florida — and only then because a recount was halted before that margin could disappear. Unlike Cleveland or Nixon, Gore took his case all the way to the Supreme Court, but after it ruled against him in an infamous and overtly partisan decision, Gore accepted the results and urged his supporters to do likewise.

Even the legendary election of 1800, which threatened to tear the young country apart and revealed the flaws in the Constitution's original method for electing a president, did not lead to violence at the time. It's fair to say, however that its loser Aaron Burr, has been depicted as a historical villain ever since.

The 1968 election was very close and highly divisive — and may have witnessed one of the sleaziest political dirty tricks in American history. According to reliable accounts, Nixon tried to undermine the ongoing negotiations to end the Vietnam War in order to torpedo the campaign of Vice President Hubert Humphrey, his Democratic opponent. President Lyndon Johnson supposedly knew this was happening but did not go public with it. One of Nixon's advisers at the time was a young man named Roger Stone, more recently a confidant and consultant to Donald Trump. (Nixon was later forced to resign over the Watergate scandal, after trying to spy on Democratic campaign headquarters and then covering it up.)

Not surprisingly, Trump and the Republicans don't bring up 1968 or any of those other elections. For that matter, they also don't mention the one election that did result in violent public rejection — the 1860 election won by Abraham Lincoln. No one seriously suggested that Lincoln had stolen the election, but the leaders of Southern slave states found him unacceptable as president, and plunged the nation into four years of bloody carnage for what can reasonably be described as the worst possible reasons.

There are more reasons why Trumpist claims that the 2020 election was special don't hold up to historical scrutiny. For most of American history, Black people and women could not vote. The above-mentioned 1876 election, in fact, was resolved precisely by stripping away Black men's right to vote in most of the South. Throughout American history, people have been disenfranchised based on race, ethnicity, class, gender or simply the perception that they were likely to vote the "wrong" way. There have been elections decided through backroom deals, voter suppression, tampering with the results and other dubious or illegal tactics. Our democracy is almost always messy, and on many occasions deeply flawed.

If Trump were correct in his assertions about the 2020 election (which, once again, he absolutely is not), that would of course be an injustice. But it probably wouldn't make the list of the top 10 injustices in American political history. To state the obvious, Trump fans don't care about political injustice and absolutely don't care about history. Those are just excuses for a blatant attempt to overthrow democracy. If Trump's supporters want to believe they are special, in one sense they are. They remain loyal to the only president in American history to openly flout George Washington's most important precedent — the peaceful transfer of power.

Horseshoes and widow's peaks: Why do men go bald in different patterns?

In a 1997 episode of "Seinfeld," a character named Kurt learns to his dismay that he is going bald. His dismay doesn't stem from his aversion to being hairless; in fact, he is a swimmer, and is used to shaving his cranium to better compete. But when he learns that he may become involuntarily bald, he sinks into a state of impulsivity and depression.

While most bald people do not react as poorly as the "Seinfeld" swimmer, the episode does accurately capture a basic truth about male pattern baldness. Indeed, it is not simply that men are self-conscious, but that male pattern baldness imparts a feeling of disempowerment over their appearance.

"Male pattern baldness or Androgenetic alopecia is the most common cause of hair loss, affecting up to 80% of men in their lifetime," Dr. Kristen Lo Sicco, an associate professor of dermatology at NYU Langone Health, told Salon by email. "About 75% of people with Androgenetic alopecia have at least one family member with this type of hair loss."

That sense of disempowerment also stems from the mystery of baldness, and the lack of control over it. It is distressing for any human to feel that one's body is out of one's control, that it is doing things without our consent. Yet science has a sense of why, and how, baldness happens — and why men (and sometimes women) lose power over their hair.

There is, apparently, a connection between a chemical that we associate with masculinity — testosterone — and the looming threat that your hair will start to thin. A man who suffers from male pattern baldness has hair follicles that sensitive to the testosterone metabolite, or dehydrotestosterone.

Like the color of your eyes or shape of your earlobe, the structure of your hairline is actually quite individual and says a great deal about you. Until recently, scientists didn't have a particularly comprehensive knowledge about different types of hairlines. That's because early major classification systems tended to mostly or entirely study only one race (Caucasian men, Japanese men). In 1975, a dermatologist and hair transplant surgeon named Dr. O'Tar Norwood studied male pattern baldness in over 1,000 males and came up with the system most commonly used today.

According to Norwood's system, most hair loss begins at the temples and gradually recedes back, eventually leaving a peninsula of hair in the middle of the scalp. If the baldness continues, that peninsula will become more narrow while the hair on the crown starts to thin out. If your baldness insists on defoliating your cranium even further, the peninsula will start to shrink the direction of the bald patch on your crown. When the two areas of baldness finally meet, you will be left with nothing but a horseshoe-shaped ring of hair.

There is one variant to this system, which Norwood dubbed "Type A." Its main features are that it deprives the early balding man of that precious peninsula but spares him that initial thinning on the crown. Instead, like a retreating army destined to be court marshaled for abandoning their position, the hair at the front simply gives up the fight and allows a hairless desert of skin to overtake the landscape atop one's skull.

Want more health and science stories in your inbox? Subscribe to Salon's weekly newsletter The Vulgar Scientist.

Analogies aside, there are straightforward scientific explanations for why one's hair seems to abandon you. The truth is that, with male pattern baldness, the hair isn't falling off of your head in unusually large quantities. (Hair is supposed to fall out, after all.) It is rather that the hair doesn't grow back as it normally would.

"A few things happen to cause the appearance of balding," Lo Sicco explained. The first is called "miniaturization. "This means the hair shafts are becoming smaller and smaller in caliber (thickness) over time," Lo Sicco continued. The second is "shortening of the growth phase of hair," known as "the anagen phase."

"This means less hair will be on the scalp and will become shorter," Lo Sicco said. "This is because the length of our hair is determined by the length of the anagen phase."

The type of hairline that you wind up with depends on how your biological chemistry impacts the hair follicles at different locations on your head.

Likewise, it turns out there is also a reason that bald men continue growing hair on the back and sides of their head — the "horseshoe" pattern of hair, as sported by celebrities like Larry David and the late James Gandolfini.

"The hair on the top and front of the scalp is susceptible to androgens, however the hair on the back of it is not (occipital scalp)," Lo Sicco wrote. "This theory is known as 'donor dominance,' pioneered by Dr. Orentreich who developed hair transplantation in the US while at the NYU Skin & Cancer Unit."

While it is easy to joke about male pattern baldness, it is not a harmless medical condition. Depending on the stage and severity, baldness has been linked to an increased risk of heart disease (although the extent of that risk is still debated), insulin resistance, high blood pressure and prostate cancer. The exact mechanisms are unknown, however, as much still remains to be learned about the exact mechanisms behind baldness.

BRAND NEW STORIES

Don't Sit on the Sidelines of History. Join Alternet All Access and Go Ad-Free. Support Honest Journalism.