Red pills and radicalization: Here's how conspiracism is overwhelming us
America is drowning in conspiracy theories: QAnon cultists are taking over the Republican Party, the occupant of the Oval Office praises them in press conferences, rational COVID-19 pandemic measures are met with armed protests and raging open defiance, and Proud Boys and militiamen are bringing violence to liberal cities while armed vigilantes riled up about nonexistent “antifa arsonists” harass strangers and journalists in fire-stricken rural areas.
So I wrote a book about it. More importantly, it’s a book intended not just to document and explain the phenomenon, but to also be a guide for finding our way out of the morass. It’s titled Red Pill, Blue Pill: How to Counteract the Conspiracy Theories That Are Killing Us, out this week from Prometheus Books. Here’s an excerpt.
[The following excerpt is from Chapter Seven, “Chaos By Design,” which describes how and why people are effectively radicalized online by conspiracism and its attendant disinformation. It seemed especially appropriate for the current moment.]
It may be the most notorious Google search in history: “black on white crime.” That was the search that sent Dylann Roof down the path that led him to murder nine black parishioners inside a Charleston, South Carolina, church one evening in June 2015.
He described this path in a post on the white supremacist website he had created.
The event that truly awakened me was the Trayvon Martin case. I kept hearing and seeing his name, and eventually I decided to look him up. I read the Wikipedia article and right away I was unable to understand what the big deal was. It was obvious that Zimmerman was in the right. But more importantly this prompted me to type in the words “black on White crime” into Google, and I have never been the same since that day. The first website I came to was the Council of Conservative Citizens. There were pages upon pages of these brutal black on White murders. I was in disbelief. At this moment I realized that something was very wrong. How could the news be blowing up the Trayvon Martin case while hundreds of these black on White murders got ignored?
From this point I researched deeper and found out what was happening in Europe. I saw that the same things were happening in England and France, and in all the other Western European countries. Again I found myself in disbelief. As an American we are taught to accept living in the melting pot, and black and other minorities have just as much right to be here as we do, since we are all immigrants. But Europe is the homeland of White people, and in many ways the situation is even worse there. From here I found out about the Jewish problem and other issues facing our race, and I can say today that I am completely racially aware.
Roof actually has provided a kind of map of radicalization online. As information scientist Michael Caulfield explains, this whole process is actually produced by a kind of self-contained data spiral that’s based to a large extent on “curation”—that is, the way we collect web materials into our own spaces and annotate them. That curation creates a data feedback for the algorithm that then directly affects what you see. Curations, he warns, can warp reality because of the resulting feedback loop: they “don’t protect us from opposing views, but often bring us to more radical views.”
Caulfield observes that “black on white crime” is a data void—that is, it’s not a term used by social scientists or reputable news organizations, “which is why the white nationalist site Council of Conservative Citizens came up in those results. That site has since gone away, but what it was was a running catalog of cases where black men had murdered (usually) white women. In other words, it’s yet another curation, even more radical and toxic than the one that got you there. And then the process begins again.”
Noble explains that the framing a person brings to his or her Internet experience shapes what kinds of results they see on a search engine, or a video recommendation, or a social media news feed. “In the case of Dylann Roof’s alleged Google searches,” she writes, “his very framing of the problems of race relations in the U.S. through an inquiry such as ‘black on white crime’ reveals how search results belie any ability to intercede in the framing of a question itself. In this case, answers from conservative organizations and cloaked websites that present news from a right-wing, anti-Black, and anti-Jewish perspective are nothing more than propaganda to foment racial hatred.”
The key to this process of radicalization is its incremental nature: people undergoing it don’t recognize what is happening to them, since each step feels normal initially. This is in fact precisely by design by the organizations and ideologues who are trying to recruit people into their conspiracy theories, which are ultimately about belief systems and political movements.
A onetime “red-pilled” conspiracy theorist named Matt described how he became trapped in a curated spiral like this for Kelly Weill of The Daily Beast. It began when he innocently watched a video of Bill Maher and Ben Affleck discussing Islam, and at its completion, the algorithm recommended several much more extreme videos attacking Islam, including some produced by Infowars conspiracy theorist Paul Joseph Watson. One video led to the next and the next.
“Delve into [Watson’s] channel and start finding his anti-immigration stuff which often in turn leads people to become more sympathetic to ethno-nationalist politics,” Matt said. “This sort of indirectly sent me down a path to moving way more to the right politically as it led me to discover other people with similar far-right views.”
Now twenty, Matt has since exited the ideology and built an anonymous Internet presence where he argues with his ex-brethren on the right.
“I think YouTube certainly played a role in my shift to the right because through the recommendations I got,” he said, “it led me to discover other content that was very much right of center, and this only got progressively worse over time, leading me to discover more sinister content.”
“The thing to remember about this algorithmic–human grooming hybrid is that the gradualness of it—the step-by-step nature of it—is a feature for the groomers, not a bug,” says Caulfield. “I imagine if the first page Roof had encountered on the CCC page had sported a Nazi flag and a big banner saying ‘Kill All Jews,’ he’d have hit the back button, and maybe the world might be different. (Maybe.) But the curation/search spiral brings you to that point step by step. In the center of the spiral you probably still have enough good sense to not read stuff by Nazis, at least knowingly. By the time you get to the edges, not so much.”
Peter Neumann, of the U.K.’s Centre for the Study of Radicalisation, identifies six steps on the ladder of extremist belief. “The first two of these processes deal with the consequences of being exposed to extremist content,” he writes. “No single item of extremist propaganda is guaranteed to transform people into terrorists. Rather, in most cases, online radicalization results from individuals being immersed in extremist content for extended periods of time, the amplified effects of graphic images and video, and the resulting emotional desensitization.”
Beheading videos, photos of corpses, suicides, and mass murders, all these things are part of these first two steps in the immersion process. Neumann calls this mortality salience—material intended to create an overpowering sense of one’s own vulnerability to death, as well as to heighten the viewer’s moral outrage.
The next two steps are also key to the process—namely, immersion in extremist forums, where deviant and extremist views are normalized, and online disinhibition, wherein people lose their normal inhibitions about violence because of their relative anonymity online. “Some of the participants get so worked up that they declare themselves ready to be terrorists,” notes psychologist Marc Sageman. “Since this process takes place at home, often in the parental home, it facilitates the emergence of homegrown radicalization, worldwide.”
The final stages occur when online role-playing occurs—the kind in which new recruits more or less practice their ideology in gaming situations, often in the context of modern video games. The participants project themselves into their gaming avatars, giving themselves traits that they usually do not possess in real life. After a while, this divide becomes noticeable and drives further radicalization: “[A]fter recognizing the gap between their avatar’s mobilization and their own physical mobilization, many online participants begin taking steps to reconcile the gap,” observe researchers Jaret Brachmann and Alex Levine. This is when they take the last step: using the Internet to connect directly to terrorist infrastructures that then begin to mobilize them.
Caulfield believes one of the keys to preventing this kind of radicalization lies in establishing “digital literacy” programs wherein young people new to the Internet can learn how to confront, cope with, and overcome the challenges they will be forced to navigate there. And it all begins with the curation process, how we accumulate the materials for our personal spaces.
“So, the idea here is that you might start in a relatively benign space with some kind of ideological meaning, and then someone uses this term ‘black on white crime,’” Caulfield says. “It’s probably a stretch to call the Google search results a curation, but you can think of it along the same lines. You put in a term, and Google is going to show you the most relevant, not necessarily the best, but the most relevant results for that term. And now you have a set of things that are in front of you. Now, on each of those pages, because you picked ‘black on white crime,’ if you click into that page that has ‘black on white crime,’ there are going to be other phrases on there.”
Even people with normal levels of skepticism can find themselves drawn inside. “So you go, and you do the Google search, and you’re like, ‘You know what? I can’t trust this page. I’m going to be a good info-literacy person, and what I’m going to do is, I’m going to just check that these crimes really happened.’ OK, so what do you do? You pull these crimes and what you find is that these crimes did happen, and the pages they’re going to are more white supremacists talking about how these are actually black on white hate crimes. And now they’re mentioning more things, and they’re mentioning more terms, and they’re mentioning changes in law that now make it easier for black people to kill white people.
“So you’re like ‘Oh, well, I’ve got to Google this change in the law.’ But who’s talking about this thing that’s broadly made up, or it’s a heavy misinterpretation of something? Well, again it’s white ... So you keep going deeper and deeper, and every time you’re pulling out something to investigate on that page, it’s pulling you into another site, and that other site of course is covering a bunch of other events and terms and so forth. And you end up going deeper and deeper into it.”
Caulfield argues that educators need to help their students develop better informational literacy, including learning how to recognize when they are being recruited into a radical belief system or cult or are being manipulated for either financial or political motivations.
“My contention is that the students are practicing info-literacy as they’ve learned it,” he says. “And as a matter of fact, this approach to researching online is what they have learned from a fairly early age in terms of how to approach sources and information on the web.
“I think a lot of academics and teachers would say, ‘but that’s not what we’re teaching,’” he adds. “Let’s just put that whole argument aside because it doesn’t matter. Whatever we’re teaching, these are the lessons they take away from it. So they are practicing info-literacy as learned, and through either chance or through engineering or through fate, whatever it is, these techniques plug really well into radicalization.”
Sometimes the people who fall down the rabbit holes and are recruited into communities organized around conspiracy theories would have ended up in a similar situation regardless. But people are also being actively recruited for a combination of political, ideological, and financial/economic motivations. And they are being actively deceived.
“We are all targets of disinformation, meant to erode our trust in democracy and divide us,” warns University of Washington information scientist Kate Starbird.
She came to this stark conclusion while conducting a study at the University of Washington involving the evolution of the discussion about the Black Lives Matter movement on social media—and found herself walking into the unexpected realization, supported both by data and a raft of real-world evidence, that the whole discussion was being manipulated, and not for the better. The more the team examined the evidence, the clearer it became that this manipulation was intended to fuel internal social strife among the American public.
The study quickly morphed into a scientific examination of disinformation—that is, information that’s intended to confuse and distort, whether accurate or not—which exists on all sides of the political spectrum. One of their key studies focused on Twitter to see how bad information immediately follows major crisis-type events such as mass shootings and how those rumors “muddy the waters” around the event, even for people who were physically present, and in particular how such rumors can permanently alter the public’s perception of the event itself and its causes.
Consider exhibit A: the nearly instantaneous claims by Alex Jones and other conspiracy theorists that the Las Vegas mass shooting of October 1, 2017, was a false flag event and the ensuing swirl of confusion around it, which eventually permanently obscured the public’s understanding that the man who perpetrated it was unhinged and at least partially motivated by far-right conspiracy theories about guns. Police investigators avoided the evidence that this had been the case as well.
The chief reason we perceive stories, whether real or not, as “true” depends in large part on our unconscious cognitive biases, Starbird says—that is, when our preexisting beliefs are confirmed along the way. We’ve seen how these biases can be targeted by technology companies. Well-equipped political organizations can manipulate disinformation in much the same way.
“If it makes you feel outraged against the other side, probably someone is manipulating you,” she warns.
The main wellspring of the disinformation Starbird dealt with in her study was Russia and its “troll farms” that introduced industrial-strength data pollution into the American discourse via social media during the 2016 election campaign and afterward. However, she says that the disinformation can be, and often is, run by anyone sophisticated enough to understand its essential principles. These include white nationalists, a number of conspiracy-oriented campaigns involving vaccines and other health-related conspiracies, and in recent years, QAnon.
The strategy, she says, is not just consistent, but frighteningly sophisticated and nuanced. “One of these goals is to ‘sow division,’ to put pressure on the fault lines in our society,” she explained in her findings. “A divided society that turns against itself, that cannot come together and find common ground, is one that is easily manipulated. . . . Russian agents did not create political division in the United States, but they were working to encourage it.”
These outside organizational entities make full use of a preexisting media ecosystem featuring “news” outlets that claim to be “fair” and “independent,” but which are in fact only propaganda organizations, nearly all of them right-wing. As Starbird explained in one of her studies:
This alternative media ecosystem has challenged the traditional authority of journalists, both directly and indirectly. . . . Its development has been accompanied by a decreased reliance on and an increased distrust of mainstream media, with the latter partially motivated by a perception of widespread ethical violations and corruption within mainstream media. . . . Indeed, many view these alternative news sites as more authentic and truthful than mainstream media, and these effects are compounding—as research has found that exposure to online media correlates with distrust of mainstream media.
False information renders democratic discourse, which relies on factual accuracy, impossible, and as Starbird notes, “with the loss of commonly-held standards regarding information mediation and the absence of easily decipherable credibility cues, this ecosystem has become vulnerable to the spread of misinformation and propaganda.”
Because it’s actually a fairly closed, self-contained, and narrow ecosystem, it becomes a real echo chamber, with stories being repeated among the various “independent” news sites, even if they seem not to show up on the major networks (Fox being the most common exception). After a while, the repetition acts as a kind of confirmation for the stories—if people keep seeing different versions of the same headlines, they’ll start thinking the information has been confirmed by a “variety” of sources.
“The tactics of disinformation can be used by anyone,” Starbird says. “The Internet seems to facilitate access towards targets of different kinds, and we are definitely seeing disinformation from multiple sets of actors, including from the U.S., including foreign actors and domestic actors as well. There’s a certain flavor of Russian disinformation that is perhaps different from some others, but the tactics are known and they are easily learned and portable.”
Unwinding the relationship between authoritarian governments like Russia—which has been promoting far-right political movements around the world, particularly in Europe—and white nationalists trying to “red-pill” vulnerable young people is complicated. “There are some movements, particularly these far-right movements, whose disinformation aligns neatly with Russian disinformation as well,” Starbird observes.
“It’s a chicken-and-egg problem. The current manifestation of the far right or alt-right or whatever we want to call it, the information systems and some of the mechanisms for information flow all seem to have Russian disinformation integrated into them. It’s hard to know what’s cause and what’s effect, but they seem to be intertwined. In a similar way, we can see far left ecosystems around things like Syria and Venezuela are integrated with Russian disinformation as well… . We don’t know how causal that is versus opportunistic.”