The disturbing and cruel psychology of Marjorie Taylor Greene

Rep. Marjorie Taylor Greene

Indulge me please while I dwell another day on Marjorie Taylor Greene, the House Republican and subject of Friday's Editorial Board. I want to discuss her hostile confrontation of David Hogg, the teenager who saw friends and classmates being murdered in the 2018 Parkland massacre. A year later, Greene stalked Hogg while he was walking down a Washington street. She demanded he defend his gun-reform advocacy. She later called him "a coward" in the pay of George Soros. She said the "radical gun control agenda David Hogg was pushing" made him a "little Hitler."

Consider what it took for someone to do this. Consider what a grown woman has to do psychologically to look into the eyes of a teenager who witnessed unimaginable suffering, and call him "a coward." Most of us possess a sense of empathy. Most of us, even if we're champions of the Second Amendment, could not tolerate knowing that we've compounded a young man's pain. Most of us would recoil instantly. "Oh my God!" most of us would say. "I'm so sorry! Go ahead and say whatever you want!"

Read More Show less
ACLU By ACLUSponsored

Imagine you've forgotten once again the difference between a gorilla and a chimpanzee, so you do a quick Google image search of “gorilla." But instead of finding images of adorable animals, photos of a Black couple pop up.

Is this just a glitch in the algorithm? Or, is Google an ad company, not an information company, that's replicating the discrimination of the world it operates in? How can this discrimination be addressed and who is accountable for it?

“These platforms are encoded with racism," says UCLA professor and best-selling author of Algorithms of Oppression, Dr. Safiya Noble. “The logic is racist and sexist because it would allow for these kinds of false, misleading, kinds of results to come to the fore…There are unfortunately thousands of examples now of harm that comes from algorithmic discrimination."

On At Liberty this week, Dr. Noble joined us to discuss what she calls “algorithmic oppression," and what needs to be done to end this kind of bias and dismantle systemic racism in software, predictive analytics, search platforms, surveillance systems, and other technologies.

What you can do:
Take the pledge: Systemic Equality Agenda
Sign up