From 'Hillbilly Elegy' to 'Fried Green Tomatoes,' Hollywood has a rural perception problem

As soon as Donald Trump was elected to the presidency in 2016, it seemed that all eyes turned to rural America for answers. I was working at a public radio station in Louisville, Kentucky on election night — a typically blue dot in a state that has a complex voting history, voting for Jimmy Carter and Bill Clinton, as well as Reagan, both Bushes and Trump — when the results began to roll in. As it became clear that Trump was pulling ahead, my phone began to light up with texts from friends and other journalists who lived in larger cities, most asking the same question: "How could this happen?"

Almost immediately, media members, policymakers, and academics set about trying to contextualize the beliefs of portions of the country that had seemed to fade below notice or were dismissed as "flyover country." America at large sought to understand "rural America," a term that quickly became shorthand for the white, non-college-educated voters that aided in Trump's victory.

Read More Show less
ACLU By ACLUSponsored

Imagine you've forgotten once again the difference between a gorilla and a chimpanzee, so you do a quick Google image search of “gorilla." But instead of finding images of adorable animals, photos of a Black couple pop up.

Is this just a glitch in the algorithm? Or, is Google an ad company, not an information company, that's replicating the discrimination of the world it operates in? How can this discrimination be addressed and who is accountable for it?

“These platforms are encoded with racism," says UCLA professor and best-selling author of Algorithms of Oppression, Dr. Safiya Noble. “The logic is racist and sexist because it would allow for these kinds of false, misleading, kinds of results to come to the fore…There are unfortunately thousands of examples now of harm that comes from algorithmic discrimination."

On At Liberty this week, Dr. Noble joined us to discuss what she calls “algorithmic oppression," and what needs to be done to end this kind of bias and dismantle systemic racism in software, predictive analytics, search platforms, surveillance systems, and other technologies.

What you can do:
Take the pledge: Systemic Equality Agenda
Sign up