What Causes Cancer: Probably Not You

The perennial temptation to blame disease on sin or at least some grave moral failing just took another hit. A major new study shows that women on a virtuous low fat diet with an extraordinary abundance of fruits and veggies were no less likely to die of breast cancer than women who grazed more freely. Media around the world have picked up on the finding, cautioning, prudishly, that you can't beat breast cancer with cheeseburgers and beer.

Another "null result" in cancer studies -- i.e., one showing that a suspected correlation isn't there -- has received a lot less attention. In the May issue of Psychological Bulletin, James Coyne and his colleagues at the University of Pennsylvania reported that "there is no compelling evidence linking psychotherapy or support groups with survival among cancer patients." This flies in the face of the received wisdom that any sufficiently sunny-tempered person can beat cancer simply with a "positive attitude." For example, an e-zine article entitled "Breast Cancer Prevention Tips" advises:

Read More Show less
ACLU By ACLUSponsored

Imagine you've forgotten once again the difference between a gorilla and a chimpanzee, so you do a quick Google image search of “gorilla." But instead of finding images of adorable animals, photos of a Black couple pop up.

Is this just a glitch in the algorithm? Or, is Google an ad company, not an information company, that's replicating the discrimination of the world it operates in? How can this discrimination be addressed and who is accountable for it?

“These platforms are encoded with racism," says UCLA professor and best-selling author of Algorithms of Oppression, Dr. Safiya Noble. “The logic is racist and sexist because it would allow for these kinds of false, misleading, kinds of results to come to the fore…There are unfortunately thousands of examples now of harm that comes from algorithmic discrimination."

On At Liberty this week, Dr. Noble joined us to discuss what she calls “algorithmic oppression," and what needs to be done to end this kind of bias and dismantle systemic racism in software, predictive analytics, search platforms, surveillance systems, and other technologies.

What you can do:
Take the pledge: Systemic Equality Agenda
Sign up