Professor of history of philosophy explains why adversarial criticism is antithetical to truth

3-D rendering via Shutterstock.

Philosophical discussions, whether in a professional setting or at the bar, frequently consist of calling out mistakes in whatever has been proposed: ‘This is all very well, but …’ This adversarial style is often celebrated as truth-conducive. Eliminating false assumptions seems to leave us with truth in the marketplace of ideas. Although this is a fairly pervasive practice (even I am practising it right now), I doubt that it is a particularly good approach to philosophical discussions. The lack of progress in adversarial philosophical exchange might rest on a simple but problematic division of labour: in professional settings such as talks, seminars and papers, we standardly criticise others’, rather than our own, views. At the same time, we clearly risk our reputation much more when proposing an idea rather than criticising it. This systematically disadvantages proponents of (new) ideas.

Read More Show less
ACLU By ACLUSponsored

Imagine you've forgotten once again the difference between a gorilla and a chimpanzee, so you do a quick Google image search of “gorilla." But instead of finding images of adorable animals, photos of a Black couple pop up.

Is this just a glitch in the algorithm? Or, is Google an ad company, not an information company, that's replicating the discrimination of the world it operates in? How can this discrimination be addressed and who is accountable for it?

“These platforms are encoded with racism," says UCLA professor and best-selling author of Algorithms of Oppression, Dr. Safiya Noble. “The logic is racist and sexist because it would allow for these kinds of false, misleading, kinds of results to come to the fore…There are unfortunately thousands of examples now of harm that comes from algorithmic discrimination."

On At Liberty this week, Dr. Noble joined us to discuss what she calls “algorithmic oppression," and what needs to be done to end this kind of bias and dismantle systemic racism in software, predictive analytics, search platforms, surveillance systems, and other technologies.

What you can do:
Take the pledge: Systemic Equality Agenda
Sign up