Riddles 11

EASY:A sweet geometryThat quickly disappears.Becomes palatial tinge,This small, white bit of cheer.ANSWER: A sugar cubeA cube of sugar dissolves quickly in one's coffee or tea. Once dissolved, the sugar becomes a sweet, cheerful tinge on the tongue.INTERMEDIATEHaiku is structured in syllables,But here each stage by feet is done.A stone is thrown. A task is shown.Then: one, two, one, two, one, two, one.ANSWER: HopscotchSomewhat like the pattern of the syllables in a haiku (5-7-5), the steps of a hopscotch game are ordered according to the number of feet one uses. After throwing a stone and seeing your task, the hopper jumps (with two legs) and hops (with one leg) to victory.DIFFICULTThe field is budding with new day's growth.The farmer awakes and it starts to rain.The earth is cleansed. The field is cut.The field lies smooth, but the farmer feels pain.ANSWER: A Morning ShaveThe "field" in this riddle is a man's face -- budding each morning with one day's growth of facial hair. When the man wakes up he showers, cleaning the "earth," and then "cuts the field." While his field now lies smooth, the farmer feels the pain of razorburn.

ACLU By ACLUSponsored

Imagine you've forgotten once again the difference between a gorilla and a chimpanzee, so you do a quick Google image search of “gorilla." But instead of finding images of adorable animals, photos of a Black couple pop up.

Is this just a glitch in the algorithm? Or, is Google an ad company, not an information company, that's replicating the discrimination of the world it operates in? How can this discrimination be addressed and who is accountable for it?

“These platforms are encoded with racism," says UCLA professor and best-selling author of Algorithms of Oppression, Dr. Safiya Noble. “The logic is racist and sexist because it would allow for these kinds of false, misleading, kinds of results to come to the fore…There are unfortunately thousands of examples now of harm that comes from algorithmic discrimination."

On At Liberty this week, Dr. Noble joined us to discuss what she calls “algorithmic oppression," and what needs to be done to end this kind of bias and dismantle systemic racism in software, predictive analytics, search platforms, surveillance systems, and other technologies.

What you can do:
Take the pledge: Systemic Equality Agenda
Sign up