Truck Designers Wear Skirts to Build a Better Car For Women

This post, written by Jessica Valenti, originally appeared on Feministing

What better way to find out what women want in a car then to dress in skirts, heels, and throw on some fake nails?
"A few times a year we go off-site and try to have a learning exercise that is a lot of fun," said [GM vehicle line director Mary Sipes]. "We took our group to the proving grounds and broke them into teams. One guy on each team had to be Mr. Mom. We dressed him in a garbage bag to simulate a tight skirt. We gave him rubber gloves with press on nails, a purse, a baby and a baby stroller and some chores like loading groceries."
The men were then required to go through what women do routinely every day. They had to put the baby in a car seat and buckle them in, fold up the stroller, pull up the liftgate and stow the stroller, put grocery bags in the back. They then had to walk around the vehicle and step into it not using the running board. Wearing the gloves with press-on nails they had to operate the key fob, adjust the radio and then figure out what to do with their purses -- without breaking or losing a nail. Lost or broken fingernails or torn garbage bag skirts resulted in points against the final score.
ACLU By ACLUSponsored

Imagine you've forgotten once again the difference between a gorilla and a chimpanzee, so you do a quick Google image search of “gorilla." But instead of finding images of adorable animals, photos of a Black couple pop up.

Is this just a glitch in the algorithm? Or, is Google an ad company, not an information company, that's replicating the discrimination of the world it operates in? How can this discrimination be addressed and who is accountable for it?

“These platforms are encoded with racism," says UCLA professor and best-selling author of Algorithms of Oppression, Dr. Safiya Noble. “The logic is racist and sexist because it would allow for these kinds of false, misleading, kinds of results to come to the fore…There are unfortunately thousands of examples now of harm that comes from algorithmic discrimination."

On At Liberty this week, Dr. Noble joined us to discuss what she calls “algorithmic oppression," and what needs to be done to end this kind of bias and dismantle systemic racism in software, predictive analytics, search platforms, surveillance systems, and other technologies.

What you can do:
Take the pledge: Systemic Equality Agenda
Sign up