Fashion Retailer Zara Has a Dubious Solution to Prevent Racist and Culturally Appropriated Products

This can’t end well. A recent New York Times article that explained how fast fashion brands are trying to avoid making insensitive and offensive material revealed that mega-retailer Zara has a new plan for combating racism, thievery and cultural appropriation in its designs. To avoid future gaffes, Zara told the Times it will rely on an algorithm to "scan designs for insensitive or offensive features."


But this strategy for achieving increased sensitivity is faulty and troubling.

In recent years, Zara has been accused of appropriating designs from non-European cultures, like a Somali Baati dress and a South Asian lungi skirt, as well as selling outright offensive items like a white supremacist “Pepe the frog” skirt and a striped top that looked suspiciously like a concentration camp uniform, complete with yellow Star of David.

[[{"type":"media","view_mode":"media_large","fid":"631279","attributes":{"alt":"","class":"media-image","height":"394","typeof":"foaf:Image","width":"480"}}]]

Credit: Zara

Zara isn’t alone in weathering such scandals, or seeking solutions when they happen. A few weeks ago, H&M experienced a backlash after it dressed a young black model in a sweatshirt with the phrase “coolest monkey in the jungle” printed across the front. In response to criticism, H&M hired a new global leader for diversity and inclusiveness.

Fast fashion is literally fast: stores like Zara can turn designs into in-store products in just a few weeks. But the lack of oversight over a massive amount of product allows for major mistakes to happen. “When you have two hours to approve a line versus two months, things go unnoticed,” Adheer Bahulkar, a retail expert, told the Times.

Zara’s choice to rely on an algorithm to scan for potentially offensive content is a step in the right direction, but algorithms aren’t always reliable: remember when Google Photos labeled African American users “gorillas?" Or when a Nikon camera insisted that Asian subjects of photos were blinking? AI technology does not have the best track record, and has a pattern of contributing to existing biases.

Kate Crawford, an AI researcher at Microsoft, and Meredith Whittaker, a researcher at Google, told the MIT Technology Review that bias may exist in algorithmic products in all sectors. “It’s still early days for understanding algorithmic bias,” they said. “Just this year we’ve seen more systems that have issues, and these are just the ones that have been investigated.”

Cathy O’Neil, a mathematician and the author of Weapons of Math Destruction, similarly told the MIT Technology Review, “Algorithms replace human processes, but they’re not held to the same standards. People trust them too much.”

The decision to rely on algorithms to prevent another PR gaffe seems like an inexpensive way to address a problem that actual human beings should be responsible for. As Blavity pointed out, Zara would be better off promising to increase diversity in its design and production teams. The company did hire a team of diversity officers in 2016 after multiple embarrassing, offensive products were released, and Zara now requires new employees go through inclusion training (perhaps in response to reports that salespeople discriminated against non-white customers). But in order to truly respond to accusations of insensitivity, the company must go beyond use of a robot and a mandatory one-time training for workers, and seek to employ an inclusive and diverse upper-level staff, as well as disrupt the all-white, all-European makeup of its board of directors.

Enjoy this piece?

… then let us make a small request. AlterNet’s journalists work tirelessly to counter the traditional corporate media narrative. We’re here seven days a week, 365 days a year. And we’re proud to say that we’ve been bringing you the real, unfiltered news for 20 years—longer than any other progressive news site on the Internet.

It’s through the generosity of our supporters that we’re able to share with you all the underreported news you need to know. Independent journalism is increasingly imperiled; ads alone can’t pay our bills. AlterNet counts on readers like you to support our coverage. Did you enjoy content from David Cay Johnston, Common Dreams, Raw Story and Robert Reich? Opinion from Salon and Jim Hightower? Analysis by The Conversation? Then join the hundreds of readers who have supported AlterNet this year.

Every reader contribution, whatever the amount, makes a tremendous difference. Help ensure AlterNet remains independent long into the future. Support progressive journalism with a one-time contribution to AlterNet, or click here to become a subscriber. Thank you. Click here to donate by check.

DonateDonate by credit card

Close

Thanks for your support!

Did you enjoy AlterNet this year? Join us! We're offering AlterNet ad-free for 15% off - just $2 per week. From now until March 15th.