Having a Facebook Account Is to Beg to Be Manipulated
Photo Credit: Twin Design / Shutterstock.com
What do you call it when media try to manipulate your feelings without first asking for informed consent?
Example: The average Facebook user sees only 20 percent of the 1,500 stories per day that could have shown up in their news feed. The posts you receive are determined by algorithms whose bottom line is Facebook’s bottom line. The company is constantly adjusting all kinds of dials, quietly looking for the optimal mix to make us spend more of our time and money on Facebook. Of course the more we’re on Facebook, the more information they have about us to fine-tune their formulas for picking ads to show us. That’s their business model: We create and give Facebook, for free, the content they use and the data they mine to hold our attention, which Facebook in turn sells to advertisers.
Those are the terms of service that everyone, without reading, clicks “I Agree to” – and not just for Facebook. We make comparable mindless contracts all the time with Gmail, Yahoo, Twitter, Amazon, Siri, Yelp, Pandora and tons of other apps, retailers and advertiser-supported news and entertainment. If you’re online, if you use a smartphone, you’re an experimental subject in proprietary research studies of how best to target, engage and monetize you. They’re always testing content, design, headlines, graphics, prices, promotions, profiling tools, you name it, and you’ve opted in whether you realize it or not.
Many of these experiments hinge on our feelings, because much of what makes us come, stay, buy, like, share, comment and come back is emotional, not rational. So it should surprise no one that Facebook wants to know what makes its users happier. But when they acknowledged last month that they had tested – on 700,000 people, for one week – whether increasing the fraction of upbeat posts in their news feeds made them feel more upbeat (it did), a firestorm broke out.
The charge: People are being treated like guinea pigs without their consent. Unaccountable corporations are secretly manipulating our emotions. This is the slippery slope to “Brave New World.”
So what else is new? Neil Postman first warned us about “Amusing Ourselves to Death” – the name of his book – in 1984, before the Web was spun. But that didn’t stop entertainment, which is exquisitely attuned to the marketplace, from making its long march through our institutions. Today, politics is all about unaccountable corporations manipulating our emotions; they're constantly testing and targeting their paid messages to voters, none of whom are asked for informed consent. The news industry is all about the audience, and much of its content has long been driven by the primal power of danger, sex and novelty to trap our attention, but there's no clamor for shows and sites to warn us we're lab chimps.
John Kenneth Galbraith called advertising "the management of specific demand." Ads tell us stories, which are all variants of: If you buy this, you'll be happy. Their words and images were tested on audiences even before Don Draper was a boy, and now digital analytics gives marketers new attention management techniques to use on us. Today, every tweet, every YouTube or blog post aspires to be viral, and when that happens, no one complains that some cat or cute kid or Kardashian has used Orwellian mind-control to manipulate our mood.
I’ll give the Facebook freakout this: University partners did the research using Facebook’s data, and the academic vetting process could have gone the other way and nixed the project. But even if that had happened, Facebook could still have conducted this experiment, just as they and Google and plenty of other companies no doubt continue to adjust algorithms, run randomized trials of content and design (known as A/B tests) and discover the many economic, political and cultural micro-tribes we consumers belong to. Academic committees called Institutional Review Boards rule on what professors can do to research subjects, but informed consent in Silicon Valley is basically what someone can get away with, which is what’s been true for commerce, politics and the content industries since at least the 1980s.
In fact, ever since people first gathered around the fire, storytellers have perfected their skills by studying the data in their audiences’ eyes. Today, we may think that our media savvy and B.S. detectors protect us from being played like piccolos, but people have always believed that thinking could reliably prevent their emotions from running away with them, and they’ve always been wrong. Neuroscience now shows what happens: Our emotions are faster than our reason, which we then use to reverse engineer some rationalization for our actions.
Is there any way to protect people from the hidden persuaders, as Vance Packard called an earlier era’s desire wizards? After all, the arts and technologies of manipulation are only going to get more powerful. Consumer protection is only going to grow weaker. Mass education’s ability to turn out critical thinkers is hardly going to spike upward. The best plan Plato could come up with to protect future leaders from being enslaved by their appetites was to exile the most powerful manipulators of his time – the poets, who whipped crowds into frenzies with their artifice and illusions.
But banishment is an authoritarian solution. More speech, not less, is the democratic answer to assaults on freedom and agency. Open-source research, with methods and tools freely available, can serve the public interest. (We’re up to that at the Norman Lear Center’s Media Impact Project.) And the place where countervailing speech really wants to get heard is in the media, whose industrial success, like Facebook’s, depends on monetizing our attention. I’ve seen a lot of stories about Facebook fiddling with the happiness of our feeds. The irony is that I encountered all of them on media whose owners are just as determined to push my buttons as Mark Zuckerberg.