To combat disinformation, we should treat Facebook like Big Tobacco

To combat disinformation, we should treat Facebook like Big Tobacco
Facebook CEO Mark Zuckerberg in Press conference at VIVA Technology (Vivatech) the world's rendezvous for startup and leaders.Shutterstock/ Frederic Legrand - COMEO

Political disinformation and misinformation spreading rapidly on social media sites has become a major problem over the past four years. While it affects people on both sides of the aisle, it's primarily been a force for radicalizing conservatives. From the alt-right in 2016 to QAnon in 2020, a lack of trust in our democratic institutions, Americans' lack of media literacy and the failures of the platforms themselves have caused countless conservatives around the country to adopt more extremist and conspiratorial views.

Facebook recently banned QAnon groups from its platform, but as many have noted, these kinds of bans are only so effective. Users often simply become better at camouflaging what they're doing. If we're going to seriously take on the political disinformation problem, we cannot rely on platforms like Facebook to self-police. We're going to need a solution from Washington rather than Silicon Valley.

Algorithms and consumer protection

David Carroll, an associate professor of media design at the Parsons School of Design and one of the subjects of the documentary on Cambridge Analytica called The Great Hack, tells AlterNet that we need to stop focusing so much on cleaning up disinformation and start focusing on how it's become so powerful.

"There's no strategic or systemic attempt to get at the root of the radicalizing effect of personal data feeding into algorithms, feeding into a business model and juicing engagement at all costs," Carroll says. "Facebook banning QAnon is part of its reckoning. It was a thing on the chan board fringe that was mainstreamed on Facebook through the recommendation engine—algorithms and group recommending—so I think making the companies accountable to product safety and deceptive practices is probably a better strategy."

Any solution to this problem must avoid trampling on free speech. Even when Facebook and Twitter recently took action to avoid the spread of a New York Post article on Hunter Biden's emails that contained personal email addresses and phone numbers—and which many have called Russian disinformation--Republicans claimed the platforms were censoring content to help Joe Biden win the election. There are ways we can address this issue without people worrying about tech companies intervening in the free flow of ideas.

Carroll says the goal should be keeping people from being radicalized in the first place and making sure social media companies can be held accountable when their platforms are causing harm to society. If Biden wins in November, the new administration and administration and Congress should come together and pass legislation that will force companies like Facebook to reveal to the public how their algorithms work and what personal data are being fed into them.

"Once you give people rights to their data and protection of their data, that creates the basis to then make algorithms more accountable, which then makes companies more accountable for products that they have liability for. Then they have more incentive to make them safe or take them off of the market if they're unsafe," Carroll says.

As things stand, the algorithms and recommendation engines that power social media platforms like Facebook operate in the shadows. That makes it difficult to identify exactly how Facebook's rabbit holes lead people to conspiracism and extremism. If these companies were forced to be more transparent and allow users to understand how they're being influenced, that could have a major effect on the spread of disinformation and allow us to more specifically point out ways Facebook is failing the public.

"I think it comes down to mandating explainability—meaning you have to be able to explain how the algorithm is working so that accusations of radicalization can be authoritatively sussed out, which will force companies to design the product so they can't be accused of radicalizing," Carroll says. "Right now, there's no mechanism to hold the company accountable."

Carroll believes that when people know why they're being pushed in a certain direction politically, such as why they're receiving the Facebook group recommendations they're receiving, then they'll be less likely to be sucked into a toxic spiral. Recent reporting from Britain's Channel 4 found that black voters who were shown that they were being targeted by the Trump campaign in 2016 and pushed not to vote wanted to vote even more. Even Trump supporters were disturbed by how they had been targeted.

Brooke Binkowski, former managing editor of Snopes and the current managing editor of Truth or Fiction, tells AlterNet that she would get rid of the algorithm entirely.

"That's not organic spread," she says. "They are still invisibly manipulating the conversation—or at least trying to."

Treating Facebook like Big Tobacco

Carroll says we need to start thinking about companies like Facebook the way we think about Big Tobacco, alcohol companies or car companies. He says there are limits to how those industries can market their product and what products they can sell. Car companies adhere to strict safety rules and regulations that have dramatically reduced highway fatalities and pedestrian fatalities, but those companies are still able to "innovate and thrive," he says.

"There are plenty of examples where we have succeeded at that—letting industries thrive while also regulating them for safety," Carroll says. "Algorithms are just the next frontier of that."

To take the Big Tobacco analogy further, both Carroll and Binkowski believe Facebook should be held responsible for educating the public in the way cigarette companies were forced to educate the public about the dangers of smoking after the Master Settlement Agreement.

"The ad tech industry eroded the profitability of local news, local journalism. Local newspapers have become an endangered species, which harms local communities," Carroll says. "This contributes to the decline in trust in media, so I see these as systemic problems, and systemic solutions need to be considered."

Binkowski says she thinks lawmakers should force Facebook to dedicate funding to newsrooms "in perpetuity." She says this funding should be distributed transparently and overseen by a board of journalists and academics.

"That funding program should be global, not national. They need to be compelled to return to the world what they took from it," Binkowski says. "They need to pay journalists our fair share. They sucked up all the funds from what we made, what we created, the content that we risk our lives regularly putting together, generally for a salary that would shock non-journalists if they knew it."

Carroll says there's no "silver bullet" for fixing the disinformation problem, but creating a system that has more transparency, accountability and enforcement is key. It's also important to force these companies to repair the damage they've done, which may include forcing them to fund newsrooms. Disinformation will always find its place on the internet, as it's very difficult to combat effectively, but if we can get to the root of the problem and make it less potent, then we can start moving in the right direction and create a less chaotic and harmful information ecosystem.

This article was paid for by AlterNet subscribers. Not a subscriber? Try us and go ad-free for $1. Prefer to give a one-time tip? Click here.

Enjoy this piece?

… then let us make a small request. AlterNet’s journalists work tirelessly to counter the traditional corporate media narrative. We’re here seven days a week, 365 days a year. And we’re proud to say that we’ve been bringing you the real, unfiltered news for 20 years—longer than any other progressive news site on the Internet.

It’s through the generosity of our supporters that we’re able to share with you all the underreported news you need to know. Independent journalism is increasingly imperiled; ads alone can’t pay our bills. AlterNet counts on readers like you to support our coverage. Did you enjoy content from David Cay Johnston, Common Dreams, Raw Story and Robert Reich? Opinion from Salon and Jim Hightower? Analysis by The Conversation? Then join the hundreds of readers who have supported AlterNet this year.

Every reader contribution, whatever the amount, makes a tremendous difference. Help ensure AlterNet remains independent long into the future. Support progressive journalism with a one-time contribution to AlterNet, or click here to become a subscriber. Thank you. Click here to donate by check.

Close
alternet logo

Tough Times

Demand honest news. Help support AlterNet and our mission to keep you informed during this crisis.