Facebook Ignored Internal Warnings Its Algorithms Were Intensifying Divisiveness: Report

Mark Zuckerberg, chief executive officer and founder of Facebook, attends the Viva Tech start-up and technology gathering at Parc des Expositions Porte de Versailles on May 24, 2018 in Paris, France. (Photo: Christophe Morin/IP3/Getty Images)

Facebook Ignored Internal Warnings Its Algorithms Were Intensifying Divisiveness: Report

"Facebook knows what it's doing, intentionally continues to cause harm to increase engagement and profit, and will never fix these problems themselves."  

Despite internal research that Facebook's platform was exploiting and exacerbating divisiveness among its users, top executives ignored the findings that the algorithms were doing the exact opposite of the company's stated public mission to bring people together.

That's according to new reporting Tuesday from the Wall Street Journal which in a comprehensive dive into the company's treatment of its platform's capabilities to divide users found that executives knew in 2018 what the site was doing to users but declined to take action.

"The most persistent myth about Facebook is that it naively bumbles its way into trouble," tweetedNew York Times tech columnist Kevin Roose. "It has always known what it is, and what it's doing to society."

"Our algorithms exploit the human brain's attraction to divisiveness," a presentation to the company's leaders in 2018 declared. The research further warned that if the problem was "left unchecked," the platform's system would feed users "more and more divisive content in an effort to gain user attention & increase time on the platform."

As the Daily Beastexplained:

The social-media giant, which boasts its mission is to "connect the world," reportedly launched a research project in 2017 led by Facebook's former Chief Product Officer Chris Cox to study how its algorithms aggravate divisive and harmful content. The task force named "Common Ground" assigned employees into "Integrity Teams" throughout the company. The team reportedly found that while some groups united people from various backgrounds, others only accelerated conflict and misinformation.

Despite the evidence laid out in Cox's team's presentation, however, the company's board--particularly vice president of global public policy Joel Kaplan--rejected taking action to change the platform's algorithm and incentives.

According to the Journal, Kaplan has outsized power at Facebook when it comes to determining the direction of the company:

A former deputy chief of staff to George W. Bush, Mr. Kaplan became more involved in content-ranking decisions after 2016 allegations Facebook had suppressed trending news stories from conservative outlets. An internal review didn't substantiate the claims of bias, Facebook's then-general counsel Colin Stretch told Congress, but the damage to Facebook's reputation among conservatives had been done.

[...]

Disapproval from Mr. Kaplan's team or Facebook's communications department could scuttle a project, said people familiar with the effort. Negative policy-team reviews killed efforts to build a classification system for hyperpolarized content.

Journalists took to Twitter to note the timeline of the meeting and their requests for information on the platform's incentives.

"I visited Facebook HQ around this time to ask about the growing instances of mass violence linked the platform," tweetedNew York Times reporter Max Fisher. "One executive after another looked me in the eye and said they had no reason to believe the platform itself drove bad behavior."

The reporting, said tech accountability advocacy group Freedom From Google and Facebook, "proves what we've been saying all along: Facebook knows what it's doing, intentionally continues to cause harm to increase engagement and profit, and will never fix these problems themselves."

"Until Congress, state attorneys general, the FTC, or DOJ breaks up Facebook's monopoly and further regulates them," the group added, "nothing is going to change."

Our work is licensed under Creative Commons (CC BY-NC-ND 3.0). Feel free to republish and share widely.