SUBSCRIBE TO OUR FREE NEWSLETTER
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
5
#000000
#FFFFFF
To donate by check, phone, or other method, see our More Ways to Give page.
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
Despite internal research that Facebook's platform was exploiting and exacerbating divisiveness among its users, top executives ignored the findings that the algorithms were doing the exact opposite of the company's stated public mission to bring people together.
That's according to new reporting Tuesday from the Wall Street Journal which in a comprehensive dive into the company's treatment of its platform's capabilities to divide users found that executives knew in 2018 what the site was doing to users but declined to take action.
"The most persistent myth about Facebook is that it naively bumbles its way into trouble," tweetedNew York Times tech columnist Kevin Roose. "It has always known what it is, and what it's doing to society."
\u201cNew from @JeffHorwitz & me: Facebook spent years studying the its role in polarization, according to sources and internal documents. One internal slide laid out the issue like so. \u201dOur algorithms exploit the human brain's attraction to divisiveness.\u201d https://t.co/PZWLUt68rs\u201d— Deepa Seetharaman (@Deepa Seetharaman) 1590508913
"Our algorithms exploit the human brain's attraction to divisiveness," a presentation to the company's leaders in 2018 declared. The research further warned that if the problem was "left unchecked," the platform's system would feed users "more and more divisive content in an effort to gain user attention & increase time on the platform."
As the Daily Beastexplained:
The social-media giant, which boasts its mission is to "connect the world," reportedly launched a research project in 2017 led by Facebook's former Chief Product Officer Chris Cox to study how its algorithms aggravate divisive and harmful content. The task force named "Common Ground" assigned employees into "Integrity Teams" throughout the company. The team reportedly found that while some groups united people from various backgrounds, others only accelerated conflict and misinformation.
Despite the evidence laid out in Cox's team's presentation, however, the company's board--particularly vice president of global public policy Joel Kaplan--rejected taking action to change the platform's algorithm and incentives.
According to the Journal, Kaplan has outsized power at Facebook when it comes to determining the direction of the company:
A former deputy chief of staff to George W. Bush, Mr. Kaplan became more involved in content-ranking decisions after 2016 allegations Facebook had suppressed trending news stories from conservative outlets. An internal review didn't substantiate the claims of bias, Facebook's then-general counsel Colin Stretch told Congress, but the damage to Facebook's reputation among conservatives had been done.
[...]
Disapproval from Mr. Kaplan's team or Facebook's communications department could scuttle a project, said people familiar with the effort. Negative policy-team reviews killed efforts to build a classification system for hyperpolarized content.
Journalists took to Twitter to note the timeline of the meeting and their requests for information on the platform's incentives.
\u201cAnd the people who\u2019ve been reporting on this for years are met with complete derision and sometimes outright lies by those at Facebook when we bring it up.\u201d— Ben Collins (@Ben Collins) 1590516135
"I visited Facebook HQ around this time to ask about the growing instances of mass violence linked the platform," tweetedNew York Times reporter Max Fisher. "One executive after another looked me in the eye and said they had no reason to believe the platform itself drove bad behavior."
The reporting, said tech accountability advocacy group Freedom From Google and Facebook, "proves what we've been saying all along: Facebook knows what it's doing, intentionally continues to cause harm to increase engagement and profit, and will never fix these problems themselves."
"Until Congress, state attorneys general, the FTC, or DOJ breaks up Facebook's monopoly and further regulates them," the group added, "nothing is going to change."
Common Dreams is powered by optimists who believe in the power of informed and engaged citizens to ignite and enact change to make the world a better place. We're hundreds of thousands strong, but every single supporter makes the difference. Your contribution supports this bold media model—free, independent, and dedicated to reporting the facts every day. Stand with us in the fight for economic equality, social justice, human rights, and a more sustainable future. As a people-powered nonprofit news outlet, we cover the issues the corporate media never will. |
Despite internal research that Facebook's platform was exploiting and exacerbating divisiveness among its users, top executives ignored the findings that the algorithms were doing the exact opposite of the company's stated public mission to bring people together.
That's according to new reporting Tuesday from the Wall Street Journal which in a comprehensive dive into the company's treatment of its platform's capabilities to divide users found that executives knew in 2018 what the site was doing to users but declined to take action.
"The most persistent myth about Facebook is that it naively bumbles its way into trouble," tweetedNew York Times tech columnist Kevin Roose. "It has always known what it is, and what it's doing to society."
\u201cNew from @JeffHorwitz & me: Facebook spent years studying the its role in polarization, according to sources and internal documents. One internal slide laid out the issue like so. \u201dOur algorithms exploit the human brain's attraction to divisiveness.\u201d https://t.co/PZWLUt68rs\u201d— Deepa Seetharaman (@Deepa Seetharaman) 1590508913
"Our algorithms exploit the human brain's attraction to divisiveness," a presentation to the company's leaders in 2018 declared. The research further warned that if the problem was "left unchecked," the platform's system would feed users "more and more divisive content in an effort to gain user attention & increase time on the platform."
As the Daily Beastexplained:
The social-media giant, which boasts its mission is to "connect the world," reportedly launched a research project in 2017 led by Facebook's former Chief Product Officer Chris Cox to study how its algorithms aggravate divisive and harmful content. The task force named "Common Ground" assigned employees into "Integrity Teams" throughout the company. The team reportedly found that while some groups united people from various backgrounds, others only accelerated conflict and misinformation.
Despite the evidence laid out in Cox's team's presentation, however, the company's board--particularly vice president of global public policy Joel Kaplan--rejected taking action to change the platform's algorithm and incentives.
According to the Journal, Kaplan has outsized power at Facebook when it comes to determining the direction of the company:
A former deputy chief of staff to George W. Bush, Mr. Kaplan became more involved in content-ranking decisions after 2016 allegations Facebook had suppressed trending news stories from conservative outlets. An internal review didn't substantiate the claims of bias, Facebook's then-general counsel Colin Stretch told Congress, but the damage to Facebook's reputation among conservatives had been done.
[...]
Disapproval from Mr. Kaplan's team or Facebook's communications department could scuttle a project, said people familiar with the effort. Negative policy-team reviews killed efforts to build a classification system for hyperpolarized content.
Journalists took to Twitter to note the timeline of the meeting and their requests for information on the platform's incentives.
\u201cAnd the people who\u2019ve been reporting on this for years are met with complete derision and sometimes outright lies by those at Facebook when we bring it up.\u201d— Ben Collins (@Ben Collins) 1590516135
"I visited Facebook HQ around this time to ask about the growing instances of mass violence linked the platform," tweetedNew York Times reporter Max Fisher. "One executive after another looked me in the eye and said they had no reason to believe the platform itself drove bad behavior."
The reporting, said tech accountability advocacy group Freedom From Google and Facebook, "proves what we've been saying all along: Facebook knows what it's doing, intentionally continues to cause harm to increase engagement and profit, and will never fix these problems themselves."
"Until Congress, state attorneys general, the FTC, or DOJ breaks up Facebook's monopoly and further regulates them," the group added, "nothing is going to change."
Despite internal research that Facebook's platform was exploiting and exacerbating divisiveness among its users, top executives ignored the findings that the algorithms were doing the exact opposite of the company's stated public mission to bring people together.
That's according to new reporting Tuesday from the Wall Street Journal which in a comprehensive dive into the company's treatment of its platform's capabilities to divide users found that executives knew in 2018 what the site was doing to users but declined to take action.
"The most persistent myth about Facebook is that it naively bumbles its way into trouble," tweetedNew York Times tech columnist Kevin Roose. "It has always known what it is, and what it's doing to society."
\u201cNew from @JeffHorwitz & me: Facebook spent years studying the its role in polarization, according to sources and internal documents. One internal slide laid out the issue like so. \u201dOur algorithms exploit the human brain's attraction to divisiveness.\u201d https://t.co/PZWLUt68rs\u201d— Deepa Seetharaman (@Deepa Seetharaman) 1590508913
"Our algorithms exploit the human brain's attraction to divisiveness," a presentation to the company's leaders in 2018 declared. The research further warned that if the problem was "left unchecked," the platform's system would feed users "more and more divisive content in an effort to gain user attention & increase time on the platform."
As the Daily Beastexplained:
The social-media giant, which boasts its mission is to "connect the world," reportedly launched a research project in 2017 led by Facebook's former Chief Product Officer Chris Cox to study how its algorithms aggravate divisive and harmful content. The task force named "Common Ground" assigned employees into "Integrity Teams" throughout the company. The team reportedly found that while some groups united people from various backgrounds, others only accelerated conflict and misinformation.
Despite the evidence laid out in Cox's team's presentation, however, the company's board--particularly vice president of global public policy Joel Kaplan--rejected taking action to change the platform's algorithm and incentives.
According to the Journal, Kaplan has outsized power at Facebook when it comes to determining the direction of the company:
A former deputy chief of staff to George W. Bush, Mr. Kaplan became more involved in content-ranking decisions after 2016 allegations Facebook had suppressed trending news stories from conservative outlets. An internal review didn't substantiate the claims of bias, Facebook's then-general counsel Colin Stretch told Congress, but the damage to Facebook's reputation among conservatives had been done.
[...]
Disapproval from Mr. Kaplan's team or Facebook's communications department could scuttle a project, said people familiar with the effort. Negative policy-team reviews killed efforts to build a classification system for hyperpolarized content.
Journalists took to Twitter to note the timeline of the meeting and their requests for information on the platform's incentives.
\u201cAnd the people who\u2019ve been reporting on this for years are met with complete derision and sometimes outright lies by those at Facebook when we bring it up.\u201d— Ben Collins (@Ben Collins) 1590516135
"I visited Facebook HQ around this time to ask about the growing instances of mass violence linked the platform," tweetedNew York Times reporter Max Fisher. "One executive after another looked me in the eye and said they had no reason to believe the platform itself drove bad behavior."
The reporting, said tech accountability advocacy group Freedom From Google and Facebook, "proves what we've been saying all along: Facebook knows what it's doing, intentionally continues to cause harm to increase engagement and profit, and will never fix these problems themselves."
"Until Congress, state attorneys general, the FTC, or DOJ breaks up Facebook's monopoly and further regulates them," the group added, "nothing is going to change."