SUBSCRIBE TO OUR FREE NEWSLETTER
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
5
#000000
#FFFFFF
To donate by check, phone, or other method, see our More Ways to Give page.
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
If you decided to take a breather this weekend from the relentless stream of Facebook news, you've got a lot of catching up to do.
Since Friday afternoon, several prominent outlets rushed to press with stories drawn from the trove of internal Facebook documents provided by whistleblower Frances Haugen. These stories offer shocking insights into Facebook's coverup of its role in the democracy-destabilizing spread of hate and disinformation, especially in the aftermath of the 2020 U.S. elections.
Fixing the model is the right approach. But it's a repair that Facebook will never do on its own.
By Monday, another dozen reports were added to the pile, including important reporting by Bloomberg and USA Today on Facebook's inability to curb the amplification of hate speech or prevent conspiracy theorists from gaming its algorithms to spread their toxic messages across the company's networks.
What's going on here?
Since early October, a consortium of 18 news outlets, including the Associated Press, The Atlantic, Bloomberg, CNN, NBC, The New York Times and The Washington Post, have been sifting through tens of thousands of Haugen's documents, with a plan to publish reports in some synchronized fashion.
Profits over people
The weekend crop of stories were a part of that push. Nearly all of them reveal Facebook executives' reluctance to fix the problem at the heart of the company's dangerous but profitable business: a revenue model that puts engagement and growth before the health and welfare of a multiracial democracy.
On Friday, the Associated Press reported on a rebellion that "erupted" inside the offices of Facebook on Jan. 6 as employees grew frustrated with the company's unwillingness to address the rise of pro-Trump political extremism across its platforms following the 2020 elections.
At CNN, Donie O'Sullivan, Tara Subramaniam and Clare Duffy reported that Facebook was "fundamentally unprepared" to curtail the Stop the Steal movement, which grew out of the false belief that the election was rigged and that Trump was the "true president."
Spokespeople and organizers for this anti-democratic campaign used Facebook's platforms to turn out people at events that led to the deadly attack on the U.S. Capitol. Even worse, the company provided the basic coordinating infrastructure that mobilized people and incited them to violence. In a damning news clip tied to CNN's report, O'Sullivan asked participants in the insurrection how they heard about or helped organize the attack. Their answer: via Facebook groups and event pages.
NBC, The New York Times and NPR reported on the efforts of a Facebook researcher who created an imaginary user, Carol Smith of North Carolina, to test the platform's engagement algorithms. The researcher offered a few details about Smith, including that she was a Trump supporter and followed the conservative accounts of Fox News and Sinclair Broadcast Group.
"Within one week, Smith's feed was full of groups and pages that had violated Facebook's own rules, including those against hate speech and disinformation," reports NBC's Brandy Zadrozny. These included several recommendations that Smith join groups dedicated to spreading far-right QAnon conspiracy theories or others espousing support for a coming race war.
Too big (and profitable) to fix
By Monday, Bloomberg reported that Facebook executives have long known that the company's hate-speech problem was far bigger and more entrenched than they had disclosed.
Last March, Facebook founder Mark Zuckerberg assured Congress that "more than 98 percent of the hate speech that we take down is done by [artificial intelligence] and not by a person." But Facebook employees warned that those numbers were misleading. Neither the company's human reviewers nor its automated systems were that good at flagging most hateful content.
"In practice, the company was only removing 5 percent or less of hate speech, the documents suggest," Bloomberg reported.
Zuckerberg and many of his spokespeople continue to say the violence that resulted from the spread of hate and disinformation is the fault of those who physically harmed others during the insurrection and at other times--but this new series of reports assigns significant blame to Facebook executives.
"Unquestionably, [Facebook] is making hate worse," Haugen told members of the UK Parliament on Monday. "I think there is a view inside the company that safety is a cost center, it's not a growth center, which I think is very short-term in thinking because Facebook's own research has shown that when people have worse integrity experiences on the site, they are less likely to retain [them]."
Facebook isn't capable of such farsightedness. Yes, its constant hunt for short-term growth may ultimately sabotage the social-media giant's long-term survival. But can we really afford to wait for Facebook to fix itself?
As news organizations continue to publish stories exposing Facebook's failures, more and more lawmakers and regulators have called for an investigation of a business model that profits from the spread of the most extreme hate and disinformation.
Fixing the model is the right approach. But it's a repair that Facebook will never do on its own.
Common Dreams is powered by optimists who believe in the power of informed and engaged citizens to ignite and enact change to make the world a better place. We're hundreds of thousands strong, but every single supporter makes the difference. Your contribution supports this bold media model—free, independent, and dedicated to reporting the facts every day. Stand with us in the fight for economic equality, social justice, human rights, and a more sustainable future. As a people-powered nonprofit news outlet, we cover the issues the corporate media never will. |
If you decided to take a breather this weekend from the relentless stream of Facebook news, you've got a lot of catching up to do.
Since Friday afternoon, several prominent outlets rushed to press with stories drawn from the trove of internal Facebook documents provided by whistleblower Frances Haugen. These stories offer shocking insights into Facebook's coverup of its role in the democracy-destabilizing spread of hate and disinformation, especially in the aftermath of the 2020 U.S. elections.
Fixing the model is the right approach. But it's a repair that Facebook will never do on its own.
By Monday, another dozen reports were added to the pile, including important reporting by Bloomberg and USA Today on Facebook's inability to curb the amplification of hate speech or prevent conspiracy theorists from gaming its algorithms to spread their toxic messages across the company's networks.
What's going on here?
Since early October, a consortium of 18 news outlets, including the Associated Press, The Atlantic, Bloomberg, CNN, NBC, The New York Times and The Washington Post, have been sifting through tens of thousands of Haugen's documents, with a plan to publish reports in some synchronized fashion.
Profits over people
The weekend crop of stories were a part of that push. Nearly all of them reveal Facebook executives' reluctance to fix the problem at the heart of the company's dangerous but profitable business: a revenue model that puts engagement and growth before the health and welfare of a multiracial democracy.
On Friday, the Associated Press reported on a rebellion that "erupted" inside the offices of Facebook on Jan. 6 as employees grew frustrated with the company's unwillingness to address the rise of pro-Trump political extremism across its platforms following the 2020 elections.
At CNN, Donie O'Sullivan, Tara Subramaniam and Clare Duffy reported that Facebook was "fundamentally unprepared" to curtail the Stop the Steal movement, which grew out of the false belief that the election was rigged and that Trump was the "true president."
Spokespeople and organizers for this anti-democratic campaign used Facebook's platforms to turn out people at events that led to the deadly attack on the U.S. Capitol. Even worse, the company provided the basic coordinating infrastructure that mobilized people and incited them to violence. In a damning news clip tied to CNN's report, O'Sullivan asked participants in the insurrection how they heard about or helped organize the attack. Their answer: via Facebook groups and event pages.
NBC, The New York Times and NPR reported on the efforts of a Facebook researcher who created an imaginary user, Carol Smith of North Carolina, to test the platform's engagement algorithms. The researcher offered a few details about Smith, including that she was a Trump supporter and followed the conservative accounts of Fox News and Sinclair Broadcast Group.
"Within one week, Smith's feed was full of groups and pages that had violated Facebook's own rules, including those against hate speech and disinformation," reports NBC's Brandy Zadrozny. These included several recommendations that Smith join groups dedicated to spreading far-right QAnon conspiracy theories or others espousing support for a coming race war.
Too big (and profitable) to fix
By Monday, Bloomberg reported that Facebook executives have long known that the company's hate-speech problem was far bigger and more entrenched than they had disclosed.
Last March, Facebook founder Mark Zuckerberg assured Congress that "more than 98 percent of the hate speech that we take down is done by [artificial intelligence] and not by a person." But Facebook employees warned that those numbers were misleading. Neither the company's human reviewers nor its automated systems were that good at flagging most hateful content.
"In practice, the company was only removing 5 percent or less of hate speech, the documents suggest," Bloomberg reported.
Zuckerberg and many of his spokespeople continue to say the violence that resulted from the spread of hate and disinformation is the fault of those who physically harmed others during the insurrection and at other times--but this new series of reports assigns significant blame to Facebook executives.
"Unquestionably, [Facebook] is making hate worse," Haugen told members of the UK Parliament on Monday. "I think there is a view inside the company that safety is a cost center, it's not a growth center, which I think is very short-term in thinking because Facebook's own research has shown that when people have worse integrity experiences on the site, they are less likely to retain [them]."
Facebook isn't capable of such farsightedness. Yes, its constant hunt for short-term growth may ultimately sabotage the social-media giant's long-term survival. But can we really afford to wait for Facebook to fix itself?
As news organizations continue to publish stories exposing Facebook's failures, more and more lawmakers and regulators have called for an investigation of a business model that profits from the spread of the most extreme hate and disinformation.
Fixing the model is the right approach. But it's a repair that Facebook will never do on its own.
If you decided to take a breather this weekend from the relentless stream of Facebook news, you've got a lot of catching up to do.
Since Friday afternoon, several prominent outlets rushed to press with stories drawn from the trove of internal Facebook documents provided by whistleblower Frances Haugen. These stories offer shocking insights into Facebook's coverup of its role in the democracy-destabilizing spread of hate and disinformation, especially in the aftermath of the 2020 U.S. elections.
Fixing the model is the right approach. But it's a repair that Facebook will never do on its own.
By Monday, another dozen reports were added to the pile, including important reporting by Bloomberg and USA Today on Facebook's inability to curb the amplification of hate speech or prevent conspiracy theorists from gaming its algorithms to spread their toxic messages across the company's networks.
What's going on here?
Since early October, a consortium of 18 news outlets, including the Associated Press, The Atlantic, Bloomberg, CNN, NBC, The New York Times and The Washington Post, have been sifting through tens of thousands of Haugen's documents, with a plan to publish reports in some synchronized fashion.
Profits over people
The weekend crop of stories were a part of that push. Nearly all of them reveal Facebook executives' reluctance to fix the problem at the heart of the company's dangerous but profitable business: a revenue model that puts engagement and growth before the health and welfare of a multiracial democracy.
On Friday, the Associated Press reported on a rebellion that "erupted" inside the offices of Facebook on Jan. 6 as employees grew frustrated with the company's unwillingness to address the rise of pro-Trump political extremism across its platforms following the 2020 elections.
At CNN, Donie O'Sullivan, Tara Subramaniam and Clare Duffy reported that Facebook was "fundamentally unprepared" to curtail the Stop the Steal movement, which grew out of the false belief that the election was rigged and that Trump was the "true president."
Spokespeople and organizers for this anti-democratic campaign used Facebook's platforms to turn out people at events that led to the deadly attack on the U.S. Capitol. Even worse, the company provided the basic coordinating infrastructure that mobilized people and incited them to violence. In a damning news clip tied to CNN's report, O'Sullivan asked participants in the insurrection how they heard about or helped organize the attack. Their answer: via Facebook groups and event pages.
NBC, The New York Times and NPR reported on the efforts of a Facebook researcher who created an imaginary user, Carol Smith of North Carolina, to test the platform's engagement algorithms. The researcher offered a few details about Smith, including that she was a Trump supporter and followed the conservative accounts of Fox News and Sinclair Broadcast Group.
"Within one week, Smith's feed was full of groups and pages that had violated Facebook's own rules, including those against hate speech and disinformation," reports NBC's Brandy Zadrozny. These included several recommendations that Smith join groups dedicated to spreading far-right QAnon conspiracy theories or others espousing support for a coming race war.
Too big (and profitable) to fix
By Monday, Bloomberg reported that Facebook executives have long known that the company's hate-speech problem was far bigger and more entrenched than they had disclosed.
Last March, Facebook founder Mark Zuckerberg assured Congress that "more than 98 percent of the hate speech that we take down is done by [artificial intelligence] and not by a person." But Facebook employees warned that those numbers were misleading. Neither the company's human reviewers nor its automated systems were that good at flagging most hateful content.
"In practice, the company was only removing 5 percent or less of hate speech, the documents suggest," Bloomberg reported.
Zuckerberg and many of his spokespeople continue to say the violence that resulted from the spread of hate and disinformation is the fault of those who physically harmed others during the insurrection and at other times--but this new series of reports assigns significant blame to Facebook executives.
"Unquestionably, [Facebook] is making hate worse," Haugen told members of the UK Parliament on Monday. "I think there is a view inside the company that safety is a cost center, it's not a growth center, which I think is very short-term in thinking because Facebook's own research has shown that when people have worse integrity experiences on the site, they are less likely to retain [them]."
Facebook isn't capable of such farsightedness. Yes, its constant hunt for short-term growth may ultimately sabotage the social-media giant's long-term survival. But can we really afford to wait for Facebook to fix itself?
As news organizations continue to publish stories exposing Facebook's failures, more and more lawmakers and regulators have called for an investigation of a business model that profits from the spread of the most extreme hate and disinformation.
Fixing the model is the right approach. But it's a repair that Facebook will never do on its own.