SUBSCRIBE TO OUR FREE NEWSLETTER
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
5
#000000
#FFFFFF
To donate by check, phone, or other method, see our More Ways to Give page.
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
The pro-democracy group Global Witness on Monday demanded to know how Facebook is, as it claims, "deeply committed to protecting election integrity" after the organization tested Facebook's ability to detect misinformation about Brazil's upcoming election--and found that none of the fake ads it submitted raised any red flags for the social media platform.
Less than two months before Brazilians head to polls to vote in the presidential election, Global Witness submitted 10 Brazilian Portuguese-language ads to Facebook--telling users to vote on the wrong day, promoting voting methods that are not in use, and questioning the integrity of the election before it even takes place.
The company "appallingly failed" to flag the false information, the group found.
"Meta must recognize that protecting democracy is not optional: it's part of the cost of doing business."
"Despite Facebook's self-proclaimed efforts to tackle disinformation--particularly in high stakes elections--we were appalled to see that they accepted every single election disinformation ad we submitted in Brazil," said Jon Lloyd, senior advisor at Global Witness.
The ads clearly violated the misinformation policies and guidelines put forward by Facebook and its parent company, Meta, which state that moderators "remove content that attempts to interfere with voting, such as incorrect voting information."
One ad was initially rejected under the company's policy pertaining to "ads about social issues, elections, or politics," but without any effort by Global Witness, Facebook alerted the group that the ad had ultimately been approved. The content was directed at Indigenous people in Brazil and told users to vote on the wrong date.
"This bizarre sequence of decisions from Facebook seriously calls into question the integrity of its content moderation systems," said Global Witness.
Previously, the group ran similar tests to see if Facebook would flag hate speech in Myanmar, Kenya, and Ethiopia; the company found Facebook's content moderation efforts "seriously lacking" in those investigations as well.
Brazilians are set to vote on October 2, and the election will decide whether President Jair Bolsonaro gets another term. His main opponent is former President Luiz Inacio Lula da Silva, commonly called Lula, who pushed progressive social programs when he led the country from 2003-2010.
Bolsonaro has overseen increased deforestation of the Amazon rainforest, a rise in inequality during the Covid-19 pandemic, and has questioned the integrity of Brazil's electronic voting machines, as some of Global Witness's fake Facebook ads did.
The fact that Brazilian users of Facebook--the most popular social media platform in the country--could log on to the site and see the same misinformation Bolsonaro is spreading could push the country towards the kind of unrest the U.S. saw after the 2020 election, said Global Witness.
"The disinformation that Facebook allows on its platform feeds into the 'stop the steal' narrative in Brazil--a growing tactic intended to set the stage for contesting the election and risking similar violence as we saw during the January 6th insurrection attempt in the U.S.," said Lloyd. "Facebook can and must do better."
Global Witness noted that it submitted its fake ads from locations in London and Nairobi, without masking the locations; did not include a disclaimer regarding who paid for the content; and used a non-Brazilian payment method--"all of which raises serious concerns about the potential for foreign election interference and Facebook's inability to pick up on red flags and clear warning signs," the group said.
"When Facebook fails to stop disinformation--as appears to be the case in Brazil's election--[the] most likely explanation is that it does not want to stop it," said Roger McNamee, author of the book Zucked: Waking Up to the Facebook Catastrophe.
\u201cPro tip: when FB fails to stop disinformation \u2014 as appears to be the case in Brazil\u2019s election \u2014 most likely explanation is that it does not want to stop it. Hate speech, disinformation, and conspiracy theories boost engagement, which increases profits. https://t.co/mjwCmrfXVK\u201d— Roger McNamee (@Roger McNamee) 1660533289
Facebook did not comment on the specific findings of Global Witness, but told the group in a statement that it "prepared extensively for the 2022 election in Brazil" and described the tools it has launched to fight misinformation.
"It's clear that Facebook's election integrity measures are simply ineffective," said Global Witness. "Meta must recognize that protecting democracy is not optional: it's part of the cost of doing business."
Global Witness called on Facebook to take steps including:
"Facebook has identified Brazil as one of its priority countries where it's investing special resources specifically to tackle election-related disinformation," Lloyd told the Associated Press. "So we wanted to really test out their systems with enough time for them to act. And with the U.S. midterms around the corner, Meta simply has to get this right--and right now."
Common Dreams is powered by optimists who believe in the power of informed and engaged citizens to ignite and enact change to make the world a better place. We're hundreds of thousands strong, but every single supporter makes the difference. Your contribution supports this bold media model—free, independent, and dedicated to reporting the facts every day. Stand with us in the fight for economic equality, social justice, human rights, and a more sustainable future. As a people-powered nonprofit news outlet, we cover the issues the corporate media never will. |
The pro-democracy group Global Witness on Monday demanded to know how Facebook is, as it claims, "deeply committed to protecting election integrity" after the organization tested Facebook's ability to detect misinformation about Brazil's upcoming election--and found that none of the fake ads it submitted raised any red flags for the social media platform.
Less than two months before Brazilians head to polls to vote in the presidential election, Global Witness submitted 10 Brazilian Portuguese-language ads to Facebook--telling users to vote on the wrong day, promoting voting methods that are not in use, and questioning the integrity of the election before it even takes place.
The company "appallingly failed" to flag the false information, the group found.
"Meta must recognize that protecting democracy is not optional: it's part of the cost of doing business."
"Despite Facebook's self-proclaimed efforts to tackle disinformation--particularly in high stakes elections--we were appalled to see that they accepted every single election disinformation ad we submitted in Brazil," said Jon Lloyd, senior advisor at Global Witness.
The ads clearly violated the misinformation policies and guidelines put forward by Facebook and its parent company, Meta, which state that moderators "remove content that attempts to interfere with voting, such as incorrect voting information."
One ad was initially rejected under the company's policy pertaining to "ads about social issues, elections, or politics," but without any effort by Global Witness, Facebook alerted the group that the ad had ultimately been approved. The content was directed at Indigenous people in Brazil and told users to vote on the wrong date.
"This bizarre sequence of decisions from Facebook seriously calls into question the integrity of its content moderation systems," said Global Witness.
Previously, the group ran similar tests to see if Facebook would flag hate speech in Myanmar, Kenya, and Ethiopia; the company found Facebook's content moderation efforts "seriously lacking" in those investigations as well.
Brazilians are set to vote on October 2, and the election will decide whether President Jair Bolsonaro gets another term. His main opponent is former President Luiz Inacio Lula da Silva, commonly called Lula, who pushed progressive social programs when he led the country from 2003-2010.
Bolsonaro has overseen increased deforestation of the Amazon rainforest, a rise in inequality during the Covid-19 pandemic, and has questioned the integrity of Brazil's electronic voting machines, as some of Global Witness's fake Facebook ads did.
The fact that Brazilian users of Facebook--the most popular social media platform in the country--could log on to the site and see the same misinformation Bolsonaro is spreading could push the country towards the kind of unrest the U.S. saw after the 2020 election, said Global Witness.
"The disinformation that Facebook allows on its platform feeds into the 'stop the steal' narrative in Brazil--a growing tactic intended to set the stage for contesting the election and risking similar violence as we saw during the January 6th insurrection attempt in the U.S.," said Lloyd. "Facebook can and must do better."
Global Witness noted that it submitted its fake ads from locations in London and Nairobi, without masking the locations; did not include a disclaimer regarding who paid for the content; and used a non-Brazilian payment method--"all of which raises serious concerns about the potential for foreign election interference and Facebook's inability to pick up on red flags and clear warning signs," the group said.
"When Facebook fails to stop disinformation--as appears to be the case in Brazil's election--[the] most likely explanation is that it does not want to stop it," said Roger McNamee, author of the book Zucked: Waking Up to the Facebook Catastrophe.
\u201cPro tip: when FB fails to stop disinformation \u2014 as appears to be the case in Brazil\u2019s election \u2014 most likely explanation is that it does not want to stop it. Hate speech, disinformation, and conspiracy theories boost engagement, which increases profits. https://t.co/mjwCmrfXVK\u201d— Roger McNamee (@Roger McNamee) 1660533289
Facebook did not comment on the specific findings of Global Witness, but told the group in a statement that it "prepared extensively for the 2022 election in Brazil" and described the tools it has launched to fight misinformation.
"It's clear that Facebook's election integrity measures are simply ineffective," said Global Witness. "Meta must recognize that protecting democracy is not optional: it's part of the cost of doing business."
Global Witness called on Facebook to take steps including:
"Facebook has identified Brazil as one of its priority countries where it's investing special resources specifically to tackle election-related disinformation," Lloyd told the Associated Press. "So we wanted to really test out their systems with enough time for them to act. And with the U.S. midterms around the corner, Meta simply has to get this right--and right now."
The pro-democracy group Global Witness on Monday demanded to know how Facebook is, as it claims, "deeply committed to protecting election integrity" after the organization tested Facebook's ability to detect misinformation about Brazil's upcoming election--and found that none of the fake ads it submitted raised any red flags for the social media platform.
Less than two months before Brazilians head to polls to vote in the presidential election, Global Witness submitted 10 Brazilian Portuguese-language ads to Facebook--telling users to vote on the wrong day, promoting voting methods that are not in use, and questioning the integrity of the election before it even takes place.
The company "appallingly failed" to flag the false information, the group found.
"Meta must recognize that protecting democracy is not optional: it's part of the cost of doing business."
"Despite Facebook's self-proclaimed efforts to tackle disinformation--particularly in high stakes elections--we were appalled to see that they accepted every single election disinformation ad we submitted in Brazil," said Jon Lloyd, senior advisor at Global Witness.
The ads clearly violated the misinformation policies and guidelines put forward by Facebook and its parent company, Meta, which state that moderators "remove content that attempts to interfere with voting, such as incorrect voting information."
One ad was initially rejected under the company's policy pertaining to "ads about social issues, elections, or politics," but without any effort by Global Witness, Facebook alerted the group that the ad had ultimately been approved. The content was directed at Indigenous people in Brazil and told users to vote on the wrong date.
"This bizarre sequence of decisions from Facebook seriously calls into question the integrity of its content moderation systems," said Global Witness.
Previously, the group ran similar tests to see if Facebook would flag hate speech in Myanmar, Kenya, and Ethiopia; the company found Facebook's content moderation efforts "seriously lacking" in those investigations as well.
Brazilians are set to vote on October 2, and the election will decide whether President Jair Bolsonaro gets another term. His main opponent is former President Luiz Inacio Lula da Silva, commonly called Lula, who pushed progressive social programs when he led the country from 2003-2010.
Bolsonaro has overseen increased deforestation of the Amazon rainforest, a rise in inequality during the Covid-19 pandemic, and has questioned the integrity of Brazil's electronic voting machines, as some of Global Witness's fake Facebook ads did.
The fact that Brazilian users of Facebook--the most popular social media platform in the country--could log on to the site and see the same misinformation Bolsonaro is spreading could push the country towards the kind of unrest the U.S. saw after the 2020 election, said Global Witness.
"The disinformation that Facebook allows on its platform feeds into the 'stop the steal' narrative in Brazil--a growing tactic intended to set the stage for contesting the election and risking similar violence as we saw during the January 6th insurrection attempt in the U.S.," said Lloyd. "Facebook can and must do better."
Global Witness noted that it submitted its fake ads from locations in London and Nairobi, without masking the locations; did not include a disclaimer regarding who paid for the content; and used a non-Brazilian payment method--"all of which raises serious concerns about the potential for foreign election interference and Facebook's inability to pick up on red flags and clear warning signs," the group said.
"When Facebook fails to stop disinformation--as appears to be the case in Brazil's election--[the] most likely explanation is that it does not want to stop it," said Roger McNamee, author of the book Zucked: Waking Up to the Facebook Catastrophe.
\u201cPro tip: when FB fails to stop disinformation \u2014 as appears to be the case in Brazil\u2019s election \u2014 most likely explanation is that it does not want to stop it. Hate speech, disinformation, and conspiracy theories boost engagement, which increases profits. https://t.co/mjwCmrfXVK\u201d— Roger McNamee (@Roger McNamee) 1660533289
Facebook did not comment on the specific findings of Global Witness, but told the group in a statement that it "prepared extensively for the 2022 election in Brazil" and described the tools it has launched to fight misinformation.
"It's clear that Facebook's election integrity measures are simply ineffective," said Global Witness. "Meta must recognize that protecting democracy is not optional: it's part of the cost of doing business."
Global Witness called on Facebook to take steps including:
"Facebook has identified Brazil as one of its priority countries where it's investing special resources specifically to tackle election-related disinformation," Lloyd told the Associated Press. "So we wanted to really test out their systems with enough time for them to act. And with the U.S. midterms around the corner, Meta simply has to get this right--and right now."