SUBSCRIBE TO OUR FREE NEWSLETTER

SUBSCRIBE TO OUR FREE NEWSLETTER

Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.

* indicates required
5
#000000
#FFFFFF
A woman holds sign reading "Respect the Vote in Brazil

A woman holds a sign reading "Respect the Vote" in Portuguese at a demonstration against Brazilian President Jair Bolsonaro and in defense of democracy and elections at Avenida Paulista in Sao Paulo, Brazil on August 11, 2022. (Photo: Paulo Lopes/Anadolu Agency via Getty Images)

Facebook 'Appallingly Failed' to Detect Election Misinformation in Brazil, Says Democracy Watchdog

"The disinformation that Facebook allows on its platform feeds into the 'stop the steal' narrative in Brazil—a growing tactic intended to set the stage for contesting the election."

The pro-democracy group Global Witness on Monday demanded to know how Facebook is, as it claims, "deeply committed to protecting election integrity" after the organization tested Facebook's ability to detect misinformation about Brazil's upcoming election--and found that none of the fake ads it submitted raised any red flags for the social media platform.

Less than two months before Brazilians head to polls to vote in the presidential election, Global Witness submitted 10 Brazilian Portuguese-language ads to Facebook--telling users to vote on the wrong day, promoting voting methods that are not in use, and questioning the integrity of the election before it even takes place.

The company "appallingly failed" to flag the false information, the group found.

"Meta must recognize that protecting democracy is not optional: it's part of the cost of doing business."

"Despite Facebook's self-proclaimed efforts to tackle disinformation--particularly in high stakes elections--we were appalled to see that they accepted every single election disinformation ad we submitted in Brazil," said Jon Lloyd, senior advisor at Global Witness.

The ads clearly violated the misinformation policies and guidelines put forward by Facebook and its parent company, Meta, which state that moderators "remove content that attempts to interfere with voting, such as incorrect voting information."

One ad was initially rejected under the company's policy pertaining to "ads about social issues, elections, or politics," but without any effort by Global Witness, Facebook alerted the group that the ad had ultimately been approved. The content was directed at Indigenous people in Brazil and told users to vote on the wrong date.

"This bizarre sequence of decisions from Facebook seriously calls into question the integrity of its content moderation systems," said Global Witness.

Previously, the group ran similar tests to see if Facebook would flag hate speech in Myanmar, Kenya, and Ethiopia; the company found Facebook's content moderation efforts "seriously lacking" in those investigations as well.

Brazilians are set to vote on October 2, and the election will decide whether President Jair Bolsonaro gets another term. His main opponent is former President Luiz Inacio Lula da Silva, commonly called Lula, who pushed progressive social programs when he led the country from 2003-2010.

Bolsonaro has overseen increased deforestation of the Amazon rainforest, a rise in inequality during the Covid-19 pandemic, and has questioned the integrity of Brazil's electronic voting machines, as some of Global Witness's fake Facebook ads did.

The fact that Brazilian users of Facebook--the most popular social media platform in the country--could log on to the site and see the same misinformation Bolsonaro is spreading could push the country towards the kind of unrest the U.S. saw after the 2020 election, said Global Witness.

"The disinformation that Facebook allows on its platform feeds into the 'stop the steal' narrative in Brazil--a growing tactic intended to set the stage for contesting the election and risking similar violence as we saw during the January 6th insurrection attempt in the U.S.," said Lloyd. "Facebook can and must do better."

Global Witness noted that it submitted its fake ads from locations in London and Nairobi, without masking the locations; did not include a disclaimer regarding who paid for the content; and used a non-Brazilian payment method--"all of which raises serious concerns about the potential for foreign election interference and Facebook's inability to pick up on red flags and clear warning signs," the group said.

"When Facebook fails to stop disinformation--as appears to be the case in Brazil's election--[the] most likely explanation is that it does not want to stop it," said Roger McNamee, author of the book Zucked: Waking Up to the Facebook Catastrophe.

Facebook did not comment on the specific findings of Global Witness, but told the group in a statement that it "prepared extensively for the 2022 election in Brazil" and described the tools it has launched to fight misinformation.

"It's clear that Facebook's election integrity measures are simply ineffective," said Global Witness. "Meta must recognize that protecting democracy is not optional: it's part of the cost of doing business."

Global Witness called on Facebook to take steps including:

  • Urgently increasing its content moderation capabilities to mitigate the risk of misinformation surrounding the Brazilian election;
  • Immediately strengthening its ad account verification process to better identify accounts posting content that undermines election integrity;
  • Publishing information on what steps they've taken in each country and for each language to ensure election integrity; and
  • Allowing verified independent third party auditing so that Meta can be held accountable for what they say they are doing.

"Facebook has identified Brazil as one of its priority countries where it's investing special resources specifically to tackle election-related disinformation," Lloyd told the Associated Press. "So we wanted to really test out their systems with enough time for them to act. And with the U.S. midterms around the corner, Meta simply has to get this right--and right now."

Our work is licensed under Creative Commons (CC BY-NC-ND 3.0). Feel free to republish and share widely.