Zuckerberg touted the changes as an anti-censorship campaign, saying the company was trying to "get back to our roots around free expression" and arguing that "the recent elections also feel like a cultural tipping point toward, once again, prioritizing speech."
"With Zuckerberg and other tech CEOs lining up (literally, in the case of the recent inauguration) behind the new administration's wide-ranging attacks on human rights, Meta shareholders need to step up and hold the company's leadership to account to prevent Meta from yet again becoming a conduit for mass violence, or even genocide."
However, Pat de Brún, head of Big Tech Accountability at Amnesty International, and Maung Sawyeddollah, the founder and executive director of the Rohingya Students' Network who himself fled violence from the Myanmar military in 2017, said the change in policies would make it even more likely that Facebook or Instagram posts would inflame violence against marginalized communities around the world. While Zuckerberg's announcement initially only applied to the U.S., the company has suggested it could make similar changes internationally as well.
"Rather than learning from its reckless contributions to mass violence in countries including Myanmar and Ethiopia, Meta is instead stripping away important protections that were aimed at preventing any recurrence of such harms," de Brún and Sawyeddollah wrote on the Amnesty International website. "In enacting these changes, Meta has effectively declared an open season for hate and harassment targeting its most vulnerable and at-risk people, including trans people, migrants, and refugees."
Past research has shown that Facebook's algorithms can promote hateful, false, or racially provocative content in an attempt to increase the amount of time users spend on the site and therefore the company's profits, sometimes with devastating consequences.
One example is what happened to the Rohingya, as de Brún and Sawyeddollah explained:
We have seen the horrific consequences of Meta's recklessness before. In 2017, Myanmar security forces undertook a brutal campaign of ethnic cleansing against Rohingya Muslims. A United Nations Independent Fact-Finding Commission concluded in 2018 that Myanmar had committed genocide. In the years leading up to these attacks, Facebook had become an echo chamber of virulent anti-Rohingya hatred. The mass dissemination of dehumanizing anti-Rohingya content poured fuel on the fire of long-standing discrimination and helped to create an enabling environment for mass violence. In the absence of appropriate safeguards, Facebook's toxic algorithms intensified a storm of hatred against the Rohingya, which contributed to these atrocities. According to a report by the United Nations, Facebook was instrumental in the radicalization of local populations and the incitement of violence against the Rohingya.
In late January, Sawyeddollah—with the support of Amnesty International, the Open Society Justice Initiative, and Victim Advocates International—filed a whistleblower's complaint against Meta with the Securities and Exchange Commission (SEC) concerning Facebook's role in the Rohingya genocide.
The complaint argued that the company, then registered as Facebook, had known or at least "recklessly disregarded" since 2013 that its algorithm was encouraging the spread of anti-Rohingya hate speech and that its content moderation policies were not sufficient to address the issue. Despite this, it misrepresented the situation to both the SEC and investors in multiple filings.
Now, Sawyeddollah and de Brún are concerned that history could repeat itself unless shareholders and lawmakers take action to counter the power of the tech companies.
"With Zuckerberg and other tech CEOs lining up (literally, in the case of the recent inauguration) behind the new administration's wide-ranging attacks on human rights, Meta shareholders need to step up and hold the company's leadership to account to prevent Meta from yet again becoming a conduit for mass violence, or even genocide," they wrote. "Similarly, legislators and lawmakers in the U.S. must ensure that the SEC retains its neutrality, properly investigate legitimate complaints—such as the one we recently filed, and ensure those who abuse human rights face justice."
The human rights experts aren't the only ones concerned about Meta's new direction. Even employees are sounding the alarm.
"I really think this is a precursor for genocide," one former employee toldPlatformer when the new policies were first announced. "We've seen it happen. Real people's lives are actually going to be endangered. I'm just devastated."