SUBSCRIBE TO OUR FREE NEWSLETTER

SUBSCRIBE TO OUR FREE NEWSLETTER

Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.

* indicates required
5
#000000
#FFFFFF
Facebook and Free Speech

"Facebook's process for deleting posts and accounts based on its community standards has long been opaque, leading to charges from people across the political spectrum that Facebook is effectively shutting down their speech while allowing others to operate unfettered." (Photo: Legal Loop)

Facebook and Free Speech

Facebook should adopt a policy of extreme transparency.

In the weeks since Mark Zuckerberg's testimony to Congress, Facebook has made two important policy announcements. The company released a document explaining what posts and accounts it removes on the basis of its internal rules, known as "community standards," and it engaged outside consultants to review the social media platform's impact on various communities. The company also released its first transparency report on the enforcement of its community standards.

These are all welcome developments, but they lay bare a fundamental question raised by Zuckerberg himself: What obligations does the public want companies to fulfill when deciding which speech deserves a place on the Internet and social media? The Supreme Court recently called the internet and social media platforms "the most important places...for the exchange of views," so the question is not simply an academic exercise.

Facebook's process for deleting posts and accounts based on its community standards has long been opaque, leading to charges from people across the political spectrum that Facebook is effectively shutting down their speech while allowing others to operate unfettered.

From the right, Facebook has been accused of liberal bias, including allegations that the platform manipulated the "Trending Topics" portion of its newsfeed to demote conservative sources in the lead-up to the 2016 election. More recently, conservative critics flamed Facebook for labeling videos from two pro-Trump African-American sisters as "unsafe" - though that may have had more to do with the sisters' habit of promoting conspiracy theories and Holocaust deniers than their support for the president.

In the meantime, progressive voices have demanded that Facebook address concerns about hate speech and harassment on the platform, as well as the censorship of "Black, Arab, Muslim, and other marginalized voices." Anecdotal data suggests that posts by people of color, as well as Muslims, are disproportionately targeted for content takedowns, while white nationalist movements are largely left alone. Facebook has removed posts that accuse white people of complicity in racism or report racial slurs, while leaving up scores of posts and accounts promoting white supremacy and violence against marginalized groups.

Political speech, too, has been suppressed, often under Facebook's rules against broadly defined "terrorist speech." Following a 2016 agreement with Israel to address "incitement," for instance, Facebook suspended the personal accounts of several prominent Palestinian journalists and removed even non-political content from their home publications. The platform also deleted accounts and content from academics, journalists, and local newspapers relating to the conflict in Kashmir, including posts about a separatist killed by the Indian army. Faced with complaints, Facebook characterized the posts as "terrorist content" and said they would be permitted only if they also "condemn[ed] these organisations and/or their violent activities."

These cases demonstrate the complexity of implementation on the ground. Even Facebook's newly published, supposedly bright-line rules raise difficult questions. For example, the platform prohibits posts supporting or praising any individual engaged in "terrorist activity" or "organized hate," even if the support is entirely separate from the activities themselves. Would the company remove a post arguing that Dylann Roof, who went on a deadly shooting spree at an African-American church in Charleston, S.C., did not deserve the death penalty based on human rights grounds? What about posts praising the humanitarian efforts carried out by the Holy Land Foundation, a Muslim charity, despite its conviction for support of Hamas? The published policies do not give adequate answers to these questions.

Moreover, the use of artificial intelligence to screen for terrorist content has produced an outsized focus on Muslims, as described above, and experience shows these types of detection tools are likely to encode, perpetuate, and even mask societal bias. Facebook's push towards using algorithms to flag various types of speech for removal risks baking in biases that will disadvantage minorities and underrepresented communities.

As Facebook continues to refine its policies and practices with the help of information generated by the upcoming company audit, it has an opportunity to improve.

First, Facebook should ensure that it is living up to its stated presumption in favor of allowing speech. The platform should ensure removals are carried out in narrow, targeted ways, and any process for removing posts and accounts should take at least equal account of the value of a robust exchange of ideas, including unpopular ideas. The company should clarify that, while ensuring that individuals and groups can function online free from harassment and hate is a critical goal, it does not intend to follow Zuckerberg's suggestion to Congress that the company would remove all "speech that might make people feel just broadly uncomfortable." Even speech that makes people uncomfortable can serve the public good.

Second, Facebook should take concrete steps to implement its presumption in favor of speech by providing top-notch training to content moderators, including on the value of maintaining the platform as a venue for the open exchange of ideas. The company recently added a much needed appeals mechanism for deletions, and it should be properly resourced. Facebook intends to have 20,000 staff focusing on "safety and security" by the end of the year, and has said publicly that 7,500 of those will be working on both removals and appeals, but the ratio of resources allocated to the two functions is not known. It is critical that the appeals process be taken as seriously as the takedown process.

Third, Facebook should adopt a policy of extreme transparency. While it recently joined its peers in reporting on the number of posts and accounts that it takes down based on its terms of service, numbers alone are not enough. To assure the public that it is applying its policies even-handedly, Facebook should also publish details about the types of posts and accounts it takes down. Its recent transparency report is still a high-level view. For instance, it does not appear to separate out the number of accounts removed from the pieces of content removed, or provide details about the overlap. To demonstrate that the company is abiding by its assurances that its "Dangerous Organizations and Individuals" policy applies equally to all groups engaged in organized violence, it should report not only takedowns of "ISIS and al-Qaeda content" but also takedowns of other groups that qualify (some of which are covered under the category of "hate speech," which it is beginning to report). And it lacks case studies on how some of the most difficult real-life scenarios are likely to play out, which would give users notice as to whether their activity on the platform is likely to lead to suppression of their posts or even entire accounts.

Finally, Facebook should build on its current initiative and develop a system of regular, publicly available audits overseen by a multi-stakeholder group that measure bias and the impact on civil rights and civil liberties.

Facebook is in an unenviable position, with any decision likely to leave some constituency dissatisfied. Regardless of the position it takes on any given issue, as a platform with over a billion users, it must lean far forward in embracing transparency and accountability. The new standards and audits are a welcome first step, but Facebook should expect its users, civil society, and governments around the world to demand more.

© 2023 Just Security