SUBSCRIBE TO OUR FREE NEWSLETTER

SUBSCRIBE TO OUR FREE NEWSLETTER

Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.

* indicates required
5
#000000
#FFFFFF
Facebook to Ban Political Ads After Polls Close on Nov. 3, 'Just in Time to Have No Impact Whatsoever'

Facebook CEO Mark Zuckerberg speaks to an audience in 2018. (Photo: Anthony Quintano/Flickr/cc)

Facebook to Ban Political Ads After Polls Close on Nov. 3, 'Just in Time to Have No Impact Whatsoever'

Critics noted that the social media giant also recently announced an algorithm change that could "make the site more toxic and less usable while endangering democracy and human rights."

In Facebook's latest attempt to limit the spread of electoral misinformation on its platform, the social media giant announced Wednesday that it will ban political advertisements in the United States after polls close on Election Day--but critics raised concerns about the timing of the policy and other recent moves by the company.

The ad ban was revealed in a new blog post from Guy Rosen, Facebook's VP of integrity, detailing the company's preparations for the U.S. general election.

Rosen wrote that "while ads are an important way to express voice, we plan to temporarily stop running all social issue, electoral, or political ads in the U.S. after the polls close on November 3, to reduce opportunities for confusion or abuse. We will notify advertisers when this policy is lifted."

The advocacy group Public Citizen responded to the move on Twitter by casting doubt on the effectiveness of imposing such a ban after voting is over:

In a pair of tweets, Sleeping Giants, another advocacy organization, was similarly critical of the timing and called for making the ban permanent:

New York Times technology correspondent Mike Isaac reported that the move came after weeks of Facebook CEO Mark Zuckerberg and his lieutenants watching the race between President Donald Trump and Democratic presidential nominee Joe Biden "with an increasing sense of alarm."

According to Isaac:

Executives have held meetings to discuss President Trump's evasive comments about whether he would accept a peaceful transfer of power if he lost the election. They watched Mr. Trump tell the Proud Boys, a far-right group that has endorsed violence, to "stand back and stand by." And they have had conversations with civil rights groups, who have privately told them that the company needs to do more because Election Day could erupt into chaos, Facebook employees said.

The new ad policy comes after the company introduced "measures to reduce election misinformation and interference on its site just last month," Isaac noted. "At the time, Facebook said it planned to ban new political ads for a contained period--the week before Election Day--and would act swiftly against posts that tried to dissuade people from voting. Mr. Zuckerberg also said Facebook would not make any other changes until there was an official election result."

Fight for the Future suggested in a series of tweets that the ad ban "isn't going to fix the problem at all," pointing to recently announced changes to Facebook's recommendation algorithm which the digital rights advocacy group claims "will make the site even more toxic and ruin one of the last parts of it that are still actually kind of useful: Facebook groups."

"This is just silly. No one is going to need to use paid advertising to make disinformation go wildly viral in the immediate aftermath of November 3rd. They can just ... post it," tweeted Fight for the Future deputy director Evan Greer. "Facebook needs to put an immediate moratorium on algorithmic amplification."

Greer's group has launched a petition urging the social media company to immediately reverse its algorithm decision, warning that "turning on algorithmic amplification for Facebook group posts will make the site more toxic and less usable while endangering democracy and human rights."

Our work is licensed under Creative Commons (CC BY-NC-ND 3.0). Feel free to republish and share widely.