Apr 21, 2016
While the prospect of a Donald Trump presidency is a terrifying one, perhaps this is scarier: Facebook could use its unprecedented powers to tilt the 2016 presidential election away from him - and the social network's employees have apparently openly discussed whether they should do so.
As Gizmodo reported on Friday, "Last month, some Facebook employees used a company poll to ask [Facebook founder Mark] Zuckerberg whether the company should try 'to help prevent President Trump in 2017'."
Facebook employees are probably just expressing the fear that millions of Americans have of the Republican demagogue. But while there's no evidence that the company plans on taking anti-Trump action, the extraordinary ability that the social network has to manipulate millions of people with just a tweak to its algorithm is a serious cause for concern.
The fact that an internet giant like Facebook or Google could turn an election based on hidden changes to its code has been a hypothetical scenario for years (and it's even a plot point in this season's House of Cards). Harvard Law professor Jonathan Zittrain explained in 2010 how "Facebook could decide an election without anyone ever finding out", after the tech giant secretly conducted a test in which they were able to allegedly increase voter turnout by 340,000 votes around the country on election day simply by showing users a photo of someone they knew saying "I voted".
Facebook repeated this civics engagement experiment on a broader scale during the 2012 election. While the testing did not favor any one candidate, the potential for that power to be used to manipulate voters became such an obvious concern that Facebook's COO, Sheryl Sandberg, said, in 2014, "I want to be clear - Facebook can't control emotions and cannot and will not try to control emotions." She added: "Facebook would never try to control elections."
Her comments came right after a controversial study conducted by Facebook became public. It showed that, in fact, the company had secretly manipulated the emotions of nearly 700,000 people.
Some 78% of Americans have a social network profile of some kind. The dominance of Facebook in Americans' daily lives, and the fact that more people get their news from it than any other source, means the influence of the company in elections has never been greater. With each year that passes, the potential that an internet giant could swing an election gets greater.
Earlier this year, the Guardian reported on the treasure trove of data Facebook holds on hundreds of millions of voters and how it is already allowing presidential candidates to exploit it in different ways:
Facebook, which told investors on Wednesday it was 'excited about the targeting', does not let candidates track individual users. But it does now allow presidential campaigns to upload their massive email lists and voter files - which contain political habits, real names, home addresses and phone numbers - to the company's advertising network. The company will then match real-life voters with their Facebook accounts, which follow individuals as they move across congressional districts and are filled with insightful data.
And in a Politico Magazine piece entitled "How Google could rig the 2016 election", research psychologist Robert Epstein described how a study he co-authored in Proceedings of the National Academy of Sciences found that "Google's search algorithm can easily shift the voting preferences of undecided voters by 20% or more - up to 80% in some demographic groups - with virtually no one knowing they are being manipulated."
As Epstein says, much of this manipulation is unintentional: search results on Google are influenced by the popularity of other searches, algorithms are changed all the time for various reasons, and some tweaks that affect what people see about politics may not be the result of malicious engineers bent on changing the country's political persuasions. However, the potential for that to happen is there - and the same risks apply to Facebook.
To be sure, many corporations, including broadcasters and media organisations, have used their vast power to influence elections in all sorts of ways in the past: whether it's through money, advertising, editorials, or simply the way they present the news. But at no time has one company held so much influence over a large swath of the population - 40% of all news traffic now originates from Facebook - while also having the ability to make changes invisibly.
As Gizmodo reported, there's no law stopping Facebook from doing so if it desires. "Facebook can promote or block any material that it wants," UCLA law professor Eugene Volokh told Gizmodo. "Facebook has the same First Amendment right as the New York Times. They can completely block Trump if they want. They block him or promote him."
To those disgusted by Trump's xenophobia, his boorish and erratic behavior, this might seem like a welcome development. But one organisation having the means to tilt elections one way or another a dangerous innovation. Once started, it would be hard to control. In this specific case, a majority of the public might approve of the results. But do we really want future elections around the world to be decided by the political persuasions of Mark Zuckerberg, or the faceless engineers that control what pops up in your news feed?
Join Us: News for people demanding a better world
Common Dreams is powered by optimists who believe in the power of informed and engaged citizens to ignite and enact change to make the world a better place. We're hundreds of thousands strong, but every single supporter makes the difference. Your contribution supports this bold media model—free, independent, and dedicated to reporting the facts every day. Stand with us in the fight for economic equality, social justice, human rights, and a more sustainable future. As a people-powered nonprofit news outlet, we cover the issues the corporate media never will. |
© 2023 The Guardian
Trevor Timm
Trevor Timm is a co-founder and the executive director of the Freedom of the Press Foundation. He is a writer, activist, and legal analyst who specializes in free speech and government transparency issues. He writes a weekly column for The Guardian and has also contributed to The Atlantic, Al Jazeera, Foreign Policy, Harvard Law and Policy Review, PBS MediaShift, and Politico.
While the prospect of a Donald Trump presidency is a terrifying one, perhaps this is scarier: Facebook could use its unprecedented powers to tilt the 2016 presidential election away from him - and the social network's employees have apparently openly discussed whether they should do so.
As Gizmodo reported on Friday, "Last month, some Facebook employees used a company poll to ask [Facebook founder Mark] Zuckerberg whether the company should try 'to help prevent President Trump in 2017'."
Facebook employees are probably just expressing the fear that millions of Americans have of the Republican demagogue. But while there's no evidence that the company plans on taking anti-Trump action, the extraordinary ability that the social network has to manipulate millions of people with just a tweak to its algorithm is a serious cause for concern.
The fact that an internet giant like Facebook or Google could turn an election based on hidden changes to its code has been a hypothetical scenario for years (and it's even a plot point in this season's House of Cards). Harvard Law professor Jonathan Zittrain explained in 2010 how "Facebook could decide an election without anyone ever finding out", after the tech giant secretly conducted a test in which they were able to allegedly increase voter turnout by 340,000 votes around the country on election day simply by showing users a photo of someone they knew saying "I voted".
Facebook repeated this civics engagement experiment on a broader scale during the 2012 election. While the testing did not favor any one candidate, the potential for that power to be used to manipulate voters became such an obvious concern that Facebook's COO, Sheryl Sandberg, said, in 2014, "I want to be clear - Facebook can't control emotions and cannot and will not try to control emotions." She added: "Facebook would never try to control elections."
Her comments came right after a controversial study conducted by Facebook became public. It showed that, in fact, the company had secretly manipulated the emotions of nearly 700,000 people.
Some 78% of Americans have a social network profile of some kind. The dominance of Facebook in Americans' daily lives, and the fact that more people get their news from it than any other source, means the influence of the company in elections has never been greater. With each year that passes, the potential that an internet giant could swing an election gets greater.
Earlier this year, the Guardian reported on the treasure trove of data Facebook holds on hundreds of millions of voters and how it is already allowing presidential candidates to exploit it in different ways:
Facebook, which told investors on Wednesday it was 'excited about the targeting', does not let candidates track individual users. But it does now allow presidential campaigns to upload their massive email lists and voter files - which contain political habits, real names, home addresses and phone numbers - to the company's advertising network. The company will then match real-life voters with their Facebook accounts, which follow individuals as they move across congressional districts and are filled with insightful data.
And in a Politico Magazine piece entitled "How Google could rig the 2016 election", research psychologist Robert Epstein described how a study he co-authored in Proceedings of the National Academy of Sciences found that "Google's search algorithm can easily shift the voting preferences of undecided voters by 20% or more - up to 80% in some demographic groups - with virtually no one knowing they are being manipulated."
As Epstein says, much of this manipulation is unintentional: search results on Google are influenced by the popularity of other searches, algorithms are changed all the time for various reasons, and some tweaks that affect what people see about politics may not be the result of malicious engineers bent on changing the country's political persuasions. However, the potential for that to happen is there - and the same risks apply to Facebook.
To be sure, many corporations, including broadcasters and media organisations, have used their vast power to influence elections in all sorts of ways in the past: whether it's through money, advertising, editorials, or simply the way they present the news. But at no time has one company held so much influence over a large swath of the population - 40% of all news traffic now originates from Facebook - while also having the ability to make changes invisibly.
As Gizmodo reported, there's no law stopping Facebook from doing so if it desires. "Facebook can promote or block any material that it wants," UCLA law professor Eugene Volokh told Gizmodo. "Facebook has the same First Amendment right as the New York Times. They can completely block Trump if they want. They block him or promote him."
To those disgusted by Trump's xenophobia, his boorish and erratic behavior, this might seem like a welcome development. But one organisation having the means to tilt elections one way or another a dangerous innovation. Once started, it would be hard to control. In this specific case, a majority of the public might approve of the results. But do we really want future elections around the world to be decided by the political persuasions of Mark Zuckerberg, or the faceless engineers that control what pops up in your news feed?
Trevor Timm
Trevor Timm is a co-founder and the executive director of the Freedom of the Press Foundation. He is a writer, activist, and legal analyst who specializes in free speech and government transparency issues. He writes a weekly column for The Guardian and has also contributed to The Atlantic, Al Jazeera, Foreign Policy, Harvard Law and Policy Review, PBS MediaShift, and Politico.
While the prospect of a Donald Trump presidency is a terrifying one, perhaps this is scarier: Facebook could use its unprecedented powers to tilt the 2016 presidential election away from him - and the social network's employees have apparently openly discussed whether they should do so.
As Gizmodo reported on Friday, "Last month, some Facebook employees used a company poll to ask [Facebook founder Mark] Zuckerberg whether the company should try 'to help prevent President Trump in 2017'."
Facebook employees are probably just expressing the fear that millions of Americans have of the Republican demagogue. But while there's no evidence that the company plans on taking anti-Trump action, the extraordinary ability that the social network has to manipulate millions of people with just a tweak to its algorithm is a serious cause for concern.
The fact that an internet giant like Facebook or Google could turn an election based on hidden changes to its code has been a hypothetical scenario for years (and it's even a plot point in this season's House of Cards). Harvard Law professor Jonathan Zittrain explained in 2010 how "Facebook could decide an election without anyone ever finding out", after the tech giant secretly conducted a test in which they were able to allegedly increase voter turnout by 340,000 votes around the country on election day simply by showing users a photo of someone they knew saying "I voted".
Facebook repeated this civics engagement experiment on a broader scale during the 2012 election. While the testing did not favor any one candidate, the potential for that power to be used to manipulate voters became such an obvious concern that Facebook's COO, Sheryl Sandberg, said, in 2014, "I want to be clear - Facebook can't control emotions and cannot and will not try to control emotions." She added: "Facebook would never try to control elections."
Her comments came right after a controversial study conducted by Facebook became public. It showed that, in fact, the company had secretly manipulated the emotions of nearly 700,000 people.
Some 78% of Americans have a social network profile of some kind. The dominance of Facebook in Americans' daily lives, and the fact that more people get their news from it than any other source, means the influence of the company in elections has never been greater. With each year that passes, the potential that an internet giant could swing an election gets greater.
Earlier this year, the Guardian reported on the treasure trove of data Facebook holds on hundreds of millions of voters and how it is already allowing presidential candidates to exploit it in different ways:
Facebook, which told investors on Wednesday it was 'excited about the targeting', does not let candidates track individual users. But it does now allow presidential campaigns to upload their massive email lists and voter files - which contain political habits, real names, home addresses and phone numbers - to the company's advertising network. The company will then match real-life voters with their Facebook accounts, which follow individuals as they move across congressional districts and are filled with insightful data.
And in a Politico Magazine piece entitled "How Google could rig the 2016 election", research psychologist Robert Epstein described how a study he co-authored in Proceedings of the National Academy of Sciences found that "Google's search algorithm can easily shift the voting preferences of undecided voters by 20% or more - up to 80% in some demographic groups - with virtually no one knowing they are being manipulated."
As Epstein says, much of this manipulation is unintentional: search results on Google are influenced by the popularity of other searches, algorithms are changed all the time for various reasons, and some tweaks that affect what people see about politics may not be the result of malicious engineers bent on changing the country's political persuasions. However, the potential for that to happen is there - and the same risks apply to Facebook.
To be sure, many corporations, including broadcasters and media organisations, have used their vast power to influence elections in all sorts of ways in the past: whether it's through money, advertising, editorials, or simply the way they present the news. But at no time has one company held so much influence over a large swath of the population - 40% of all news traffic now originates from Facebook - while also having the ability to make changes invisibly.
As Gizmodo reported, there's no law stopping Facebook from doing so if it desires. "Facebook can promote or block any material that it wants," UCLA law professor Eugene Volokh told Gizmodo. "Facebook has the same First Amendment right as the New York Times. They can completely block Trump if they want. They block him or promote him."
To those disgusted by Trump's xenophobia, his boorish and erratic behavior, this might seem like a welcome development. But one organisation having the means to tilt elections one way or another a dangerous innovation. Once started, it would be hard to control. In this specific case, a majority of the public might approve of the results. But do we really want future elections around the world to be decided by the political persuasions of Mark Zuckerberg, or the faceless engineers that control what pops up in your news feed?
We've had enough. The 1% own and operate the corporate media. They are doing everything they can to defend the status quo, squash dissent and protect the wealthy and the powerful. The Common Dreams media model is different. We cover the news that matters to the 99%. Our mission? To inform. To inspire. To ignite change for the common good. How? Nonprofit. Independent. Reader-supported. Free to read. Free to republish. Free to share. With no advertising. No paywalls. No selling of your data. Thousands of small donations fund our newsroom and allow us to continue publishing. Can you chip in? We can't do it without you. Thank you.