SUBSCRIBE TO OUR FREE NEWSLETTER
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
5
#000000
#FFFFFF
To donate by check, phone, or other method, see our More Ways to Give page.
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
The New York Times published a blockbuster story about Facebook that exposed how the company used so-called "smear merchants" to attack organizations critical of the platform. The story was shocking on a number of levels, revealing that Facebook's hired guns stooped to dog-whistling, anti-Semitic attacks aimed at George Soros 1 and writing stories blasting Facebook's competitors on a news site they managed. As Techdirt points out, however, while the particulars are different, the basic slimy tactics are familiar. Any organization that runs public campaigns in opposition to large, moneyed corporate interests has seen some version of this "slime your enemies" playbook.
What is different here is that Facebook, the company seeking to undermine its critics, has a powerful role in shaping whether and how news and information is presented to billions of people around the world. Facebook controls the hidden algorithms and other systems that decide what comes up in Instagram and Facebook experiences. And it does so in a way that is almost completely beyond our view, much less our control.
This fact--that Facebook can secretly influence what we see (like, perhaps, criticism against it), both through what it promotes and what it allows to be posted by others--is deeply disturbing. Users deserve some answers from Facebook to these basic questions:
The over 2.6 billion people using Facebook globally, who Facebook unironically calls its "community," should demand much more than just self-serving responses or empty apologies delivered the day before a holiday weekend. Facebook employees, who also have tremendous power to pressure their employer, should join users and demand that Facebook come clean.
The ongoing hidden nature of Facebook's algorithmic decision-making, however, plus the fact that it took a major newspaper expose to bring this to light, means Facebook probably can't be trusted to provide the answers users require. Facebook must allow third-party, neutral investigators access to see whether Facebook is misusing its position as our information purveyor to wage its own ugly propaganda war.
Going forward, we must also demand openness from Facebook about how it uses its power to buttress its financial and policy positions. Facebook can have policy positions, and it can even use its own platform to promote them. But it should only do so if it is up front with users about those practices and makes it crystal clear when it uses its power to put a finger on the scales to influence what they see. Only then can users make an informed decision about whether the platform is where they want to be.
Most importantly, this incident confirms that we should double down on pressure on Facebook and Instagram to provide users with more control over their experience on the platforms. We must support and develop competition, including concrete steps to promote data portability and interoperability. Congress can help by removing the legal blocks to a healthier Internet it created through the overbroad Computer Fraud and Abuse Act (CFAA) and Digital Millennium Copyright Act (DMCA). We must also ensure that click-wrap contracts and API restrictions can't be used to block competing and interoperable online services. As we've said before:
If it were more feasible for users to take their data and move elsewhere, Facebook would need to compete on the strength of its product rather than on the difficulty of starting over. And if the platform were more interoperable, smaller companies could work with the infrastructure Facebook has already created to build innovative new experiences and open up new markets. Users are trapped in a stagnant, sick system. Freeing their data and giving them control are the first steps towards a cure.
Facebook's smear campaign should spur policymakers and the rest of us to ask serious questions about Facebook's power as our information supplier. And once those questions are answered, we should take the steps necessary to restore a healthy information ecosystem online.
Common Dreams is powered by optimists who believe in the power of informed and engaged citizens to ignite and enact change to make the world a better place. We're hundreds of thousands strong, but every single supporter makes the difference. Your contribution supports this bold media model—free, independent, and dedicated to reporting the facts every day. Stand with us in the fight for economic equality, social justice, human rights, and a more sustainable future. As a people-powered nonprofit news outlet, we cover the issues the corporate media never will. |
The New York Times published a blockbuster story about Facebook that exposed how the company used so-called "smear merchants" to attack organizations critical of the platform. The story was shocking on a number of levels, revealing that Facebook's hired guns stooped to dog-whistling, anti-Semitic attacks aimed at George Soros 1 and writing stories blasting Facebook's competitors on a news site they managed. As Techdirt points out, however, while the particulars are different, the basic slimy tactics are familiar. Any organization that runs public campaigns in opposition to large, moneyed corporate interests has seen some version of this "slime your enemies" playbook.
What is different here is that Facebook, the company seeking to undermine its critics, has a powerful role in shaping whether and how news and information is presented to billions of people around the world. Facebook controls the hidden algorithms and other systems that decide what comes up in Instagram and Facebook experiences. And it does so in a way that is almost completely beyond our view, much less our control.
This fact--that Facebook can secretly influence what we see (like, perhaps, criticism against it), both through what it promotes and what it allows to be posted by others--is deeply disturbing. Users deserve some answers from Facebook to these basic questions:
The over 2.6 billion people using Facebook globally, who Facebook unironically calls its "community," should demand much more than just self-serving responses or empty apologies delivered the day before a holiday weekend. Facebook employees, who also have tremendous power to pressure their employer, should join users and demand that Facebook come clean.
The ongoing hidden nature of Facebook's algorithmic decision-making, however, plus the fact that it took a major newspaper expose to bring this to light, means Facebook probably can't be trusted to provide the answers users require. Facebook must allow third-party, neutral investigators access to see whether Facebook is misusing its position as our information purveyor to wage its own ugly propaganda war.
Going forward, we must also demand openness from Facebook about how it uses its power to buttress its financial and policy positions. Facebook can have policy positions, and it can even use its own platform to promote them. But it should only do so if it is up front with users about those practices and makes it crystal clear when it uses its power to put a finger on the scales to influence what they see. Only then can users make an informed decision about whether the platform is where they want to be.
Most importantly, this incident confirms that we should double down on pressure on Facebook and Instagram to provide users with more control over their experience on the platforms. We must support and develop competition, including concrete steps to promote data portability and interoperability. Congress can help by removing the legal blocks to a healthier Internet it created through the overbroad Computer Fraud and Abuse Act (CFAA) and Digital Millennium Copyright Act (DMCA). We must also ensure that click-wrap contracts and API restrictions can't be used to block competing and interoperable online services. As we've said before:
If it were more feasible for users to take their data and move elsewhere, Facebook would need to compete on the strength of its product rather than on the difficulty of starting over. And if the platform were more interoperable, smaller companies could work with the infrastructure Facebook has already created to build innovative new experiences and open up new markets. Users are trapped in a stagnant, sick system. Freeing their data and giving them control are the first steps towards a cure.
Facebook's smear campaign should spur policymakers and the rest of us to ask serious questions about Facebook's power as our information supplier. And once those questions are answered, we should take the steps necessary to restore a healthy information ecosystem online.
The New York Times published a blockbuster story about Facebook that exposed how the company used so-called "smear merchants" to attack organizations critical of the platform. The story was shocking on a number of levels, revealing that Facebook's hired guns stooped to dog-whistling, anti-Semitic attacks aimed at George Soros 1 and writing stories blasting Facebook's competitors on a news site they managed. As Techdirt points out, however, while the particulars are different, the basic slimy tactics are familiar. Any organization that runs public campaigns in opposition to large, moneyed corporate interests has seen some version of this "slime your enemies" playbook.
What is different here is that Facebook, the company seeking to undermine its critics, has a powerful role in shaping whether and how news and information is presented to billions of people around the world. Facebook controls the hidden algorithms and other systems that decide what comes up in Instagram and Facebook experiences. And it does so in a way that is almost completely beyond our view, much less our control.
This fact--that Facebook can secretly influence what we see (like, perhaps, criticism against it), both through what it promotes and what it allows to be posted by others--is deeply disturbing. Users deserve some answers from Facebook to these basic questions:
The over 2.6 billion people using Facebook globally, who Facebook unironically calls its "community," should demand much more than just self-serving responses or empty apologies delivered the day before a holiday weekend. Facebook employees, who also have tremendous power to pressure their employer, should join users and demand that Facebook come clean.
The ongoing hidden nature of Facebook's algorithmic decision-making, however, plus the fact that it took a major newspaper expose to bring this to light, means Facebook probably can't be trusted to provide the answers users require. Facebook must allow third-party, neutral investigators access to see whether Facebook is misusing its position as our information purveyor to wage its own ugly propaganda war.
Going forward, we must also demand openness from Facebook about how it uses its power to buttress its financial and policy positions. Facebook can have policy positions, and it can even use its own platform to promote them. But it should only do so if it is up front with users about those practices and makes it crystal clear when it uses its power to put a finger on the scales to influence what they see. Only then can users make an informed decision about whether the platform is where they want to be.
Most importantly, this incident confirms that we should double down on pressure on Facebook and Instagram to provide users with more control over their experience on the platforms. We must support and develop competition, including concrete steps to promote data portability and interoperability. Congress can help by removing the legal blocks to a healthier Internet it created through the overbroad Computer Fraud and Abuse Act (CFAA) and Digital Millennium Copyright Act (DMCA). We must also ensure that click-wrap contracts and API restrictions can't be used to block competing and interoperable online services. As we've said before:
If it were more feasible for users to take their data and move elsewhere, Facebook would need to compete on the strength of its product rather than on the difficulty of starting over. And if the platform were more interoperable, smaller companies could work with the infrastructure Facebook has already created to build innovative new experiences and open up new markets. Users are trapped in a stagnant, sick system. Freeing their data and giving them control are the first steps towards a cure.
Facebook's smear campaign should spur policymakers and the rest of us to ask serious questions about Facebook's power as our information supplier. And once those questions are answered, we should take the steps necessary to restore a healthy information ecosystem online.