SUBSCRIBE TO OUR FREE NEWSLETTER
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
5
#000000
#FFFFFF
To donate by check, phone, or other method, see our More Ways to Give page.
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
With the act now in effect for most platforms, the European Commission and member states "must resist any attempts by Big Tech companies to water down implementation," said one expert.
As the European Union's Digital Services Act expanded to cover nearly all online platforms in the bloc on Saturday, Amnesty International stressed the importance of robust enforcement.
"It's a historic day for tech accountability," said Alia Al Ghussain, researcher and adviser on technology and human rights at Amnesty Tech, in a statement. "Today must mark the end of the era of unregulated Big Tech, and for that to happen, the DSA must be robustly enforced to avoid it becoming a paper tiger."
"Today must mark the end of the era of unregulated Big Tech."
E.U. member states and the European Commission "are primarily responsible for the monitoring and enforcement of the additional obligations that apply to Big Tech companies under the DSA," Al Ghussain added. "They must resist any attempts by Big Tech companies to water down implementation and enforcement efforts, and insist on putting human rights at the forefront of this new digital landscape."
Some of the E.U.'s online rulebook took effect in August for 19 major platforms and search engines: Alibaba AliExpress; Amazon; Bing; Booking.com; Apple's AppStore; Google's Play, Maps, Search, Shopping, and YouTube; LinkedIn; Meta-owned Facebook and Instagram; Pinterest; Snapchat; TikTok; Wikipedia; X, formerly called Twitter; and Zalando.
The European Commission took its first formal action under the DSA in December, announcing an investigation into X—which is owned by billionaire Elon Musk—for "suspected breach of obligations to counter illegal content and disinformation, suspected breach of transparency obligations, and suspected deceptive design of user interface."
As of Saturday, the DSA applies to all online platforms, with some exceptions for firms that have fewer than 50 employees and an annual turnover below €10 million ($10.78 million)—though those companies must still designate a point of contact for authorities and users as well as have clear terms and conditions.
The DSA bans targeting minors with advertisements based on personal data and targeting all users with ads based on sensitive data such as religion or sexual preference. The act also requires platforms to provide users with: information about advertising they see; a tool to flag illegal content; explanations for content moderation decisions; and a way to challenge such decisions. Platforms are further required to publish a report about content moderation procedures at least once a year.
While companies that violate the DSA could be fined up to 6% of their global annual turnover or even banned in the E.U., imposing such penalties isn't the ultimate goal. According toAgence France-Presse:
Beyond the prospect of fines, Alexandre de Streel of the think tank Centre on Regulation in Europe, said the law aimed ultimately to change the culture of digital firms.
"The DSA is a gradual system, everything is not going to change in one minute and not on February 17," he said. "The goal isn't to impose fines, it's that platforms change their practices."
Still, Thierry Breton, a former French tech CEO now serving as the European commissioner for the internal market, said in a statement that "we encourage all member states to make the most out of our new rulebook."
Like Amnesty's Al Ghussain, he stressed that "effective enforcement is key to protect our citizens from illegal content and to uphold their rights."
Earlier this week, Politicoreported that "senior E.U. officials like Breton and Věra Jourová, commission vice president for values and transparency, have butted heads over how to sell the rulebook to both companies and the wider public." Internal battles and industry pushback aren't the only barriers to effectively implementing the DSA.
"At the national level, member countries are expected to nominate local regulators by February 17 to coordinate the pan-E.U. rules via a European Board for Digital Services," Politico noted. "That group will hold its first meeting in Brussels early next week. But as of mid-February, only a third of those agencies were in place, based on the commission's own data, although existing regulators in Brussels, Paris, and Dublin are already cooperating."
Campaigners are also acknowledging the shortcomings of the DSA. European Digital Rights on Saturday recirculated a November 2022 essay in which EDRi policy advisers Sebastian Becker Castellaro Jan Penfrat argued that "the DSA is a positive step forward" but "no content moderation policy in the world will protect us from harmful online content as long as we do not address the dominant, yet incredibly damaging surveillance business model of most large tech firms."
Meanwhile, Al Ghussain said that "to mitigate the human rights risks posed by social media platforms, the European Commission must tackle the addictive and harmful design of these platforms, including changes to recommender systems so that they are no longer hardwired for engagement at all costs, nor based on user profiling by default."
Common Dreams is powered by optimists who believe in the power of informed and engaged citizens to ignite and enact change to make the world a better place. We're hundreds of thousands strong, but every single supporter makes the difference. Your contribution supports this bold media model—free, independent, and dedicated to reporting the facts every day. Stand with us in the fight for economic equality, social justice, human rights, and a more sustainable future. As a people-powered nonprofit news outlet, we cover the issues the corporate media never will. |
As the European Union's Digital Services Act expanded to cover nearly all online platforms in the bloc on Saturday, Amnesty International stressed the importance of robust enforcement.
"It's a historic day for tech accountability," said Alia Al Ghussain, researcher and adviser on technology and human rights at Amnesty Tech, in a statement. "Today must mark the end of the era of unregulated Big Tech, and for that to happen, the DSA must be robustly enforced to avoid it becoming a paper tiger."
"Today must mark the end of the era of unregulated Big Tech."
E.U. member states and the European Commission "are primarily responsible for the monitoring and enforcement of the additional obligations that apply to Big Tech companies under the DSA," Al Ghussain added. "They must resist any attempts by Big Tech companies to water down implementation and enforcement efforts, and insist on putting human rights at the forefront of this new digital landscape."
Some of the E.U.'s online rulebook took effect in August for 19 major platforms and search engines: Alibaba AliExpress; Amazon; Bing; Booking.com; Apple's AppStore; Google's Play, Maps, Search, Shopping, and YouTube; LinkedIn; Meta-owned Facebook and Instagram; Pinterest; Snapchat; TikTok; Wikipedia; X, formerly called Twitter; and Zalando.
The European Commission took its first formal action under the DSA in December, announcing an investigation into X—which is owned by billionaire Elon Musk—for "suspected breach of obligations to counter illegal content and disinformation, suspected breach of transparency obligations, and suspected deceptive design of user interface."
As of Saturday, the DSA applies to all online platforms, with some exceptions for firms that have fewer than 50 employees and an annual turnover below €10 million ($10.78 million)—though those companies must still designate a point of contact for authorities and users as well as have clear terms and conditions.
The DSA bans targeting minors with advertisements based on personal data and targeting all users with ads based on sensitive data such as religion or sexual preference. The act also requires platforms to provide users with: information about advertising they see; a tool to flag illegal content; explanations for content moderation decisions; and a way to challenge such decisions. Platforms are further required to publish a report about content moderation procedures at least once a year.
While companies that violate the DSA could be fined up to 6% of their global annual turnover or even banned in the E.U., imposing such penalties isn't the ultimate goal. According toAgence France-Presse:
Beyond the prospect of fines, Alexandre de Streel of the think tank Centre on Regulation in Europe, said the law aimed ultimately to change the culture of digital firms.
"The DSA is a gradual system, everything is not going to change in one minute and not on February 17," he said. "The goal isn't to impose fines, it's that platforms change their practices."
Still, Thierry Breton, a former French tech CEO now serving as the European commissioner for the internal market, said in a statement that "we encourage all member states to make the most out of our new rulebook."
Like Amnesty's Al Ghussain, he stressed that "effective enforcement is key to protect our citizens from illegal content and to uphold their rights."
Earlier this week, Politicoreported that "senior E.U. officials like Breton and Věra Jourová, commission vice president for values and transparency, have butted heads over how to sell the rulebook to both companies and the wider public." Internal battles and industry pushback aren't the only barriers to effectively implementing the DSA.
"At the national level, member countries are expected to nominate local regulators by February 17 to coordinate the pan-E.U. rules via a European Board for Digital Services," Politico noted. "That group will hold its first meeting in Brussels early next week. But as of mid-February, only a third of those agencies were in place, based on the commission's own data, although existing regulators in Brussels, Paris, and Dublin are already cooperating."
Campaigners are also acknowledging the shortcomings of the DSA. European Digital Rights on Saturday recirculated a November 2022 essay in which EDRi policy advisers Sebastian Becker Castellaro Jan Penfrat argued that "the DSA is a positive step forward" but "no content moderation policy in the world will protect us from harmful online content as long as we do not address the dominant, yet incredibly damaging surveillance business model of most large tech firms."
Meanwhile, Al Ghussain said that "to mitigate the human rights risks posed by social media platforms, the European Commission must tackle the addictive and harmful design of these platforms, including changes to recommender systems so that they are no longer hardwired for engagement at all costs, nor based on user profiling by default."
As the European Union's Digital Services Act expanded to cover nearly all online platforms in the bloc on Saturday, Amnesty International stressed the importance of robust enforcement.
"It's a historic day for tech accountability," said Alia Al Ghussain, researcher and adviser on technology and human rights at Amnesty Tech, in a statement. "Today must mark the end of the era of unregulated Big Tech, and for that to happen, the DSA must be robustly enforced to avoid it becoming a paper tiger."
"Today must mark the end of the era of unregulated Big Tech."
E.U. member states and the European Commission "are primarily responsible for the monitoring and enforcement of the additional obligations that apply to Big Tech companies under the DSA," Al Ghussain added. "They must resist any attempts by Big Tech companies to water down implementation and enforcement efforts, and insist on putting human rights at the forefront of this new digital landscape."
Some of the E.U.'s online rulebook took effect in August for 19 major platforms and search engines: Alibaba AliExpress; Amazon; Bing; Booking.com; Apple's AppStore; Google's Play, Maps, Search, Shopping, and YouTube; LinkedIn; Meta-owned Facebook and Instagram; Pinterest; Snapchat; TikTok; Wikipedia; X, formerly called Twitter; and Zalando.
The European Commission took its first formal action under the DSA in December, announcing an investigation into X—which is owned by billionaire Elon Musk—for "suspected breach of obligations to counter illegal content and disinformation, suspected breach of transparency obligations, and suspected deceptive design of user interface."
As of Saturday, the DSA applies to all online platforms, with some exceptions for firms that have fewer than 50 employees and an annual turnover below €10 million ($10.78 million)—though those companies must still designate a point of contact for authorities and users as well as have clear terms and conditions.
The DSA bans targeting minors with advertisements based on personal data and targeting all users with ads based on sensitive data such as religion or sexual preference. The act also requires platforms to provide users with: information about advertising they see; a tool to flag illegal content; explanations for content moderation decisions; and a way to challenge such decisions. Platforms are further required to publish a report about content moderation procedures at least once a year.
While companies that violate the DSA could be fined up to 6% of their global annual turnover or even banned in the E.U., imposing such penalties isn't the ultimate goal. According toAgence France-Presse:
Beyond the prospect of fines, Alexandre de Streel of the think tank Centre on Regulation in Europe, said the law aimed ultimately to change the culture of digital firms.
"The DSA is a gradual system, everything is not going to change in one minute and not on February 17," he said. "The goal isn't to impose fines, it's that platforms change their practices."
Still, Thierry Breton, a former French tech CEO now serving as the European commissioner for the internal market, said in a statement that "we encourage all member states to make the most out of our new rulebook."
Like Amnesty's Al Ghussain, he stressed that "effective enforcement is key to protect our citizens from illegal content and to uphold their rights."
Earlier this week, Politicoreported that "senior E.U. officials like Breton and Věra Jourová, commission vice president for values and transparency, have butted heads over how to sell the rulebook to both companies and the wider public." Internal battles and industry pushback aren't the only barriers to effectively implementing the DSA.
"At the national level, member countries are expected to nominate local regulators by February 17 to coordinate the pan-E.U. rules via a European Board for Digital Services," Politico noted. "That group will hold its first meeting in Brussels early next week. But as of mid-February, only a third of those agencies were in place, based on the commission's own data, although existing regulators in Brussels, Paris, and Dublin are already cooperating."
Campaigners are also acknowledging the shortcomings of the DSA. European Digital Rights on Saturday recirculated a November 2022 essay in which EDRi policy advisers Sebastian Becker Castellaro Jan Penfrat argued that "the DSA is a positive step forward" but "no content moderation policy in the world will protect us from harmful online content as long as we do not address the dominant, yet incredibly damaging surveillance business model of most large tech firms."
Meanwhile, Al Ghussain said that "to mitigate the human rights risks posed by social media platforms, the European Commission must tackle the addictive and harmful design of these platforms, including changes to recommender systems so that they are no longer hardwired for engagement at all costs, nor based on user profiling by default."