SUBSCRIBE TO OUR FREE NEWSLETTER
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
5
#000000
#FFFFFF
To donate by check, phone, or other method, see our More Ways to Give page.
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
"Online platforms use sophisticated and opaque techniques of data collection that endanger young people and put their healthy development at risk," said one children's advocate.
Child welfare advocates renewed calls for U.S. lawmakers to pass a pair of controversial bills aimed at protecting youth from Big Tech's "dangerous and unacceptable business practices" after the Federal Trade Commission published a report Thursday detailing how social media and streaming companies endanger children and teens who use their platforms.
The FTC staff report—entitled A Look Behind the Screens: Examining the Data Practices of Social Media and Video Streaming Services—"shows how the tech industry's monetization of personal data has created a market for commercial surveillance, especially via social media and video streaming services, with inadequate guardrails to protect consumers."
The agency staff examined the practices of Meta platforms, which include Facebook, Instagram, and WhatsApp; YouTube; X, formerly known as Twitter; Snapchat; Reddit; Discord; Amazon, which owns the gaming site Twitch; and ByteDance, the owner of TikTok.
"The report finds that these companies engaged in mass data collection of their users and—in some cases—nonusers," Bureau of Consumer Protection Director Samuel Levine said in the paper. "It reveals that many companies failed to implement adequate safeguards against privacy risks. It sheds light on how companies used our personal data, from serving hypergranular targeted advertisements to powering algorithms that shape the content we see, often with the goal of keeping us hooked on using the service."
The publication "also finds that these practices pose unique risks to children and teens, with the companies having done little to respond effectively to the documented concerns that policymakers, psychologists, and parents have expressed over young people's physical and mental well-being."
FTC Chair Lina Khan said in a statement that "the report lays out how social media and video streaming companies harvest an enormous amount of Americans' personal data and monetize it to the tune of billions of dollars a year."
"While lucrative for the companies, these surveillance practices can endanger people's privacy, threaten their freedoms, and expose them to a host of harms, from identify theft to stalking," she added.
Researchers at Boston Children's Hospital and Harvard University published an analysis last December that revealed social media companies made nearly $11 billion in 2022 advertising revenue from U.S.-based users younger than 18.
According to the FTC report:
While the use of social media and digital technology can provide many positive opportunities for self-directed learning, forming community, and reducing isolation, it also has been associated with harms to physical and mental health, including through exposure to bullying, online harassment, child sexual exploitation, and exposure to content that may exacerbate mental health issues, such as the promotion of eating disorders, among other things.
The publication also flags "algorithms that may prioritize certain forms of harmful content, such as dangerous online challenges."
The report accuses social media companies of "willful blindness around child users" by claiming that there are no children on their platforms because their sites do not allow them to create accounts. This may constitute an attempt by the companies to avoid legal liability under the Children's Online Privacy Protection Act Rule (COPPA). Last December, Khan
proposed sweeping changes to COPPA to address the issue.
Josh Golin, executive director of Fairplay—a nonprofit organization "committed to helping children thrive in an increasingly commercialized, screen-obsessed culture"—said in a statement that "this report from the FTC is yet more proof that Big Tech's business model is harmful to children and teens."
"Online platforms use sophisticated and opaque techniques of data collection that endanger young people and put their healthy development at risk," Golin added. "We thank the FTC for listening to the concerns raised by Fairplay and a coalition of advocacy groups, and we call on Congress to pass COPPA 2.0, the Children and Teens' Online Privacy Protection Act, and KOSA, the Kids Online Safety Act, to better safeguard our children from these companies' dangerous and unacceptable business practices."
On Wednesday, the House Energy and Commerce Committee voted to advance COPPA 2.0 and KOSA, both of which were overwhelmingly passed by the Senate in July.
However, rights groups including the ACLU condemned KOSA, which the civil liberties organization warned "would violate the First Amendment by enabling the federal government to dictate what information people can access online and encourage social media platforms to censor protected speech."
In May 2023, U.S. Surgeon General Dr. Vivek Murthy issued an advisory on "the growing concerns about the effects of social media on youth mental health."
The White House simultaneously announced the creation of a federal task force "to advance the health, safety, and privacy of minors online with particular attention to preventing and mitigating the adverse health effects of online platforms."
Murthy has also called for tobacco-like warning labels on social media to address the platform's possible harms to children and teens.
According to a study published in January by the corporate power watchdog Ekō, in just one week that month there were more than 33 million posts on TikTok and Meta-owned Instagram "under hashtags housing problematic content directed at young users," including suicide, eating disorders, skin-whitening, and so-called "involuntary celibacy."
Common Dreams is powered by optimists who believe in the power of informed and engaged citizens to ignite and enact change to make the world a better place. We're hundreds of thousands strong, but every single supporter makes the difference. Your contribution supports this bold media model—free, independent, and dedicated to reporting the facts every day. Stand with us in the fight for economic equality, social justice, human rights, and a more sustainable future. As a people-powered nonprofit news outlet, we cover the issues the corporate media never will. |
Child welfare advocates renewed calls for U.S. lawmakers to pass a pair of controversial bills aimed at protecting youth from Big Tech's "dangerous and unacceptable business practices" after the Federal Trade Commission published a report Thursday detailing how social media and streaming companies endanger children and teens who use their platforms.
The FTC staff report—entitled A Look Behind the Screens: Examining the Data Practices of Social Media and Video Streaming Services—"shows how the tech industry's monetization of personal data has created a market for commercial surveillance, especially via social media and video streaming services, with inadequate guardrails to protect consumers."
The agency staff examined the practices of Meta platforms, which include Facebook, Instagram, and WhatsApp; YouTube; X, formerly known as Twitter; Snapchat; Reddit; Discord; Amazon, which owns the gaming site Twitch; and ByteDance, the owner of TikTok.
"The report finds that these companies engaged in mass data collection of their users and—in some cases—nonusers," Bureau of Consumer Protection Director Samuel Levine said in the paper. "It reveals that many companies failed to implement adequate safeguards against privacy risks. It sheds light on how companies used our personal data, from serving hypergranular targeted advertisements to powering algorithms that shape the content we see, often with the goal of keeping us hooked on using the service."
The publication "also finds that these practices pose unique risks to children and teens, with the companies having done little to respond effectively to the documented concerns that policymakers, psychologists, and parents have expressed over young people's physical and mental well-being."
FTC Chair Lina Khan said in a statement that "the report lays out how social media and video streaming companies harvest an enormous amount of Americans' personal data and monetize it to the tune of billions of dollars a year."
"While lucrative for the companies, these surveillance practices can endanger people's privacy, threaten their freedoms, and expose them to a host of harms, from identify theft to stalking," she added.
Researchers at Boston Children's Hospital and Harvard University published an analysis last December that revealed social media companies made nearly $11 billion in 2022 advertising revenue from U.S.-based users younger than 18.
According to the FTC report:
While the use of social media and digital technology can provide many positive opportunities for self-directed learning, forming community, and reducing isolation, it also has been associated with harms to physical and mental health, including through exposure to bullying, online harassment, child sexual exploitation, and exposure to content that may exacerbate mental health issues, such as the promotion of eating disorders, among other things.
The publication also flags "algorithms that may prioritize certain forms of harmful content, such as dangerous online challenges."
The report accuses social media companies of "willful blindness around child users" by claiming that there are no children on their platforms because their sites do not allow them to create accounts. This may constitute an attempt by the companies to avoid legal liability under the Children's Online Privacy Protection Act Rule (COPPA). Last December, Khan
proposed sweeping changes to COPPA to address the issue.
Josh Golin, executive director of Fairplay—a nonprofit organization "committed to helping children thrive in an increasingly commercialized, screen-obsessed culture"—said in a statement that "this report from the FTC is yet more proof that Big Tech's business model is harmful to children and teens."
"Online platforms use sophisticated and opaque techniques of data collection that endanger young people and put their healthy development at risk," Golin added. "We thank the FTC for listening to the concerns raised by Fairplay and a coalition of advocacy groups, and we call on Congress to pass COPPA 2.0, the Children and Teens' Online Privacy Protection Act, and KOSA, the Kids Online Safety Act, to better safeguard our children from these companies' dangerous and unacceptable business practices."
On Wednesday, the House Energy and Commerce Committee voted to advance COPPA 2.0 and KOSA, both of which were overwhelmingly passed by the Senate in July.
However, rights groups including the ACLU condemned KOSA, which the civil liberties organization warned "would violate the First Amendment by enabling the federal government to dictate what information people can access online and encourage social media platforms to censor protected speech."
In May 2023, U.S. Surgeon General Dr. Vivek Murthy issued an advisory on "the growing concerns about the effects of social media on youth mental health."
The White House simultaneously announced the creation of a federal task force "to advance the health, safety, and privacy of minors online with particular attention to preventing and mitigating the adverse health effects of online platforms."
Murthy has also called for tobacco-like warning labels on social media to address the platform's possible harms to children and teens.
According to a study published in January by the corporate power watchdog Ekō, in just one week that month there were more than 33 million posts on TikTok and Meta-owned Instagram "under hashtags housing problematic content directed at young users," including suicide, eating disorders, skin-whitening, and so-called "involuntary celibacy."
Child welfare advocates renewed calls for U.S. lawmakers to pass a pair of controversial bills aimed at protecting youth from Big Tech's "dangerous and unacceptable business practices" after the Federal Trade Commission published a report Thursday detailing how social media and streaming companies endanger children and teens who use their platforms.
The FTC staff report—entitled A Look Behind the Screens: Examining the Data Practices of Social Media and Video Streaming Services—"shows how the tech industry's monetization of personal data has created a market for commercial surveillance, especially via social media and video streaming services, with inadequate guardrails to protect consumers."
The agency staff examined the practices of Meta platforms, which include Facebook, Instagram, and WhatsApp; YouTube; X, formerly known as Twitter; Snapchat; Reddit; Discord; Amazon, which owns the gaming site Twitch; and ByteDance, the owner of TikTok.
"The report finds that these companies engaged in mass data collection of their users and—in some cases—nonusers," Bureau of Consumer Protection Director Samuel Levine said in the paper. "It reveals that many companies failed to implement adequate safeguards against privacy risks. It sheds light on how companies used our personal data, from serving hypergranular targeted advertisements to powering algorithms that shape the content we see, often with the goal of keeping us hooked on using the service."
The publication "also finds that these practices pose unique risks to children and teens, with the companies having done little to respond effectively to the documented concerns that policymakers, psychologists, and parents have expressed over young people's physical and mental well-being."
FTC Chair Lina Khan said in a statement that "the report lays out how social media and video streaming companies harvest an enormous amount of Americans' personal data and monetize it to the tune of billions of dollars a year."
"While lucrative for the companies, these surveillance practices can endanger people's privacy, threaten their freedoms, and expose them to a host of harms, from identify theft to stalking," she added.
Researchers at Boston Children's Hospital and Harvard University published an analysis last December that revealed social media companies made nearly $11 billion in 2022 advertising revenue from U.S.-based users younger than 18.
According to the FTC report:
While the use of social media and digital technology can provide many positive opportunities for self-directed learning, forming community, and reducing isolation, it also has been associated with harms to physical and mental health, including through exposure to bullying, online harassment, child sexual exploitation, and exposure to content that may exacerbate mental health issues, such as the promotion of eating disorders, among other things.
The publication also flags "algorithms that may prioritize certain forms of harmful content, such as dangerous online challenges."
The report accuses social media companies of "willful blindness around child users" by claiming that there are no children on their platforms because their sites do not allow them to create accounts. This may constitute an attempt by the companies to avoid legal liability under the Children's Online Privacy Protection Act Rule (COPPA). Last December, Khan
proposed sweeping changes to COPPA to address the issue.
Josh Golin, executive director of Fairplay—a nonprofit organization "committed to helping children thrive in an increasingly commercialized, screen-obsessed culture"—said in a statement that "this report from the FTC is yet more proof that Big Tech's business model is harmful to children and teens."
"Online platforms use sophisticated and opaque techniques of data collection that endanger young people and put their healthy development at risk," Golin added. "We thank the FTC for listening to the concerns raised by Fairplay and a coalition of advocacy groups, and we call on Congress to pass COPPA 2.0, the Children and Teens' Online Privacy Protection Act, and KOSA, the Kids Online Safety Act, to better safeguard our children from these companies' dangerous and unacceptable business practices."
On Wednesday, the House Energy and Commerce Committee voted to advance COPPA 2.0 and KOSA, both of which were overwhelmingly passed by the Senate in July.
However, rights groups including the ACLU condemned KOSA, which the civil liberties organization warned "would violate the First Amendment by enabling the federal government to dictate what information people can access online and encourage social media platforms to censor protected speech."
In May 2023, U.S. Surgeon General Dr. Vivek Murthy issued an advisory on "the growing concerns about the effects of social media on youth mental health."
The White House simultaneously announced the creation of a federal task force "to advance the health, safety, and privacy of minors online with particular attention to preventing and mitigating the adverse health effects of online platforms."
Murthy has also called for tobacco-like warning labels on social media to address the platform's possible harms to children and teens.
According to a study published in January by the corporate power watchdog Ekō, in just one week that month there were more than 33 million posts on TikTok and Meta-owned Instagram "under hashtags housing problematic content directed at young users," including suicide, eating disorders, skin-whitening, and so-called "involuntary celibacy."