SUBSCRIBE TO OUR FREE NEWSLETTER
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
5
#000000
#FFFFFF
");background-position:center;background-size:19px 19px;background-repeat:no-repeat;background-color:var(--button-bg-color);padding:0;width:var(--form-elem-height);height:var(--form-elem-height);font-size:0;}:is(.js-newsletter-wrapper, .newsletter_bar.newsletter-wrapper) .widget__body:has(.response:not(:empty)) :is(.widget__headline, .widget__subheadline, #mc_embed_signup .mc-field-group, #mc_embed_signup input[type="submit"]){display:none;}:is(.grey_newsblock .newsletter-wrapper, .newsletter-wrapper) #mce-responses:has(.response:not(:empty)){grid-row:1 / -1;grid-column:1 / -1;}.newsletter-wrapper .widget__body > .snark-line:has(.response:not(:empty)){grid-column:1 / -1;}:is(.grey_newsblock .newsletter-wrapper, .newsletter-wrapper) :is(.newsletter-campaign:has(.response:not(:empty)), .newsletter-and-social:has(.response:not(:empty))){width:100%;}.newsletter-wrapper .newsletter_bar_col{display:flex;flex-wrap:wrap;justify-content:center;align-items:center;gap:8px 20px;margin:0 auto;}.newsletter-wrapper .newsletter_bar_col .text-element{display:flex;color:var(--shares-color);margin:0 !important;font-weight:400 !important;font-size:16px !important;}.newsletter-wrapper .newsletter_bar_col .whitebar_social{display:flex;gap:12px;width:auto;}.newsletter-wrapper .newsletter_bar_col a{margin:0;background-color:#0000;padding:0;width:32px;height:32px;}.newsletter-wrapper .social_icon:after{display:none;}.newsletter-wrapper .widget article:before, .newsletter-wrapper .widget article:after{display:none;}#sFollow_Block_0_0_1_0_0_0_1{margin:0;}.donation_banner{position:relative;background:#000;}.donation_banner .posts-custom *, .donation_banner .posts-custom :after, .donation_banner .posts-custom :before{margin:0;}.donation_banner .posts-custom .widget{position:absolute;inset:0;}.donation_banner__wrapper{position:relative;z-index:2;pointer-events:none;}.donation_banner .donate_btn{position:relative;z-index:2;}#sSHARED_-_Support_Block_0_0_7_0_0_3_1_0{color:#fff;}#sSHARED_-_Support_Block_0_0_7_0_0_3_1_1{font-weight:normal;}.grey_newsblock .newsletter-wrapper, .newsletter-wrapper, .newsletter-wrapper.sidebar{background:linear-gradient(91deg, #005dc7 28%, #1d63b2 65%, #0353ae 85%);}
To donate by check, phone, or other method, see our More Ways to Give page.
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
"We are suing TikTok to protect young people and help combat the nationwide youth mental health crisis," explained New York Attorney General Letitia James.
Attorneys general from over a dozen states and the District of Columbia on Tuesday announced lawsuits against TikTok, accusing the company behind the popular social media platform of deliberately making the site addictive for children and deceiving the public about its dangers.
"We're suing the social media giant TikTok for exploiting young users and deceiving the public about the dangers the platform poses to our youth," Democratic California Attorney General Rob Bonta
explained Tuesday morning in San Francisco. "Together, with my fellow state AGs, we will hold TikTok to account, stop its exploitation of our young people, and end its deceit."
New York Attorney General Letitia James, also a Democrat, said in a
statement that "young people are struggling with their mental health because of addictive social media platforms like TikTok."
"TikTok claims that their platform is safe for young people, but that is far from true," she continued. "In New York and across the country, young people have died or gotten injured doing dangerous TikTok challenges and many more are feeling more sad, anxious, and depressed because of TikTok's addictive features."
"Today, we are suing TikTok to protect young people and help combat the nationwide youth mental health crisis," James added. "Kids and families across the country are desperate for help to address this crisis, and we are doing everything in our power to protect them."
James' office said in a
statement:
TikTok uses a variety of addictive features to keep users on its platform longer, which leads to poorer mental health outcomes. Multiple studies have found a link between excessive social media use, poor sleep quality, and poor mental health among young people. According to the U.S. surgeon general, young people who spend more than three hours per day on social media face double the risk of experiencing poor mental health outcomes, including symptoms of depression and anxiety.
According to James' office, TikTok's addictive features include:
The attorneys general also accuse TikTok of violating the Children's Online Privacy Protection Act, which is meant to shield children's online data; of falsely claiming that its platform is safe for children; and of lying about the effectiveness of its so-called safety tools meant to mitigate harms to youth.
In addition to California and New York, the following states are part of the new lawsuit: Illinois, Kentucky, Louisiana, Massachusetts, Mississippi, North Carolina, New Jersey, Oregon, South Carolina, Vermont, and Washington. So is the District of Columbia.
All told, 23 states have now filed lawsuits targeting TikTok's harms to children.
However, the issue is by no means limited to TikTok. Last October, dozens of U.S. states
sued Meta—which owns the social media sites Facebook and Instagram—for allegedly violating consumer protection laws by designing their apps to be addictive, especially to minors.
Twitter, the social platform known as X since shortly after it was
purchased by Elon Musk in 2022 for $44 billion, was sued in 2021 by child sex trafficking victims for allowing the publication of sexually explicit images of minors and refusing to remove them as requested by the plaintiffs and their parents.
Last month, the U.S. Federal Trade Commission
published a report detailing how social media and streaming companies endanger children and teens who use their platforms. The report's publication sparked renewed calls for Congress to pass legislation including the Children and Teens' Online Privacy Protection Act and Kids Online Safety Act (KOSA) to better safeguard minors against the companies' predatory practices.
However, rights groups including the ACLU condemned KOSA, which the civil liberties organization
warned "would violate the First Amendment by enabling the federal government to dictate what information people can access online and encourage social media platforms to censor protected speech."
The two bills—which were
overwhelmingly passed by the U.S. Senate in July—were last month approved for advancement in the House of Representatives.
In May 2023, U.S. Surgeon General Dr. Vivek Murthy issued an advisory on "the growing concerns about the effects of social media on youth mental health."
The White House simultaneously announced the creation of a federal task force "to advance the health, safety, and privacy of minors online with particular attention to preventing and mitigating the adverse health effects of online platforms."
Murthy has also called for tobacco-like warning labels on social media to address the platform's possible harms to children and teens.
Some critics are wary of singling out TikTok—which is owned by the Chinese company ByteDance—for political or xenophobic purposes.
Earlier this year, U.S. President Joe Biden signed into law a $95 billion foreign aid package containing a possible nationwide TikTok ban. The legislation requires ByteDance to sell TikTok to a non-Chinese company within a year or face a federal ban. TikTok subsequently sued the federal government over the potential ban.
Approximately 170 million Americans use TikTok, which is especially popular among members of Gen-Z and small-to-medium-sized businesses, and contributes tens of billions of dollars to the U.S. economy annually.
Evan Greer, who heads the digital rights group Fight for the Future, slammed the law as "one of the stupidest and most authoritarian pieces of tech legislation we've seen in years."
However, children's advocates welcomed the new lawsuits.
"We are pleased to see so many state attorneys general holding TikTok accountable for deliberately causing harms to young people," said Josh Golin, executive director of Fairplay. "Between state and private lawsuits, state legislation, and Federal Trade Commission enforcement actions, the tide is turning against Big Tech, and it's clear the status quo of social media companies harming kids cannot and will not continue."
"Now we need leaders in the House to join their Senate counterparts in passing the Kids Online Safety Act and the Children and Teens' Online Privacy Protection Act so that all platforms, not just those involved in legal settlements, will have to be safe by design for children from day one," Golin added.
As a physician I saw the deadly consequences of government inaction on AIDS. Today, I see a parallel—with big tech.
In the late 1990s, I was working as a physician in Zambia when I decided to pull over my car to stop at the side of the road. I couldn’t have prepared myself for what I would see at the intersection right in front of me: coffins of all sizes—including tiny ones for children and babies—for sale.
Perhaps I shouldn’t have been shocked. I had been watching many friends and colleagues who were dying of AIDS, and the hospital wards were overflowing with dying patients, at two to every bed, with others lying on the hallway floors. People diagnosed with HIV were living without hope as the medicines that could slow the progress of the disease were largely unavailable to those outside the U.S. and Europe.
Still, the sight of the coffins lined up neatly by the road remains one of the starkest and most disturbing memories from my time in Zambia.
This experience led me to found the Global AIDS Alliance in 2000. As a leader of the organization, our movement mobilized to pressure lawmakers on both sides of the aisle to convince them to fund programs to get medicines to those diagnosed with HIV no matter where they lived. At first it was disheartening to see how little many lawmakers and people in power initially seemed to care that people were dying preventable deaths.
As our movement built power, eventually we were successful—when the U.S. government launched the President’s Emergency Plan for AIDS Relief and the global community mobilized to create the Global Fund to Fight AIDS, Tuberculosis and Malaria, two organizations that have saved over 40 million lives in the last 20 years.
Today, I fear I am seeing a similar pattern on another issue: child safety.
As a survivor of familial childhood sexual violence, I have been a longtime advocate for laws to protect children from the trauma that I went through at the hands of my father.
Fortunately, my trauma happened in the 1960s decades before the internet, as children who are violated today are often photographed and videoed by their perpetrators. Content is often shared on the Internet and on storage platforms like iCloud where they can circulate endlessly.
In the last decade it has become increasingly clear that social media and tech companies are irresponsibly allowing sexual violence, harassment and exploitation to spread widely through all social media and internet platforms. On a daily basis children are experiencing abuse, harassment and cyberbullying, which is driving too many young people into a mental health crisis.
Yet the U.S. government has been tragically slow to take action.
Recently, the Senate grilled the CEOs of five big tech companies on child safety in an unprecedented hearing. Among those attending were the parents and loved ones of young people who have been harmed by social media, including some who died as a result.
Though the hearing was a good first step, much more needs to be done.
Countless of young people—including my friend Leah Juliett, a survivor of image-based sexual violence whose underage nude images still circulate on the internet despite their efforts to get them removed—continue to be harmed by social media and technology companies. And companies like Apple, which has refused to detect and remove known images depicting sexual abuse of children, were not required to testify.
The hearing has yet to result in any concrete legislative action—and time is running out. Young people across the country continue to be traumatized every day by this inaction. Were it not for the fact that my abuse took place decades ago and was not documented or uploaded on the internet, I may have been in the same situation as my friend Leah.
I urge President Biden and Congress to urgently take action—starting by requiring companies to enact policies to detect and remove known images depicting child sexual violence, abuse and to prevent cyberbullying. We need to treat this crisis of neglect of children like the public health emergency that it is.
I hope that the government will listen to survivors like myself and Leah—rather than turn a blind eye like I witnessed during the AIDS crisis. There’s an opportunity to save the lives of many young people, but if we don’t act soon it will be too late. We hear their silence, and we hope that they will hear all of our voices and be brave!
"Anyone who believes that children deserve to explore and play online without being tracked and manipulated should support this update."
To ensure that tech giants will not have "carte blanche with kids' data," as one advocacy group said, the Federal Trade Commission on Wednesday unveiled major proposed changes to the United States' online privacy law for children for the first time in a decade, saying that companies' evolving practices and capabilities require stronger protection for young people.
"Kids must be able to play and learn online without being endlessly tracked by companies looking to hoard and monetize their personal data," said FTC Chair Lina Khan as she announced proposed changes to the Children's Online Privacy Protection Act (COPPA) of 1998.
For more than two decades the law has restricted companies' online tracking of children through social media apps, video games, and advertising networks by requiring firms to obtain parental consent before gathering or using young users' personal information.
Platforms like Instagram and TikTok have sought to comply with COPPA by prohibiting children under age 13 from having accounts and requiring users to provide their birth dates, but regulators have accused several tech companies of failing to adequately protect children, and firms including Amazon, Google, and Epic Games have been hit with multimillion dollar penalties for COPPA violations.
Under the proposed changes, companies would be:
Education technology firms would also be permitted to collect and use students' personal information only for school-authorized educational purposes and not for commercial use, and COPPA Safe Harbor programs—industry groups which are permitted to seek FTC approval for self-regulatory guidelines that are the same as or stronger than COPPA's—would be required to publicly disclose their membership lists and report additional information to the FTC.
Haley Hinkle, policy counsel for children's digital advocacy group Fairplay, said the proposed rulemaking builds on the FTC's recent enforcement actions against companies including Epic Games—which charged users, including children, for unwanted purchases—and Meta, which misled parents about controls on its Messenger Kids app and about who could access kids' data.
"With this critical rule update, the FTC has further delineated what companies must do to minimize data collection and retention and ensure they are not profiting off of children's information at the expense of their privacy and well-being," Hinkle said. "Anyone who believes that children deserve to explore and play online without being tracked and manipulated should support this update."
Katharina Kopp, director of policy for the Center for Digital Democracy, said that strengthened online safeguards are "urgently needed" as markets and web companies increasingly pursue children "for their data, attention, and profits."
The rule will "help stem the tidal wave of personal information gathered on kids," Kopp said.
"The commission's plan will limit data uses involving children and help prevent companies from exploiting their information," she added. "These rules will also protect young people from being targeted through the increasing use of AI, which now further fuels data collection efforts. Young people 12 and under deserve a digital environment that is designed to be safer for them and that fosters their health and well-being. With this proposal, we should soon see less online manipulation, purposeful addictive design, and fewer discriminatory marketing practices."
Khan said the updated rules will make clear that safeguarding children's data online is the responsibility of tech firms.
"The proposed changes to COPPA are much-needed, especially in an era where online tools are essential for navigating daily life—and where firms are deploying increasingly sophisticated digital tools to surveil children," she said. "Our proposal places affirmative obligations on service providers and prohibits them from outsourcing their responsibilities to parents."
Zamaan Qureshi, co-chair of the Design It for Us coalition, said the proposed rules, which are subject to a 60-day public comment period before the FTC votes on them, "will make kids and teens much safer online."
"We applaud the FTC's new proposed rules that strengthen COPPA by centering the experience of children online, rather than Big Tech's bottom line," said Qureshi. "The proposed rule directly targets Big Tech's toxic business model by requiring the invasive practice of surveillance advertising to be off by default, limiting harmful nudges that keep young people coming back to the platform even when they don't want to, and including protections against the collection of biometric information."
"The FTC is acting where Congress has failed to, imposing strict rules for Big Tech who have spent years profiting off our personal information and data," Qureshi added. "This proposal should signal to Congress to act quickly in 2024 to advance bipartisan kids' privacy and safety legislation."