"The FTC has rightly recognized Meta simply cannot be trusted with young people's sensitive data and proposed a remedy in line with Meta's long history of abuse of children," Golin added.
"The FTC has rightly recognized Meta simply cannot be trusted with young people's sensitive data and proposed a remedy in line with Meta's long history of abuse of children."
Jeff Chester, executive director of the Center for Digital Democracy, similarly said that the FTC's move "is a long-overdue intervention into what has become a huge national crisis for young people."
"Meta and its platforms are at the center of a powerful commercialized social media system that has spiraled out of control, threatening the mental health and well-being of children and adolescents," he asserted. "The company has not done enough to address the problems caused by its unaccountable data-driven commercial platforms."
The FTC said in a statement that the tech giant, which changed its parent company name from Facebook to Meta in 2021, "has failed to fully comply with the order, misled parents about their ability to control with whom their children communicated through its Messenger Kids app, and misrepresented the access it provided some app developers to private user data."
The 2020 order, which the social media company agreed to the previous year, came out of the Cambridge Analytica scandal. It involved a $5 billion fine—which critics condemned as far too low—and followed a 2012 order also related to privacy practices.
"Facebook has repeatedly violated its privacy promises," Samuel Levine, director of the FTC's Bureau of Consumer Protection, declared Wednesday. "The company's recklessness has put young users at risk, and Facebook needs to answer for its failures."
The commission specifically accuses Meta of violating both the 2012 and 2020 orders as well as the FTC Act and the Children's Online Privacy Protection Act (COPPA) Rule. Commissioners are proposing a blanket ban against monetizing the data of minors, pausing the launch of new products and services, extending compliance to merged companies, limiting future uses of facial recognition technology, and strengthening privacy requirements.
The changes would apply to not only Facebook but also other Meta platforms such as Instagram, Oculus, and WhatsApp.
The commission voted 3-0 to issue an order to show cause—though Commissioner Alvaro Bedoya also put out a statement questioning whether the agency has the authority to implement some of the proposals. Meta now has 30 days to respond, after which the FTC will make a final decision on whether to move forward with the changes.
In a statement Wednesday, Meta spokesperson Andy Stone took aim at the commission leader specifically, saying that "FTC Chair Lina Khan's insistence on using any measure—however baseless—to antagonize American business has reached a new low."
Stone also claimed that the FTC's attempt to modify the 2020 order "is a political stunt," accused the commission of trying to "usurp the authority of Congress to set industrywide standards," and vowed to "vigorously fight this action."
While praising the FTC effort and blasting Meta, advocates for children concurred with the company's spokesperson on one point: the need for broader U.S. governmental action to address industry practices.
"Amid a continuing rise in shocking incidents of suicide, self-harm, and online abuse, as well as exposés from industry 'whistleblowers,' Meta is unleashing even more powerful data gathering and targeting tactics fueled by immersive content, virtual reality, and artificial intelligence, while pushing youth further into the metaverse with no meaningful safeguards," said Chester. "Parents and children urgently need the government to institute protections for the 'digital generation' before it is too late."
"Today's action by the FTC limiting how Meta can use the data it gathers will bring critical protections to both children and teens," he continued. "It will require Meta/Facebook to engage in a proper 'due diligence' process when launching new products targeting young people—rather than its current method of 'release first and address problems later approach.' The FTC deserves the thanks of U.S parents and others concerned about the privacy and welfare of our 'digital generation.'"
After also applauding the FTC "for its efforts to hold Meta accountable," Golin called on Congress to pass the Children and Teens' Online Privacy Protection Act, or COPPA 2.0, "because all companies should be prohibited from misusing young people's sensitive data, not just those operating under a consent decree."
"Until Congress acts on its promise to ensure privacy for kids and adults online, it's critical that the agency boldly enforces the law."
Public Citizen executive vice president Lisa Gilbert said in a statement that "kids should never have been used as an engine of profit for Meta, and it's great that the FTC is continuing to act aggressively. Until Congress acts on its promise to ensure privacy for kids and adults online, it's critical that the agency boldly enforces the law."
Though backed by some child advocacy groups, a few legislative proposals intended to protect children online—including the Eliminating Abusive and Rampant Neglect of Interactive Technologies (EARN IT) Act and Kids Online Safety Act (KOSA)—have alarmed organizations that warn about endangering digital privacy and free expression, as Common Dreams reported Tuesday.
As Sens. Ed Markey (D-Mass.) and Bill Cassidy (R-La.) on Wednesday reintroduced COPPA 2.0, Fight for the Future director Evan Greer—who has openly criticized the other measures—said that "we think federal data privacy protections should cover EVERYONE, not just kids, but overall this is a bill that would do some good and it does not have the same censorship concerns as bills like KOSA."