SUBSCRIBE TO OUR FREE NEWSLETTER
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
5
#000000
#FFFFFF
");background-position:center;background-size:19px 19px;background-repeat:no-repeat;background-color:#222;padding:0;width:var(--form-elem-height);height:var(--form-elem-height);font-size:0;}:is(.js-newsletter-wrapper, .newsletter_bar.newsletter-wrapper) .widget__body:has(.response:not(:empty)) :is(.widget__headline, .widget__subheadline, #mc_embed_signup .mc-field-group, #mc_embed_signup input[type="submit"]){display:none;}:is(.grey_newsblock .newsletter-wrapper, .newsletter-wrapper) #mce-responses:has(.response:not(:empty)){grid-row:1 / -1;grid-column:1 / -1;}.newsletter-wrapper .widget__body > .snark-line:has(.response:not(:empty)){grid-column:1 / -1;}:is(.grey_newsblock .newsletter-wrapper, .newsletter-wrapper) :is(.newsletter-campaign:has(.response:not(:empty)), .newsletter-and-social:has(.response:not(:empty))){width:100%;}.newsletter-wrapper .newsletter_bar_col{display:flex;flex-wrap:wrap;justify-content:center;align-items:center;gap:8px 20px;margin:0 auto;}.newsletter-wrapper .newsletter_bar_col .text-element{display:flex;color:var(--shares-color);margin:0 !important;font-weight:400 !important;font-size:16px !important;}.newsletter-wrapper .newsletter_bar_col .whitebar_social{display:flex;gap:12px;width:auto;}.newsletter-wrapper .newsletter_bar_col a{margin:0;background-color:#0000;padding:0;width:32px;height:32px;}.newsletter-wrapper .social_icon:after{display:none;}.newsletter-wrapper .widget article:before, .newsletter-wrapper .widget article:after{display:none;}#sFollow_Block_0_0_1_0_0_0_1{margin:0;}.donation_banner{position:relative;background:#000;}.donation_banner .posts-custom *, .donation_banner .posts-custom :after, .donation_banner .posts-custom :before{margin:0;}.donation_banner .posts-custom .widget{position:absolute;inset:0;}.donation_banner__wrapper{position:relative;z-index:2;pointer-events:none;}.donation_banner .donate_btn{position:relative;z-index:2;}#sSHARED_-_Support_Block_0_0_7_0_0_3_1_0{color:#fff;}#sSHARED_-_Support_Block_0_0_7_0_0_3_1_1{font-weight:normal;}.sticky-sidebar{margin:auto;}@media (min-width: 980px){.main:has(.sticky-sidebar){overflow:visible;}}@media (min-width: 980px){.row:has(.sticky-sidebar){display:flex;overflow:visible;}}@media (min-width: 980px){.sticky-sidebar{position:-webkit-sticky;position:sticky;top:100px;transition:top .3s ease-in-out, position .3s ease-in-out;}}.grey_newsblock .newsletter-wrapper, .newsletter-wrapper, .newsletter-wrapper.sidebar{background:linear-gradient(91deg, #005dc7 28%, #1d63b2 65%, #0353ae 85%);}
To donate by check, phone, or other method, see our More Ways to Give page.
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
The public must be vigilant about those who claim vigilance as a mandate without bounds. A republic cannot outsource its conscience to machines and contractors.
The feed has eyes. What you share to stay connected now feeds one of the world’s largest surveillance machines. This isn’t paranoia, it’s policy. You do not need to speak to be seen. Every word you read, every post you linger on, every silence you leave behind is measured and stored. The watchers need no warrant—only your attention.
Each post, like, and photograph you share enters a room you cannot see. The visible audience, friends and followers, is only the front row. Behind them sit analysts, contractors, and automated systems that harvest words at scale. Over the last decade, the federal security apparatus has turned public social media into a continuous stream of open-source intelligence. What began as episodic checks for imminent threats matured into standing watch floors, shared databases, and automated scoring systems that never sleep. The rationale is familiar: national security, fraud prevention, situational awareness. The reality is starker: Everyday conversation now runs through a mesh of government and corporate surveillance that treats public speech, and the behavior around it, as raw material.
You do not need to speak to be seen. The act of being online is enough. Every scroll, pause, and click is recorded, analyzed, and translated into behavioral data. Algorithms study not only what we share but what we read and ignore, and how long our eyes linger. Silence becomes signal, and absence becomes information. The watchers often need no warrant for public content or purchased metadata, only your connection. In this architecture of observation, even passivity is participation.
This did not happen all at once. It arrived through privacy impact assessments, procurement notices, and contracts that layered capability upon capability. The Department of Homeland Security (DHS) built watch centers to monitor incidents. Immigration and Customs Enforcement folded social content into investigative suites that already pull from commercial dossiers. Customs and Border Protection (CBP) linked open posts to location data bought from brokers. The FBI refined its triage flows for threats flagged by platforms. The Department of Defense and the National Security Agency fused foreign collection and information operations with real-time analytics.
Little of this resembles a traditional wiretap, yet the effect is broader because the systems harvest not just speech but the measurable traces of attention. Most of it rests on the claim that publicly available information is fair game. The law has not caught up with the scale or speed of the tools. The culture has not caught up either.
The next turn of the wheel is underway. Immigration and Customs Enforcement plans two round-the-clock social media hubs, one in Vermont and one in California, staffed by private contractors for continuous scanning and rapid referral to Enforcement and Removal Operations. The target turnaround for urgent leads is 30 minutes. That is not investigation after suspicion. That is suspicion manufactured at industrial speed. The new programs remain at the request-for-information stage, yet align with an unmistakable trend. Surveillance shifts from ad hoc to ambient, from a hand search to machine triage, from situational awareness to an enforcement pipeline that links a post to a doorstep.
The line between looking and profiling thins because the input is no longer just what we say but what our attention patterns imply.
Artificial intelligence makes the expansion feel inevitable. Algorithms digest millions of posts per hour. They perform sentiment analysis, entity extraction, facial matching, and network mapping. They learn from the telemetry that follows a user: time on page, scroll depth, replay of a clip, the cadence of a feed. They correlate a pseudonymous handle with a résumé, a family photo, and a travel record. Data brokers fill in addresses, vehicles, and associates. What once took weeks now takes minutes. Scale is the selling point. It is also the danger. Misclassification travels as fast as truth, and error at scale becomes a kind of policy.
George Orwell warned that “to see what is in front of one’s nose needs a constant struggle.” The struggle today is to see how platform design, optimized for engagement, creates the very data that fuels surveillance. Engagement generates signals, signals invite monitoring, and monitoring, once normalized, reshapes speech and behavior. A feed that measures both speech and engagement patterns maps our concerns as readily as our views.
Defenders of the current model say agencies only view public content. That reassurance misses the point. Public is not the same as harmless. Aggregation transforms meaning. When the government buys location histories from data brokers, then overlays them with social content, it tracks lives without ever crossing a courthouse threshold. CBP has done so with products like Venntel and Babel Street, as documented in privacy assessments and Freedom of Information Act releases. A phone that appears at a protest can be matched to a home, a workplace, a network of friends, and an online persona that vents frustration in a late-night post. Add behavioral traces from passive use, where someone lingers and what they never click, and the portrait grows intimate enough to feel like surveillance inside the mind.
The FBI’s posture has evolved as well, particularly after January 6. Government Accountability Office reviews describe changes to how the bureau receives and acts on platform tips, along with persistent questions about the balance between public safety and overreach. The lesson is not that monitoring never helps. The lesson is that systems built for crisis have a way of becoming permanent, especially when they are fed by constant behavioral data that never stops arriving. Permanence demands stronger rules than we currently have.
Meanwhile, the DHS Privacy Office continues to publish assessments for publicly available social media monitoring and situational awareness. These documents describe scope and mitigations, and they reveal how far the concept has stretched. As geospatial, behavioral, and predictive analytics enter the toolkit, awareness becomes analysis, and analysis becomes anticipation. The line between looking and profiling thins because the input is no longer just what we say but what our attention patterns imply.
The First Amendment restrains the state from punishing lawful speech. It does not prevent the state from watching speech at scale, nor does it account for the scoring of attention. That gap produces a chilling effect that is hard to measure yet easy to feel. People who believe they are watched temper their words and their reading. They avoid organizing, and they avoid reading what might be misunderstood. This is not melodrama. It is basic social psychology. Those who already live closer to the line feel the pressure first: immigrants, religious and ethnic minorities, journalists, activists. Because enforcement databases are not neutral, they reproduce historical biases unless aggressively corrected.
Error is not theoretical. Facial recognition has misidentified innocent people. Network analysis has flagged friends and relatives who shared nothing but proximity. A meme or a lyric, stripped of context, can be scored as a threat. Behavioral profiles amplify risk because passivity can be interpreted as intent when reduced to metrics. The human fail-safe does not always work because human judgment is shaped by the authority of data. When an algorithm says possible risk, the cost of ignoring it feels higher than the cost of quietly adding a name to a file. What begins as prudence ends as normalization. What begins as a passive trace ends as a profile.
Fourth Amendment doctrine still leans on the idea that what we expose to the public is unprotected. That formulation collapses when the observer is a system that never forgets and draws inferences from attention as well as expression. Carpenter v. United States recognized a version of this problem for cell-site records, yet the holding has not been extended to the government purchase of similar data from brokers or to the bulk ingestion of content that individuals intend for limited audiences. First Amendment jurisprudence condemns overt retaliation against speakers. It has little to say about surveillance programs that corrode participation, including the act of reading, without ever bringing a case to court. Due process requires notice and an opportunity to contest. There is no notice when the flag is silent and the consequences are dispersed across a dozen small harms, each one deniable. There is no docket for the weight assigned to your pauses.
Wendell Phillips wrote, “Eternal vigilance is the price of liberty.” The line is often used to defend surveillance. It reads differently from the other side of the glass. The public must be vigilant about those who claim vigilance as a mandate without bounds. A republic cannot outsource its conscience to machines and contractors.
You cannot solve a policy failure with personal hygiene, but you can buy time. Treat every post as a public record that might be copied, scraped, and stored. Remove precise locations from images. Turn off facial tagging and minimize connections between accounts. Separate roles. If you organize, separate that work from family and professional identities with different emails, phone numbers, and sign ins. Use two-factor authentication everywhere. Prefer end-to-end encrypted tools like Signal for sensitive conversations. Scrub photo metadata before upload. Search your own name and handles in a private browser, then request removal from data-broker sites. Build a small circle that helps one another keep settings tight and recognize phishing and social engineering. These habits are not retreat. They are discipline.
The right to be unobserved is not a luxury. It is the quiet foundation of every other liberty.
Adopt the same care for reading as for posting. Log out when you can, block third-party trackers, limit platform time, and assume that dwell time and scroll depth are being recorded. Adjust feed settings to avoid autoplay and personalized tracking where possible. Use privacy-respecting browsers and extensions that reduce passive telemetry. Small frictions slow the flow of behavioral data that feeds automated suspicion.
Push outward as well. Read the transparency reports that platforms publish. They reveal how often governments request data and how often companies comply. Support groups that litigate and legislate for restraint, including the Electronic Frontier Foundation, the Brennan Center for Justice, and the Center for Democracy and Technology. Demand specific reforms: warrant requirements for government purchase of location and browsing data, public inventories of social media monitoring contracts and tools, independent audits of watch centers with accuracy and bias metrics, and accessible avenues for redress when the system gets it wrong. Insist on disclosure of passive telemetry collection and retention, not only subpoenas for content.
The digital commons was built on a promise of connection. Surveillance bends that commons toward control. It does so quietly, through dashboards and metrics that reward extraction of both speech and attention. The remedy begins with naming what has happened, then insisting that the rules match the power of the tools. A healthy public sphere allows risk. It tolerates anger and error. It places human judgment above automated suspicion. It restores the burden of proof to the state. It recognizes that attention is speech by another name, and that freedom requires privacy in attention as well as privacy in voice.
You do not need to disappear to stay free. You need clarity, patience, and a stubborn loyalty to truth in a time that rewards distraction. The watchers will say the threat leaves no choice, that vigilance demands vision turned outward. History says freedom depends on the courage to look inward first. The digital world was built as a commons, a place to connect and create, yet it is becoming a hall of mirrors where every glance becomes a record and every silence a signal. Freedom will not survive by accident. It must be practiced—one mindful post, one untracked thought, one refusal to mistake visibility for worth. The right to be unobserved is not a luxury. It is the quiet foundation of every other liberty. Guard even the silence, for in the end it may be the only voice that still belongs to you.
One ACLU campaigner blasted the justices for "giving the executive branch unprecedented power to silence speech it doesn't like."
The United States Supreme Court on Friday unanimously upheld a federal law banning TikTok if its Chinese parent company does not sell the popular social media app by Sunday.
The justices ruled in TikTok v. Garland, an unsigned opinion, that "Congress has determined that divestiture is necessary to address its well-supported national security concerns regarding TikTok's data collection practices and relationship with a foreign adversary."
"The problem appears real and the response to it not unconstitutional," the high court wrote. "Speaking with and in favor of a foreign adversary is one thing. Allowing a foreign adversary to spy on Americans is another."
President Joe Bidensigned legislation last April forcing ByteDance, which owns TikTok, to sell the app to a non-Chinese company within a year or face a nationwide ban. Proponents of the ban cited national security concerns, while digital rights and free speech defenders condemned the law.
Approximately 170 million Americans use TikTok, which is especially popular with younger people and small-to-medium-sized businesses, and contributes tens of billions of dollars to the U.S. economy annually.
The ACLU—which this week called TikTok v. Garland "one of the most important First Amendment cases of our time"—condemned Friday's decision as "a major blow to freedom of expression online."
"The Supreme Court's ruling is incredibly disappointing, allowing the government to shut down an entire platform and the free speech rights of so many based on fear-mongering and speculation," ACLU National Security Project deputy director Patrick Toomey said in a statement.
"By refusing to block this ban, the Supreme Court is giving the executive branch unprecedented power to silence speech it doesn't like, increasing the danger that sweeping invocations of 'national security' will trump our constitutional rights," Toomey added.
The digital rights group Electronic Frontier Foundation (EFF) said in response to Friday's ruling, "We are deeply disappointed that the court failed to require the strict First Amendment scrutiny required in a case like this, which would've led to the inescapable conclusion that the government's desire to prevent potential future harm had to be rejected as infringing millions of Americans' constitutionally protected free speech."
"We are disappointed to see the court sweep past the undisputed content-based justification for the law—to control what speech Americans see and share with each other—and rule only based on the shaky data privacy concerns," EFF added.
The Biden administration
said Friday that it would leave enforcement of any ban up to the incoming Trump administration.
The Washington Post reported Thursday that Republican U.S President-elect Donald Trump, who is set to take office next week, is weighing an executive order to suspend enforcement of the ban for 60-90 days.
U.S. Sen. Ed Markey (D-Mass.), who earlier this week introduced a bill to delay ByteDance's sale deadline until October, said Friday: "I am deeply disappointed by the Supreme Court's decision to uphold the TikTok ban. I am not done fighting to pass my 270-day extension. We need more time."
"We are suing TikTok to protect young people and help combat the nationwide youth mental health crisis," explained New York Attorney General Letitia James.
Attorneys general from over a dozen states and the District of Columbia on Tuesday announced lawsuits against TikTok, accusing the company behind the popular social media platform of deliberately making the site addictive for children and deceiving the public about its dangers.
"We're suing the social media giant TikTok for exploiting young users and deceiving the public about the dangers the platform poses to our youth," Democratic California Attorney General Rob Bonta
explained Tuesday morning in San Francisco. "Together, with my fellow state AGs, we will hold TikTok to account, stop its exploitation of our young people, and end its deceit."
New York Attorney General Letitia James, also a Democrat, said in a
statement that "young people are struggling with their mental health because of addictive social media platforms like TikTok."
"TikTok claims that their platform is safe for young people, but that is far from true," she continued. "In New York and across the country, young people have died or gotten injured doing dangerous TikTok challenges and many more are feeling more sad, anxious, and depressed because of TikTok's addictive features."
"Today, we are suing TikTok to protect young people and help combat the nationwide youth mental health crisis," James added. "Kids and families across the country are desperate for help to address this crisis, and we are doing everything in our power to protect them."
James' office said in a
statement:
TikTok uses a variety of addictive features to keep users on its platform longer, which leads to poorer mental health outcomes. Multiple studies have found a link between excessive social media use, poor sleep quality, and poor mental health among young people. According to the U.S. surgeon general, young people who spend more than three hours per day on social media face double the risk of experiencing poor mental health outcomes, including symptoms of depression and anxiety.
According to James' office, TikTok's addictive features include:
The attorneys general also accuse TikTok of violating the Children's Online Privacy Protection Act, which is meant to shield children's online data; of falsely claiming that its platform is safe for children; and of lying about the effectiveness of its so-called safety tools meant to mitigate harms to youth.
In addition to California and New York, the following states are part of the new lawsuit: Illinois, Kentucky, Louisiana, Massachusetts, Mississippi, North Carolina, New Jersey, Oregon, South Carolina, Vermont, and Washington. So is the District of Columbia.
All told, 23 states have now filed lawsuits targeting TikTok's harms to children.
However, the issue is by no means limited to TikTok. Last October, dozens of U.S. states
sued Meta—which owns the social media sites Facebook and Instagram—for allegedly violating consumer protection laws by designing their apps to be addictive, especially to minors.
Twitter, the social platform known as X since shortly after it was
purchased by Elon Musk in 2022 for $44 billion, was sued in 2021 by child sex trafficking victims for allowing the publication of sexually explicit images of minors and refusing to remove them as requested by the plaintiffs and their parents.
Last month, the U.S. Federal Trade Commission
published a report detailing how social media and streaming companies endanger children and teens who use their platforms. The report's publication sparked renewed calls for Congress to pass legislation including the Children and Teens' Online Privacy Protection Act and Kids Online Safety Act (KOSA) to better safeguard minors against the companies' predatory practices.
However, rights groups including the ACLU condemned KOSA, which the civil liberties organization
warned "would violate the First Amendment by enabling the federal government to dictate what information people can access online and encourage social media platforms to censor protected speech."
The two bills—which were
overwhelmingly passed by the U.S. Senate in July—were last month approved for advancement in the House of Representatives.
In May 2023, U.S. Surgeon General Dr. Vivek Murthy issued an advisory on "the growing concerns about the effects of social media on youth mental health."
The White House simultaneously announced the creation of a federal task force "to advance the health, safety, and privacy of minors online with particular attention to preventing and mitigating the adverse health effects of online platforms."
Murthy has also called for tobacco-like warning labels on social media to address the platform's possible harms to children and teens.
Some critics are wary of singling out TikTok—which is owned by the Chinese company ByteDance—for political or xenophobic purposes.
Earlier this year, U.S. President Joe Biden signed into law a $95 billion foreign aid package containing a possible nationwide TikTok ban. The legislation requires ByteDance to sell TikTok to a non-Chinese company within a year or face a federal ban. TikTok subsequently sued the federal government over the potential ban.
Approximately 170 million Americans use TikTok, which is especially popular among members of Gen-Z and small-to-medium-sized businesses, and contributes tens of billions of dollars to the U.S. economy annually.
Evan Greer, who heads the digital rights group Fight for the Future, slammed the law as "one of the stupidest and most authoritarian pieces of tech legislation we've seen in years."
However, children's advocates welcomed the new lawsuits.
"We are pleased to see so many state attorneys general holding TikTok accountable for deliberately causing harms to young people," said Josh Golin, executive director of Fairplay. "Between state and private lawsuits, state legislation, and Federal Trade Commission enforcement actions, the tide is turning against Big Tech, and it's clear the status quo of social media companies harming kids cannot and will not continue."
"Now we need leaders in the House to join their Senate counterparts in passing the Kids Online Safety Act and the Children and Teens' Online Privacy Protection Act so that all platforms, not just those involved in legal settlements, will have to be safe by design for children from day one," Golin added.