SUBSCRIBE TO OUR FREE NEWSLETTER
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
5
#000000
#FFFFFF
To donate by check, phone, or other method, see our More Ways to Give page.
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
A small-government intervention will clean up the public market and force Threads—and Meta—to build a better, safer sewing machine.
As a kid, I worked in a men’s store tailor shop on the East Side of Cleveland. It was chaos, watching master tailors cut, sew, and press tiny threads into modern fashion. My job was to clean the shop, oil the machines, and keep the steam presses hydrated. Thread was everywhere and constantly needed to be swept up, as each garment was crafted with care and purpose.
Whether Meta founder Mark Zuckerberg realized it or not, the name of his new text-based social media platform, Threads, is the perfect metaphor for the new platform we’ve all been craving. Will it be sewn into something beautiful or just another tangled mess that needs to be swept up?
Elon Musk’s decisions at the helm of Twitter and the longstanding issues surrounding the lack of controls against bullies and bots have disgusted millions of users. But is jumping ship to a new platform—owned by a flawed company that has not cleaned up its own issues—the way we want to engage?
After my first day on Threads, I already faced issues that have plagued Twitter for years. I had fake profiles and bots already following my account.
Social media fashions have changed from when we first logged on over a decade ago. We are no longer excited by chaos, stunts, or gimmicks, or learning basic HTML to customize our backgrounds on MySpace. Many of us just want an uncluttered, simple social platform that’s bully and bot-free, and isn’t trying to sell us stuff we don’t want or need. Adam Mosseri, the head of Instagram, knows this, and was quoted in The New York Times saying he wants “Threads to be a ‘friendly place’ for public conversation.”
But is that even possible, given that Threads has seemingly already fallen short on protections? After my first day on Threads, I already faced issues that have plagued Twitter—a blatantly similar type of platform—for years. I had fake profiles and bots already following my account.
If Threads wants to succeed, it needs a bobbin to keep it running smoothly. Think of it as adding some simple guardrails to help guide the threads from jamming the machine. Without this basic intervention, we already know the downward spiral that’s coming next.
We have watched social networks, including Meta, fight to keep and expand archaic protections that were granted in 1996’s Communications Decency Act. These protections were created to allow companies like AOL and Prodigy to be treated as blind infrastructure, like a telephone line, and never be held liable for any communications on their railways.
These laws were created before there were modern-day social networks, let alone billions of dollars in advertising revenue being moved through them.
Unfortunately, as each of these platforms competes to become the largest network in the free market, without any intervention or protections, they will create more of the same bot-driven cesspools, spreading misinformation and disinformation and promoting false advertising. There is no real incentive for them to do anything different in the United States. Threads is not yet in the European Union, since the E.U. has stricter privacy laws. It also has yet to implement advertising, but that’s just a matter of time.
Now is the time to evolve the Communications Decency Act so that the next generation of social networks are sewn into a more wearable garment. This is not unAmerican. Think back to that famous Thomas Jefferson quote, “We might as well require a man to wear still the coat which fitted him when a boy as a civilized society to remain ever under the regimen of their barbarous ancestors." Let’s follow this lead and advance our social platforms by evolving Section 230 of the 1996 Communications Decency Act and force these powerful companies to take accountability for their actions.
Historically, Twitter only took performative actions to resolve or remove bots and fake accounts before they testified before Congress or before a major election. The company was well known for putting out self-congratulatory press releases on how it clamped down and removed tons of bots and bad actors—but let’s be honest, they never implemented long-term fixes to these known problems.
A simple change in liability, the bobbin, will ensure social networks run smoother by forcing them to focus on their consumers. This simple change will make these companies spend resources on security measures, monitoring technology, and even hiring staff to review advertising for accuracy, just like every other media outlet in America.
In other words, a small-government intervention will clean up the public market and force Threads—and Meta—to build a better, safer sewing machine. One that does not allow its users to be threatened by hate speech or acts of violence without real consequences.
It’s time for Congress to take out their brooms, evolve the Communications Decency Act, and help clean up these threads.
Common Dreams is powered by optimists who believe in the power of informed and engaged citizens to ignite and enact change to make the world a better place. We're hundreds of thousands strong, but every single supporter makes the difference. Your contribution supports this bold media model—free, independent, and dedicated to reporting the facts every day. Stand with us in the fight for economic equality, social justice, human rights, and a more sustainable future. As a people-powered nonprofit news outlet, we cover the issues the corporate media never will. |
As a kid, I worked in a men’s store tailor shop on the East Side of Cleveland. It was chaos, watching master tailors cut, sew, and press tiny threads into modern fashion. My job was to clean the shop, oil the machines, and keep the steam presses hydrated. Thread was everywhere and constantly needed to be swept up, as each garment was crafted with care and purpose.
Whether Meta founder Mark Zuckerberg realized it or not, the name of his new text-based social media platform, Threads, is the perfect metaphor for the new platform we’ve all been craving. Will it be sewn into something beautiful or just another tangled mess that needs to be swept up?
Elon Musk’s decisions at the helm of Twitter and the longstanding issues surrounding the lack of controls against bullies and bots have disgusted millions of users. But is jumping ship to a new platform—owned by a flawed company that has not cleaned up its own issues—the way we want to engage?
After my first day on Threads, I already faced issues that have plagued Twitter for years. I had fake profiles and bots already following my account.
Social media fashions have changed from when we first logged on over a decade ago. We are no longer excited by chaos, stunts, or gimmicks, or learning basic HTML to customize our backgrounds on MySpace. Many of us just want an uncluttered, simple social platform that’s bully and bot-free, and isn’t trying to sell us stuff we don’t want or need. Adam Mosseri, the head of Instagram, knows this, and was quoted in The New York Times saying he wants “Threads to be a ‘friendly place’ for public conversation.”
But is that even possible, given that Threads has seemingly already fallen short on protections? After my first day on Threads, I already faced issues that have plagued Twitter—a blatantly similar type of platform—for years. I had fake profiles and bots already following my account.
If Threads wants to succeed, it needs a bobbin to keep it running smoothly. Think of it as adding some simple guardrails to help guide the threads from jamming the machine. Without this basic intervention, we already know the downward spiral that’s coming next.
We have watched social networks, including Meta, fight to keep and expand archaic protections that were granted in 1996’s Communications Decency Act. These protections were created to allow companies like AOL and Prodigy to be treated as blind infrastructure, like a telephone line, and never be held liable for any communications on their railways.
These laws were created before there were modern-day social networks, let alone billions of dollars in advertising revenue being moved through them.
Unfortunately, as each of these platforms competes to become the largest network in the free market, without any intervention or protections, they will create more of the same bot-driven cesspools, spreading misinformation and disinformation and promoting false advertising. There is no real incentive for them to do anything different in the United States. Threads is not yet in the European Union, since the E.U. has stricter privacy laws. It also has yet to implement advertising, but that’s just a matter of time.
Now is the time to evolve the Communications Decency Act so that the next generation of social networks are sewn into a more wearable garment. This is not unAmerican. Think back to that famous Thomas Jefferson quote, “We might as well require a man to wear still the coat which fitted him when a boy as a civilized society to remain ever under the regimen of their barbarous ancestors." Let’s follow this lead and advance our social platforms by evolving Section 230 of the 1996 Communications Decency Act and force these powerful companies to take accountability for their actions.
Historically, Twitter only took performative actions to resolve or remove bots and fake accounts before they testified before Congress or before a major election. The company was well known for putting out self-congratulatory press releases on how it clamped down and removed tons of bots and bad actors—but let’s be honest, they never implemented long-term fixes to these known problems.
A simple change in liability, the bobbin, will ensure social networks run smoother by forcing them to focus on their consumers. This simple change will make these companies spend resources on security measures, monitoring technology, and even hiring staff to review advertising for accuracy, just like every other media outlet in America.
In other words, a small-government intervention will clean up the public market and force Threads—and Meta—to build a better, safer sewing machine. One that does not allow its users to be threatened by hate speech or acts of violence without real consequences.
It’s time for Congress to take out their brooms, evolve the Communications Decency Act, and help clean up these threads.
As a kid, I worked in a men’s store tailor shop on the East Side of Cleveland. It was chaos, watching master tailors cut, sew, and press tiny threads into modern fashion. My job was to clean the shop, oil the machines, and keep the steam presses hydrated. Thread was everywhere and constantly needed to be swept up, as each garment was crafted with care and purpose.
Whether Meta founder Mark Zuckerberg realized it or not, the name of his new text-based social media platform, Threads, is the perfect metaphor for the new platform we’ve all been craving. Will it be sewn into something beautiful or just another tangled mess that needs to be swept up?
Elon Musk’s decisions at the helm of Twitter and the longstanding issues surrounding the lack of controls against bullies and bots have disgusted millions of users. But is jumping ship to a new platform—owned by a flawed company that has not cleaned up its own issues—the way we want to engage?
After my first day on Threads, I already faced issues that have plagued Twitter for years. I had fake profiles and bots already following my account.
Social media fashions have changed from when we first logged on over a decade ago. We are no longer excited by chaos, stunts, or gimmicks, or learning basic HTML to customize our backgrounds on MySpace. Many of us just want an uncluttered, simple social platform that’s bully and bot-free, and isn’t trying to sell us stuff we don’t want or need. Adam Mosseri, the head of Instagram, knows this, and was quoted in The New York Times saying he wants “Threads to be a ‘friendly place’ for public conversation.”
But is that even possible, given that Threads has seemingly already fallen short on protections? After my first day on Threads, I already faced issues that have plagued Twitter—a blatantly similar type of platform—for years. I had fake profiles and bots already following my account.
If Threads wants to succeed, it needs a bobbin to keep it running smoothly. Think of it as adding some simple guardrails to help guide the threads from jamming the machine. Without this basic intervention, we already know the downward spiral that’s coming next.
We have watched social networks, including Meta, fight to keep and expand archaic protections that were granted in 1996’s Communications Decency Act. These protections were created to allow companies like AOL and Prodigy to be treated as blind infrastructure, like a telephone line, and never be held liable for any communications on their railways.
These laws were created before there were modern-day social networks, let alone billions of dollars in advertising revenue being moved through them.
Unfortunately, as each of these platforms competes to become the largest network in the free market, without any intervention or protections, they will create more of the same bot-driven cesspools, spreading misinformation and disinformation and promoting false advertising. There is no real incentive for them to do anything different in the United States. Threads is not yet in the European Union, since the E.U. has stricter privacy laws. It also has yet to implement advertising, but that’s just a matter of time.
Now is the time to evolve the Communications Decency Act so that the next generation of social networks are sewn into a more wearable garment. This is not unAmerican. Think back to that famous Thomas Jefferson quote, “We might as well require a man to wear still the coat which fitted him when a boy as a civilized society to remain ever under the regimen of their barbarous ancestors." Let’s follow this lead and advance our social platforms by evolving Section 230 of the 1996 Communications Decency Act and force these powerful companies to take accountability for their actions.
Historically, Twitter only took performative actions to resolve or remove bots and fake accounts before they testified before Congress or before a major election. The company was well known for putting out self-congratulatory press releases on how it clamped down and removed tons of bots and bad actors—but let’s be honest, they never implemented long-term fixes to these known problems.
A simple change in liability, the bobbin, will ensure social networks run smoother by forcing them to focus on their consumers. This simple change will make these companies spend resources on security measures, monitoring technology, and even hiring staff to review advertising for accuracy, just like every other media outlet in America.
In other words, a small-government intervention will clean up the public market and force Threads—and Meta—to build a better, safer sewing machine. One that does not allow its users to be threatened by hate speech or acts of violence without real consequences.
It’s time for Congress to take out their brooms, evolve the Communications Decency Act, and help clean up these threads.