Hold up
I don’t wholly agree with this ruling or it’s implications–The Encryption Problem, in particular, is a terrible argument that has to die–but I really have to address this section because it’s not accurate:
The trial judge in the California case bought this argument, ruling that because the claims were about “product design and other non-speech issues,” Section 230 didn’t apply. The New Mexico court reached a similar conclusion. Both cases then went to trial.
This distinction — between “design” and “content” — sounds reasonable for about three seconds. Then you realize it falls apart completely.
Here’s a thought experiment: imagine Instagram, but every single post is a video of paint drying. Same infinite scroll. Same autoplay. Same algorithmic recommendations. Same notification systems. Is anyone addicted? Is anyone harmed? Is anyone suing?
Of course not. Because infinite scroll is not inherently harmful. Autoplay is not inherently harmful. Algorithmic recommendations are not inherently harmful. These features only matter because of the content they deliver. The “addictive design” does nothing without the underlying user-generated content that makes people want to keep scrolling.
Instagram has, I’m sure, thousands of videos of paint drying that, I’m also sure, have very few views. Those videos have very few views because part of Instagram’s algorithmic recommendation system is to not serve videos of paint drying to people, because the design goal of Instagram is maximum addiction and use, which would not happen if their algorithm only recommended videos of paint drying.
The scenario of “Instagram, but with videos of paint drying. Same infinite scroll. Same autoplay. Same algorithmic recommendations. Same notification systems,” is the scenario we’re in now where we do have people addicted, we do have people harmed, and people are suing. Constraining Instagram to have “only” videos of paint drying is a straw man because it nearly eliminates all the design decisions that caused the harm. So, yeah, if you eliminate all that design that causes harm, the harm isn’t caused, but that’s not what anyone’s talking about.
First, however, let’s start with what Section 230 actually says:
No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.
No provider or user of an interactive computer service shall be held liable on account of—
(A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or
(B) any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1).
There’s more that I believe isn’t currently relevant, but by all means look and correct me.
In every day language, what does 230 say? It’s a narrow carve out for responsibility based only on “providers are not necessarily publishers” and “providers can choose what content appears, or does not”.
Now, what are these lawsuits claiming? They claim (I’m going to speak to just Instagram here, but this applies to all the others as well):
- That Instagram, as a system, has been specifically designed to be addictive
- That Instagram, as a system, has been specifically designed to worsen the mental health of its users
- That Instagram, as a system, has been specifically designed to maximize user engagement at the expense of that user
- That children deserve additional protection–just like children get additional protection from advertisement–from hostile systems because their brains are still developing and they’re particularly vulnerable to it
None of those are content arguments, and saying, “But what if the content was paint drying?” is not relevant or helpful. People aren’t addicted to “a single Instagram video” or even “a single Instagram channel” (you can probably tell I’m not on Instagram; I’m sure they’re not called “channels”). People are addicted to the system of Instagram that feeds them content specifically tailored to maximize addiction and use, and feeds them content in a way that maximizes addiction and use. For some people that’s makeup videos, for some people that’s movie clips; the specific content is not the point. Hell, there’s probably one guy in Minnesota who’s hopelessly addicted to paint drying videos.
The problem, as with practically everything we’re dealing with in the world, is not single bad actors or individual responsibility. The problem is the system, and the system has, in fact as documented in court, been specifically designed to be addictive, to ruin people’s mental health, and to cause harm. The only way we’re going to be able to address this is by focusing on the system.
Finally, we’ve got to address this statement as well:
If every editorial decision about how to present third-party content is now a “design choice” subject to product liability, Section 230 protects effectively nothing. Every website makes decisions about how to display user content. Every search engine ranks results. Every email provider filters spam. Every forum has a sorting algorithm, even if it’s just “newest first.” All of those are “design choices” that could, theoretically, be blamed for some downstream harm.
Instagram’s targeted recommendation and addiction algorithm dark patterns are not the same thing as “newest first”. This is a slippery slope argument with no evidence that such a slope exists. If “newest first” was equally addictive and harmful, Meta would not have spent probably billions creating its various “engagement” systems. This is like saying a lawsuit against a restaurant that poisoned someone with puffer fish will lead to lawsuits against restaurants for selling salmon because they’re both fish.
Another example: we didn’t ban normal darts after we banned lawn darts, despite their similar design decisions, because of the key differences in their design decisions that resulted in clear and obvious differences in their harmful outcomes. No one’s going to get sued for “newest first” specifically because of how it’s different to the engagement algorithms.
The people and companies who make products have always been responsible for the designs of their products when those designs cause harm, from the lawn dart to the Pinto. And, we have long recognized that mental harms are harms: “Intentional infliction of emotional distress”, for instance, has been a recognized tort for decades. That we now have products that cause mental harm is new simply because we didn’t used to have the technology to create those products. But, “products have designs that cause harm” is not a new concept, and neither is “mental harms are tortable harms”.
Furthermore, “every editorial decision” is not now a “design choice”; just the design choices. Providers are–still!–not publishers or speakers of third-party content, and–still!–are not liable for moderation. Nothing in these lawsuits can be reasonably construed to impact decisions to publish–or not–specific content, which is all 230 protects. These lawsuits are, fully, not about the content, any more than California’s ban on Amazon’s dark patterns are a ban on having a web store. This lawsuits are fundamentally not about speech, because the problem is not the speech, but the system around the speech.
That some people might benefit from social media doesn’t negate the harm done to other people, nor make the company not liable for the harm it causes. No matter how many people found joy and friendship playing lawn darts with their friends, that doesn’t resurrect the kids who died, or replace the eyes that were lost. “Someone who was not harmed by lawn darts” would never be invited to a lawsuit about someone who was harmed by lawn darts; that just doesn’t make sense.
I’ve come down pretty hard, here, like I’m fully in favor of these lawsuits. While I definitely believe the nature of these social media sites is specifically designed to be harmful, and we do need a way to address that, ehhhhh, the plaintiffs in these cases made some pretty bad arguments. “Encryption is harmful”, well, guess what, lack of encryption is more harmful! We absolutely can’t be saying that companies are damned if they do, damned if they don’t, and we definitely don’t want to be restricting encryption. As rightly pointed out by the author, mental harms are complex, multifaceted, and it’s difficult to determine a reliable causality; I don’t know enough about the people in question to speak on the analysis that happened here, but it probably wasn’t sufficient. But, that doesn’t mean that such an analysis is impossible, and being on social media for 16 hours a day is certainly a compelling starting point.
So, more broadly speaking, what should we do about it? I don’t know! There’s a needle that needs to be threaded, and I’m not the one to thread it. The big algorithmic social media sites are really bad and I love every cut that someone gets against them, but there were certainly arguments being made on the plaintiff’s side (encryption? Come on!) that were pure BS and bad for everyone.
All that being said, one thing we absolutely must not do is misrepresent the actual harm and problems caused by the systems these companies created, and we need some kind of law or regulation to end it and make them liable for it. Hell, a basic goddamn privacy law would probably get us most of the way there on its own just by cutting down on the fodder that goes into their algorithms. Good luck to us all on that.
Hi John - sorry if there was any confusion in the checkout process, but you definitely don't need to create a PayPal account! In your cart you should see two checkout buttons, the second one is for PayPal but the first one (the regular checkout button) will allow you to proceed without creating any kind of account
Yup, still available!
can't move it I'm afraid - though I can delete it if you like and you can repost! In the mean time, I'll link to where I assume it was supposed to go: https://www.techdirt.com/2025/11/14/nut-huggers-apparel-plans-to-battle-back-against-bullshit-buc-ees-bullying/
Fixed that too, thanks
yeah sorry about that - fixed now!
whoops, there was an error in the link - fixed now! thanks
Yeah we were blown away by the quality of so many entries this year!
whoops, correct, thanks! fixed
I don't think this is because he "can't admit he made a mistake" - I think it is because this is exactly what he wants, and he wants everyone to know that he will do it to anyone he pleases.
Though it's a broad rhetorical stroke and not really comparable to the acute diagnosis of these specific government actions as kidnapping, I don't actually have much problem with calling taxation theft if that's really what you want to do - knock yourself out But while there are many hopeful visions of a stateless future that I will happily or even eagerly entertain, I strongly suspect that they don't line up very well with yours Stephan
The administration's position is that as soon as these men first arrived at CECOT, America washed its hands of the whole thing and it no longer has anything to do with them. The purpose of demanding a statement from someone with personal knowledge of Garcia's current whereabouts is to establish whether and to what degree the DHS has in fact continued any active monitoring of these people, and to find individual officials who can be held responsible for fulfilling the court order to facilitate their return The purpose of evading that demand is to avoid answering that question, and avoid giving the court anyone to hold responsible But we do know that many of these men made it to CECOT (as there are photos of several of them being held there), and El Salvador says it is proudly holding all 238 of them, and at the moment there's just no particular reason to believe this isn't the case.
At the moment, there is every reason to believe all 238 of these men are being held in CECOT in El Salvador
It's not uncommon for the court to give the government lawyers leeway - but nothing about this situation is common. I think at the very least she could have done what Garcia's lawyers asked: order an official with personal knowledge of his whereabouts to appear before the court. And if it were up to me, order that to happen today.
And all of this happening just after they openly defied her order this morning when they missed the deadline for their response by half an hour
I don't think this administration is actually hellbent on saving money. I think they are hellbent on allocating money and spending any extra money required to finish their project of transforming America into an outright fascist state.
I am not optimistic at all, but this is a real human's life we're talking about - a human being with family that is fighting for his return. You don't get to just flatly declare him as as-good-as-dead.
Noted. Bye forever! 👋
It's still coming at some point, just been very busy
Traffic has been high, which means both more comments and more people voting on them!
120 something supporters 24 hours ago I just checked the dashboard and we had 829 backers yesterday. 694 the day before that. Your memory isn't so good I guess.