Confused NY Court Says That Section 230 Doesn’t Block Ridiculous Lawsuit Blaming Social Media For Buffalo Shooter
from the the-blame-game dept
Can you imagine what kind of world we’d live in if you could blame random media companies for tangential relationships they had with anyone who ever did anything bad? What would happen if we could blame newspapers for inspiring crime? Or television shows for inspiring terrorism? The world would be a much duller place.
We’ve talked a lot about how the entire purpose of Section 230 of the Communications Decency Act is to put the liability on the right party. That is, it’s entirely about making sure the right party is being sued and to avoid wasting everyone’s time by suing some random party, especially in pursuit of “Steve Dallas” type lawsuits, where you just sue some random company, tangentially connected to some sort of legal violation, because they have the deepest pockets.
Unfortunately, a judge in the NY Supreme Court (which, bizarrely, is NY’s lowest level of courts) has allowed just such a lawsuit to move forward. It was filed by the son of a victim of the racist dipshit who went into a Buffalo supermarket and shot and killed a bunch of people a couple years ago. That is, obviously, a very tragic situation. And I can certainly understand the search for someone to blame. But blaming “social media” because someone shot up a supermarket is ridiculous.
It’s exactly the kind of thing that Section 230 was designed to get tossed out of court quickly.
Of course, NY officials spent months passing the blame and pointing at social media companies. NY’s Governor and Attorney General wanted to deflect blame from the state’s massive failings in handling the situation. I mean, the shooter had made previous threats, and law enforcement in NY had been alerted to those threats and failed to stop him. Also, he used a high capacity magazine that was illegal in NY and law enforcement failed to stop him. Also, when people in the store called 911, the dispatcher didn’t believe them and hung up on them.
The government had lots of failings that aren’t being investigated, and lawsuits aren’t being filed over those. But, because the shooter also happened to be a racist piece of shit on social media, people want to believe we should be able to magically sue social media.
And, the court is allowing this based on a very incorrect understanding of Section 230. Specifically, the court has bought into the trendy, but ridiculous, “product liability” theory, that is now allowing frivolous and vexatious plaintiffs across the country to use this “one weird trick” to get around Section 230. Just claim “the product was defective” and, boom, the court will let the case go forward.
That’s what happened here:
The social media/internet defendants may still prove their platforms were mere message boards and/or do not contain sophisticated algorithms thereby providing them with the protections of the CDA and/or First Amendment. In addition, they may yet establish their platforms are not products or that the negligent design features plaintiff has alleged are not part of their platforms. However, at this stage of the litigation the Court must base its ruling on the allegations of the complaint and not “facts” asserted by the defendants in their briefs or during oral argument and those allegations allege viable causes of action under a products liability theory.
Now, some might say “no big deal, the court says they can raise this issue again later,” but nearly all of the benefit of Section 230 is in how it gets these frivolous cases tossed early. Otherwise, the expense of these cases adds up and creates a real mess (along with the pressure to settle).
Also, the judge here seems very confused. Section 230 does not protect “mere message boards.” It protects any interactive computer service from being held liable for third-party speech. And whether or not they “contain sophisticated algorithms” should have zero bearing on whether or not Section 230 applies.
The Section 230 test to see if it applies is quite simple: (1) Is this an interactive computer service? (2) Would holding them liable in this scenario be holding them liable for the speech of someone else? (3) Does this not fall into any of the limited exceptions to Section 230 (i.e., intellectual property law, trafficking, or federal criminal law)? That’s it. Whether or not you’re a message board or if you use algorithms has nothing to do with it.
Again, this kind of ruling only encourages more such vexatious litigating.
Outside of Section 230, the social media defendants sought to go to the heart of the matter and just made it clear that there’s no causal link between “dipshit being a dipshit on social media” and “dipshit going on a murderous rampage.”
And, again, the judge doesn’t seem to much care, saying that a jury can figure that out:
As a general proposition the issue of proximate cause between the defendants’ alleged negligence and a plaintiff’s injuries is a question of fact for a jury to determine. Oishei v. Gebura 2023 NY Slip Op 05868, 221 AD3d 1529 (4th Dept 2023). Part of the argument is that the criminal acts of the third party, break any causal connection, and therefore causation can be decided as a matter of law. There are limited situations in which the New York Court of Appeals has found intervening third party acts to break the causal link between parties. These instances are where “only one conclusion may be drawn from the established facts and where the question of legal cause may be decided as a matter of law.” Derdiarian v Felix Contr. Corp., 51 NY2d 308 at 315 (1980). These exceptions involve independent intervening acts that do not flow from the original alleged negligence.
Again, though, getting this kind of case to a jury would be crazy. It would be a massive waste of everyone’s resources. By any objective standard, anyone looking at this case would recognize that it is not just frivolous and vexatious, but that it creates really terrible incentives all around.
If these kinds of cases are allowed to continue, you will get more such frivolous lawsuits for anything bad that happens. Worse, you will get much less overall speech online, as websites have incentives to take down or block any speech that isn’t Sesame Street-level in terms of how it’s portrayed. Any expression of anger, any expression of complaining, any expression of being unhappy about anything could otherwise be seen as proof that the social media was “defective in its design” for not magically connecting that expression to future violence.
That would basically be the end of any sort of forum for mental health. It would be the end of review sites. It would be the end of all sorts of useful websites, because the liability that could accrue from just one user on those forums saying something negative would be too much. If just one of the people in those forums then took action in the real world, people could blame it and sue it for not magically stopping the real world violence.
This would be a disaster for the world of online speech.
As Eric Goldman notes, it seems unlikely that this ruling will survive on appeal, but it’s still greatly problematic:
I am skeptical this opinion will survive an appeal. The court disregards multiple legal principles to reach an obviously results-driven decision from a judge based in the emotionally distraught community.
The court doesn’t cite other cases involving similar facts, including Gibson v. Craigslist and Godwin v. Facebook. One of the ways judges can reach the results they want is by selectively ignoring the precedent, but that approach doesn’t comply with the rule of law.
This opinion reinforces how the “negligent design” workaround to Section 230 will functionally eliminate Section 230 if courts allow plaintiffs to sue over third-party content by just relabeling their claims.
Separately, I will note my profound disappointment in seeing a variety of folks cheering on this obviously problematic ruling. Chief among them: Free Press. I don’t always agree with Free Press on policy prescriptions, but generally, their heart is in the right place on core internet freedom issues. Yet, they put out a press release cheering on this ruling.
In the press release, they claim (ridiculously) that letting this case move forward is a form of “accountability” for those killed in the horrific shooting in Buffalo. But that’s ridiculous and anyone should recognize that. There are people to hold liable for what happened there: most obviously the shooter himself. But trying to hold random social media sites liable for not somehow stopping future real world violence is beyond silly. As described above, it’s also extremely harmful to causes around free speech that Free Press claims as part of its mission.
I’m surprised and disappointed to see them take such a silly stance that undermines its credibility. But I’m even more disappointed in the court for ruling this way.
Filed Under: blame, buffalo shooting, liability, product liability, section 230, wayne jones
Companies: google, reddit, youtube
Comments on “Confused NY Court Says That Section 230 Doesn’t Block Ridiculous Lawsuit Blaming Social Media For Buffalo Shooter”
Willfully and intentionally “confused.”
This is… good? We could launch a class action suit against the KOSA, EARN IT, STOP CSAM, and RESTRICT bills blaming them for every mass shooting event that has occurred ever since they were first announced.
Wasn’t the Buffalo shooter an active /pol/ user who was inspired by the Christchurch shooter (specifically his manifesto that he distributed on various chan boards)? Why on earth is Facebook being sued? He didn’t even stream on Facebook like the Christchurch shooter did, he did it on Twitch. Am I missing something?
Re:
/pol/ is far right. We already know the far right gets kid gloves. If Ammon and Cliven bundy had been black, they’d have been shot to death at zero-hour.
This comment has been flagged by the community. Click here to show it.
Re: Re:
So what?
More Black people should die at the hands of Whites, given the vast majority are killed by other coons.
Re: Re: Re:
You sound like a 12-year old pretending to be a 4channer.
Re: Re: Re:2
I think you’ve accidentally increased his actual age by a factor of two.
Re: Re: Re:2
And I don’t want that insurrectionist scum in 4chan.
Re: Re: Re:
You’re a dork lol.
Re: Re: Re:
You’ll forever be unloved and unhappy, trapped in circles of unproductive and insightful losers who have literally nothing going for them but their skin color.
And that’s why you’re angry. It doesn’t get you as far as it used to. You’re finding you have to actually apply yourself and provide something of value, and you can’t do it.
You’re every bit as redundant as you imagine those of other races to be.
This comment has been flagged by the community. Click here to show it.
Re: Re: Re:2
Uninsightful losers. You and your circles of unproductive uninsightful losers.
Re: Re: Re:3
That one hit hard eh bro?
Re: Re: Re:
Much edge
So lord
Very basement
Re:
Whoever has the most money is the target.
It also means they might just pay a settlement to make you go away to avoid even more breathless “reporting” about how they are the source of all bad things in the country.
If you only stick to your principles when you like the result you don't have any
Separately, I will note my profound disappointment in seeing a variety of folks cheering on this obviously problematic ruling. Chief among them: Free Press. I don’t always agree with Free Press on policy prescriptions, but generally, their heart is in the right place on core internet freedom issues. Yet, they put out a press release cheering on this ruling.
The true test of a person/group’s principles is whether they’ll stick to them even when they might not like the result and/or who it’ll require them to side with.
Did they address the Gonzalez and Twitter decisions by SCOTUS? How is this case different (aside from the attacks happening overseas)?
Re:
Technically the issues are different, though, yes, it did strike me how fundamentally similar it feels. But because it’s under a different law (this being product liability and that one under terrorism laws) it’s a different issue.
While the judge doesn’t address the cases directly, I imagine the answer would be that those cases would be addressed later on in the process.
Just Go Away
The problem here is that many people just want social media to go away. They don’t want facebook, twitter, reddit, or pretty much every forum on the internet to exist anymore. It’s like how the media companies acted back in the 90s when people first started sharing mp3 and other content. They wanted the internet to just go away. Because that’s the only solution here. Even with the most draconian moderation, you’re still going to have people who try to get around it. And they will. It’s behavior we see all the time. Tell people they cannot discuss something, and they’ll invent codewords to get around it, or they will talk about peripheral things.
And the sad thing is, even if social media was outlawed, people still wouldn’t be happy. Because they’d wonder why they can no longer share pictures with their family, or discuss how they hate certain people.
Re:
Sad but true fact.
Re:
Precisely this.
The Dumbpublicans gleefully rubbing their hands at the idea of GenZs getting kicked off social media because those platforms get banned don’t have the brainpower to realize that if this happens, Alex Jones gets banned too.
Then again that is precisely the demographic fucked in the head enough to pay Jones for supplements.
Re: Re:
Not to mention the hypocrisy, if this happened, they’ll want a forum, when they were against it.
And they wonder why we didn’t find a cure for cancer yet.
🤦
Re: Re:
The Dumbpublicans
I thought NY was a blue state.
This comment has been flagged by the community. Click here to show it.
AI judges
can fix this, if we treat the law like software code.
Re:
Have ‘law makers’ use agile?
LOL
Have regular ‘law’ reviews, like review the law before it goes live?
LOL”
Run simulations of the ‘law’ with performance teting?
Perform End to End testing?
Huge LOL
Re:
Humans can’t agree on semantic meaning, so it would be impossible to train an AI to understand the nuances in a manner that was fair without significant improvements to the technology. You’d just get AI that was biased towards the data it was trained on, the way chatbots start to get racist when you feed them racist content.
There’s also a certain amount of subjective judgment that goes into justice and we can’t even agree if every violation of law should be considered a crime.
Do you think you should receive a ticket for every time you’ve ever committed a traffic violation? You’d be in jail or debt for the rest of your life.
Re: Re:
Ironically, Jay would, at least until he stays in jail for his revenge porn conviction.
Man, if only human behavior operated in the manner suggested by these lawsuit “theories”, i’d find a way to start a popular platform so as to moderate every violent asshat until they stopped being violent asshats. That’s like freakin’ magic.
Re:
If social media companies are liable for violence committed by users, then gun manufacturers and pro-gun politicians should be ahead of the line for liability.
That’s the problem with bad precedents. To get even with that one site, Omegle (which I was never keen on), a dangerous precedent was set where they’re now trying their luck with the same argument all over the place.
The only thing it creates is a big mess.
This comment has been flagged by the community. Click here to show it.
Section 230 enabled Craigslist, which enabled them to destroy newspapers that relied on advertising, and allowed pirates to steal my mailing list.
Re:
Hey John Smith.
Maybe you should move to India, they literally have scam artist as a job opportunity there.
Hell, going by the Kitboga videos, they have even longer mailing lists than you ever got.
This comment has been flagged by the community. Click here to show it.
Re: Re:
Leigh, when Masnick gets fucked by the press release I’m going to launch, you’re going to have a front row seat. I’m going to pry open that whore mouth of yours and shit down your fucking throat.
Re: Re: Re:
If you had anything that was newsworthy, it’d be out by now.
Meanwhile, didn’t you admit you were using Amazon to hoover up data to sell to data brokers?
Re: Re: Re:
I remember those threats, Jhon. When you used to go by “horse with no name”. They were as toothless and spineless and pathetic as you in your nursing home ward back then and it’s the exact same thing now.
The best thing you have going for you is shaking your fist and making vague threatening promises just to feel like a top again.
Re: Re: Re:
Hey Jhon boi is this like the last 50 times you threatened us with an imminent (insert bullshit argument) release?
Or is this just the usual IMPOTENT blathering by a false scam artist with a tiny dick?
Cum at me bro
Re: Re: Re:2
John Smith is false, and John Smith is a scam artist. But I don’t think he’s a false scam artist.
Re:
…hallucinated nobody mentally competent, ever.
Re:
No one ever stole your shitty cold call list.
By the way I’m still waiting for the local/state/federal police visit/lawsuit/fist fight you promised me bro.
So what. Who cares?! That is now going to be my new attitude, since that is the attitude that I have encountered by everyone that I have reported the ten year long pre-meditated, cold-blooded, saistic organized crime spree that has been perpetrated against me.
The FBI, police, sherrifs, and government tell people reporting organized crime and terrorism to talk to crisis workers, because they are basically telling the victims reporting the crime, or witnesses reporting the crime, ” So what, we don’t care.” and it is just a lazy cop put to evade all responsibility for actually doing their jobs.
That is what happens when the illegal Orwellian terrorist network took over every sector of government, legal sector, academia, healthcare, financial sector, and every other sector of the economy.
So what who cares.
Re:
I thought this article was about …
“Confused NY Court Says That Section 230 Doesn’t Block Ridiculous Lawsuit Blaming Social Media For Buffalo Shooter”
So What .. Who Cares has become a standard response to off topic complaining in comment sections on the web. Get over it.
Re:
You do.
So very, very, very much.
To the point where you reply to yourself constantly, post after post.