Section 230 needs an update. It needs to be made clear that hosing speech is not a liability.
“Recommending” speech with a black box algorithm that the user can’t control or select IS a liability.I like that. If you censor or promote speech, you should be responsible for it, to the same extent. You stop being a carrier and start being a curator.
Not quite. These aren’t “free speech” rules.
If a company, organization, or community doesn’t want to have certain kinds speech, they can remove anything they don’t like. Disney can’t be required to host comments about how Mr. Wheeler was right to drive on the sidewalk killing people. They couldn’t be sued for it if they did. But that’s already included in 230.
The only important thing that’s changed between the mid '90s and now, is that sites actively select and push user created speech onto people who didn’t choose to see it. Speech that wasn’t from a community or user they choose to follow. If that speach leads to harmful behaviour, then sites should be able to be held accountable for the harms.
That’s all I’m saying. Promoting user content is fundamentally different than hosting it. Hosting needs to be protected as it has been. Promoting does carry a new level of responsibility. Censorship (when not the government) is still well within an organization’s rights.
So how does that work with Threadiverse instances, many of which have openly partisan rules? Would that make instances like lemmy.blahaj and dbzer0 liable because they censor specific expressed political and social viewpoints?
Censoring content isn’t the same as recommending content though. The OP was referring to recommendation algorithms specifically.
Hexbear in particular literally stickies posts to their instance and thus in that sense ‘recommends’ it. The user replied saying that if you “If you censor or promote speech, you should be responsible for it, to the same extent.”
Brilliantly articulated. I completely agree with you.
I’m usually a big fan of the EFF, but it’s wrong on this one. If you decentralize to the limit – i.e., such that each user is running their own instance for themselves – it becomes okay for the service to become liable for the user’s speech because the user and the service owner are one in the same. In reality, (extremely) federated social media is the only kind that can survive without Section 230 and thus repealing it entirely would be a win for the Fediverse.
(You could argue “but users won’t go to the trouble of running their own instance,” but to that I’d say “they will if the law doesn’t give them any other choice, short of not participating at all.”)
The key detail about federated social media is that even self-hosters are still providing content from others. That’s how federation works without* requiring a direct connection from every instance to every other instance. My instance can connect to yours to get your content, but also the content from all other instances that you federate with. And vice-versa.
So, I understand the EFF’s argument that, without section 230, I would only federate with extremely small groups that I trust with my full financial life. That would devastaie the open social web.
*Thanks for the good-faith typo correction!
That’s how federation works with[out] requiring a direct connection from every instance to every other instance. My instance can connect to yours to get your content, but also the content from all other instances that you federate with. And vice-versa.
So what? That’s like saying ISPs should require Section 230 to avoid liability because they route packets. We’re talking about legality: it’s stuff like intent and responsibility that matters, not the technical details. Each instance owner still gets to decide which other instances they want to federate with; some ‘middle hop’ in that connection is irrelevant.
The fundamental issue that Section 230 is designed to address is the separation between the users posting the content and the platform owners who control who sees it, and the moral hazard that creates. If you eliminate the separation, there’s nowhere left for the moral hazard to exist.
We’re talking about legality: it’s stuff like intent and responsibility that matters, not the technical details.
My point, and my understanding of the EFF article, is that we do need a law that establishes just who can be held responsible, and how so. But maybe you’re imagining a world where that question is moot—in a world where there’s no separation of users and providers*. That would be a world where no one gets rich from internet infrastructure, and I would enjoy that very much.
*Another typo?! Oof.
But maybe you’re imagining a world where that question is moot—in a world where there’s no separation of users and [providers].
Yes, that’s exactly what I’m imagining. (Any tips on how I could’ve made that clearer from my first comment?)
Awesome! No, I don’t think your first comment needs to be different. You explicitly mention taking an extreme limit in the second sentence. I only realized after our first back-and-forth that I was implicitly thinking of a more near/medium-term situation. Like, how do we get from here, now, to the longer-term world we could hope for.
So, that’s how I read the EFF article. But it’s of course OK, and (dare I say!?) possibly even good, that we talk about different views on this stuff! So, thanks :)
I was watching the bipartisan senate hearing on 230. Even Ted Cruz, and every witness there, said 230 should exist, but it needs AMENDED! And they want it to include AI companies. Meaning they’ll be responsible for the content that users create.
I did a reacts video on this for folks that ate interested:
https://tubefree.org/w/aKkLkyiz8Spf2csNfgUkdK
Liability or Deniability? Platform Power as Section 230 Turns 30
’ Upon announcing the hearing, Sen. Cruz said: “Big Tech—the most powerful companies on Earth—can exercise monopoly power to make views they dislike disappear and that should scare everyone. When it comes to viewpoint suppression, however, repealing section 230 might increase censorship. I look forward to hearing from our witnesses and discussing possible reforms to section 230 so online platforms are a free and open marketplace for ideas.” ’








