this post was submitted on 29 Aug 2023
787 points (96.9% liked)
Technology
59593 readers
3396 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I feel like you can't really change 230, you need to to instead legislatate differently. There is room for more criminal liability when things go wrong I think. But cival suits in the US can be really bogus. Like someone could likely sue a mastodon instance for turning their kid trans and win without section 230
I'm with you on the legislate differently part.
The background of Section 230(c)(2) is an unfortunate 1995 court ruling that held that if you moderate any content whatsoever, you should be regarded as its publisher (and therefore ought to be legally liable for whatever awful nonsense your users put on your platform). This perversely created incentive for web forum operators (and a then-fledgling social media industry) to not moderate content at all in order to gain immunity from liability- and that in turn transformed broad swathes of the social internet into an unmoderated cesspool full of Nazis and conspiracy theories and vaccine disinformation, all targeting people with inadequate critical thinking faculties to really process it responsibly.
The intent of 230(c)(2) was to encourage platform operators to feel safe to moderate harmful content, but it also protects them if they don't. The result is a wild-west, if you will, in which it's perfectly legal for social media operators in the USA to look the other way when known-unlawful use of their platforms (like advertising stolen goods, or sex trafficking, or coordinating a coup attempt, or making porn labeled 'underage' searchable) goes on.
It was probably done in good faith, but in hindsight it was naïve and carved out the American internet as a magical zone of no-responsibility.
This is not really what 230 does, sites still face criminal liability were needed, like if I made a site that had illegal content I could still be arrested and have my server seized, repealing 230 would legit just let Ken Paxton launch a multi state lawsuit suing a large list of queer mastodon instance for transing minors. Without 230 it would be lawsuit land and sites would censor anything that wasnt cat photos in an effort to avoid getting sued. Lawsuits are expensive even when you win. If you wanna make social media companies deal with something you gotta setup criminal liability not repeal 230. 230 just protect sites from cival suits not criminal ones.