this post was submitted on 17 Aug 2023
825 points (95.9% liked)
Technology
59593 readers
2883 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Fantastic. I've been waiting to see these cases.
Start with a normal person, get them all jacked up on far right propaganda, then they go kill someone. If the website knows people are being radicalized into violent ideologies and does nothing to stop it, that's a viable claim for wrongful death. It's about foreseeability and causation, not about who did the shooting. Really a lot of people coming in on this thread who obviously have no legal experience.
I just don't understand how hosting a platform to allow people to talk would make you liable since you're not the one responsible for the speech itself.
Is that really all they do though? That's what theyve convinced us that they do, but everyone on these platforms knows how crucial it is to tweak your content to please the algorithm. They also do everything they can to become monopolies, without which it wouldn't even be possible to start on DIY videos and end on white supremacy or whatever.
I wrote a longer version of this argument here, if you're curious.
This is a good read, I highly suggest people click the link. Although it is short enough that I think you could have just posted it into your comment.
Yes, but then I couldn't harvest all your sweet data.
Kidding! It's a static site on my personal server that doesn't load anything but the content itself. It's mostly just a PITA to reformat it all mobile.
Which article is it? The link takes me to the website main page.
Huh really? Do you have JS turned off or anything? Here's the full link: https://theluddite.org/#!post/section-230
Hmm not sure. I use a client called Memmy for browsing Lemmy. Copy and pasting the link in my browser worked. Thanks!
I bet memmy cuts off the URL at the "#!" for some reason. I'll submit a bug report to their repo.
I agree to a point, but think that depending on how things are structured on the platform side they can have some responsibility.
Think of facebook. They have algorithms which make sure you see what they think you want to see. It doesn't matter if that content is hateful and dangerous, they will push more of that onto a damaged person and stoke the fires simply because they think it will make them more advertisement revenue.
They should be screening that content and making it less likely for anyone to see it, let alone damaged people. And I guarantee you they know which of their users are damaged people just from comment and search histories.
I'm not sure if reddit works this way, due to the upvotes and downvote systems, it may be moreso the users which decide the content you see, but reddit has communities which they can keep a closer eye on to prevent hateful and dangerous content from being shared.
Because you are responsible for hiring psychologists to tailor a platform to boost negative engagement, and now there will be a court case to determine culpability.
Reddit is going to have to make the argument that it just boosts “what people like” and it just so happens people like negative engagement.
And I mean it’s been known for decades that people like bad news more than good news when it comes to attention and engagement.
They probably will take that argument but that doesn't instantly dissolve them of legal culpability.
They set the culture.
Did reddit know people were being radicalized toward violence on their site and did they sufficiently act to protect foreseeable victims of such radicalization?
Tell that to the admins of lemmy.world defederating from communities because they may be held liable for what shows up on their website.
You mean the cowards who are already operating in a safe-habor provision of the DMCA?
Sure? I mean I think so. 🤔
We should get the thought police in on this also, stop it before it has a chance to spread. For real though, people need to take accountability for their own actions and stop trying to deflect it onto others.
Something tells me you're not a lawyer.
Something tells me you're wrong and not a lawyer.
Does remindmebot exist on Lemmy? I'd be very interested in a friendly wager.
Loser has to post a pic in a silly shirt!
I don't know but I'm 3 for 3 on these.
Bet that Supreme Court would uphold ATF interpretation on bump stock ban. That appeals courts would find a violation of 1A where Trump and other political figures blocked constituents on social media. And I bet that Remington was going to be found liable in the Sandy Hook lawsuit on a theory not wholly dissimilar from the one we're talking about here. I'm pretty good at novel theories of liability.
What silly shirt will you wear?
Mine will say "I'm a T-Rex stuck in a woman's body"
I am not, in fact, a woman. It's a hoot.
Mine will say "Novel theories of civil liability are not my bag, baby!"
In fact they are.
It's a date! No remindmebot but I'll bookmark it.
Like you