this post was submitted on 22 Dec 2023
844 points (96.5% liked)

Technology

58306 readers
4466 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

More than 200 Substack authors asked the platform to explain why it’s “platforming and monetizing Nazis,” and now they have an answer straight from co-founder Hamish McKenzie:

I just want to make it clear that we don’t like Nazis either—we wish no-one held those views. But some people do hold those and other extreme views. Given that, we don’t think that censorship (including through demonetizing publications) makes the problem go away—in fact, it makes it worse.

While McKenzie offers no evidence to back these ideas, this tracks with the company’s previous stance on taking a hands-off approach to moderation. In April, Substack CEO Chris Best appeared on the Decoder podcast and refused to answer moderation questions. “We’re not going to get into specific ‘would you or won’t you’ content moderation questions” over the issue of overt racism being published on the platform, Best said. McKenzie followed up later with a similar statement to the one today, saying “we don’t like or condone bigotry in any form.”

you are viewing a single comment's thread
view the rest of the comments
[–] Drivebyhaiku@lemmy.world 1 points 9 months ago* (last edited 9 months ago)

Law is a funny beast. Lots of people do things which are illegal all the time and get away with it because you basically have to assert your right to be protected by law to sort of activate it. Like someone yelling at me that they are going to kill me while I am out in public is technically a form of assult. , I can call the authorities and get them to assist me to make sure they don't follow through and to get them to stay the hell away from me but chances are I am not going to seek restitution in court for something that small because I would have to press charges, seek and pay for legal council, everything would need to be processed to make sure the law is being properly handled at all points of the arrest and the punishment would likely be fairly trifling for all my troubles.

Private entities already basically have the imperitive to determine what is permissible on their platforms. Freedom of speech is not practiced under the auspices of substack. They are allowed to kick you out for whatever the heck they want (some exceptions applying) because they own that space. To remove posts as threats a judge would have to go through each individual one, source it, bring the original commenter into court and go through due process with every single user to check it against their local jurisdiction's laws for threats and the likely outcome would just be small fines and community service... Quite frankly the juice would not be worth the squeeze.

On the other hand we are absolutely allowed to have an opinion that substack letting Nazis spread hate speech on their platform under their watch is a moral failure on their part.