215
you are viewing a single comment's thread
view the rest of the comments
[-] sharkfinsoup@lemmy.ml -4 points 3 months ago

But they literally cannot moderate their platform. The amount of data that Facebook sees every minute would bankrupt any company if they had to actually hire enough people to go through all that content and determine which is fine and which isn't. And that isn't even taking into consideration the mental and emotional damage that a person will go through just seeing all the vile and despicable shit that gets posted. AI moderation isn't advanced enough and the human moderation cost is so great that the giant social media companies will pretty much never be able to self moderate. Reddit was only able to moderate itself (to an extent) because they had an endless supply of free mods. Facebook doesn't have that same luxury.

[-] breadsmasher@lemmy.world 18 points 3 months ago

Sounds like they should be bankrupt then

[-] GluWu@lemm.ee 8 points 3 months ago

Why pay $5 million to pay 100 mods $50k/year when you can just pay a few hundred thousand in fines while you let the government move the walls of your garden for you.

[-] BonesOfTheMoon@lemmy.world 12 points 3 months ago

I guess. But they could do a better job with user reported content which they very much don't.

[-] Dudewitbow@lemmy.zip 6 points 3 months ago

iirc meta on its own spends billions on content moderation, much more than other companies generally do. the problem is with content moderation, you only see the stuff they miss and not the stuff they already filtered out.

on the topic of weeding out CSAM, an example of where a company gave up on it is nintendo suprisingly. when they had flipnote(a 3ds application where you can send post it nores to others) was used by predators in japan to lure children. Nintendo deemed it not moderatable and since then, removed. flipnote and no chat replacement has since then replaced it functionally.

moderation is super tough and you can hear some really fucked up stories these people go through, even ones who have to go through more content (e.g people who have to filter out content in china due to government surveillance) has and how it affevted their lives.

[-] BonesOfTheMoon@lemmy.world 4 points 3 months ago

I've reported probably a thousand pictures of swastika tattoos and shit they don't remove, and people calling people homophobic slurs. I don't think anyone reviews those reports.

[-] Dudewitbow@lemmy.zip 2 points 3 months ago

because on the list of stuff theyre filtering out, thats probably low on their list when compared to content like CSAM or actual murder, which gets them into legal problems if that kind of content gets wild.

[-] PrinceWith999Enemies@lemmy.world 4 points 3 months ago

That’s what externalization looks like. In the fossil fuel industry, it’s creating polluting products without having to bear the costs. In chemical companies, it’s physically polluting the environment. Same with mining companies, etc.

In social media, it is a refusal to manage content in a responsible manner, whether it’s CSAM or disinformation campaigns or hate speech. That externalization is what allows them to pay the salaries that they do, and invest in r&d, and increase their stock values to ridiculous levels. Meta is a trillion dollar company and it needs to rebalance its priorities.

[-] ICastFist@programming.dev 3 points 3 months ago

But they literally cannot moderate their platform

They can, but doing so will affect profits. They used to outsource moderation to Kenyans, who got paid in pennies. Sama, the company doing said "moderation", apparently stopped offering that kind of work.

Worth noting: FB and fuckzuck knew that moderation would be a problem for a big platform with millions of daily users. They didn't care back in 2012, they don't care now. "Not our problem", for all intents and purposes, just like their nonexistant customer support.

In the corporate world, profits are always more important than safety, health and other civilian nonsense. Last I checked instagram via the app, I saw 3 ads for obvious pyramid schemes, not too different from my previous check in 2023. Hey, scammers are paying for ad space, why should zuck care?

this post was submitted on 26 Mar 2024
215 points (97.4% liked)

News

21752 readers
3365 users here now

Welcome to the News community!

Rules:

1. Be civil


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. Trolling is uncivil and is grounds for removal and/or a community ban.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.


Obvious right or left wing sources will be removed at the mods discretion. We have an actively updated blocklist, which you can see here: https://lemmy.world/post/2246130 if you feel like any website is missing, contact the mods. Supporting links can be added in comments or posted seperately but not to the post body.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source.


Posts which titles don’t match the source won’t be removed, but the autoMod will notify you, and if your title misrepresents the original article, the post will be deleted. If the site changed their headline, the bot might still contact you, just ignore it, we won’t delete your post.


5. Only recent news is allowed.


Posts must be news from the most recent 30 days.


6. All posts must be news articles.


No opinion pieces, Listicles, editorials or celebrity gossip is allowed. All posts will be judged on a case-by-case basis.


7. No duplicate posts.


If a source you used was already posted by someone else, the autoMod will leave a message. Please remove your post if the autoMod is correct. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.


Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners.


The auto mod will contact you if a link shortener is detected, please delete your post if they are right.


10. Don't copy entire article in your post body


For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

founded 1 year ago
MODERATORS