Technology
This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.
Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.
Rules:
1: All Lemmy rules apply
2: Do not post low effort posts
3: NEVER post naziped*gore stuff
4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.
5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)
6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist
7: crypto related posts, unless essential, are disallowed
view the rest of the comments
Okay, thanks for the clarification
Everyone except you still very much includes drawn & AI pornographic depictions of children within the basket of problematic content that should get filtered out of federated instances so thank you very much but I'm not sure your point changed anything.
They are not saying it shouldn't be defederated, they are saying reporting this to authorities is pointless and that considering CSAM is harmful.
Everybody understands there's no real kid involved. I still don't see an issue reporting it to authorities and all the definitions of CSAM make a point of including simulated and illustrated forms of child porn.
https://en.m.wikipedia.org/wiki/Child_pornography
What's the point of reporting it to authorities? It's not illegal, nor should it be because there's no victim, so all reporting it does is take up valuable time that could be spent tracking down actual abuse.
Definitions of CSAM definitely do not include illustrated and simulated forms. They do not have a victim and therefore cannot be abuse. I agree that it should not be allowed on public platforms, hence why all instances hosting it should be defederated. Despite this, it is not illegal, so reporting it to authorities is a waste of time for you and the authorities who are trying to remove and prevent actual CSAM.
CSAM definitions absolutely include illustrated and simulated forms. Just check the sources on the wikipedia link and climb your way up, you'll see "cartoons, paintings, sculptures, ..." in the wording of the protect act
They don't actually need a victim to be defined as such
That Wikipedia broader is about CP, a broader topic. Practically zero authorities will include illustrated and simualated forms of CP in their definitions of CSAM
I assumed it was the same thing, but if you're placing the bar of acceptable content below child porn, I don't know what to tell you.
That's not what I was debating. I was debating whether or not it should be reported to authorities. I made it clear in my previous comment that it is disturbing and should always be defederated.
Ah. It depends on the jurisdiction the instance is in
Mastodon has a lot of lolicon shit in japan-hosted instances for that reason
Lolicon is illegal under US protect act of 2003 and in plenty of countries
If you don't think images of actual child abuse, against actual children, is infinitely worse than some ink on paper, I don't care about your opinion of anything.
You can be against both. Don't ever pretend they're the same.
Step up the reading comprehension please
I understand what you're saying and I'm calling you a liar.