this post was submitted on 24 Jul 2023
271 points (100.0% liked)

Technology

34920 readers
140 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
 

Not a good look for Mastodon - what can be done to automate the removal of CSAM?

you are viewing a single comment's thread
view the rest of the comments
[–] priapus@sh.itjust.works 3 points 1 year ago (1 children)

Definitions of CSAM definitely do not include illustrated and simulated forms. They do not have a victim and therefore cannot be abuse. I agree that it should not be allowed on public platforms, hence why all instances hosting it should be defederated. Despite this, it is not illegal, so reporting it to authorities is a waste of time for you and the authorities who are trying to remove and prevent actual CSAM.

[–] balls_expert@lemmy.blahaj.zone 1 points 1 year ago* (last edited 1 year ago) (1 children)

CSAM definitions absolutely include illustrated and simulated forms. Just check the sources on the wikipedia link and climb your way up, you'll see "cartoons, paintings, sculptures, ..." in the wording of the protect act

They don't actually need a victim to be defined as such

[–] priapus@sh.itjust.works 1 points 1 year ago (1 children)

That Wikipedia broader is about CP, a broader topic. Practically zero authorities will include illustrated and simualated forms of CP in their definitions of CSAM

[–] balls_expert@lemmy.blahaj.zone 1 points 1 year ago* (last edited 1 year ago) (1 children)

I assumed it was the same thing, but if you're placing the bar of acceptable content below child porn, I don't know what to tell you.

[–] priapus@sh.itjust.works 1 points 1 year ago (1 children)

That's not what I was debating. I was debating whether or not it should be reported to authorities. I made it clear in my previous comment that it is disturbing and should always be defederated.

Ah. It depends on the jurisdiction the instance is in

Mastodon has a lot of lolicon shit in japan-hosted instances for that reason

Lolicon is illegal under US protect act of 2003 and in plenty of countries