this post was submitted on 25 Oct 2023
81 points (81.9% liked)

Technology

58115 readers
4871 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] fubo@lemmy.world 61 points 10 months ago* (last edited 10 months ago) (2 children)

Deepfakes of an actual child should be considered defamatory use of a person's image; but they aren't evidence of actual abuse the way real CSAM is.

Remember, the original point of the term "child sexual abuse material" was to distinguish images/video made through the actual abuse of a child, from depictions not involving actual abuse -- such as erotic Harry Potter fanfiction, anime characters, drawings from imagination, and the like.

Purely fictional depictions, not involving any actual child being abused, are not evidence of a crime. Even deepfake images depicting a real person, but without their actual involvement, are a different sort of problem from actual child abuse. (And should be considered defamatory, same as deepfakes of an adult.)

But if a picture does not depict a crime of abuse, and does not depict a real person, it is basically an illustration, same as if it was drawn with a pencil.

[–] Uranium3006@kbin.social 10 points 10 months ago (1 children)

Remember, the original point of the term “child sexual abuse material” was to distinguish images/video made through the actual abuse of a child, from depictions not involving actual abuse – such as erotic Harry Potter fanfiction, anime characters, drawings from imagination, and the like.

although that distinction lasted about a week before the same bad actors who cancel people over incest fanfic started calling all the latter CSEM too

[–] fubo@lemmy.world 11 points 10 months ago* (last edited 10 months ago) (1 children)

As a sometime fanfic writer, I do notice when fascists attack sites like AO3 under a pretense of "protecting children", yes.

[–] Uranium3006@kbin.social 6 points 10 months ago

And it's usually fascists, or at least people who may not consider themselves as such but think and act like fascists anyways.

[–] Pyro@pawb.social 3 points 10 months ago (3 children)

Add in an extra twist. Hopefully if the sickos are at least happy with AI stuff they won't need "real"

Sadly, a lot of it does evolve from wanting to "watch" to wanting to do

[–] hoshikarakitaridia@sh.itjust.works 9 points 10 months ago

Sadly, a lot of it does evolve from wanting to "watch" to wanting to do

This is the part where I disagree and I would love ppl to prove me wrong. Because whether this is true or false, it will probably be the deciding factor in allowing or restricting "artificial CSAM".

[–] topinambour_rex@lemmy.world 4 points 10 months ago

Sadly, a lot of it does evolve from wanting to "watch" to wanting to do

Have you got some source about this ?

[–] fubo@lemmy.world 2 points 10 months ago (1 children)

Some actually fetishize causing suffering.

[–] JohnEdwa@sopuli.xyz 3 points 10 months ago* (last edited 10 months ago)

Some people are sadists and rapists, yes, regardless of what age group they'd want to do it with.