this post was submitted on 21 May 2024
509 points (95.4% liked)

Technology

59087 readers
3145 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] helpImTrappedOnline@lemmy.world 150 points 5 months ago* (last edited 5 months ago) (4 children)

The headline/title needs to be extended to include the rest of the sentence

"and then sent them to a minor"

Yes, this sicko needs to be punished. Any attempt to make him the victim of " the big bad government" is manipulative at best.

Edit: made the quote bigger for better visibility.

[–] cley_faye@lemmy.world 49 points 5 months ago

That's a very important distinction. While the first part is, to put it lightly, bad, I don't really care what people do on their own. Getting real people involved, and minor at that? Big no-no.

[–] DarkThoughts@fedia.io 26 points 5 months ago (1 children)

All LLM headlines are like this to fuel the ongoing hysteria about the tech. It's really annoying.

[–] helpImTrappedOnline@lemmy.world 8 points 5 months ago* (last edited 5 months ago)

Sure is. I report the ones I come across as clickbait or missleading title, explaining the parts left out...such as this one where those 7 words change the story completely.

Whoever made that headline should feel ashamed for victimizing a grommer.

[–] MeanEYE@lemmy.world 7 points 5 months ago

I'd be torn on the idea of AI generating CP, if it were only that. On one hand if it helps them calm the urges while no one is getting hurt, all the better. But on the other hand it might cause them not to seek help, but problem is already stigmatized severely enough that they are most likely not seeking help anyway.

But sending that stuff to a minor. Big problem.

[–] Glass0448 -1 points 5 months ago (2 children)

Cartoon CSAM is illegal in the United States. Pretty sure the judges will throw his images under the same ruling.

https://en.wikipedia.org/wiki/PROTECT_Act_of_2003

https://www.thefederalcriminalattorneys.com/possession-of-lolicon

[–] Madison420@lemmy.world 9 points 5 months ago

It won't. They'll get them for the actual crime not the thought crime that's been nerfed to oblivion.

[–] ameancow@lemmy.world 3 points 5 months ago

Based on the blacklists that one has to fire up before browsing just about any large anime/erotica site, I am guessing that these "laws" are not enforced, because they are flimsy laws to begin with. Reading the stipulations for what constitutes a crime is just a hotbed for getting an entire case tossed out of court. I doubt any prosecutors would lean hard on possession of art unless it was being used in another crime.