this post was submitted on 02 Jul 2025
373 points (97.5% liked)
Technology
72379 readers
3535 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Honestly I think we need to understand that this is no different to sticking a photo of someone's head on a porn magazine photo. It's not real. It's just less janky.
I would categorise it as sexual harassment, not abuse. Still serious, but a different level
Schools generally means it involves underage individuals, which makes any content using them csam. So in effect, the "AI" companies are generating a ton of csam and nobody is doing anything about it.
Disagree. Not CSAM when no abuse has taken place.
That's my point.
There's a thing that was happening in the past. Not sure it's still happening, due to lack of news about it. It was something called "glamour modeling" I think or an extension of it.
Basically, official/legal photography studios took pictures of child models in swimsuits and revealing clothing, at times in suggestive positions and sold them to interested parties.
Nothing untoward directly happened to the children. They weren't physically abused. They were treated as regular fashion models. And yet, it's still csam. Why? Because of the intention behind making those pictures.
The intention to exploit.