this post was submitted on 04 Nov 2023
36 points (62.9% liked)
Privacy
31847 readers
199 users here now
A place to discuss privacy and freedom in the digital world.
Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.
In this community everyone is welcome to post links and discuss topics related to privacy.
Some Rules
- Posting a link to a website containing tracking isn't great, if contents of the website are behind a paywall maybe copy them into the post
- Don't promote proprietary software
- Try to keep things on topic
- If you have a question, please try searching for previous discussions, maybe it has already been answered
- Reposts are fine, but should have at least a couple of weeks in between so that the post can reach a new audience
- Be nice :)
Related communities
Chat rooms
-
[Matrix/Element]Dead
much thanks to @gary_host_laptop for the logo design :)
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The problem is that it's not just cartoon characters, but also realistic looking people. That makes it, especially in the next years when the techniques improve, impossible to know what is fake and what is not and thus the fake ones should also be banned. And these models are trained on images of actual abused children, which of course is the main problem with this.
This is the first I'm hearing about models trained on real CSAM.
It wouldnt surprise me tbh. From my superficial visit to the darknet years ago, it seemed like these csam consumers have specific "favourites" among the victims whom they want to see more of. At least that's what I remember from clicking a link to such a chan and noping out of it.
It is the first you are heading about this because it is bs.
That's because it isn't happening
There's just no reason to do so
What isn't happening? Them making fake csam? I haven't seen it because I don't want to see it but I am connnnfident it's occurring. Some kid already got busted feeding images of girls in his class into an image generator and making nudes out of them.
So while it might not be wide spread it's 100 percent happening and will increase.
Honestly releasing these generators to the general public was a mistake. They thought they could put up safety measures but they're easily bypassed. I think they should have kept them locked up and only give access to people who are registered and trackable with people reviewing what they're generating.
All of these ai generators are getting abused left and right and anyone who didn't think that would happen is an idiot.
No, I'm saying the models aren't being trained with actual CSAM. The comment I replied to was about training, not generation.
All I was saying is that you don't need to train a model on child abuse images to get it to output child abuse images
Do you really think the people generating CSAM give a fuck about their training data? They are making the content because they enjoy it - I'd guess they'd use all training data available (of which they would likely have plenty of, considering their interests)
The people generating it are rarely the ones who are training the models. They take pretrained models and prompt them for what they want.
Even if they were training a model for a specific subject, they could train it with any pictures of the subject and combine it with another model that can generate the kind of image they want.
There is absolutely no reason they would need abuse images to use for training. There are far better general nsfw models available right now than they could ever train themselves.