this post was submitted on 04 Nov 2023
36 points (62.9% liked)
Privacy
32120 readers
352 users here now
A place to discuss privacy and freedom in the digital world.
Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.
In this community everyone is welcome to post links and discuss topics related to privacy.
Some Rules
- Posting a link to a website containing tracking isn't great, if contents of the website are behind a paywall maybe copy them into the post
- Don't promote proprietary software
- Try to keep things on topic
- If you have a question, please try searching for previous discussions, maybe it has already been answered
- Reposts are fine, but should have at least a couple of weeks in between so that the post can reach a new audience
- Be nice :)
Related communities
much thanks to @gary_host_laptop for the logo design :)
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Didn't watch the video, but I don't care about AI CSAM. Even if it looks completely lifelike, it's not real.
Prove it's fake when some of it of your daughter is making it's way around school.
You've missed the point. Fake or not it does damage to people. And eventually it won't be possible to determine if it's real or not.
When that becomes widespread, photos will be generateable for literally everyone, not just minors but every person with photos online. It will be a societal shift; images will be assumed to be AI generated, making any guilt or shame about a nude photo existing obselete.
What a disguising assumption. And the best argument against AI I've ever heard.
I mean, anyone with enough artistic talent can draw whatever they would like right now. With AI image generation, it essentially just gives everyone the ability to draw whatever they want. You can try to fight the tech all you want, but it's a losing battle.
You may not like it, but do you really see another likely scenario?
Disguising or disgusting?
AI generated porn depicting real people seems like a different and much bigger issue
AI generated CSAM in general, while disgusting, at least doesn't directly harm people, fabricated nudes most definitely does, regardless of the age of the victim
You just implied children aren't real people.
AI generated nudes of noone in particular isn't hurting anyone, not directly at least, but AI generated nudes of a specific person, using that persons likeness and everything, that's much worse
AI can generate faces of people that don't actually exist, that's what i mean
The post made it seem like it was about AI generated CSAM in general, which while disgusting, doesn't directly harm anyone, but then the comments spoke about AI generated CSAM depicting a real individual, and that's much worse, but also not a problem that's specific to children
Currently pedos tend to group up and share real csam, and these "communities" probably serves to normalize the activities for the members, perhaps being able to generate it will keep pedos from clumping together, reducing the degree of normalization so they're more likely to seek help, and as a bonus, real children aren't preyed upon to create said csam?
And saying that removing ai tools that can generate csam will lead to them "attempt to fuck children in the streets" as you say, would you also say that we should stop criminalizing the distributing existing csam, because the existing csam that is shared in paedophile circles is all that is keeping them from going out and raping children?
AI CSAM is incredibly harmful. All CSAM is harmful. It's been shown to increase chance of pedophilic abuse.
Stop defending CSAM, HOLY SHIT.
Can you link me a source for that, please?
Jeez, calm down
I am not defending CSAM, just saying that CSAM depicting an actual existing child is magnitudes worse, as is any other kind of fabricated sexual content of real people.
Take loli porn for example, it's probably bad for society, but if someone makes loli porn based on the appearance of an actual individual, that's much more fucked up, and in addition to the "normal" detrimental effects, that would also harm that victim in a much more direct way.
Eh, if you train a ai with CSAM to make more CSAM that a different story. But in general yes.
What data is it trained on? This isn't meant to be a "gotcha" question, I'm wondering about it.
An image of an "avocado chair" is built on images of avocados, and images of chairs.