this post was submitted on 07 Sep 2023
167 points (96.1% liked)

Technology

59605 readers
3446 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

AI-generated child sex imagery has every US attorney general calling for action::"A race against time to protect the children of our country from the dangers of AI."

top 50 comments
sorted by: hot top controversial new old
[–] db2@sopuli.xyz 93 points 1 year ago (44 children)

They're not pictures of real people, proceeding against it on that basis undermines the point and makes them look like idiots. It should be banned on principle but ffs there's got to be a way that doesn't look reactionary and foolish.

[–] ram@lemmy.ca 17 points 1 year ago (3 children)

Except when they are pictures of real people doing a body swap

[–] db2@sopuli.xyz 46 points 1 year ago (1 children)

That isn't at all what an AI generated image is. People have been doing that for better than 50 years.

load more comments (1 replies)
[–] BadRS@lemmy.world 29 points 1 year ago (1 children)

Thats been possible since before photoshop and certainly is possible after

[–] NeoNachtwaechter@lemmy.world 2 points 1 year ago

Thats been possible since before photoshop

possible before, easy as 1 command now.

[–] fidodo@lemm.ee 10 points 1 year ago* (last edited 1 year ago)

Shouldn't that already be covered under revenge porn laws? At least the distribution side of it.

[–] newthrowaway20@lemmy.world 7 points 1 year ago (3 children)

But aren't these models built from source material? I imagine if you want CP AI, you need actual CP to train it, no? That definitely would be a problem.

[–] Rivalarrival 20 points 1 year ago

No, you can use a genetic algorithm. You have your audience rate a legal, acceptable work. You present the same work to an AI and ask it to manipulate traits, and provide a panel of works to your audience. Any derivative that the audience rates better than the original is then given back to the AI for more mutations.

Feed all your mutation and rating data to an AI, and it can begin to learn what the audience wants to see.

Have a bunch of pedophiles doing the training, and you end up with "BeyondCP".

[–] NegativeInf@lemmy.world 3 points 1 year ago (2 children)

My question is where did they get the training data for a sufficiently advance CP image generator. Unless it's just ai porn with kids faces? Which is still creepy, but I guess there are tons of pictures that people post of their own kids?

[–] SinningStromgald@lemmy.world 5 points 1 year ago

Manga, manwha(?) CG sets etc of shota/loli. Sprinkle in some general child statistics for height, weight etc . And I'm sure social media helped as well, with people making accounts for their babies for God sake.

Plenty of "data" I'm sure to train up an AI.

[–] bobs_monkey@lemm.ee 4 points 1 year ago

Wouldn't put it past some suck fucks to feed undesirable content into an AI training

load more comments (41 replies)
[–] DarkSpectrum@lemmy.world 70 points 1 year ago (16 children)

Isn't AI generated better than content sourced from real life? It could actually drive a reduction in child sexual abuse instances due to offenders leveraging alternative sources.

[–] Valmond@lemmy.mindoki.com 31 points 1 year ago (1 children)

I wonder what it's trained on 🤮

[–] regbin_@lemmy.world 31 points 1 year ago

Based on my understanding of how current diffusion models work, you actually don't need to train it on CP. As long as it knows how humans look like without clothes and how children look like even if fully clothed with abayas and stuff, it can make the relation and generate CP when asked to.

Just to be clear, I'm totally against any form of CP and CSAM. Just explaining how the tech works.

[–] doggle@lemmy.dbzer0.com 28 points 1 year ago

I see what you're saying, but ai has yet to offset regular porn production at all. There's no reason I see to think accepting ai cp would do anything but normalize it and make it more accessible, possibly increasing demand for the real stuff.

Also, the ai models need to be trained on something...

[–] krayj@sh.itjust.works 9 points 1 year ago* (last edited 1 year ago)

One big problem is that it makes enforcement of real abuse impossible. If there is an explosion of that kind of ai generated content and it gets good enough to be confused for the real thing, then real abuse will slip under the radar. It would be impossible to sift through all that content trying to differentiate between ai generated and real if ai generated were ever allowed.

[–] philodendron@lemdro.id 5 points 1 year ago

Alternatively it could become an indoctrination pipeline

[–] Asudox@lemmy.world 3 points 1 year ago

This feels like a double edged sword.

load more comments (11 replies)
[–] uriel238@lemmy.blahaj.zone 49 points 1 year ago

In the United States, there are significantly greater dangers to kids than AI porn. Hunger, poverty and the climate crisis come to mind.

If we are refusing to address these for ideological reasons (e.g. because its socialism ) then the established system itself is a threat to kids.

Priorities.

[–] Sanctus@lemmy.world 39 points 1 year ago

Pandora's digital box has been opened. And I dont think this one ends with everything going back in the box.

[–] autotldr@lemmings.world 11 points 1 year ago

This is the best summary I could come up with:


On Wednesday, American attorneys general from all 50 states and four territories sent a letter to Congress urging lawmakers to establish an expert commission to study how generative AI can be used to exploit children through child sexual abuse material (CSAM).

In particular, open source image synthesis technologies such as Stable Diffusion allow the creation of AI-generated pornography with ease, and a large community has formed around tools and add-ons that enhance this ability.

Since these AI models are openly available and often run locally, there are sometimes no guardrails preventing someone from creating sexualized images of children, and that has rung alarm bells among the nation's top prosecutors.

Establishing a proper balance between the necessity of protecting children from exploitation and not unduly hamstringing a rapidly unfolding tech field (or impinging on individual rights) may be difficult in practice, which is likely why the attorneys general recommend the creation of a commission to study any potential regulation.

In the past, some well-intentioned battles against CSAM in technology have included controversial side effects, opening doors for potential overreach that could affect the privacy and rights of law-abiding people.

Similarly, the letter's authors use a dramatic call to action to convey the depth of their concern: "We are engaged in a race against time to protect the children of our country from the dangers of AI.


The original article contains 960 words, the summary contains 225 words. Saved 77%. I'm a bot and I'm open source!

[–] Blue2a2@sh.itjust.works 3 points 1 year ago

I'm sure it is entirely coincidental that the call to action is to restrict/control the free open-source software, and leaves Google and Microsoft safely in control with their curated models.

This is just like the time the US made websites responsible for their users' content, and coincidentally made it much more legally dangerous to start your own social media platform.

But sure, I mean, just think of the (imaginary) children! We need to stop this theoretical abuse of imaginary children by passing laws that make it harder for any AI not created by a tech giant to operate.

load more comments
view more: next ›