249

cross-posted from: https://mbin.grits.dev/m/mews/t/22301

White House calls for legislation to stop Taylor Swift AI fakes

top 50 comments
sorted by: hot top controversial new old
[-] BetaDoggo_@lemmy.world 146 points 5 months ago

Nobody cares until someone rich is impacted. Revenge porn has been circulating on platforms uninhibited for many years, but the second it happens to a major celebrity suddenly there's a rush to do something about it.

[-] givesomefucks@lemmy.world 85 points 5 months ago

What?

This isn't revenge porn, it's fakes of celebrities.

Something that was done for decades, and one of the biggest parts of early reddit. So it's not "the second" either.

The only thing that's changed is people are generating it with AI.

The ones made without AI (that have been made for decades) are a lot more realistic and a lot more explicit. It just takes skill and time, which is why people were only doing it for celebrities.

The danger of AI is any random person could take some pictures off social media and make explicit images. The technology isn't there yet, but it won't take much longer

[-] PhlubbaDubba@lemm.ee 21 points 5 months ago

I think it's more about the abject danger that unregulated AI replication of noteworthy figures represents to basically everything

Also, revenge porn is illegal in I think every state but South Carolina and even then it might have been banned since I saw that stat

[-] Deceptichum@kbin.social 19 points 5 months ago

While I agree with the sentiment that rich people's issues have more influence.

How Many States Have Revenge Porn Laws?

All states, excluding Massachusetts and South Carolina, have separate statutes specifically related to revenge porn. It's important to note, however, that a person may still be prosecuted for revenge porn under other statutes in those two states.

https://www.findlaw.com/criminal/criminal-charges/revenge-porn-laws-by-state.html

[-] Mango@lemmy.world 6 points 5 months ago

You think it wasn't celebrities first? The issue here is specifically Taylor Swift.

[-] helenslunch@feddit.nl 5 points 5 months ago

the second it happens to a major celebrity suddenly there's a rush to do something about it.

Bruh this been happening to celebrities for decades.

load more comments (1 replies)
[-] aniki@lemm.ee 51 points 5 months ago* (last edited 5 months ago)

This wasn't a problem until the rich white girl got it. Now we must do... something. Let's try panic!

-The Whitehouse, probably.

[-] frickineh@lemmy.world 16 points 5 months ago

Honestly, I kind of don't even care. If that's what it takes to get people to realize that it's a serious problem, cool. I mean, it's aggravating, but at least now something might actually happen that helps protect people who aren't megastars.

[-] PapaStevesy@midwest.social 7 points 5 months ago

You must be new to capitalism, lol

load more comments (6 replies)
load more comments (1 replies)
[-] voluble@lemmy.world 6 points 5 months ago

White House used Panic!

It hurt itself in its confusion!

load more comments (1 replies)
[-] Zozano@lemy.lol 32 points 5 months ago

Do you want more AI gens of nude Taylor Swift? Because that's how you get more AI gens of nude Taylor Swift.

load more comments (3 replies)
[-] guyrocket@kbin.social 26 points 5 months ago

This will be interesting.

How to write legislation to stop AI nudes but not photo shopping or art? I am not at all sure it can be done. And even if it can, will it withstand a courtroom free speech test?

[-] macrocarpa@lemmy.world 12 points 5 months ago

I think it's not feasible to stop or control it, for several reasons -

  1. People are motivated to consume ai porn
  2. There is no barrier to creating it
  3. There is no cost to create it
  4. There are multiple generations of people who have shared the source material needed to create it.

We joke about rule 34 right, if you can think of it there is porn of it. It's now pretty straightforward to fulfil the second part of that, irrespective as to the thing you thought of. Those pics of your granddsd in his 20s in a navy uniform? Your high school yearbook picture? Six shots of your younger sister shared by an aunt on Facebook? Those are just as consumable by ai as tay tay is.

load more comments (3 replies)
[-] badbytes@lemmy.world 23 points 5 months ago

Surely this should be a priority.

[-] remotelove@lemmy.ca 16 points 5 months ago* (last edited 5 months ago)

Well, it's not really just about Swift. There are probably many other people that are going through this. Not every person who generates nudes of someone else is going to make it to the news, after all.

I could see this being a problem in highschools as really mean pranks. That is not good. There are a million other ways I could see fake nudes being used against someone.

If someone spread pictures of me naked: 1. I would be flattered and 2. Really ask why someone wants to see me naked in the first place.

If anything, just an extension of any slander(?) laws would work. It's going to be extremely hard to enforce any law though, so there is that.

However, how long have revenge porn laws been a thing? Were they ever really a thing?

[-] originalucifer@moist.catsweat.com 17 points 5 months ago* (last edited 5 months ago)

i remember a headline from a few weeks back, this is already happening in schools. its really not about swift

[-] cosmicrookie@lemmy.world 18 points 5 months ago

Wait.. They want to stop only Taylor Swift AI fakes? Not every AI fake representing a real person???

[-] AngryCommieKender@lemmy.world 12 points 5 months ago

Only AI fakes of billionaires. They're just admitting that there's a two tiered legal system, and if you're below a certain "value," you will not be protected.

[-] cosmicrookie@lemmy.world 7 points 5 months ago

If the value level is Taylor Swift we're all doomed

load more comments (2 replies)
[-] ehrik@lemmy.world 12 points 5 months ago

Y'all need to read the article and stop rage baiting. It's literally a click away.

"Legislation needs to be passed to protect people from fake sexual images generated by AI, the White House said this afternoon."

[-] Thorny_Insight@lemm.ee 5 points 5 months ago

Quick! Stock up on Katy Perry fakes before those get banned aswell

[-] Bonesy91@lemmy.world 18 points 5 months ago

This is what the white house is concerned about........ Fuck them. Like there is so much worse going on in America but oh no one person has ai fake porn images heaven forbid!

[-] MirthfulAlembic@lemmy.world 17 points 5 months ago

The White House is capable of having a position on more than one issue at a time. There also doesn't seem to be a particular bill they are touting, so this seems to be more of a "This is messed up. Congress should do something about it" situation than "We're dropping everything to deal with this" one.

load more comments (1 replies)
[-] XeroxCool@lemmy.world 5 points 5 months ago

Nice job reading the article, any one of these articles, to actually get context and not just react to headlines.

People are asking about Swift. The government isn't buddying up to her specifically. Swift is only the most famous face of this issue with very focused growth on this.

[-] Caligvla@lemmy.dbzer0.com 17 points 5 months ago* (last edited 5 months ago)

U.S. government be like:

Thousands of deep fakes of poor people: I sleep.

Some deep fakes of some privileged Hollywood elite: R E A L S H I T.

load more comments (1 replies)
[-] thantik@lemmy.world 17 points 5 months ago

I'd much rather that we do nothing, let it proliferate to the point where nobody trusts nudes at all any more.

load more comments (29 replies)
[-] iheartneopets@lemm.ee 16 points 5 months ago

Taylor is just trying to distract us from her jet emissions again, just like her new PR relationship with that Kelce guy was almost certainly to distract us from her dating that Matty Healy dude that openly said he enjoys porn that brutalizes black women (and also from her jet emissions).

She's not stupid. She's a billionaire very aware of how news cycles work.

[-] Sagifurius@lemm.ee 11 points 5 months ago

Oh look, we've got this generations moral panic figured.

[-] Anticorp@lemmy.world 9 points 5 months ago

Is the law going to explicitly protect her and no one else?

load more comments (3 replies)
load more comments
view more: next ›
this post was submitted on 27 Jan 2024
249 points (88.8% liked)

Not The Onion

11020 readers
494 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Comments must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 1 year ago
MODERATORS