this post was submitted on 26 Jan 2024
290 points (87.4% liked)

Technology

59440 readers
3573 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Taylor Swift is living every woman’s AI porn nightmare — Deepfake nudes of the pop star are appearing all over social media. We all saw this coming.::Deepfake nudes of the pop star are appearing all over social media. We all saw this coming.

top 50 comments
sorted by: hot top controversial new old
[–] books@lemmy.world 104 points 9 months ago (2 children)

I feel like I live on the Internet and I never see this shit. Either it doesn't exist or I exist on a completely different plane of the net.

[–] GBU_28@lemm.ee 19 points 9 months ago (9 children)

You ever somehow get invited to a party you'd usually never be at? With a crowd you.never ever see? This is that.

[–] Plopp@lemmy.world 26 points 9 months ago (2 children)

You ever somehow get invited to a party you'd usually never be at? With a crowd you.never ever see?

No?

[–] trk@aussie.zone 24 points 9 months ago

You ever somehow get invited to a party

Also no 😥

[–] GBU_28@lemm.ee 11 points 9 months ago

Understandable, have a nice day

load more comments (8 replies)
[–] schnurrito@discuss.tchncs.de 14 points 9 months ago (3 children)

On the Internet, censorship happens not by having too little information, but too much information in which it is difficult to find what you want.

We all have only so much time to spend on the Internet and so necessarily get a filtered experience of everything that happens on the Internet.

load more comments (3 replies)
[–] Zip2@feddit.uk 71 points 9 months ago (1 children)

I’ll wait for Taylor’s version.

load more comments (1 replies)
[–] BeefPiano@lemmy.world 68 points 9 months ago (7 children)

I wonder if this winds up with revenge porn no longer being a thing? Like, if someone leaks nudes of me I can just say it’s a deepfake?

Probably a lot of pain for women from mouth breathers before we get there from here .

[–] SnotFlickerman@lemmy.blahaj.zone 55 points 9 months ago* (last edited 9 months ago)

I mean, not much happened to protect women after The Fappening, and that happened to boatloads of famous women with lots of money, too.

Arguably, not any billionaires, so we'll see I guess.

[–] thantik@lemmy.world 18 points 9 months ago (3 children)

This has already been a thing in courts with people saying that audio of them was generated using AI. It's here to stay, and almost nothing is going to be 'real' anymore unless you've seen it directly first-hand.

[–] TropicalDingdong@lemmy.world 54 points 9 months ago (2 children)

first-hand.

the first-hand in question:

load more comments (2 replies)
load more comments (2 replies)
[–] TwilightVulpine@lemmy.world 9 points 9 months ago (4 children)

Why would it make revenge porn less of a thing? Why are so many people here convinced that as long people say it's "fake" it's not going to negatively affect them?

The mouth breathers will never go away. They might even use the excuse the other way around, that because someone could say just about everything is fake, then it might be real and the victim might be lying. Remember that blurry pictures of bigfoot were enough to fool a lot of people.

Hell, even others believe it is fake, wouldn't it still be humilliating?

[–] Aethr@lemmy.world 9 points 9 months ago (4 children)

I think you're underestimating the potential effects of an entire society starting to distrust pictures/video. Yeah a blurry Bigfoot fooled an entire generation, but nowadays most people you talk to will say it's doctored. Scale that up to a point where literally anyone can make completely realistic pics/vids of anything in their imagination, and have it be indistinguishable from real life? I think there's a pretty good chance that "nope, that's a fake picture of me" will be a believable, no question response to just about anything. It's a problem

load more comments (4 replies)
load more comments (3 replies)
load more comments (4 replies)
[–] nyan@lemmy.cafe 63 points 9 months ago* (last edited 9 months ago) (2 children)

Fake celebrity porn has existed since before photography, in the form of drawings and impersonators. At this point, if you're even somewhat young and good-looking (and sometimes even if you're not), the fake porn should be expected as part of the price you pay for fame. It isn't as though the sort of person who gets off on this cares whether the pictures are real or not—they just need them to be close enough that they can fool themselves.

Is it right? No, but it's the way the world is, because humans suck.

[–] BreakDecks@lemmy.ml 48 points 9 months ago (2 children)

Honestly, the way I look at it is that the real offense is publishing.

While still creepy, it would be hard to condemn someone for making fakes for personal consumption. Making an AI fake is the high-tech equivalent of gluing a cutout of your crush's face onto a playboy centerfold. It's hard to want to prohibit people from pretending.

But posting those fakes online is the high-tech, scaled-up version of xeroxing the playboy centerfold with your crush's face on it, and taping up copies all over town for everyone to see.

Obviously, there's a clear line people should not cross, but it's clear that without laws to deter it, AI fakes are just going to circulate freely.

[–] EatATaco@lemm.ee 10 points 9 months ago

AI fake is the high-tech equivalent of gluing a cutout of your crush’s face onto a playboy centerfold.

At first I read that as "cousin's face" and I was like "bru, that's oddly specific." Lol

load more comments (1 replies)
load more comments (1 replies)
[–] Stanwich@lemmy.world 48 points 9 months ago (6 children)

WHAT?? DISGUSTING! WHERE WOULD THESE JERKS PUT THIS ? WHAT SPECIFIC WEBSITE DO I NEED TO BOYCOTT?

[–] paddirn@lemmy.world 27 points 9 months ago

Google Search didn’t really turn up much, far less than if you were to search up something like ‘Nancy Pelosi nude’ even, it kind of seems overblown and the only reason it’s gotten any news is because of who it happened to. Just being famous nowadays seems like you’re just going to see photoshopped or deepfake porn of yourself spread all over the internet.

load more comments (5 replies)
[–] FenrirIII@lemmy.world 40 points 9 months ago (3 children)

Went to Bing and found "Taylor swift ai pictures" as a top search. LOTS of images of her being railed by Sesame Street characters

[–] SnotFlickerman@lemmy.blahaj.zone 30 points 9 months ago* (last edited 9 months ago)

Im gonna be real...

When I stumbled upon (didn't go looking to be clear) her and Oscar the Grouch on a pile of trash... It sent me.

I know the whole situation is gross but I couldn't stop laughing due to the sheer absurdity. Like... What???

load more comments (1 replies)
[–] EatATaco@lemm.ee 37 points 9 months ago (7 children)

God what a garbage article:

On X—which used to be called Twitter before it was bought by billionaire edgelord Elon Musk

I mean, really? The guy makes my skin crawl, but what a hypocritically edgy comment to put into an article.

And then zero comment from Taylor Swift in it at all. She is basically just speaking for her. Not only that, but she anoints herself spokesperson for all women...while also pretty conspicuously ignoring that men can be victims of this too.

Don't get me wrong, I'm not defending non consensual ai porn in the least, and I assume the author and I are mostly in agreement about the need for something to be done about it.

But it's trashy politically charged and biased articles like this that make people take sides on things like this. Imo, the author is contributing to the problems of society she probably wants to fix.

[–] jivandabeast@lemmy.browntown.dev 27 points 9 months ago* (last edited 9 months ago) (1 children)

hypocritically edgy comment to put into an article.

Its vice, their whole brand is edgy. Calling Elon an edgelord is very on brand for them.

pretty conspicuously ignoring that men can be victims of this too.

Sure, but women are disproportionately affected by this. You're making the "all lives matter" argument of AI porn

make people take sides on things like this

People should be taking sides on this.

Just seems like you wanna get mad for no reason? I read the article, and it doesn't come across nearly as bad as you would lead anyone to believe. This article is about deepfake pornography as a whole, and how is can (and more importantly HAS) affected women, including minors. Sure it would have been nice to have a comment from Taylor, but i really don't think it was necessary.

load more comments (1 replies)
[–] Psythik@lemmy.world 23 points 9 months ago (4 children)

On the contrary, I find it more ridiculous when news media pretends like nothing is wrong over at Twitter HQ. I wish more journalists would call Musk out like this every time they're forced to mention Twitter.

load more comments (4 replies)
[–] Maggoty@lemmy.world 10 points 9 months ago
load more comments (4 replies)
[–] jwing@lemmy.world 36 points 9 months ago (4 children)
load more comments (4 replies)
[–] ObsidianZed@lemmy.world 34 points 9 months ago* (last edited 9 months ago) (2 children)

People have been doing these for years even before AGI.

Now, it's just faster.

Edit: Sorry, I suppose I should mean LLM AI

[–] GBU_28@lemm.ee 21 points 9 months ago (1 children)
[–] jbloggs777@discuss.tchncs.de 17 points 9 months ago

There is no point waiting for a response...the threat has been neutralized. Now repeat after me: There is no AGI.

[–] expr@programming.dev 10 points 9 months ago (1 children)

AGI continues to exist purely in the realm of science fiction.

load more comments (1 replies)
[–] Mango@lemmy.world 22 points 9 months ago (13 children)

Nightmare? Doesn't it simply give them the chance to just say any naked pic of them is fake now?

[–] TwilightVulpine@lemmy.world 10 points 9 months ago (32 children)

Oh I'm sure that must be a very nice thing to talk out with your mother or significant other.

"Don't worry they are plastering naked pictures of me everywhere, it's all fake"

load more comments (32 replies)
load more comments (12 replies)
[–] DarkMessiah@lemmy.world 22 points 9 months ago (1 children)

And this is why I don’t want to be famous. Being famous exposes your name to the crazies of the world, and leaves you blissfully unaware until the crazies snap.

[–] No_Eponym@lemmy.ca 14 points 9 months ago

"... privacy is something you can sell, but you can’t buy it back." -Bob Dylan

[–] Neil@lemmy.ml 21 points 9 months ago (4 children)

I'm not saying she shouldn't have complained about this. She has every right to, but complaining about it definitely made the problem a lot worse.

[–] jamyang@lemmy.world 12 points 9 months ago (2 children)

Streisand Effect or something.

load more comments (2 replies)
load more comments (3 replies)
[–] Dariusmiles2123@sh.itjust.works 14 points 9 months ago (2 children)

At least now, if pictures are real, you can say it’s AI generated.

Still, to be honest, I’ve never understood how some people can let one night stands film them naked.

If it’s a longtime girlfriend or boyfriend and they betray you, it’s different, but people aren’t acting in a clever way when it comes to sex.

[–] interdimensionalmeme@lemmy.ml 13 points 9 months ago (1 children)

There's nothing wrong with recording your naked body and it being seen online by willing persons.

The people who would disrespect you for it, they're the problem.

[–] Dariusmiles2123@sh.itjust.works 11 points 9 months ago (2 children)

That’s not what I’m talking about.

I’m talking about not being careful who you’re giving these images if you don’t want them to spread online. And, of course, the person sharing it on the web is the guilty person, not the naked victim.

load more comments (2 replies)
load more comments (1 replies)
[–] b3an@lemmy.world 13 points 9 months ago

Well, targeting someone famous and going overboard with it likely results in legal responses. Perhaps this gets deepfakes then attention they need to be regulated or legally punishable. Especially when targeting underage children.

[–] Kushia@lemmy.ml 12 points 9 months ago (1 children)

Some of them are really good too, in a realistic sense. You can tell they are AI though.

load more comments (1 replies)
load more comments
view more: next ›