this post was submitted on 03 Oct 2023
761 points (95.7% liked)

Technology

59317 readers
5904 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Robin Williams' daughter Zelda says AI recreations of her dad are 'personally disturbing'::Robin Williams' daughter Zelda says AI recreations of her dad are 'personally disturbing': 'The worst bits of everything this industry is'

you are viewing a single comment's thread
view the rest of the comments
[–] _number8_@lemmy.world 2 points 1 year ago (3 children)

imaginary scenario:

you love good will hunting, you're going thru a tough time, and you use AI to have robin williams say something gentle and therapist-y that directly applies to you and your situation -- is this wrong?

[–] Naz@sh.itjust.works 10 points 1 year ago (1 children)

I've asked extremely high end AI questions on ethics of this nature and after thinking for exactly 14.7 seconds it responded with:

• The ethics of generating images, sound, or other representations of real people is considered no different than active imagination when done for fun and in privacy.

• However, spreading those images to others, without the original person's consent is considered a form of invasion of privacy, impersonation, and is therefore unethical.

Basically, you're fine with imagining Robin Williams talking to you, but if you record that and share it with others/disseminate the content, then it becomes unethical.

[–] TwilightVulpine@lemmy.world 2 points 1 year ago (1 children)

• The ethics of generating images, sound, or other representations of real people is considered no different than active imagination when done for fun and in privacy.

That doesn't sound right at all. Copying and processing somebody's works for the sake of creating a replica is completely different than imagining it to yourself. Depending on how its done, even pretending that it's being done solely for yourself is incorrect. Many AI-based services take feedback from what their users do, even if they don't actively share it.

Just like looking at something, memorizing it and imitating it is allowed while taking a picture may not be, AI would not necessarily get the rights to engage with media as people do. It's not an independent actor with personal rights. It's not an extension of the user. It's a tool.

Then again I shouldn't be surprised that an AI used and trained by AI users, replies about its use as basically a natural right.

[–] JackbyDev@programming.dev 3 points 1 year ago (1 children)

Please see the second point. Essentially you cannot commit copyright violation if you don't distribute anything. Same concept.

[–] TwilightVulpine@lemmy.world 2 points 1 year ago (1 children)

These AIs are not being produced by the rights owners so it seems unlikely that they are being built without unauthorized distribution.

[–] JackbyDev@programming.dev 2 points 1 year ago

I get your point, but I think for the purpose of the thought exercise having the model built by yourself is better to get at the crux of "I am interested in making an image of a dead celebrity say nice things to me" especially since the ethics of whether or not building and sharing models of copyrighted content is a totally different question with its own can of worms.

[–] Empricorn@feddit.nl 4 points 1 year ago

I wouldn't apply morality, but I bet it isn't healthy. I would urge this theoretical person to consult with an actual licensed therapist.

[–] BuckyVanBuren@lemmy.world 1 points 1 year ago

Ask Tom Waits...