this post was submitted on 22 Feb 2024
488 points (96.2% liked)

Technology

59422 readers
2824 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Google apologizes for ‘missing the mark’ after Gemini generated racially diverse Nazis::Google says it’s aware of historically inaccurate results for its Gemini AI image generator, following criticism that it depicted historically white groups as people of color.

you are viewing a single comment's thread
view the rest of the comments
[–] Kusimulkku@lemm.ee 27 points 9 months ago (1 children)

Someone needs to edit this to feature Kanye

[–] ArmoredThirteen@lemmy.ml 18 points 9 months ago (1 children)

Looks like they scrubbed swastikas out of the training set? I have mixed feelings about this. Like if they want something to have historical accuracy or my own personal opinions on censorship that shouldn't be scrubbed. But also this is the perfect tool to churn out endless amounts of pro nazi propaganda so maybe it's safer to keep it removed?

[–] Excrubulent@slrpnk.net 10 points 8 months ago (2 children)

I wonder if it's just a hard shape to get right, like hands.

Isn't there an entire subreddit of humans who can't get it right? I think we're starting to see considerable overlap between the intelligence of the smartest AI and the dumbest humans.

[–] T156@lemmy.world 2 points 8 months ago

Probably. Image generators still have a bit of trouble with signs and iconography. A swastika probably falls into a similar category.