this post was submitted on 22 Feb 2024
488 points (96.2% liked)

Technology

59422 readers
2819 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Google apologizes for ‘missing the mark’ after Gemini generated racially diverse Nazis::Google says it’s aware of historically inaccurate results for its Gemini AI image generator, following criticism that it depicted historically white groups as people of color.

you are viewing a single comment's thread
view the rest of the comments
[–] VoterFrog@lemmy.world 2 points 8 months ago

C is just a work around for B and the fact that the technology has no way to identify and overcome harmful biases in its data set and model. This kind of behind the scenes prompt engineering isn't even unique to diversifying image output, either. It's a necessity to creating a product that is usable by the general consumer, at least until the technology evolves enough that it can incorporate those lessons directly into the model.

And so my point is, there's a boatload of problems that stem from the fact that this is early technology and the solutions to those problems haven't been fully developed yet. But while we are rightfully not upset that the system doesn't understand that lettuce doesn't go on the bottom of a burger, we're for some reason wildly upset that it tries to give our fantasy quasi-historical figures darker skin.