this post was submitted on 12 Mar 2024
52 points (100.0% liked)

Technology

37712 readers
470 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

Brin’s “We definitely messed up.”, at an AI “hackathon” event on 2 March, followed a slew of social media posts showing Gemini’s image generation tool depicting a variety of historical figures – including popes, founding fathers of the US and, most excruciatingly, German second world war soldiers – as people of colour.

you are viewing a single comment's thread
view the rest of the comments
[–] GadgeteerZA@beehaw.org 5 points 8 months ago (1 children)

Sometimes you do want something specific. I can understand if someone just asked for a person x, y, z and then gets a broader selection of men, women, young, old, black or white. But if one asks for a middle-aged white man, I would not expect it to respond with a young, Black women, just to have variety. I'd expect other non-stated variables to be varied. It's like asking for a scene of specifically leafy green trees, then I would not expect to see a whole lot of leafless trees.

[–] Ephera@lemmy.ml 13 points 8 months ago (1 children)

Yeah, the problem with that is that there's no logic behind it. To the AI, "white person" is equally as white as "banker". It only knows what a white person looks like, because it's been shown lots of pictures of white people and those were labeled "white person". Similarly, it's been shown lots of pictures of white people and those were labeled "banker".

There is a way to fix that, which is to introduce a logic before the query is sent to the AI. It needs to be detected whether your query contains explicit reference to skin color (or similar), and if so, that query prefix needs to be left out.

Where it gets wild, is that you can ask the AI whether your query contains such explicit references to skin color and it will genuinely do quite well at answering that correctly, because text processing is its core competence.
But then it will answer you "Yes." or "No." or "Potato chips." and you have to program the condition to then leave out the query prefix.

[–] GadgeteerZA@beehaw.org 4 points 8 months ago

Yes, it could be that, and may explain why the Nazi images came out like they did. But it sounded more like to me, Google was forcing diversity into the images deliberately. But sometimes that does not make sense. For general requests, yes. Otherwise they can just as well decide that grass should not always be green or brown, but sometimes also just make it blue or purple for variety.