this post was submitted on 12 Mar 2024
52 points (100.0% liked)

Technology

37602 readers
540 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

Brin’s “We definitely messed up.”, at an AI “hackathon” event on 2 March, followed a slew of social media posts showing Gemini’s image generation tool depicting a variety of historical figures – including popes, founding fathers of the US and, most excruciatingly, German second world war soldiers – as people of colour.

you are viewing a single comment's thread
view the rest of the comments
[–] GadgeteerZA@beehaw.org 16 points 6 months ago (17 children)

It's not just historical. I'm a white male and I prompted Gemini to create images for me if a middle aged white man building a Lego set etc. Only one image was a white male and two of the others wrecan Indian and a Black male. Why when I asked for a white male. It was an image I wanted to share to my family. Why would Gemini go off the prompt? I did not ask for diversity, nor was it expected for that purpose, and I got no other options for images which I could consider so it was a fail.

[–] Ephera@lemmy.ml 33 points 6 months ago (7 children)

The problem is that the training data is biased and these AIs pick up on biases extremely well and reinforce them.

For example, people of color tend to post fewer pictures of themselves on the internet, mostly because remaining anonymous is preferable to experiencing racism.
So, if you've then got a journalistic picture, like from the food banks mentioned in the article, suddenly there will be relatively many people of color there, compared to what the AI has seen from its other training data.
As a result, it will store that one of the defining features of how a food bank looks like, is that it has people of color there.

To try to combat these biases, the bandaid fix is to prefix your query with instructions to generate diverse pictures. As in, literally prefix. They're simply putting words in your mouth (which is industry standard).

[–] frogmint@beehaw.org 6 points 6 months ago (1 children)

For example, people of color tend to post fewer pictures of themselves on the internet, mostly because remaining anonymous is preferable to experiencing racism.

That is quite the bold statement. Source?

[–] Ephera@lemmy.ml 7 points 6 months ago

I don't think I came up with that myself, but yeah, I've got nothing. Would have been multiple years, since I've read about that.
Maybe strike the "mostly", but then it seemed logical enough to me that this would be a factor, similar to how some women will avoid revealing their gender (in certain contexts on the internet) to steer clear from sexual harassment.
For that last part, I can refer you to a woman from which I've heard first-hand that she avoids voice chat in games, because of that.

load more comments (5 replies)
load more comments (14 replies)