this post was submitted on 04 Nov 2023
295 points (93.0% liked)

Technology

59179 readers
2130 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

WhatsApp’s AI shows gun-wielding children when prompted with ‘Palestine’::By contrast, prompts for ‘Israeli’ do not generate images of people wielding guns, even in response to a prompt for ‘Israel army’

you are viewing a single comment's thread
view the rest of the comments
[–] generalpotato@lemmy.world 86 points 1 year ago (1 children)

Systemic prejudices showing up in datasets causing generative systems to spew biased output? Gasp.. say it isn’t so?

I’m not sure why this is surprising anymore. This is literally expected behavior unless we get our shit together and get a grip on these systemic problems. The rest of it all is just patch work and bandages.

[–] vacuumflower@lemmy.sdf.org 3 points 1 year ago* (last edited 1 year ago)

I'd like to point out that not everything generative is a subset of all the ML stuff. So prejudices in datasets do not affect everything generative.

That's off the topic, just playing with such a thing as generative music now. Started with SuperCollider, but it was too hard (maybe not anymore TBF, probably recycling a phrase, for example, would be much easier and faster there than in my macaroni shell script) so now I just generate ABC, convert it to MIDI with various instruments, and use FluidSynth.