this post was submitted on 06 Jul 2023
167 points (100.0% liked)

Technology

58138 readers
4398 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

New York City businesses that use artificial intelligence to help find hires now have to show the process was free from sexism and racism.

you are viewing a single comment's thread
view the rest of the comments
[–] CoderKat@lemm.ee 8 points 1 year ago* (last edited 1 year ago)

Good fucking luck. Here's a fascinating article about Amazon's attempt to use AI for hiring, which to their credit, they realized was a bad idea and scrapped: https://www.reuters.com/article/us-amazon-com-jobs-automation-insight-idUSKCN1MK08G

In short, it was trained on past hiring data, so taught itself from sexist hiring preferences made by humans. It absolutely not designed to be sexist and I'm sure the devs had good intentions, but it taught itself how to be sexist.

In effect, Amazon’s system taught itself that male candidates were preferable. It penalized resumes that included the word “women’s,” as in “women’s chess club captain.” And it downgraded graduates of two all-women’s colleges, according to people familiar with the matter. They did not specify the names of the schools.

And here's a different but similar AI having some even subtler issues:

[..] The algorithms learned to assign little significance to skills that were common across IT applicants, such as the ability to write various computer codes, the people said.

Instead, the technology favored candidates who described themselves using verbs more commonly found on male engineers’ resumes, such as “executed” and “captured,” one person said.

To be very clear, these issues stem at their root from human biases, so not using an AI is not going to save you from bias and in fact may well be even more biased because at least AI can be the work of entire teams doing their best to combat bias. But it can end up discriminating in very subtle and unfair ways, like how it was penalizing certain schools. It can end up perpetuating past bad behavior and make it harder to improve.

Finally, this article is about Amazon noticing these biases and actively trying to correct them. This law is a good thing, because otherwise many companies won't even do that. While still imperfect, Amazon could have played whackamole trying to root out biases (it sounds like they did for a while before giving up). Many companies won't even do that, so we need laws like this to force them to at least do so. Of course, ideally anti bias laws would also apply to humans, since we are just as vulnerable.