this post was submitted on 08 Aug 2023
409 points (98.1% liked)
Technology
59440 readers
3553 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
According to a recent review, 100% of the people falsely arrested via facial recognition findings have been black.
The technology needs to be legally banned from law enforcement applications, because law enforcement is not making a good faith effort to use the technology.
We should ban patrol automation software too. They utilize historical arrest data to help automatically create patrol routes. Guess which neighborhoods have a history of disproportionate policing.
The problems with the approaches that tend to get used should be the cause of absolute outrage. They’re ones that should get anyone laughed off of any college campus.
The problem is that they lend a semblance of scientific justification to confirm the biases of both police departments and many voters. Politicians look to statisticians and scientists to tell them why they’re right, not why they’re wrong.
That’s why it’s so important for these kinds of issues to make the front pages.
It's great how statistics can be used to basically support anything the author wants them to. Identifying initial biases in the data is super important just as verifying the statistics independently.
100% !?
I think that facial recognition software is a bit biased.
Developer, here. Working as intended.
*issue resolved
Works on my machine
I do not see a bias here. It did not assumed that criminal is black by default or so, it simply works much worse for black people.
There could be different reasons for that. For example it can suck recognizing black faces in bad light conditions.
This is a Systemic Bias; in this case Systemic racism.
The outcome a product or service disproportionately targets Black people. It wasn’t designed to do it, so it’s not overt racism, it just worked out that way.
Camera systems inherently have a harder time with dark skin. That’s a fact. However it’s been found time and time again that these systems are predominantly created by and tested on light skin individuals. So the bias is built into the flawed creation. You can see this in Hollywood where lighting has only recently been set up to highlight dark skin with majority black casts and show runners in shows like Atlanta and Insecure.
Could you please point where it targets disproportionally black. Does it recognizes black people instead of white? this would be a racism.
If it just recognizes completely wrong faces for black people - it is a shitty quality which, buy the way, works in favour of black criminals.
I’m speaking generally, not about this system specifically. It is probably like every other camera based system that struggles with dark skin over light skin. Even things like automatic sink sensors in public bathrooms have failed in this way https://gizmodo.com/why-cant-this-soap-dispenser-identify-dark-skin-1797931773
Yes, I know. And I agree that when this happens in conditions when a human can do the work well - it is a bias. However, when this happens in conditions when a human cannot do the work either it could be a physic, like dark is worse visible in darkness.
And in exactly discussed case it is not that clear what was the reason.
it is a matter of interpretation as well: one can say "a system helps black criminals to avoid being arrested".
For me, this false reconginision statistic is an alarming signal which says that the system works bad and deeper analyses must be done, and meanwhile, police must be more accurate dealing with its results.
The outcome of the bad technology and policing is disproportionately effecting dark skinned people. That’s where it becomes systemic racism. No one decided to design a system to arrest more blacks people. The outcome of various factors ended that way however. Sometimes it’s just a consequence of nature, but most of the time there are clear reasons like lack of representation in design and testing that would have found the problems earlier.
Where did you read about "arrest more black people"? They say it points to wrong people when a criminal has black skin. You can also describe it "helps black criminals hide them self".
I'm absolutely with you being against racism and other discriminations, but exactly in this case rasims and bias is not that relevant. Overusing terms like "rasism" makes the team weaker and people start consider it as a minor thing. Like one associates "racism" with non ideal snarcamera settings, what is dangerous.
Arrests more innocent black or darker skinned people is what I meant.
I’m not overusing the term, you’re conflating two types of racism, and need to understand the context in order to understand what others are speaking about. If you just assume everyone is talking about overt racism all the time, you’re going to jump to the wrong conclusions and probably think people are being dramatic or ridiculous half the time.
Anyone in the photography industry will tell you yes
A similar thing has happened here in the Netherlands. Algorithms have been used to detect fraud, but had a discriminatory bias and accused thousands of parents of child benefits fraud. Those parents came in huge financial problems as they had to back back the allowances, many even got their children taken away and to this day haven't gotten them back.
The Third Rutte Cabinet did resign over this scandal, but many of those politicians came back at another position, including prime minister Rutte, because that's somehow allowed.
Wikipedia (English): https://en.m.wikipedia.org/wiki/Dutch_childcare_benefits_scandal