this post was submitted on 10 Aug 2023
99 points (100.0% liked)

Technology

37443 readers
397 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

Summary

  • Detroit woman wrongly arrested for carjacking and robbery due to facial recognition technology error.
  • Porsche Woodruff, 8 months pregnant, mistakenly identified as culprit based on outdated 2015 mug shot.
  • Surveillance footage did not match the identification, victim wrongly identified Woodruff from lineup based on the 2015 outdated photo.
  • Woodruff arrested, detained for 11 hours, charges later dismissed; she files lawsuit against Detroit.
  • Facial recognition technology's flaws in identifying women and people with dark skin highlighted.
  • Several US cities banned facial recognition; debate continues due to lobbying and crime concerns.
  • Law enforcement prioritized technology's output over visual evidence, raising questions about its integration.
  • ACLU Michigan involved; outcome of lawsuit uncertain, impact on law enforcement's tech use in question.
you are viewing a single comment's thread
view the rest of the comments
[–] ConsciousCode@beehaw.org 8 points 1 year ago

Facial recognition should always be a clue, never evidence. It should have the same weight as eyewitness testimony, because the algorithms will always have personal biases from their dataset. Otherwise, we risk lawyers saying stuff like "the algorithm gives a 99% confidence this is you" and the jury thinks this is some objective measure. Meanwhile, the algorithm only has 1% BIPOC in its dataset and labels with high confidence lots of them as being the same person.

Reminds me of the movie Anon, with this jaw-dropping quote at the end: "It's not that I have something to hide. I have nothing I want to show you."