this post was submitted on 28 Mar 2024
622 points (97.1% liked)

Technology

59466 readers
5251 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Israel has deployed a mass facial recognition program in the Gaza Strip, creating a database of Palestinians without their knowledge or consent, The New York Times reports. The program, which was created after the October 7th attacks, uses technology from Google Photos as well as a custom tool built by the Tel Aviv-based company Corsight to identify people affiliated with Hamas.

you are viewing a single comment's thread
view the rest of the comments
[–] irotsoma@lemmy.world 3 points 7 months ago

It doesn't work, for point 1 very well though. The tech is fine, but the way it's presented to users is that it's way more accurate than it actually is. That's marketing rather than a technical problem. Second, the tech is not as good at recognizing non-white people. It's just a fact that there are more pictures of white people to train the tech on since white people have historically had more access to photography among other reasons. And the models used to create most of the tech was built to favor facial traits that are more likely to differ in white people.

So, the likelihood of high probability matches is much lower so the likelihood that the highest probability match that is made is actually much lower probability of it being an actual match means the bad matches bubble to the top and get accepted as real. And these kinds of uses are more interested in a "better safe than sorry" stance and they aren't sorry about killing the wrong person, only about not killing the right one. So they're perfectly as happy killing many people that are possible matches as they are one person that's the correct match.