this post was submitted on 08 Dec 2024
211 points (96.9% liked)

Technology

60109 readers
1920 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] lepinkainen@lemmy.world 3 points 2 weeks ago (1 children)

Yep, it's a legal "think of the children" requirement. They've been doing CSAM scanning for decades already and nobody cared.

When Apple did a system that required MULTIPLE HUMAN-VERIFIED matches of actual CP before even a hint would be sent to the authorities, it was somehow the slippery slope to a surveillance state.

The stupidest ones were the ones who went "a-ha! I can create a false match with this utter gibberish image!". Yes, you can do that. Now you've inconvenienced a human checker for 3 seconds, after the threshold of local matching images has been reached. Nobody would've EVER get swatted by your false matches.

Can people say the same for Google stuff? People get accounts taken down by "AI" or "Machine learning" crap with zero recourse, and that's not a surveillance state?

[–] Petter1@lemm.ee 3 points 2 weeks ago

😅why do we get downvoted?

I guess somebody doesn’t like reality 💁🏻