this post was submitted on 18 Sep 2023
7 points (65.2% liked)

Privacy

31808 readers
314 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

Chat rooms

much thanks to @gary_host_laptop for the logo design :)

founded 5 years ago
MODERATORS
 

I know that scanning images for Scam is kind if a dystopian and scary. However, that doesn't mean that we need to open ourselves for abusive materials to be sent to us.

What I think we need is some publicly available ML models that can be run on each device voluntary to block SCAM from being shown or stored.

Publicly available models would help but implementing them could be a slippery sloap. If popular encrypted messaging apps start having this feature built in its possible it will become illegal to turn it off or use versions of the app with scanner removed. This would mean that we would effectively stuck with a bad egg in our code.

Maybe the best answer is to not give individuals with questionable history the ability to message you.

Does anyone else have a thought?

you are viewing a single comment's thread
view the rest of the comments
[โ€“] ono@lemmy.ca 24 points 1 year ago (2 children)

The point of CSAM scanners is not to protect children, but to circumvent due process by expanding warrantless surveillance. That is antithetical to FOSS.

So, in a word, no.