this post was submitted on 06 Jan 2025
63 points (94.4% liked)

Cybersecurity

5923 readers
146 users here now

c/cybersecurity is a community centered on the cybersecurity and information security profession. You can come here to discuss news, post something interesting, or just chat with others.

THE RULES

Instance Rules

Community Rules

If you ask someone to hack your "friends" socials you're just going to get banned so don't do that.

Learn about hacking

Hack the Box

Try Hack Me

Pico Capture the flag

Other security-related communities !databreaches@lemmy.zip !netsec@lemmy.world !securitynews@infosec.pub !cybersecurity@infosec.pub !pulse_of_truth@infosec.pub

Notable mention to !cybersecuritymemes@lemmy.world

founded 2 years ago
MODERATORS
 

Titled “Enhanced Visual Search,” this toggle permits iPhones to transmit photo data to Apple by default, raising concerns about user privacy and data-sharing practices.

you are viewing a single comment's thread
view the rest of the comments
[–] TaviRider@reddthat.com 2 points 3 days ago

The 1:1 matching and the porn detection were separate capabilities.

Porn detection is called Communication Safety, and it only warms the user. If it’s set up in Screen Time as a child’s device, someone has to enter the parent’s Screen Time passcode to bypass the warning. That’s it. It’s entirely local to the device. The parent isn’t notified or shown the image, and Apple doesn’t get the image. It’s using an ML model, so it can have false positives.

CSAM detection was exact 1:1 matching using a privacy-preserving hashing system. It prevented users uploading known CSAM to iCloud, and that’s it. Apple couldn’t tell if there was a match or find out the hashes of images being evaluated.

Many people misunderstood and conflated the two capabilities, and often claimed without evidence that they did things that they were designed never to do. Apple abandoned the CSAM detection capability.