this post was submitted on 25 Jul 2023
151 points (100.0% liked)

Fediverse

27694 readers
541 users here now

A community to talk about the Fediverse and all it's related services using ActivityPub (Mastodon, Lemmy, KBin, etc).

If you wanted to get help with moderating your own community then head over to !moderators@lemmy.world!

Rules

Learn more at these websites: Join The Fediverse Wiki, Fediverse.info, Wikipedia Page, The Federation Info (Stats), FediDB (Stats), Sub Rehab (Reddit Migration), Search Lemmy

founded 1 year ago
MODERATORS
 

Not the best news in this report. We need to find ways to do more.

you are viewing a single comment's thread
view the rest of the comments
[–] Aesthesiaphilia@kbin.social 5 points 1 year ago (2 children)

We do know they only found, what, 112 actual images of CP? That's a very small number. I'd say that paints us in a pretty good light, relatively.

[–] dustyData@lemmy.world 6 points 1 year ago

112 images out of 325,000 images scanned over two days, is about 0,03% So we are doing pretty well. With more moderation tools we could continue to knock out those sigmas.

[–] blazera@kbin.social 2 points 1 year ago (1 children)

it says 112 instances of known CSAM. But that's based on their methodology, right, and their methodology is not actually looking at the content, it's looking at hashtags and whether google safesearch thinks it's explicit. Which Im pretty sure doesnt differentiate with what the subject of the explicitness is. It's just gonna try to detect breast or genitals I imagine.

Though they do give a few damning examples of things like actual CP trading, but also that they've been removed.

[–] Rivalarrival 7 points 1 year ago

How many of those 112 instances are honeypots controlled by the FBI or another law enforcement agency?