this post was submitted on 25 Jul 2023
151 points (100.0% liked)

Fediverse

27694 readers
541 users here now

A community to talk about the Fediverse and all it's related services using ActivityPub (Mastodon, Lemmy, KBin, etc).

If you wanted to get help with moderating your own community then head over to !moderators@lemmy.world!

Rules

Learn more at these websites: Join The Fediverse Wiki, Fediverse.info, Wikipedia Page, The Federation Info (Stats), FediDB (Stats), Sub Rehab (Reddit Migration), Search Lemmy

founded 1 year ago
MODERATORS
 

Not the best news in this report. We need to find ways to do more.

you are viewing a single comment's thread
view the rest of the comments
[–] blazera@kbin.social 7 points 1 year ago (2 children)

basically we dont know what they found, because they just looked up hashtags, and then didnt look at the results for ethics reasons. They dont even say what hashtags they looked through.

[–] Aesthesiaphilia@kbin.social 5 points 1 year ago (2 children)

We do know they only found, what, 112 actual images of CP? That's a very small number. I'd say that paints us in a pretty good light, relatively.

[–] dustyData@lemmy.world 6 points 1 year ago

112 images out of 325,000 images scanned over two days, is about 0,03% So we are doing pretty well. With more moderation tools we could continue to knock out those sigmas.

[–] blazera@kbin.social 2 points 1 year ago (1 children)

it says 112 instances of known CSAM. But that's based on their methodology, right, and their methodology is not actually looking at the content, it's looking at hashtags and whether google safesearch thinks it's explicit. Which Im pretty sure doesnt differentiate with what the subject of the explicitness is. It's just gonna try to detect breast or genitals I imagine.

Though they do give a few damning examples of things like actual CP trading, but also that they've been removed.

[–] Rivalarrival 7 points 1 year ago

How many of those 112 instances are honeypots controlled by the FBI or another law enforcement agency?

[–] bandario@lemmy.dbzer0.com 2 points 1 year ago (3 children)

There are communities on NSFW Lemmy where people intentionally present as children engaged in sexual abuse scenarios. They're adults and this is their kink that everyone is supposed to tolerate and pretend is ok.

See defederation drama over the last couple of days. What I'm saying is, the hashtags mean nothing.

[–] LexiconDexicon@lemmy.world 10 points 1 year ago

They’re adults

then what's the problem?

[–] blazera@kbin.social 6 points 1 year ago

No, that admin lied about the community

[–] hightrix@kbin.social 5 points 1 year ago (1 children)

There are communities on NSFW Lemmy where people intentionally present as children engaged in sexual abuse scenarios.

If you are referring to the community that was cited as the reason for defederation, this is completely false. The community in question is adorableporn, extremely similar to the subreddit of the same name. No one, in any manner, in either community, presents as a child. While yes, the women that post there tend to be on the shorter and thinner side, calling short, thin adults 'children' is not being honest.

To be clear, this community is about petite women. This community is NOT about women with a kink to present as a child.

[–] bandario@lemmy.dbzer0.com 1 points 1 year ago (1 children)

And what of all the other bait communities? Come on. It's not ok.

[–] hightrix@kbin.social 3 points 1 year ago

What other bait communities? We can't just accept "think of the children" as an excuse. That doesn't work.

Yes, no one wants actual CSAM to show up in their feed, we can all completely agree on that. But just because some middle-aged woman can't tell the difference between a 20 year old and a 15 year old, doesn't make images of the 20 year old CSAM.