this post was submitted on 23 Jul 2023
601 points (99.5% liked)

Lemmy NSFW

11868 readers
261 users here now

Updates about lemmynsfw.com

founded 1 year ago
MODERATORS
 

One of the admins at lemmy.blahaj.zone asked us to purge a community and all of its users because they thought it was full of child sexual abuse material, aka CSAM, fka kiddy porn. We assured them that we had checked this comm thoroughly and we were satisfied that all of the models on it were of age.

The admin then demanded we purge the comm because they mistook it for CSAM, and claimed that the entire point of the community was to make people think it was CSAM. We vehemently disagreed that that was in fact the point of the community, but they decided to defederate from us anyway. That is of course their choice, but we will not purge our communities or users because someone else makes a mistake of fact, and then lays the responsibility for their mistake at our feet.

If someone made a community intended to fool people into thinking it was kiddy porn, that would be a real problem. If someone of age goes online and pretends -- not roleplays, but pretends with intent to deceive -- to be a child and makes porn, that is a real problem. Nobody here is doing that.

One of the reasons we run our instance the way that we do is that we want it to be inclusive. We don't body shame, and we believe that all adults have a right to sexual expression. That means no adult on our instance is too thin, fat, bald, masculine, old, young, cis, gay, etc., to be sexy, and that includes adults that look younger than some people think they should. Everyone has a right to lust and to be lusted after. There's no way to draw a line that says "you can't like adult people that look like X" without crossing a line that we will not cross.

EDIT: OK, closing this post to new comments. Everything that needs saying has been said. Link to my convo with the blahaj admin here.

you are viewing a single comment's thread
view the rest of the comments
[–] KairuByte@lemmy.world 37 points 1 year ago (2 children)

They decided that because they mistook it for CSAM it should be taken down, and the entire community with it.

Because they assumed one image was CSAM.

It’s kinda nuts.

[–] anaisrim@mastodon.social 32 points 1 year ago (1 children)

@KairuByte @Shit

After reading Ada's post and comments there (not removed, many were), it's my opinion the admins there wanted to defederate lemmynsfw anyway and this was a convenient excuse.

Regardless, it's their server. Many users there support the decision. Their right, and if the userbase wants that they've chosen the right instance for them. Those who don't want that outcome will move.

[–] KairuByte@lemmy.world 23 points 1 year ago (3 children)

I’m inclined to agree with you. Though I’ll argue that most users over there are agreeing based on a colorful interpretation of what happened, assuming that there is indeed a community based around legal porn meant to look like CSAM… which doesn’t appear to be the case at all. Look at the community in question (!adorableporn@lemmynsfw.com) and you’ll notice a lack of anything encouraging people to present as underage.

[–] twelves@lemmynsfw.com 17 points 1 year ago

Adults can be, and often are, adorable.

[–] anaisrim@mastodon.social 8 points 1 year ago (2 children)

@KairuByte

I've seen it. IMO the side panel explicitly says "childlike" and I can see some might have a problem with that. It suggests the purpose Ada objected to over there. And I think mods and admins might want to change that text such that it makes no reference to underage anyone in the context of porn.

Clearly, all participants are over 18. Good.

[–] b9999998@lemmynsfw.com 9 points 1 year ago* (last edited 1 year ago)

I'm the new mod of a few hours at !adorableporn@lemmynsfw.com , and i have nothing to hide.

See pinned post https://lemmynsfw.com/post/419923 and comments as I was trying to update the community sidebar from a single line to what you see there now, and get feedback. I even have a "under construction" disclaimer.

Btw, the reference came as I was cut/paste from here https://www.vocabulary.com/dictionary/adorable

[–] twelves@lemmynsfw.com 6 points 1 year ago (2 children)

Did they change it? I don't see what you're saying is there.

[–] Asslaion@lemmynsfw.com 13 points 1 year ago

Just went there, the sticky post has a comment suggesting they remove it and the moderator said they removed the phrase. That one was actually something problematic to leave there. Apparently they are changing rules and there's new mods or something

[–] anaisrim@mastodon.social 2 points 1 year ago (1 children)

@twelves

There's a screenshot I included. Posted from Mastodon so maybe the image didn't propagate out to lemmy?

[–] AnaisRim@lemmynsfw.com 6 points 1 year ago

Coming here from Lemmynsfw and I see my screenshot did not propagate. It's ok. I'm sure ActivityPub devs will get the unified timeline down sometime. When will everyone switch to Zot?

[–] GBU_28@lemm.ee 4 points 1 year ago* (last edited 1 year ago) (2 children)

I understand the knee jerk reaction... Doesn't federation mean they are potentially possessing copies of that content, hosting it, by being federated?

[–] KairuByte@lemmy.world 20 points 1 year ago* (last edited 1 year ago) (2 children)

So, yes. Their instance would have copies of content viewed by their users. That said, they didn’t defederate because of CSAM, which would make perfect sense. They defederated because they made an incorrect assumption, and then wanted an entire community nuked because of that assumption… even after they were corrected.

The moment things were made clear, they should have said “oh okay, our bad.” But instead they doubled down.

[–] GBU_28@lemm.ee 4 points 1 year ago* (last edited 1 year ago) (1 children)

But if you had the anxiety and fear in your heart that boots were about to kick in your door, and hell, that you are facilitating the consumption of csam , would a few DMs really put you at ease?

Empathetically assume you had already accepted the worst was occuring, I believe it would be very hard to adjust course and sleep at night

[–] KairuByte@lemmy.world 4 points 1 year ago* (last edited 1 year ago) (2 children)

Honestly, I don’t know how the law would handle this kind of situation. But in my mind, the only time you’re in legal hot water is when (a) there is actual CSAM involved, and (b) nothing is done to prevent that association.

In this case, (a) was proven to be false. So there’s no concern. But if it had been the case, then defederation makes sense.

Otherwise, there’s no reason to federate at all. Anyone can post CSAM on any instance at any time. There’s nothing in place to detect it, nothing in place to handle it other than manual moderation. That’s just a hard fact of lemmy instance hosting.

[–] GBU_28@lemm.ee 2 points 1 year ago (1 children)

I enjoy reading and commenting here, but it is my back of mind fear for federated spaces like Lemmy.

Bad actors could spam suspicious or actual csam.

All it takes is one admin/hoster to be "made example of" to really shake the system.

I hope I'm wrong and ignorant of the realities of the law / prosecution.

[–] KairuByte@lemmy.world 10 points 1 year ago

Note: I deleted my comment by mistake. X.x

So I think most of the time we would be in the clear, as long as actual CSAM is handled when it is found/reported.

Just like Reddit doesn’t get hauled to court when CSAM is posted. And mods don’t get arrested for viewing it while they are removing it.

[–] GBU_28@lemm.ee 1 points 1 year ago (1 children)

I enjoy reading and commenting here, but it is my back of mind fear for federated spaces like Lemmy.

Bad actors could spam suspicious or actual csam.

All it takes is one admin/hoster to be "made example of" to really shake the system.

I hope I'm wrong and ignorant of the realities of the law / prosecution.

[–] MikeyMongol@lemmynsfw.com 11 points 1 year ago (1 children)

We are extremely aware of this possibility and have taken many active steps against it, and we are scrupulously staying on the right side of US law when it comes to reporting potential CSAM. As stated in our FAQ, preventing CSAM on our instance is our highest priority.

[–] GBU_28@lemm.ee 2 points 1 year ago

Appreciate the verbiage, but I wasn't calling Lemmy nsfw out, I was commenting on the whole big picture

[–] GBU_28@lemm.ee 3 points 1 year ago

But if you had the anxiety and fear in your heart that boots were about to kick in your door, and hell, that you are facilitating the consumption of casm , wood a few DMs really put you at ease?

Empathetically assume you had already accepted the worst was occuring, I believe it would be very hard to adjust course and sleep at night

[–] assqrw@lemmynsfw.com 5 points 1 year ago

Doesn't federation mean they are potentially possessing copies of that content, hosting it, by being federated?

Media isn't replicated. Lemmy is a link aggregator, posts include links to content and that is what is replicated. The fact that instances let you upload images directly does make that a bit confusing but if you look at a post from one instance on another the posts link is still the image on the original instance and is fetched from there. The only local media from federated instances are the thumbnails that are generated and stored locally. That's still a problem in the instance of some illegal content but less so.