Hey folks!
I made a short post last night explaining why image uploads had been disabled. This was in the middle of the night for me, so I did not have time to go into a lot of detail, but I'm writing a more detailed post now to clear up where we are now and where we plan to go.
What's the problem?
As shared by the lemmy.world team, over the past few days, some people have been spamming one of their communities with CSAM images. Lemmy has been attacked in various ways before, but this is clearly on a whole new level of depravity, as it's first and foremost an attack on actual victims of child abuse, in addition to being an attack on the users and admins on Lemmy.
What's the solution?
I am putting together a plan, both for the short term and for the longer term, to combat and prevent such content from ever reaching lemm.ee servers.
For the immediate future, I am taking the following steps:
1) Image uploads are completely disabled for all users
This is a drastic measure, and I am aware that it's the opposite of what many of our users have been hoping, but at the moment, we simply don't have the necessary tools to safely handle uploaded images.
2) All images which have federated in from other instances will be deleted from our servers, without any exception
At this point, we have millions of such images, and I am planning to just indiscriminately purge all of them. Posts from other instances will not be broken after the deletion, the deleted images will simply be loaded directly from other instances.
3) I will apply a small patch to the Lemmy backend running on lemm.ee to prevent images from other instances from being downloaded to our servers
Lemmy has always loaded some images directly from other servers, while saving other images locally to serve directly. I am eliminating the second option for the time being, forcing all images uploaded on external instances to always be loaded from those servers. This will somewhat increase the amount of servers which users will fetch images from when opening lemm.ee, which certainly has downsides, but I believe this is preferable to opening up our servers to potentially illegal content.
For the longer term, I have some further ideas:
4) Invite-based registrations
I believe that one of the best ways to effectively combat spam and malicious users is to implement an invite system on Lemmy. I have wanted to work on such a system ever since I first set up this instance, but real life and other things have been getting in the way, so I haven't had a chance. However, with the current situation, I believe this feature is more important then ever, and I'm very hopeful I will be able to make time to work on it very soon.
My idea would be to grant our users a few invites, which would replenish every month if used. An invite will be required to sign up on lemm.ee after that point. The system will keep track of the invite hierarchy, and in extreme cases (such as spambot sign-ups), inviters may be held responsible for rule breaking users they have invited.
While this will certainly create a barrier of entry to signing up on lemm.ee, we are already one of the biggest instances, and I think at this point, such a barrier will do more good than harm.
5) Account requirements for specific activities
This is something that many admins and mods have been discussing for a while now, and I believe it would be an important feature for lemm.ee as well. Essentially, I would like to limit certain activities to users which meet specific requirements (maybe account age, amount of comments, etc). These activities might include things like image uploads, community creation, perhaps even private messages.
This could in theory limit creation of new accounts just to break rules (or laws).
6) Automated ML based NSFW scanning for all uploaded images
I think it makes sense to apply automatic scanning on all images before we save them on our servers, and if it's flagged as NSFW, then we don't accept the upload. While machine learning is not 100% accurate and will produce false positives, I believe this is a trade-off that we simply need to accept at this point. Not only will this help against any potential CSAM, it will also help us better enforce our "no pornography" rule.
This would potentially also allow us to resume caching images from other instances, which will improve both performance and privacy on lemm.ee.
With all of the above in place, I believe we will be able to re-enable image uploads with a much higher degree of safety. Of course, most of these ideas come with some significant downsides, but please keep in mind that users posting CSAM present an existential threat to Lemmy (in addition to just being absolutely morally disgusting and actively harmful to the victims of the abuse). If the choice is between having a Lemmy instance with some restrictions, or not having a Lemmy instance at all, then I think the restrictions are the better option.
I also would appreciate your patience in this matter, as all of the long term plans require additional development, and while this is currently a high priority issue for all Lemmy admins, we are all still volunteers and do not have the freedom to dedicate huge amounts of hours to working on new features.
As always, your feedback and thoughts are appreciated, so please feel free to leave a comment if you disagree with any of the plans or if you have any suggestions on how to improve them.
A karma system is sounding pretty good right now... /me lifts shield and ducks
Even if it's just a a limited tiered system with numbers to obsess about. Level - 1 browsing rights. Graduate to level 2 after 5 days and total of greater than 30minutes of logged in activity
Level - 2 commenting rights. Limited to 10 comments daily for 5 days.
Graduate after at least 3 comments, total upvote count >+3, and 5days.
Level 3 - posting rights. Limited to 3 posts daily for 5 days. Unlimited commenting.
Graduate after 5d and total upvote count >50
Level 4 — image posting rights. 10 images per day max
Graduate after 2 weeks and total upvote count >100
Level 5 - you've made it, everyone is equal here. Entry level users are still enjoying and growing into the community. No need to be a tool about trying to get more karma / points and number of bots / temp accounts / total losers should be minimal by this screening level.
Unlike the features mentioned in the OP -- all of which I support, though I regret 4's necessity -- I think this one would actually be harmful to the existing userbase because karma scores encourage pointless attention-seeking behavior, as Reddit demonstrates.
I am going to get downvoted for this, but (circlejerk baiting massively popular Reddit opinion)
I don't see how that would happen if everyone just capped out at karma=5 within a few weeks
Bots would just abuse such a karma system, wouldn't they?
It would attract the karma farming bots that reddit has. Any website that has a privilege system causes accounts with more privilidges to be worth more to buyers.
Yes for a karma point system. People will buy karma's in the thousands, but how much are people going to pay for a max score of 5 that is just there as entry level screening buffer. I don't imagine there would be sufficient value to go through the effort of farming these kinds of accounts.
Not many systems they can't, but my outline is a lot of hoops to jump through and also has a significant time gate which limits rapid attacks. Also, there's enough steps for opportunities for pattern detection to sniff out bots here as well
lemmy's code already does it. person_aggregates keeps track of post_score and comment_score. It just isn't displayed on lemmy-ui. A bot or new code can look at these values.
Welllll, my favorite discord does have a Recently Joined / New Member role. You need to post 100 text messages and have been there for 3 days before you have the ability to post images and access to the more spicy and sarcastic chats.
I think it would fit Lemmy well. Seems reasonable to lurk around a bit a first before dumping a bunch of pics onto servers right when you sign up.