this post was submitted on 07 Feb 2025
284 points (99.3% liked)

Fediverse

31535 readers
1266 users here now

A community to talk about the Fediverse and all it's related services using ActivityPub (Mastodon, Lemmy, KBin, etc).

If you wanted to get help with moderating your own community then head over to !moderators@lemmy.world!

Rules

Learn more at these websites: Join The Fediverse Wiki, Fediverse.info, Wikipedia Page, The Federation Info (Stats), FediDB (Stats), Sub Rehab (Reddit Migration)

founded 2 years ago
MODERATORS
 

We have paused all crawling as of Feb 6th, 2025 until we implement robots.txt support. Stats will not update during this period.

you are viewing a single comment's thread
view the rest of the comments
[–] Semi_Hemi_Demigod@lemmy.world 57 points 1 month ago (4 children)

Robots.txt is a lot like email in that it was built for a far simpler time.

It would be better if the server could detect bots and send them down a rabbit hole rather than trusting randos to abide by the rules.

It was built for the living, free internet.

For all ita dark corners, it was better than what we have now.

[–] swizzlestick@lemmy.zip 27 points 1 month ago (3 children)

It would be better if the server could detect bots and send them down a rabbit hole

Already possible: Nepenthes.

[–] Semi_Hemi_Demigod@lemmy.world 25 points 1 month ago

ANY SITE THIS SOFTWARE IS APPLIED TO WILL LIKELY DISAPPEAR FROM ALL SEARCH RESULTS.

I’m sold

[–] Skepticpunk@lemmy.world 3 points 1 month ago
[–] merthyr1831@lemmy.ml 2 points 1 month ago

that website feels like uncovering a piece of ancient alien weaponry

[–] poVoq@slrpnk.net 19 points 1 month ago

Because of AI bots ignoring robots.txt (especially when you don't explicitly mention their user-agent and rather use a * wildcard) more and more people are implementing exactly that and I wouldn't be surprised if that is what triggered the need to implement robots.txt support for FediDB.

[–] jagged_circle@feddit.nl 8 points 1 month ago* (last edited 1 month ago) (2 children)

It is not possible to detect bots. Attempting to do so will invariably lead to false positives denying access to your content to what is usually the most at-risk & marginalized folks

Just implement a cache and forget about it. If read only content is causing you too much load, you're doing something terribly wrong.

[–] bhamlin@lemmy.world 2 points 1 month ago (1 children)

While I agree with you, the quantity of robots has greatly increased of late. While still not as numerous as users, they are hitting every link and wrecking your caches by not focusing on hotspots like humans do.

[–] jagged_circle@feddit.nl 2 points 1 month ago (1 children)

You need a bigger cache. If you dont have enough RAM, host it on a CDN

[–] bhamlin@lemmy.world 5 points 1 month ago

Sure thing! Help me pay for it?

[–] Fredthefishlord@lemmy.blahaj.zone 0 points 1 month ago* (last edited 1 month ago) (1 children)

False positives? Meh who cares ... That's what appeals are for. Real people should realize after not too long

[–] jagged_circle@feddit.nl 2 points 1 month ago

Every time I tried to appeal, I either got no response or met someone who had no idea what I was talking about.

Occasionally a bank fixes the problem for a week. Then its back again.