this post was submitted on 10 Jun 2025
351 points (98.6% liked)

Memes

51860 readers
1375 users here now

Rules:

  1. Be civil and nice.
  2. Try not to excessively repost, as a rule of thumb, wait at least 2 months to do it if you have to.

founded 6 years ago
MODERATORS
top 15 comments
sorted by: hot top controversial new old
[–] AbnormalHumanBeing@lemmy.abnormalbeings.space 77 points 2 months ago (3 children)

Meanwhile, the POV bots should be getting:

(I have to set it one up for my Fediverse stuff one of these days as well)

[–] wise_pancake@lemmy.ca 39 points 2 months ago (2 children)

I keep seeing this on serious sites and it makes me happy

[–] bigBananas@feddit.nl 10 points 2 months ago (3 children)

Such a weird thing that it essentially discriminates Mozilla based browsers though, I'd expect bots would follow the most-used-approach. So yeah, this does not make me happy..although the anime-girl kinda does

[–] Anafabula@discuss.tchncs.de 17 points 2 months ago

It doesn't discriminate Mozilla based browsers. It checks if the User-Agent string contains "Mozilla".

Due to historical reasons, every browser (and software pretending to be a browser) has "Mozilla" in it's User-Agent string.

This is a User-Agent string for Google Chrome on Windows 10:

Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/137.0.0.0 Safari/537.36
[–] nickwitha_k@lemmy.sdf.org 4 points 2 months ago

Is it blocking you? I pretty much exclusively use Gecko at this point and don't have an issue yet.

[–] tastemyglaive@lemmy.ml 4 points 2 months ago

What's the problem with Gecko browsers exactly? The only issue I have is disabling JShelter for new domains.

[–] Dreaming_Novaling@lemmy.zip 8 points 2 months ago

At first I was getting it for some proxy services and fediverse services, and didn't think much of it cause I thought it was just some thing small projects used instead of cloudflare/google. But yeah now I've been seeing it on more "official" websites and I'm happy about it after I took time to read their github page.

I especially love it since I don't have to cry over failing 30 "click the sidewalk" captchas in a row for daring to use a VPN + uBlock + Librewolf to look at a single page of search results. I can sit on my ass for 5 sec and breeze through, assured that I'm not a robot 🥹

[–] Link@rentadrunk.org 16 points 2 months ago (2 children)
[–] kautau@lemmy.world 28 points 2 months ago (2 children)
[–] brbposting@sh.itjust.works 15 points 2 months ago

It was created by Xe Iaso in response to Amazon's web crawler overloading their Git server, as it did not respect the robots.txt exclusion protocol and would work around restrictions.

Jeff wouldn’t do that!

[–] RVGamer06@sh.itjust.works 9 points 2 months ago

Even a wikipedia page lmfao

[–] NotProLemmy@lemmy.ml 12 points 2 months ago* (last edited 2 months ago)

If humans can't view the page, so won't bots.

[–] sharkfucker420@lemmy.ml 6 points 2 months ago (1 children)

How do I disguise myself as a bot

[–] goatbeard@lemm.ee 6 points 2 months ago

Don't use the front end