this post was submitted on 05 Jan 2024
35 points (97.3% liked)
Asklemmy
43912 readers
1036 users here now
A loosely moderated place to ask open-ended questions
Search asklemmy π
If your post meets the following criteria, it's welcome here!
- Open-ended question
- Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
- Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
- Not ad nauseam inducing: please make sure it is a question that would be new to most members
- An actual topic of discussion
Looking for support?
Looking for a community?
- Lemmyverse: community search
- sub.rehab: maps old subreddits to fediverse options, marks official as such
- !lemmy411@lemmy.ca: a community for finding communities
~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
No search engine is going to find a long obfuscated URL. I don't think NC publishes a site tree for a crawler to use.
In fact, unless you post your domain somewhere online or its registration is available somewhere, it's unlikely anyone will ever visit your server without a direct link provided by you or someone else who knows it.
You might still get discovered by IP crawlers, but even then they aren't going to be trial and erroring their way to shared files, for the same reason they can't brute force any sane SSH password.
If you use HTTPS with a publicly-trusted certificate (such as via Let's Encrypt), the host names in the certificate will be published in certificate transparency logs. So at least the "main" domain will be known, as well as any subdomains you don't hide by using wildcards.
I'm not sure whether anyone uses those as a list of sites to automatically visit, but I certainly would not count on nobody doing so.
That just gives them the domain name though, so URLS with long randomly-generated paths should still be safe.
There is also the DNS system itself, not sure if reverse lookup is possible in some way without a PTR record, but suffice to say there are ways, and there are many.
Obscurity is not security, just a reasonable first line of defense. If you run something publicly accessible, lock it down.
Stuff that can't be brute forced in a million years is a good way to do that, even if it's just a string in a URL. It's basically like having to enter a password. You could even fail2ban it by banning IPs that try a bunch of random URLs that aren't valid, or use a simple rate-limit.
Nah I have some services running on unpublished domains and I get hit by brute force attempts at SSH logins all the time. It might not be sane but botnet gonna botnet.
Oh, same. Though on my current IP it hasn't happened for a couple years, now.
But finding an SSH port with an IP crawler is a lot easier than finding all the services accessible behind different paths/subdomains on port 80. And even then, mapping out a site-tree all the way out to uncrackable-password-lenght URLs, is never gonna happen by brute force.