Selfhosted

41113 readers
543 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
1
 
 

First, a hardware question. I'm looking for a computer to use as a... router? Louis calls it a router but it's a computer that is upstream of my whole network and has two ethernet ports. And suggestions on this? Ideal amount or RAM? Ideal processor/speed? I have fiber internet, 10 gbps up and 10 gbps down, so I'm willing to spend a little more on higher bandwidth components. I'm assuming I won't need a GPU.

Anyways, has anyone had a chance to look at his guide? It's accompanied by two youtube videos that are about 7 hours each.

I don't expect to do everything in his guide. I'd like to be able to VPN into my home network and SSH into some of my projects, use Immich, check out Plex or similar, and set up a NAS. Maybe other stuff after that but those are my main interests.

Any advice/links for a beginner are more than welcome.

Edit: thanks for all the info, lots of good stuff here. OpenWRT seems to be the most frequently recommended thing here so I'm looking into that now. Unfortunately my current router/AP (Asus AX6600) is not supported. I was hoping to not have to replace it, it was kinda pricey, I got it when I upgraded to fiber since it can do 6.6gbps. I'm currently looking into devices I can put upstream of my current hardware but I might have to bite the bullet and replace it.

Edit 2: This is looking pretty good right now.

2
 
 

Hello everyone! Mods here 😊

Tell us, what services do you selfhost? Extra points for selfhosted hardware infrastructure.

Feel free to take it as a chance to present yourself to the community!

🦎

3
4
 
 

For this new year, I’d like to learn the skills necessary to self host. Specifically, I would like to eventually be able to self host Nextcloud, Jellyfin and possibly my email server too.

I've have a basic level understanding of Python and Kotlin. Now I'm in the process of learning Linux through a virtual machine because I know Linux is better suited for self hosting.

Should I stick with Python? Or is JavaScript (or maybe Ruby) better suited for that purpose? I'm more than happy to learn a new language, but I'm unsure on which is better suited.

And if you could start again in your self hosting journey, what would you do differently? :)

5
 
 

In my business, I am very often not allowed to use public clouds because of data privacy reasons. This means, I need to create my own cloud with all the services that are needed to actually work in a IT driven DevOps environment.

Is there a comprehensive list of software, which offer on-prem solutions? It does not have to be open-source or free.

A good example would be some Atlassian software, like Jira, which is available as on-prem version.

6
 
 

Hey, Threadiverse! I'm looking for informed opinions on database choices.

I can stand up an Internet-facing application and have it use either MySQL or PostgreSQL. Which is the better choice, and why do you think so?

Thanks!

7
 
 

I have for a while a ubuntu server where I selfhost for my household syncthing (automatic backup of most important files on devices), baïkal, magic mirror and a few other things via docker.

I was looking at what I have now (leftovers of a computer of mine, amd 2600 with 16 gb ram with a 1660 super and a western digital blue ssd of 512GB), and regarding storage wise, at the time I decided to get several sort of cheap ssd's to have enough initial space (made a logical volume out of 3 crucial mx500 1TB, in total making 3TB). At the time I though I wanted to avoid regular hdd at all costs (knew people who had issues with it), but in hindsight, I never worked with NAS drives, so my fear over these hdd with such low usage is sort of uncalled for.

So now I am trying to understand what can I change this setup so I can expand later if needed, but also having a bit more space already (for the personal stuff I have around 1.5TB of data) and add a bit more resilience in case something happens. Another goal is to try to make a 3-2-1 backup kind of solution (starting with the setup at home, with an external disk already and later a remote backup location). Also, I will probably decommission for now the ssd's since I want to avoid to have a logical volumes (something happens on one drive, and puff all the data goes away). So my questions regarding this are:

  • For hdd's to be used as long term storage, what is usually the rule of thumb? Are there any recommendations on what drives are usually better for this?
  • Considering this is going to store personal documents and photos, is RAID a must in your opinion? And if so, which configuration?
  • And in case RAID would be required, is ubuntu server good enough for this? or using something such as unraid is a must?
  • I was thinking of probably trying to sell the 1660 super while it has some market value. However, I was never able to have the server completely headless. Is there a way to make this happen with a msi tomahawk b450? Or is only possible with an APU (such as 5600g)?

Thanks in advance

PS: If you guys find any glaring issues with my setup and know a tip or two, please share them so I can also understand better this selfhosted landscape :)

8
 
 

For context, Hoarder is a bookmarking tool, and it was selected by selfh.st as one of the favourites of 2024

https://selfh.st/2024-favorite-new-apps/

Here is a link to the post, and it has also been copied below (with some extra lines added to fix formatting):


This post could have been about how hoarder reached 10k stars on Github, or about how we spent a day in the front page of hackernews. But unfortunately, it's about neither of those. Today, I received a cease and desist from someone holding the "Hordr" trademark claiming that "Hoarder" infringes their trademark. Quoting the content of the letter:

In these circumstances, our client is concerned, and justifiably so, that your use of a near identical name in connection with software having very similar (if not identical) functionality gives the impression that your software originates from, is somehow sponsored by, or is otherwise affiliated with our client.

They're asking to cease and desist from using the "Hoarder" name, remove all content of websites/app store/github/etc that uses the name "Hoarder" and the cherry on top, "Immediately transfer the hoarder.app domain to our client" or let it expire without renewing it (in Feb 2027). They're expecting a response by the 24th of Jan, or they're threatening to sue.

For context, I've started developing Hoarder in Feb 2024, and released it here on reddit on March 2024. I've never heard about "Hordr" before today, so I did some research (some screenshots along the way):

  1. They have a trademark for "Hordr" registered in Jan 2023.

  2. They registered the domain hordr dot app in 2021.

  3. Searching google for their domain shows nothing but their website, their parent company and an old apk (from Jun 2024). So they have basically zero external references.

  4. They've had their 2.0 release on the app store on the 3rd of Jan 2025 (2 weeks ago), with "AI powered bookmarking". The release before that is from Feb 2023, and says nothing about the content of the app back then.

    1. Their apps are so new that they are not even indexed on the play store. Google says they have "1+" downloads.
    2. I found an apk on one of the apk hosting sites from Jun 2024, which shows some screenshots of how the app looked back then.
  5. Wayback machine for the hordr dot info shows a references from 2023 to some app in the app/play store. The app itself (in app/play store) is unfortunately not indexed.

So TL;DR, they seem legitimate and not outright trademark trolls. Their earliest app screenshots from June 2024 suggest their current functionality came after Hoarder’s public release. Despite their claims, I find it hard to see how Hoarder could cause confusion among their customers, given they appear to have very almost none. If anything, it feels like they’ve borrowed from Hoarder to increase the similarity before sending the cease and desist.

Hoarder is a side project of mine that I've poured in so much time and energy over the last year. I don't have the mental capacity to deal with this. I'm posting here out of frustration, and I kinda know the most likely outcome. Has anyone dealt with anything similar before?

9
 
 

I’m going to make a backup of 2TB SSD today. I will use clonezilla mainly because that’s all I know. But do you recommend any other ways for any reason?

I want to keep the process simple and easy. And I will likely take backup once a month or so repeatedly. It doesn’t have to be ready all the time. If you need more clarification, ask away.

10
 
 

Hi all,

It's been a long... years at work and my brain is fried currently with no bandwidth to properly determine how to migrate a BTFRS array from unraid over to proxmox. I can see the array in proxmox and am able to mount it but now I cannot for the life of me figure out how to

  1. verify that the data is intact
  2. assign it to a storage pool for use in vm's
  3. view it within proxmox

I haven't touched proxmox in years after settling on unraid a while back, but am looking to move back to a non-unraid config.

Anyone here have experience with btfrs and proxmox? Any good links to a tutorial or video?

Thanks!

11
 
 

cross-posted from: https://lemmy.ml/post/24823173

Hi folks, looking for a bit of steer to get off the ground with self hosting. My goals to start with are pretty straight forward:

  • I want to set up Home Assistant to move my smart devices off the cloud and fully contained within the walls of my home.
  • I want to set up my own little Pixelfed server for my family's use, along with some other federated socials.

From what I was looking at, I think my easiest route to doing both of these things is with a Home Assistant Yellow (built-in Zigbee and Thread system) with a Raspberry Pi 4.

I've never done anything like this before but I'm interested in learning. If anyone more experienced has any insight or direction, I'd really appreciate it! Cheers!

12
 
 

Today, lemmy.amxl.com suffered an outage because the rootful Lemmy podman container crashed out, and wouldn't restart.

Fixing it turned out to be more complicated than I expected, so I'm documenting the steps here in case anyone else has a similar issue with a podman container.

I tried restarting it, but got an unexpected error the internal IP address (which I hand assign to containers) was already in use, despite the fact it wasn't running.

I create my Lemmy services with podman-compose, so I deleted the Lemmy services with podman-compose down, and then re-created them with podman-compose up - that usually fixes things when they are really broken. But this time, I got a message like:

level=error msg=""IPAM error: requested ip address 172.19.10.11 is already allocated to container ID 36e1a622f261862d592b7ceb05db776051003a4422d6502ea483f275b5c390f2""

The only problem is that the referenced container actually didn't exist at all in the output of podman ps -a - in other words, podman thought the IP address was in use by a container that it didn't know anything about! The IP address has effectively been 'leaked'.

After digging into the internals, and a few false starts trying to track down where the leaked info was kept, I found it was kept in a BoltDB file at /run/containers/networks/ipam.db - that's apparently the 'IP allocation' database. Now, the good thing about /run is it is wiped on system restart - although I didn't really want to restart all my containers just to fix Lemmy.

BoltDB doesn't come with a lot of tools, but you can install a TUI editor like this: go install github.com/br0xen/boltbrowser@latest.

I made a backup of /run/containers/networks/ipam.db just in case I screwed it up.

Then I ran sudo ~/go/bin/boltbrowser /run/containers/networks/ipam.db to open the DB (this will lock the DB and stop any containers starting or otherwise changing IP statuses until you exit).

I found the networks that were impacted, and expanded the bucket (BoltDB has a hierarchy of buckets, and eventually you get key/value pairs) for those networks, and then for the CIDR ranges the leaked IP was in. In that list, I found a record with a value equal to the container that didn't actually exist. I used D to tell boltbrowser to delete that key/value pair. I also cleaned up under ids - where this time the key was the container ID that no longer existed - and repeated for both networks my container was in.

I then exited out of boltbrowser with q.

After that, I brought my Lemmy containers back up with podman-compose up -d - and everything then worked cleanly.

13
 
 

Hello,

I've attached a diagram of the setup I'm trying to achieve. Hopefully its clearer than trying to explain it with text...

Basically I'm trying to stream the camera to a selfhosted webpage.

The camera is connected to the VPN server

The stream is picked up on the Media Server (MediaMTX)

The stream is available from anywhere on the local network via whatever protocol MediaMTX offers. All good here.

The webserver set up is Nginx. Works fine.

A basic Wordpress site is set up and I can access it via a domain name over the internet with HTTPS.

What I'm struggling with is getting the "local stream" (read local IP) in to the website. I have WP plugins that let me embed streams, but I suspect the issue is the local IP is not available over the internet so you cant just point it to 192.X.X.X. Saying that though, even on my local network I cant see the stream.

So the questions are,

  1. how can I serve the stream to nginx/ wordpress and
  2. can I somehow have nginx treat the stream as a locally hosted resource that can proxy the stream to remote web browsers?

Ideally I dont want to open up a port on the LAN for a direct streaming to the internet which the website then points to as it seems a unsafe... But if that's the only way then I guess it can''t be helped.

Happy to provide more info if needed.

TIA

Edit: Wordpress is for a separate website project outside of the scope of this post. Only 1 page will be for the video player/ stream but there will be other uses for the website. Not just streaming

Edit 2: Seems the general consensus is that I do need to publicise my video stream.

I've just made my website accessible through its local IP and gotten embedded HLS and WebRTC streams working. Putting the domain back no longer plays the videos so its certainly a networking access issue or even a https issue as the streams are currently http.

I didn't realise you could reverse proxy a video stream! (Even though i did once upon a time use the nginx rtmp server).

I've also been made aware of tailscale + funnel which does a similar thing without exposing my own domain.

I'll have a go at reverse proxying it, which should also sort out the https issue and hopefully be done 🤞

You guys rock!

14
 
 

Hi, what's your setup?

I often listen to music through youtube on my phone connected to a bluetooth speaker. I use Newpipe, works very well. Then when I want to save a song or an album, there's the option for downloading (in newpipe itself) or on android for example Seal (works really well for downloading entire playlists and unselecting some sponsored video's from the playlist).

The hassle is uploading from the phone to my jellyfin. I've used File Browser, bit limited in options.

Then I thought I could use Syncthing to have some folder from my Android phone upload it automatically to my Jellyfin server (pc running dietpi), but it seems Syncthing is now discontinued on Android?

What I was first looking for was my own hosted yt-dlp with a mobile friendly UI, but that seemed quite difficult to get running.

15
 
 

Not torrenting, but searching.

I want a way to find similar media to the media I like.

Something with a similar to Jellyseer, with a way to browse media.

16
 
 

I’m doing a lot of coding and what I would ideally like to have is a long context model (128k tokens) that I can use to throw in my whole codebase.

I’ve been experimenting e.g. with Claude and what usually works well is to attach e.g. the whole architecture of a CRUD app along with the most recent docs of the framework I’m using and it’s okay for menial tasks. But I am very uncomfortable sending any kind of data to these providers.

Unfortunately I don’t have a lot of space so I can’t build a proper desktop. My options are either renting out a VPS or going for something small like a MacStudio. I know speeds aren’t great, but I was wondering if using e.g. RAG for documentation could help me get decent speeds.

I’ve read that especially on larger contexts Macs become very slow. I’m not very convinced but I could get a new one probably at 50% off as a business expense, so the Apple tax isn’t as much an issue as the concern about speed.

Any ideas? Are there other mini pcs available that could have better architecture? Tried researching but couldn’t find a lot

Edit: I found some stats on GitHub on different models: https://github.com/ggerganov/llama.cpp/issues/10444

Based on that I also conclude that you’re gonna wait forever if you work with a large codebase.

17
 
 

Let's say I've got Nextcloud selfhosted in my basement and that it is accessible on the world wide web at nextcloud.kickassdomain.org. When someone puts in that URL, we'll have all the fun DNS-lookups trying to find the IP address to get them to my router, and my router forwards ports 80 and 443 to a machine running a reverse-proxy, and the reverse-proxy then sends it to a machine-and-port that Nextcloud is listening to.

When I do this on my phone next to that computer hosting Nextcloud, (I believe) what happens is that the data leaves and re-enters my home network as my router sends the data to the IP address it is looking for (which is itself). This would mean that instead of getting a couple hundred Mbps from the local wifi (or being etherneted in and getting even more), I'm limited by my ISPs upload speed of ~25Mbps.

Maybe that just isn't the case and I've got nothing to worry about...

What I want my network to do is to know that nothing has to leave the network at all and just use the local speeds. What I tried before was using a DNS re-write in Adguard such that anything going to my kickassdomain would instead go to the local IP address (so like nextcloud.kickassdomain.org -> 192.168.0.99). This seemed to cause a lot of problems when I then left the house because, I assume, the DNS info was cached and my phone would out in the world and try to connect to that IP and fail.

My final goal here is that I want to upload/download from my selfhosted applications (like nextcloud) without being limited by the relatively slow upload speed of the ISP.

Maybe the computer already figured all this out, though - it does seem like my router should know it's own IP and not bother sending things out into the world just for them to come back.

If it matters, my IP address is pretty stable, but more importantly it is unique to me (like every house in the neighborhood has their own IP).

Updates from testing: So everything does indeed just work without me needing to change how I already had it set up, presumably because the router did the hairpin NAT action folks are talking about here.

I tested it by installed iperf3 on the server then I used my phone (using the PingTools Network Utilities android app, only found on google play and not on f-droid) to connect. Here are the results:

  1. Phone to local IP address (192.168.0.xxx) - ~700 Mbits/second
  2. Phone to speedtest.mykickassdomain.org while still on the wifi - ~700 Mbits/second
  3. Phone on cellular to speedtest.mykickassdomain.org - ~4 Mbits/second
18
 
 

I am currently planning to set up nextcloud as it is described in https://help.nextcloud.com/t/nextcloud-docker-compose-setup-with-caddy-2024/204846 and make it available via tailscale.

I found a tailscale reverse proxy example for the AIO Version: https://github.com/nextcloud/all-in-one/discussions/5439 which also uses caddy as reverse proxy.

It might be possible to adjust it to the nextcloud:fpm stack.

But it might also be possible to use the built in reverse proxy of the tailscale sidecar by using a TS_SERVE_CONFIG . In this json file the multiple paths (/push/* and the / root) can be configured and can redirect to the right internal dns name and port (notify_push:7867 and web:80) https://tailscale.com/blog/docker-tailscale-guide

Has anyone done that? Can someone share a complete example?

19
25
submitted 3 days ago* (last edited 3 days ago) by ReedReads@lemmy.zip to c/selfhosted@lemmy.world
 
 

I can't seem to find hardware requirements in the spec. Can someone help me out?

Looking to run this in a docker container with a Postgres DB, not sqlite.

https://github.com/laurent22/joplin/blob/dev/packages/server/README.md

20
 
 

Ive been wanting to get proper storage for my lil server running nextcloud and a couple other things, but nc is the main concern. Its currently running on an old ssd ive had laying around so i would want a more reliable longer term solution.

So thinking of a raid1 (mirror) hdd setup, with two 5400rpm 8tb drives, bringing the choices down to ironwolf or wd red plus, which both are in the same price range.

Im currently biased towards the ironwolfs because they are slightly cheaper and have a cool print on them, but from reddit threads ive seen that wd drives are generally quieter, which currently is a concern since the server is in my bedroom.

Does anyone have experience with these two drives and or know better solutions?

Oh and for the os, being a simple linux server, is it generally fine to have that on a separate drive, an ssd in this case?

Thanks! :3

21
 
 

As it stands, both Piped and Invidious are dead. Because od that, I almost completely stopped watching youtube but l'd still sometimes like to check what the people I follow posted (I used to do that via Piped). Are there any new ways of following people without actually using Google? I'm aware of the tools that download new videos as they come out but I'm more interested in just "subscribing", kinda like RSS?
Ideally it would be on iOS

Edit: I found it, “Unwatched” on iOS is awesome, thanks to !FundMECFSResearch@lemmy.blahaj.zone

22
 
 

cross-posted from: https://lemmy.ml/post/24722787

I am running ubuntu with casa os. I was previously running an intel apu (the name has slipped me I will update the post when I can with this info). Recently I got a 1650 that I installed for nvenc transcoding. It seems all the proper drivers are installed but my jellyfin container still fails playback anytime with it turned on.

I have reinstalled the container with the nvidia device variable and no dice. I have also tried installing the nvidia cintainer toolkit but that didn't work either. I am at a loss for trying to get nvenc to work.

Any help is appreciated!

EDIT: here is the ffmpeg log file

https://gofile.io/d/9nsBFq

23
 
 

Original Post:

I recently had a Proxmox node I was using as a NAS fail catastrophically. Not surprising as it was repurposed 12 year old desktop. I was able to salvage my data drive, but the boot drive was toast. Looks like the sata controller went out and fried the SSD I was using as the boot drive. This system was running TurnKey FileServer as a LXC with the media storage on a subvol on a ZFS storage pool.

My new system is based on OpenMediaVault and I'm am happy with it, but I'm hitting my head against a brick wall trying to get it to mount the ZFS drive from the old system. I tried installing ZFS using the instructions here as OMV is based on Debian but haven't had any luck so far.

Solved:

  1. Download and install OMV Extras
  2. OMV's web admin panel, go to System -> Plugins and install the Kernel Plugin
  3. Go to System -> Kernel and click the blue icon that says Proxmox (looks like a box with a down arrow as of Jan 2025) and install the latest Proxmox kernel from the drop down menu.
  4. Reboot
  5. Go back to the web panel, System -> Plugins and install the plugin openmediavault-zfs.
  6. Go to Storage -> zfs -> Pools and click on the blue icon Tools -> Import Pool. From here you can import all existing zfs pools or a single pool.
24
 
 

Hi all!

i have a nice setup with some containers (podman rootless) and bare metal services (anything i can install bare metal, goes bare metal usually).

I used Monit, in the past, to keep an eye on my services and automatically restart something that for any reason goes down. I stopped using Monit because doesnt scale well on mobile browser and it's frankly clumsy to configure.

I could go back to Monit i guess, but i am wondering if there is anything better out there to try.

A few requirements (not necessarily mandatory, but preferable):

  • Open Source (ideally: true open source, not just commercial sulutions with dumbed down free verisons)
  • Not limited, or focuesd, on containers (no Watchtower and similar)
  • For containers, it can just support "works" or "restart"
  • For containers, if it goes above the minimum "works" and "restart" must support podman
  • Must support bare metal services (status, start, stop)
  • Must send email or other kind of notifications (ok IM notifications, but email preferred)
  • Should additionally monitor external machines (es other servers on the LAN), or generic IP addresses
  • Should detect if a web service is alive but blocked
  • No need for fancy GUIs or a Web GUI (it's a pro point, but not required)
  • No need for data reporting, graphics and such aminities. They are a plus, but 100% not required.

What do you guys use?

25
 
 

So I have been selfhosting my calendar and todo list on a local server for sometime now. I use thuunderbird's tasks on my laptop and jtx board on my phone.

I see that jtx board has a journaling feature. It looks like maybe it is just for notes rather than a place to write self reflections. Is there something similar to this app in self hosting with a mobile and desktop component?

view more: next ›