this post was submitted on 08 Dec 2023
23 points (96.0% liked)

Programming

17020 readers
238 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities !webdev@programming.dev



founded 1 year ago
MODERATORS
 

TL;DR No archive format like tar, zip, ... but how would you theoretically represent a symlink in a manner that can be stored on the cloud and retrieved back to the system as a symlink?

Backstory

I heavily use symlinks to organise my media and even wrote an application that helps me do so (it's in Python and being rewritten in Rust). But I also use stuff like home-manager and nix which makes heavy use of symlinks.

My goal is to back up my media and /home to the cloud at regular intervals. There are services that cost just about 60-100€ yearly for limitless storage in the cloud. So having part of my library purely in the cloud and using terrabytes of space would cost less than a single 15TB HDD (500+€). To have a local backup, I'd even need a least a second one, which would put me at >1000€ - the equivalent of at least 10 years of cloud storage.

Options explored

rclone

It is pretty sweet as it supports mounting a cloud drive as a folder and has transparent encryption! However there are multiple open issues on uploading symlinks and I don't know Go. I wouldn't mind trying to learn it if I had an idea how to upload a symlink without following it (following symlinks breaks them).

git-annex etc.

git-annex and using a bare git repo with a remote worktree is great, but I don't need to make diffs of stuff and follow how things moved around, etc. I just need to replace backups with a view of what's there. Plus, storing all that history will probably take enormous amounts of space which is wasteful.

Ideas

store a blob of stat() call for every file

I'm not sure about this. The stat struct does contain information about the filetype (directory, hard link, symlink, ...), but my knowledge of linux internals is limited and maybe that's too complicated for this usecase.

a db of links

Instead of storing the links themselves, I store a DB (sqlite? CSV?) of links, upload that DB and use the DB to restore links after pull it back down. 🤔 Actually this might be the simplest thing to do, but maybe y'all have better ideas.

top 9 comments
sorted by: hot top controversial new old
[–] PeeGee@lemm.ee 9 points 9 months ago* (last edited 9 months ago) (1 children)

This is the very reason I dumped Dropbox. Used to be great - I just created symlinks to all my normal data folders in the main Dropbox folder and all of those folders would then be real-time (ish) backed up to the cloud without me ever having to touch anything or move all my stuff into the Dropbox folder. Then they decided to drop support for symlinks…

[–] dataprolet@lemmy.dbzer0.com 3 points 9 months ago (1 children)

If your files are only in the cloud it's not a backup.

[–] PeeGee@lemm.ee 4 points 9 months ago

They weren’t - they were on my file system and using symlinks in my Dropbox folder pushed them up to my DB account as well.

[–] key@lemmy.keychat.org 4 points 9 months ago* (last edited 9 months ago)

I'd think simplest option is to replace the symlink with a text file that contains the target path. Then add in a special unique extension so you can easily detect which files are meant to be symlinks.

I haven't used it but this script looks like it does most of that https://github.com/nbeaver/toggle-symlink

Though really I would question the need to do such a thing. Backup is a well solved thing so needing to be creative is a bad sign. I'd default to using a pre-built backup tool like borg. If you really need to avoid wrappers you can always backup to an actual file system via rsync which will handle symlinks normally

[–] thorisalaptop@lemmy.world 4 points 9 months ago
[–] cm0002@lemmy.world 3 points 9 months ago (2 children)

I have no comment on the syncing problem, but could you lmk what the cheap cloud services are? I'm reaching the end of my rope with Google Workspace drive no longer being unlimited and Dropbox ended their "as much as you need" policy

[–] YoorWeb@lemmy.world 4 points 9 months ago* (last edited 9 months ago)

Nextcloud. It's open source, you can go with one of their providers or host it yourself I'd you don't mind playing with config.

[–] tinkralge@programming.dev 3 points 9 months ago (1 children)

Jottacloud is what I want to use. Unlimited storage for ~100€/month

Close behind is 1fichier for 2€/TB/month or 12€/TB/year, but they are in France and "uptobox" (a similar provider) was shutdown by the US on French soil because they allowed providing links to the files.

You can probably find others in the list of storage systems supported by rclone

[–] cm0002@lemmy.world 2 points 9 months ago

Oh. Yea I've been in the Google Workspace Unlimited alternative thread on the rclone forums, apparently jotta will eventually hit a point on its "gradual slowdown" that it's practically worthless (Iirc it was around 10TB, so for jotta "unlimited" is functionally 10TB)

1ficihier was also talked about, but ig you have to reupload any given file every 30 days or it will expire