Perhaps git would work?
Selfhosted
A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.
Rules:
-
Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.
-
No spam posting.
-
Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.
-
Don't duplicate the full text of your blog or github here. Just post the link for folks to click.
-
Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).
-
No trolling.
Resources:
- selfh.st Newsletter and index of selfhosted software and apps
- awesome-selfhosted software
- awesome-sysadmin resources
- Self-Hosted Podcast from Jupiter Broadcasting
Any issues on the community? Report it using the report flag.
Questions? DM the mods!
Syncthing is also an option.
But rsync
can do this fine with --recursive --delete
. The mirror will remain an exact replica.
lsyncd does just this. It's intended use is to sync directories between systems over slow(ish) uplink, but it can work locally as well. It takes some fiddling to set up, but once set up it just does it's thing seamlessly at the background. However if you're just looking for a backup solution I'd might look for something else, like a plain rsync script.
This is the best answer. I use it regularly to keep hundreds of TB in sync across nodes. Works extremely well and is pretty much hands off once setup.
I found an answer on StackExchange that refers to a tool called rsync-sidekick which looks like it could achieve your aims.
Edit: There is also another answer on that same StackExchange page referring to a script called rsync-prepare which is capable of working with a remote destination.
I use RealTimeSync.
It just... Works. I love it. It's free.
There are plenty of features that could effectively let you do what you want.
Mine is set to sync my server with a backup whenever my main PC is idle for 20 mins. Once it determines how to sync, it lets me review before syncing.
Why not exclude the folder you want to move from the initial sync, and sync that folder separately to the final location?
borg does this
Why not maintain a soft or hard link?
I am trying to create a crontab to backup multiple directory in which files change their location, but I don't know if links would solve it
You can also have rsync exclude specific files, so you could sync the main dir first, then the special subdir specifically
As long as the backup system performs deduplication, it should be able to store the backup data efficiently even if the source files are constantly moved around or renamed.
ZFS snapshots can perform deduplication. There is also ZBackup, which also has deduplication support.
rclone sync might be a acceptable?
It doesn't move files from one dir to another but it would delete the old directory and recopy to the new directory.
I think rsync has the same functionality but could be wrong
Restic, kopia, and Borg are all pretty good backup tools with deduplication built in, so they might be a good option if you're doing this for backup purposes?