this post was submitted on 19 May 2024
71 points (97.3% liked)

Linux

47314 readers
592 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS
 

Title is TLDR. More info about what I'm trying to do below.

My daily driver computer is Laptop with an SSD. No possibility to expand.

So for storage of lots n lots of files, I have an old, low resource Desktop with a bunch of HDDs plugged in (mostly via USB).

I can access Desktop files via SSH/SFTP on the LAN. But it can be quite slow.

And sometimes (not too often; this isn't a main requirement) I take Laptop to use elsewhere. I do not plan to make Desktop available outside the network so I need to have a copy of required files on Laptop.

Therefor, sometimes I like to move the remote files from Desktop to Laptop to work on them. To make a sort of local cache. This could be individual files or directory trees.

But then I have a mess of duplication. Sometimes I forget to put the files back.

Seems like Laptop could be a lot more clever than I am and help with this. Like could it always fetch a remote file which is being edited and save it locally?

Is there any way to have Laptop fetch files, information about file trees, etc, located on Desktop when needed and smartly put them back after editing?

Or even keep some stuff around. Like lists of files, attributes, thumbnails etc. Even browsing the directory tree on Desktop can be slow sometimes.

I am not sure what this would be called.

Ideas and tools I am already comfortable with:

  • rsync is the most obvious foundation to work from but I am not sure exactly what would be the best configuration and how to manage it.

  • luckybackup is my favorite rsync GUI front end; it lets you save profiles, jobs etc which is sweet

  • freeFileSync is another GUI front end I've used but I am preferring lucky/rsync these days

  • I don't think git is a viable solution here because there are already git directories included, there are many non-text files, and some of the directory trees are so large that they would cause git to choke looking at all the files.

  • syncthing might work. I've been having issues with it lately but I may have gotten these ironed out.

Something a little more transparent than the above would be cool but I am not sure if that exists?

Any help appreciated even just idea on what to web search for because I am stumped even on that.

you are viewing a single comment's thread
view the rest of the comments
[–] linuxPIPEpower@discuss.tchncs.de 4 points 4 months ago (1 children)
  1. In another comment I ran iperf3 Laptop (wifi) ---> Desktop (ethernet) which was about 80-90MBits/s. Whereas Desktop ---> OtherDesktop was in the 900-950 MBits/s range. So I think I can say the networking is fine enough when it's all ethernet. Is there some other kind of benchmarking to do?

  2. Just posted a more detailed description of the desktops in this comment (4th paragraph). It's not ideal but for now its what I have. I did actually take the time (gnome-disks benchmarking) to test different cables, ports, etc to find the best possible configuration. While there is an upper limit, if you are forced to use USB, this makes a big difference.

  3. Other people suggested ZeroTier or VPNs generally. I don't really understand the role this component would be playing? I have a LAN and I really only want local access. Why the VPN?

  4. Ya, I have tried using syncthing for this before and it involves deleting stuff all the time then re-syncing it when you need it again. And you need to be careful not to accidentally delete while synced, which could destroy all files.

  5. Resilio I used it a long time ago. Didn't realize it was still around! IIRC it was somewhat based on bittorrent with the idea of peers providing data to one another.

[–] BearOfaTime@lemm.ee 3 points 4 months ago* (last edited 4 months ago)

The VPN is to give you access to your files from anywhere, since you don't have the storage capacity on your laptop for all of them.

If you have an encrypted connection to home, laptop storage isn't a concern.

As a benefit, this also solves the risk of losing files that are only on the laptop, by keeping the at home.

Yea, Syncthing has it's moments (and uses - I keep hundreds of gigs between 5 phones and 5 laptops/desktops in sync with it).

Resilio does use the bittorrent protocol, but uses keys and authorization for shares. Give it a try, it may address your need to access files remotely. I use it to access my media (about 2TB) which clearly can't be sync'd to my laptop (or phone). I can grab any file, at any time.