this post was submitted on 25 Oct 2024
25 points (96.3% liked)

Programming

17492 readers
61 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities !webdev@programming.dev



founded 1 year ago
MODERATORS
 

Hi programmers,

I work from two computers: a desktop and laptop. I often interrupt my work on one computer and continue on the other, where I don't have access to uncommitted progress on the first computer. Frustrating!

Potential solution: using git to auto save progress.

I'm posting this to get feedback. Maybe I'm missing something and this is over complicated?

Here is how it could work:

Creating and managing the separate branch

Alias git commands (such as git checkout), such that I am always on a branch called "[branch]-autosave" where [branch] is the branch I intend to be on, and the autosave branch always branches from it. If the branch doesn't exist, it is always created.

handling commits

Whenever I commit, the auto save branch would be squashed and merged with the underlying branch.

autosave functionality

I use neovim as my editor, but this could work for other editors.

I will write an editor hook that will always pull the latest from the autosave branch before opening a file.

Another hook will always commit and push to origin upon the file being saved from the editor.

This way, when I get on any of my devices, it will sync the changes pushed from the other device automatically.

Please share your thoughts.

you are viewing a single comment's thread
view the rest of the comments
[–] NegativeLookBehind@lemmy.world 27 points 4 weeks ago (3 children)

Write code on a machine you can remote into from each computer? Less commits, possibly less reverts, less chance of forgetting to git pull after switching machines…idk.

[–] lucas@startrek.website 5 points 4 weeks ago (2 children)

Don't even need to remote in to anything, just store your working code on a network share

[–] matcha_addict@lemy.lol -4 points 4 weeks ago (1 children)

I mean... That's kinda what git does, in a way... Right?

[–] Kkmou@lemm.ee 5 points 4 weeks ago* (last edited 4 weeks ago)

Don't think git as a sync storage, more like to merge works.

If you need to share files between computers use a shared storage.

Always use the right tool for the job. Mount a shared storage or use synctools rsync, etc

[–] matcha_addict@lemy.lol 1 points 4 weeks ago (2 children)

I have considered this approach, but there are several things I had issues with.

  • there is still a degree of latency. It's not a deal breaker, but it is annoying
  • clipboard programs don't work. They copy to the remote host's clipboard. I bet there's a solution to this, but I couldn't find it from spending a limited time looking into it.
  • in the rare case the host is unreachable, I am kinda screwed. Not a deal breaker since its rare, but the host has to be always on, whether the git solution only requires it to be on when it syncs

To address the issues you brought up:

  • less commits: this would be resolved by squashing every time I make a commit. The auto save commits will be wiped. If I really hated commits, I could just amend instead of commit, but I rather have the history.
  • forgetting to git pull: the hooks I talked about will take care of that. I won't have to ever worry about forgetting anymore.
[–] actually@lemmy.world 1 points 4 weeks ago

I once used a virtual desktop in the cloud, and I could access that from anywhere. It was just a regular OS that had all my tools, and it was where my work was done changes. Ultimately, that remote desktop went away when I changed jobs. But, it would be something I would think about again for me.

There is a danger of things going poof, or not being accessible. It cannot be helped at all. But a push to a backup repo during each commit, would allow an emergency restore. Doing a snapshot every few days of the machine, for example if its on AWS or other, helps lessen the loss when and if it goes poof.

To solve the issue of the internet going out, have one of your local computers do a regular pull as a cron job of the backup repo

[–] Strykker@programming.dev 0 points 3 weeks ago (1 children)

Your git solution still has all of these issues, as you need the git server to be alive, for number 3 use something like rsync so you keep a local copy that is backed up if you are concerned about the file share being offline.

[–] matcha_addict@lemy.lol 1 points 3 weeks ago* (last edited 3 weeks ago)

I don't need the client computers to be alive, only the central server (which could be github.com for example, so not even a server I manage).

[–] hakunawazo@lemmy.world 1 points 4 weeks ago

Yes, and use something like GNU Screen to work seamless on the other machine again.