this post was submitted on 10 Aug 2023
42 points (97.7% liked)
Asklemmy
43803 readers
753 users here now
A loosely moderated place to ask open-ended questions
Search asklemmy ๐
If your post meets the following criteria, it's welcome here!
- Open-ended question
- Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
- Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
- Not ad nauseam inducing: please make sure it is a question that would be new to most members
- An actual topic of discussion
Looking for support?
Looking for a community?
- Lemmyverse: community search
- sub.rehab: maps old subreddits to fediverse options, marks official as such
- !lemmy411@lemmy.ca: a community for finding communities
~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I wrote a bash script that runs daily which 7z (AES256) the databases (well... I dump the DB as text and then 7z those files), web files (mostly WordPress), user files, all of /etc, and generate a list of all installed packages, and then copy the archives to a timestamped folder on my Google drive (I keep the last two nights, plus the last 3 Sundays).
TBH, the zipped content is around 1.5GB for each backup. So my 17GB of free GDrive space more than enough. If I actually had a significant amount of data, I'd look into a more robust long term solution.
If there was a catastrophic failure, it'd take me around six hours to rebuild a new server and test it.
That is a good idea, I was thinking on doing something similar with s3 before deciding to check what other people were doing. Thanks