I don’t do any backup rotation schemes… I just do a straight rsync archive. Do I run the chance of fouling something up, mostly because of excess files (I don’t delete when I do a backup - just add or overwrite). I haven’t run into any problems yet, but I haven’t been in the situation where I’ve had to bring back something onto the local disk that I had updated many times.
You need to understand what you are achieving with your backup approach and what you are protecting against.
A full backup plan with history and rotation, etc. will protect you against all sorts of issues, including file corruption, accidental deletion, drive failures, theft, ransomware, etc.
A simple rsync may protect you against some of these (e.g. drive failure or theft of laptop) but probably won’t protect you against file corruption, ransomware, etc. as any changes to the file will overwrite the archive.
Depending on how your rsync is configured, it may also not protect you against accidental deletion or some types of drive corruption as the rsync may delete the archive when the original file is deleted.
In general, schemes like rsync copies are great for providing data redundancy between physical systems (e.g. between drives or separate NAS units), but not great for actual data protection.
Similarly RAID is a redundancy scheme to provide limited protection against hardware failure, NOT a backup.
It’s up to you but I would ask yourself: how valuable is this data to me and what would be the impact of losing it?
If the answer is “very valuable, and the loss would be significant” I would suggest putting in place a full backup scheme of some sort, based on how much data you are prepared to lose. For instance, if you are prepared to deal with up to a week’s data being lost, then build your backup schedule around that.
There’s plenty of options for Cloud backup, including encryption and compression, that might be worth exploring.
For a more robust backup solution, still based around
rsync, you might look at
rsnapshot: it preserves multiple versions of your source tree (daily / weekly / monthly), with unchanged files shared using hard links.
I recently met band members of https://lorenzosmusic.bandcamp.com/ who they just keep their Ardour sessions in a git repository. This way you can roll-back and also easily share sessions (merging git branches of the arodur session file is however not possible).
Then you just push to github or gitlab, or … and let them worry about backups:
From a disk-usage POV it is not the most efficient way since it takes up twice the storage on the local disk (audio files are duplicated in the
.git/ folder) and you may have to resort to git-LFS (large file storage https://git-lfs.github.com/). But it is kinda neat, especially if you already use git for other purposes already.
I did that for a long while using BZR with earlier versions of Ardour. I remember having discussions about this with Paul and a few others at the time, and they cautioned me that it may or may not work, but can’t say I ever had to many issues honestly.
Git and Git-LFS for those that are familiar with them and have the knowledge are great and I could see working well for this.
The people in Lorenzo’s Music have figured out a workflow that completely (more or less) sidesteps the issues that would arise with using git (or bzr or any similar tool) to do merging.
I have used Pcloud. There is a desktop application for Linux, Windows and Mac that makes backup easy. Sharing files with customers is also easy.
This topic was automatically closed 91 days after the last reply. New replies are no longer allowed.