I’ve been considering paying for a European provider, mounting their service with rclone
, and thus being transparent to most anything I host.
How do y’all backup your data?
Prayer
You don’t have to worry about the backups. It the data recovery that will require divine intervention.
Jesus is my
copilotraid parity.
Raid is backup right?
of course /s
It protects against drive failure. That is the threat I am most worried about, so it’s fine for me.
drive failure
Perhaps unintended but very much relevant singular. Unless you’re doing RAID 6 or the like, a simultaneous failure of two drives still means data loss. It’s also worth noting that drives of the same model and batch tend to fail after similar amounts of time.
same model and batch
This is why when you buy hard drives, you should split the order across several stores rather than buying all of them from one store. You’re much more likely to get drives from different batches.
Oh, don’t worry they’re a random mix of old drives I had lying around, they’re most certainly not the same model, let alone batch!
(But yes, fair call if you have a big Nas. I have 2TB in my desktop)
That’s the thing. I don’t.
Two hard drives of the same size, one on site and one off site.
Where do you keep your off-site one? Like a friend or family member’s house?
I keep one in a bank deposit box. It costs like $10/year, fireproof, climate controlled, and exactly the right size for a 3.5" disk. Rotate every couple of months, because it is like 10-15 minute process to get into the vault.
So your backed up data can be as old as a couple of months and requires manual interaction? I guess that’s better than nothing, but I’m looking for something more automated. I’m not sure what my options are for cloud storage or if they are safe from deletion. Or if having it in a closet in a friends house is really the best option.
I have a live local backup to guard against hardware/system failure. I figure the only reason I’d have to go to the off-site backup is destruction of my home, and if that ever happens then recreating a couple of months worth of critical data will not be an undue burden.
If I had work or consulting product on my home systems, I’d probably keep a cloud backup by daily rsync, but I’m not going to spend the bandwidth to remote backup the whole system off site. It’s bad enough bringing down a few tens of gigabytes - sending up several terabytes, even in the background, just isn’t practical for me.
At home and at the shop where I work. At work the drives are actually stored in a Faraday cage.
Either works, if you dont trust them encryption is always an option
I wrote my own thing. I didn’t understand how the standard options worked so I gave up.
Tape is the best medium for archiving data.
I really want to use tape for backups, but holy expensive. Those tape drives are thousands of dollars.
Damn, the last time I thought about this (20 years ago) I was able to buy a tape drive for a PC for like … I wanna say $250-300?? I forget the format, it was very very common though and tapes were dirt cheap, maybe $10-12 a pop. Worked great, if you were willing to sit around and swap tapes out as needed.
I think the problem is that normal consumers wouldn’t ever buy a tape drive, so the only options still being produced are enterprise grade. The tapes are still pretty cheap, but the drives are absurd.
So tape doesn’t make sense for the typical person, unless you don’t have to buy the equipment and store i.
But, if you’re even a small company it becomes cheaper to use tape.
Companies don’t like deleting data. Ever. In fact some industries have laws that say they can’t delete data.
For example, the company I work in is small, but old. Our accounting department alone requires complex automated processes to do things each day that require data to be backed up.
From the beginning of time. I shit you not. There is no compression even.
And at the drop of a hat, the IT dept needs to be able to implement a backup from any time in the past. Although this almost never happens outside of the current pay cycle, they need to have the option available.
The best way they have to facilitate this (I hate it - like I said they’re old) is to simply write everything multiple times a night. And it’s everything since we started using digital storage. Yes, it’s overkill and makes no sense, but that’s the way it is for us. And that’s the way it is for a lot of companies.
So, when we’re talking about that amount of data, and tape having a storage cost advantage of 4:1 over disk, it more than pays for all the overhead for enterprise level backups.
I bought an incredibly overkill tape system a few years ago and then the power supply exploded in it and I never bothered to replace it. Still, definitely worth it
Yes, tape has very steep entry costs and requires maintenance and storage.
Most of the time it doesn’t make sense for a person to use it, but rather a corporate entity that needs to backup petabytes of data multiple times a day.
deleted by creator
I use Kopia
Manually plug in a few disks every once in a while and copy the important stuff. Disks are offline for the most part.
I keep important files on my NAS, and use Borgbackup with Borgmagic for backups. I’ve got a storage VPS with HostHatch that’s $10/month for 10TB space (was a special Black Friday deal a few years ago).
Make sure you don’t just have one backup copy. If you discover that a file was corrupted three weeks ago, you should be able to restore the file from a three week old backup. rsync and rclone will only give you a single backup. Borg dedupes files across backups so storing months of daily backups often isn’t a problem, especially if the files rarely change.
Also make sure that ransomware or an attacker can’t mess up your backup. This means it should NOT be mounted as a file system on the client, and ideally the backup system has some way of allowing new backups while disallowing deleting old ones from the client side. Borg’s “append only” mode is perfect for this. Even if an attacker were to get onto your client system and try to delete the backups, Borg’s append-only mode just marks them as deleted until you run a
compact
on the server side, so you can easily recover.Local to synology. Synology to AWS with synology’s backup app. It costs me pennies per day.
Same, although aws is my plan b. For plan a I have an older Synology that is a full backup target.
On site? I put enterprise drives in my nas. Always have and have never had a drive fail. If one does, raid is good until the replacement arrives.
Raid is no backup. Raid helps you against drive failure.
Backup helps you if you or some script screwed up your data, or you need to go back to last months version of a file for whatever other reason.
Aws helps if your house burns down and you need to set up again from scratch.
Versioning is a feature completely separate from raid or dual nas or whatever else you do. Your example of the house burning down is exactly why I questioned the dual nas… Both nas will be toast.
So please, tell me again why you need 2 nas for versioning? Maybe you’re doing some goofy hack, then ok. That’s still silly. Just do proper versioning. If you’re coding, just use git. Don’t reinvent the wheel.
I’m stunned that you are unfamiliar with the versioning feature of backups. In my bubble this has been best practice since Apple came along with the Time Machine, but really we tried that even before with rsync, albeit only with limited success.
This is different from git because this takes care about all files and configurations, and it does so automatically. Furthermore it also includes rules when to thin out and discard old versions, because space remains an issue.
Synologys backup tool is quite similar to Time Machine, and that’s what I am using the second NAS for. I used to have a USB hard drive for that task, but it crashed and my old Synology and a few old disks were available. That’s better because it also protects against a number of attacks that make all mounted paths unusable.
Git is not a backup tool. It’s a versioning tool, best used for text files.
Your condescension is matched only by your reading comprehension. I do not know what your requirements are. You said coding and alluded versioning, so I tossed out git. Enjoy your tech debt. I hope it serves you well and supports your ego for many years.
Your condescension is matched only by your reading comprehension.
Bruh. Look into a mirror.
deleted by creator
I do an automated nightly backup via restic to Backblaze B2. Every month, I manually run a script to copy the latest backup from B2 to two local HDDs that I keep offline. Every half a year I recover the latest backup on my PC to make sure everything works in case I need it. For peace of mind, my automated backup includes a health check through healthchecks.io, so if anything goes wrong, I get a notification.
It’s pretty low-maintenance and gives a high degree of resilience:
- A ransomware attack won’t affect my local HDDs, so at most I’ll lose a month’s worth of data.
- A house fire or server failure won’t affect B2, so at most I’ll lose a day’s worth of data.
restic has been very solid, includes encryption out of the box, and I like the simplicity of it. Easily automated with cron etc. Backblaze B2 is one of the cheapest cloud storage providers I could find, an alternative might be Wasabi if you have >1TB of data.
How much are you backing up? Admittedly backblaze looks cheap but at $6 Tb leaves me with $84 pcm or just over $1000 per year.
I’m seriously considering a rpi3 with a couple of external disk in an outbuilding instead of cloud
Oh, I think we’re talking different orders of magnitude here. I’m in the <1TB range, probably around 100GB. At that size, the cost is negligible.
Isn’t backblaze is like $6 per TB 🤔🤔🤔
So $216 a year?
$6 x 14Tb = $84 month x 12 months = $1008 per year, or did I miss read the prices?
Sorry, I thought you or somebody said they store 3TB. Probably I’m mistaken, sorry 🥲
Also you know it’s also possible to setup backups on the drive connect, also a good thing to turn off the networking beforehead 😶🌫️ (Also it’s possible to do “timer usb hub”, it’s not very off-site, but a switch can turn on every X days and the machine will mount it and do the backup, then the usb hub turns off (imagine putting it in a fireproof safe with a small hole for a usb cable))
Also, i’m using ntfy.sh for notifications And if you’re using raid, you can setup it with on a drive failure
The only type of data I care about is photos and video I’ve taken. Everything else is replaceable.
My phone —> immich —> backblaze b2, and some Google drive.
Linux isos I can always redownload.
rclone to dropbox and opendrive for things I care about like photo backups and RAW backups, and an encrypted rclone volume to both for things that need to be backed up, but also kept secure, such as scans of my tax returns, mortgage paperwork, etc. I maintain this script for the actual rclone automation via cron