I have a Synology NAS with a few TB of data that I need to back up. Originally I was using Amazon Drive with my Unlimited plan; but since Amazon terminated those I was without a offside backup for a while (yes, I know, shame one me).
I recently purchased an external hard drive for my mac and I want to backup my NAS to there as well so it is included in my Backblaze plan.
I know Katie and David mentioned they both use a similar setup but they never went into detail there.
Does anyone has an idea how to accomplish those? The Hyper Backup package from Synology only supports WebDAV (which requires macOS Server) or rsync via ssh.
Should I setup a new user for ssh and rsync, or use a tool to clone the network drive?
Does anyone use a similar setup?
I back up my Drobo NAS using an external drive and Backblaze.
I run Carbon Copy Cloner on my iMac and every week it backs up my Drobo to an 8tb external hard drive. That external hard drive then gets backed up to Backblaze. The NAS doesn’t need support for any special backup features, it just gets mounted as a network drive and CCC takes care of the rest.
I use the very reasonably priced Amazon Glacier service for backup of my Synology. Restore will not be fast if needed, but the drives have good redundancy already. I also keep mostly archive stuff on it and can do without the files for a period.
I think the best approach is to use Carbon Copy Cloner to clone the Synology to the external drive on your Mac. I believe this is the setup that Katie Floyd has described.
There are two other Mac cloning apps that are also well-known and very good: SuperDuper! and ChronoSync. I personally use CCC, but both of the others are well regarded as well.
You will need to set up the cloning software to ensure that the Synology share is mounted for cloning. This is quite easy to do, as CCC at least will do this automatically for you.
I would probably NOT user HyperBackup to the Mac external drive, because you are layering another, essentially unnecessary data structure on top of your data with this mechanism, ensuring that to restore your files you would need to d/l the entire set of files from BackBlaze and then use Synology’s HyperBackup Explorer…too many places for something to go wrong.
I will reiterate my comments from another thread. I personally stopped using BackBlaze because they only retain files for 30 days if deleted. I have already had once situation in which I discovered missing files many months after the last time I had accessed them. With BackBlaze they would have been gone from online store. With Arq, they were readily retrievable. At the time I did not have a robust in-house TimeMachine and Clone setup, which was as mistake. Now, my Mac Mini server and MacBookPro both do clones and Time Machine (to an external drive on the mini) as well as the Arq backup to BackBlaze B2. The Synology shares are cloned daily and and Arq’s to B2 every 6 hours.
As an aside, it is possible to set up a WebDav share without MacOS Server (in fact, this will soon be necessary as Apple is removing services from Server). The setup is complex, and The Google can help you find instructions, but I’m not sure this is worth the effort just to use HyperBackup at home.
I own a QNAP NAS but I think the situation might be similar to some degree. My use case is a little different though.
I have two ways in place to backup the NAS data:
I have a Virtual Machine (Windows 10) installed on my NAS, which does several tasks for my needs. In this VM, Arq is backing up one share to one OneDrive Account and a second share to a different OneDrive account. Both accounts are included in my Office 365 subscription (1TB each). I am aware of the fact that there are different solutions out there, but I like it to be independent of app limitations. Having a dedicated virtual machine on my NAS enables me to have a (slow) full-fledged PC on the NAS. I also use RDP to access the machine from my Mac if I need Windows for minor tasks “on my Mac” (only small things, so it does not matter that the VM is quite slow because of the NAS’ hardware limitations). But that is a different topic.
One a week, I attach an external USB drive to my QNAP. When my QNAP identies the USB drive, it will start to backup the NAS contents to the USB drive automatically. This is being done with QNAPs internal backup solutions. The weekly incremental backup takes about 20 minutes. The first backup needed several hours. The USB drive is formatted in HFS+, so I am able to read from it directly using a Mac if necessary. Synology seems to be able to do the same: https://www.synology.com/en-global/knowledgebase/DSM/help/DSM/Tutorial/backup_backup#t2.1 Synology seems to prefer not using USB copy, but the “simple” USB copy in combination with HFS+ means that I can access the data on my Mac natively in an emergency. And I like that a lot.
Synology actually has a very good alternative Synology C2 backup, that you can use with hyper backup. Works really well, and as their servers are in Germany they should be very fast for you (looking at your flag)
I have mine running every night, and have never had issues. It’s 59 euros per year for 1TB, and that is enough for me.
I have a couple of Synology NAS installed at my clients and my own home office.
All of them have dual back-ups: cloud based using Backblaze and on a local USB drive.
For cloud, backup use the Synology Cloud Back up and Hyper Backup for USB.
My smallest Synology NAS has 7 drives. Besides a dual drive failure protection, I also have one drive parked as a “hot swap” that will automatically be added to the raid. Leaving the dual disk redundancy in place. Giving me ample time to travel to the site and install a new “hot swap”.
Do install an APC power back-up and connect it to the NAS so it can properly shut down and start when there is a power failure.
The only other then you could to for data redundancy is to mirror the Synology NAS to one or more other units installed in different locations etc etc etc
I would have never thought of using a VM to run Arq…now you have my brain thinking!
If the NAS is capable of doing so (at least a “modern” somewhat “powerful” Intel Celeron CPU and ideally 8 GB of RAM), you will not regret it. It will be a bit slow and nothing to be useful for editing photos or video. But it really is nice to have a Windows machine without wasting space on my Mac for the occasional need, when Windows might be needed. And it is just great for running server tasks without having a dedicated computer consuming power 24/7.
Yep I am running a Windows 10 Pro install on my DS1517+ equipped with 2 M2SSD’s and 16GB of memory. Using the web interface or Windows Remote Desktop. It works “ok” but is not the fastest machine around the block. One very annoying thing is that accessing the internet (browsing the web) using the VM only runs at a fraction of the speed that is available on the WAN.
But it works well enough if you don’t needs it all the time.