I have an old skull canyon nuc with 32GB memory that I run VMWare esx on.
Currently it's running a pfsense firewall, VMs for nextcloud, pihole, gitlab server, portainer docker server, Ubiquiti controller, windows 10 desktop and windows server 2019. Oh, and a minio server for backups via duplicacy. And a wireguard server running on the nextcloud VM.
Sits on a shelf in my basement and just keeps humming along.
I switched from CrashPlan to Duplicacy backing up to a GSuite account. Using the command-line Duplicacy client on FreeBSD and Windows 10, have it setup to launch via cron and Task Scheduler.
> B2 has to use CloudSync instead of HyperBackup?
If you do not want to use Hyper Backup, I'd recommend that you use a 3rd party backup software, instead of a sync software (whether Synology's or other).
The reason is that there are benefits to backup software that sync software generally doesn't have (at least not all of them):
I'm actually making the switch away from Hyper Backup and I'm using Duplicacy-Web in a docker container right now. While the UI may feel a bit rough around the edges at times, it's been rock solid, much faster than HB and I prefer the configuration of their retention times over HB. It's a cross-platform tool written in Go, the CLI version is open-source and free for personal use, while the Web version is available for a modest subscription fee. So far I'm very happy with it and can recommend it.
> However, with a four month old and no time
that's familiar :)
> I need something simple
I'd recommend Duplicacy's UI front end which is very simple and rather cheap ($20 + $5/year after first year). The command line version is free for personal use though. So you technically could setup everything with the trial UI and then just configure scheduler to run command line version periodically. This will save you maybe 20 min reading documentation about how to configure backup destination in the command line. The massive benefit of this tool is that you can configure multiple computers/accounts to backup to the same destination archive, and while maintaining separate backup history for each backup set the tool will deduplicate data across all machines -- saves a lot on storage costs.
Some people have good experience with Arq backup application - license costs $50 - but I was unable to make it work for me due to weird technical issues. You may want to try the trial and see if you like it.
I had similar issues with Backblaze's consumer product and moved over to their B2 service. I have never liked how Backblaze handles external drives but I do understand why they do it, from a business perspective. Crashplan used to handle external drives with ease but again, it's not sustainable.
Not pushing any particular product but you might want to look into Arq (what I use) or Duplicacy (used to use). There are others out there but I use / have used both of these extensively. Both work with B2 (and other cloud storage services) and handle external drives, NAS, etc., with ease.
Another option that a lot of people on this forum swear by is duplicacy. You should be able to point it to any endpoint (external drive, B2 etc) and get a backup done with almost all the features you mention. Look up posts by /u/ssps in this subreddit. Has the advantage of being open format, so you’re not dependent on Synology software at multiple levels.
“No use for redundancy” is an interesting philosophy for a central repository that you are directly working on :) Essentially if the one disk fails, you are purely reliant on the last backup! Even a RAID1/SHR1 would allow you to lose a single drive and not have a disruption to active work.
Synology’s use of Btrfs gives you filesystem snapshots as well (which is not the same as a backup, think more like Time Machine.. ability to go back to a previous point in time). You should look into that and use a combination of these features to protect yourself from different sources/types of failure.
My recommendation is duplicacy. https://duplicacy.com
No extra space is taken. Backups are incremental and differential. If you backup 1TB of unchanged data 100000 times it will take 1TB (+ couple of kb for accounting). Likely however much less because most backup tools also support compression.
I use this: https://duplicacy.com and target backblaze B2. (I also still use HyperBackup to backup synology specific stuff only, like system state)
You can run it in docker container (https://hub.docker.com/r/saspus/duplicacy-web).
Since it's written in Go and is self-contained executable without dependencies so there is reason to use docker either, nor wrap it into synology package (other than distribution convenience in both cases).
Here is how to run it natively https://blog.arrogantrabbit.com/backup/Duplicacy-Web-on-Synology-Diskstation/
When not to use it: if your diskstation is severely memory constrained. HyperBackup was designed to churn terabytes of data using minuscule amount of ram.
Locked into Arq. Most backup apps are like this though. Duplicacy is similar to Arq but they have a restore application (command line) that is open source and free to download via Github.
Why not Duplicacy?
I'd run it on every machine though, not just the Syncthing Server. It deduplicates between machines, so if one machine uploads a file that another machine has it shouldn't upload much whatsoever.
I'm confused on its licensing. On github it shows "Free for personal use or commercial trial" but then on https://duplicacy.com/buy.html it shows "License Types There are 3 types of Duplicacy licenses:" all costing money.
Can anyone clear this up for me?
I'm thinking of a similar scheme, but using Duplicacy as the backup tool, since I have Linux, Mac and Windows clients that I want to back up to it.
Just looking for something small and quiet to use as an SFTP server now.
EDIT: Decided to build a Socket 1151 Micro-ITX system in a Silverstone Milo Series ML08 case. I think I can probably fit two 3.5" hard drives in there if I don't have a video card. I'll put Proxmox on there for containers and ZFS storage.
> backing up to the nas via SFTP.
If OP has a model that supports Docker, I would highly recommend using Minio instead of SFTP. It’s faster, uses less resources, and has checksums built in, so if your backup repository is corrupted, minio will tell you (or your backup software).
Arq has a guide for setting it up on Synology.
> Kopia is free and opensource
Arq is also, as far as I know, the only 3rd party Mac backup that includes all metadata.
It also supports running missed backups when the machine next is able (battery powered, no network, etc), and will not schedule a bunch of backups in a row if one or more is missed.
Kopia is great if you have many machines that has the same files, and you want to back them all up. With Arq, deduplication is done per backup job, where Kopia will deduplicate on repository level.
As you wrote, Kopia is “unstable” (works for me, but YMMV), but if you want the functionality of Kopia in a stable package, I highly recommend Duplicacy. The CLI version is free, but I also highly recommend that you buy a license for it just for the the GUI. Duplicacy works in mysterious ways, at least if you’re used to more traditional backup methods, and the UI nearly abstracts that away.
> use Restic as the client on the PCs.
That’s an odd recommendation for “family PCs”. Using a piece of backup software that has no GUI, means that all restores/recovery falls back on you.
I would probably go for either ArqBackup or Duplicacy. In case it needs to be free, something like Kopia. Kopia is not yet released in a stable version, to it’s probably not the best option for now.
> This is one area I would probably go with a commercial solution instead of self-hosting
If it’s the only backup I fully agree. The second your user count increases above 1, you’re no longer entertaining a hobby, you’re a full blown sysadm with responsibilities. Backup may not be the most critical to keep running compared to Nextcloud, but it is still something people rely on.
But yeah, buy a cloud backup.
I prefer to use 3rd party software rather than Synology's software.
I use sync software (In my case Heatsoft ADCS) to sync my data from my Windows PC to the NAS.
I use Duplicacy-Web in a docker container to backup up my NAS data to two different cloud storage providers
I have BTRFS snapshots enabled on my NAS to have ransomware protection.
You can also use Duplicacy-Web to back up files from your PC to your NAS, if you prefer to already use a backup software at that step. But I'd still strongly recommend an online backup of your important data.
Any single HDD is not bit-rot protected. IMHO that limits the use-case for backups in the following ways:
For under 1TB of data use a backup software and back up to a cloud storage provider like Google Cloud Drive or Backblaze B2. You can use Synology's Hyper Backup to do the backups, though I don't like it as much and use a 3rd party software (Duplicacy-Web in a docker container.
Have a look at Duplicacy. It’s opensource multi platform backup tool. It supports cross-machine, lock free deduplication. Command line version is free for personal use. GUI for it is extremelyaffordable($20 +$5/year for the first computer and half of that for each subsequent one). Choice of storage backend is yours. (Personally I use command line version, but bought a few licenses anyway to support the developers and reward great work)
You can run it on a nas, on your Mac, on your PC targeting the same destination backend storage, with client-side encryption, and the data will be deduplicated across all your devices. It’s awesome. Look at my recent post history for the description on how to run it on Synology natively.
As a destination you can use any storage available to you. For example, if you already have g-suite subscription you can utilize the unlimited storage to keep Duplicacy datastore there. Or you can use Backblaze B2 too. It supports quite a large number of storage providers.
Duplicacy. Opensource. Command line version is free for personal use. Supports multiple backend, including local disk. Supports lock-less multi-machine deduplication, encryption, pruning and is very, very fast. Leaves Crashplan in the dust.
Have a look at duplicacy: https://duplicacy.com/ . It fits several of your requirements. The command line version is free for personal use and gui version is paid and it supports several cloud backends.
If you're getting class C transactions then it sounds like your software is doing something wrong. As you dicated, that only lists things and all that should be cached locally.
I have not used Duplicati though, so I can't comment on it. Maybe look into Duplicacy instead? I know that Duplicacy uses HEAD requests instead of LIST for checking the status of 1 item.
>Now I have a new question. What's the best method of copying/mirroring/cloning/whatever-ing my first drive with the second.
Let's say drive 1 sustains some unrecoverable data corruption - do you want to blindly mirror or overwrite that on drive 2? Probably not! Consider making drive 2 a true backup that stores the old copy AND the new copy of files that appear to have changed. This way, you can recover from things like corruption or accidental deletions in addition to a complete drive failure. Duplicacy is a nice option (https://duplicacy.com/).
Duplicacy or it’s command line version. Search this sub. There are detailed comparisons.
To summarize, I would avoid both arq and duplicati for many reasons, stability and performance leading the list.
Duplicacy is currently my backup program of choice. The algorithm is ingenious, yet so simple and eloquent that it amazes me no one has done it before.
It supports all three major OS (including VSS on Windows). I even have it running automated backups on my Android phones and tablets, and my Raspberry Pis. Hell, I even have it backing up my HTPC (Shield TV) that has nothing of value on it. It supports all major cloud providers as backends, and you can even backup to multiple. I personally backup all of my devices to an on-site SFTP backup server, which then copies the backups to GDrive, B2, and Hubic.
The GUI requires a paid license, and frankly, sucks. He's rewriting it atm. The cli version is free for non-commercial use, and is a lot more powerful/flexible than the GUI.
It has a bit of a learning curve, but once it clicks it's easy to use. To get the most out of it I recommend reading the main Github page, and all of the wiki pages. The design wiki pages are particularly interesting if you're technical, as he details the algorithm in depth -- to the point where you can restore and unencrypt your files by hand if you ever need to.
I think Duplicacy is made for this. It explicitly supports both Backblaze B2 and having multiple computers hit the same repository. The command line version is free for personal use, GUI version costs a little money. Well worth a look: https://duplicacy.com/ and https://github.com/gilbertchen/duplicacy
You seem to be conflating backup and sync - or maybe I'm just tired.
For sync I give another vote for Unison, provides a nice UI for rsync. For backup I'm currently using Duplicacy - and no that's not just a misspelling of Duplicity or Duplicati.
Really wish backup tool authors would stop calling their software dupli(something) :(
I ran into those issues when trying out Duplicati 2.0.1.22 through 2.0.1.31 on both Windows 10 x64 and Ubuntu 16.04. AFAIK 1.x is no longer supported.
I was talking about Duplicacy (not Dupliacti) not supporting Amazon Drive. I've stopped playing with Duplicati and am now trying out Duplicacy instead.
With Duplicity, Duplicati and Duplicacy - there's an awful lot of similarly named backup software out there!
Yeah, the UI is a breeze. Just click through a couple of drop downs. A little slow when working off of an S3 bucket, but even that was still faster than what Vorta and straight Borg were giving me.
Probably worth mentioning that there is a License Exception for the UI actually allows for restoring, it's the backing up portion that they restrict.
I too stopped using Duplicati and changed to Duplicacy.
/u/dn4nm3d Do you find the benefits of buying the license worth the money? I've only used the CLI. Is the only difference a UI? If so, how is the restore process in the UI? I ask because restoring from CLI is cumbersome.
I personally use a Drobo, which is a stand-alone drive enclosure that you can fill with multiple hard drives of the same or different sizes, and which serves as a network storage server. It'll maintain redundancy so that if one disk fails, you just swap it out for another one and your data continues to be available.
For offsite backup, I use Duplicacy and Backblaze B2 cloud storage on the back end.
My Drobo currently has five hard drives with a total raw storage of 12 TB, and after redundancy, that capacity is about 7 TB. As an occasionally serious amateur photographer with a library going back about 13 years, I have 3.28 TB in use and 3.80 TB free, so this solution will probably be great for me for a while yet.
BTW Drobo's a really powerful platform. You can set up remote access to a Drobo NAS (network-attached storage) from mobile devices, run a web server, install a Plex media server, and do plenty of other stuff I'm not even aware of.
I really like Duplicacy (note this is not the same as Duplicati, which /u/ProbablePenguin recommended) for S3 standard class.
Arq is one of the few pieces of software that performs well with s3 glacier(/deep archive) storage classes, though this won't be an issue for you on Sia.
I’m still looking for the holy grail, but for now I use Borg with Borgmatic to backup. I do however backup in the opposite direction of most people as my “NAS” is in the cloud, accessible through rclone + crypt, so my backups run on FUSE mounts to a local drive.
I’ve tried Restic, but that slows down after 2TB or so, as in 23+ hour prune operations.
I’m currently evaluating Duplicacy, but ultimately I will probably switch to Kopia once it reaches 1.1. In its current state, Kopia is alpha software, and while it’s cool, and probably works well, it’s not something you should trust with your data (yet).
For clients I use Arq.
You should stay away from Duplicati at all costs. It mostly works well. The problem is mostly, and I’ve seen it corrupt multiple repositories with nothing but delete and full backup to solve it.
Hi, so the idea was to have a ZFS mirror (I mistakenly wrote RAID 1 in my OP, sorry), so that one disk replicates the other, to make use of ZFS self-healing capabilities. Additionally, I will also have a backup system (I was thinking about Duplicacy.
But perhaps this is overkill and I could get away with having just one SSD + backups. I would lose the self-healing functionality, but perhaps that's OK. This is the first time I have a NAS, so I'm inexperienced, so my original idea could potentially be wrong, or overkill, or too expensive.
I’m happy with using duplicacy. It’s free for the command line line version - which I use.
It supports most platforms and cloud providers including backing up to network drives and local storage.
The performance and reliability is better than Arq and is probably the best for any free online solution.
The developer is also highly involved in the project.
The downside is the GUI version isn’t as polished as Arq and costs a small amount.
The command line version very powerful but has a steep learning curve if you want to automate it completely. I’ve set mine up to start as a launch agent and use open source scripts to further simplify it.
I’m happy with using duplicacy. It’s free for the command line line version - which I use.
The performance and reliability is better than Arq and is probably the best for any online solution.
The developer is also highly involved in the project.
The downside is the GUI version isn’t as polished as Arq and costs a small amount.
The command line version very powerful but has a steep learning curve if you want to automate it completely. I’ve set mine up to start as a launch agent and use open source scripts to further simplify it.
Well, I'm kind of noob in the area, but for my setup which is similar to yours, I've used Duplicacy.
https://duplicacy.com/download.html
~~GUI~~ CLI version is free, and easy to use.
I've tried a bunch of other backup software, but duplicay has been best so far.
It makes incremental backups, with versioning, so the first backup is a complete copy of all your data, but next version "backup" will be an incremental backup saving only changes. So versioning is much faster.
I use Duplicacy since it supports deduplication, meaning you can backup multiple sources to the same cloud storage and use less storage overall if the same files exist. It also has built-in encryption.
Look into Duplicacy - it has a free-for-personal-use CLI version (open source) that works on most platforms and you could set it up with the Pi as an sftp backend.
Does encryption, de-duplication, multiple simultaneous backups. Also supports various cloud providers, so you could script the Pi to make a copy of its storage off-site.
(Even though the CLI version is free, I highly encourage anyone to consider the Web GUI for the backup client, which is very affordable and supports the developer. Uses the CLI as the backup engine under the hood.)
Yeah, their pricing model is ridiculously confusing .. https://duplicacy.com/buy.html. But it’s to provide discounts for multiple machines and multiple years and non-commercial use so I don’t know whether to be upset or happy about that
I'm pretty sure ghettoVCB would still work.
For home I used to use the personal license ($20/yr) of Vertical Backup. Incremental, block deduplicating, lots of backup targets. I use a minio S3 server running on FreeNAS and a Backblaze B2 bucket for offsite. Hasn't been upgraded to v7 though.
Made by the same people that make Duplicacy. You can even use the duplicacy web gui to restore VMs.
I use Duplicacy (free command line, relatively cheap yearly license for GUI) to backup to Backblaze B2. It's similar to Restic and Borg, I'd say.
It can backup to all kinds of other storage as well.
I personally use this https://duplicacy.com. You can target any storage you like there, including SFTP, B2, and S3. Search this forum and /r/Backup for it — you’ll find many recommendations.
There are many others: qBackup, ArqBackup, Cloudberry, borg, restic, ...
> The test here would be to run the same backup against a local storage and see how CPU utilization is then. If it isn't higher, then perhaps it's a limit of Hyper Backup (HB). The CPU in the DS1817+ is an Intel Atom C2538, which has 4 Cores and 4 Threads. If HB run this in single-thread mode, then you've hit a CPU limit and it might be time to consider a different backup software.
That's a good idea, I may try that in the future, thanks.
> It might be faster in the sense that HB doesn't need to keep track of 2 million files, only of 196 files. However on the flip side this will ruin all efforts at deduplications between versions and each time a single file changes (and you recreate that ZIP file), it will likely need to upload the entire ZIP file. So if the test against local storage doesn't show an improvement you know that it is your CPU and then you probably might want to discard the ZIP file approach.
The files are just pictures/flash files from a website so I won't make any changes to them. I only ever need to view them from time to time (very rarely) as I just wanted to back up the website. I think the ZIP file idea is okay because of this.
> P.S.: If you'd like to try a different backup software, I can recommend Duplicacy-Web. At times a bit rough around the edges, but the engine is very fast and reliable and I never had a problem in the 4 months that I've been using it. And it's apparently multi-threaded. FWIW I'm running it on my DS1819+ in a docker container. The command line version is open source and free for personal use, while the web version, which I use, has a modest annual subscription fee.
I'll check it out, thank you! Thank you for all of your help. :)
> What do you recommend aside from a USB backup?
I don't even recommend a USB backup in most cases, since it doesn't have bit rot protection. Copying your media files to a USB drive is fine, but copying important data to a USB drive is not very useful. Also using a backup program (that organizes your data into chunks) is not recommended for the same reason: A flipped bit will (silently or obviously) corrupt the important data file/the backup from the backup program.
I am using a 3rd party program (Duplicacy-Web) to back up my most important data (about 50GB) to Backblaze B2 (and also Google One/Drive, but B2 is better). My media files (which are the bulk or my data) are copied/synced to an external USB drive and I regularly (every 1-3 months) check for bit rot by running a file content comparison against the NAS.
> A. How do I just go and reformat my NAS to a non-RAID setup? I didn't see anything immediately obvious.
You delete the volume and the storage pool and create new ones. Be careful though, this will likely also take out all your NAS apps and their configuration along with your data, so I wouldn't recommend this, particularly if you're planning to use this device as a backup server.
> B. What Windows 10 backup software would you use to backup to this NAS?
I use Duplicacy-Web and can highly recommend it. You can even run it in your docker container on your NAS to copy those backups to a cloud service. I'm actually just syncing (copying) my changed files to my NAS from my workstation and the let Duplicacy handle all the backups of my non-media files (to Google Drive and Backblaze B2).
> I got 2 NAS, my 918+ is just my normal NAS, my 216j is my BackUp for my most critical data.
You have a problematic choice as a backup machine, since j-series models don't support the BTRFS file system and therefore are not protected from bit rot. I hope that you are using BTRFS on your 918+. Together with disk redundancy and the "data checksum" enabled on each shared folder this will protect you from bit rot there.
After using Hyper Backup for several years to backup the most important data on my NAS, I've recently switched to Duplicacy-Web. The UI is a bit rough around the edges, but the engine itself is very solid and it performs much better than Hyper Backup.
My backups are automated (in my case just to Google Drive and Backblaze B2) and I wouldn't use a process that doesn't run on auto-pilot. It seems that you generate a full backup image of your selected data every time, but a good backup software is able to only use additional storage for new or changed files.
I don't know, of Synology offers anything for this, but I'm using a 3rd party backup software called Duplicacy to back up my sister's Mac to my own NAS. I actually prefer a 3rd party tool, so I'm not locked into the Synology ecosystem too much m
If you're looking for software recommendations (other than the obvious Time Machine), I can recommend Duplicacy-Web. I'm using it to back up my sister's Mac (the user data folder) to my Synology NAS over the internet. And since my sister isn't exactly a techno-phile (to phrase it mildly), it had to be something that goes automatic. So now I've combined Duplicacy (with a VPN client, since it's over the internet) and it's been rock solid.
The UI is a bit rough around the edges, but the engine itself is very fast and solid. It has all the usual features (versioning, deduplication, client-side encryption). A somewhat unusual feature is cross-client deduplication, that is storage space is deduplicated between multiple clients backing up into the same storage. Though that means that they all need to know the storage password, so they would have access to the other clients' backups.
Also Duplicacy has a "copy" feature, which I'm using to sync a local backup to cloud storage space.
So here's what you could do with Duplicacy:
Create local backups for all your clients to your NAS (either into the same storage space or separate storage space). Even if backed up into the same storage space, the backups would still be distinguishable by backup id. You can schedule those backups multiple timer per day, if you want.
Run a client of Duplicacy in a docker container on your NAS and use the copy feature to sync those backups to one or more cloud storage spaces. I'm using Google Drive and B2 for that, but Google Drive is actually pretty bad performance-wise. Again you can do this multiple times a day or just once overnight.
Please note that the CLI version of Duplicacy is free for personal use, while the Web version (which I use) has a modest per-client annual license fee (which even goes down with additional years and additional clients). I paid for two licenses and it's money well spent.
> I want to backup a few folders on my DS218+ to an old WD Mybooklive NAS.
Unless this old NAS has bit rot protection, you might want to reconsider using it as a backup for anything important.
> I want it to backup the folders I have specified, and If I delete a big file, I want the backup of that file deleted and so forth.
Think about that twice (or perhaps even three times). If you accidentally delete something (and don't notice), do you really want it to disappear right away? That's generally not a desired behavior with backups. Perhaps what you are looking for is really a sync, not a backup.
> I don’t want it to make a full backup of all the files each time it’s runs.
Most backup and sync programs have an option to only backup what has been changed.
> Is hyperbackup up the best option?
It is the best backup option within the Synology suite of apps and is very integrated into DSM, e.g. it can backup system and app configurations. Is it the best, period? Not in my book. I've recently started using Duplicacy-Web and - while a little rough around the edges - I consider it superior to Hyper Backup.
> What about retention options for it? If I have a file that never changes, will it just keep the single version of it?
If you have a file that never changes, that file will appear in all versions, but the storage space for it will only be allocated once and other versions will simply point to the same storage location.
Duplicacy has lock free deduplication. And a really nice Web-Based GUI. Also supports pretty much all of the cloud storage providers.
I use it to backup to an S3 bucket on my freenas server and also offsite to a backblaze b2 bucket.
I like Duplicacy-Web as a backup software. The CLI version is free for personal use, the price of the web version is very reasonable (with 50% discounts for subsequent licenses and subsequent years).
One of the great features is cross-deduplication across multiple clients within the same storage.
The Web UI is a bit rough around the edges, but the functionality is very solid and fast.
Wow! That's quite a lot of research! Thanks for sharing it. I'm going to spend some time parsing through it.
From the research that I did and articles that I've read, Duplicacy seems like the winner in my book. It's well priced, (Web Edition is $20), light, contains encryption, has moderate to fast upload/download speeds, and more importantly, is good at picking up where it left off in the upload/encryption process. This is a big one for me, as if a drive is disconnected or some issue arises, I can't have it start all over again.
It also has a super active forum for those who are more advanced, as well as more entry-level users, like myself. The CLI version seems amazing if you're willing to dive into the terminal, but I don't have a problem paying a one time $20 (and then $5 per year after that) to have a GUI (which is important to me), and be able to easily visualize my stored data.
For me, Duplicacy Web Edition was a little confusing at first, but after reading the guide, I feel it's relatively simple to jump-start the configuration. Arq is definitely easier in terms of allowing a noobie to jump in and start working in it, but it doesn't seem to offer any visualization of data nor allow it to do more complex operations like one can in Duplicacy.
This post may also shed some light on key differences between Arq and Duplicacy, but note, it is three years old, so it's possible that many of these issues have been fixed. Still, being able to read a comparison of many of the software out there is useful so as to see where the weak points are.
General advice is to use Duplicacy as a backup tool targeting Backblaze B2 storage
If you have significantly more than 2TB of data to backup — (obviously, don’t backup iTunes movies offsite, only truly important irreplaceable data) then there are other options to consider, from full service backup providers such as crashplan ($10/unlimited storage) and Backblaze Personal to g-suite ($12/unlimited storage) and all the way to hosting redundant storage array at friends house, depending on other needs: data type, turnover rate, convenience, etc.
Duplicacy as a backup tool (cross platform, open source, written in Golang, ridiculously fast, robust and flexible).
B2 as backend storage works well.
If you have more than 2TB consider G-Suite as the backend. $12/month, unlimited storage (750GB/day max ingress and 10TB/day egress limits). Technically, you must have 5 accounts to get unlimited data -- but they don't enforce that. The caveat -- since goggle drive is a file sharing api it is not designed for bulk storage so some operations (such as backup thing, pruning, etc may be slow). Relevant recent thread.
Myself I stick to B2. Been using that for few years now -- no complaints so far.
I also had no idea this was a new release. As a new user it works great for me.
I’m using Wasabi and had no problems setting it up or testing it. But until there’s documentation, and until the arq_restore
command-line utility is updated to work with V6, I wouldn’t recommend it unless you’re prepared to do your own research (as you did). The developer screwed up bigtime but the product is solid IMO. I wouldn’t blame anyone for dumping this product, but I haven’t found anything better.
If you ask for suggestions you’ll get referred to Duplicacy, by the way. People seem to love it. Definitely check it out. I thought it sucked. To each their own.
Two solutions to use side-by-side
As a side note, I found this software after Code42 sunsetted Crashplan Home edition after testing a lot of backup tools, and it leaves all competition far in the dust both in terms of performance, resilience and features. I went ahead and bought a few licenses even though I use free command line versions, just to support the dev, and I don't like spending money just like the next guy. it is that good :)
Edit: I praise Duplicacy all the time, and my appear as a shill.. I"m not, I'm not associated with acrosync in any way other that a happy customer.
Edit2: Another thread on this from earlier today https://www.reddit.com/r/synology/comments/ggaw7o/timemachine_vs_synology_drive_which_is_better_for/fpzc7qx/
I would not mix BTRFS Snapshots and Hyper Backup single versioning. What if you decided in the future to move your backup to the cloud or from one cloud provider to another? You can move the Hyperbackup files (I've done that before), but you can't move the snapshots. Use snapshots on your original NAS for ransomware protection and full versioning in Hyper Backup.
Don't use external USB drives as backup media, because it doesn't provide bit rot protection. You need both the BTRFS file system (with checksums enabled) and disk redundancy for bit rot protection to auto repair any problem.
Since you don't like Hyper Backup, consider something better. And a better backup program than Hyper Backup is Duplicacy (Web). It runs on Synology (either as a docker container or natively. It's a bit rough around the edges re. the "feel", but the functionality itself is very robust and also very fast.
Another nice thing about it that it is cross-platform, so you can continue to use the backups, even if you were to move out of the Synology ecosystem at some point in the future.
If you want to try out Duplicacy-Web in a docker container, this one works for me.
You can also install it natively on your DiskStation. /u/ssps has a guide to that in his blog.
Hertzner is AWESOME and I love them. Seems a little expensive for storage, but that’s probably just me and my TBs of data. I’m not aware of their storage boxes so I will take a look.
I recently did my own evaluation of many backup systems both open source and commercial and nothing came close to DuplicacyDuplicacy. Just make sure you spell that right LOL. There are a lot of projects that have similar spelling.
You should also be aware that Duplicacy is free and very easy to use. I think the license is required for commercial use maybe.
I personally use Duplicacy and absolutely love it with Backblaze B2.
It doesn't work amazingly with anything that isn't instant-TTFB, so things like Glacier don't work unfortunately.
Do you need sync or backup?
Because if you want a proper off-site backup solution, file sync is just the means to get it across the network - it has risks if you don't consider additional measures to prevent a sync deleting/trashing files both ends if you get hit with ransomware, say.
For backup, I recommend Duplicacy. It can do incremental, de-duplication, and is cross-platform. The CLI version is also free for personal use, and open source. Synology is supported by sftp backend.
If you insist on B2 — consider Duplicacy https://duplicacy.com in the docker container. https://hub.docker.com/r/saspus/duplicacy-web
If you don’t have to use B2 — consider HyperBackup with Wasabi as a destination. Similar cost as B2.
Another possibility — Crashplan Pro in docker.
Myself — I do both — Duplicacy and Crashplan.
Duplicacy is lock free so you can backup multiple computes to the same bucket at the same time. Since it's block based it won't store the same block twice (deduplication) so you may save even more space. They have Windows, Mac and Linux versions.
duplicity I really haven't used. It looks like it's Linux only.
If you're not scared of a command line, try Duplicacy, which has a free CLI command line and runs in QNAP - just use the terminal.
You'll want to try this out locally first to get the hang of it, then experiment with B2 until you're comfortable. The deduping is excellent.
How much data do you have?
I would strongly consider backup services such as Backblaze or Crashplan.
If amount of data is under 1TB — more cost effective would be to use a backup application such as https://duplicacy.com coupled with cloud storage service — such as Backblaze B2 that costs $0.005/GB/months.
Have you looked at Duplicacy? The new web-based UI is in beta and needs more polish (including a service mode) but it satisfies a lot of your requirements and is under constant development. Cloud and local storage, Linux and Windows client, encryption with incremental, snapshots, de-duplication, open source (CLI).
No Android client (yet, but feasible), but I find it makes more sense to sync phone data to a PC which, in turn, does the backup. (Not that I have much to backup on my Pixel 2 XL, but for this purpose I use Syncthing.)
You can use https://duplicacy.com (command line version is free for personal use) coupled with Baxkblaze B2 cloud storage, which is 10GB free and then $0.005/GB/month.
I'd give the GUI a go, I prefer the CLI but that's personal preference. Remember you need to buy a license for the GUI after the 30 day free trial, whereas the CLI is free.
If you're comfortable with the command line, I personally would skip Duplicati and use Duplicacy.
Overall it's just a much nicer experience, in my opinion. It also supports concurrent backups from different machines to the same location, and deduplication between those machines (I.E. if machine #1 has a file, and so does machine #2, it should only backup said file once+alittlebitofpadding).
Hmmm found out why i didn't tried it when it came out ....
https://duplicacy.com/buy.html
Anyways, may thanks for getting me on the right path. For my next budget i will be sure to place this on the to-buy list.
If you have, I'd be interested in what software you thought I was talking about. Additionally, if you have, then give Duplicacy a legitimate go. It really is the best out of the bunch by a long shot (in my opinion).
I know it's really confusing with no less than five pieces of software (I think) sharing almost-homophones, but hey, as is the world of hobbyist software development.
To back up Bitwarden you back up your bwdata folder:
https://help.bitwarden.com/article/backup-on-premise/
Bitwarden doesn't have that functionality built-in but you've got plenty of good options. Duplicacy is a popular choice. It's cross-platform and supports Dropbox, Google Drive, S3, SFTP, local disks, ...
Just a heads up for others, Duplicacy has licencing.
Only personal CLI use is free. Outside of that it does cost.
So even though it's faster, just bear that in mind that it could require a license depending on your use case.
> that can also sync in real time my work files (psd, 3d files, unity/unreal project etc) so I can access them on another computer or retrieve them if they get corrupted (so something with versioning).
Note that all "live syncing" products I have tried eventually end up with conflicts where both computers think their file is newer (sometimes for no obvious reason). Often this results in 2 copies of the file.
This can be disruptive in active work (you don't notice that some changes got lost as mywork[conflict].xml) so I avoid "active sync" in any folders that see regular updates on different PCs. I suggest using a version control system like Git instead.
My goto tool for scheduled backups is Duplicacy. The commandline is free and the GUI is quite cheap. I use it with B2 storage, as that was the cheapest I found.
Thanks for the suggestion, I'll check Duplicacy out!
I've heard of Duplicati and Duplicity, I haven't heard of Duplicacy. I was also confused and assumed you had a typo.
I refuse to keep my data and its backups on the same filesystem, as even with mature filesystems like ZFS you can face bugs causing data loss: https://github.com/zfsonlinux/zfs/issues/6931
Copying files (e.g. with rsync) to differently formatted disks is slower, but much safer. Tools like duplicacy also make it trivial to upload snapshots to a remote public cloud for cheap as well.
I use the CLI, which is free for personal use. Quoting from this page:
> The CLI version is free for personal use so there are no personal licenses for the CLI version. In addition, if the purpose is to restore or manage existing backups, the CLI version can be run by anyone on any computer without a valid license.
It can be downloaded from their GitHub "releases" page.
I am currently evaluating different backup solutions for home and work. At home I used Crashplan for off-site backups but they are no longer supporting end-users. At work I use rsnapshot which uses hard links to create point in time snapshots of the filesystem.
I am moving my personal off site backups to Backblaze now and started testing duplicity. Tar archives, incremental, deduplication and encryption. Can't yet tell how fast it is but I do seem to like the concept.
For work I want to move away from rsnapshot because it starts to be very slow for large backups. I am looking at duplicacy because from all the reviews it seems to be the fastest. The data is deduplicated and you can backup multiple servers to the same destination to benefit of dedup. Supports encryption, is written on Go and works on all OSes. Command line version is free for personal use.
Another one to mention is attic which is no longer maintained, borg seems to be a replacement. I consider duplicacy better because they fully support multiple servers backed up in the same storage repo.
Unless I’m missing something, their Personal Use license is not free. In fact, according to their pricing page, it’s a subscription: https://duplicacy.com/buy.html
Like I mentioned in the original comment, we have several machines that need to be backed up, so that kind of pricing model is a no go for me. To be perfectly honest, I was excited about it until I saw the subscription info.
If you have a *nix environment I can also highly recommend Borg backup which is excellent (deduplication, encryption, mounting of backups via FUSE, local & remote). The two cons you may face depending on your environment: no Windows support, deduplication is per client only.
I just stumbled across https://duplicacy.com/ which looks interesting but haven't tested it yet.
You're right, but also in duplicacy. See https://duplicacy.com/guide.html they seems to support the same protocols. So I was wondering why in the file one is checked but not in the other. I supposed you used ftp as p2p.