This is great work, I love useful scripts like this!
You might be interested in duply, which is a similar project designed to make backups to remote servers as simply as possible.
It is essentially a bunch of scripts for duplicity so you don't need to make long, complicated command lines. Duplicity itself supports a number of backends, including S3, Backblaze, google drive and others, as well as generic (s)ftp and rsync protocols.
There's also options for encryption, too.
So I personally use it to back up all my computers on other servers, but it at least stays encrypted, in case the data is something I shouldn't trust a service to keep private.
You should take a look at duplicity. It's also written in Python (about 1400 lines), uses GPG for encryption, supports backing up to S3 or to a SSH server and it supports incremental backups (uses rsync libraries with a local manifest cache).
Good work though, never enough backup software.
rPi's are great for the low DC power draw, allowing it to run for a long time without mains power.
Ideally everyone already has backups off their HA server. I'm using https://community.home-assistant.io/t/hass-io-add-on-auto-backup/99557 and http://duplicity.nongnu.org/ to back those archives up off-site. No matter what you're running you hopefully have offsite backups of your HA config.
Recovering from a dead microsd is as easy as reflashing hassio and restoring the backup.
I use <code>duplicity</code>. The data are backed up to a remote server I own, and also uploaded (client-side encrypted) to my Google Drive with infinite storage (academic plan FTW!).
Duplicity maybe? PGP-encrypted rsync-like backups.
I switched from Truecrypt to just regularly backing up via duplicity which uses GnuPG for encryption. If I want to edit/use a secure file I dencrypt the file store, make changes, then re-encrypt.
Duplicity is incremental so this process is pretty quick - it obviously has the issue of needing me to store unencrypted files temporarily (though I'm using FileVault drive encryption so they are not totally unencrypted).
On the positive side I bypass several hassles with truecrypt - no guessing how big a volume I needed beforehand; no need to upload the entire encrypted volume because I changed a single character!
I am not a security expert, I'm not sure how well GnuPG compares to Truecrypt's algorithms.
It's not possible, for the simple reason that you cannot update a file of a container without opening (i.e. decrypting) that container first. As you say, you need to open both backup and source, perform incremental backup, then close both. I've not tested it yet, but apparently duplicity can handle this kind of machinery for you.
You should use rsync to make the transfer, it's resumable and really efficient, you can use it now to check the files are ok too.
Have a look at "rsync modules", it's a good way to automate your transfers if needed but make sure you do them over ssh as the rsyncd security is inexistent (no authentication, no encryption).
Or just use duplicity it's a great backup tool based on rsync and gnupg (if you use signing and encryption) and it supports compression, encryption, incremental backups, manages rotation, etc...
Another useful tool is hashdeep, it takes recursive checksums of your files and supports many algorithms.
Deja Dup is essentially a Duplicity interface, so it presumably uses the Duplicity encryption feature that appears to use GPG with a symmetric passphrase for encryption. I haven't looked at that implementation in depth, but GnuPG has a good reputation for encryption, so it's likely fine to put your backups there.
Note that by doing so, you will expose how much stuff you're backing up, how much changes between backups, and how frequently you back up, possibly among other metadata. That shouldn't reveal your data, just keep it in mind if you're concerned about that sort of thing.
I use duplicity for most of my backups. It uses gpg for encrypted (for confidentiality) and signed (for integrity) backups. It can backend to Google Docs, S3, Swift, SCP/SFTP, and many others.
I'm surprised nobody mentioned the wonderful duplicity yet. For encryption you can either use GPG and file-level encryption or you use a encrypted file system as already mentioned.
It's best to have offsite backups if you can, rather than relying on a USB hard drive. We successfully use duplicity to back everything up to S3, but it could get quite expensive for large amounts of data.
Let me offer a strong recommendation for Duplicity. Pretty straightforward command line tool. It works on top of rsync to create incremental and full backups, can encrypt the backups, and can back up to a variety of locations.
It has worked for me for years, and it has not failed me yet during recovery.
Duplicity uses rsync underneath, which is generally regarded as the fastest differential file transfer program, so I think 100 GB will be no sweat. It encrypts with GPG client side. Probably the only downside is it's a classic Linux utility with a command line interface, though deja-dup I think remedies that.
I cannot recommend a cloud provider, but any cheap storage will do (I use Azure because of some free credits I have), including Amazon S3, Rackspace Cloudfiles, Backblaze, etc. Choose whichever is cheapest for your data size.
If you're on Linux already, I don't think any alternative beats duplicity.
Use software that doesn't require you to trust the host storing your data. Make sure your data is well encrypted with a strong password/key before its uploaded using software you can trust.
Git Annex w/ Encryption(optionally with GnuPG) supports a lot of possible cloud storage providers. It sort of requires that your move your files into git annex repositories to begin with. https://git-annex.branchable.com/
Duplicity encrypts with GnuPG by default and supports a lot of possible cloud storage providers too. http://duplicity.nongnu.org/
I use duplicity, sort of like rsync on steroids. It does incremental backups to S3, but the entire backup is encrypted via my PGP key.
Before that I used hand-rolled scripts to Amazon's Glacier service.
At 100 gigs of backup data set to S3's "Infrequent Access tier", it runs me about $1.50 a month.
Interesting to see another backup alternative, but what are the advantages over others? Duplicity specifically.
http://duplicity.nongnu.org/index.html
Duplicity delegates archive management and encryption to rsync and GPG which I see to be major plusses over reinventing the wheel.
Also I don't see anything in the rustic docs detailing incremental vs full backup management. Maybe I missed it.
One place I'd love to see improvements over duplicity is file restore time. Sometimes it can take 20+ minutes to restore a smallish directory (~100 files, ~100MB).
Ehh, all the pre-included ones are crap.
On Macs, Apple's Time Machine is awesome, On Linux I've heard people like Duplicity. I don't run any Windows machines, but I am sure some people around here will be able to make good recommendations.
I use duplicity which is pretty neat if you want to encrypt your data and store them on a cloud service or a server with ssh directly. It also does incremental backups among other things.
The first con isn't a con, at-least, it really shouldn't be. You're going to be using archiving software anyway, such as Duplicity, which will split your data up into chunks anyway.
P.S. Why on earth aren't you using Duplicity? It's literally one command and afk, you seem to be going through hell.
duplicity --full-if-older-than ${age} /path/to/data onedrive://folder/on/onedrive
I recommend looking into Duplicity (http://duplicity.nongnu.org/index.html). That site has their documentation as well as some examples on how to get started. I exclude directories I know I won't need to restore, like /proc and otherwise backup everything else.
Duplicity supports a wide variety of remote protocols as well, so it works pretty well in a remote environment.
> I don't want to upload 1GB backups all the time, just the differences from the last commit.
You might consider using Duplicity: http://duplicity.nongnu.org/
Git is not really about backups. It's more for keeping records of who changed which files and how, and for reunifying divergent paths of evolution. And it mostly only does that well if it is working with plain text files. (But, that may be what you want, if the ebook files are in a text format that git can deal with.)
Another amazing tool is Duplicity.
Excerpt:
Duplicity backs directories by producing encrypted tar-format volumes and uploading them to a remote or local file server. Because duplicity uses librsync, the incremental archives are space efficient and only record the parts of files that have changed since the last backup. Because duplicity uses GnuPG to encrypt and/or sign these archives, they will be safe from spying and/or modification by the server.
Take a look at Duplicity. It's pretty powerful on its own right (encryption, uploading to cloud providers or wherever, restoring specific files, removing old backups etc.), so you might not have to fuss around with other software to achieve what you're trying to do.
B2 itself is just "dumb" object storage, so you'll get those features if the client you use has them. An overview of Duplicity's features can be found here.
Define "Best". DejaDup is an easy to use GUI tool which integrates well with the Gnome desktop environment and Gnome Files/Nautilus file manager. It supports encryption and provides a selection backup location types (including local), via many protocols (ftp, ssh, etc), which allow for network/"cloud" backup.
It uses a lower level command-line tool called duplicity, therefore allowing use of all of its back-ends (Dropbox, Google Drive, Mega, OneDrive, etc), not explicitly displayed by DejaDup. Simply select, Storage location > Custom Location. See the duplicity man page for more information.
If you're not running on Windows and don't want to add .NET to your list of problems, try Duplicity instead. Google Drive is conspicuously absent from the feature list, but I gathered from a quick web search that it can be made to work with Google Drive easily by installing one more Python library.
In what way is it cheaper?
Tarsnap says they charge $0.25/GB/month - plus $0.25/GB bandwidth fee.
Rsync.net, which was until now the most expensive player I knew of, charges from $0.08 to $0.20/GB/month depending on total volume, and no bandwidth fee. You can encrypt your data using the free tool duplicity before it leaves your server.
Meanwhile Amazon is $0.007 to $0.03/GB/month depending on how often you plan to read your data back.
You could try Duplicity to create a backup and then the incremental archives. I don't think it does anything specific for optical or tape though.
You will need a full chain of full archive and every incremental in order if you wish to restore. You could look for software that does reverse-incrementals, but you would need to burn a new disc every day (or use re-writables) as the most complete archive is at the top of the stack, so to speak.
For regular forward incrementals, I guess you could put an archive on a disc, and say a weeks worth of incrementals (burning a new one each day), then put the disc in your backup site (offsite, right?) and start using a new one? That way, any one disc should have the full stack for that directory, and n days of incrementals after it. A failure of one disc only takes out that week's worth of data, and not every disc in the chain.
There is probably better software to do this with.
If you're looking for a simple backup solution then some combination of Duplicity and S3 would be a good place to start.
http://old.blog.phusion.nl/2013/11/11/duplicity-s3-easy-cheap-encrypted-automated-full-disk-backups-for-your-servers/ https://rtcamp.com/tutorials/backups/duplicity-amazon-s3/
If you're looking for snapshots, that's a little more complicated. Where is it hosted? That's the most important thing to know before any solution can be given.
http://duplicity.nongnu.org/ for incremental backups.
I have played with LUKS containers a bit and they actually sync well (a change in a file residing in a large container only results in about 3x as many bytes change of the container it's self).
For 2-way sync between various devices, I have Owncloud running on a netbook behind OpenVPN, but looking at replacing it with Syncthing as it matures.
I think that 2-way, high-frequency sync is just to fragile a situation to combine with backups. I'll always use a separate, 1-way, incremental backup system so that if the sync system screws up, I'm never in trouble.
You really need backups. Get a second HDD for ~$100 (go with internal if you have space for one, quality tends to be more consistent than external drives), and use duplicity to do incremental backups to it (free). That's really the bare minimum.
EDIT: And test your backups regularly, of course. If you haven't verified your backups, you don't have backups.
I use Déjà Dup which is GTK+ frontend to duplicity because it's easy to use and because it encrypts the backups.
I've configured it so it prompts me every week to backup selected dirs from my /home to my external hard drive. After the creating of the backup is complete, I use command-line tool(s) called megatools to upload the copy of my backups from the external hard drive to my free 50 GB MEGA account.
I'm also considering to burn the backups to a bunch of DVDs twice a year and storing it somewhere after the start of the next year.
Sorry for the delay. Here's the script I run. The --encrypt-secret-keyring syntax is wrong I've got to change that but don't wanna re upload 100+GB :). I left the Amazon keys because incase you wanna upload to amazon. They have a good documentation. Duplicity is in its own jail. So you have to do pkg install duplicity, trickle, and other stuff. Trickle throttles the connection. Took me forever to do that. And you can use tmux to monitor the progress. tmux, dup_script, ctrl-b+d (to detach) or something, tmux, and systat -ifstat 1 to monitor throttle is working and when it is done.
Backup script.
#!/bin/sh file="$(date +%b)$(date +%d)$(date +%Y)" #test -x $(which duplicity) || exit 0 . .passphrase
export PASSPHRASE #ASSIGNS AWS key #export AWS_ACCESS_KEY_ID=[NUMBERS&SHIT] #export AWS_SECRET_ACCESS_KEY=[NUMBERS&SHIT]
#$(which duplicity) trickle -s -u 100 -d 1000 duplicity --verbosity 5 --encrypt-secret-keyring=[KEYPAIR]/mnt/backups/ sftp:///backup > /duplicity_stuff/dup_$file.txt
#RESETS AMAZON KEYS #AWS_ACCESS_KEY_ID= #export AWS_SECRET_ACCESS_KEY=
http://duply.net/ is what i am looking at using to replace duplicity to a back up to a remote server
Sticking it up onto s3 and glacier seems the better option for me.
Note that the duplicity developer identified the same issue and listed a proposed set of requirements in the format, but it doesn't appear that anything ever got implemented here.
The same as happens when other files on your hard-drive become corrupt: you restore from an offsite backup.
I have copies of the database on my home PC and work PC, as well as on a USB stick. I also back everything up to an external HDD and important stuff to Ubuntu One using duplicity. The backups are daily versions going back at least a couple of months.
If all that falls apart, I just need to get back into my email account via the admin password to my web hosting (which is one of the few that I actually know) and I can reset most passwords anyway.
Duplicity is an rsync-based one that could be worth adding. However its primary purpose is storing encrypted copies remotely so you don't need to be as concerned about the integrity of your remote server.
It sounds like a GUI version of duplicity.
If so, given that duplicity uses standard file formats and encryption libraries (i.e., gnupg
), and is inherently scriptable by being a CLI program, I'll stick with duplicity :)
That's kind of the premise of things like Backblaze B2. Technically, you can store things unencrypted, but with backup tools like duplicity your files are stored in blocks (tar files) and can be encrypted with a PGP key that never leaves your computer. The encrypted blocks are transferred to any cloud service you like (even google drive, onedrive, etc. work) and it can do things like incremental backup too.
In this case, you're responsible for backing up your PGP key but it's also much more secure.
You could use a combination of good old duplicity (old, but effective. Uses gpg keys for encryption). Then use rClone for sending the locally backed up files to the cloud. Then you have a 3-2-1 backup strategy implemented.
I like [duplicity](http://duplicity.nongnu.org/), it does incremental backups along side periodic full backups (depending on factors such as how long the last full backup was).
One of the big features of duplicity is that it can GPG encrypt the backups (although this can be disabled).
I recommend duplicity! It can backup over ssh/rsync, but also has support for Google Drive, Microsoft OneDrive, Amazon S3 etc. The data gets encrypted locally so no privacy issues.
ok I did not catch that, my bad.
Looking at the documentation for duplicity : http://duplicity.nongnu.org/vers8/duplicity.1.html I don't see any option to have threading or parralelism backup. B2 will not throttle, but will have a limit per thread. So you should aim to have multiple thread, somehow. If you have only one big file to transfer, if "duplicity" can't spit the file and use those chunks in multiple threads, it will be slow. Altought, I did saw an option to split some files.
i don't use deja-dup but it's a frontend for duplicity.
this page contains information about duplicity.
according to the examples paragraph:
duplicity /home/me sftp://[email protected]/some_dir
backs up /home/me
folder to the mentioned sftp server. i believe, changing the sftp address with a backup folder would be sufficient...
I have full disk encryption. There a a couple of caveats with Nextcloud's encryption that makes it not ideal for me as the preview generator app and a couple of other apps can't work with encryption. Also I believe there is less of a performance overhead.
Like /u/ExplodingLemur mentioned full disk encryption requires a password at boot. You can get around this issue by using Dropbear SSH to allow you to enter your password remotely
​
As far as backups go I just have 2 removable disks I swap between storing at my desk at my office. I currently use duplicity but I plan to switch to just encrypting the removable drive
Follow up question/thought for you. Duplicity has a section on the site arguing against tar: http://duplicity.nongnu.org/new_format.html
I wonder if you would run into the same limitations that they discuss. Your implementation seems like it would resolve some, but anyway, just flagging it for you as food for thought.
Duplicity sounds like it might fit the bill http://duplicity.nongnu.org/
​
Although deletions would stick around for however long your weekly backups stick around.
I don't think it does, but I've used duplicity for things like that. It is basically command-line oriented, but there are front-end programs that try to simplify the options. Duplicity supports a bunch of back-ends. Deja-dup is one such front-end, available in Manjaro. http://duplicity.nongnu.org/
> I could use the desktop PC’s ample storage, but it often won’t communicate with the Mac
You will have to explain that a little. If you set up a home NAS like FreeNAS or OpenMediaVault then you can connect to that storage any number of ways, from FTP, SAMBA, NFS or use syncing protocols like Syncthing, seafile, duplicity etc.
Well I just use duplicity into AWS with a rule to immediately move it to glacier, my 1.3TB weekly incremental backup takes something like 1.5TB or so and they charger under $10/mo. I've never had a stability issue with that.
I have my data in an raid1 array on my machine and important things are synced to a selfbuit NAS which also has two drives in an raid1 array.
"Real" important data is synced to a Blackbaze B2 online storage. I use duplicity for this.
I see, I will have to see if that is possible using an Odroid HC2 (SBC) running Open media vault. The other option I was looking into was duplicity http://duplicity.nongnu.org/index.html . It would be nice to not store my media files encrypted on the NAS however so that I could use them on my desktop.
Check out Duplicity. I started using it a few weeks ago. Multiple backend options (S3, azure, google drive, etc.), ability to include/exclude folders, incremental backup options, gpg encryption. So far I am impressed.
Nobody mentioning Duplicity?
It supports encrypting and signing your backups with different GPG keys and has a plethora of different storage backends you can use - from SFTP to Amazon S3.
I have a Zotac ZBOX Nano sitting on a shelf with a 500G SSD networked to a Synology DS212 NAS with 3TB of storage, the zotac runs a bunch of Docker containers with various home services (plex, IRC bouncer, etc) and my various "automate the boring stuff" runs as cronjobs on the box itself. I run backups of critical data using duplicity which puts up encrypted backup files on Google drive.
it is definitely cheaper long term to host your own home-level hardware to run Python scripts and stuff rather than paying monthly for clouds / shells, though having offsite backups is pretty important too.
There is software that can use cloud storage service storage to implement backup.
For example - https://www.qualeed.com/en/qbackup/ has a long list of storage it supports (citing this SW because I'm evaluating it now as a Crashplan replacement for myself):
It does not explicitly list OneDrive - but I'm pretty sure other SW out there will likely have support for it.
Edit: here is the one that supports OneDrive http://duplicity.nongnu.org. (did not evaluate so can't recommend yet)
Edit2: Arq supports backup to OneDrive
Here is my 2 cents: I run duplicity (linux backup tool) which encrypts the entire contents in configurable sized chunks (25MB ones by default if I'm not mistaken), and it keeps a manifest (also encrypted) of the mapping.
The benefit here is that if you make small changes, you have to upload only a manifest and a multiple of 25MB (or however large you make them) chunks. You save on bandwidth without loosing out on security.
There are some other nifty things you can do with it and if this interests you, check it out.
Another point that may be worth considering is that if you keep your backup attached to your network in a writeable state (mounted somewhere where you automatically upload changes), you are risking that a user error or bad change wipes important data. If it was very important I would run 2 offsite backups: one online and one offline. You backup regularly to the online, and you migrate to offline manually (or automated once a month, up to you).
Ieri, sul gruppo Telegram è venuto fuori Duplicity, che pare supportare anche S3 ma, sono sincero, personalmente non l'ho mai provato. Sembra comunque un sw di backup interessante, dacci un'occhiata.
Pretty much any incremental backup (not RAID) software should be able to do this, although most of them only detect changes when the backup is scheduled, not continuously.
Personally I like Duplicity because of how configurable it is, but you can accomplish the same thing with the built-in Windows backup tool: http://www.backup-utility.com/windows-7/windows7-incremental-backup.html
Duplicity or Duplicati would be great places to start.
Since you mentioned you've got a medical practice, you really should make sure you're either using a strong password (good), or public / private keys (even better). You might even want to look at getting someone else to apply a figurative sledgehammer to your system to make sure that it's secure.
Duplicity allows you to split archives into --volsize
segments, and only downloads the segments that you require for the files you're restoring (Plus the header-like files that are very small and inform Duplicity of which files are in which archives).
Sure, why not. Though it seems abandoned and to have lots of issues itself.
See, <code>duplicity</code> already features backup to Google Docs out-of-the-box.
i use mysqlhotcopy to backup mysql db to disk and archives it, keep only the last 10 copies. mysqlhotcopy only take 1 second to to run through and the whole process takes about 45sec's, this runs every 6 hrs
i then use duplicity to backup & encrypt those db archive, plus all the stuff on the filesystem i need to s3 every 3 days
My server centos 6.7 (i think) (Xeon E3-1270 v2 16gb ram 2x1tb raid1) runs an active forum plus a couple of wp smaller sites, somewhat overpowered but i got it cheap.
i also have a small vps i think i paid $10 per year for it. that is only used to run monitoring for my main server, basically munin is installed there, then only the node is on the big server, i was also using pingdom's free plan, but looks like they are stopping that, some peopple have recommended uptimerobot instead
I use duplicity to push my backups to Amazon Glacier. Duplicity does encrypted, incremental, unattended backups. It was a bit of a hassle to set up, but now that it is done, I'm really happy with it :-)
Edit: I currently have ~140GB of backup and pay less than a dollar per month for that.
I use duplicity since it works with many types of storage backend. From the laptop I send it to dropbox, while from my home server it goes to the uni network (via WebDAV).
The others said it pretty much already. Have a look at tools like http://duplicity.nongnu.org/ (based on rsync). I let it run every night to do an encrypted delta-backup to a remote server. In case the owncloud data folder would suddenly go blank and backed up, nothing would get lost.
For system backups I recommend using a cron job to do a full backup on Sundays and incremental backups throughout the week. Get a 500gB or 1tB external drive or whatever you need to be able to hold a single full image and a another 25-50% of that full image for the incremental changes.
My favorite solution for server backups is duplicity. It's rsync based, and it's really good. Just be sure to exclude the directories that are runtime or temp in the backup (like /proc /tmp /mnt, etc)
http://duplicity.nongnu.org/duplicity.1.html
It will back up to pretty much any location, SSH/FTP/SMB/CIFS/NFS/filesystem, you name it. Just worry about doing the backups and you can learn how to restore them on your own time or desperately in case of an emergency.
It is a rather simple shell command that can backup over ssh, optionally with encryption and user defined full/intermediate backups. Check out the man page, or a distro wiki.
duplicity (as in http://duplicity.nongnu.org/ ) is still being developed and very lightweight. you can setup a cron job and never think about it again. Granted, if you are not a command line user this will feel unnatural to you. If you know of any good windows/mac applications that support SFTP please let me know.
Yes. I use it all the time with Duplicity to back up all my files. For something like $5.83/month you get unlimited[0] cloud file storage by a somewhat reputable (Microsoft) vendor.
I see absolutely no reason to shit on something that's got amazing cost value.
[0]Technically it's 10TB, then you email them whenever you hit that cap and they'll add another 10TB for you, but considering you can do that unlimited times, it's unlimited storage (But not really because they don't have unlimited hard-drives).
You can use that storage for backup with Duplicity. (See man page for Google Docs URL format.) Just make sure to sign into your account every once in a while after you graduate so you don't lose the account.
There are managed services like Crashplan, Backblaze, etc for $5-$15 a month, often offering "unlimited" storage but I don't use them and can't comment. Many people swear by them however and if you search reddit you'll find many reviews/users.
I don't know of any extremely cheap online backup solutions--you're either paying for a service or for your own server. You could get another kimsufi box and use rsync to simply copy everything over so it is always in two places.
FTP or SFTP will take a while but it's going to be your cheapest bet. You could buy an external hard drive or two and set up something like http://duplicity.nongnu.org/ or Obnam etc that does automatic backups. The initial backup is definitely going to take a long time. Not much you can do about that, 1TB on a home connection will typically take some time in my experience but if you can leave your computer on and unattended it'll be over in 3 days probably. After that your automatic solution should watch the seedbox for new files and back up automatically.
If you have more questions, it might benefit /r/seedboxes if you made a thread and polled the community as to what they use.
Fair enough, I don't know any good tools for local backups unfortunately. What I use, Duplicity, does support local backups, but isn't made* for them. If you'd like to give it a try, it does support versioning and everything you mentioned, what you'd be looking for is something like (Assuming you don't want encryption, I'd still say use it, but, whatever floats your boat)**:-
>duplicity --no-encryption --full-if-older-than 1D /tmp/backup/original/ file:///tmp/backup/backup/
and that's it, just keep calling that command it'll just keep adding changes to your backup, and you revert to any version in history using:-
For one file on 2015/01/01 +2GMT:-
>duplicity --no-encryption --restore-time=2015-01-01T00:00:00+02:00 --file-to-restore relative/path/to/file restore file://tmp/backup/backup/ /tmp/backup/restore/
or to recover the full directory an hour ago:-
>duplicity --restore-time=1h --no-encryption file:///tmp/backup/backup /tmp/backup/restore/
*Doesn't mean it won't work well, I've just never used it seriously for it.
** Every day (--full-if-older-than 1D
ay) it will backup the whole thing again in it's full form, not incrementally. Either pass 'incr' if you want it to only ever incrementally backup, or increase the --full-if-older-than
time.
Ive been reading and hearing some very good things about people using Duplicity: http://duplicity.nongnu.org/ & tie this with some online storage from amazon S3, you have a cheap, easy and secure place to backup to - http://blog.phusion.nl/2013/11/11/duplicity-s3-easy-cheap-encrypted-automated-full-disk-backups-for-your-servers/
Glacier would be perfect. The only wrinkle will be client application support for it. Services like Crashplan or Backblaze will give you an app which'll automatically upload your photos almost as soon as they're written to disk. They can also send delta updates for things like Lightroom libraries which really cut down on your upload bill. They do cost more than Glacier at this scale. However, it could be worth it for the ease of use factor.
It's possible that the Duplicity backup system could use Glacier as a backend. That would kick ass, and be nearly as convenient as Crashplan or Backblaze. The Glacier page mentions automatic migration from S3 to Glacier, that would work with Duplicity as it can already back up to S3.
EDIT: Hang on, S3 is too expensive? S3 would only cost you US$2.79/month if you were using reduced redundancy storage.
The open-source duplicity sounds similar.
While I'm not interested in using tarsnap itself, the open-source libarchive on which it is based does look like it might solve a vaguely-related issue that I'd been having, so thanks.
my backup tool of choice is duplicity
routinely backups, as others already suggested, can be done with a cronjob
however, if you need detailed history, you may want to consider a distributed version control system (like git)
if you go this way, you will have to add new content to repository explicitly after copying files (by doing git add), then commit changes to repository (git commit, this will create a new revision) and send changes from your local repository to the dedicated server (git push)
I've been looking around for the same thing. You didn't say what File Server OS you were using, but if you are using Linux I've been looking at Areca Backup that was posted in another thread a few days ago. http://www.areca-backup.org/
I'm interested in what other solutions people have, because I'm looking for something similiar.
There's also http://duplicity.nongnu.org/