> Second is there ANY way to recover the old photos from my old hard drives
It depends heavily on the physical state of the drives. If the drives are still able to be read, you could try something like this to do it yourself.
If the drives are making noise (clicking), won't start, or otherwise won't show up when connected to a computer, that might require professional data recovery services.
I'm afraid I lack the knowledge to be able to help you find a solution but one thing I can do is make you aware of the following study - http://www.zdnet.com/article/solid-state-disks-lose-data-if-left-without-power-for-just-a-few-days/
> backing up to the nas via SFTP.
If OP has a model that supports Docker, I would highly recommend using Minio instead of SFTP. It’s faster, uses less resources, and has checksums built in, so if your backup repository is corrupted, minio will tell you (or your backup software).
Arq has a guide for setting it up on Synology.
> Kopia is free and opensource
Arq is also, as far as I know, the only 3rd party Mac backup that includes all metadata.
It also supports running missed backups when the machine next is able (battery powered, no network, etc), and will not schedule a bunch of backups in a row if one or more is missed.
Kopia is great if you have many machines that has the same files, and you want to back them all up. With Arq, deduplication is done per backup job, where Kopia will deduplicate on repository level.
As you wrote, Kopia is “unstable” (works for me, but YMMV), but if you want the functionality of Kopia in a stable package, I highly recommend Duplicacy. The CLI version is free, but I also highly recommend that you buy a license for it just for the the GUI. Duplicacy works in mysterious ways, at least if you’re used to more traditional backup methods, and the UI nearly abstracts that away.
I never even realized it existed until I read your question but Directory Opus from GPSoft (https://www.gpsoft.com.au/) does have logs for file operations. It's a little on the spendy side, but it's the best file manager I've ever used. I started using it back in the day on the Amiga and then decade or so later I realized they released Windows versions now and I bit the bullet and have never looked back. It's got a TON of functionality packed into it and is very extensible via scripting and a lot of customization options. It's got configurable logging though in that you can specify what type of operations get logged and how long the long file gets before it's trimmed, etc. It also supports various operations that allow you to check the "Unattended operation" during a long copy/move/whatever that will allow you to specify actions to take during that process (ie. overwrite if the file is existing, return an error, etc.) but at the end, it presents you with a log summary of what happened during that process. It's also smart enough to notice if you start a large copy process and then copy more files to that location, it'll add them to the queue instead of just starting a 2nd simultaneous copy. Really the features of DOpus are too numerous to list. They offer a free trial and a Lite version, but in my opinion if you're going to spend the money on the Lite version, you should just get the full version. I can't state enough how satisfied I've been with DOpus.
UPDATE: Here's the online help for logging: https://www.gpsoft.com.au/help/Opus10/index.html#!Documents/Prefs/Logging.htm
I'm not looking for power user features, so it's hard to justify paying an effective double fee for a $50 program license (Arq) and for a separate cloud or backup service.
Amazon Glacier or other archival services, for example, are difficult for me to parse. There seem to be massive fees for adding, retrieving, or overwriting data that belie even the cheapest raw storage topline shown, making them uncompetitive even in this context. I just want a monthly backup dump without complex pricing or conditions.
Do you know anything about Carbonite? https://www.carbonite.com/
On reflection, it may be smarter to create a differential image backup on the local portable drive and then mirror that compressed content onto the cloud. No worries about file structue or metadata either. But is it possible to restore individual files from an image in online storage to the system? Having to redownload the entire thing would defeat the purpose.
I agree with /u/bryantech - Arqbackup will work very well for your requirements.
> Arq makes incremental versioned backups of your files as compressed, encrypted, de-duplicated data.
It is also easy to understand how to restore with Arqbackup: https://www.arqbackup.com/docs/pages/restoring.html
You paid $40 for backvp2. Spending $50 for Arqbackup would be well worth it if only for the ease of use. But there's more! You can also back up to clouds: Amazon, Google, Wasabi, Backblaze and OneDrive. If you are not backing up to the cloud (or offsite), your company is not safe.
See also: https://www.reddit.com/r/Arqbackup
Duplicacy. Opensource. Command line version is free for personal use. Supports multiple backend, including local disk. Supports lock-less multi-machine deduplication, encryption, pruning and is very, very fast. Leaves Crashplan in the dust.
Duplicacy or it’s command line version. Search this sub. There are detailed comparisons.
To summarize, I would avoid both arq and duplicati for many reasons, stability and performance leading the list.
Came across this on Amazon but looks a little long in the tooth:
>Does Windows continue to back up to an encrypted drive after a restart or is a password required after each restart for backups to launch?
You need to configure Auto-unlock feature - https://winaero.com/turn-on-auto-unlock-for-bitlocker-drive-in-windows-10/
>I know Veeam is reliable and free in my case (https://www.veeam.com/virtual-machine-backup-solution-free.html), but without the cloud part.
What solutions would you suggest?
Assuming you have the Veeam B&R Community edition, you can extend local backups to the cloud using Starwinds VTL. It allows to offload backups in any S3 cloud including Azure blob, keeping backups immutable and restoring VMs from blob to Azure directly (you would need a Veeam instance in Azure for that). https://www.starwindsoftware.com/starwind-virtual-tape-library-free
You could use a file sync program. You can setup the program to mirror the everyday hdd to the backup drive(s). The program will scan both drives and will just copy new or modified files to the backup drive. On windows I use SyncFolders to backup my files to external drives.
If you truly don't know anything about rsync, I'd start on Digital Ocean. It's a pretty basic introduction to rsync. From there, you could probably write a script that runs with cron or similar to keep it updated.
From what I see it is a Discontinued backup utility for Windows. https://alternativeto.net/software/cobian-backup/about/?license=free&platform=iphone
No matter how great your drives are your storage is only as good as your backup method. While having drives in RAID is better than the alternative it only protects you from mechanical failure. If you accidentally remove the wrong file or get hit with ransomware you are completely out of luck.
All drive will fail in the future but the 1TB hdd will probably fail sooner than the others, it may not be next year but it will almost certainly be in the next few years.
As for how you would backup your data, assuming you are on Windows there is a new tool known as file history which is included with the OS. You would select your most important directories (you can always just do c:\users=big_avacado) and copy them to another drive. This will automatically rerun around every hour to pickup any changed files.
I would start with this.
For a much better backup method you would want to have some sort of offline storage as if you do get hit with something like ransomware it will easily take out any drives that are attached at the time.
As you mentioned not having speed issues I would be less concerned with running out to redo everything on them but keep in mind that you could avoid problems in the future (either performance or data integrity) by moving to newer drives now.
In your scenario I would probably setup file history to backup all of these directories to the external ssd and plug it in once per day / week depending on how frequently your data changes.
What sort of restore do you want? Assuming you are on windows and just want to recover individual files file history would handle this easily enough and it's already built in.
Most modern incremental backup software does this. Arq, Kopia, and more handle it with compression.
Archivers like tar, zip, etc, solve a different problem. Where incremental backups (usually) store a single copy of every file (and delta revisions), archivers typically represent a complete copy of the source at the time of the snapshot. They don’t have the advantage of just referring to a delta, but instead have to store every block of every file in the snapshot.
There are some “hybrids” out there like Tarsnap, or Zpaq but most of them usually attempt to build on top of existing archiving tools. Please not that I have no idea if any of the above tools will eat your data. I’ve never used them, though I hear good things about tarsnap.
>How often do you change your external hdd for backup purpose ?
Since I buy only new drives, even for backups, I used to replace drives every 3-5 years, depending on the drive warranty period.
>What do you use to monitor your external HDD?
For motoring a drive, you can find programs like HDDscan that scan the drive for errors and report on the drive's SMART status, which will report the number of bad sectors. http://hddscan.com/
Your task is quite ordinary, so I'd say you can select any backup tool you are comfortable with. Almost every software vendor offers free trial versions of their products, so you won't have to buy a pig in a poke. Obviously, I would recommend Acronis True Image (as you can see from my nickname, my opinion cannot be considered unbiased=)), however, ther is a bunch of other tools. Take a look at this article for comparison table.
> Program Files is that the plugins are stored there,
I assume you refer to what Anki calls add-ons?
The default location for these is in your user profile, see https://apps.ankiweb.net/docs/manual.html#file-locations . If they were in the Program Files you would need admin privileges to install a new addon. I don't think there is any reason to put addons in the Program Files.
But your plan should work. For backups you may not rely on anonymous tips from the internet. You must make a test restore periodically to verify that the backup is working.
Thanks for the reply. I was hoping to use Borg on unRAID as well as my Linux PC. Borg is part of the 'Nerd Pack' in unRAID, so it'll get updates. BorgMatic isn't :(
So i'm trying to achieve it with Borg for now. I have heard of healthchecks.io, but in all honesty it too is beyond me. If i can't understand out how to use SystemD to email me, i doubt i'll figure out the additional layer.
I've seen BorgBase, and the Vorta app is pretty neat. Because Vorta isn't in a Community Application (or Docker) in unRAID i'd rather just use Borg for both systems, with minor tweaks in the config as appropriate. Now there's a potentially lucrative market for you :) unRAID people are often like me: somewhat tech savvy but struggle with deeper system configuration like Cron and SystemD.
If i can make a couple of suggestions for Vorta:
BorgBase looks like a good service, but i have access to another PC at my parents house so i plan to use that.
Backup to local external drive? Use File History. It's already built-in and costs nothing.
See documentation here: https://support.microsoft.com/en-us/windows/file-history-in-windows-5de0e203-ebae-05ab-db85-d5aa0a199255
BTW, even though they say
> File History only backs up copies of files that are in the Documents, Music, Pictures, Videos, and Desktop folders and the OneDrive files available offline on your PC. If you have files or folders elsewhere that you want backed up, you can add them to one of these folders.
You can in fact create your own "Library" and add any folders you want to it. For example, I've created "Library" called "Game Saves" and added various obscure locations certain games like to save files to that are not in Documents\My Saved Games. Then added that library to file history.
Edit: somewhat fresher link: https://support.microsoft.com/en-us/windows/backup-and-restore-in-windows-352091d2-bb9d-3ea3-ed18-52ef2b88cbef#WindowsVersion=Windows_10
Definetely look into Keepit https://www.keepit.com/services/backup-microsoft-office-365/
Takes minutes to set up and 0 maintenance because it is cloud to cloud solution.
Not sure about costs, but I think they are cheaper than Veeam.
For basic backup services, Cloud Station Backup is reasonable. I wouldn't want to do a full system recovery from it (and I doubt under Catalina it would support it), but for basic 'regular' data backup services that are free once you've invested in Synology, it's not the end of the world.
While I've not used it since EMC sold it off, Retrospect for Mac might an alternative to consider for a home backup solution.
Got multiple hubiC accounts with my friends since we care about storage space and data safety. One hubiC account offers 25GB free space, more than common cloud services like Dropbox, Google drive, etc. Besides, France has some strong data privacy laws to prevent the government or third-party software from accessing your data without your permission so there's no need to worry about data security.
You can try https://www.urbackup.org/ or https://backuppc.github.io/backuppc/ in a VM and see which one you like better then implement it. Use Let's Encrypt for SSL. I can't provide any other opinion.
"Their offer looks poor" about Veeam? I'm sorry KaFKA_1410 -> That's not correct.
Veeam is by far the most integrated for VMware backups and the agents for Linux and Windows can rock out in awesome fashion.
You should use Veeam Community Edition (the free one) to let me point be proven:
Free Backup Solution - Veeam Backup & Replication Community Edition
​
Tell them RICKATRON sent you!
Do you run agent-based backups? or is it just offline VM backups?
From what I see, you could use Veeam Community Edition which is free for 10 VMs to backup them directly to the local and network storage. https://www.veeam.com/virtual-machine-backup-solution-free.html
How many VMs do you have? For 10 VMs you could run Veeam Backup and Replication Community Edition. https://www.veeam.com/virtual-machine-backup-solution-free.html
I can recommend Veeam Agent for Windows workstation backup. Like Macrium, it allows you to make a file, volume, and disk image/ system-level backup from which you can restore a file, volume, or entire system to the same or new hardware. https://www.veeam.com/windows-cloud-server-backup-agent.html
I recently tried this and found GB&S was a huge resource hog that was also surprisingly slow and inefficient at sync on my MacBook. A friend recommended Arq which can backup to your Google Drive storage as well. It can even encrypt the backup. So far it is working much better for me than GB&S ever did.
Thanks for all that info!
I had assumed from looking at https://www.arqbackup.com/pricing/ that * 1 user for desktop computers
meant you needed a license for each computer. But if you don't that's pretty good.
So is there no limit on it at all? Is it just kind of like an honour thing that you won't going installing it on 100 business PCs with a single license?
General advice is to use Duplicacy as a backup tool targeting Backblaze B2 storage
If you have significantly more than 2TB of data to backup — (obviously, don’t backup iTunes movies offsite, only truly important irreplaceable data) then there are other options to consider, from full service backup providers such as crashplan ($10/unlimited storage) and Backblaze Personal to g-suite ($12/unlimited storage) and all the way to hosting redundant storage array at friends house, depending on other needs: data type, turnover rate, convenience, etc.
Hertzner is AWESOME and I love them. Seems a little expensive for storage, but that’s probably just me and my TBs of data. I’m not aware of their storage boxes so I will take a look.
I recently did my own evaluation of many backup systems both open source and commercial and nothing came close to DuplicacyDuplicacy. Just make sure you spell that right LOL. There are a lot of projects that have similar spelling.
You should also be aware that Duplicacy is free and very easy to use. I think the license is required for commercial use maybe.
How much data do you have?
I would strongly consider backup services such as Backblaze or Crashplan.
If amount of data is under 1TB — more cost effective would be to use a backup application such as https://duplicacy.com coupled with cloud storage service — such as Backblaze B2 that costs $0.005/GB/months.
Have you looked at Duplicacy? The new web-based UI is in beta and needs more polish (including a service mode) but it satisfies a lot of your requirements and is under constant development. Cloud and local storage, Linux and Windows client, encryption with incremental, snapshots, de-duplication, open source (CLI).
No Android client (yet, but feasible), but I find it makes more sense to sync phone data to a PC which, in turn, does the backup. (Not that I have much to backup on my Pixel 2 XL, but for this purpose I use Syncthing.)
What kind of communication are you going to set up between the computers? If you're using a VPN to have them all on the LAN then copying each backup to the other HDD shouldn't be too hard - You'd just need the target copy location to be \\computername\D$\BackupDir
where the letter is whatever the drive letter of the external is.
If you don't have a method for networking the computers then you'll have to settle for having one computer backup to the single HDD that's connected.
I've chosen restic which I'm using for local backups to an external but later I'll use it for backups direct to cloud storage.
Thanks! I'm familiar with rclone, actually. I agree that it supports a lot of cloud buckets - but it needs to work with a file system. I was asking more about something for gathering data from, say, a CRM/ERP/accounting software. Some of these have backup functionalities in the UI and others have APIs. https://rclone.org/
You could cut out your local copy and still get two backups using 'rclone'[1] and (perhaps) an rsync.net account[2].
You back up cpanel directly to rsync.net (using sftp/ssh/whatever) - it's datacenter to datacenter so that is probably quite fast.
Then you instruct rsync.net to push copies to another cloud provider (in your case, Amazon S3):
ssh rclone some/file/blah s3:/some/bucket
... again, cloud to cloud so probably very fast. The bonus is that an rsync.net account has immutable snapshots, for free, so you gain ransomware/hacker protection by keeping a copy there.
I highly recommend you look into rclone. That’s going to take you ... well a very long time! I bet you already figured out that part LOL.
Rclone is awesome and very tunable. Hit me up if you need any help with it.
Thank you for linking to that page. Further down on the page it links to how copy using rclone. Copying is a function of backing up.
A Google Drive account with rclone would do this for you, assuming you were managing the backup files yourself - rclone doesn't handle things like naming and rotation, just straight up/downloads.
Consider Duplicati – open-source, multisystem, S3 support, web-GUI, encryption, etc. https://www.duplicati.com/
The next article includes overview of Duplicati and other similar applications - http://www.vmwareblog.org/single-cloud-enough-secure-backups-5-cool-cross-cloud-solutions-consider/
It would be a good idea to get an external drive to have an onsite backup. Given that you have around 200GB of data, get a 1TB or 2TB of storage. Then use a program like Duplicati to manage your backup. Duplicati allows you to create multiple versions of your files so you can have multiple restore points. :)
Do you prefer to copy whole files when the original has been modified? Incremental backup software such as Duplicati only copy the part of the file that has been changed. AFAIK you can configure it not to delete anything at the destination.
I don't think a huge image is a good fit for Google Drive anyway. I'm don't think it really supports partial file upload, so it'd be uploading the whole image again if a single bit changes. Unless this has changed recently or something.
And more generally, syncing isn't really a "proper" versioned backup system. As it also syncs deletes/corruption. It's good that you're aiming for the 1-2-3 thing, but if the 3 depends on the 2, which depends on the 1... this is a risker way to do things than if 2 and 3 were directly backing up from 1. Because if 2 is bad, then 3 will be just as bad due to just being a synced mirror of it.
But if you do wanna make use of Google Drive/OneDrive and similar that aren't really intended for partial file upload... you can kill two birds with one stone by using https://cryptomator.org/ instead of veracrypt.
It's basically the same thing as veracrypt... a virtual mounted drive that stores the encrypted data in image files... but instead of one bigass file for the entire image (veracrypt)... cryptomator splits the image up into small files that play nicely with GDrive/OneDrive etc.
Regardless of all of above, I'd probably recommend getting off FAT32 anyway, and formatting to NTFS. But you'll need another temp drive to put your files on so you can format it. Maybe a friend/family member has a temp drive you can use or something.
> it's only 128GB and I have ~400GB
Hmm, so it sounds like there's no backup of a lot of that data anyway?
And I know it's not the answer you wanted, but overall, unless you really really can't afford it... it might just be time to grab a new drive. Even your 1TB is probably pretty old by now, so personally... I'd want as many copies as possible.
From what I have found:
>That depends how the corruption occurs. If you open a Microsoft Word file in notepad, edit the foreign characters, and save it, it will corrupt the file, but it will still be perfectly healthy as far as the OS and other programs are concerned. If a drive is going bad and a SECTOR has corruption, it may detect and return a data redundancy or cyclical read error. The first is file content corruption and the second is file corruption. FFS will NOT detect file data corruption but will detect file corruption if it is unable to read the file.
https://freefilesync.org/forum/viewtopic.php?t=6539
https://freefilesync.org/forum/viewtopic.php?t=1930
FreeFileSync is an excellent product that I have been using for years without any problems. It has good filtering capabilities, so you can pick just the file types you want. https://freefilesync.org/
The two posters, IT Insider Advise and IT Insider Advise II both report that the company does not back up your data to redundant data centers but tells customers that it does. They both allege it engages in bad business practices. The company has 91 reviews on https://www.trustpilot.com/review/www.idrive.com with an average rating of one star. There are dozens of current one-star reviews. Restoring at 0.35 mbps, it would take 265 days to restore 1 terabyte. You might want to do a test restore of 100 GB to see if your clients are well protected by iDrive.
I am currently working on something very similar for my own home setup. Happy to share my insights and also eager to see how other people are doing their documentation.
Since the documentation should be very acceeible also also quick and easy to update, I chose to write it in markdown, which then can be exported to HTML or PDF or whatever is needed using pandoc.
For the most part, I list stuff in tables: Device inventory, backup targets and frequency, data classification. When there is more information required than a table can show, I just write it as a short list above or below the table (for example when specific folders are excluded or when certain types of data should not be stored together on the same device). For diagrams I use mermaid flowcharts. While certainly not the most appropriate or beautifil solution, I really like how well this integrates with markdown and it gets the job done for me.
Yes, they are a good combo. And you have identified some important issues. Ransomware could wipe out everything on your external hard drives. You could recover most of your files from BackBlaze, but you'd need them to ship you your data. They'll pay you back for the cost of that once you return their device, but you're going to be down for quite awhile while you reinstall your OS and all your software and then restore your data.
With free drive image software you can avoid an incredible amount of hassle, aggravation and lost days.
Veeam Agent for Microsoft Windows is free. It backs up everything on your PC to an external hard drive. Better yet, it preserves deltas day after day: incremental forever. You set the number of days. Typically 7 days is enough, but in case of an undiscovered data loss, more is better. Each day doesn't use much space - it's a delta!
Even better if is you have two large external hard drives. Connect them to this USB hub with push buttons for each port.
https://www.amazon.com/Sabrent-4-Port-Individual-Switches-HB-UM43/dp/B00JX1ZS5O
Configure Veeam Agent for Microsoft Windows to "see" both drives. There's good documentation for that.
Each day, you press the buttons to turn off one drive and turn on the other. Veeam finds the active drive and backs up to it. You can skip a day or even get mixed up and Veeam will still back up to the drive that is active.
When ransomware hits, it wipes out your PC and the active external hard drive. But the other external hard drive is physically disconnected by your USB hub button. Can't touch that!
I love NASes, too. They do more than backups. But you don't need one for local backup if your external hard drive is big enough. A NAS is great for offsite mirroring to another geographic location.
I agree, I don't think that anything better than https://play.google.com/store/apps/details?id=com.koushikdutta.backup&hl=cs exists (without root)
I agree about tape being cheap, but it's not much cheaper than a modern external USB HDD nowadays. Now if you have a library (old dell TL2000 or 132T's) it's pretty fun to script the library control stuff.
I setup a SSH RSYNC process before i used CRASHPLAN to an old LINUX box at my dad's house. Provided remote replication that was active all the time. that was a fun project for me. Maybe something similar could be your next adventure. Rasperry Pi3's are plenty beefy to serve as a LINUX NAS/RSYNC destination. And they are super-cheap.
Data protection is a thankless skill/profession, i'll tell you. I'm happy to see someone taking an interest. My field of expertise is a lonely one. :) Good Luck!