They use 17+20 erasure coding with checksums for verification, so should be pretty resistant to bit-rot on their end.
Your end will be another matter. There's not much defence against that other than not deleting historic versions, so if an unexpected change happens there's plenty of time for the user to notice and roll it back. 30 days isn't really plenty of time...
You could use QuickPar to protect valuable static data with your own erasure codes.
I'd go 7z with PAR files if you want a recovery record. Even though WinRAR will probably be around for ages and ages to come, I think going with a proprietary format for archival storage is a bad idea. Even if you save a copy of the archiver, running it on some computer at an arbitrary point into the future might be a PITA.
A lot of news readers do this automatically for you nowadays, and if they fail, usually you aren't going to be able to do anything about it anyway, but you can get QuickPar and then download the par2 files for a post (if there are any). QuickPar will use the par2 files to find the missing blocks and repair the rar files. Once repaired you should be able to extract manually.
What you're saying might make sense, but that doesn't make RARring relevant. PAR files have been available for ages, you can include a small file and it will both verify and repair a large movie. Besides, with the reliability of TCP, when was the last time you downloaded/uploaded data that were corrupt?
1) Missing blocks are either DMCA or a incomplete post (upload never finished or the provider never synced all of the data).
Mostly it's a takdown dmca request, you can try adding a block account or two from different backends to "try" and get the missing block(s) needed to extract the archive.
2) The beyond retention error can be either it's truly past the X days of saved data or a DMCA takedown where a lot of blocks have been removed.
You can try downloading the par2 files and use http://www.quickpar.org.uk/ or http://multipar.eu/ to manually repair the missing blocks and then extract with winrar/7zip.
In general the older a post is, the more likely DMCA has happened. Having a indexer(s) that decode obfuscated post will help. Plus adding block accounts for just the missing block(s) when needed (add to sabnzbd as a backup in list of servers).
> I would be more inclined to RAR it - split to 500K (or whatever), enable recovery record (with a large %) and also add couple of recovery volumes. And then put as many copies of those files on as you can fit.
Might also be interested in QuickPar to create a bunch of parity data for the file(s).
I don't know about the UI, but I know you can create more than 100% parity with cli tools.
You are using a outdated version of sabnzbd.
Go here 2.1.0+ https://sabnzbd.org/downloads
Then test it again, that may fix your issue.
If not try manually repairing with http://www.quickpar.org.uk/Download.htm for Windows or cli par2repair with linux. See if that works and then just open up with rar or 7zip.
You could ask the sender to create PAR2 files using Quickpar then use them to rebuild the archive.
Or if they're going to split the archive ask them to create PAR2 files anyway just in case.
The PAR2 files can then be used to verify and repair the archive.
Proper reliable backup is considered a 3-2-1 backup plan. 3 copies of the data, 2 different mediums, 1 offsite. The most simple method is:
1 copy on the Brennan
1 copy on a cold hard drive
1 copy on a cloud storage provider
You also have a bonus copy in the form of the CDs themselves. But as these aren't the same format, they are considered a last resort.
While this protects you against media loss, it doesn't protect you against data corruption. For that you need checksums. Before copying your data onto a bunch of media, you may consider generating checksums for the data to form a baseline of what is known good data. This can be achieved by generating hashes or checksums for the files. From these checksums, you can verify your data at regular intervals. There are several utilities out there that can generate hashes/checksums and verify them. Corz checksum is one such utility. You could also incorporate file repair capabilities into your checksums. QuickPAR can do that. This process takes a long time, but it is necessary to ensure you aren't backing up bad data and so that you can restore bad data with known good data from your backups.
Once the hashes are generated, now you are ready to make a bunch of copies. After you have made the copies, verify the copies using the hashes/checksums. Repeat this check 1-12 times a year depending on how reliable you want your backups to be. Regular checks will also fulfill the need to spin up your cold storage drive every once in a while to keep it running properly.
Any program that claims to check those files of "any" kind will fail because I've tried that in the past. Those check programs only have a small set of functions to check basic things for usually only format that you have to find another program to check the other format like for mp4, mkv, avi, ts and others.
What you should (have done) do is get a program like this and hash the videos as they arrive to your preferred format (above MD5) and save results with the file on that drive. When you next plug that drive, you first go re-check the hash against corruption, then be sure it's safe or not.
There's no way to tell if your source is corrupted or not by all above techiques because I've seen few videos which passed both external steam control program and origin hash check but when it plays, it had glitches in the video which was the fault of the encoder that those type of controls can't catch. So you also need to be sure your "source" is reliable too.
If you want ultimate protection, you can generate parity and create spare files depending on your choice to recover byte to megabyte size of file corruption.
well what you are doing now sounds the best one can do. Your next collection try making par2 files for each album, say 5-10% minimum. If something odd happens again, chances are some of those par 2 files will be intact enough to repair one or two songs per album. http://www.quickpar.org.uk/
It is a weird thing but it kind of works like holographic images - take an old national geographic cover with a holographic image on it, cut it in half and instead of two halves of the same pic you got two smaller pics.. Say you have 100 songs in a folder, and you make 10% par2 files of them, which will bloat the total size by 10%, if any 10 of those songs are damaged it can reconstitute them whole!
Jdownloader (the program emuparadise wants you to install) isn't terrible. I find it a good option to use when downloading files from web based sources. Make sure to use the offline installer as the online one is known to contain malware. Also be ready to use QuickPar to repair the rar files after you download them; you'll need the .par2 file included in the ISO post in order to instruct it on how to do this.
I can only tell you what was recommended to me. My client is newsleecher ($20 a year) and I subscribe to usenetserver. I was also referred to QuickPar to clean up any files that may have been lost.
*edit - I should add that I am pretty much an amateur to this but I've already discovered that the quantity and quality of content is superior to that of torrents (IMO).
par2 is basically file-based, software raid.
You create a par2 set, that is essentially parity information for a particular file-set, and that parity information can be used to replace any missing or corrupt parts, byte for byte.
For instance if you create a 100MB par set, your file-set can be missing up 100MB, and be restorable.
It became popular on usenet because it's much preferable to re-posting individual parts of 'Linux-ISOs' that did not make it through propagation.
Before the days of par2, usenet used to be about 50% repost requests and reposts, by volume.
There's another way that you can create files that can be used to verify integrity, called PAR2. The difference with SFV is that PAR2 files can actually repair broken files, as long as the damage does not exceed the amount of parity you have.
The magic is in that you can have, say 100MB of files, create 10MB of PAR2 parity, and that 10MB can repair ANY 10MB worth of damage to ANY of the files. Basically what RAID does, in software/files.
But of course, this isn't free, you've now increased your set of files' size by 10%. That's what I was getting at.
If you're interested in using PAR2 you can find a free tool called quickpar here.
No, the burner doesn't manage any of that. All you can do is verify files after burn.
You can also use something like QuickPAR: http://www.quickpar.org.uk/AboutPAR2.htm
Or Multipar: http://hp.vector.co.jp/authors/VA021385/
Or just generate checksums of your source files, validate your burned files to make sure they match. Keep a duplicate copy so that when you go to read the data later and there's corruption, you have a backup to restore from.
The way binary data is stored on Usenet maybe something to consider.
The parity data generated by QuickPar can be use to repair your archive in case of data corruption via bitrot or some type of media failure. The more parity data you generate, the more corruption you can recover from.
Would you mind me asking what site you got this from?
Just looking at this is giving me anxiety
Hopefully one of these will be of some assistance:
Quickpar is the easiest solution. It's a quick download and install, doesn't require your changing anything about how you use usenet. The file names have been obfuscated. Quickpar will verify the download, all files present and not corrupted, then change the names to the original name.
I was pondering a way to use PAR2 to further improve the chances of retaining data when storing it on archival media of some kind, by the way, that might be something to consider for your system.
http://www.quickpar.org.uk/ is a Windows app that takes a set of files, like your image file folder, and calculates parity for them. It then generates as much or little parity data in .par files as you wish; the more insurance against missing files you want, the more space it takes up, but it's never more than a fraction of the whole.
This way, if you come back to a disc 25 years from now or something and it has a nasty scratch on it so you lose 15 pictures but retrieve the .par files and the rest of the pictures, you can use Quickpar to reconstruct the missing data.
I do agree with what /u/jcunews1 said - you can't get something from nothing.
But - if you do have access to usenet & you do know a gud usenet group that should have the content you're looking you could use quickpar to find what data you're missing and then download it from there...
Lot of conditionals in that 1...
Or you could download stuff from Usenet/NNTP/Newsgroups in multiple chunks as each message was constrained to a certain size.
In either case, BBS or Usenet, you could use PAR2, aka Parity Archive format, to create additional files that could fix errors in your downloads or even recreate missing sets: a RAID of sorts - "redunant array of internet downloads"
http://www.quickpar.org.uk/CreatingPAR2Files.htm
Redundancy is how much of the total file you want to be able to restore.
Efficiency is how well the block size matches the various file sizes.
Its worth noting, there are two concepts about 'degrading'. One is visually (saving into a 'lossier' format than the existing file). The other is file content (damage to the bits/bytes of the file because of transmission errors or storage errors), or possibly even tampering.
You can solve the visual issue use 'lossless' formats (like PNG). Or by using non rasterized formats for line art (SVG).
You can solve for file content issues by storing EXTRA data to recover from damage with a 'parity archive'
http://www.quickpar.org.uk/ can store recovery data upto whatever percentage you want to repair a file if it becomes damaged over time.
If you want to trust that your file is untouched, calculate a hash for it (md5, sha256, etc) and keep it next to the file as its 'signature'!
Finally if you need any chain of trust for a file, consider using something like GPG to sign it with a private key and distribute the public key to the recipients. Places like github allow signing any additions or modifications to a file with a private key too.
in my experience, I use 2 types of archive, one in WINRAR with recovery record (It's like PAR2) the other is 7zip with par2 because 7zip doesn't handle recovery records.
​
The Par2 software that I use is an old one, called QuickPar ( http://www.quickpar.org.uk/ ) I don't like this method because there is no update since 2004... That's why actually I use winrar.
​
Also, I don't use solid file because if there is extensive damage, it will only affect a couple of files not all the archive, it will compress less but more secure...
According to this page, which says the following, a damaged par2 recovery file is still usable.
> Damaged PAR files will still be useable. PAR 2.0 can use the undamaged parts of a PAR file.
If a drive is going to get bad sectors, I guess there may a benefit if the recovery volumes are split into multiple files as long as they're located on different areas of the disk. Otherwise, in event of sector failure, you'd either lose a part of the big recovery file, or lose one/multiple small recovery files. In either case, the remaining data should be usable, according to that page. Right/
Actually, I found this, which says:
> Damaged PAR files will still be useable. PAR 2.0 can use the undamaged parts of a PAR file.
This implies, to me, that there's no practical difference.
Exactly what it says. There weren’t enough par2 repair blocks or “slices” to fix the errors in the original file. Quickpar has a quick explanation on how par2 works.
Download the .par2 files and repair the rars with MultiPar/QuickPar
Zippy has no way for me to check the CRC values without having to re-download each file & verify, par2 makes it a lot easier.
If you can tell me which part(s) are corrupt then I can reup them but using the par2 will be quicker overall
You could create a parity (recovery) file set for the files, which will retain the original file names. You can then rename them to anything you want, move them, then click the PAR2 file and it will restore the original names.
The program is very small, light and fast. If the total file set is in the GB range it might take a few minutes to create and a minute to check and restore. If you create some extra recovery blocks, you can even recover files which are lost or corrupted (e.g. you archive them to a CD and parts of the CD become unreadable, or you have a partial hard drive crash resulting in unreadable sectors).
If we assume that you're right, http://www.quickpar.org.uk/ would help with that. It creates ECCs for files to assist in verifying data integrity and repairing damaged files.
I've never used it so I can't comment on its efficacy.
Thanks that worked. For those who haven't used a par2 file to repair archives before, I used: http://www.quickpar.org.uk/VerifyingAndRepairing.htm
The one thing I'm still trying to figure out is how do you know which par2 file to use for each episode? I think I'll figure it out sooner or later. Thanks again for the help. Great work!
There are several ways. From VC, output the encrypted container. Now use a program like Par2 to split the container into smaller files and take some precautions with regards to data integrity. Or use 7zip to 'compress' and split the 50GB files into smaller chunks 2-5GB is good. Also, 7zip give you some simple encryption options.
Then just got about uploading your files
I'm not sure why you'd ever want that kind of a setup. It's much easier to download a browser plugin such as nzbget-chrome and send the .nzb file directly from a website straight to nzbget. Simpler yet there are programs such as CouchPotato and Sonarr to automate the entire process. Alternatively some indexer sites such as Dog allow a remote push directly to your server if you prefer that method. If for whatever reason you want a more manual and detailed process of what's actually going on, there are other downloaders such as Grabit and QuickPar that might be worth checking out. Repairing in general is hard to get an exact estimate because of the way Pars work.
Use a normal file compression program?
7zip is perfectly capable of compressing the file into multiple .7z files and reassemble those when you uncompress it.
If you want to be certain it gets there bit perfect, you could use http://www.quickpar.org.uk/ (or any par2 utility, http://multipar.eu/ seems to be much more current and a successor to Quickpar) to generate a set of parity files on the compressed files or the ova itself before transmitting.
Try manually repairing archive with par2 tools.
Either command line par2verify or http://www.quickpar.org.uk/ or http://multipar.eu/
When running that there are two types of failures with par2. One there is too many missing parts so the archive can't be repaired. The second is the par2 files are no good and can not access the data (very rare, but I've seen it time to time).
If the 1st one happens, there is nothing you can do, unless you get a more blocks. If it's the 2nd one, just try opening the archive files and running a test on the archive. The archive may still be good even if the par2 files are not.
And lastly if the par2 can be repaired, then run the repair and then extract out.
Au niveau des usages, il faut aussi souligner que sur les NG, les fichiers sont en général séparés en .rar de plusieurs morceaux, et que leur récupération n'est pas sans faille, parfois un seul fichier sur 200 peut empêcher la décompression. Pour éviter ce problème, les fichiers sont souvent accompagnés de .par, qui sont en gros des fichiers de vérification/réparation, ils ne sont pas obligatoires (certains client comme AltBinz les mettent en pause par défaut) mais très pratiques. Il faut utilise un logiciel comme QuickPar pour lancer ces fichiers, une fois tout téléchargé. Perso j'utilise astraweb, grosse rétention et trafic/vitesse illimité pour 10$ / mois payables par PayPal.