You need a provider like - https://www.eweka.nl/en
You need a downloader like - https://sabnzbd.org/
You need an indexer, I belive Sonarr/Radarr have a default one, they won't be great. The best indexers are generally invite only and can be a pain to get into. Just bookmark the suggested ones and keep an eye on if/when they are accepting new users.
and then set up sonarr etc like - https://www.cuttingcords.com/home/ultimate-server/setting-up-sonarr
So you should have
Should be plenty of guides out there for each step. Yes it costs a little (think I pay about the price of 1 months netflix, but every 3 months). I won't go as far as saying it's virus free, but compared to torrenting 99.99999% of the time you get what you want in the quality you want and is free from all shite over SSL. Once it's downloaded Sonarr should just do its usual thing from there.
to the mods: I am in no way condoning piracy. Newsgroups have existed since almost as long as the internet and are a legitimate way of sharing media. If others choose to use it for any illegal means then that's nothing to do with me :)
I know this is /r/trackers, but for alot of the odder shows, you are better off on usenet. You can integrate it into sonarr. https://sabnzbd.org/ or https://nzbget.net/ . Then get a indexer (10 $ for lifetime), then wait for a deal on usenet provider for about 1 $ a month. Or get a block account if that suits you better, 1 $ per 500 gb or so.
The nzg client will be the download client like deluge in sonarr, and the indexer will be where you set trackers in sonarr.
> UNLIMITED > > No Access Limits > > 50 Simultaneous Connections
Does this mean you won't penalize users who connect from more than one IP simultaneously -- i.e.: if a user has a VPS or Dedicated Server in one location, and also downloads from home?
> Huge Retention
Define "Huge retention". 2100 Days? 2400 Days? 2700 Days? 3600+ Days?
> Free SSL Encryption
Your site is secured by Let's Encrypt. While I have nothing against Let's Encrypt personally speaking, I have to question your certificate. Is it provided by yourself or by who you're reselling for (which would possibly cause problems).
> No Tracking, Logless Access
In these cases, we generally have to take providers at their word regarding this. Some providers have established a reputation where we can take them at their word. However, you are new. Can you provide any evidence that demonstrates this?
It's a "feature" of par2cmdline, where they skip blocks that might be shifted a bit in the data if you don't force it. In mega-rare cases those could cause a failure, but in practice it never happens. Luckily you can override this behavior, so it finds as many blocks as on Windows.
You need to set "-N" for Par2 Parameters, see here: https://sabnzbd.org/wiki/configuration/1.1/switches#par2cmdline
Juz zip/rar -it as everyone else does ...
Its theoretically possible - since there is no actual '"filename" property inside nzb and is derived from subject, You could put relative path in it. But this will cause problems for clients. Another problem is do you use forward (*UX) or backward (windows) slash for paths?
Just disable Direct Unpack.
I personally love this feature, it saves a lot of time and it makes it possible to watch videos while they download.
Open it in its own tab and check "x_frame_options" and "host_whitelist" in special options
https://sabnzbd.org/wiki/configuration/2.3/special
"Includes HTTP header with every request that prevents SABnzbd to be included in another site within the browser. Disable when trying to use SABnzbd with tools that let you control your HTPC from a single interface."
No, SABnzbd isn't connecting, it is being contacted by a wrong hostname from an external source.
In any case, it is being blocked by the new security feature https://sabnzbd.org/wiki/extra/hostname-check.html
/u/TehJonny It seems you setup is exposed to the outside internet and you maybe don't have username password setup?
Maybe the first really application of this kind of attack?! Do you recognize the IP address?
NzbGet for the win. It's matter of the programming limitations in sab that slows it down. I would think you could get more than what you are with it currently though. What you want to tinker with is cache amounts in the settings within sab. See: https://sabnzbd.org/wiki/advanced/highspeed-downloading
Btw we only download Linux iso's round here.
Ok. I grabbed both of those, and they're fine.
The issue is your sab version. They are up to version 2.3.9, which as you can see is WAY ahead of what you're running. Perhaps this would be useful to you. https://sabnzbd.org/wiki/installation/install-ubuntu-repo
Edit: Also, holy crap, the version you're running was released in November 2014. Even the current version, 2.3.9, is 340 days old. Update.
Haven't been able to use your service for a while now:
​
>Failed to connect: Server news.usenetfarm.eu uses an untrusted certificate [Certificate not valid. This is most probably a server issue.]
>
>Failed to connect: Server news.usenet.farm uses an untrusted certificate [Certificate not valid. This is most probably a server issue.] - Wiki: https://sabnzbd.org/certificate-errors
Try https://sabnzbd.org/wiki/configuration/3.2/categories The Folder or full path for final storage (relative folders are based of your complete folder, full paths are also allowed).Ending the path with an asterisk *will prevent creation of job folders.
​
Not verified though.
well looks like you have a couple things going on and you didn't really do much troubleshooting on your own.
1) only 1 entry is in Sab's history, which tells me this is likely part of your issue https://wiki.servarr.com/Sonarr_Troubleshooting#Download_client_clearing_items
2) these two lines
>21-5-10 12:58:27.9|Trace|DiskScanService|55 files were found in F:\TV\Home Economics S01E04 1080p WEB H264-CAKES.2 >21-5-10 12:58:27.9|Debug|DiskScanService|0 video files were found in F:\TV\Home Economics S01E04 1080p WEB H264-CAKES.2
tell me that Sab is likely configured improperly. It should be set to +Delete, you probably have it at only Download or Repair. https://sabnzbd.org/wiki/extra/job-options
If it is set to +Delete and is not unpacking the releases, that's something for you to troubleshoot with sab's support methods.
To test you could also try what SABnzbd gives you. I know you probably have a preference for NZBGet, so this would just be to test if NZBGet is the problem or connection/disk/etc/etc.
If you want to help, install the Debug version of nzbget and send hugbug (the creator) the logs after a crash via the forums.
The disk speed is indeed very important. If you have a 50GB file, it needs to read and process all those 50GB's, if your disk can only do 20MB/s than you can imagine this taking a while.
The post-processing is done by an external tool that depends on the OS. On Windows it's Multipar and there's really nothing faster than that. On Linux systems make sure to have multicore-par2, especially par2-tbb if it's available for your system. Double check that you have nothing set for Extra Par2-paramters in Config > Switches (enable Advanced Settings).
Users can solve it by installing SABnzbd 1.1.0 and then deleting the server.cert in their admin_dir (location for each os: https://sabnzbd.org/wiki/advanced/directory-setup)
SABnzbd will then re-generate the certificates using a proper algorithm when it's restarted!
It doesn't by default. Go to the folders tab in the options, and turn on the advanced view.
Then set the '.nzb Backup folder' to whatever you want (down at the very bottom).
https://sabnzbd.org/wiki/configuration/3.3/folders
What you have is correct. Nothing special to consider really but here's some more details on how it works
>You assign each server a priority, a number between 0 (highest) and 99 (lowest).
>
>SABnzbd will first try to get articles from the group of servers with the highest priority. Within the priority group, the first server with a free connection will be tried. When the first tried server doesn't have an article, then another server with the same priority is tried.
>
>When none of the primary servers has a specific article, a lower priority group is tried. Within the lower priority group, the same method is used: the first server with a free slot is tried.
Not sure if you are asking if already available for download, which it is:
https://sabnzbd.org/downloads
Usually, it will have 2-3 betas, before going final (3rd beta often being an RC), but so far I have always been an early adapter with beta versions when it comes to sab, never really had any issues.
So Newsdemon has worked perfectly for me for years, until today. I've gotten a 100% error rate on everything I've tried, ranging in age from 2 days to 5 years.
"Aborted, cannot be completed - https://sabnzbd.org/not-complete"
How do I fix this?
If you use any of the banners on our website (https://sabnzbd.org) to get to this exact deal, we from SABnzbd get also some income. But when using the link above, we get nothing 😂
Set here the maximum retention time of your server. Retention means the number of days that articles are kept by the server. 0 means infinite retention. Only use this when you have multiple servers and you want to avoid that SABnzbd wastes time on asking servers for articles it cannot have. Be aware that retention times advertised by Usenet providers are not absolute. If you set it too low, your others servers will be used more intensely. There is no reason to set a retention time when you have only one server.
I'd suggest checking your config and adjusting the propagation delay setting if it is not set. It's difficult to identify what your issue is without more info or by eliminating variables.
propagation delay
https://sabnzbd.org/wiki/configuration/2.3/switches
> If you experience very young posts failing due to missing blocks your server might still be in the process of receiving the posts. Delaying the these very young posts a few minutes might solve these issues. Posts will be paused until they are at least this age. Setting job priority to Force will skip the delay.
https://forum.nzbget.net/viewtopic.php?f=8&t=1736
or adjust your nzbget.conf
https://github.com/nzbget/nzbget/blob/develop/nzbget.conf
> # Propagation delay to your news servers (minutes).
PropagationDelay=0
https://sabnzbd.org/wiki/faq#toc27
"If you are moving between major versions (such as a 0.4 version to a 0.5 version) it is recommended that you finish your current queue. As major versions may create new queue files (your old queue file will still exist if you choose to go back to it and finish up downloading)"
Any reason the API wouldn't work? You won't be able to feed that as-is to an RSS reader, but you could write some pretty trivial code to convert the json output to xml. The api does support xml as an output format, but I don't think it's in RSS format so you'd still have to do some sort of translation.
Are you using SABnzbd 1.1.0 with the HTML login form? You can disable that in Specials, switch off html_login.
EDIT Did you see https://sabnzbd.org/wiki/extra/howto-apache and the settings described there?
I think i found a good solution for it:
sabnzbd provides options in the "switch" options with "extended options" activated called "nice" and "io-nice" which are linux tools that define process priorities.
https://sabnzbd.org/wiki/configuration/3.4/switches
Playing with the values gave a instant improvement.
I get wobbly download speeds, too. I just found this https://sabnzbd.org/wiki/advanced/highspeed-downloading:
Both Incomplete and Complete should be on a SSD/M.2/NVMe disk. SABnzbd
should report the speed of Incomplete and Complete at least 150 MB/s.
Slower disks like HDD and NAS have a very negative impact on downloading
speed: a slower disk means slower downloads."
I'm going to my comp and incomp folder structures to the SSD and put links in the origianal places so I don't have change any path settings.
I ran into a somewhat similar issue a couple days ago. After a bit of researching I found this https://github.com/sabnzbd/sabnzbd/releases and I notice it talks about external access and local ranges. My guess is Sab for some reason can't detect your local network. So in short I think you should be able to add 192.168.1.0/24 in sabnzbd.ini file under local_ranges to fix your issue. For the host settings I think 0.0.0.0 should still work at least that's how mine is setup. Here is a link to the wiki https://sabnzbd.org/wiki/configuration/3.3/general. Hope this helps.
Same issue here using sabnzbd - "Server usenet.premiumize.me uses an untrusted certificate [Certificate not valid. This is most probably a server issue.] - Wiki: https://sabnzbd.org/certificate-errors"
I'm not even gonna lie I tried to find this is NZBGet's documentation and there was no info ,and their documentation is overall laughable to the point where I would honestly recommend Sabnzbd if you want to verify something, because I can tell you for sure how its priority setup works
in your example of both servers set to the same priority, Sab will use whichever has a free slot first and then try others in the same priority level if an article cannot be grabbed.
Issues I have been having with NewsHosting are:
I was thinking I would benefit from having an additional backbone, I am just confused on which one I should get with NewsHosting (or if I should scrap NewsHosting altogether)
A fellow Linux user, nice. Arch?
The RAID0 won't be bottlenecking you then lol. Decent CPU and RAM I assume? Do you have sabyenc installed and enabled? Are you using Direct Unpack (not that you should have much issue with your setup)? Have you tried a non-beta version in case it's a bug/regression?
I run SAB on my NAS and as I said I get full throughput. Check out the high speed tweaks in the wiki if you haven't already, and as a last resort maybe try nzbget and see if that helps, as it should be tuned for high speed out of the box.
Should be able to under perfect or ideal conditions.
if you havent already tweaked: https://sabnzbd.org/wiki/advanced/highspeed-downloading
many have better luck with NZBGet due to how its written. may squeeze the rest of your speeds out with it. might be worth a test install just to compare.
As you've presumably already downloaded the file, I would just rename it and set it there and retry.
Else, per the documentation I linked, there is a link within for NZB specification which shows an example file using the meta tags to set a password within the NZB file itself. You'll have to edit it and reload the NZb with the passphrase for it to work though.
You need an NZB client.
I've used Easynews for 15 years, you shouldn't have any issues with the nzb files from there.
Yuo setup sab with details of the Easynews servers (found on Easynews site here).
I use 20 connections on each of 3 of the servers.
Use the Easynews site in conjunction with an indexer (as mentioned) or directly from the web interface, download the resulting nzb to a folder being watched by sabnzbd and it will do the rest.
Have you checked the guide? https://sabnzbd.org/wiki/advanced/highspeed-downloading
In addition you should test with only one server (US, NL, DE at NGD) at a time and for instance 5 connections to find out which one is the fastest for you.
It doesn't do that obfuscating, that's just the name of the file was it was packaged by the uploader.
You can also read more here (not requiring scripts): https://sabnzbd.org/wiki/faq#deobfuscate
You can see if any of these suggestions on the SAB Wiki help. Without any tweaks I can get 100MB/s (~800mbps) with Ninja on a pretty whimpy Hetzner VPS.
When I use Agent I usually send the files to nzb and open in Sabnzbd.
Certainly that's all I'd ever do with large posts that need repairing.
Sab strikes me as a lot more intelligent than Agent for handling large posts that need repairing.
And has an option to pause downloads while processing. Which is what you and I need.
So you might be hitting a quickpar problem, or you might be hitting a problem where the par client is not as clever as sab at handling large posts. Or, of course, you might just be hitting garbage posts; it happens.
Agent is excellent for sampling small files, or selecting a few complete ones and sending to a d/l folder which I specify/create on the fly.
confused me at first too.
You're actually downloading a shit load of different parts of a file. If you're using SABnzbd you can see this by clicking the wrench on the top right, connections, then show "show active connections". You can be concurrently downloading as many file parts as your provider allows, but each connection is so fast that usually 10-20 will saturate your connection.
Basically, we're used to downloads being SO SLOW in bitorrent that trying to do multiple at a time made more sense. With usenet your connection gets saturated SO FAST that it's fine/normal for it to work on one file at a time because they're broken down into pieces and it's eating your entire bandwidth at all times.
There's not a great native way to set priorities so Shows go before movies, but you can do it with a pre-queue script https://sabnzbd.org/wiki/scripts/pre-queue-scripts.html.
Our friend /u/lead2gold (nuxref) keeps the CentOS repo very up to date!
https://sabnzbd.org/wiki/installation/install-fedora-centos-rhel
https://www.reddit.com/r/usenet/comments/71ida8/sabnzbd_230_released/dnb1qbl/
download NOOBS and install Raspbian. https://www.raspberrypi.org/downloads/noobs/
Then choose between sabnzbd and nzbget.
Let me know if you got any questions
Have you tried everything on this page?
https://sabnzbd.org/wiki/advanced/highspeed-downloading
What does it say in the Status window when it is slow? Is the cache filled all the way?
What if you download a real Ubuntu or other Linux iso?
Try reading this article. It goes into detail with regards to what your asking. https://sabnzbd.org/wiki/advanced/highspeed-downloading
Tl;Dr Usenet providers give you a lot of connections for overhead. Start small keep increasing the number of connections till you saturate your link.
Ex. With newdemon I get 50. However, I have it set to 30 and get no errors. Alternatively with xsusenet's free plan I have a total of 5. I set it to 2.
you have to experiment.
Yes, takedowns can happen at any time when content creators see their content and request it to be taken down.
We have a special Wiki page about it: https://sabnzbd.org/wiki/introduction/downloads-cannot-be-completed.html
That's why we give the notification when you just install SABnzbd or start using Glitter to look at this page explaining all the neat tricks it can do :)
/u/TehJonny I released a new version, 2.2.1RC1 with only the small change that should make sure it does not get stuck for other reasons (happens also for some users)
The current Fedora/CentOS installation page is a bit of a mess since our package creator from long ago stopped (https://sabnzbd.org/wiki/installation/install-fedora-repo)
Would you maybe want to suggest how to re-write the top part? We want to keep the bottom part about running from sources, but the top part needs new text and links.
Shoot me an email at if you're interested :)
So Here's the full dump:
I've also tried downgrading the python-jaraco
pkg as it is listed as breaing the latest.
How does one up the log level on the systemd service?
× [email protected] - SABnzbd binary newsreader
Loaded: loaded (/usr/lib/systemd/system/[email protected]; enabled; vendor preset: disabled)
Active: failed (Result: exit-code) since Fri 2022-01-28 14:17:16 EST; 2s ago
Docs: https://sabnzbd.org/wiki/
Process: 36821 ExecStart=/usr/lib/sabnzbd/SABnzbd.py --logging 1 --browser 0 --config-file /home/tom/.sabnzbd.ini (code=exited, status=1/FAILURE)
Main PID: 36821 (code=exited, status=1/FAILURE)
CPU: 527ms
Jan 28 14:17:16 archpod systemd[1]: [email protected]: Failed with result 'exit-code'.
Jan 28 14:17:16 archpod systemd[1]: [email protected]: Scheduled restart job, restart counter is at 5.
Jan 28 14:17:16 archpod systemd[1]: Stopped SABnzbd binary newsreader.
Jan 28 14:17:16 archpod systemd[1]: [email protected]: Start request repeated too quickly.
Jan 28 14:17:16 archpod systemd[1]: [email protected]: Failed with result 'exit-code'.
Jan 28 14:17:16 archpod systemd[1]: Failed to start SABnzbd binary newsreader.
As far as I know you can use value.queue.sizeleft instead of value.queue.mbleft and it will return a value that dynamically changes units.
I used https://sabnzbd.org/wiki/advanced/api#queue to find all the return values. Look at the example json under the Full Queue output section for what I assume is all the return values.
On nzbget, i increased my article cache and my download speeds increased from 30 to 130 on an SSL port. Maybe on sabnzbd, you can do the same thing?
Different server propagation of new files. DMCA - files may sometimes exists on some servers and not others. ISP or Local admin port throttling. Server down/error. More redundancy++
This page from Sabnzbd gives some good tips. Even gives an example of Celeron J4105 getting 80MB/s so OP should be able to hit those speeds.
I will also say the speed test is not accurate for me. Showing a much slower speed than I get when downloading.
Here are my server setups.
Have you checked all the tips in https://sabnzbd.org/wiki/advanced/highspeed-downloading?
Are both actually running on the same server, or just servers that are supposedly the same? 11 MB/s makes it seem like it's capped at 100 Mbit/s, which could indicate a network/cable problem.
If you are using Docker and trust the module you can try running it with the --privileged parameter. That will turn off security measures, which may speed it up. At least you'll know if that's the problem. Running things under Docker can create a noticeable slow down, particularly if the host OS and Docker OS are different.
sabnzbd has a wiki with info on testing etc ... > https://sabnzbd.org/wiki/advanced/certificate-errors.html
It for sure is affecting a number of providers. The ones I tested come up with an error. To keep going disable Certificate verification for the servers. Sabnzbd chat has this in their chat is all I can say ... > Cert errors are expected, the providers will replace them pretty quickly.
NewsHosting sub here, and have been using Nzbplanet for my nzb's. I am getting a lot of failed nzb downloads error ("Aborted, cannot be completed - https://sabnzbd.org/not-complete")
why is that happening, and what can I do differently to stop that happening. I would prefer to receive my files after I have spent the time looking through the filters and identifying an nzb. Its heart wrenching to see an asset so close, and then you don't get it.
Thanks in advance for feedback!
NewsHosting sub here, and have been using Nzbplanet for my nzb's. I am getting a lot of failed nzb downloads error ("Aborted, cannot be completed - https://sabnzbd.org/not-complete")
why is that happening, and what can I do differently to stop that happening. I would prefer to receive my files after I have spent the time looking through the filters and identifying an nzb. Its heart wrenching to see an asset so close, and then you don't get it.
Thanks in advance for feedback!
SABnzbd has a built-in test-download.
As you use NZBget, maybe you can simulate is by download this reference post: https://sabnzbd.org/tests/test_download_1000MB.nzb
If that fails too, your setup is rong.
One option to consider is just running an http server locally and using the browser as your interface (along the lines of what sabnzbd does). Considering how go pretty much gives you everything out of the box to run a Web server, this approach has some virtues, though it may confuse nontechnical users.
192.* should be allowed but it's possible that it gets confused by the .09 IP address. I have no idea why that would happen.
Use indexer tags next to the category you want to sort your downloads into. For example your TV category can have “tv*” as the indexer tag and any NZB coming from an indexer’s TV > HD or TV > SD category will be automatically sorted to that Sabnzbd category.
See: https://sabnzbd.org/wiki/configuration/3.2/categories
Indexer tags are explained at the bottom of that Wiki page
>I am having to always manually choose the category in the SAB download list.
And how you determine the category? Based on words in the filename / post name?
If you can describe that to a computer, you could use https://sabnzbd.org/wiki/scripts/pre-queue-scripts
>Not blank, 0.0.0.0
the better option anyway
>Yup. 192.168.50.0/24, 10.0.0.0/24 in my case. SABnzbd is in a DMZ network and I am in a LAN. DMZ as in the proper enterprise sense of the word, not the home router sense of the word.
Okay and HAproxy is running on the same Docker? If not on the same Network? (I guess 10.0.0.x)
>Full web interface.
Okay than the "List of local network ranges" shouldn't matter
>URL base I've tried /SABnzbd and I've tried just / and no change. Host_whitelist wasn't actually provided for my URL so I added it and still no change.
Stick to the lowercase "/sabnzbd" and configure it in the forwarding rule in the HAproxy
for the Host_whitelist see: https://sabnzbd.org/wiki/extra/hostname-check.html
Do you have anything in the logfiles of ha or sabnzbd?
According to their own wiki page, they suggest removing direct unpack and to pause while post processing.
I did this and downloading, unpacking, and moving of the file was much faster.
OK, interesting.
Have you tried this: stop SABnzbd, find sabnzbd.ini, edit the file, change the relevant "connections = " to ... 150. Then start SABnzbd.
EDIT: location of sabnzbd.ini: https://sabnzbd.org/wiki/advanced/directory-setup
If you haven't done this already, you can try the steps here: https://sabnzbd.org/wiki/advanced/highspeed-downloading - it definitely helped me with my gigabit connection.
in the same vein as what the other poster said, I'd highly recommend an SSD that just handles files during the download / repair process (like a scratch / cache drive, essentially) and copying the final files over to their correct destination on your storage drives. It definitely helps if those drives you have are actually SMR drives*, but even just splitting the responsibilities for the drives is a nice to have if you can get it.
(*IIRC drive manufacturers explicitly recommend this sort of cache drive setup for SMR drives when write speed is important, since you then write a file once that you'll later read multiple times, which is what these drives are designed for)
>I'm currently getting between 50 and 55 mbps.
>
>Connection set to 1000 mbps and line dedication is 100%.
>
>Does this look the type of speed I should be getting?
No, certainly not.
... but I think you're mixing up bits and Bytes. See https://sabnzbd.org/wiki/faq#toc31
This is a great reference for SAB speed tuning, which really in theory is a great guide for any newsreader.
Based on https://sabnzbd.org/wiki/advanced/highspeed-downloading I would say:
Don't you use categories or labels in SAB? So you tell sab that movies go to /download/movies and TV to go to /download/tv
https://sabnzbd.org/wiki/configuration/3.0/categories
In sonarr and radarr you define what category to use and in SAB to define the category
Same
2020-09-09 00:56:59 ERROR Server eu.newsdemon.com uses an untrusted certificate [Certificate not valid. This is most probably a server issue.] - Wiki: https://sabnzbd.org/certificate-errors 2020-09-09 00:57:03 ERROR Server news.newsdemon.com uses an untrusted certificate [Certificate not valid. This is most probably a server issue.] - Wiki: https://sabnzbd.org/certificate-errors
https://sabnzbd.org/wiki/installation/install-debian
>The PPA for SABnzbd for Ubuntu seems to work for Debian too. You need to read and follow the instruction by the author of the Ubuntu PPA (JCFP) in this forum thread.
I go to the thread (https://forums.sabnzbd.org/viewtopic.php?f=16&t=9844) and it says:
>Edit: with version 3.0.0, direct use of the PPA on Debian is no longer an option. If you have such as setup, remove the PPA from your sources.list and go with creating a backport instead. The source packages from the PPA are an excellent basis for those, and that remains the case for 3.0.0.
Perhaps I misunderstood what I need to do
As others have said, they are a bunch of files that are used to rebuild any missing parts of the archive.
This is due to the way binaries are posted, usenet is designed to be a text discussion system, so to post binaries they are split up into thousands of chunks of text (encoded using an encoding called yEnc) and each article is posted individually.
An NZB file is essentially just an xml file that contains references to these thousands of posts that make up each file.
Because of this, sometimes posts go missing or aren't properly shared between the various providers that host usenet. A par set gives enough information to repair the binary if any of these text chunks are missing.
The most common usenet downloaders, nzbget and sabnzbd do not actually download the par2 files unless they are needed, so while the reported size might be larger, the total download is usually the same unless repair is required, and even then it will only download as many blocks as are needed.
Hey, check the bottom of this page: https://sabnzbd.org/wiki/advanced/sabnzbd-as-a-windows-service
Specifically the last section, Enabling Network Share Access.
I set the service logon to my user account, bingo.
Refer to the API help, https://nzbfinder.ws/apihelp. The feed address should look like this: https://nzbfinder.ws/api?t=search&cat=<your_category>&apikey=<your_apikey>&q=<your_search_words>
Use that as the "Feed URL" in SABnzbd's RSS section, give the feed a meaningful name, save. For filtering inside the results of that feed, refer to https://sabnzbd.org/wiki/configuration/3.0/rss.
Sab version 3.0.0-develop. Interesting, because I am using the "nobetas." According to the website that should only be final releases, but I guess it includes release candidates.
For LL, I was using the previous release. It just updated to the newest one a couple hours ago, said it had 4 new commits. I don't remember the hash to the previous one. Now I am on 6915b5ff303efb866546a1b743dd0bee2608e695.
In the end, it sort of solved itself, because when LL re-ran the job to look for missing books, it sent all the books to Sab, then the post processor ran a couple minutes later and found all the books that had already been downloaded. But then the other copies also downloaded so I just had to go manually delete them from my sab download. In order for this to work, though, I had to uncheck the "blacklist downloaded" option so that it would make sure to find at least one copy of the books, since some had been tried 3 times already.
I think it was just a hiccup for this one time, because it had worked previous as well as after this. It would seem to me that the solution would be for it to "import" the book if it is wanted even if it is not marked as snatched, similar to sonarr and radarr.
I just got this error again 26min ago.
Server news.newsgroup.ninja uses an untrusted certificate [Certificate hostname mismatch: the server hostname is not listed in the certificate. This is a server issue.] - Wiki: https://sabnzbd.org/certificate-errors
Here are the logs from the latest time this happened:
2020-07-02 05:23:39,631::INFO::[downloader:490] : Initiating connection 2020-07-02 05:23:39,805::INFO::[happyeyeballs:138] Quickest IP address for news.newsgroup.ninja (port 563, ssl 1, preferipv6 True) is 69.16.179.22 2020-07-02 05:23:39,807::INFO::[downloader:490] : Initiating connection 2020-07-02 05:23:39,809::INFO::[downloader:490] : Initiating connection 2020-07-02 05:23:39,810::INFO::[downloader:490] : Initiating connection 2020-07-02 05:23:39,812::INFO::[downloader:490] : Initiating connection 2020-07-02 05:23:39,813::INFO::[downloader:490] : Initiating connection 2020-07-02 05:23:39,814::INFO::[downloader:490] : Initiating connection 2020-07-02 05:23:39,818::INFO::[downloader:490] : Initiating connection 2020-07-02 05:23:39,864::INFO::[newswrapper:243] Certificate error for host news.newsgroup.ninja: hostname 'news.newsgroup.ninja' doesn't match either of '*.sslusenet.com', 'sslusenet.com' 2020-07-02 05:23:39,864::ERROR::[newswrapper:257] Server news.newsgroup.ninja uses an untrusted certificate [Certificate hostname mismatch: the server hostname is not listed in the certificate. This is a server issue.] - Wiki: https://sabnzbd.org/certificate-errors 2020-07-02 05:23:39,865::INFO::[newswrapper:269] Failed to connect: Server news.newsgroup.ninja uses an untrusted certificate [Certificate hostname mismatch: the server hostname is not listed in the certificate. This is a server issue.] - Wiki: https://sabnzbd.org/certificate-errors :563
As I mentioned here this is not the server or provider and is more than likely related to your cipher settings.
If you're specifying a cipher on your setup page, change it to AES128-SHA, AES256-SHA or leave the option blank. If it's blank already, try adding AES128-SHA or AES256-SHA.
Good luck!
https://sabnzbd.org/wiki/advanced/ssl-ciphers
If you leave that option empty you will be asked for the category whenever you send a download to sabnzbd in hydra.
Otherwise hydra will not set a category when sending a result to sabnzbd so sab will have to decide which category to use. See https://sabnzbd.org/wiki/configuration/2.3/categories for that. That's more a question related to sab than to hydra.
While the download is in the Queue (downloading or not), hover over the file and you will see a few icons, among a Folder icon. Click on it, and fill out the password.
​
And see https://sabnzbd.org/wiki/advanced/password-protected-rars
Which version of SABnzbd?
In sabnzbd.log find the line "Decoder failure: Out of memory", and copy-paste 15 lines from there into this thread. Especially interesting the line with "Traceback" which should be there.
​
>Article Cache Limit is set to: -1; which I've been using that setting for years.
​
But still: what happens when you set that to 500M ? Because -1 might be understood as something very big (for example 2^32 - 1 or so)
​
Maybe relevant: https://sabnzbd.org/wiki/introduction/known-issues#:~:text=If%20you%20get%20the%20Decoder,as%20a%20Service%20but%20regular.
| I would prefer to just download the 64K.
Those nzbs apparently have "32K" or "64K" in their titles. The solution is simple then: Use that RSS feed's filters (in SABnzbd) to require "64K" or to reject "32K". The result being that only the 64K files match and are downloaded. (wiki)
| there are 25 matched items in there
If you used an RSS url like the one I mentioned, that already has the search terms included, so in that case you actually did tell DOG to retrieve something.
The other solution mentioned by /u/ultimatefireball (category with folder plus asterisk at the end) is simpler than date sorting. As I understand it, it will just put the files in the same folder "as they are". If the file names are not meaningful though, but some gibberish (that may happen), date sorting may make it possible to set the filename. Or you could try SABnzbd's built-in Deobfuscate.py script (wiki).
The default filter "*" matches everything, that's why you had 25 matched items. Change it to the name of the show you want.
If you don't want a separate folder for every download, just make sure the folder on the Categories page ends with an asterisk (wiki). My advise would be to create a new Radio category, which you should also set for the RSS feed.
Did you click on that URL https://sabnzbd.org/wiki/advanced/certificate-errors.html and read that info?
Did you fill out your (ninja?) newsserver there and click on "Test Server"? Which server do you use?
How would that work?
I don't think this has the functionality I require:
https://sabnzbd.org/wiki/scripts/pre-queue-scripts
Am I looking at the wrong thing?
Yea I followed all of this: https://sabnzbd.org/wiki/advanced/highspeed-downloading
Before that I was not getting over 50 but now I get much higher. I also ran that speed test in SAB and my HDD transfer rate was 110 MB/s
Changing the SSL cipher you use can give you a noticeable performance boost if your speeds are being CPU limited. It doubled my speeds when I ran things on an old laptop. You can also try some of the other tricks on SAB's High Speed Download Guide
Not necessarily. Best thing you can do is start at 5 and increase until your speed decrease. You then have your best setting. Here is sabnzbd advice: >Investigate your connection count in Config->Servers. It may seem counter-intuitive, since more connections should be faster than fewer connections, but if you use the 50+ connections some hosts give you the overhead from constantly opening and closing connections can slow you down. So start at the max allowed connections and slowly lower your count until you max out your speed. Or do it the other way round: start with 5 connections, measure the speed, and raise to 7, measure again, 9, measure again, etc. Normally 10 connections are enough.
this? does this mean its usenet server issue and I should get a second server subscription?
ServersNEWS.USENETSERVER.COM=73 KB
Download failed - Not on your server(s) - https://sabnzbd.org/not-complete
I just set these so no idea if it will work. But check out the “nice” switches, there’s options for the par2, nice, and ionice to set priority of sabs access.
SabNZbd has a setting:
Maximum retries
To prevent deadlock, SABnzbd will only try each server a limited amount of times. You can increase the value or set it to 0 to enable endless retries.
Look in the file sabnzbd.ini
https://sabnzbd.org/wiki/configuration/2.3/switches
or google sabnzbd retries
More info:
Can SABnzbd run on NAS / Android / operating system XYZ / … ? SABnzbd can run on an operating system that provides / runs:
Python 2.7 the Python library Cheetah par2 and unrar optionally the Python library yEnc The creators of SABnzbd provide the source code of SABnzbd, plus ready-to-install-and-run packages for Windows and macOS. Others provide (or don't provide) packages for other operating systems. If you want SABnzbd on another operating system, search for an existing package for that OS, or make it work yourself. On top of the above, SABnzbd requires some amount of RAM: more is better, especially for high-speed Internet connections and/or big downloads. 256MB of RAM (or less) without swap space will cause problems and crashes.
>1.1.1 is the most recent version in the repository, and i know just enough Linux to probably bork everything up if I tried to upgrade on my own. So I went with Nzbget and it seems to be working much better with the files.
Yes of course, because 1.1.1 is ancient. Like 3 years old. The latest SABnzbd can be installed from the PPA-repo that we provide, you could have read the documentation: https://sabnzbd.org/wiki/installation/install-unix
1) Signup to a cheap service like fastusenet which is $4.5 dollars a month
2) Download a usenet client like sabnzbd
3) Download nzb (usenet equivalent of torrent) from sites like nzbplanet or usenet-4all