1) If an indexer has the option to search a particular category for a string such as "MLB <year>" for instance, you could create an RSS feed off that and then feed it to NZBGet/Sabnzd+ every 30 min or so to grab any match
2) Use NZBGet to filter out "MLB <year>" from an indexer's TV > Sport RSS feed, and then poll that every 30 minutes. Get has a pretty powerful parser that can do a lot. There is a link to the documentation here: http://nzbget.net/RSS. I believe Sabnzbd has RSS filtering as well, but I don't know much about it since I don't use it. Link here: http://wiki.sabnzbd.org/configure-rss
With both methods you can automate sport events/shows using a NewzNab (or even nzedb) indexer without a 3rd party program
Edit: Added NZBGet/Sabnzbd RSS link
I admit that I'm on a low powered NAS but can SAB do all this? When I used it last a couple of years ago it didn't, maybe things changed?
http://nzbget.net/forum/viewforum.php?f=8&sid=4b543a475830b78c97b551f67aeb4d67
Try this script instead i use this one from their forum. hope it helps
[Unit] Description=NZBGet Daemon Documentation=http://nzbget.net/Documentation After=network.target
[Service] User=<replace_with_the_user_you_want> Group=<replace_with_the_group_you_want> Type=forking ExecStart=</path/to/nzbget/nzbget> -D ExecStop=</path/to/nzbget/nzbget> -Q ExecReload=</path/to/nzbget/nzbget> -O KillMode=process Restart=on-failure
[Install] WantedBy=multi-user.target
I actually see that you commented on my thread on the NZBGet forums.
Haha, small world, this.
The 90 connections have to be an issue here but I don't know if anybody has worked with the SAB devs with a connection >100Mb/s.
Hey, I'm the author for the NotifyPlex script for NZBGet. Glad to see someone wrote one for Sab! One thing...for anyone that uses Plex Home, the library refresh method won't work because you need to authenticate with plex.tv first and get an authentication token. Maybe something to look into?
There is a PP-Script for NZBGet that can do this, that I ported over from the Sabnzbd+ version: http://nzbget.net/forum/viewtopic.php?f=8&t=1474
The Sabnzbd+ version of the script is still out there too..
Perhaps you can have your download client run the script?
It's a queue script, and it's in "beta" I guess you could say. It attempts to peek inside the RARs while downloading to detect any unwanted extensions. It's not a post-processing script--which would only scan after, and doesn't really make much sense.
Here is the link to the forum thread, but I'm assuming this is what you had found anyway: http://nzbget.net/forum/viewtopic.php?f=8&t=1394
I run NZBGet on OSX, so I can't comment on how it would perform on your machine, but I can assure you that it has a relatively small CPU and memory footprint and runs very efficiently. It should not interfere with your other programs that are running, and as I explain below, NZBGet won't really need to do RSS pulls or periodic folder checks. The only time you will see a hit is when you're unpacking, repairing, and downloading...though any NZB client will take up resources doing that.
In terms of automation, NZBGet has many features that can make that happen. If you are a Dog user (like you said), you can set up Remote Push so that you can instantly send the NZB from Dog to NZBGet with the click of a button. You can even set up a Movie and TV Watchlist to look out for your movies/shows and they will automatically be sent to NZBGet when they are released/indexed. You won't even need to set up RSS/Bookmarks or a "watch" folder at all. However, you will need to make NZBGet accessible from the WAN side (port forwarding), so that Dog can connect to NZBGet.
There are also many post-processing scripts for NZBGet that will sort/rename your files, notify you, refresh your Plex/XBMC libraries, etc. when the download is finished.
I have the same problem for months now, but with NZBGet. For me it's not working at all and nothing gets downloaded. This is the message I get in NZBGet:
TLS certificate verification failed for usenet.premiumize.me: certificate has expired. For more info visit http://nzbget.net/certificate-verification
> cp /usr/lib/systemd/system/nzbget.service /etc/systemd/system/nzbget.service
After running systemctl daemon-reload, same thing.
systemctl status nzbget.service ● nzbget.service - NZBGet Daemon Loaded: loaded (/etc/systemd/system/nzbget.service; enabled; vendor preset: enabled) Active: failed (Result: exit-code) since Thu 2020-09-17 23:29:51 UTC; 54min ago Docs: http://nzbget.net/Documentation
Sep 17 23:29:51 homeserver systemd[3043]: nzbget.service: Failed at step EXEC spawning /opt/nzbget/nzbget: No such file or directory Sep 17 23:29:51 homeserver systemd[1]: nzbget.service: Control process exited, code=exited, status=203/EXEC Sep 17 23:29:51 homeserver systemd[1]: nzbget.service: Failed with result 'exit-code'. Sep 17 23:29:51 homeserver systemd[1]: Failed to start NZBGet Daemon. Sep 17 23:29:51 homeserver systemd[1]: nzbget.service: Scheduled restart job, restart counter is at 5. Sep 17 23:29:51 homeserver systemd[1]: Stopped NZBGet Daemon. Sep 17 23:29:51 homeserver systemd[1]: nzbget.service: Start request repeated too quickly. Sep 17 23:29:51 homeserver systemd[1]: nzbget.service: Failed with result 'exit-code'. Sep 17 23:29:51 homeserver systemd[1]: Failed to start NZBGet Daemon.
I get connection closed by remote host. When I use the IP, I get:
TLS certificate verification failed for 178.20.174.131: certificate hostname mismatch (news.usenet.farm). For more info visit http://nzbget.net/certificate-verification
If it helps any I use my local DNScrypt server for DNS with DNSSEC and DNS over HTTPS enabled and pointing to Cloudflare's secure 1.1.1.1 DNS servers, so don't think it's an ISP issue if I'm bypassing for DNS.
Thanks for the link. I also found the 15.0 release notes. I'm using the Synology Community package which hasn't been updated to 15.0 yet, but I'm looking forward to the new version.
Look up in the Paths settings to see what you have set as your MainDir. Then your Scripts directory you should MainDir\scripts.
For example, if you have MainDir set as C:\nzbget, then your scripts directory would be C:\nzbget\scripts. Seems like you figured that part out.
Along with Email.py and Logger.py, you can place other scripts in there. VideoSort, however, isn't just one .py file. It is a collection of libraries along with a .py file. So you need to unzip the VideoSort.zip file, and drop the entire "VideoSort" folder in the script folder next to Logger.py and Email.py. Once you do that, go back into your NZBGet settings, and you will see Videosort near Email and Logger on the bottom left. To add other scripts that aren't zipped, you just drop the .py files into that same folder. So depending on the script you download, you either drop the entire Folder in there (which inside it has .py file(s) ) or you just drop the .py files. If it comes in a txt document, which is weird, you can technically save the file as a .py and do the same thing. Most PP-Scripts are written in python (hence the .py), and should NOT just be .txt files. NZBGet will only recognize .py files and .sh files, I believe.
EDIT: http://nzbget.net/Catalog_of_post-processing_scripts is the link for the PP-scripts catalog for NZBGet. There are more in the NZBGet forums in the Extension Scripts topic also.
I very recently helped a friend who was getting this "stop at 99%" thing by introducing him NZBGet. It solved instantly the issue. You should give it a try.
if you're on Windows, you can try the latest beta version which comes with a noob-friendly (no offense meant) installer.
Instructions here if you wish to have it running as a Service : http://nzbget.net/Installation_on_Windows#Version_15_or_newer
What about your CPU/HDD usage when you download with the newsgroups ? Maybe the hardware is limiting your performances.
By the way, 95 threads, if it's with SSL/TLS activated, it takes a bunch of CPU... Try less, between 30-60. And are you sure you providers give you 95 connections to use ? My provider gives me 30 connections, and I can easily max out 100-150 mbps.
NZBGet is another great client you might want to try. "NZBGet is written in C++ and designed with performance in mind to achieve maximum download speed by using very little system resources." (http://nzbget.net)
Maybe an ISP limitation, try turning on SSL/TLS and using different ports, 443 for encrypted should work best as it's the same as https. Check your configuration with your newsgroups providers.
I'm sure you could make a post processing script for that, http://nzbget.net/Extension_scripts
But maybe head over too, http://nzbget.net/forum/viewforum.php?f=8 and make a request if programming isn't your thing.
kk, drop that software and try something else.
Sabnzbd or nzbget.
I'm not sure if Mimo will automate repair with the new stuff.
Sabnzbd and nzbget tend to work best as they keep up on new posting styles.
It's pretty simple. Assuming you have a DOGnzb account: All you have to do is go to "Watchlist > Movies", Search for the Movie you want, select it from the auto-search Drop-Down (if it doesn't show up there you can just use the imdbID in the search box), choose from up to 12 quality settings, and press Add Item.
You can also select multiple qualities, and the Watchlist will keep pushing releases to you until you've received the highest quality version that you selected. Make sure the "Keep looking for better" option is enabled in your Watchlist settings.
If the movie you're adding is already indexed (at the qualities you've selected), it will show you the available releases so you can Push them right away. If not, you can Watchlist anyway. If the movie has not been released/indexed yet, it will push to you as soon as DOGnzb indexes the release and it matches the quality(s) you set.
You can then integrate this setup with your download client so that it maps the categories using Aliases or Indexer tags, so that it can be put into the right category for post-processing. There are many ways to post-process the movie after it's done downloading. Either by sending it back to CouchPotato for renaming/sorting with nzbToMedia scripts, by using Sabnzbd's built-in renamer/sorter, or by using VideoSort for NZBGet.
Like everyone has mentioned, you're less likely to get them with good indexers.
But you do get a bad download, NZBGet can detect and delete password protected and fake downloads very early whist downloading. So you might download ~50MB of a few GB download and if it finds it bad, it can delete them automatically and do other things.
You can do that manually by following this forum post. But if you run it on your local machine, it might be easier to just save them in the NzbDir? You could adjust the options NzbDirInterval and NzbDirFileAge to speed up scanning.
I probably should have made a text post, you're right to be cautious of random binaries. I build these myself and decided to share them since building on windows isn't as easy as on linux. I'm active on the NZBGet forum with a similar nick and have linked them there in the past. Letting it run through virustotal shows that I didn't include any crapware.
It was also pretty bad timing since there was an new official testing build released just shortly after submitting the link, so there's no need to try them for now anyway.
I use NZBget with duplicates handled by duplicate keys. http://nzbget.net/RSS#Duplicates
Once you create your dupe keys filter, just paste them to each of your RSS feeds.
Most of the people that run nzbget/sabnzbd tend to run on Linux or NASs, as that's really where nzbget shines.
You could also go over to the nzbget forums, hugbug the developer I'm sure could look more into this for you. http://nzbget.net/forum/?title=forum
If you have a linux computer or can run a VM, I'd try on Linux, to see if it's nzbget or the OS.
I'm trying the US server. I've tried just about every port. I have all the settings you posted. It returns a TLS error: TLS certificate verification failed for news.newsgroupdirect.com: certificate has expired. For more info visit http://nzbget.net/certificate-verification
Oh yeah! Thank you, very much appreciate your help:
● nzbget.service - NZBGet Daemon Loaded: loaded (/etc/systemd/system/nzbget.service; enabled; vendor preset: enabled) Active: active (running) since Fri 2020-07-03 12:28:23 +04; 5s ago Docs: http://nzbget.net/Documentation Process: 173447 ExecStart=/home/funkcanna/nzbget/nzbget -c /home/funkcanna/nzbget/nzbget.conf -D (code=exited, status=0/SUCCESS) Main PID: 173460 (nzbget) Tasks: 68 (limit: 38400) Memory: 121.8M CGroup: /system.slice/nzbget.service └─173460 /home/funkcanna/nzbget/nzbget -c /home/funkcanna/nzbget/nzbget.conf -D
Yeah its been rebooted a few times. I rebooted again and got (sorry for formatting, I’m on IPad and its doing with things with the formatting!)
● nzbget.service - NZBGet Daemon Loaded: loaded (/etc/systemd/system/nzbget.service; enabled; vendor preset: enabled) Active: failed (Result: exit-code) since Thu 2020-07-02 19:46:48 +04; 2min 48s ago Docs: http://nzbget.net/Documentation Process: 1036 ExecStart=/opt/nzbget/nzbget -c /opt/nzbget/nzbget.conf -D (code=exited, status=203/EXEC)
Jul 02 19:46:48 Lee-Ubuntu systemd[1]: Stopped NZBGet Daemon. Jul 02 19:46:48 Lee-Ubuntu systemd[1]: Starting NZBGet Daemon... Jul 02 19:46:48 Lee-Ubuntu systemd[1]: nzbget.service: Control process exited, code=exited, status=203/EXEC Jul 02 19:46:48 Lee-Ubuntu systemd[1]: nzbget.service: Failed with result 'exit-code'. Jul 02 19:46:48 Lee-Ubuntu systemd[1]: Failed to start NZBGet Daemon. Jul 02 19:46:48 Lee-Ubuntu systemd[1]: nzbget.service: Scheduled restart job, restart counter is at 5. Jul 02 19:46:48 Lee-Ubuntu systemd[1]: Stopped NZBGet Daemon. Jul 02 19:46:48 Lee-Ubuntu systemd[1]: nzbget.service: Start request repeated too quickly. Jul 02 19:46:48 Lee-Ubuntu systemd[1]: nzbget.service: Failed with result 'exit-code'. Jul 02 19:46:48 Lee-Ubuntu systemd[1]: Failed to start NZBGet Daemon.
Then I tried to start it and got:
Job for nzbget.service failed because the control process exited with error code. See "systemctl status nzbget.service" and "journalctl -xe" for details.
EDIT: I ran journalctl -xe and there was lots of stuff, but one of note was:
Jul 02 19:51:17 Lee-Ubuntu systemd[3037]: nzbget.service: Failed to execute command: No such file or directory Jul 02 19:51:17 Lee-Ubuntu systemd[3037]: nzbget.service: Failed at step EXEC spawning /opt/nzbget/nzbget: No such file or directory
Thanks, appreciate it:
Description=NZBGet Daemon Documentation=http://nzbget.net/Documentation After=network.target
[Service]
Type=forking ExecStart=/home/funkcanna/nzbget/nzbget -D ExecStop=/home/funkcanna/nzbget/nzbget -Q ExecReload=/home/funkcanna/nzbget/nzbget -O KillMode=process Restart=on-failure
[Install] WantedBy=multi-user.target
>systemctl status nzbget.service
nzbget.service - NZBGet Daemon Loaded: loaded (/etc/systemd/system/nzbget.service; enabled; vendor preset: enabled) Active: failed (Result: exit-code) since Wed 2020-07-01 14:44:36 +04; 1h 12min ago Docs:
<code>http://nzbget.net/Documentation</code>Jul 01 14:44:36 Lee-Ubuntu systemd[1]: nzbget.service: Scheduled restart job, restart counter is at 5. Jul 01 14:44:36 Lee-Ubuntu systemd[1]: Stopped NZBGet Daemon. Jul 01 14:44:36 Lee-Ubuntu systemd[1]: nzbget.service: Start request repeated too quickly. Jul 01 14:44:36 Lee-Ubuntu systemd[1]: nzbget.service: Failed with result 'exit-code'. Jul 01 14:44:36 Lee-Ubuntu systemd[1]: Failed to start NZBGet Daemon.
Not sure if this is any help, but I would try simplifying the start up. Maybe don't use a written config then set up in the web console with your settings.
Try user root? I'm sure that's frowned apon though
Here is my working unit script
> [Unit] Description=NZBGet Daemon Documentation=http://nzbget.net/Documentation After=network.target
[Service] User=root Group=root Type=forking ExecStart=/opt/nzbget/nzbget -D ExecStop=/opt/nzbget/nzbget -Q
[Install] WantedBy=multi-user.target
\n>
If you're using Linux and a systemd service, you can add 'Restart=on-failure' to the '[Service]' section like:
[Unit] Description=NZBGet Daemon Documentation=http://nzbget.net/Documentation After=network.target
[Service] User=nzbget Group=nzbget Type=forking ExecStart=/usr/bin/nzbget -D -c /var/lib/nzbget/nzbget.conf ExecStop=/usr/bin/nzbget -Q -c /var/lib/nzbget/nzbget.conf ExecReload=/usr/bin/nzbget -O -c /var/lib/nzbget/nzbget.conf KillMode=process Restart=on-failure
[Install] WantedBy=multi-user.target
From HTPCguides: http://www.htpcguides.com/install-latest-nzbget-on-ubuntu-15-x-with-easy-updates/
Edit the systemd startup file for NZBGet at /etc/systemd/system/nzbget.service
Replace /mnt/usbstorage with the path to your mount point which should fix the problem where NZBGet isn’t starting because the hard drive isn’t mounted yet
[Unit]
Description=NZBGet Daemon
Documentation=http://nzbget.net/Documentation
After=network.target
RequiresMountsFor=/mnt/usbstorage
I wouldn't say errors are unrelated to performance. Poor performance can lead to errors among all the processes that are working together in a system.
Often when unpacking something, I'll get "UNRAR Error 10" in nzbget, which seems to correspond to "Device or Resource busy" in the detail logs. It looks as if nzbget is trying to delete the _unpack folder too soon after an unrar fails. I fixed it buy turning off the cleanup functions (which sickbeard does just fine later).
That nzbget web-interface link has been down, along with the forums. That's why I posted here.
EDIT: Ah, see you edited w/ some advice. Thanks! I'll try moving some other directories to the C1. SD Storage is cheap. Any advice on cifs mounting? I'm wondering if that's part of my issue since I"m getting those busy-device errors.
> http://nzbget.net/forum/viewtopic.php?f=8&t=835 2015-04-24 22:46:59 POSTPROCESSER :: Unable to move file /Volumes/OS/completed/tv/penny.dreadful.s01e08.720p.bluray.x264-demand-NZBgeek/zmNag5LMPHYTjp67roSGqlKNled3sMzJ.mkv to /Volumes/TV/Penny Dreadful/Penny Dreadful - 1x08 - Grand Guignol.mkv: error 16 : Resource busy 2015-04-24 21:39:26 POSTPROCESSER :: Unable to move file /Volumes/OS/completed/tv/penny.dreadful.s01e08.720p.bluray.x264-demand-NZBgeek/zmNag5LMPHYTjp67roSGqlKNled3sMzJ.mkv to /Volumes/TV/Penny Dreadful/Penny Dreadful - 1x08 - Grand Guignol.mkv: error 16 : Resource busy 2015-04-24 19:59:47 POSTPROCESSER :: Unable to move file /Volumes/OS/completed/tv/penny.dreadful.s01e08.720p.bluray.x264-demand-NZBgeek/zmNag5LMPHYTjp67roSGqlKNled3sMzJ.mkv to /Volumes/TV/Penny Dreadful/Penny Dreadful - 1x08 - Grand Guignol.mkv: error 16 : Resource busy 2015-04-24 19:12:25 POSTPROCESSER :: Unable to move file /Volumes/OS/completed/tv/penny.dreadful.s01e08.720p.bluray.x264-demand-NZBgeek/zmNag5LMPHYTjp67roSGqlKNled3sMzJ.mkv to /Volumes/TV/Penny Dreadful/Penny Dreadful - 1x08 - Grand Guignol.mkv: error 16 : Resource busy
I'll try that script you posted, thanks! Part of my problem is knowing what to google and whats not actually working haha!
There are a lot of great tips on http://nzbget.net/Performance_tips
I recently moved from SAB to nzbget and had to utilize several of these to tweak for the optimum performance. Here are a few things to consider from my own experiences...
DirectWrite is a big one. It should be used if your filesystem supports it as it cuts down on disk I/O and therefore speeds things up. If your filesystem doesn't support it, or doesn't give good performance with it, you should turn it off and make sure your ArticleCache is as big as possible. In any case, your ArticleCache shouldn't be set to less than 100MB
Are you running nzbget in console mode or daemon mode? for some people it's faster one way vs the other.
What lvl are your messages logging at? Turning off the detail messages can speed things up.
This can be a big one: Where are your temporary and intermediate directories set to? Are they local to the machine or somewhere across the network? The latter can really limit I/O speeds and hence download speeds.
Yeah, I went through hell just to get that last 5MB/s to hit 38 MB/s. It's worth nothing that if someone is in this boat with NZBGet, turning off 'ContinuePartial' was a big factor in the speed difference. Also, if NZBGet is running on a lower powered machine, CRC check can be turned off to help raise the speed a bit. I download to a MacMini first, then unpack to my NAS. DirectWrite should be off, and ArticleCache should be set near 1000 MB. The InterDir should be specified on the local machine, the DestDir directory should be on the NAS. After all that tweaking, I was finally able to get my max speed.
This documentation helped alot: http://nzbget.net/Performance_tips
Communication error on the webui may be your browser being weird, have you tried other browsers? If it is still able to download using watchlists then it suggest nzbget is working but the web interface isn't, or can it not download while you see this communication error message?
I will also echo /u/sup3rlativ3 and say post on the nzbget forum as the dev hugbug is very active there
You put the name of your client (the machine where Plex Home Theater is running). You can go to Plex/Web and check the Devices tab to get the name of the "client". The library scan won't work because of Plex Home, but if you use NZBGet, I wrote a PP-script that will do both GUI notifications and automated targeted updates upon download (even with Plex Home enabled). You can find it here: http://nzbget.net/forum/viewtopic.php?f=8&t=1393
A number of methods exist, which I'm sure others will discuss.
I'm interested to see how dev hugbug plans to integrate posting into NZBGet.
Why not?
According to this, it should be pretty straightforward: nzbget.net/Packaging
In my head, it could work like this:
Have the update-info-script query the nzbget source version. If there's a new version, download the sources via the install-script. Then, compile them, stop nzbget, replace the compiled version, start nzbget again.
Do you get any "TLS handshake failed" errors?
I do, and from searching the forums it looks like our problem might be related to an unreliable server:
http://nzbget.net/forum/viewtopic.php?f=3&t=1515
http://nzbget.net/forum/viewtopic.php?f=3&t=1514
still trying to dig up more info though.
Most people nowadays use their indexers' API to snatch the NZB, or drop it into a NZB "watch" folder to download. In the Age of Automation less and less people are double-clicking on .nzb files, but if that's how you like to do it, it's possible.
You can add file associations with NZBGet by following the directions in this thread: http://nzbget.net/forum/viewtopic.php?f=3&t=933
Here's what the new Quick Par Verification feature is all about: http://nzbget.net/forum/viewtopic.php?f=10&t=1333
Compared to the previous par repair function this one is a lot faster. "Verification time: 1.5 minutes vs 33.5 minutes."
It shouldn't filter, but it HAS TO. The dev has said that he will not include a non-filtering RSS reader.
More information: http://nzbget.net/RSS
Edit: Besides, this question is relevant no matter which client you use. You might switch NZB-downloader, but want a separate RSS downloader.
You'd have to make a script for that in the init.d folder.
I think someone on the subreddit made one for Linux.
But you can just do
nzbget stop
nzbget start
http://nzbget.net/Installation_on_Linux_(mipsel)#Starting_on_startup
I am not too familiar with NZBGet as I recently switched from SAB. But I think that there is something wrong with OMGWTFNZB API. Sounds to me that it is pulling the URL down from the show details, and not the NZB file.
I'd check out the NZBget website to see if there is any information there.
What version of nzbget are you using? Accordingly to their forums, current testing version fixed a LOT of bugs related to par files. http://nzbget.net/forum/viewtopic.php?f=4&t=1129&sid=4acc81158fc0aa2c9d96fd40434d43e0
I myself is waiting of Nzbget v13 before I'll try to switch from SAB.