Eventually, I think. Maybe. Right now, the Sonarr devs are working on V3, which is an entire overhaul of the UI (see Lidarr if you want a preview of what V3 will look like).
There are a couple of tools people have built to integrate with Trakt lists. https://flexget.com/Plugins/List/sonarr_list and https://github.com/tomtom602/TraktToSonarr are two.
Cu flexget + transmission.
Ii configurezi un RSS de unde sa isi ia trackere si niste reguli care sa le filtreze in functie de show/calitate video/etc. Cat despre episod "nou" se ocupa el cat descarca, stocheaza intr-un db file ce episoade a descarcat, si ce a aparut nou pe RSS de la ultima scanare il compara cu ce e deja stocat in db.
One of my quarantine projects has been setting up a Raspberry Pi to do exactly this. I found a program called FlexGet that appears to work well for this. It seems like it's intended for torrents/piracy but it downloads from the Giant Bomb RSS feeds just fine. If you have the time and patience to set it up, it's worth it.
Also, have you heard of FlexGet? It would handle all that FileBot stuff automatically on download! I've got it running on my VPS as well as my Transmission instance in the cloud, so all the upload traffic occurs up there vs on my home connection. I have an rsync cron that downloads content home. My ISP only sees A LOT of encrypted download traffic from one source.
>Do you have a plan for when Google stops the gravy train? Unlimited for $10/mo isn't likely to continue for very long.
Pay the price. Even the $50/mo is significantly cheaper than buying and hosting it all at home with the amount I have stored with google.
>As long as you have analyze files turned off, I'd think it would work. But that is still a lot of files and folders. I wonder how long the daily-ish refresh would take.
Analyze is definitely turned off. "Daily" refreshes take more than a day to complete.
>Have you explored FlexGet? I know nothing about it, but it is the only alternative I can think of. Did you ever try CP w/ this? I can't imagine it'd work though.
I'll look into flexget, and I had bad experiences with CP in the past and can't be bothered to try again.
Glad it helps, I did basically the same thing as that individual. I am using the SequentialDownload plugin to prevent disk fragmentation, even though it's somewhat unhealthy for the swarm.
But yeah, basically just launch lots of deluge daemons with their own config files on their own ports. I use <code>flexget</code>, Sonarr, Radarr, etc to parse releases/RSS feeds. The .torrents are sorted by flexget into folders that are watched by deluge for new downloads.
I've been planning to release a bunch of tools I have for stable, large scale seeding, including a "better" web UI (doesn't massively lag the browser like the classic deluge web UI) for managing dozens of daemons simultaneously.
Until sonarr implements trakt/imdb/whatever else list integration, if you want to go the route of hands off for at least trakt to sonarr integration for now, I'd suggest looking into flexGet and the Sonarr List Plugin. Once you get things setup you will rarely if ever have to go in and add a show manually to sonarr anymore.
You may be able to use it for more than trakt to sonarr, like IMDB to sonarr but I haven't exactly looked into it, as all I use is trakt.
Not sure which RSS scraper you're using, but something like FlexGet can rewrite the URL. For RED, add &usetoken=1
to the end of the URL to make it use a token.
Good question, not sure about the collection part but out of curiosity i looked into the free space options with flexget and found they have a path selection plugin. I dont have a need for that but its nice to know it exists
This has been a goal of mine as well. Recently I migrated to Servarr from FlexGet. Before, under FlexGet, I had evolved a little Python "daemon" that accomplished this goal. Since my switch to Servarr, I've been migrating that daemon into Prunerr with the primary goal of reflecting the state of Servarr download items in the download client, primarily by moving the download item around in the download client, and then using that reflected Servarr state to determine which download items to delete. I'd love to collaborate to make this useful for others trying to accomplish this, so please to test and/or contribute, but take the pre-alpha warnings at the top of the README
seriously!
I'm just texting myself random links for testing. The tasks hangout2 or 1 are just clones from the original task so I can test. So I just make the changes and link the task to the profile and send myself a test link.
Here's the link I'm testing with https://flexget.com/ChangeLog
I think you need to do the first run with the --learn
argument to avoid that. And yeah, I've heard flexget is not very easy to configure but once you get it right it's a powerful autodl tool.
FlexGet is basically just an RSS reader that can match series and movies with specific qualities and supports a lot of plugins. You can check out the website here: https://flexget.com/. I have it configured to look for certain shows and certain qualities (720p, 1080p, hdtv/webdl) and it will automatically fetch the matching ones from an RSS feed (or multiple feeds). My config checks 6 different RSS feed for every show and it keeps track of the episode and quality that has already been fetched so it doesn't get downloaded more than once.
<code>flexget</code> has a <code>content_size</code> configuration option per RSS feed it's consuming.
If there's a better tool than flexget for this stuff (in terms of simplicity/reliability/configuration options), I'm unfamiliar with it.
You can send all torrents smaller than x to a watch folder that will automatically add them to your client, and torrents bigger than x to a different folder where you can manually screen them.
Traktarr sadly doesn't seem to work on windows 10 (at least every time I've tried on windows 10 it never worked), as for flexget the install instructions for windows 10 are right here https://flexget.com/InstallWizard/Windows
just gotta have python installed on your machine and then it's as easy as just pip install flexget
.
First you're going to want to google flexget to get an idea what it is.
You'll need a torrent server. I use transmission daemon on ubuntu server.
There's lots of guides via Google to set those things set up.
Then specifically
Setup imdb_list plugin https://flexget.com/Plugins/List/imdb_list
Setup flexget transmission plugin https://flexget.com/Plugins/transmission
Then configure tasks to scan the rss feed of your favorite torrent sites.
https://flexget.com/Plugins/rss
Honestly man, you have to do some reading and tinkering. There's no for dummies guide for what you want.
Not exactly sure why you'd want to do something like this, but you can use the limit plugin, then run the task once per hour.
limit: amount: 1 from: rss: http://example.com/rss accept_all: yes
Hmm, .session? That is an internal bookkeeping directory for rtorrent, when a torrent gets added it gets held there. If you are talking about the watch folder, RSS doesn't use that.
In the RSS manager (right click on the feed), you can set "do not start automatically". The directory field is where the torrent payload (not the torrent dot file is stored).
IF you want different behavior I think you have to look at flexget.
Install flexget following one of the methods here https://flexget.com/Install
Once done create a configuration file https://flexget.com/Configuration following the examples and such from the link I gave before for the sonarr_list specifically this example "Add series from trakt to sonarr", make sure to read all the plugin settings options as well so you get a better understanding on how each option works and how it needs to be setup.
Nah man, not even close. Any exodus fork for kodi (Exodus Redux, Venom, Scrubs v2) simply has everything available for streaming in one place. Quality usually maxes out around 720p for most things though.
If you want better quality then yeah you go the torrent route.... but this can be automated to a degree where you end up with a far better system than what any of the providers offer. For instance, you can use something like flexget which takes a series of torrent RSS feed inputs (just feed it a list of 10's or 100's, from where-ever). You then define a filter based on things like IMDB ratings, actors, directors, whether it's a series you have already downloaded before, etc as well as video quality. It'll then select the torrents that pass your filter and send them to your torrent client, and you can set it to organise the files automatically, however you like, on your disk. Smack a Kodi front end on that and you're golden. Don't have to look for, or sort, content it just comes to you automatically—straight to your media player on your TV.
Flexget sembra supportare il tuo NAS e da quello che vedo si trova un plugin per youtube-dl.
Non ho esperienza specifica perché ho usato flexget solo con torrent e per quello mi sono trovato benissimo.
Prova ad approfondire ma a prima vista mi sembra fattibile usarlo anche per youtube.
Seen a few folks mentioning RSS, but no mention of FlexGet the third party RSS getter.
If your tracker supports it, Autodl-irssi, which uses IRC, is also an option.
Beyond that there are several other topic specific gettors, like SickRage and Couch Potato. Or even Sonarr and Radarr, which run natively under windows and can be run from home.
> So everything is stored on Google
Do you have a plan for when Google stops the gravy train? Unlimited for $10/mo isn't likely to continue for very long.
> With rclone vfs listing files is really fast, so I would assume radarr discovering/seeing the movies shouldn't be an issue, but obviously I could be wrong.
As long as you have analyze files turned off, I'd think it would work. But that is still a lot of files and folders. I wonder how long the daily-ish refresh would take.
Have you explored FlexGet? I know nothing about it, but it is the only alternative I can think of. Did you ever try CP w/ this? I can't imagine it'd work though.
I have not tried that (I'd derive no benefit anyway I think). What I am pushing from is Tautulli for watch notifications so I can have some awareness of what my kids are watching.
If you are after notification of when new stuff is completed, and you have the rest of your pipeline automated, you could have Tautulli send you notifications when things are added to Plex. That might be cool if you want that kind of up to the minute info.
Personally I've got enough noise in my life and I set up the automation so I don't have to know :-)
Assuming your torrent setup moves completed files to a different location than where they download from, you could maybe set up something to watch for changes in that directory and send notifications. Need to find some sort of file watcher that can either directly integrate to pushover or has a connection to IFTTT. Looks like FileBot has direct support, as well as Flexget which I had not heard of before but looks interesting.
There's a long winded way of doing RSS feeds with a third party app and downloading with Transmission.
I haven't figured it out yet so I'm using qb and Transmission at once :(
edit: Flexget is the tool
Until sonarr implements trakt/imdb/whatever else list integration, if you want to go the route of hands off for at least trakt to sonarr integration for now, I'd suggest looking into flexGet and the Sonarr List Plugin. Once you get things setup you will rarely if ever have to go in and add a show manually to sonarr anymore.
You may be able to use it for more than trakt to sonarr, like IMDB to sonarr but I haven't exactly looked into it, as all I use is trakt.
See /r/flexget and https://flexget.com/
It is basically a separate tool that can handle RSS feeds & send filtered torrents to various torrent clients.
Not sure if that is really your issue, sounds more like something wrong with your ability to load the showRSS feed.
The idea is to filter the RSS feed, for instance with something like <code>flexget</code>, or maybe the built-in RSS filter on your torrent client. I prefer flexget because it is far more modular, is independent of the torrent client (so one is not tied to a specific torrent client forever, just because they don't feel like redoing their RSS filters). This way you can send all torrents matching a certain filter (eg from a certain category, like 0day-apps, or XXX-0DAY-CLIPS, or PC-ISO) to a torrent client and the client will know to move finished downloads to specific folders, eg send all torrents with titles ending in RELOADED|SKIDROW|Unleashed|SIMPLEX|F4CG|Razor1911|CODEX|CPY|*PUNKS|PLAZA
etc to /media/pc-games/
or all 0day-app releases with MacOSX|macOS
etc in the title will go to /media/0day/_macos
Looks like for torrents it is built in but I found these.
https://flexget.com/ http://factormystic.net/projects/apps/automatic-feed-downloader https://sourceforge.net/projects/rss-downloader/
I am sure that last one the python script could be adapted to meet your needs pretty easily if the others are of no help.
I'm not sure if this is as viable for music as it is for TV shows, but automating piracy is a massive time saver. I configured Flexget to download new episodes of the shows I watch - you could have it download new tracks/albums from a list of artists you like, or automatically choose what to download based on popularity charts or something.