Jdownloader is the program to use. Downloads from almost every website out there and deals with interruptions very well. Use this ad-free installer:
https://jdownloader.org/jdownloader2
It takes a bit of getting used to - this is a powerful piece of software with a ton of features. Make sure to read the documentation and a guide or two beforehand.
It is worth mentioning that some servers, especially certain file hosting services with optional premium subscriptions, do not allow free users to continue interrupted downloads. Jdownloader can not circumvent such restrictions, but very short interruptions are not going to end the download in many cases with this program, unlike with a web browser.
You should also consider switching over to Firefox. It's more efficient, has better working ad-blocking (Google made intentional changes to Chrome that made uBlock Origin significantly less effective) and is far superior in terms of privacy. It can also continue interrupted downloads far better than Chrome, although it's a manual process (you have to pause and continue downloads that appear "stuck").
So I usually save a lot of posts from this sub, the other 2meirl42meirl and depression. Recently I lost all of them but I've managed to get most of them using ripme.
I'm still missing one which had some nice comments. It was a picture with some text suggesting they want to cuddle? I can't remember which sub exactly it was from, if someone finds that post link me pls.
PS: I was too shy to ask from my main account
RipMe is my preferred tool. Also supports imgur, GFY, and other popular image/video hosting sites. Jdownloader 2 works as well, but may be overkill for this purpose.
If you're not aware of the tool ripme yet, I can highly recommend getting it. It supports erome.
The download link is a bit below, but here's a link to the latest release.
Download the ripme.jar file and just run that. It's pretty self explanatory from there.
For steamunlocked it is a good idea to use a download manager for increased speeds, try jdownloader: https://jdownloader.org/jdownloader2 it usually offers increased speeds over the browser's download manager. The torrent client recommended by r/piracy is qbittorrent
http://i.imgur.com/XHBa71T.jpg
I just tried it and it doesn't appear to function, you'd probably have to request it, but somehow I don't think it's going to be a priority.
I love the concept, and thank you for sharing this!
I use something similar for batches of images called, RipMe:
https://github.com/RipMeApp/ripme
RipMe is a Java (.jar) app that supports clipboard monitoring. You should take a look at the code for ideas on URL validation. It seems to me that you are on your way toward making a video downloading version of that app.
Note: RipMe CAN download videos, but it does not give you the control that youtube-dl does. A similar app (particularly with clipboard monitoring) that was specific to youtube-dl would be awesome. I'm no Java dev, but I'd be happy to help you out with ideas and testing.
>go to the release page
>download and run the .jar file under the latest release
You should be able to double click it in windows once java's installed, if not the command line is
java -jar /path/to/where/you/saved/the/jar/file/ripme.jar
once it's open it's fairly straight-forward - paste the url (it can even catch urls from the clipboard), set the destination directory and click "Rip". There are bells an whistles, take some time to have a look at the settings.
And enjoy your tame no-nipples tumblr pr0n!
You can use the config option history.end_rip_after_already_seen = X
. It will cause ripme to end the rip after it finds X urls that where already downloaded
If you try to download the files, you'll get this message:
>total size of requested files (52 GB) is too large for zip-on-the-fly
What you need to do is download a program called JDownloader here. Install the program and open it. Click on "Add new links" and paste a link to the page linked in this post. Once that's done, you can click on the browse button to change what location the videos will download to. (You can skip this step if you're ok with the default download location.) After that, right-click on the entry called "ScottTheWoz" and click "Start downloads" All of the videos should download normally.
Use this: https://github.com/RipMeApp/ripme
Download albums from: * imgur
Since it turned out in to a listing post, might as well recommend WFDownloader App. It already employs crawl delay at intervals during scraping and you can increase it in the general settings so as not to trigger instagram but the search will take longer.
If you want to batch download from kissanime, use WFDownloader App, as it can help you grab complete series. Also supports other sites like gogoanime, masterani, chia-anime etc.
Edit: you can watch this video demonstration that shows how to use with kissanime.ru.
Yes, you would need the download manager on the device which is going to be doing the downloading. It may be a standalone program, or a web browser extension like DownThemAll.
https://github.com/RipMeApp/ripme
Download the latest jar from "releases".
There's no view of the progress of 1 file downloading, so when it lists the files it's downloading and then says "Downloading next page" and it seems to be just sitting there, you just need to wait. It's downloading those files in the background.
It's still uploading, but here's a link to the Google Drive folder (should be done in an hour or so).
Alternatively, you can use the wonderful RipMe utility.
EDIT: It's finished uploading, so Google Drive is an alternative now.
Im a noob. So i just have to translate the words in https://github.com/RipMeApp/ripme/blob/translation/src/main/resources/LabelsBundle.properties ?
Edit: I dont have Github but I translated the file as good as I could
Sometimes there's a download button at the bottom of the album off a little '...' menu. If that doesn't work, ripme is a java app that does the job nicely. There's a wiki on how to run it.
There's this tool made by a member of this subreddit, but it is unfinished and yet to be released. I don't know if there's an existing tool that can save text posts. Someone else will probably know if there is one.
https://www.reddit.com/r/DataHoarder/comments/6t5jss/release_ripreddit_semiautomatic_discord_based/
If you want to download only the images posted on a subreddit then there's ripme
I can't find what I used to use. It's free so I'll link it here when I find it. Until then, check these out. https://www.wfdownloader.xyz/blog/best-bulk-downloader-applications
Bulk Image Downloader is pretty great once you actually have a site chosen to download from. You can choose how deep you want to search in a page.
Have you tried wfdownloader app? I've used it to download galleries from deviantart in the past but that was a long time ago. You copy the url of a gallery e.g. hxxps://www.deviantart.com/auser/gallery/ and paste it in the software and it finds all the image links from that and puts it a batch that you can download.
JDownloader2 or youtube-dl should sort you out.
Both can get you all audio variants (aac/mp4 mostly) so you can pick the ones you want.
Get them all, then it's an one-step operation to extract.
You might get some mileage out of JDownloader2 and perhaps realdebrid or alldebrid or the like for speedier downloads.
it's not a torrent so JDownloader is the best option. if it was a torrent you still shouldn't use utorrent. super unsafe and has bundled bitcoin miners in the past. use qbittorrent instead because it's open-source
Use JDownloader2. I just used it on a video that you supplied (Fall Out Boy) and these are the properties of the video.
​
Video: AV01 1920x1080 23.976fps 1542kbps [V: ISO Media file produced by Google Inc. (av1 main, yuv420p, 1920x1080, 1542 kb/s)]
Audio: AAC 44100Hz stereo 127kbps [A: ISO Media file produced by Google Inc. [eng] (aac lc, 44100 Hz, stereo, 127 kb/s)]
For anyone who wants to download all those PDFs at once get DownloadThemAll for your browser.
Highlight all the links in your browser right click and select "DownloadThemAll" -> "Save with DownloadThemAll". When the window pops up right click one of the links and pick "Select All" then right click again and pick "Check Selected Items" then hit the Download button at the bottom.
You might be able to get them all by choosing "DownloadThemAll" -> "Save Selection with One Click" but I haven't tried it.
Edit: Added link to the DownloadThemAll webpage
Try NeoDownloader, you can setup filters and several rules to download only what you need.
It have a 30day trial and a Lite Free version.
In the case of IHG, I asked in the forum if it could auto update and the unofficial response is that it isn't coded to auto update and is manual update only.
For Nano Defender, it says in the "Installationn" section
> You need Firefox version 58 or above to install this extension.
so it isn't compatible with WF 56.
RipMe is a Java app used to rip media from social media websites. I don't do anything in Java myself, but the code seems to be quite straightforward from what I've seen.
Maybe RipMe? I use it a lot for downloading all the posts from a given user, it works great with Reddit. I've never tried it on my "saved" page before, but it's definitely worth a shot.
Can also use Ripme to rip every post from her insta and watch them all that way. The tool was created as a way to watch/collect content that has awful video player controls or is locked behind something silly when inspect element
can free it.
Hm. You're right. Seems like web.stagram.com changed their html. I'll look into it but I'm busy atm so it might take longer than usual.
If you want to download that profile right now and you have a PC then try RipMe - it's an open source downloader for many sites and also supports instagram.
I don't rip from thousands of tumblers but maybe about 10 large ones. The program i use is ripme for images and videos. its nice because i can queue up all my old jobs in two clicks and it checks for existing files before downloading again. The one problem i have and i'm sure you will have is there is a limit on how many API requests you can send to tumbr (1000 per hour/ 5000 per day). You can just use a new API key to get around that but i am not aware of a ripping program that can do this automatically while resuming the rip.
The more i think about it, ripme is not the solution for you but maybe it is a starting point or can help someone else will a smaller request amount.
> Im a noob. So i just have to translate the words in https://github.com/RipMeApp/ripme/blob/translation/src/main/resources/LabelsBundle.properties ?
Yup
> Edit: I dont have Github but I translated the file as good as I could
Thanks!
Get RipMe, make a link list containing the users and subs you want to download, save it as list
in the same directory as ripme.jar
then do cat list | parallel -j32 'java -jar ripme.jar -t 8 -u {}'
to do the initial download, then you can run that command again as often as you want or add it to cron, etc.
WFDownloader App - A free multipurpose bulk downloader to easily bulk download from pinterest, instagram, search engines and other supported sites. It also has a forum media grabber and website crawler for customized downloads.
Looks like something doable with wfdownloader. Just select only documents or apply relevant filters before you search. See crawler tutorial.
If you are trying only chrome extensions, then it may be difficult to find something that does what you want. You may need to use a dedicated software for this. What you need is a bulk downloader application that supports the site you are trying to download from. This post shows a few good ones, hopefully one of them works for you.
Not sure if it's the most convenient way, that's how I do things like this:
Download the shows with Jdownloader https://jdownloader.org/ and put them as MP3 files on the phone.
> tried a couple of ways like youtube-dl and other things i found online. But none of them worked,
It would be good to indicate what you have already tried so that someone does not recommend this again.
The JDownloader 2 tool can download such data in many cases. However, if copy protection is used, it will be difficult.
https://jdownloader.org/jdownloader2
If downloading from this link is showing up as a virus threat, then it's a false positive. There's no premium version of Jdownloader.
And no, you don't need to add the account if you're directly pasting the RD-generated link. You need the RD account only if you're pasting the non-converted link (rapidgator link for example), then it'll convert it to an RD link within the app.
Download from here
Install and Open
Now paste the download Link into JDownloader2 and start the download.
​
Some filehosters will limit your speed and some will additionally block pausing. Your download will then be lost if your pc shuts down or you loose connection.
Ganval is the very definition of a predator. He's a pedophile. So far his daddy warbucks has been able to keep him out of prison, but there are some things even money can't save you from. I highly recommend that when the shit does hit the fan and she gets arrested contact all the big youtubers about it (some already did Tessi videos before, so they'd be interested in an update), tell them about "King of Tulane" as well, save many streams of Tessi and mirror those on like Google drive, sent all that in a neat handy synopsis type dossier to these major youtubers (since they are lazy and they don't read much) make it easy for them to make videos about. Then also contact the old media about all of this since she was an old Dr. Phil guest, let Dr. Phil's staff know all of this too. But time it right, do it when the shit hits the fan and you know Mya is rotting in jail. Strike the iron when its hot, in the meanwhile have Mya incriminate herself as much as she possibly can. Download all her youtube channels while they're still up, this is a handy tool for that:
All you need to do is dump in her three channels, copy paste them, and it will download everything. Download Ganvals channel too since right now he has videos up on Mya proving he's involved. Capture his channel comments and her channel comments too, theres a handy snapshot button to capture entire pages in the browser. I know thats all a lot of work, but trust me you will thank me in the end.
Time your moment though, make sure you time it just right: She goes down, thats when you strike.
Jdownloader2 is a great tool if you want a GUI instead of youtube-dl. It’s mostly open source and it downloads video files along with a long list of other filetypes from any public source (account/login protected sources if you log in through the program), and lets you custom configure damn near everything you could want.
It does seems that there has been some past reports of bloatware being installed together with Jdownloader, but in the ~3 years i have used it i haven’t found anything being installed with it, neither has my active antivirus. Make sure you get it from the official download page.
https://jdownloader.org/jdownloader2
This is where you can install. There is a paste link option and then it will download.
Jdownloader is a download manager for website links like mega, zippyshare ect.
It has the ability to resume and continue downloads when links expire. It can also auto extract compressed files into 1 file if needed.
I hate to be that person, but that's pretty bad in terms of audio. Lossy-lossy conversions will always result in a loss of quality. If you want the best quality, use JDownloader to download the AAC of the video. https://jdownloader.org/
Yesterday, I used the ad-free version of jDownloader to grab the full set.
Windows 10 tried to block the installer. I had to manually allow it to run. Works well.
ive used it for years now and never had a issue. its actually called jdownloader 2. I use it in arch linux and win 10 never had a issue with a virus or malware since it has been installed not sure if the windows version might of had adware but if it did u opt out but I dont think it did.
You can try ask other people but my experience has been great with it.
These are my final thoughts on this thread. Using Animelist with Jdownloader has worked pretty well for. At times, Animelist can be slow but most of the time the download speed is fast. Also if you can figure what XDCC is and how to use it, then it seems to be the best solution. I personally, can't recommend it.
The others are just saying that it is possible, but not saying how. I'll fix that.
Many browsers don't natively support resumable downloads. It's possible that there are extensions/plugins/addons that enable them for your browser of choice, if you check your browsers addon store for "resume(able) download" or google "[browser name] resume(able) download" you might find something.
If you don't find anything, consider a download manager like jdownloader. You just paste in an URL, and it searches it for anything to download, lets you download multiple files at once, pause and resume, and do some other stuff.
If they were all on the same page it might have been possible with just wfdownloader app. For what you want to do, you'll need to use a crawler that supports logging-in to grab all the video links, then export all of them to use in a tool like jdownloader or youtube-dl. You can try its crawler, but first either import cookies or use the user login button. After that put vimeo in result filter so that it accepts only links with vimeo in it (Do not select the "video" option since I doubt it will work as the videos are embedded here). After the search, export the links to jdownloader to batch download all of the videos. Note that it may not work if the site uses too much javascript to load the videos.
It's possible to do this with wfdownloader app. You can use its forum option to download forum images, videos, attachments and so on. You'll need to adjust the settings based on the site.
My condolences.
It seems that Instaloader is exactly what you need.
If you only want the pictures and videos jdownloader2 also seems a good option
For those asking for a source the 2K video is on director Arnaud Bresson's Vimeo. Using JDownloader2 I was able to get the uncompressed 4.92GB prores file (that may or may not be paywalled behind Vimeo Plus...I am really not sure) tbh I am not sure if the way I extracted the frames was the best possible (using Quicktime 7 Pro copied frames directly to clipboard, pasted directly into this reddit gallery), a mac user with experienced in the prores format might have a better way...I managed to extract all the frames as 24mb bmps, will uplaod shortly...
I don't really understand what you are trying to say. You don't need to use crawler mode for tumblr in wfdownloader app because it is a supported site. Just copy and paste the link of the tumblr page into the app as shown in the bulk image downloader tutorial but use a tumblr link instead.
>the site look almost commercial
Can you tell me what makes you think this so that it can be removed? There's no catch, see faq. It's a hobby project I think will be useful to some.
There is also wfdownloader app which supports both reddit and twitter and you are right that the sites that are supported by each tool are a bit different. Also, they have different priorities.
you can do that by using Jdownloader2's link grabber.
https://jdownloader.org/jdownloader2
it can also use ffmpeg to convert/transcode videos and audio files.
I sincerely recommend folks download local copies of JCS videos if they have any hope or desire to re-watch them, unaltered, at any time in the future. The channel is notorious for yanking their videos, sometimes forever and sometimes to butcher them with new edits.
There's a multitude of browser extensions and apps that can download, but the one I use is JDownloader.
Its a lot of the same porn, but the full ~30 minute scenes instead of the 12min highlight reel w/o the money shot, so if youre satisfied, keep on keeping on, but ive been fapping to downloaded porn since before there were tube sites and developed the habit of skipping around a lot, and it seems like no matter how fast my internet gets, trying to jump ahead on a tube site buffers for a sec, so i just stuck with VLC and local storage.
IDK, go download JD2 https://jdownloader.org/jdownloader2, turn on clipboard monitor, go to sxyprn.com (NSFW duh), just copy some video page URLs, or even search result pages (JD2ll scrape everything!), download a few, see what you think.
Thats half of what i do nowadays, other half then is just torrents which will usually replace 3/4 of the vids from sxyprn with slightly better quality. Thats just performer name x265
for anybody newer, or just the name and anything with >3 seeders for anybody older.
Always use a download manager for sharing services, Jdownloader2 is your best friend https://jdownloader.org/jdownloader2.
Use Torrents, if you are in a Third World Country you don't even need to worry about DMCA notices, and if you are on a First Wolrd Country, 5~7 Bucks for monthly is cheap for all the content you can get using Torrents
Megathread has a lot of sources for you to download, so "i stg i cannot find a good pirating website." is bs!
>fitgirl repacks freeze and dont work and say 0b/s so can't use that :shrug:
This is your pc fault and not the repacks fault!
There's a tool called ripme that has a graphical interface (if that's something you're into) and works for multiple sites. I use it to download entire subreddits pretty often, it's nice since it nests albums into folders
Install Python, save the code as a .py
file, then double click on that file.
Though if you're just trying to download posts from a subreddit and don't actually intend on learning Python, you'll be better off using an existing tool like RipMe or gallery-dl to do it. That's way easier than trying to do it yourself.
I would suggest trying wfdownloader app and following this detailed instagram downloader tutorial. If it's about 300 posts, it should be able to get all. Note that you'll need to import cookies since saved posts are private.
If you are still looking for one, you could try this instagram downloader that works for single and bulk downloads. Just be careful not to overdo it since they rate-limit the process.
Don't mean to revive a dead thread, but this is what I found works.
If the video is hosted on Vimeo, then the odds are that the stream link is in the HTML source code. I simply loaded up every video on the Patreon then opened up the inspector and edited the entire document as HTML. From here I copied the entire html source. If you are frequently downloading web content then there is a chance that you know of something called Jdownloader2. It can thankfully make sense of the entire HTML source and as such you just paste the source and it will parse any links it finds in it - Including the Vimeo videos! From here I just filtered anything that was not Vimeo from the Jdownloader parse, selected all the Vimeo videos and began to batch download.
Make sure to get firmware 13.1.0 and keys from Emusak for new update.
Gdrive 1.1.0 update link: aHR0cHM6Ly9kcml2ZS5nb29nbGUuY29tL2ZpbGUvZC8xeHFJa1FvV2ZwR1JhZ19ZWmotbXpXN2ZUX0xycEk4WXgvdmlldz91c3A9c2hhcmluZw==
LetsUpload 1.1.0 update link (read jdownloader2 instructions down below for blazing fast speeds): [will be live in ~1 hour)
Confirmed working on Ryujinx.
Right click on Pokemon Brilliant Diamond > Manage Title Updates > add > then browse for the update file (look at the size, make sure it's the smaller 2.5gb one) then it should list Version 1.1.0 and then click Save.
If it still doesn't work, might be ROM versions are different.
I got mine from here: aHR0cHM6Ly93d3cuemlwZXJ0by5jb20vcG9rZW1vbi1icmlsbGlhbnQtZGlhbW9uZC1zd2l0Y2gv
Just make sure to get jdownloader 2 for anything off that site or LetsUpload.
Jdownloader 2 - https://jdownloader.org/jdownloader2 - first go into Settings > Max Chunks per Download > 20
then copy/paste the links on that site into the Link Grabber section on jdownloader 2 then right click and 'Start downloads'
I got the full game from here: aHR0cHM6Ly93d3cuemlwZXJ0by5jb20vcG9rZW1vbi1icmlsbGlhbnQtZGlhbW9uZC1zd2l0Y2gv
just make sure to get jdownloader 2 for anything off that site
Jdownloader 2 - https://jdownloader.org/jdownloader2 - first go into Settings > Max Chunks per Download > 20
then copy/paste the links on that site into the Link Grabber section on jdownloader 2 then right click and 'Start downloads'
Hm.. for Ryujinx here is what I did: right click on Pokemon Brilliant Diamond > Manage Title Updates > add > then browsed for the update file (look at the size, make sure its the smaller 2.5gb one) then it should list Version 1.1.0 and then click Save.
If it still doesn't work, might be ROM versions are different. I got mine from here: aHR0cHM6Ly93d3cuemlwZXJ0by5jb20vcG9rZW1vbi1icmlsbGlhbnQtZGlhbW9uZC1zd2l0Y2gv
just make sure to get jdownloader 2 for anything off that site
Jdownloader 2 - https://jdownloader.org/jdownloader2 - first go into Settings > Max Chunks per Download > 20
then copy/paste this link into the Link Grabber section then right click and 'Start downloads'
When I find a directory/site, I use a few different tools depending on how its set up.
On a open directory, http://www.neodownloader.com/ is what I utilize. (Full version is payware)
Also, another app called "Free Download Manager" allows for specific url drag and drop downloads.
And yet one other is a addon for Firefox called "DownloadHelper".
Edit* "Down Them All" is a handy Firefox addon as well.
Try using the free version of Neo Downloader http://www.neodownloader.com/ . I use it all the time too bulk download albums on Imgur. All you have to do is copy the album url and paste it into Neo, choose the desired path for storing images and your good to go.
I use a program called neodownloader. I used to use a imgur specific album downloader but the lite version of neodownloader is free, easy to use, puts pics in a nice folder, and can download pics from any website.
Humm.. I installed Fennec and then this add. It shows in addon list but no way to use it.
Did you tried installing this one. Its an older version In fennec says addon corrupt https://imagehostgrabber.com/download.html
+1
Thanks for taking the time to post here,
> 1.7.4.c, please use that. XPI located on site.
https://imagehostgrabber.com/download.html lists apparently inferior 1.7.0.4c and https://imagehostgrabber.com/IHG/imagegrabber-1.7.4c.xpi is 404 not found, was 1.7.4.c a typo?
FYI IHG has decided to only support Palemoon
Through a little trial and error I found the download link for 56.2.14 https://storage-waterfox.netdna-ssl.com/releases/win64/installer/Waterfox%2056.2.14%20Setup.exe
#1 and the hard stop is:
​
Image Host Grabber aka IHG.
​
​
I use it to suck up images. No IHG. ... Not usable.
​
None of the other similar are ported, and they can NOT be ported as the API(s) used are not available.
I use RipMe. Its a GUI program, at least on Windows, don't have a Mac which is what your asking for. Easy to use aswell. None of the sites you specific seem to be on the supported list but I'm not sure if its been updated as far as i know, Best id say is just to test it out.
Just to clarify : Hentoid does NOT reprocess / compress any picture when downloading. What you get is exactly the same picture you would get when downloading from any desktop browser.
However, I can understand viewing it through Nox isn't ideal, and the emulation certainly renders a lower quality image than what you would get by viewing it directly on your disk.
A good place to start for a PC "equivalent" would be https://github.com/RipMeApp/ripme
Just download
​
https://github.com/ripmeapp/ripme/releases
Install java
open the Jar file
Copy and paste the pornhub / erome link in the tool --> download
​
#Enjoy
p.s. PC non-mobile should work on MacOS aswell as its java.
youtube-dl is for downloading videos. If you want a similar tool for images, then try gallery-dl. Or if you want a simpler GUI maybe try ripme.jar:
You don't need to be computer savvy. You just need to be able to read the directions for downloading Java, Maven, and this https://github.com/ripmeapp/ripme
There are other apps that do the same thing. Someone with a few decent computers could run it on each and rack up a shit ton of pictures very quickly. It would still probably take days to a week though. My 8 year old laptop that runs too hot and is connected through a VPN can download tens of thousands of photos in a day. If they were all smaller files I imagine over 150,000 would be pretty easy.
Photo size varies wildly but I would estimate most photos taking about .5 seconds to 1 second. If I remember when I get home I'll run a few tests on file size.
Hi, I've got a program called Timesearch, but it doesn't download media. I leave the media for other tools, maybe RipeMe can do what you need.
A quick search on github returns this:
https://github.com/RipMeApp/ripme
Which seems to have support for downloading from instagram. Not sure if it does exactly what you want.
Instagram has an API, so in principle it should be possible to write exacly what you want as a small script. Like
https://github.com/LevPasha/Instagram-API-python
Should let you get access to the API from python, and save you some hassle if you have a python dev available. You might have to apply for a API key somehow from instagram, which should be straight forward if you have an instagram account, and I don't sadly.
You should also ask r/socialistprogrammers there there might be people who could whip something up.
I hope my title is true. I downloaded all pictures with this: https://github.com/ripmeapp/ripme
With exception of the videos and those eight pictures only showing question marks I should have all posts right?
It seems like RipMe supports the site, but you might need to test that. Just give it the link from your browser's address bar and try it out, it might even download whole accounts
I monitor quite a few accounts and subreddits for a variety of reasons, and here is how I do it:
On a Raspberry Pi using Raspbian Lite, I installed Java and RipMe. RipMe can pull lists of urls, so I made various lists with different requirements, such as once daily, once weekly, etc.
Next, I made cronjobs to run the app using the lists, and saving to network drives. This would ensure every X amount of time, it would run the scrapes again.
Doing it separately on a Pi offloaded the scraping tasks so my main machines wouldn't have to deal with it, and saving to network drives backed everything up.
I also use scheduled wget tasks using cronjobs to scrape pages. RipMe is great for images, gifs, and videos, and wget grabs everything else using the same setup.
ty i also just found that. but how do you rip the images there to put into zip/rar ? I use cdexviewer to read the manga.
the site isn't listed for FMD app https://github.com/fmd-project-team/FMD/releases
Nor is it supported by ripme https://github.com/RipMeApp/ripme/releases/
https://github.com/RipMeApp/ripme
Get this and aim it at a subreddit, downloaded about 200 images per minute on my connection. Images average out at 1MB each.
15GB = ~15,000MB / 200 = ~75 minutes to hit your target. This will of course be quantity over quality. But who's going to scrutinise 15,000 memes?
If you mean individual photos, if you have javascript disabled, then you can simply click on a image and there's an option to download it in the bottom right. If you mean the entire album, then you can save it using a program like ripme: https://github.com/ripmeapp/ripme
You would have to log into your reddit account using ripme, but this isn't possible yet:
https://github.com/RipMeApp/ripme/issues/1245
https://github.com/RipMeApp/ripme/issues/1273
There also is no cookie support yet, so logging in isn't possible at all.
> When modifying the rip.properties file to change "log.save" to true I can't find where the log is being written
It should be in the dir ripme was started from
> So where does the log get written?
No idea, but if you find out leave a comment and I'll update the wiki
> Where is the canonical list of all the available options?
For ripme, you'd have to write your own code for those unsupported sites, see: https://github.com/ripmeapp/ripme/wiki/How-To-Create-A-Ripper-for-HTML-websites Rather than use a program or browser extension, I would be inclined to write everything in my own code as it would be more efficient and easier to manage downloads to how I would want them.
> I was unsure if it was being updated with each release. Does the docker container get updated with each release?
It should be but it does sometimes get forgotten. It's easy enough to do by hand via editing the docker file
> Is this what I should be using, or is there another docker container that is kept more up to date?
The official docker is https://github.com/RipMeApp/ripme-docker
> Also how would I go about changing the properties file permanently for each container?
I;ve not tried it but docker exec -it <container> bash
should start bash in the container. From there just edit ripmes config file
ripme and TumblThree are both safe bets - I prefer TumblThree because it's a lot more user friendly (and lets you get around profiles that are NSFW-locked or password locked (in some cases).
Thanks for the reply. I was planning on doing something like that anyways so it should work out for me.
A quick question about the wiki, did ripme used to rip the files from URLs directly in the past, but that was a feature that was taken out? I am asking because one of the flags for using ripme says that you can rip URLs from a file
usage: java -jar ripme.jar [OPTIONS] -4,--skip404 Don't retry after a 404 (not found) error -d,--saveorder Save the order of images in album -D,--nosaveorder Don't save order of images -f,--urls-file <arg> Rip URLs from a file. <---
Pushshift.io apparently ingests almost a terabyte of Reddit data every month, according to its creator, and I think that's purely Reddit data, and not external images/videos/link. I don't know how they're set up behind the scenes, but it just shows that it's definitely possible.
You could spread it over many different hosts in different locations, and use many different accounts to access the API.
I "casually" hoard images/gifs (no metadata or comments, just the media) from just around 700-750 subreddits with the application Ripme (and some custom scripts for managing Ripme), some subreddits ripped twice daily and some every 2 days, and I don't get blocked by Reddit in any way, even though this is all running from one server. I don't think Ripme uses the official API, but I don't know for sure. That might have something to do with it. Currently I'm at a little bit more than 12TB, but that's from the last 4-5 years, so this is not a super comprehensive archive.
Inspired by the post today, “Can we make this sub just selfies again and not soft porn?! It’s a great sub but it’s becoming NSFW and makes me not even want to post! You don’t have to wear underwear to get attention ladies!”
http://sankeymatic.com/manual/ - Used to Create Chart
https://github.com/ripmeapp/ripme - Used to rip images
After ripping the sub (I have no idea how “good” of a job it did), I randomized the pictures and manually data mined 1000 of them while watching 1.5 episodes of House MD.
Honestly, I think there are inherent problems with this data. I am led to believe there is a lot of deleting that goes on in this sub. These deletions may lean towards the risqué posts? I honestly thought there would be a much lower percentage of “normal” selfies.
Inspired by the post today, “Can we make this sub just selfies again and not soft porn?! It’s a great sub but it’s becoming NSFW and makes me not even want to post! You don’t have to wear underwear to get attention ladies!” http://sankeymatic.com/manual/ - Used to Create Chart https://github.com/ripmeapp/ripme - Used to rip images After ripping the sub (I have no idea how “good” of a job it did), I randomized the pictures and manually data mined 1000 of them while watching 1.5 episodes House MD. Honestly, I think there are inherent problems with this data. I am led to believe there is a lot of delating that goes on in this sub. These deletions may lean towards the risqué posts? I honestly though there would be a much lower percentage of “normal” selfies. You guys are all beautiful. Keep doing your own thing, whatever it is that makes you happy!