You can get a directory listing using various means, eg. and then parse that through aria2 (or wget)
My fave lazy tip here is to setup uget which can use aria2 as a downloader to monitor the clipboard and then just copy link (ctrl+e on chrome).
A firefox add-on like this one lets you grab all links on a page.
A downloader like this one lets you paste those links as a download task.
you can use for some servers, some tool like uGet.
it shouldn't make downloading faster, but it can help with reliability by minimizing the possibility of corruption in time consuming downloads.
besides that, I would have nothing more to recommend.
cheers!
Yes on both counts. I haven't watched a video on RD yet, though, so I don't know how well that works but I assume it works fairly well. And no, RD is still useful.
RD is mainly very good if you want to download things from ddl websites. They usually have many mirrors on various file hosts and it's impossible (or very, very expensive) to have a premium account on every file host.
If you need a good simple download manager, try uGet. All of the others I looked at were too complex to just download stuff but uGet just does what it's supposed to.
Glad that it helped! Also I forgot to add this in:
Glad that it helped! Also, I forgot to add this in:
Try reinstalling networkmanager and if you are using wireless then also try connecting via iwctl
​
How to connect via iwctl:
iwd
package installediwctl station YOUR_WIRELESS_NETWORK_INTERFACE connect "YOUR_WIRELESS_NETWORK_SSID"
​
Try reinstalling NetworkManager and if you are using wireless then also try connecting via iwctl
​
EDIT: Also try to use a download manager like uGet
>never heard of it.. link does not seem to be working on mobile for me right now.
Which is probably because the site's SSL certificate has expired.
>so what is that uget supposed to do?
The tool is a download manager with a graphical user interface which, according to the site, offers the following functions.
Source: https://ugetdm.com/features/
An alternative might be https://jdownloader.org/home/index if one has no problem with Java.
I know dtmall, & yes it isn't recursive.
If you have windows 10 you can install the linux subsytem or get cygwin for the tools I've mentioned (maybe not wget2...yet).
uget will download directory contents (and sub directories) AND take filters. AND there is a windows port AND it's a gui.
If you can't get a full directory list it sounds like the directory isn't "open". Not trying to Bogart your url but if you'd like to share it (maybe via PM) I could have a play with it to see what works?
Read the stickies and the sidebar.
If you aren't big on commandline there are a few online wget "primers" (they'll ask you what you want and give you a working text to paste).
The speed is basically the server (& a little bit you) - if the server is an old beige box in some 3rd world shithole connected to dialup you're going to be lucky to dld anything at all let alone quickly.
Some times of the day you'll find that they are quicker - I'm an early riser where I am (+8GMT) and I find for the 'Iranian' servers 0300-0500 my time gets me far quicker than during the day if I'm home.
The same is said for your speeds - if you're trying during your peak times when everyone where you lives is home and gaming/watching netflix etc. then your speeds aren't going to be great.
As you mentioned aria2 you might want to look at uget (which uses aria) - it also has a scheduler to set times to download!
EDIT it's mentiond - you can get tools (including aria2) that multithread - axel, wget2 but if you aren't using commandline (or on linux for axel) the learning curve may be too steep. It will also usually kill the server - you're effectively hitting it with say 10 connections rather than 1.
I downloaded uget and am having a weird problem. The sidebar on the lefthand side of the screen that allows one to choose "Status" and "Category" has disappeared and I can't find any menu option to bring it back. I went to the documentation pages on the uGet website ( https://ugetdm.com/documentation ) and all that is there is a "page not found" message, so I can't find any instructions for this program that would tell me how to bring that back. The reason I need it is because all that is visible on screen are the downloads I already completed. I can't find any way to get back to the view where I could see the status of what is currently downloading.
If the images are not referenced on the page at all (e.g. the ones they forgot to include), then no standard web-spider will download them unfortunately.
If their links appear only after clicking a JavaScript arrow (I am guessing here), then this could be the reason why HTTrack is not picking them up. HTTrack does not execute JS to be able to get to the next page/image and retrieve the image URL from there.
What you can do is, if these names follow a pattern, try to generate a long list of potential names (e.g. image0001.png, image0002.png, [...], image9999.png, etc.), add the beginning of the URL in front of each one (www.example.com/assets/img/characters/image0001.png) and then feed this list of generated URLs in a downloader like uGet (https://ugetdm.com/) or even HTTrack, which will then try and download the ones that actually exist.
If there is no pattern, then you may have to manually swap to the next image and use Download Star to quickly download the newly found images in the page.
wget won't download multiple things at the same time but it will get you multiple files in an OD (sequentially) according to a regex. There's shitloads of info for it in the sidebar to the right ====>
If none of that meant anything to you and you aren't prepared to learn how to use a command line tool then try uget - it's free, and probably the easiest gui download manger to use. There are more "powerful" options such as jdownloader2 as suggested but if you're new to the game I'd KISS principal it.