I use lftp: http://lftp.yar.ru/
You can do directory downloads with 'mirror' and even select how many files and how many chunks of each file you want to concurrently download.
lftp > cd to/path/with/folder > mirror -c -P4 --use-pget-n=2 ./wangpics
That will download a folder called wangpics
. It will download 4 files concurrently, using 2 concurrent connections per file.
A good backup policy ought to include both on site and off site backup. Encryption should be used unless you have a valid reason not to.
Depending on the kind of data you're working with and the resources available to you, you can add some git magic or LUKS encrypted virtual drives into your policy,.
That said the tool you're looking for is backupninja, I remember reading somewhere that it is possible to use backupninja over ftp, though I have not used it myself.
Your other option is to use a combination of dedicated tools to make your own solution:
Make your incremental backups with tar, here's an example
tar -g /var/log/tar-incremental.log -zcvf /backup/today.tar.gz /var/www/html /home /etc
Then transfer it to its destination with lftp. Wrap with a little bit of scripting which mails you the result of the operation, and set up a cron task to run it for you.
You should probably add a mechanism to prevent the ftp server from filling up and maybe make a fresh full backup once in while.
Also remember that having backup is worth nothing if you don't test them to be sure they work and know how to restore them.
What FTP you was using in Windows XP? That way I compare features to any Linux FTP clients I would suggest for you do use.
Linux FTP Cron Job for automatic ftp backup
http://www.cyberciti.biz/faq/linux-unix-autologin-cron-ftp-script/
What environment are you working in (Windows I assume).
I used Goodsync when my media store was on run on a Windows Server and everything synced overnight so bandwidth was never an issue
I am now running Linux as my primary VM for media. I use http://lftp.yar.ru now/ with a bash script to schedule. You can throttle it as required but obviously this assumes you are a running a server/linux VM etc.
Hey man.
mirror is the command to sync a while directory. The use pget tells it to use the over command when getting files, which is segemented download.
So you want the pget command.
http://lftp.yar.ru/lftp-man.html
Also, I'm just on my phone now, but I usually pass something like --use-pget-n=5 as an arg to the mirror command instead of setting it.
Try pget -n 5 and see if it's the same speed.
This probably isn't quite what you're looking for, but the command line ftp app "lftp" is pretty easy to use and can do this (the feature is called pget in lftp.)
You can install lftp via homebrew.
The speed I get with segmented downloads through lftp is insane. Totally worth the little bit of a learning curve, though your curve might be steeper if you aren't at all familiar with the OSX command line.
> mirror -c -v --loop --Remove-source-dirs "$remote_dir" "$local_dir $REGEX"
How were you expecting
> mirror [OPTS] [source [target]] > > Mirror specified source directory to the target directory.
manpage to work on regexes without using a flag specifying them?
Method 2: download, compile and install lftp-4.8.4 from http://lftp.yar.ru/get.html
Run the lftp command as follows in a command line terminal:
lftp -u xbox,xbox -e 'find -l /;bye' 192.168.1.102 > long_file_list.txt
Change the IP address to match the one used to access your Xbox.
agree with the above. I recently also switched over to LFTP for SFTP/speed reasons and found the Whatbox wiki to be most useful. Also this http://lftp.yar.ru/lftp-man.html
Thanks for the response wBuddha.
I found the -E command here: http://lftp.yar.ru/lftp-man.html
> -E delete source files after successful transfer
Cheers for your help. Will give it a shot later on :)
Another approach might be to move the repository on the remote out from under the /www
directory and into a secure directory. Then add a post-receive hook that calls a script which copies the files in the working tree (and just those files) over into the appropriate directory under /www
.
I've never done that, though, so I can't go into specifics. Personally, I upload to my webhost using <code>lftp</code>. Doing that makes it easier, for me at least, to have files on the site that aren't tracked in Git, and to exclude the .git
directory from what gets copied.
Running lftp from terminal and the command below produces exactly the result I want. It downloads everything but rar files. However, when I put the command in that particular script it seems to run into segmentation fault error. I really don't know why.
mirror -c -x ".r(a|[0-9])(r|[0-9])$" -P --parallel="5" --use-pget-n="10" --log="$log_file" "$remote_dir" "$local_dir"
I had already tried what you suggested. Doesn't seem to make a difference. I just uninstalled lftp from ubuntu repository and compiled the latest from the developer. We'll see if that makes a difference.
Edit- After compiling lftp, segmentation fault error is gone. But there is a new problem. Program seems to delete the same directory that it is downloading the file into in the middle of download and then gives error stating that directory does not exist!
I keep looking at the documentation at http://lftp.yar.ru/lftp-man.html to see what I'm missing that's causing this. Really no clue again. I wonder if combination -c (continue where it left off) and -x (regex) cause a problem.
I have done this.
I use the lftp command-line program to transfer the file via SFTP upload. In my python code, I use something like:
threads = [] for tmp in xyz: data_file,lftp_pgm,fname = tmp thread = threading.Thread(target=run.threadLoad, args=(data_file, lftp_pgm,fname)) threads.append(thread) thread.start()
In the threadLoad function:
def threadLoad(self, data_file, lftp_pgm, fname): output = fname[:-4] + "out.lftp" with open(output,mode="w",encoding="latin-1") as fp: child = subprocess.Popen([ lftp_pgm, "-d", "-f", fname ], stdout=fp, stderr=fp ) streamdata = child.communicate()[0] retcode = child.returncode
You might have hostname and credentials in your xyz list for multiple destinations. In your scenario, rsync may work better if your transfer window ends before the file finishes. That way it can resume where it left off at when the script runs agains.
The way I ended up going about it was I installed an sftp server on it (can't remember what tool I used for that), and I've been using lftp on my computer's terminal (since my computer is a Unix-based system) to transfer files. Works like a dream. Even set up an alias so I just type dl
and it executes lftp sftp://[X]@[Y]:[Z]
, where X is the username I set up, Y is the name I gave the computer's IP in my hosts file, and Z is the port I forwarded through the router (not the default sftp port for a little extra safety).
The worst part — and this would have been unavoidable either way — is constantly having to ask them what their public IP is and updating my hosts file, since they don't get a static IP from their ISP.
Ignoring the possible routing issue and just talking how to pull content .
It depends a bit on your workflow and operating system(s?) but I like to be as hands off as possible. If you are pulling stuff into a NAS or a dedicated media machine and are running Linux on either I highly recommend LFTP.
You can write a really simple bash script to pull content (and delete it from a seedbox if it is completed?) and include a "--use-pget[-n=N]" command within that to specify a parallel download.
The advantage of this is you can then run the bash script through crontab every couple of hours to make sure you have pulled in new stuff.
I would recommend pulling stuff via sFTP rather than FTP as well so it all remains private.
lftp is pretty badass. It's super featureful, and I've used it to mirror remote sites without problems. The only thing I'm not sure about on your wishlist is sending an email when it fails, but you could probably wrap it in a shell script and just check the return code.