How to download large files on an unstable network
Suppose you are somewhere where the only connectivity to the Internet is an unstable and slow connection. You try to download a relatively large file, with a size of a few megabytes - like a copy of the WNDW book - but it will take two hours because the connectivity is so slow. Unfortunately you get disconnected very often so you actually never get the file, because your browser starts from zero every time it is disconnected.
One solution to this problem is to use wget - a tiny download utility that can resume downloads every time the connection is interrupted. Wget is available for DOS, Windows, Mac OS X, FreeBSD, OpenBSD, NetBSD, Linux, and many other operating systems.
If you are using Windows on an unstable network we recommend downloading wget. If you use Linux, or any flavour of BSD, chances are very high that you already have wget installed on the machine. If your connectivity is not good enough to download even wget you might be out of luck. Someone should copy it to a USB stick, floppy, CD, etc. and give it to you.
Wget is available with and without support for SSL-encryption. Usually you don't need SSL for getting files that are available to the public from the net and wget with SSL is about eight times bigger than wget without SSL. If the network conditions are really bad, it could be a good idea to download wget without SSL first to download wget with SSL. Wget for Windows without SSL-Support can be found here. The size of this download is 165 KB.
Installation for Windows
Uncompress the downloaded file with the zip utility of your choice. The archive contains three files. There is one html-file witch contains the manual, a sample configuration file to alter the behaviour of wget and 'wget.exe'.
Actually the size of the manual is much bigger than the program ;-)
wget.exe is the actual program, you should copy it somewhere 'into the path' - a directory where Windows looks for executable programs. The directory where Windows itself is installed to - like C:\WINDOWS or c:\WINNT - is 'in the path' (to executables).
Basic usage with WindowsWget works on the command line, so you have to get to the command line (open a shell) first.
If you are using Windows click on: Start > Execute
Now type in cmd (Win2000, XP) or command (Windows 95, 98, ME)
Press "Enter". Once you have a command prompt, type:
wget -c -t 0 --timeout=30 http://link-to-the-file-you-want
The command shown on the line above will start wget and tell it to continue (-c) with the download, use an infinite number of tries (-t 0) and interrupt waiting and reconnect automatically if no data is received for 30 seconds (--timeout=30).
wget will download the file into the directory where you started wget.
Wget without SSL-support can download files from http and ftp servers. If SSL is built in, then https is supported as well.
NoteFor ftp-transfers, there is active and passive mode. Sometimes passive ftp mode has to be switched on. If you have trouble with ftp, try passive mode:
wget -c -t 0 --passive-ftp --timeout=30 ftp://link-to-the-file-you-want
PasswordsSome servers require a user name and a password. The syntax is a little bit different for ftp and http(s).
wget -c -t 0 --timeout=20 --http-user=name --http-passwd=pass http://link-to-file wget -c -t 0 --timeout=20 --ftp-user=name --ftp-passwd=pass ftp://link-to-file
Support for ftp-passwords has been added recently, so if your version of wget doesn't understand --ftp-user= or --ftp-passwd= it is too old.
Some links contain characters like backslashes, exclamation marks, question marks and equals. This could be a source of trouble if you use wget from the command line, since a shell (command line interpreter) may interpret these signs as special characters that are part of the command line syntax.
In order to have such links work as expected with wget you have to put them in quotes. For example:
How to automatically invoke wget with these optionsHere is a tiny script for wget on Linux that is handy for downloading larger files. Note: this script will not work on a Windows machine. Save this as a text file named twget:
#!/bin/sh wget -c -t 0 --timeout=15 $@Copy the script into your path:
cp twget /usr/local/binand make it executable with:
chmod 711 /usr/local/bin/twget(You may need to do this as the superuser 'root') This saves a little bit of typing:
twget http://link-to-fileNo matter how flakey the connection is, and now matter how big the file is - wget will get it. It is just a matter of time.