Lightweight Command-line Download Manager – “wget”

“wget” is historical opensource lightweight command-line cross platform download manager by GNU. Technically it’s just a download-er. I won’t give a long description on what is wget, instead of that I’ll just show you how a tiny binary can be more useful than a sophisticated download manager.

First Get yourself the wget binary. If you’re on Linux, most probably wget comes built in. For Windows go to http://gnuwin32.sourceforge.net/packages/wget.htm . You’ll need to PATH the install folder if you want to use wget command anywhere. For Mac users you will need to download source from ftp://ftp.gnu.org/gnu/wget/ and then compile it by yourself. Or simply download binaries from here.


Ad

You can also have wget on Android (root required), download binary from http://forum.xda-developers.com/showthread.php?t=1611993 and using any root file browser copy it to /system/bin/.


There are limitless use of wget, so I’ll show only few cool uses. So open up terminal (or cmd) and start commanding !!!

Very simple usage:

wget https://www.kernel.org/pub/linux/kernel/v3.x/linux-3.12.5.tar.xz

This will simply download file from the link. Not so cool huh !

Download With Resume Support:

wget -c -t 0 https://www.kernel.org/pub/linux/kernel/v3.x/linux-3.12.5.tar.xz

“-c” is for continuous download and the number after “-t” specifies the number of tries wget should do after getting error. 0 means infinity tries. So wget won’t stop till the file is completely downloaded.

Mirror a website !!!

wget -m http://gnu.org/

This will make a local copy of a website.In simple words an offline copy of a website !!! However this will only work good for simple webpages i.e. HTML only type webpages. If you just want to grab a full featured website then try httrack.

Recursive Download:

wget -r -l1 -H -k http://gnu.org/

“-r” is for enabling recursive download, “-l1” is for level 1 to prevent wget from downloading from very deep level. “-H” allows wget to change the host i.e. it can download form other connected websites. “-k” is for converting links into local links from remote links.

You can also add some filtering options with “-A” and “-R”. If you add “-A.jpeg” after wget and before the URL then it will only download files with .jpeg suffix or you can specify a list of accepted suffix “-A acceptlist.txt” , acceptlist.txt is a comma separated list. “-R” dose the rejection of suffix but its syntax is same as “-A”.

Download from URL List:

wget -i downloadlist.txt 

“downloadlist.txt” is nothing special but a simple text file with a URL per line. wget will download from all those URLs one by one. In any above examples the specific URL can be replaced by “-i downloadlist.txt”

Now I’m Done ! However these are some of those limitless uses but these are the mostly used. If you don’t like using wget in command lines, there are lots of GUI for wget,

These are some of them,

For Windows: http://sourceforge.net/projects/winwget/

For Linux: Check this forum post http://askubuntu.com/questions/11633/can-you-recommend-a-good-modern-gui-download-manager-wget-wrapper

For Mac OS X : https://code.google.com/p/cocoawget/

Isn’t this enough ? Get wget official documentations from http://www.gnu.org/software/wget/manual/ and download everything like a boss !!!

Ad

Leave a Reply