10 Wget (Linux File Downloader) Command Examples in Linux

Narad Shrestha

He has over 10 years of rich IT experience which includes various Linux Distros, FOSS and Networking. Narad always believes sharing IT knowledge with others and adopts new technology with ease.

Receive Your Free Complimentary eBook NOW! -

Download Free Linux eBooks

Advanced Bash-Scripting Guide
Linux Bible
A Newbie's Getting Started Guide to Linux
Ubuntu Linux Toolbox: 1000+ Commands

You may also like...

6 Responses

  1. Scully says:

    You made my day. I tried without the ftp:// and failed one time after the other.

  2. Predatux says:

    Hi…

    I want to make a script to check a website and download the latest available version of a deb file and install it.

    The problem I have is that on the website, each time the version changes, so the file name is changed, so I can not know the exact name to give the order to wget.

    I wonder if there is any way to include wildcards in wget, or similar option.

    As an example, suppose you want to download each x time the latest “Dukto” in 64 bits.

    Their website is:
    http://download.opensuse.org/repositories/home:/colomboem/xUbuntu_12.04/amd64/dukto_6.0-1_amd64.deb

    How i can tell wget to look in that directory and download the dukto*.deb?

    Thanks in advance.

  3. Hasanat says:

    what does is mean by “downloads file recursively” ??

    • John Lang says:

      That means it will go through all the links on the website. So, for example, if you have a website with links to more websites then it will download each of those and any other links that are in that website. You can set the number of layers, etc (reference http://www.gnu.org/software/wget/manual/html_node/Recursive-Retrieval-Options.html ). This is actually how google works but for the whole internet, it goes through ever link on every website to every other one. Also, if you use some more commands you can actually download a whole site and make it suitable for local browsing so if you have a multipage site that you use often, you could set it recursive and then open it up even without an internet connection. I hope that makes sense(the tl;dr version is that it follows every link on that website to more links and more files in a ‘tree’)

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

Join Over 75000+ Linux Users
  1. 51334
  2. 2657
  3. 16705

Enter your email to get latest Linux Howto's