10 Wget (Linux File Downloader) Command Examples in Linux

If you have any questions or problems regarding this article and want help within 24 Hours? Ask Now

Support TecMint: Did you find this tutorial helpful?. Please help to keep it alive by donating. Every cent counts! - Donate Now

Narad Shrestha

He has over 10 years of rich IT experience which includes various Linux Distros, FOSS and Networking. Narad always believes sharing IT knowledge with others and adopts new technology with ease.

Your name can also be listed here. Got a tip? Submit it here to become an TecMint author.

Receive Your Free Complimentary eBook NOW! -

Download Free Linux eBooks

Advanced Bash-Scripting Guide
Linux Bible
A Newbie's Getting Started Guide to Linux
Introduction to Linux - A Hands on Guide

You may also like...

6 Responses

  1. Scully says:

    You made my day. I tried without the ftp:// and failed one time after the other.

  2. Predatux says:


    I want to make a script to check a website and download the latest available version of a deb file and install it.

    The problem I have is that on the website, each time the version changes, so the file name is changed, so I can not know the exact name to give the order to wget.

    I wonder if there is any way to include wildcards in wget, or similar option.

    As an example, suppose you want to download each x time the latest “Dukto” in 64 bits.

    Their website is:

    How i can tell wget to look in that directory and download the dukto*.deb?

    Thanks in advance.

  3. Hasanat says:

    what does is mean by “downloads file recursively” ??

    • John Lang says:

      That means it will go through all the links on the website. So, for example, if you have a website with links to more websites then it will download each of those and any other links that are in that website. You can set the number of layers, etc (reference http://www.gnu.org/software/wget/manual/html_node/Recursive-Retrieval-Options.html ). This is actually how google works but for the whole internet, it goes through ever link on every website to every other one. Also, if you use some more commands you can actually download a whole site and make it suitable for local browsing so if you have a multipage site that you use often, you could set it recursive and then open it up even without an internet connection. I hope that makes sense(the tl;dr version is that it follows every link on that website to more links and more files in a ‘tree’)

Leave a Reply

Your email address will not be published. Required fields are marked *

Join Over 95000+ Linux Users
  1. 57,739
  2. 3,201
  3. 25,639

Enter your email to get latest Linux Howto's