Skip to content

How to download the whole website in one command

homepage-banner

TL;DR

wget -r -p -np -k -P ./data/ http://example.com/eg/

Parameters explained:

-P    Specifies the directory to download to
-r    Indicates recursive download
-np   Indicates not to download links from other sites
-k    Indicates to modify the links in the downloaded web pages to local links
-p    Retrieves all elements required to display web pages

How to use wget to download the whole website

To download the entire website, you need to use the recursive option with wget. The recursive option allows you to download all the files and folders on a website.

The syntax for downloading the website is as follows:

wget -r -p <website URL>

Let’s say you want to download the entire website https://www.example.com. To do this, run the following command:

wget -r -p <https://www.example.com>

This command will download all the files and folders on the website and save them in the current directory.

Limiting the download speed

By default, wget downloads files as fast as possible. However, you can limit the download speed to avoid overwhelming the server or your internet connection. To limit the download speed, you can use the –limit-rate option.

The syntax for limiting the download speed is as follows:

wget -r -p --limit-rate=<download speed> <website URL>

Let’s say you want to limit the download speed to 500KB/s. To do this, run the following command:

wget -r -p --limit-rate=500k <https://www.example.com>

This command will download all the files and folders on the website at a maximum speed of 500KB/s.

Downloading only certain file types

Sometimes, you may only want to download certain file types, such as images or PDFs. You can use the -A option to specify which file types to download.

The syntax for downloading only certain file types is as follows:

wget -r -p -A <file type> <website URL>

Let’s say you only want to download images from the website https://www.example.com. To do this, run the following command:

wget -r -p -A jpg,jpeg,png,gif <https://www.example.com>

This command will download all the images on the website and save them in the current directory.

Conclusion

Wget is a powerful tool that allows you to download entire websites in one command. With the recursive option, you can download all the files and folders on a website. By limiting the download speed and downloading only certain file types, you can customize your download to suit your needs. Try using wget to download your favorite websites and see how easy it is to use!

Leave a message