Most of us are used to accessing websites using a web browser such as Firefox. It is also possible, however, to access websites from the command-line. This is especially useful when downloading a file for which you already know the URL. The wget and curl commands can be written into scripts, automating the process of downloading package files.
The following is an example of using wget to download a file from the Samba website:
# wget http://download.samba.org/pub/samba/samba-latest.tar.gz
The following is an example of using curl to download a file from the Nmap website:
# curl -o nmap-7.70.tar.bz2 https://nmap.org/dist/nmap-7.70.tar.bz2
Differences between curl and wget
While wget and curl perform the same basic function, there are some key differences:
- wget is a command-line utility only, whereas curl is implemented using the cross-platform libcurl library and is therefore more easily ported to other systems.
- wget can download files recursively, whereas curl cannot.
- curl supports many more network protocols than wget, which only supports HTTP/S and FTP.
- wget is better suited for straightforward downloading of files from a web server, whereas curl is better suited to building and managing more complex requests and responses from web servers.
Syntax
The syntax of the wget and curl commands is:
# wget/curl [options] {URL}
curl Command Examples
1. Download the contents of a URL to a file:
# curl http://example.com --output filename
2. Download a file, saving the output under the filename indicated by the URL:
# curl --remote-name http://example.com/filename
3. Download a file, following location redirects, and automatically continuing (resuming) a previous file transfer and return an error on server error:
# curl --fail --remote-name --location --continue-at - http://example.com/filename
4. Send form-encoded data (POST request of type `application/x-www-form-urlencoded`). Use `–data @file_name` or `–data @’-‘` to read from STDIN:
# curl --data 'name=bob' http://example.com/form
5. Send a request with an extra header, using a custom HTTP method:
# curl --header 'X-My-Header: 123' --request PUT http://example.com
6. Send data in JSON format, specifying the appropriate content-type header:
# curl --data '{"name":"bob"}' --header 'Content-Type: application/json' http://example.com/users/1234
7. Pass a username and password for server authentication:
# curl --user myusername:mypassword http://example.com
8. Pass client certificate and key for a resource, skipping certificate validation:
# curl --cert client.pem --key key.pem --insecure https://example.com
wget Command Examples
1. To download a page:
# wget www.linux.com
2. To log messages:
# wget -o log www.thegeekdiary.com
3. To append to log file:
# wget -a log www.thegeekdiary.com
4. To run in background:
# wget -b -o log www.thegeekdiary.com
5. To run in verbose mode:
# wget -v -o log www.thegeekdiary.com
6. To run in quite mode:
# wget -q -o log www.thegeekdiary.com
7. To read URLs from file:
# wget -i urlfile -o log www.thegeekdiary.com
8. To set number of tries:
# wget -t 5 -o log www.thegeekdiary.com
9. To see the progress:
# wget --progress=type -o log www.thegeekdiary.com type=bar # wget --progress=type -o log www.thegeekdiary.com type=dot
10. To turn on time stamping:
# wget -N -o log www.thegeekdiary.com
11. To print the headers sent by the HTTP server/FTP server:
# wget -S -o log www.thegeekdiary.com
12. To check the pages are there or they are available:
# wget --spider -i urlfile
13. To set the time-out period:
# wget -T 60 -o log www.thegeekdiary.com
14. To limit the download speed:
# wget --limit-rate 100K -o log www.thegeekdiary.com
15. To specify the interval between downloads:
# wget -w 10 -o log -i urlfile
16. To display the version of wget:
# wget -V # wget --version