Signup/Sign In

cURL in Linux. How to transfer data?

cURL, often written as curl, is a Linux utility used for data transfer from or to a server, using any of the protocols such as HTTP, HTTPS, FTP, FTPS, and so on (for multiple file transfer wget or FTP is advisable).

curl offers various options such as resuming transfers, limiting bandwidth, proxy support, and much more.

Installing cURL

On most Linux distros, cURL comes pre-installed, but in the rare case that it doesn't, your distro repo should have an up-to-date version of cURL.

To check if cURL is installed, just run the command curl alone. If cURL is not installed, it will return an error of cURL not installed, and you can install it on Ubuntu using the following commands.

sudo sh -c 'apt update && apt upgrade'
sudo apt install curl

Using cURL

curl has a very simple syntax, which looks something like

curl [OPTIONS] [URL...]

For example, you wanted to retrieve the homepage of google.com, we would run the following.

curl google.com

curl google.com

The output would be printed to the console. Also, if no protocol is specified, cURL tries to guess, otherwise defaults to HTTP.

Saving Output

cURL output can be saved in two ways. One using a pre-defined name, or two using the original name. Both are done by -o, and -O respectively.

For example to fetch google.com, and store it to index.html, the command that is run is as follows.

curl -o index.html gooogle.com

curl -o index.html google.com

Downloading Multiple Files

When multiple files are to be download, we use multiple -o, and -O options, followed by either both filename, and URL, or just the URL. Hence, to get google.com and store it to google.html, and duckduckgo.com to duckduckgo.html, the following command is what we would run.

curl -o google.html google.com -o duckduckgo.html duckduckgo.com

multiple file curl

Resuming a Download

Sometimes our downloads stop, due to a network crash, power outage, low bandwidth and many more reasons. Well, cURL saves us when this happens. We don't need to download everything again from the beginning, with the -C - option, we can resume our download, and face no issues whatsoever.

For example, say you were downloading the Ubuntu-21.04 ISO, and your network crashed, we could resume the download with the following commands.

curl -O https://releases.ubuntu.com/21.04/ubuntu-21.04-desktop-amd64.iso # Download for ISO file
curl -C - -O https://releases.ubuntu.com/21.04/ubuntu-21.04-desktop-amd64.iso # Resume download by taking the previous command and adding -C -

Follow Redirects

Sometimes a page has shifted, and if you try to cURL it, you end up with an error page, with a link to the new URL, or a blank page, with the backend server engine name. You can instruct cURL to follow these redirects to the new location no matter where they are. For example, if you cURL duckduckgo.com you would end up with the following page.

curl duckduckgo.com

But if you use the redirect flag, i.e. -L, it will lead to the correct site as follows.

curl -L duckduckgo.com

curl -L duckduckgo.com

Specify Max Transfer Rate

cURL can limit the transfer rate of files being downloaded, using the --limit-rate flag. The size can be specified using k for kilobytes, m for megabytes, and g for gigabytes. For example, to download the google.com homepage, at a speed of 1k, we run the following command.

curl -L -o google.html google.com --limit-rate 1k

To see the difference, we have executed the same command, once with --limit-rate, and once without.

curl limit-rate and without

Using a proxy

To use a proxy, we use the -x/--proxy flag, followed by the proxy URL.

For example, for a proxy on 192.168.33.24, port 8000, we would run the following.

curl -x 192.168.33.24:8000 https://www.google.com

If the proxy server requires authentication, the -U/--proxy-user flag is passed along with the parameters in the syntax as shown below.

curl -U user:pass -x PROXY_IP:PROXY_PORT <IP or Name>

Conclusion

This tutorial is a handy guide on how to install cURL, on your system, and then download files from a remote machine, using the curl program. It also shows how to limit the bandwidth, and how to connect to a proxy via cURL, and more.



About the author:
Pradeep has expertise in Linux, Go, Nginx, Apache, CyberSecurity, AppSec and various other technical areas. He has contributed to numerous publications and websites, providing his readers with insightful and informative content.