Running several downloads at the same time with cURL cuts total wall-clock time for artifact mirrors, dataset staging, and bulk file collection because idle network time is spent across multiple transfers instead of a single serial queue.
Parallel mode is enabled with --parallel or -Z, which asks cURL to schedule multiple URLs through the libcurl multi interface. Each transfer still needs an output target, either by pairing repeated --output flags with each URL on the command line or by listing url and output entries inside a file loaded with --config.
Uncapped fan-out is rarely desirable even though cURL defaults --parallel-max to 50 concurrent transfers. Small limits are safer for shared mirrors and rate-limited download endpoints, and some packaged builds still lack newer per-host controls such as --parallel-max-host, so a conservative global cap keeps the workflow portable.
Steps to run parallel downloads with cURL:
- Create a working directory and a dedicated download target before starting the batch.
$ mkdir -p ~/curl-parallel-demo/downloads $ cd ~/curl-parallel-demo
Keeping the batch definition file and the downloaded files in separate paths makes reruns easier when only one object fails or needs to be replaced.
- Create a cURL config file that maps each URL to an explicit output path.
$ cat > urls.txt << 'EOF' url = "https://downloads.example.net/toolkit/2026.03/release-manifest.json" output = "downloads/release-manifest.json" url = "https://downloads.example.net/toolkit/2026.03/linux/amd64/toolkit-linux-amd64.tar.xz" output = "downloads/toolkit-linux-amd64.tar.xz" url = "https://downloads.example.net/toolkit/2026.03/linux/arm64/toolkit-linux-arm64.tar.xz" output = "downloads/toolkit-linux-arm64.tar.xz" EOF
Config files accept long option names without the leading dashes, so url and output lines behave like repeated --url and --output options on the command line.
- Start the batch with --parallel, set a low concurrency cap, and surface transfer failures immediately.
$ curl --parallel --parallel-max 3 --parallel-immediate --fail --show-error --config urls.txt DL% UL% Dled Uled Xfers Live Total Current Left Speed -- -- 0 0 3 3 --:--:-- --:--:-- --:--:-- 0 100 -- 832k 0 3 0 --:--:-- --:--:-- --:--:-- 45.1M
Keep --parallel-max close to the actual number of files needed for the batch instead of leaving it at the default ceiling. Excess concurrency can trigger rate limiting, connection resets, or partial download churn on shared endpoints.
- Use repeated --output options for short ad hoc batches that do not justify a separate config file.
$ curl --parallel --parallel-max 3 --parallel-immediate \ --output release-manifest.json https://downloads.example.net/toolkit/2026.03/release-manifest.json \ --output toolkit-linux-amd64.tar.xz https://downloads.example.net/toolkit/2026.03/linux/amd64/toolkit-linux-amd64.tar.xz \ --output toolkit-linux-arm64.tar.xz https://downloads.example.net/toolkit/2026.03/linux/arm64/toolkit-linux-arm64.tar.xz
Each --output is paired positionally with one URL, so keep the filename immediately beside the matching transfer to avoid saving data into the wrong target.
- Verify that the expected files exist and have plausible non-zero sizes before treating the batch as complete.
$ ls -lh downloads/release-manifest.json downloads/toolkit-linux-amd64.tar.xz downloads/toolkit-linux-arm64.tar.xz -rw-r--r-- 1 user user 64K Mar 29 09:32 downloads/release-manifest.json -rw-r--r-- 1 user user 320K Mar 29 09:32 downloads/toolkit-linux-amd64.tar.xz -rw-r--r-- 1 user user 448K Mar 29 09:32 downloads/toolkit-linux-arm64.tar.xz
Missing files, zero-byte files, or obviously undersized files usually point to failed transfers or unexpected server responses. Delete only the bad targets and rerun the affected URLs instead of restarting the full batch.
Mohd Shakir Zakaria is a cloud architect with deep roots in software development and open-source advocacy. Certified in AWS, Red Hat, VMware, ITIL, and Linux, he specializes in designing and managing robust cloud and on-premises infrastructures.
