Limiting bandwidth for cURL transfers avoids saturating shared links, simulates slower connections, and enforces per-job rate caps. Constrained transfers support realistic performance tests and reduce the risk of overwhelming remote services during bulk downloads or scripted workloads.

The --limit-rate option throttles the average transfer speed across the lifetime of a request. Values are specified in bytes per second, optionally with suffixes such as k, m, or g, and the limiter applies to both download and upload directions for the active transfer handled by the current cURL process.

Rate limiting in cURL remains purely client side and does not replace server-side quotas, traffic shaping, or quality-of-service rules. Very small limits can trigger server timeouts or job overruns, while very large limits behave similarly to unconstrained transfers, so choosing realistic values and confirming effective throughput is essential.

Steps to limit bandwidth in cURL:

  1. Check the installed cURL version and supported protocols.
    $ curl --version
    curl 8.5.0 (aarch64-unknown-linux-gnu) libcurl/8.5.0 OpenSSL/3.0.13 zlib/1.3 brotli/1.1.0 zstd/1.5.5 libidn2/2.3.7 libpsl/0.21.2 (+libidn2/2.3.7) libssh/0.10.6/openssl/zlib nghttp2/1.59.0 librtmp/2.3 OpenLDAP/2.6.7
    Release-Date: 2023-12-06, security patched: 8.5.0-2ubuntu10.6
    Protocols: dict file ftp ftps gopher gophers http https imap imaps ldap ldaps mqtt pop3 pop3s rtmp rtsp scp sftp smb smbs smtp smtps telnet tftp
    Features: alt-svc AsynchDNS brotli GSS-API HSTS HTTP2 HTTPS-proxy IDN IPv6 Kerberos Largefile libz NTLM PSL SPNEGO SSL threadsafe TLS-SRP UnixSockets zstd

    Any reasonably recent cURL build supports --limit-rate for HTTP, FTP, and other common protocols.

  2. Download a file with a simple bandwidth cap using --limit-rate.
    $ curl --limit-rate 100k "http://downloads.example.net/large-file.bin" --output large-file.bin
      % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                     Dload  Upload   Total   Spent    Left  Speed
      0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
     12 1024k   12  124k    0     0   123k      0  0:00:08  0:00:01  0:00:07  123k
    ##### snipped #####
    100 1024k  100 1024k    0     0   110k      0  0:00:09  0:00:09 --:--:--  124k

    100k limits the average download rate to roughly 100 KB/s using bytes per second rather than bits per second.

  3. Apply the same limiter to an upload using --upload-file and optional credentials.
    $ curl --limit-rate 50k --upload-file "./report.tar.gz" "ftp://ftp.example.net/report.tar.gz" --user user:password
      % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                     Dload  Upload   Total   Spent    Left  Speed
      0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
     12  512k    0     0   12 65536      0  62545  0:00:08  0:00:01  0:00:07 62594
    ##### snipped #####
    100  512k    0     0  100  512k      0  58386  0:00:08  0:00:08 --:--:-- 63673

    The limiter applies to the active transfer direction so uploads and downloads obey the same --limit-rate value.

  4. Try different limits and suffixes to model a range of connection speeds.
    $ curl --limit-rate 256k "http://downloads.example.net/video.mp4" --output video.mp4
    $ curl --limit-rate 1m "http://downloads.example.net/archive.zip" --output archive.zip
    $ curl --limit-rate 2g "http://downloads.example.net/dataset.bin" --output dataset.bin

    Suffixes k, m, and g represent kibibytes, mebibytes, and gibibytes per second based on bytes, not bits.

  5. Verify the effective download rate using transfer statistics from --write-out.
    $ curl --limit-rate 200k --silent --output /dev/null --write-out "%{speed_download}\n" "http://downloads.example.net/dataset.bin"
    214713

    Success signals: the reported speed_download value stays close to the configured limit and real transfers feel consistently throttled.