Adding delays between automated HTTP downloads reduces pressure on remote servers and greatly lowers the chance of triggering throttling, abuse detection, or temporary bans during scripted transfers. Spreading requests out keeps shared bandwidth usable for interactive workloads and respects resource limits imposed by hosting providers or public mirrors. Controlled pacing is particularly important when mirroring archives, backing up web resources, or crawling APIs at scale.

The wget client implements timing and bandwidth controls directly in the transfer loop so each individual request can be preceded by a configurable pause. Options such as –wait, –random-wait, –limit-rate, and –input-file combine into a sequential downloader that spaces out object fetches and keeps traffic within predictable limits. Settings can be supplied on the command line for ad-hoc jobs or written into configuration files such as /home/user/.wgetrc for consistent behaviour across invocations.

Misconfigured delays still risk overwhelming services if wait times are too small or if site-specific usage policies are ignored, and randomised timing does not make scripted access invisible to rate-limiting systems. The commands below assume Ubuntu with wget already installed and a standard bash shell, but the syntax is identical on most other Linux distributions. Before running any long-lived automation, robots.txt directives, published API limits, and acceptable use policies deserve careful review.

Steps to add waits and random delays for wget:

  1. Open a terminal on Ubuntu where wget is available in the PATH.
    $ wget --version
    GNU Wget 1.21.2 built on linux-gnu.
    Built by Debian.
    ##### snipped #####

    wget versions 1.12 and later understand the –wait and –random-wait options used for paced downloads.

  2. Create a plain text file that contains one target URL per line in the desired download order.
    $ cat > urls.txt << 'EOF'
    https://example.org/file-01.iso
    https://example.org/file-02.iso
    https://example.org/file-03.iso
    EOF

    A URL list combined with –input-file ensures delays apply between individual objects rather than only within a single large transfer.

  3. Start a throttled batch download that applies a fixed wait and random jitter between each HTTP request.
    $ wget --wait=10 --random-wait --limit-rate=200k --input-file=urls.txt --directory-prefix=downloads
    --2025-12-08 10:00:00--  https://example.org/file-01.iso
    Resolving example.org (example.org)... 93.184.216.34
    Connecting to example.org (example.org)|93.184.216.34|:443... connected.
    HTTP request sent, awaiting response... 200 OK
    Length: 52428800 (50M) [application/octet-stream]
    Saving to: ‘downloads/file-01.iso’
     
    downloads/file-01.iso 100%[===================>]  50.0M   205KB/s    in 4m10s
     
    2025-12-08 10:04:10 (205 KB/s) - ‘downloads/file-01.iso’ saved [52428800/52428800]
     
    Sleeping 13 seconds before next request.
    ##### snipped #####

    Very small wait values, high concurrency, or omission of –limit-rate can still overload a remote service, and repeated heavy usage may lead to rate limiting or account suspension despite randomised delays.

  4. Persist default wait, random delay, and rate-limit values in the per-user configuration file /home/user/.wgetrc.
    $ cat >> ~/.wgetrc << 'EOF'
    wait=10
    random_wait=on
    limit_rate=200k
    EOF

    Entries such as wait, random_wait, and limit_rate in /home/user/.wgetrc apply automatically for that account, while similar directives in /etc/wgetrc affect all users on the system.

  5. Run a short verification download and confirm that sleep messages and capped transfer speeds appear in the output.
    $ wget --input-file=urls.txt --directory-prefix=downloads-verify
    --2025-12-08 11:00:00--  https://example.org/file-01.iso
    Resolving example.org (example.org)... 93.184.216.34
    Connecting to example.org (example.org)|93.184.216.34|:443... connected.
    HTTP request sent, awaiting response... 200 OK
    Length: 1048576 (1.0M) [application/octet-stream]
    Saving to: ‘downloads-verify/file-01.iso’
     
    downloads-verify/file-01.iso 100%[===================>]   1.0M   210KB/s    in 4.8s
     
    2025-12-08 11:00:05 (210 KB/s) - ‘downloads-verify/file-01.iso’ saved [1048576/1048576]
     
    Sleeping 9 seconds before next request.
    ##### snipped #####

    Effective pacing is indicated by Sleeping N seconds before next request lines between objects and throughput values close to the configured limit_rate across repeated runs of wget with the same URL list.

Discuss the article:

Comment anonymously. Login not required.