Limiting download speed for wget keeps large transfers from saturating shared links or desktop connections. Throttling background downloads prevents sluggish shells, video calls, or web browsing while archives, images, or backups move across the network.

The wget client implements local throttling with the --limit-rate option, applying an average per-connection cap in bytes per second. When a cap is active, wget periodically sleeps during the transfer, interpreting suffixes such as k, m, and g as multiples of 1024 to keep the observed rate near the configured value.

Rate limiting remains approximate and must account for protocol overhead, server behavior, and competing traffic on the same link. Aggressively low caps can cause HTTP timeouts or extend jobs beyond maintenance windows, whereas overly generous limits may still produce brief bursts that stress upstream routers or metered connections; selecting conservative values and verifying actual throughput on the target Linux system avoids surprises in automation and scheduled tasks.

Steps to throttle download speed with wget:

  1. Open a terminal on the target Linux system with access to the network on which throttling is required.
  2. Decide on a maximum download rate in kilobytes or megabytes per second that leaves sufficient bandwidth for interactive workloads.
  3. Start a download with a conservative kilobyte-per-second cap using the --limit-rate option.
    $ wget --limit-rate=100k https://downloads.example.com/files/largefile.zip
    --2025-12-28 12:12:10--  https://downloads.example.com/files/largefile.zip
    Resolving downloads.example.com (downloads.example.com)... 172.18.0.2
    Connecting to downloads.example.com (downloads.example.com)|172.18.0.2|:443... connected.
    HTTP request sent, awaiting response... 200 OK
    Length: 2097152 (2.0M) [application/zip]
    Saving to: 'largefile.zip'
    
         0K .......... .......... .......... .......... ..........  2% 77.6K 26s
    ##### snipped #####

    The --limit-rate value applies per connection, so multiple concurrent wget processes each receive their own cap.

  4. Apply a higher megabyte-per-second cap for faster links while still preventing full link saturation.
    $ wget --limit-rate=2m https://downloads.example.com/files/tool.tar.gz
    --2025-12-28 12:12:30--  https://downloads.example.com/files/tool.tar.gz
    Resolving downloads.example.com (downloads.example.com)... 172.18.0.2
    Connecting to downloads.example.com (downloads.example.com)|172.18.0.2|:443... connected.
    HTTP request sent, awaiting response... 200 OK
    Length: 3145728 (3.0M) [application/gzip]
    Saving to: 'tool.tar.gz'
    
         0K .......... .......... .......... .......... ..........  1%  498M 0s
    ##### snipped #####

    Suffixes k, m, and g are interpreted as kibibytes, mebibytes, and gibibytes per second, respectively, using a base of 1024.

  5. Combine throttling with recursive or scripted downloads to keep large mirroring jobs from monopolizing bandwidth.
    $ wget --limit-rate=300k --recursive --no-parent https://www.example.com/repo/
    --2025-12-28 12:12:32--  https://www.example.com/repo/
    Resolving www.example.com (www.example.com)... 172.18.0.2
    Connecting to www.example.com (www.example.com)|172.18.0.2|:443... connected.
    HTTP request sent, awaiting response... 200 OK
    ##### snipped #####

    Very low limits for recursive transfers can stretch jobs across maintenance windows or quiet periods and increase the chance of remote timeouts or interrupted sessions.

  6. Monitor runtime bandwidth usage to confirm that effective throughput remains near the configured cap.
    $ sudo nethogs -t -c 2
    Adding local address: 172.18.0.4
    Ethernet link detected
    
    Refreshing:
    unknown TCP/0/0	0	0
    
    Refreshing:
    unknown TCP/0/0	0	0

    Tools such as nethogs, iftop, or interface graphs from monitoring systems help validate that throttled transfers behave as expected.

  7. Inspect the final summary line printed by wget to confirm that the reported average download speed does not exceed the configured limit.
    $ wget --limit-rate=100k https://downloads.example.com/files/largefile.zip
    ##### snipped #####
    2025-12-28 12:12:53 (100 KB/s) - 'largefile.zip' saved [2097152/2097152]

    An average rate at or below the configured limit indicates successful throttling, especially when cross-checked against external network monitoring.