Long-running transfers over flaky links often stall or fail just before completion, turning multi-gigabyte downloads into repetitive busywork. Automatic retry behaviour in wget allows large images, backups, and archives to survive short outages, DNS hiccups, or overloaded mirrors without manual restarts.
As a non-interactive downloader, wget uses command-line options to control retry limits, wait intervals, and resume behaviour across connection attempts. The --tries option bounds how many failures are tolerated, --waitretry inserts a delay between retries, and --continue resumes partial files from the last written byte rather than discarding existing data.
Retry configuration influences both reliability and upstream load, because excessive retries or very short wait intervals can overwhelm mirrors or trigger automated blocking; moderate defaults combined with resume support provide a safer baseline for unattended jobs handling large files.
Steps to retry downloads automatically using wget:
- Open a terminal in the directory intended to store the downloaded file.
$ cd ~/Downloads
- Start a download with a finite retry limit using --tries=10 so that transient failures trigger additional attempts instead of aborting immediately.
$ wget --tries=10 https://www.example.com/unreliable-file.zip --2025-12-08 10:15:01-- https://www.example.com/unreliable-file.zip Resolving www.example.com (www.example.com)... 93.184.216.34 Connecting to www.example.com (www.example.com)|93.184.216.34|:443... connected. HTTP request sent, awaiting response... 503 Service Unavailable Retrying.
A value such as --tries=10 usually balances robustness against the risk of wasting time on a host that remains unavailable.
- Add --waitretry=10 so that each failed attempt waits several seconds before retrying instead of hammering the remote server.
$ wget --tries=10 --waitretry=10 https://www.example.com/unreliable-file.zip --2025-12-08 10:17:10-- https://www.example.com/unreliable-file.zip Resolving www.example.com (www.example.com)... 93.184.216.34 Connecting to www.example.com (www.example.com)|93.184.216.34|:443... connected. HTTP request sent, awaiting response... 503 Service Unavailable Retrying in 10 seconds. Retrying in 10 seconds. ##### snipped #####
Very short retry intervals or high retry counts can resemble abusive traffic patterns and may lead to throttling or IP blacklisting by cautious mirrors.
- Combine retries with --continue so that partially downloaded files resume rather than restart when a connection breaks.
$ wget --tries=10 --waitretry=10 --continue https://www.example.com/unreliable-file.zip --2025-12-08 10:20:00-- https://www.example.com/unreliable-file.zip Length: 1610612736 (1.5G) [application/zip] Saving to: ‘unreliable-file.zip’ unreliable-file.zip 52%[=========> ] 838,860,800 5.23MB/s eta 2m 30s ##### snipped #####
When a matching partial file exists in the current directory, --continue appends new data starting from the last completed byte instead of discarding previous progress.
- Enable indefinite retry behaviour only when appropriate by passing --tries=0 together with a conservative wait interval.
$ wget --tries=0 --waitretry=30 --continue https://www.example.com/unreliable-file.zip --2025-12-08 10:25:30-- https://www.example.com/unreliable-file.zip Resolving www.example.com (www.example.com)... 93.184.216.34 Connecting to www.example.com (www.example.com)|93.184.216.34|:443... connected. HTTP request sent, awaiting response... 503 Service Unavailable Retrying in 30 seconds. ##### snipped #####
Unlimited retries can loop for hours or days if a host remains offline, so this pattern fits controlled environments with monitoring rather than casual desktop use.
- Confirm that the completed file matches expectations by checking size or verifying a known checksum after the retried transfer finishes.
$ ls -lh unreliable-file.zip -rw-r--r-- 1 user user 1.5G Dec 8 10:32 unreliable-file.zip $ sha256sum unreliable-file.zip d2c7f0b91b4a2b9fa0f4b2c025e1e76a4e6a23b54b55a2a2f4d9c93b33a4b2f3 unreliable-file.zip
A matching hash from the download source or documentation provides stronger assurance of integrity than file size comparisons alone.
Mohd Shakir Zakaria is a cloud architect with deep roots in software development and open-source advocacy. Certified in AWS, Red Hat, VMware, ITIL, and Linux, he specializes in designing and managing robust cloud and on-premises infrastructures.
Comment anonymously. Login not required.
