Network interruptions, server issues, or system restarts can abruptly end long downloads. Restarting from scratch wastes time and bandwidth. wget addresses this by allowing partial downloads to be resumed if the server supports byte-range requests.
By continuing from where it left off, wget efficiently finishes large file downloads without redundant data transfer. This feature is especially valuable for slow or unstable connections, and for retrieving sizable files.
Before resuming, ensure the partial file is present and that the server supports resuming. If supported, wget will append the remaining portion, resulting in a complete, intact file.
Steps to resume an interrupted download using Wget:
- Navigate to the directory containing the partially downloaded file.
$ cd ~/Downloads
- Use the --continue or -c option to resume the download.
$ wget --continue https://example.com/large-file.zip --2024-12-10 10:05:00-- https://example.com/large-file.zip HTTP request sent, awaiting response... 206 Partial Content
--continue instructs wget to resume downloading from the last known offset.
- Monitor the progress until the download completes.
... large-file.zip 100%[========>] 976M 12MB/s in 41s
- Verify the file integrity after completion, if a checksum is provided.
$ sha256sum large-file.zip
Verifying checksums ensures the file is not corrupted.

Mohd Shakir Zakaria is a cloud architect with deep roots in software development and open-source advocacy. Certified in AWS, Red Hat, VMware, ITIL, and Linux, he specializes in designing and managing robust cloud and on-premises infrastructures.
Comment anonymously. Login not required.