When dealing with multiple file downloads, manually entering URLs is tedious and prone to error. wget simplifies this by reading a list of URLs from a file and retrieving them in batch.
This method streamlines downloading large sets of files, enabling automation and consistency. Once the list is prepared, wget handles each URL in sequence, logging successes and failures for reference.
Batch downloading also helps manage bandwidth and retries. If a download fails, wget can retry or resume from the list without manual intervention.
Steps to download files using Wget from a list of URLs:
- Create a text file containing each URL on a separate line.
$ nano download-list.txt # Add URLs here: https://example.com/file1.jpg https://example.com/file2.zip
- Use the --input-file option to feed the URL list to wget.
$ wget --input-file=download-list.txt --2024-12-10 10:06:00-- https://example.com/file1.jpg Saving to: ‘file1.jpg’
wget processes each URL in turn and downloads the corresponding files.
- Check the downloaded files to confirm they were retrieved successfully.
$ ls file1.jpg file2.zip
- If a download fails, use --continue to retry incomplete downloads.
$ wget --continue --input-file=download-list.txt

Author: Mohd
Shakir Zakaria
Mohd Shakir Zakaria is a cloud architect with deep roots in software development and open-source advocacy. Certified in AWS, Red Hat, VMware, ITIL, and Linux, he specializes in designing and managing robust cloud and on-premises infrastructures.

Mohd Shakir Zakaria is a cloud architect with deep roots in software development and open-source advocacy. Certified in AWS, Red Hat, VMware, ITIL, and Linux, he specializes in designing and managing robust cloud and on-premises infrastructures.
Discuss the article:
Comment anonymously. Login not required.