When dealing with multiple file downloads, manually entering URLs is tedious and prone to error. wget simplifies this by reading a list of URLs from a file and retrieving them in batch.

This method streamlines downloading large sets of files, enabling automation and consistency. Once the list is prepared, wget handles each URL in sequence, logging successes and failures for reference.

Batch downloading also helps manage bandwidth and retries. If a download fails, wget can retry or resume from the list without manual intervention.

Steps to download files using Wget from a list of URLs:

  1. Create a text file containing each URL on a separate line.
    $ nano download-list.txt
    # Add URLs here:
    https://example.com/file1.jpg
    https://example.com/file2.zip
  2. Use the --input-file option to feed the URL list to wget.
    $ wget --input-file=download-list.txt
    --2024-12-10 10:06:00-- https://example.com/file1.jpg
    Saving to: ‘file1.jpg’

    wget processes each URL in turn and downloads the corresponding files.

  3. Check the downloaded files to confirm they were retrieved successfully.
    $ ls
    file1.jpg  file2.zip
  4. If a download fails, use --continue to retry incomplete downloads.
    $ wget --continue --input-file=download-list.txt
Discuss the article:

Comment anonymously. Login not required.