Batch download jobs are easier to rerun and audit when every target URL lives in one plain-text file. Feeding that file to wget keeps the retrieval order explicit and avoids rebuilding a long command line every time the batch needs to run again.
GNU wget reads the list with --input-file (-i) and saves the results normally unless you add other download controls. Pairing --input-file with --directory-prefix keeps the files together, and the same list can be reused later with --continue, --timestamping, and pacing options when the batch becomes a scheduled job.
The list file is literal. Current GNU Wget still treats lines such as # nightly batch as URLs instead of comments, and duplicate remote filenames can still collide in the destination directory, so keep the file to plain URLs and verify the final download set before another job consumes it.
$ cat > download-list.txt <<'EOF' https://downloads.example.net/releases/2026-04/ops-status-2026-04.csv https://downloads.example.net/releases/2026-04/platform-assets-2026.04.tar.gz https://downloads.example.net/releases/2026-04/compliance-handbook-2026.pdf EOF
Keeping the manifest beside the job makes retries, reviews, and handoffs much easier.
$ wget --input-file=download-list.txt --directory-prefix=downloads --2026-04-22 09:18:13-- https://downloads.example.net/releases/2026-04/ops-status-2026-04.csv Resolving downloads.example.net (downloads.example.net)... 203.0.113.50 Connecting to downloads.example.net (downloads.example.net)|203.0.113.50|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 65536 (64K) [text/csv] Saving to: 'downloads/ops-status-2026-04.csv' ##### snipped ##### --2026-04-22 09:18:14-- https://downloads.example.net/releases/2026-04/platform-assets-2026.04.tar.gz ##### snipped ##### --2026-04-22 09:18:15-- https://downloads.example.net/releases/2026-04/compliance-handbook-2026.pdf ##### snipped ##### FINISHED --2026-04-22 09:18:15-- Total wall clock time: 0.18s Downloaded: 3 files, 3.2M in 0.12s (26.6 MB/s)
The list order is preserved, so log review and rate-limit troubleshooting stay predictable across reruns.
$ wget --input-file=download-list.txt --directory-prefix=downloads \ --continue --timestamping --wait=1 --random-wait --2026-04-22 09:21:10-- https://downloads.example.net/releases/2026-04/ops-status-2026-04.csv HTTP request sent, awaiting response... 304 Not Modified File 'downloads/ops-status-2026-04.csv' not modified on server. Omitting download. ##### snipped #####
--continue resumes partial files, --timestamping skips unchanged files, and --random-wait varies the pause between requests.
$ wget --input-file=download-list.txt \ --directory-prefix=downloads \ --continue --timestamping \ --output-file=wget-batch.log
Use --append-output instead when each run should add to one existing log instead of replacing it.
$ ls -1 downloads compliance-handbook-2026.pdf ops-status-2026-04.csv platform-assets-2026.04.tar.gz
The final directory should show one local file for each URL in download-list.txt.