When downloading multiple files from the internet, rather than fetching them one by one, a more efficient approach is to use a list. Wget, a popular command-line utility, provides this functionality through its native support for reading URLs from a file.
This method ensures that you can manage and maintain a collection of links in a single file, making it easier to initiate multiple downloads with a single command. Whether you're maintaining backups, setting up a local repository, or simply archiving data, downloading from a list with Wget is an invaluable skill.
Using the Wget -i switch, you can specify a file that contains a list of URLs to be downloaded. This file should contain one URL per line.
$ cd /path/to/directory
$ wget -i download-list.txt
If any download fails, Wget will attempt to retrieve it until the list is fully processed.
$ wget --continue -i download-list.txt
Related: resume-incomplete-downloads
$ wget --limit-rate=200k -i download-list.txt
Be mindful of network usage, especially on shared or limited connections.
If you want to download the list to a specific directory, use the -P prefix followed by the directory path. For instance,
$ wget -i download-list.txt -P /path/to/save/
will download the files to the specified directory.
Comment anonymously. Login not required.