Capturing diagnostic output from wget in a log file keeps a durable record of downloads, errors, and HTTP responses, which simplifies troubleshooting flaky networks and auditing what was retrieved. Persistent logs also make it easier to compare behavior over time, especially for scheduled downloads, batch jobs, and automated mirrors.
The wget client writes progress information, HTTP headers, and status messages to its diagnostic streams while saving the downloaded payload to files or standard output. Options such as output-file and append-output redirect those diagnostic messages into a plain-text log without changing how the actual data is stored, which allows scripts to parse logs separately from the downloaded content.
Log files can grow quickly and may contain sensitive query strings, cookies, or tokens if those are part of the requested URLs or headers. Choosing an appropriate log location, restricting file permissions, and combining logging with quiet modes reduces terminal noise while retaining enough detail for analysis on systems where wget is available.
Related: How to run wget downloads in the background
Related: How to debug wget connections
Steps to log wget output to a file:
- Open a terminal in the user account that runs the downloads.
$ whoami userRunning wget under a dedicated account simplifies tracking and limits the scope of credentials and files exposed through logging.
- Create a directory under the home folder to store wget log files.
$ mkdir -p ~/logs
Using a dedicated directory such as ~/logs keeps diagnostic files separate from downloaded data and simplifies cleanup or rotation.
- Run wget with the output-file option to log a single download into a fresh file.
$ wget --output-file="$HOME/logs/wget.log" https://www.example.com/archive.tar.gz
The output-file option overwrites any existing file with the same name, so each invocation that uses it starts with a clean log.
- Use the append-output option when multiple invocations should accumulate entries in the same log file instead of replacing it.
$ wget --append-output="$HOME/logs/wget.log" https://www.example.com/data.tar.gz
Long-running jobs that use append-output can generate large log files and may record sensitive URLs or headers, so log rotation and restricted permissions are important to prevent disk exhaustion and data exposure.
- Combine logging with non-verbose mode to reduce terminal output while still capturing a summary entry in the log.
$ wget --append-output="$HOME/logs/wget.log" --no-verbose https://downloads.example.net/files/largefile.zip
Non-verbose mode reduces on-screen noise but still writes progress and summary information into the log file.
- Inspect the log file to confirm that wget activity is being recorded as expected.
$ tail -n 15 "$HOME/logs/wget.log" 500K .......... .......... .......... .......... .......... 53% 855M 0s 550K .......... .......... .......... .......... .......... 58% 500M 0s 600K .......... .......... .......... .......... .......... 63% 911M 0s 650K .......... .......... .......... .......... .......... 68% 918M 0s 700K .......... .......... .......... .......... .......... 73% 887M 0s 750K .......... .......... .......... .......... .......... 78% 897M 0s 800K .......... .......... .......... .......... .......... 83% 695M 0s 850K .......... .......... .......... .......... .......... 87% 698M 0s 900K .......... .......... .......... .......... .......... 92% 950M 0s 950K .......... .......... .......... .......... .......... 97% 955M 0s 1000K .......... .......... .... 100% 1.31G=0.002s 2026-01-10 04:56:06 (639 MB/s) - 'data.tar.gz' saved [1048576/1048576] 2026-01-10 04:56:14 URL:https://downloads.example.net/files/largefile.zip [2097152/2097152] -> "largefile.zip" [1]
Recent timestamps, expected URLs, HTTP status values such as 200, and byte counts that match completed downloads indicate that logging for wget is working correctly.
- Periodically archive or rotate the log file when it grows large to keep storage use under control.
$ du -h "$HOME/logs/wget.log" 4.0K /home/user/logs/wget.log
Standard tools such as logrotate or a simple policy of truncating logs after inspection help prevent unexpected growth, especially on systems with frequent automated downloads.
Mohd Shakir Zakaria is a cloud architect with deep roots in software development and open-source advocacy. Certified in AWS, Red Hat, VMware, ITIL, and Linux, he specializes in designing and managing robust cloud and on-premises infrastructures.
