Repeatedly downloading the same archives or directory trees wastes bandwidth and extends maintenance windows on slow or metered connections. Using conditional downloads with wget keeps local mirrors current while avoiding unnecessary transfers in scheduled jobs, scripts, and manual update runs.
In timestamping mode, wget compares local file modification times with server-provided Last-Modified headers and sends conditional HTTP requests such as If-Modified-Since to check whether content has changed. Servers that support this pattern reply with status code 304 Not Modified when nothing is new, allowing wget to skip the payload entirely, while returning a normal 2xx status and data when updates exist.
Reliable conditional transfers depend on accurate clocks, servers that expose modification metadata, and predictable URLs; dynamic endpoints or misconfigured time zones may still trigger full downloads or cause updates to be missed, so the commands below assume a Linux shell with wget already installed and focus on static files or simple directory trees.
Steps to download only newer files with wget:
- Show the installed wget version to confirm support for timestamped downloads.
$ wget --version GNU Wget 1.21.4 built on linux-gnu ##### snipped #####
Option --timestamping (short form \-N) enables conditional downloads based on modification time in standard wget builds.
- Change into the directory that stores the local copies of the remote files.
$ mkdir -p ~/mirrors/example $ cd ~/mirrors/example $ pwd /home/example/mirrors/example
Keeping all mirrored content under a dedicated path simplifies periodic synchronization and cleanup.
- Download a single remote file with timestamping enabled so that only newer versions replace the local copy.
$ wget --timestamping https://example.com/files/archive.tar.gz --2025-12-08 10:15:12-- https://example.com/files/archive.tar.gz Resolving example.com (example.com)... 93.184.216.34 Connecting to example.com (example.com)|93.184.216.34|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 104857600 (100M) [application/x-tar] Saving to: ‘archive.tar.gz’ archive.tar.gz 100%[===================>] 100.00M 12.3MB/s in 8.1s 2025-12-08 10:15:20 (12.3 MB/s) - ‘archive.tar.gz’ saved [104857600/104857600]
On subsequent runs with an unchanged file, the server can reply with 304 Not Modified and no data is transferred.
Incorrect server timestamps or a skewed system clock can cause outdated content to be treated as current, so timestamping suits static assets more than frequently changing dynamic pages.
- Mirror a remote directory tree while skipping unchanged files by combining recursion, parent restriction, and timestamping.
$ wget --recursive --no-parent --timestamping https://example.com/repo/ --2025-12-08 10:20:01-- https://example.com/repo/ ##### snipped ##### No newer files found.
Option --recursive walks subdirectories, while --no-parent prevents ascending above the starting URL and keeps the mirror limited to the intended subtree.
- Resume an interrupted large transfer safely by pairing timestamping with the continue option so existing data is reused.
$ wget --timestamping --continue https://example.com/images/backup.img --2025-12-08 10:30:44-- https://example.com/images/backup.img Resuming download of backup.img. Checking timestamp on server... Remote file is newer than local file, resuming download. ##### snipped #####
Combining --timestamping with --continue avoids re-downloading completed segments while still updating to the latest version when the remote file changes.
- Verify conditional behavior by rerunning a timestamped download and confirming that the output shows a skip or 304 Not Modified response when nothing changed.
$ wget --timestamping https://example.com/files/archive.tar.gz --2025-12-08 10:35:02-- https://example.com/files/archive.tar.gz HTTP request sent, awaiting response... 304 Not Modified File ‘archive.tar.gz’ not modified since 2025-12-08 10:15:20.
Successful conditional downloading typically produces very short runs with 304 Not Modified status or messages such as No newer files found for already up-to-date content.
Mohd Shakir Zakaria is a cloud architect with deep roots in software development and open-source advocacy. Certified in AWS, Red Hat, VMware, ITIL, and Linux, he specializes in designing and managing robust cloud and on-premises infrastructures.
Comment anonymously. Login not required.
