Authenticated downloads are common on private package feeds, internal artifact stores, and license-restricted mirrors where access is tied to specific accounts. Automation still needs access to those resources, and configuring wget for Basic Authentication allows scheduled jobs and scripts to fetch protected files without manual intervention.
On HTTP and HTTPS endpoints, wget implements Basic Authentication by attaching a user name and password to the request, typically in the Authorization header and protected in transit by TLS when HTTPS is used. The same credential flags drive logins for FTP targets, so a single command-line pattern can handle web and FTP servers alike, whether credentials are provided directly, prompted interactively, or read from configuration files.
Because Basic Authentication relies on easily decoded base64 strings before encryption is applied, careless handling can leak secrets through shell history, process listings, or world-readable configuration files. Safer patterns rely on interactive prompts, locked-down /home/user/.wgetrc files, and short‑lived shell variables instead of hard-coding passwords into long‑lived scripts. The procedure below focuses on Linux and favors approaches that minimize credential exposure while still supporting fully unattended downloads.
Steps to authenticate using basic authentication in wget:
- Open a terminal on Linux where wget is installed.
$ wget --version | head -n 2 GNU Wget 1.21.4 built on linux-gnu. ##### snipped #####
- Perform a one-off authenticated HTTPS download by providing --user and --password on the command line.
$ wget --user='report-user' --password='S3cretPass' https://files.example.com/reports/daily-report.csv --2025-01-15 09:00:00-- https://files.example.com/reports/daily-report.csv Resolving files.example.com (files.example.com)... 203.0.113.42 Connecting to files.example.com (files.example.com)|203.0.113.42|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 51234 (50K) [text/csv] Saving to: 'daily-report.csv' daily-report.csv 100% 50.0K --.-KB/s in 0.1s 2025-01-15 09:00:01 (480 KB/s) - 'daily-report.csv' saved [51234/51234]
Passwords passed directly via --password can be exposed through shell history and process listings visible to other users on multi-user systems.
- Request the password interactively instead of typing it in clear text by adding --ask-password alongside --user.
$ wget --user='report-user' --ask-password https://files.example.com/reports/daily-report.csv Password: --2025-01-15 09:05:00-- https://files.example.com/reports/daily-report.csv Resolving files.example.com (files.example.com)... 203.0.113.42 Connecting to files.example.com (files.example.com)|203.0.113.42|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 51234 (50K) [text/csv] Saving to: 'daily-report.csv' daily-report.csv 100% 50.0K --.-KB/s in 0s 2025-01-15 09:05:00 (12.3 MB/s) - 'daily-report.csv' saved [51234/51234]
The --ask-password flag keeps the password out of shell history and avoids embedding it directly in scripts.
- Use the same credential flags with an ftp:// URL when an authenticated FTP server exposes private files.
$ wget --user='backup' --password='BackupPass' ftp://ftp.example.net/private/backups/latest.tar.gz --2025-01-15 09:10:00-- ftp://ftp.example.net/private/backups/latest.tar.gz => 'latest.tar.gz' Resolving ftp.example.net (ftp.example.net)... 198.51.100.25 Connecting to ftp.example.net (ftp.example.net)|198.51.100.25|:21... connected. Logging in as backup ... Logged in! ==> SYST ... done. ==> PWD ... done. ==> TYPE I ... done. ==> PASV ... done. ==> RETR latest.tar.gz ... done. Length: 104857600 (100M) (unauthoritative) Saving to: 'latest.tar.gz' latest.tar.gz 100% 100M --.-KB/s in 20s 2025-01-15 09:10:20 (5.0 MB/s) - 'latest.tar.gz' saved [104857600/104857600] - Place credentials that must persist for scheduled jobs into a per-user .wgetrc file with restrictive permissions.
$ printf 'user=report-user\npassword=S3cretPass\n' > ~/.wgetrc $ chmod 600 ~/.wgetrc $ ls -l ~/.wgetrc -rw------- 1 user user 42 Jan 15 09:15 /home/user/.wgetrc
An .wgetrc file that is readable by other accounts exposes plain-text credentials to anyone with local access.
- Run wget without explicit --user or --password options after .wgetrc is configured for the default host.
$ wget https://files.example.com/reports/daily-report.csv --2025-01-15 09:20:00-- https://files.example.com/reports/daily-report.csv Resolving files.example.com (files.example.com)... 203.0.113.42 Connecting to files.example.com (files.example.com)|203.0.113.42|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 51234 (50K) [text/csv] Saving to: 'daily-report.csv' daily-report.csv 100% 50.0K --.-KB/s in 0.1s 2025-01-15 09:20:01 (480 KB/s) - 'daily-report.csv' saved [51234/51234]
When present, per-user .wgetrc settings are applied automatically to matching HTTP, HTTPS, and FTP requests.
- Use a short-lived shell variable for scripting scenarios where a password is needed but should not be written to disk.
$ read -rsp "Password: " WGET_PASSWORD Password: $ printf '\n' $ wget --user='report-user' --password="$WGET_PASSWORD" https://files.example.com/reports/daily-report.csv --2025-01-15 09:25:00-- https://files.example.com/reports/daily-report.csv Resolving files.example.com (files.example.com)... 203.0.113.42 Connecting to files.example.com (files.example.com)|203.0.113.42|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 51234 (50K) [text/csv] Saving to: 'daily-report.csv' daily-report.csv 100% 50.0K --.-KB/s in 0.1s 2025-01-15 09:25:01 (480 KB/s) - 'daily-report.csv' saved [51234/51234] $ unset WGET_PASSWORD
Shell variables reduce the risk of leaving credentials in files, but environment inspection tools may still reveal them on systems that expose process environments.
- Confirm that authenticated downloads succeeded by checking the exit status and inspecting the downloaded file.
$ echo $? 0 $ ls -lh daily-report.csv -rw-r--r-- 1 user user 50K Jan 15 09:25 daily-report.csv
A zero exit status from wget combined with a 200 OK response and an expected file size indicates a successful authenticated transfer.
Mohd Shakir Zakaria is a cloud architect with deep roots in software development and open-source advocacy. Certified in AWS, Red Hat, VMware, ITIL, and Linux, he specializes in designing and managing robust cloud and on-premises infrastructures.
Comment anonymously. Login not required.
