Automated downloads over FTP with wget streamline data collection from legacy servers, mirror sites, and public archives. Scriptable transfers avoid manual use of graphical FTP clients, integrate cleanly into cron jobs, and work well on headless Linux systems. Consistent commands also make it easier to reproduce and audit file retrieval procedures across multiple hosts.
The wget utility is a non-interactive network downloader that understands HTTP, HTTPS, and FTP URLs. When fetching content over FTP, it opens a control connection on port 21, authenticates with login commands such as USER and PASS, and uses a separate data connection for each file transfer. Options like --user, --password, --recursive, and --no-parent control authentication and directory traversal, enabling both one-off downloads and full directory mirroring.
Plain FTP sends credentials and data in clear text, so usage on untrusted networks can expose passwords and file contents to interception. Sensitive workloads generally benefit from alternatives such as SFTP or FTPS, or from tunneling traffic through a VPN before accessing an FTP service. Before invoking wget for FTP transfers, ensure access to the target host, sufficient disk space in the chosen download directory, and appropriate permissions for any credentials that will be used.
Steps to download files over FTP with wget:
- Open a terminal in the directory that should hold the downloaded files.
$ mkdir -p ~/ftp-downloads $ cd ~/ftp-downloads $ pwd /home/user/ftp-downloads
- Download a single file from an FTP server by providing credentials in the URL.
$ wget ftp://user:ExamplePass!@ftp.example.net/private/report.csv --2026-01-10 05:46:40-- ftp://user:*password*@ftp.example.net/private/report.csv => 'report.csv' Resolving ftp.example.net (ftp.example.net)... 203.0.113.50 Connecting to ftp.example.net (ftp.example.net)|203.0.113.50|:21... connected. Logging in as user ... Logged in! ==> SYST ... done. ==> PWD ... done. ==> TYPE I ... done. ==> CWD (1) /private ... done. ==> SIZE report.csv ... 21 ==> PASV ... done. ==> RETR report.csv ... done. Length: 21 (unauthoritative) 0K 100% 8.29M=0s 2026-01-10 05:46:40 (8.29 MB/s) - 'report.csv' saved [21]wget masks URL passwords in its own output, but the cleartext URL can still appear in shell history or process listings.
- Download a file from an authenticated FTP server by specifying a username and password for wget.
$ wget --user=backupuser --password='BackupPass!' ftp://ftp.example.net/private/report.csv --2026-01-10 05:46:46-- ftp://ftp.example.net/private/report.csv => 'report.csv' Resolving ftp.example.net (ftp.example.net)... 203.0.113.50 Connecting to ftp.example.net (ftp.example.net)|203.0.113.50|:21... connected. Logging in as backupuser ... Logged in! ==> SYST ... done. ==> PWD ... done. ==> TYPE I ... done. ==> CWD (1) /private ... done. ==> SIZE report.csv ... 21 ==> PASV ... done. ==> RETR report.csv ... done. Length: 21 (unauthoritative) 0K 100% 5.86M=0s 2026-01-10 05:46:46 (5.86 MB/s) - 'report.csv' saved [21]Command-line passwords can appear in shell history and process listings, so long-lived FTP credentials should be stored in secure mechanisms such as .netrc or external secret managers instead of --password.
- Store FTP credentials in ~/.netrc so wget can authenticate without embedding passwords in the command line.
machine ftp.example.net login user password ExamplePass!
Storing credentials in .netrc centralizes authentication for multiple tools but also concentrates sensitive data in a single file that must be protected carefully.
- Restrict access to the .netrc file so FTP passwords are not readable by other local users.
$ chmod 600 ~/.netrc
Overly permissive permissions on .netrc can cause tools such as wget to ignore the file and expose passwords to other local accounts.
- Mirror an entire remote FTP directory recursively into a local folder while avoiding parent directories.
$ mkdir -p ~/ftp-mirror $ wget --recursive --no-parent --no-host-directories --cut-dirs=1 --directory-prefix="$HOME/ftp-mirror" ftp://ftp.example.net/pub/ --2026-01-10 05:46:18-- ftp://ftp.example.net/pub/ => '/home/user/ftp-mirror/.listing' Resolving ftp.example.net (ftp.example.net)... 203.0.113.50 Connecting to ftp.example.net (ftp.example.net)|203.0.113.50|:21... connected. Logging in as user ... Logged in! ##### snipped ##### Downloaded: 3 files, 19 in 0s (49.2 KB/s)The combination of –recursive and –no-parent keeps traversal within the selected tree, while –directory-prefix and –cut-dirs shape the local directory layout.
- Verify that FTP downloads completed successfully by listing the local directories and checking the exit status of wget.
$ ls -lh ~/ftp-downloads ~/ftp-mirror /home/user/ftp-downloads: total 4.0K -rw-r--r-- 1 user user 21 Jan 10 05:46 report.csv /home/user/ftp-mirror: total 12K -rw-r--r-- 1 user user 6 Jan 10 04:05 active-test.txt -rw-r--r-- 1 user user 7 Jan 10 04:05 passive-test.txt -rw-r--r-- 1 user user 6 Jan 10 04:05 verify-mode.txt
A zero exit code from wget combined with the expected files present in the target directory indicates a successful FTP transfer.
Mohd Shakir Zakaria is a cloud architect with deep roots in software development and open-source advocacy. Certified in AWS, Red Hat, VMware, ITIL, and Linux, he specializes in designing and managing robust cloud and on-premises infrastructures.
