What it is
File Transfer Protocol (FTP) is a decades-old mechanism for moving files between clients and servers. By default it uses TCP port 21 for the control channel and either port 20 or negotiated passive ports for data transfer. Designed long before modern security expectations, classic FTP sends credentials and payloads in cleartext unless wrapped in TLS (FTPS) or replaced with a secure alternative (such as SFTP over SSH on port 22). FTP services appear on legacy web servers, embedded appliances, and ad hoc file exchange endpoints.
Administrators historically used FTP for publishing static assets, synchronizing systems, or providing simple drop-box functionality. Attackers value exposed FTP because default configurations often allow anonymous logins, weak passwords, or directory traversal, and they reveal sensitive backups, credentials, or proprietary code that can be exfiltrated.
Why it matters
Exposed FTP on port 21 remains a frequent vector in breach investigations. Because sessions are unencrypted by default, attackers can intercept credentials, reuse them elsewhere, or tamper with files. Anonymous or poorly protected FTP sites leak confidential data and can be co-opted as staging areas for malware. Compliance frameworks such as PCI DSS and GDPR view insecure file transfers as violations, adding regulatory risk to operational impact.
How to reduce risk
- Avoid exposing FTP publicly; migrate to secure transfer methods (SFTP, HTTPS APIs).
 - If FTP must remain, enforce FTPS and disable plaintext logins.
 - Disable anonymous access and enforce strong authentication (unique accounts, long passwords, key-based auth).
 - Restrict allowed IP ranges and lock down control/data channels with firewall rules.
 - Log all file operations and feed events into SIEM monitoring.
 - Regularly test for directory traversal, permission issues, and stale data.
 - Retire legacy FTP daemons and keep remaining services patched.