While the threat landscape continues to evolve, File Transfer Protocol, or as it's commonly known, FTP, has remained largely the same for several years, and is still widely used.
FTP is primarily used to transfer large files and is designed with that purpose in mind. It is a client-server protocol, which uses separate control and data channels to accomplish file transfers. The control channel is used to authenticate and issue commands to the server. The protocol by itself does not support encryption, so all traffic on the control channel is sent in the clear, or unencrypted. This is one of the weaknesses of the protocol. In the enterprise, FTP service is generally leveraged to host content that is not considered sensitive and is ideally isolated from other sensitive systems. Care is taken to ensure the service is appropriately patched. An incorrectly configured and poorly architected FTP service can be a critical security hole in the enterprise.
What are the key FTP security best practices for the enterprise? Is FTP secure enough to consider transferring sensitive data, or are there better ways to secure FTP? If FTP cannot be secured sufficiently to allow for the transfer of sensitive data, what alternative protocol can be used to transfer data securely? These are the questions we'll seek to answer in this tip.
FTP is ubiquitous. There is no denying that. As with any ubiquitous technology, it tends to be an easy target for attackers; over the years they've had plenty of experience working with it and exploiting it. Discussions around the security of the service tend to be heated and generally without a consensus on a single best way to secure the service. A lot of this stems from the fact that business requirements, which drive the continued existence of this service, tend to be inflexible in their adoption of more secure alternatives. To me, any enterprise that uses or is considering the use of FTP should begin by asking itself these three questions:
- Do we really need FTP?
- How do we setup FTP securely (a contradiction in terms but I shall explain)?
- Is there a secure and easy to use an alternative to FTP?
The first question is interesting. The technically correct answer is no; there are more secure alternatives, which we'll cover later. The practical answer, however, is yes; given its widespread use and its cross-platform support, most enterprises will be forced to support it.
I have spent enough time troubleshooting FTP connections on filtering devices (aka firewalls) to know that FTP's control and data channel design is not ideally suited to environments in which packets traverse multiple disparate network devices. To give you an example, initiating an FTP session from a corporate network behind a network proxy to a server hosted in a load-balanced environment is not a trivial troubleshooting exercise.
As I mentioned earlier, FTP is a client-server protocol and uses separate control and data channels to accomplish file transfer. The control channel is used to authenticate and issue commands to the server. This authentication mechanism is weak because credentials are sent to the server unencrypted, making transfers susceptible to network eavesdropping. The average FTP implementation typical security bugs compound this problem.
Despite its security shortcomings, FTP is in many organizations the de facto method for transferring large volumes of data. Most workstations, applications and even network-filtering devices have built-in support for FTP. An alternative method may be more secure, but it's hard to argue with the convenience and low/no cost of using FTP.
Let us, for the moment, assume that FTP is the only option available. There are a few avenues we can explore to secure the service to a certain degree. I would start right at the network design phase by isolating the FTP service to its dedicated virtual local area network (VLAN) segment. This generally involves dedicating a separate network segment off your switch, router or firewall appliance to host the FTP service. This serves multiple purposes. Not only will it allow you to dedicate an arm of the firewall to this segment, which allows for granular policy (controlling source IPs) control and ease of troubleshooting (active/passive connections come to mind), but it will also provide you with a choke point for monitoring and using a network-based security appliance, like an IDS or IPS. In this case, a choke point is handy as both a detective and preventive mechanism; you can monitor for attacks that exploit vulnerabilities associated with the FTP service like an IDS or proactively block an attack against the service using an IPS.
Moving further up the stack, we need to focus on hardening the server hosting the FTP service itself. (Though I mentioned addressing the network design first, I do not recommend hosting the server until the hardening steps are complete.) I advise looking at more than just applying the latest patches and configuring the server to comply with a Center for Internet Security (CIS) benchmark. FTP services tend to cause serious collateral damage when targeted by an exploit. This is because, in many cases, the FTP service is allowed to run as a high-privilege process (eg. as a root user) against which a successful exploit could grant the attacker system-level privileges.
Such exploits can largely be prevented by isolating the FTP service on the server. This is different from enforcing any network-based isolation. Here the isolation is offered around the hardware hosting the service. This FTP service isolation can be implemented by running the FTP service in a virtual environment (the open source Xen hypervisor comes to mind) or enforcing chroot. Chroot provides an administrator with the capability to change the disk root directory for the process, which basically restricts the ability of the process to break out of its confines and access sensitive areas of the file system. Chroot'ing can be enforced in a few ways; a couple quick examples are using "/etc/ftpchroot" to enforce a chroot environment for certain users or enabling "ftp-chroot" login class. Both these options recommend that the FTP daemon be recompiled with ls support so no special dependencies apply. Please note that anonymous FTP is always chroot'ed.
Finally, there is an easy-to-secure alternative to FTP called Secure Shell (SSH). SSH, unlike FTP, sends all content in an encrypted form. Using an encrypted transport service and layering a file-transfer agent on top avoids some of the more common security drawbacks and complexities of the FTP service. For simplicity's sake, I have treated SCP (primarily file transfer), SFTP (a file transfer protocol built from the ground up which runs over SSH) and FTP sessions tunneled through SSH as belonging to the family of more secure alternatives to FTP. Each uses SSH and is an acceptable alternative to FTP. An oddball in this category is FTPS (FTP over SSL) by the now defunct FTP Software. To be honest, I would not keep FTPS on the list of viable alternatives to FTP because it suffers from firewall incompatibilities. Moving to a more secure protocol should be complemented with service isolation and following the appropriate server hardening steps.
Another interesting alternative to FTP for some might be a digital content delivery service, which provides secure file delivery while simplifying management. These tend to be cloud-based services and include features like file tracking, delivery notifications, workflow integration tools and scriptable APIs. Although these services were originally designed for distribution of digital content, I think they can be leveraged as good file-transfer services for the enterprise.
FTP is a sensitive topic; some users love the convenience, but network and security teams by and large aren't nearly as enamored with it. One could fill pages identifying its shortcomings and suggesting alternatives. There are ways to co-exist securely with FTP, but if a more secure alternative like SSH is feasible for your enterprise, I strongly suggest making the transition.
About the author:
Anand Sastry is a Senior Security Architect at Savvis Inc. Before joining Savvis, he worked for clients in several industries (large and mid-sized enterprises in financial, healthcare, retail and media) as a member of the security services group for a Big 4 consulting firm. He has experience in network and application penetration testing, security architecture design, wireless security, incident response and security engineering. He is currently involved with network and web application firewalls, network intrusion detection systems, malware analysis and distributed denial of service systems.
This was first published in August 2010