Four steps to taking the fear out of large file transfers

Large file transfers don't have to be scary. Learn four key steps to keeping your network secure when big files exit your enterprise.

You need to use FTP to send and receive bulk data with your partners, customers or suppliers. You are a responsible...

IT professional, so of course you are going to use a secure FTP (sFTP) solution, whether open source or commercial. So, you're going to allow that sFTP traffic through your firewall and block other FTP traffic. This presents you with the problem, then, of making sure that channel through your security system is only being used for what you intend -- and, more generally, to try to slow or stop unintended flows of sensitive data out of your environment. There are several steps to take and options to consider.

Firewall 101: Sharply limit the scope

Limiting the scope of the channel through which your files transfer is a basic rule of firewall use, but once you have designated one or a few systems to serve as your sFTP hubs, make sure of several things:

  • Only those systems are included in the rule permitting sFTP flows through the firewall;
  • Only the sanctioned systems on the far end are allowed as sources or destinations of sFTP packets; and
  • SFTP is the only encrypted traffic flowing across the firewall from these systems. Though the sFTP traffic is encrypted and so normally opaque to management tools, you can make sure it is the only opaque traffic flowing to or from those systems.

You most likely won't have a rapidly growing or changing list of systems that need to do bulk file transfers securely with external entities, or of external entities with which you need to do this kind of transfer -- most organizations don't. If you do expect frequent and/or rapid changes, you may need to work with some kind of security orchestration tool to make the necessary rules tweaks as sources and destinations come and go.

Then add host-based tools

Looking more broadly at the rest of the systems in your environment, consider host-based mitigation. Security and systems management vendors have made progress in recent years in introducing high-speed analytics into their host-based system monitoring tools. In particular, some now have the ability to spot both normal behavioral patterns for things like operating system service calls (which even malware, insiders and external attackers have to use) and therefore to spot anomalies with minimal performance impact on the system. Running host-based behavioral anomaly detection can decrease the chance of data you care about moving off in unsanctioned channels.

Watch network behavior, not content

Beyond the hosts, network behavioral analysis can spot changes in data flows even when tools cannot see the content of encrypted data flows. Because they watch for unprecedented numbers, destinations or durations of data flows out of your systems, such tools can help spot exfiltration in progress. Making effective use of such a system can be challenging, especially in the early phases of benchmarking "normal" traffic; like intrusion detection systems and data-leak prevention systems, network behavioral analysis tools can be prone to throwing "false positives" -- alerts calling attention to what is really innocuous behavior. Security staff needs to devote extra time up front to getting trained in these systems, or consider using professional services of some sort to handle this time-consuming task. Security teams also need to define and follow security processes that continually refine alerting and response rules based on assessment of alerts, to reduce the number of false hits steadily over time.

Be your own man-in-the-middle

The most extreme method for major exfiltration prevention is to channel all encrypted traffic flowing from the relevant parts of your network through appliances capable of performing the function of man-in-the-middle. These appliances terminate the encrypted channel facing inward, to your system, and the channel facing outward, to the system on the far end of the flows. Traffic comes to it encrypted and is decrypted. Then it can be analyzed using deep packet inspection tools for content sensitivity, and flagged as suspect, blocked entirely, or allowed through. If allowed through, the traffic is re-encrypted and sent on to the destination. This is a computationally demanding task and typically an expensive approach to the problem. However, by inspecting all encrypted flows in and out, it can dramatically reduce the risk of data being snuck out in an encrypted stream. This can introduce its own set of legal risks, so the scope of encrypted-stream-capture needs to be carefully defined and discussed with corporate risk management. If the scope is going to extend to cover user endpoint devices and non-sFTP encrypted flows to and from them, users will need to be informed of the fact that their encrypted data may be exposed to IT staff.


These are steps you can take to reduce the scope of risk introduced by sanctioned use of sFTP in your environment. Understand, though, that data exfiltration is a much broader problem, one rendered extremely difficult by the use of encrypted channels direct from desktops and laptops out to myriad services on the Web. Locking down the environment around sFTP, or even the entire data center, will reduce, not eliminate the risk of data smuggling. In the age of smartphone cameras and steganographic concealment of sensitive data in normal-looking Facebook or Tumblr posts, there is no way to close the doors on data exfiltration entirely. However, these are a few of the ways to make it harder, slower and riskier.

This was first published in April 2014

Dig Deeper on Network Protocols and Security



Find more PRO+ content and other member only offers, here.

1 comment


Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to: