Information Security

Defending the digital infrastructure


News Stay informed about the latest enterprise technology news and product updates.

Converting to cloud: Ranum Q&A with Lee Heath

Not down with Dropbox? Lee Heath embraced shadow IT and improved his company's data security practices in the process.

Lee Heath Lee Heath,
Information Security
Business Partner,
Alliance Data Systems

Is Marcus Ranum changing his views on cloud computing? The Information Security magazine columnist chats with Lee Heath, a 20-year veteran of vulnerability management and compliance at companies such as Yahoo! and JPMorgan Chase & Co. Heath is currently working on data loss prevention, classification and cloud storage as an information security business partner for Alliance Data Systems Inc. and its line of businesses.

Marcus Ranum: We were talking in Dallas a couple of weeks ago and you said some things that pretty much made me do a complete 180 on the whole cloud computing thing. You were, basically, embracing it and using it as a way to steer other business problems, specifically, data custodianship and classification. Tell us about it.

Lee Heath: My colleagues -- Brian Mork and Houston Hopkins -- and I are somewhat new to our positions, and we were tasked with a few specific jobs. We were to look for ways to improve upon several standard security practices, such as data loss prevention, file usage monitoring, data ownership and data classification, as well as trying to stay ahead of the curve with shadow IT.

I hope we see more of a change, and get the cloud providers to be more flexible and to meet our needs as an industry, and not just stick with what they feel is 'good enough.'

Lee Heath, information security business partner, Alliance Data Systems Inc.

Each quarter, we try and come up with a shadow IT topic and discuss how we can prevent it, or if we can use it. Of course, one the first topics that came up was the "Dropbox effect" and figuring out that upper management was already using it, not to mention the usual marketing and sales folks. After some thought -- and talking to several providers about our wants and needs -- we saw an opportunity to embrace cloud storage, make it work to our advantage and clean up the state of data management in the process.

The idea seemed simple at first -- everything moved to the cloud has an owner. By using the promise of access to your data from "anywhere" as a carrot, we're able to get users to migrate their data from traditional network file storage up to the cloud. As they move it, their data is flagged with a default classification. This allows for data retention policies that are easier to stick to and monitoring of who is accessing what, from where and with what tool. Overall, it seems like a win-win for everyone, but there is no perfect solution.

Ranum: It sounds like you're asking for the cloud service providers to stretch their business models a bit and do some technology and policy development. The good news is that once you've "broken them in" for us, everyone gets those capabilities, right? How did you manage to get a sufficient level of responsiveness?

Heath: The nice thing about the cloud is that it is agile. We talked to several providers, and none of them really had what we wanted. Luckily, some [providers] can see the advantages of listening to our security concerns and utilizing our suggestions as a way to improve their products. Some [providers] did not feel like dealing with the requirements. But a good sales rep can go a long way; someone [willing] to make a sale, and sit with both us and the engineers, really made a difference.

I think for a cloud storage provider, or any online service provider, to be successful in the corporate world, they need to have more accountability and control over more aspects of their product and the product has to be usable.

Some providers had all the controls in the world, but the product was not usable or they did not support, for example, a device with iOS (which is popular among the C-level). Most [services] are really user friendly, but they have limited security controls or accounting capabilities associated with their tools. In the end -- which has taken about six months and is not really the end -- we have the bulk of what we wanted. It has been an ongoing, iterative process with the vendor we decided to go with. It would be interesting to go back and see if any other [vendors] took what we had to say and improved their offerings.

Ranum: Did you keep any metrics about the effectiveness of the data classification? How many of the units just moved everything up and marked it with the "default" markup? Still, this sounds like a huge win -- because, no matter how you slice it, you now have an audit trail of all the files a unit moved to the cloud; and I suppose you could do more detailed analysis from the audit trail. What metrics did you keep during the migration, and what have you done with them?

Heath: Actually, you caught us at the end of the testing and design phase, and we are about to start on-boarding the general masses. Thus far, we have only had a few key test groups that are sending in feedback for tweaks and features. Brian and Houston have been playing with the API, which allows us to pull the detailed logs of all actions, plus all metadata associated with every file on the system. The API is mainly for creating your own apps, but we are using it to get to the details held within [the storage system].

One of the big things we are looking at tracking is, as you mentioned, whether people are using the tags and classification at upload time, going back, or just leaving the default. We are already tracking where people are accessing the files they have uploaded from and who they are sharing items with. Some of this data will be fed into SIEM solutions for alerting, because we don't want to muddle the signal-to-noise ratio in email alerts. Some [data] will be tracked for trending, and for alerts of anomalies, such as bulk downloads.

One upcoming feature from the provider is a rules engine, so some of the data we will be pulling and parsing on-site will end up being in the product itself at a later date. Until the rules engine is complete, the features will be managed by us via scripts and the API. For instance, data retention policy adherence is an example of how we use the API. We can pull the metadata from each object; therefore, we can see how long [it's been] since a file has been updated and how it is classified and tagged. Based on that data, we can automatically move the file to a trash folder to be deleted at a later date, notify the owner that it will be deleted or whatever we see fit. Overall, it is very flexible.

Ranum: I have to admit I'm surprised that the cloud providers were willing to make modifications for you. I suppose that what you're seeing is maturation of the market with newer and hungrier providers trying to distinguish themselves. The provider you went with is one of the top-tier providers, though, right? Were you early adopters? How did you get into their technology lifecycle so effectively?

Heath: I will admit we were not the only ones asking for these features. The sales guy has several big name companies that are asking for similar features. Luckily, we had worked with him before on other projects, and through other companies, so we have a good rapport with him and he understands that the sale is dependent on doing the right thing.

Security is the big concern with cloud services for most companies, to the point that they are unwilling to use them. For some reason, many cloud companies are, as you mentioned, not willing to meet the requirements of their customers. Whether they think they know better or think it is the corporate user community being overly paranoid, it doesn't matter. Just because the general public accepts security shortcoming does not mean that companies that have knowledgeable staff will. I hope we see more of a change and get the cloud providers to be more flexible and to meet our needs as an industry, and not just stick with what they feel is "good enough."

We ask most of the smaller to midrange companies we work with if they have a technical advisory board we can be part of; for some, it works quite well. I don't expect a company like IBM or HP to really listen when we have feature requests, but there are a lot of companies that do, and they benefit from it as much as we do. One of the key things is not just asking for a feature, but having justification and reasoning behind it. You may not end up with exactly what you asked for, but often what you get will fill the need.

Ranum: Have you attempted any redundancy analysis? I wonder if you could do something like pull back checksums and see how many exact copies you have of certain files in your entire enterprise? Or, perhaps filename analysis to see how many variant versions you have of files? I can see a lot of potential for "big data" style analysis, treating your file storage -- once you've got it all in one place -- as the subject of study. You could do some cluster analysis to see how many people shared files outside of their group. If you could tie some departmental data into the analysis -- via Microsoft Active Directory [AD] or human resources -- you could actually start to do queries against stuff like, "Tell me about people in sales who have files that came from HR." Are you doing anything like that now?

Heath: Brian and Houston have started pointing out some of the bits of information we could use for data mining and some trending. While we have technically unlimited space, using the SHA-1 hashes that are part of the metadata provided by the cloud service, we can easily pull that information and look for duplication. The duplication detection would be more of a concern for which one is "official" and making sure that is the one people are utilizing. The file and folder objects have a lot of attributes we can pull and manipulate. There is also access and revision history, so we can see that while Bob may own the file, Alice is the maintainer.

I guess one of the biggest ‘data management nirvana' aspects of what you're doing is that you more or less move away from unauthenticated access to important data.

Marcus Ranum, CSO, Tenable Security Inc.

Along with the revision history, the cloud provider keeps the previous versions, so we can see if and when a large amount of data is added or deleted from specific documents. If a file has been consistent for a period of time and suddenly changes, then we can notify the owner and/or updater to make sure that a mistake wasn't made.

We can also take that one step further to monitor the SHA-1 of specific documents to do basic file integrity monitoring of policies or procedural documents that should not change often, for instance. Beyond that we are already monitoring file sharing. We cannot only see who has shared files, but we can see if the share has been accepted and whether or not the objects have been accessed. We can also see if specific documents or whole folders have been shared internally or externally. Some groups will have more control over who they can share with than others, based on culture and business need, but we still need to monitor for abuse.

We are tying the solution into AD for authentication using SAML [security assertion markup language], but we don't see a way we can comfortably use AD groups for access controls. The cloud system allows for adding details about the user such as title, address and phone number, but not organizational info. With the enterprise console, we can add groups of users and assign access to a group, but again it is not tied to AD -- at this time -- and the groups are just for ease of use. The actual file attributes list all users with access and the type of access, not the group.

Ranum: I guess one of the biggest "data management nirvana" aspects of what you're doing is that you more or less move away from unauthenticated access to important data. What you guys have done is figured out a way to take advantage of the nature of the cloud in a way that offsets the security disadvantages of it. That's fantastic! I may become a cloud computing advocate after this. What's the most important thing you've learned from this effort?

Heath: In my last position, I was right there with you and would have not considered using the cloud for any storage. While I now feel there are some good use cases, and ways we can make it fit a need, there are still cases where I would not use it. I don't want to keep, for instance, credit card data in bulk in the cloud. I don't feel the risk warrants it, but, as you mentioned, it might be better than having it on a wide open share, even if the network is protected.

I think the main thing we took away from this as a company is, to some extent, to embrace shadow IT and leverage it to your advantage. At the same time, don't back down from what you know you need. If a vendor is not willing to work with you, explain why you don't want to use them. Many [providers] are aware of security concerns and have good ideas on their roadmaps, but until more companies push the issue, security won't be high priority over usability bells and whistles… It is a trade-off of usability versus security with users' wants thrown in to make it even more complicated, and we all know that is the ultimate balancing act for infosec.

Marcus J. Ranum, chief security officer of Tenable Security Inc., is a world-renowned expert on security system design and implementation. He is the inventor of the first commercial bastion host firewall.

Send comments on this column to

Article 6 of 7
This was last published in September 2013

Dig Deeper on Data security and cloud computing

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.

Get More Information Security

Access to all of our back issues View All