File access logs: Marcus Ranum surprised by NSA shortfall

File access logs: Marcus Ranum surprised by NSA shortfall

Date: Apr 14, 2014

Breaches such as Target Corp.'s have raised the question of how quickly and accurately organizations are able to assess the extent of a malicious incursion.

In this interview with SearchSecurity Editorial Director Robert Richardson at the 2014 RSA Conference, Marcus J. Ranum, CSO at Tenable Network Security Inc., finds himself surprised at how little attention some organizations -- government organizations in particular -- have paid to maintaining file access logs of who accesses what at their organization.

In a case like Target's, Ranum believes initial dithering about the scope of the breach was simply the result of it taking time to get good information. "The responders are not being dishonest. It's that you never have exactly the information that you want," he says. "Management is required to report that something has gone … and the technical people are scrambling and running around trying to figure out actually what did go wrong and how big the problem really is."

When it comes to the NSA's loss of documents, Ranum is less forgiving. "If you're talking 1.7 million documents, that's the kind of thing where I would expect to be able to go over to my system log aggregation point and retrieve who had pulled what from my file servers."

 

Read the transcript below.

I'm Robert Richardson. I'm the Editorial Director at Search Security and I'm here with Marcus Ranum who's the CSO of Tenable, and prior to coming on camera here, we were talking a little bit about the GEO political situation and security, and boy there is one now, right? On the heels of Snowden revelations, and before that APT1, and before that and before that. One of the things that was interesting about it, and particularly with regard to Snowden, was that he got out with the documents in the first place. Wasn't there a bit of a security issue there, and what's your take on what the impact of the Snowden revelations are and the Snowden theft for that matter?

Ranum: There's a lot of things that come from it. As a security guy, one of the things that I'm fascinated by is that they don't really seem to have a great handle on what Manning took. They don't really seem to have a great handle on what Snowden took, and if you're talking 1.7 million documents, I mean that's the kind of thing I would expect to be able to go over to my system log aggregation point and retrieve who had pulled what from my file servers at what time, and I would think that the 1.7 million documents would make it kind of a bump in my statistics that I might be able to detect.

So it's interesting. It seems that there may be some shortages of forensic tracking and data monitoring going on there. That's kind of sad. In a lot of the commercial sector we see organizations that have extremely good tracking of data access and continuous monitoring of what's going on with their network.

Though we didn't in the Target instance get the feeling anyway that they had a good grasp on what they lost, or maybe they just weren't being open about it, but there was a period when they were ever increasing. Each week brought a new number of how many credit card numbers had been stolen. Do you feel like that was just a communications issue and they probably did have the wherewithal to sort out what they'd lost, or are they sort of an outlier in your view most enterprises would have been able to do that?

Ranum: Well, you know, I've been involved in some of those kinds of incident responses and what winds up happening is I think that the responders are not being dishonest. It's that you never have exactly the information that you want when something goes wrong. There's no macro that you can run on your log analysis engine that's going to say find all stolen credit cards because the cards get stolen through a different mechanism, so you identify which machine has something that's gone wrong and then you identify the queries that came from that machine, and then you start to be able to figure out what the time window of the disaster was, and then you can start to figure out the size.

So what's probably going on there is you have an interesting dynamic because management is required to report that something has gone wrong, and so they try to put an approximate scope on it, and then the technical people are scrambling and running around and trying to figure out actually what did go wrong and how big the problem really is, and you can get these kinds of desperate leaks. I think, it's a difficult thing if the executive team of Target or whatever come forward and say, "Well, we've got a credit card leak, but we're not going to tell you how bad it is." Oh, that's an instant conspiracy. So they have to come up and say, "Well, the worst case scenario is it's somewhere between five and five million credit cards." That doesn't sound so good either, so they're going to throw a number out there and they're going to try to make it a good number, but we have to realize that these are always pieces of data that were assembled very quickly.

Now, and of course, breaches are going to continue to happen, right? If we've learned nothing in security that's one thing we've learned. Do you think that companies can get better in terms of getting numbers that are more rational? Can they get faster at it or is that just one of those things that it's going to always be . . . I take your point. It's always going to be kind of a crisis mode.

Ranum: Well, the leak . . . This is the weird part. The leak when it happens is almost always going to have to happen from an unexpected direction because if it happened from an expected direction . . .

The leak wouldn't have happened.

Ranum: . . . the leak shouldn't have happened.

Right.

Ranum: No one's going to go, hey, we're going to leave those all here that somebody can download millions of credit cards from. So you're not going to have that macro that's just sitting there waiting for somebody to hit that one hole that you've already identified.

Yeah.

Ranum: So what you need is you need as many different useful pieces of information that you can use to construct a picture of what's going wrong and so the key message to get across to organizations there is if someone in your networking group or your systems group says, "We want to turn logging off because it's a performance problem." Don't do it. Buy a bigger server, buy more RAM, whatever. You'll buy a bigger load balance, whatever it is so that you don't have to turn the logging back off.

I was involved in an instant response a couple of years ago where the key piece of data that we had was a short term log coming from an F5 load balancer that normally would have been truncated, but we actually happened to have a copy of it and we were able to find a piece of Malware that someone dropped on 1500 websites because they were silly enough to probe it from the outside to make sure that the drop works correctly.

So it works right.  Right.

Ranum: So then all we did is we just searched through the log in that and we had the file name and boom. Everything falls out. So you never know quite where the critical piece of information is going to be and that's where I think organizations need to make a realistic assessment about how much they're going to collect, and I've seen a lot of organizations won't collect it because they go, we don't have any in-house skill to analyze it. But you have to remember that having it lying around on a hard drive some place is still extremely valuable, because if things do go wrong and you've got these expensive consultants sitting there, it's really nice for them to be able to pull that information from some place rather than having to infer it or worst case is that the critical information simply is not there.

Because then you look like the NSA and you don't know how many documents you lost.

Ranum: That's right and that's a serious problem. I don't understand how that one happened. I know a lot of private sector organizations that could tell you every file that an employee accessed within the last three months or whatever. Now obviously if someone knows that that capability's in place, they can try to work around it, but it's making your job a little bit more difficult for them.

Well, it'll definitely be interesting to learn more about what maybe we never will, but about actually why they don't know more about that.

Ranum: I bet that they found religion about system logging.

Yeah, maybe they've turned some of that surveillance back their way and . . .

Ranum: Which is an interesting problem. At certain points you have to divide half the population into the ones watching the other half of the population.

Right. Which half do you hope to be in?

Ranum: I really don't care. Life's going to be interesting regardless of which half I'm in.

Yeah, all right, good. Marcus Ranum, thanks so much for joining us today.

Ranum: Thank you.

There are Comments. Add yours.

 
TIP: Want to include a code block in your comment? Use <pre> or <code> tags around the desired text. Ex: <code>insert code</code>

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
Sort by: OldestNewest

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to: