Column

Marcus Ranum chat: Next-generation SIEM

Ezine

This article can also be found in the Premium Editorial Download "Information Security magazine: Seven Outstanding Security Pros in 2012."

Download it now to read this article plus other related content.

Marcus Ranum: I’ve been thinking lately that it’s about time for a next-generation SIEM to come along and overturn the current state of the art. I thought, perhaps, we might fantasize a little bit—while remaining practical—about what such a thing might look like. Avoiding the big data hype, I have to admit that my blood runs cold when I hear marketing people talk about petabytes of data, when we’re barely managing to turn gigabytes of data into megabytes of useful information. It’s definitely the case that our current SIEM solutions are going to be constantly pressured to do more with less, which—in a SIEM context—means reduce data so it’s even more significant.

Short of artificial intelligence, where do you think we’re going to have to take next-generation SIEM [security information and event management] systems in order to produce less data that is more significant, while absorbing even more raw input?

Anton Chuvakin: As you know, my day job includes exactly that kind of thinking. Recently, I wrote a report called SIEM Futures, where I outlined five futures that SIEM has to conquer to become a next-generation system. While you should read the report to see full details, I can outline some of the ideas here. Also, none of these would be completely unheard of because in many cases the future is already here, it’s just unevenly distributed. So, the SIEM of the near future will include expanded context data collection and analysis, distributed intelligence features, the ability to monitor “emerging” IT environments (such as virtual and cloud environments), new and expanded algorithms for historical and real-time analysis, and will be much more useful than today’s SIEM for application security monitoring.

In essence, I’m looking not only at newer and better analysis algorithms—primarily on stored, but also on stream data—but also at expanded information collection (especially context information that goes beyond resolving usernames into identities and vulnerabilities on assets), as well as being able to operate in newer environments such as hypervisors and deep inside applications, where an IP address means nothing and logs are even more esoteric. These methods and futures can cross-pollinate; for example, new algorithms applied to application log data analysis or shared intelligence about attacks on public clouds.

Despite all those exciting futures, many of today’s environments are still about firewall logs and Unix syslog, so don’t get too excited about the future. I’d venture a guess that for every organization that uses Hadoop, there are dozens that still use Windows 2000, and, less likely but still possible, Windows NT 4.0. Regarding the petabytes of data, I’m just as miffed as you are as I see environments where available kilobytes of data are ignored (see the latest Verizon Data Breach Investigations Report for examples). Giving these organizations megabytes, gigabytes, up to petabytes will not change how these people approach security information and utilize SIEM.

Ranum: It seems to me that visualization and workflow automation are going to be crucial, yet I am blissfully unaware of really good ways of presenting the kind of information we’ll need to produce from a next generation SIEM. My imagination keeps conjuring up something with a front end that looks like Palantir, a middle tier that looks like ThinThread or a semantic forest generator, and a back end that’s some kind of network or grid database. And, however, it does its semantic forest generation, there will need to be at least some kind of sharing model built in, so that some self-generated clustering rules are automatically shared within communities of interest, and perhaps there’s a like/don’t like crowdscoring system for globally promoting analytic rules. That’s my daydream; how does that match your SIEM nirvana?

Chuvakin: A long time ago I wrote a blog post on “ideal” SIEM systems. I looked back at it, and I saw some naïve things but also some things that are still true today. Here are some:

  • Logging configuration: The ideal SIEM will find all possible log sources (systems, devices, applications, etc.) and enable the right kind of logging on them according to a high-level policy given to it.
  • Log collection: Will collect all the above logs securely without using any risky super-user access and with little to no impact to networks and systems.
  • Log storage: It can securely store the above logs in the original format for as long as needed and in a manner allowing quick access to them, in both raw and summarized/enriched form.
  • Log analysis: This ideal SIEM will be able to look at all kinds of logs, known to it and previously unseen, from standard and custom log sources, and tell users what they need to know about their environment based on their needs: What is broken? What is hacked? Where? What is in violation of regulations/policies? What will break soon? Who is doing this stuff? The analysis will power all of the following: automated actions, real-time notifications, long-term historical analysis and compliance relevance analysis (this probably does require some form of artificial intelligence).
  • Information presentation: Will distill the above data, information and conclusions generated by the analytic components, and present them in a manner consistent with the user’s role, from operator, to analyst, to engineer, to executive. Interactive visual and drillable text-based data presentation across all log sources. The users can also customize the data presentation based on their wishes and job needs, as well as information perception styles.
  • Automation: The ideal log management tool will be able to take limited automated actions to resolve discovered and confirmed issues as well as generate guidance to users so they know what actions to take when full-auto mode is not appropriate. The responses will range from full-auto actions, to assisted actions (i.e., click here to fix it), to issuing detailed remediation guidance. The output will include a to-do list of discovered items, complete with suggested actions and ordered by priority.

However, let me take this conversation in a somewhat different direction. Most of the successful SIEM projects I’ve seen aren’t successful because they have ideal technology. Similarly, most of the failed projects have failed not because they have crappy SIEM technology. Success or failure of a SIEM project simply depends more heavily on processes and practices, not the tools themselves. As we know, security monitoring can never be automated and attempts to create an ideal tool that will automate it are doomed to fail.

Essentially, some people who buy SIEM boxes don’t use them and then whine that “the product does not do enough.” It is one step above complaining that one’s copy of Microsoft Word keeps failing to write the next great American novel. … Or that one’s state of the art DSLR fails to turn one into Ansel Adams.

Ranum: “Relevance analysis” sounds good! It reminds me of our quest for “correlation” in system logging. The industry seems to have done a pretty good job of providing linkages between pieces of information, but they seem to mostly be rule-based (i.e., if more than 70 percent of the terms in two events are shared, and they appear within 60 seconds of each other, group them into a cluster.) But as a community we seem to have done a pretty poor job of taking advantage of the pattern-recognition research in AI. I’ve always been disappointed that the security industry seems to be playing it safe and building what we know will work (rule-based systems) instead of shooting for “intelligence amplifying” systems. By that, I mean systems that make hypothesis by generating fuzzy rules, then ask their human operators, “Is this useful?” And then remember and apply the results. Anyone who has ever trained an animal has experienced this process, and some of the early personal firewalls (such as Zone Alarm) did a good job with it. Is there anyone doing research in advanced analytics that excites you?

Chuvakin: Are you talking about academic research or industry research? A lot of academic research I’ve seen that purports to do that is really quite dumb and not even loosely connected to operational security reality. I’m pretty sure examples of excellent academic research in security data analysis exist; it’s just that my nearly 10-year quest for it came up with essentially nothing. For crying out loud, those academics still use 1998 data sets in 2012. … As far as this research is concerned, most SIEM vendors are experimenting with profiling and baselining techniques—essentially old-school anomaly detection. If you were able to do it creatively, use it on log data and produce useful insights.

In general, I want more people to use data mining and text mining—on unparsed logs—for log analysis, and I’ve seen many examples of that almost working in the field. However, nobody has productized it well yet.

Ranum: I like your suggestion of the logging system being able to do discovery. It seems to me that a big piece of our log management problem is going to get solved as the “systems challenged” customers move their data to cloud providers where logging and configuration management doctrines are in force. The big problem with getting logs collected usefully seems to be a lack of professional clue power in system administration at the edges of the network. Pushing this stuff into the cloud at least makes massive log aggregation (and cross-customer analysis) a possibility. What do you think the impact of the cloud is going to be?

Chuvakin: How about this, Marcus? More people actually pull data from the cloud into their traditional on-premise SIEM tools than shove the data up to the cloud from their on-premise log sources. There was another research project I did on security monitoring for public cloud assets and that was one of the key discoveries—and key surprises. It seems like cloud systems will serve as a log source for some time before we learn how to use public cloud resources for data analysis.

This being said, there are a couple of interesting vendors that do log management using a Software as a Service model, aka “log management in the cloud.” These guys seem to be utilizing the advantages of having all their customers’ data in one massive system with essentially unlimited computing power. New analytics do become possible, but I’ll reserve my judgment until I see more examples of that working as well or better than traditional SIEM in today’s environments.

Ranum: What direction do you see logging going in that excites you? See anything new or different, or any trend that have you feeling hopeful?

Chuvakin: You know, I’m still excited about log standards, such as common event expression. I feel that unless we get much better analyzing unstructured data—essentially text mining and natural language processing, even though most logs remind me of broken English and not natural language—the standard just has to happen. Also, just about any new analytic technique that actually works on production data—not on stupid fake academic data sets—and on production data volumes, excites me as well.

So I’m feeling hopeful about the standards, despite the long odds and despite you being skeptical about them. I’m also feeling hopeful about increased adoption of tools that analyze and don’t just store data and about people actually using them. By the way, these tools don’t have to be based on Hadoop to be fun, they might well use MySQL or something.

Ranum: Anton, thank you so much for your time. It’s always a pleasure.

This was first published in October 2012

There are Comments. Add yours.

 
TIP: Want to include a code block in your comment? Use <pre> or <code> tags around the desired text. Ex: <code>insert code</code>

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
Sort by: OldestNewest

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to: