I'm doing it intermittently when asked. I've done it for USENIX and for others privately. This started because we were trying to get a mailing list on security metrics going and I thought if we wrote everything we knew down in one place, putting it in a tutorial, then that would be the way to go. The entire presentation is
In the slides I note that to play better, you must keep score and that discipline is easier with numbers. Security is about tradeoffs, but you know that it is easier to make tradeoffs when you have a measure to compare them with. Do people quickly embrace the idea or does it take some convincing?
A lot of people say 'I don't know how to measure this … I have all these edge conditions that make my measurements nonsensical.' That's fine, but my advice is to start measuring something. Even in an environment where you're not certain your numbers are accurate, if the measurement process is consistent and stable, the trend data you get is still likely to be believable because the errors are likely to be uncorrelated with the trend. So it's not a waste of time if you can't get your measurements 100% accurate?
The purpose of security metrics is to make risk management decisions. The measurement doesn't have to be perfect, and if you can get a better measurement later, then halleluiah. My point is that we can't make much progress in security if we don't have good metrics. We've exhausted what we can do with firefighting. Risk management is not about explaining the past but in predicting the future, and for that you need to measure things as best as you can. You were part of the Athena Project at MIT during the creation of the Kerberos authentication protocol, which today is incorporated into a variety of products, including Sun Microsystems's Solaris operating system, Red Hat Linux, MandrakeSoft Linux and Debian Linux. Is it still valid in today's threat environment?
Yes, but there are also issues today Kerberos wasn't designed to address. What if the probability in a transaction is not that I'm okay and you're okay and the Internet is the problem but that the other end is already compromised? A year ago I wrote a paper, I think for USENIX, guessing that 15-30% of all desktops had some degree of remote control not intended by the user. I got some hate mail about that but since then Mike Danseglio [program manager in the Security Solutions group at Microsoft] has come out saying it's more like two thirds of all PCs and IDC piled it on and said three quarters. The point in the end is that it is a large fraction. Under those circumstances, authentication technology doesn't matter. If the person presenting the credentials is unwittingly compromised, the protocol worked, but the person's machine is under the control of someone else. That's not the problem we set out to solve with Kerberos. No protocol solves this. It's an endpoint problem. With Kerberos we solved the network problem -- you can take two ends that are certain of their own identities and make it possible for them to communicate. You just wrote a new book. What the goal behind it?
To show that successfully implementing data security technology across sophisticated global organizations requires a new data-centric, risk-based and strategic approach that defines the concepts and economics of a sound data security strategy. The winners in this game will be those with the most data in motion while the losers will be those with too much data in motion. The line between the 'most' and 'too much' is a fine line and that line is drawn by security technology. This book is my attempt to lay out my best thinking on data security in a way that makes the risk management tradeoffs clear for the reader. It also defines the security technology framework required to determine the difference between most and too much.
This is to assist managers in understanding the risks and costs associated with data loss. The ideas expressed in this volume are ones discussed and refined in conversations with hundreds of CISOs, CIOs, CEOs and business leaders. The goals of the book are to encourage discussion around the economics of data security, to define intelligent data-centric strategies and to develop a forward looking approach that will address data security needs now and in the future.