- Peter Lindstrom
Advanced persistent threats delivering zero days via watering holes have become a reality. Attackers targeting SCADA systems, heart monitors, automobiles and other devices are waiting in the wings. Calls for government to address cybersecurity have reached a fevered pitch. And the breaches continue.
As technology invades our lives and Skynet continues to develop with the Internet of Things, the potential downsides are becoming apparent in the "technology risk management" field. How can we be rational and reasonable in our world of downsides?
After hearing, once again, that he faced a decision involving tradeoffs, President Truman once famously exclaimed, "Find me a one-armed economist!" in the hope that he would eliminate "the other hand." In reality, even an economist named Bob (get it?) will discuss alternatives and options, because that's what economics is about.
Opportunity costs involve all the things we could have done but didn't, once the alternative choice was made.
Tradeoffs and scarce resources
In the technology risk management field, we are constantly making decisions involving tradeoffs: What business partners should we audit? Should we budget for two-factor authentication or active forensics capabilities? When should we apply the patch du jour? How should we secure our cloud environment?
Worse than tradeoffs, we have scarce resources. There are no easy answers, but it is up to us as technology risk management professionals to navigate these issues on a daily basis. Enter security economics, which is essentially the practice of making decisions about technology-related risk as constrained by available resources and in consideration of opportunity costs. Here are three elements of security economics:
Reduced risk: In the perfect technology risk management world, our only goal is to reduce the probability or effect associated with negative events, breaches and, ultimately, compromises leading to losses. Put simply, we want to maximize our ability to keep bad things from happening.
Scarce resources: We don't live in a perfect world, we live in one that puts limits on available resources (except for that one guy at every conference who says he has significant budget). So when we allocate those resources—people, time, service dollars, automated solutions and more—we are looking to get the biggest bang for our buck.
Opportunity costs: Decisions affect the future and we typically have a number of choices for some course of action. Opportunity costs involve all the things we could have done but didn't once we made our choice.
In some respects, we are looking to optimize our risk reduction. Theoretically, this makes sense: If we can spend $1 on two different controls, and one reduces risk by $2 and the other reduces risk by $5, we would all pick the latter option, right? And being professional skeptics, I'm sure you're thinking how crazy it is to think we can measure the $2 or the $5 exactly; but if we could, that would be the goal.
No precise measurements
Many technology risk management professionals believe that because decision elements are often hard to measure precisely, we are somehow exempt from factoring them into our decisions. Except every time we prioritize an item on our to-do list, we are making a claim about its importance relative to other activities on the list. This leads us to the notion of revealed preferences.
Economists make the distinction between "stated" and "revealed" preferences—loosely defined as what we say versus what we do—when analyzing decisions and looking for utility. Luckily, in technology risk management the "what we do" part is readily available. It makes itself known in all our resource allocation decisions. When we determine a mix of activities to perform, we spend people money. When we make purchasing decisions, we spend service or capital investment money. All of these decisions reveal something about the perceived value of the activity relative to other actions.
So think of this first column on security economics as a call to action of sorts. A call to action to acknowledge that we have many risks we can address with varying degrees of associated probability and impact. A call to action to recognize that we can't do everything. And a call to action to understand that we are looking to make the best decisions possible.
About the author:
Peter Lindstrom is principal and vice president of research for Spire Security. He has held similar positions at Burton Group and Hurwitz Group. Lindstrom has also worked as a security architect for Wyeth Pharmaceuticals, and as an IT auditor for Coopers and Lybrand and GMAC Mortgage. Contact him via email at PeteLind@spiresecurity.com, on Twitter @SpireSec or on his website www.spiresecurity.com. Send comments on this column to firstname.lastname@example.org.