Essential Guide

How to develop software the secure, Gary McGraw way

A comprehensive collection of articles, videos and more, hand-picked by our editors

McGraw: Software [in]security and scaling architecture risk analysis

Software architecture risk analysis doesn't have to be hard. Gary McGraw and Jim DelGrosso discuss an easier, more scalable process.

Co-authored by Jim DelGrosso

We hold these truths to be self-evident: Bugs and flaws split the security defect space 50/50, and Architecture Risk Analysis is a critical touchpoint for software security focused on flaws. Don't know what on Earth we're talking about? Start here.

 McGraw hat image While architecture risk analysis (ARA) is a process that has proven to be useful in finding and fixing design flaws, the process itself has remained expertise-driven and poorly automated. Contrast this with the problem of identifying implementation bugs found in the code of a system for which myriad commercial tools exist (e.g., HP Fortify SCA, Coverity, IBM AppScan Source, Cigital SecureAssist, etc.), and you see that we have a serious scalability issue on our hands.

What can we do to scale an essential software security practice requiring such deep design and security expertise? In other words, how do we scale architecture analysis?

Gary McGrawIn this article, we describe a lightweight approach to architecture analysis that does not require the large time commitment or deep security expertise that traditional ARA requires. Make no mistake, to carry out what we describe below properly will still require some design chops and real security expertise, but this lightweight analysis can be done more quickly and by a much larger pool of people than traditional ARA. This makes it both scalable and applicable to a wide variety of software development methodologies. Most importantly, our lightweight approach is efficient enough that it can be scaled to cover an entire application portfolio.

Architecture analysis for, well, more people

We call our light weight approach to ARA security architecture survey, or SAS for short. Our basic tricks are to create fewer analysis artifacts, reuse (and remember) those artifacts that we can, and do less analysis. Hold on a sec -- does "less analysis" mean that the results of SAS are less robust than ARA results? Sadly, yes. There is no free lunch here.

The good news is that SAS results have proven to be useful in practice and very much valuable. You see, most firms are doing absolutely no ARA at all. That means in practice that they are missing at least 50% of their defects and declaring their software secure prematurely (at best). Carrying out SAS is much better than doing nothing and pretending that security flaws don't exist!

We've been practicing ARA for well over seventeen years. We believe in it. It works. Believe us when we say that trimming our tried-and-true ARA approach without (accidentally) removing all of its potency was tricky. (As a quick reminder, ARA takes three steps: known attack analysis, system-specific attack analysis -- which includes threat modeling -- and dependency analysis.)

Did you notice that this section is titled "architecture analysis" and not ARA? That's not a mistake. Fully understanding risk requires a deep understanding of the target system and its business context. We specifically left risk out of our lightweight SAS approach. SAS is a purely technical analysis.

Likewise, creating a thorough threat model of the target system also requires deep understanding of the system. Building a complete threat model describing the actors and agents most likely to attack your system takes some time, so we water down that step as well. (In fact, whenever possible, we use a pre-computed threat model that we've already built at another time.)

Of course, we need something about the system itself to ponder during our analysis, so in lieu of a threat model we create a software component model of the system. This software component model identifies major software components making up the system, maps out their connections to each other and also notes their connections to any components outside the target system. This model provides valuable and necessary context to the SAS analyst.

In the end, any analysis approach, including SAS, must produce results of value. It can't just be quick to complete and scalable using a large pool of people, but also lead to vacuous results (if that were the case, we could all just stop worrying about flaws and watch TV all day). What we've discovered after doing many years of ARAs is that a large number of flaws can be identified by focusing on three areas of system security: security controls, design principles and the software development process.

Security controls, design principles and the software development process

The first key to scalable architecture analysis is to focus on areas that are screwed up the most often in practice. This provides lots of value right out of the gate. Results matter more than anything. The second key to scalable architecture analysis is to make sure that the process does not require Superman to succeed. A bigger pool of people capable of performing an analysis makes it scalable.

Another trick to scalability is tight focus. At Cigital Inc., our SAS approach focuses on eleven particular security controls, five design principles and five aspects of the software development process. It's worth noting that we developed this capability together with a number of our clients who focused directly on their most common flaws as actually found in nature.

Our list of security controls is designed to be familiar to a large set of the development community right out of the box. The controls list covers areas such as authentication, authorization, cryptography and availability, to name a few. One security control that is used incorrectly far too often is applied cryptography. Just to put a fine point on it, here's an example of a crypto flaw: applying a confidentiality control (such as AES encryption) when an integrity control (such as HMAC) is what is actually required. This is clearly a design flaw that won't typically be uncovered by other software security activities -- that is, penetration testing won't find it and neither will code review -- so unless you're looking for design flaws, this kind of security control defect has a good chance of sneaking by unnoticed.

Plenty of wise sages have written about design principles for security, starting with Jerry Saltzer and Michael Schroeder in 1975. More recently, McGraw has written about 13 design principles to ensure enterprise security. In a perfect world, all of these design principles would be applied, pondered and analyzed when designing every system. But as you probably know, we don't live in a perfect world. For the purposes of SAS, we zoomed in on five principles, which we use as a gauge to determine how well the system's design adheres to industry best practices. It's time to apply another fine point: One very common security flaw we see far too often involves violating the famous principle of least privilege. Often, this flaw shows itself in a system that has some components requiring read/write access to some resource and other components requiring only read access. What we find is the system does not implement controls to grant only read access to the component only requiring read access. Sadly, designers (and implementers) find it more convenient to use a single security control for all the components regardless of write-access requirement, so that's how the system ends up being designed. In the end, this means that if there is a vulnerability in the read-only component, there is the potential that an attacker might be able to overwrite the resource, as well since the security control allows read/write access. Ouch.

Finally, our SAS analysis also looks at the software development process itself. This part of the analysis is designed to identify potential flaws introduced as software moves through different deployment environments in an organization. As an example, we analyze how key material used for cryptographic functions is protected and used in development, test and production environments. We want to understand how separation of duties is being enforced across these radically different environments. At first blush, it may seem that potential software development process weaknesses can be understood and handled just once (in a fire- and-forget sort of way). After all, once corporate standards are set up to address these kinds of software development lifecycle issues, surely those standards will be followed 100% of the time. Right? Wrong. This kind of happy-world assumption has been proven incorrect so many times in the field that we recommend pondering systems development lifecycle (SDLC) weaknesses as part of every review (even SAS). Bypassing the corporate standard or industry-accepted best practice "just this one time" often leads to cascading and repeating failures that persist forever. (If you're not careful, pretty soon you will have created a new undocumented de facto standard by accident.)

A tight focus on particular security controls, a subset of known security design principles and aspects of an SDLC help allow SAS to scale.

Last word: Don't forget the flaws

We've said it before but we'll say it again: The time has come to focus real attention on software security flaws. Both SAS and its more intense and valuable big brother ARA are necessary parts of a fully formed software security initiative. Note that any focus on flaws should not be carried out to the detriment of finding and fixing bugs. ARA and SAS are most definitely not a replacement for your static analysis engine. By contrast, what they do is focus some much needed attention on flaws -- the kinds of software security defects that constitute half the problem.

About the authors:
Jim DelGrosso is principal consultant at Cigital, Inc., where Gary McGraw serves as chief technology officer.

This was first published in December 2013

Pro+

Features

Enjoy the benefits of Pro+ membership, learn more and join.

Essential Guide

How to develop software the secure, Gary McGraw way

0 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

-ADS BY GOOGLE

SearchCloudSecurity

SearchNetworking

SearchCIO

SearchConsumerization

SearchEnterpriseDesktop

SearchCloudComputing

ComputerWeekly

Close