Why you should never skip security assessment and testing
This Security School is a free multimedia learning guide designed to help you understand and address the strategic and tactical implications of this topic.
Some things just work better together: movies and popcorn, Sherlock Holmes and John Watson, and information security and systems administration.
Infosec pros and system administrators have a lot to learn from each other, and when they work side by side, the result is better designed, more resilient enterprise systems. This collaboration should encompass the whole application lifecycle, from app design to software validation testing to full-scale deployment.
In this video, expert Adam Gordon, lead editor of the Official (ISC)² Guide to the CBK, Fourth Edition, delves into the many ways that infosec pros’ and sys admins’ respective skill sets complement one another.
CISSP® is a registered mark of (ISC)²
The following video is an excerpt from the Official (ISC)² CISSP OnDemand Training.
Transcript - App design, software validation testing need infosec input
Let's talk about design, validation assessment and test strategies. We'll be going over the roles and the verification of validation thought processes associated with this area.
Let's begin by talking about the role of the system engineer, and the security professional, who are they, what do they do, what do they represent, why may they be important to our conversations. They help us to create a test and evaluation strategy and support of acquisition and/or development of programs, yet when we think about the system engineer and the security professional, we think about individuals that bring a lot of expertise around how to build and how to secure things and how to do those two things with a formal documentation thought process, a rigor, if you will, that allows us to focus on the aspects of those solutions that are going to be most important to minimize risk, most important to understand to identify threats and vulnerabilities, and most important for us to not only document and understand, but also then to communicate broadly and widely about so individuals that are consuming and using and interacting with the aspects of that process are familiar with it. And as a result of that, better understand how to use it in the most appropriate ways. They recommend test and evaluation approaches.
Where systems and security meet
System engineers help us to understand testing, including software validation testing, and evaluation. Security professionals help us to focus on implementation of testing and evaluation solutions with a bent, with an eye towards security and implementing security at every stage, at every level of the development life cycle becomes the focus and the responsibility of the security professional of the CISSP in these discussions. Evaluation of test plans and procedures using that critical eye to understand what we're accomplishing and if that list of things we're looking to validate and accomplish are in line with the requirements and the requirements that have been identified for the system. And they help us to understand the rationale behind the requirements for acquisition and/or development of programs.
It's not enough to say "I want to go out and build a mousetrap." I have to build a mousetrap that does the things the people that want mousetraps built said they want done, and I have to have a need for a mousetrap in the first place. If I'm looking to trap cats, mousetraps are not appropriate. But if I'm looking to deal with mice, mouse traps will be. I have to understand my audience, I have to understand their needs, and I have to understand what the requirements are that equal success in the mind of the customer. The systems engineer and the security professional both help us to focus in on that. The system engineer is very skilled at understanding requirements gathering, identification of requirements, documentation of them, and drafting plans that help us to then realize those recommendations and those requirements through the building of a system that will align with the functionality that the customer specified they would like.
The security professional on the other hand, while certainly interested in all the things the system engineer does, is going to bring the expertise of understanding how to apply security, and the critical thought processes around applying security are going to be integrated into the system engineer design through the identification of requirements in the association with requirements gathering that the system engineer has done, the security professional can be asked to come in and help to assess the impact of those requirements, help to assess the design, and help to understand where security can be most appropriately applied to achieve the desired end results both of matching the functionality, delivering the needed requirements that the customer says they want, but also to do so in a secure way. This is where the two roles meet. This is the crossroads that allows them to mutually reinforce and interact with each other, and this is the importance that both roles need to have, understand, and identify as we talk about system architecture.
The working group's another very important element in this conversation. The working group is going to allow us to be able to ensure the test and evaluation processes, such as software validation testing, are being implemented from the perspective of all the different competitive interests that have to understand how to contribute. This is where we get the mindshare or the group-share around how to identify requirements and prioritize them, how to take those requirements and master them and understand what they are, set them up into some sort of a coherent, logical associates of order, and then build ultimately from those requirements the functional elements that the system will be made up of. The working group is going to allow us, in other words, to make sure we are aligning the requirements with the system design and ultimately producing the desired results.
The role of verification
Verification becomes very important here as well. Want to make sure we think about the fact that when we're verifying, we're looking to understand that we are indeed doing things that have been described and are documented and required. Want to check them off on the list, if you will. We're going to do so from a variety of different thought processes or mechanisms and methods. Software validation testing helps us to verify. Various static and dynamic analyses are going to help us to verify. Code and document inspections helps us verify walking through the system walking through the code, seeing it, understanding it, interacting with it helps us to verify.
Verification, in other words, at the end of the day is really the thought process around doing what we say and making sure that what we say is what we did. And my matching the expectation to the outcome, verification allows us to understand that we have indeed hit the mark, or if we have not, that we are deviating and then we have to understand why we're deviating. Maybe that deviation is important and required, maybe it's not. Maybe it was intentionally put into the system, maybe it was not. We don't know. But the point is, verification helps us to identify those areas and those opportunities for us to be introspective and validate the concerns, validate the requirements, validate the assumptions and most importantly validate the outcomes.
Validation: ‘The sanity check’
When we think about validation, as I just mentioned, with regards to verification, validation is a matter of developing that level of confidence with that understanding and that trust, that reliability that we have in the solution and we're able to test the solution in order to make sure that it does what it's supposed to do and this validates the assumptions that went into the build, validates the requirements that the system engineer is gathered, that the customer has provided for us. Software validation testing shows that the software system, in other words, meets all requirements and user expectations are documented. It's the sanity check, ultimately, that says to the customer, "Mr. and Mrs. Customer, I've done what you've asked. Here's the proof." And that's what software validation testing is going to imply.
Software development is part of a system design solution. Software requirements as you can see, are typically derived from the overall system requirements, and we have to understand in other words, how to build the software from the elements of the system requirements that imply that that's the functionality want. And when we think about software development as part of system design, we have to understand that there's probably an overarching system that we're being asked to take on, ultimately think about, envision, and then do something with. That system may already exist, by the way. That system may be built as part of the design cycle. That system may be in some form operative, but may be modified as a result of our plan. There's lots of different places and ways and thought processes we can bring to bear around the conversation of software development with regards to system design.
Understanding user requirements
Whether we're building an already enhanced design and just simply adding functionality, whether we are building from scratch, whether we're modifying or updating something that already exists by simply adding something to it, doesn't really matter. Point is we have a plan, that overarching plan is going to encompass or incorporate several things. The thing that's most important to us in this conversation specifically here, is the idea of the fact that we can develop software. Software interfaces, software associations, software functionality, software validation testing -- these are all parts of the software development cycle.
We can develop an interface, a web-based interface, some sort of API-based solution that allows us to create a new way of consuming information. The requirements to do so are documented as part of the overall system design, which is, "Hey, we need to have a web-based system that allows us to access email, and allows us to be able to send and receive as well as being able to submit content from the email system to a variety of other systems. We should be able to print from the email system. We should be able to archive emails for long-term storage. We should be able to access a web-based enterprise content management system and place emails directly up into a document library of some sort. We should be able to have a web viewer or some kind in the email system that allows us to use a viewer to see documents, and spreadsheets, and presentations without having to load the full program and being able to do so within the email interface. And oh, by the way, the email interface should be supported through these web browsers and should be scaled to these mobile platforms on the operating systems." These are all parts of the system design requirements. The software development as a result of that will be driven by the list of requirements I just enumerated for you. Mobile-based access, these browsers are supported, these functions are required inside the email system.
So we need to understand and document users' requirements and specifications. Those get translated into the function elements that we ultimately build. We want to understand that, because software development can't exist outside of system design because it's not being driven by those requirements and it's not aligned in any meaningful way to the things that user or the customer has told us they want us to achieve. If we divorce, in other words, the software development process from the system design process, we lose direction. We lose the ability to understand how to develop the requirements that are listed into a functional system or a functional software offering that the user will then find value in.
Software vs. hardware
So how is software different from hardware? Just want to think about the ideas that software clearly as we're thinking about developing it. It's going to be its own animal -- it's going to be its own thing. Hardware and software are both tangible. Hardware is going to execute operational instructions provided by software, and provide the opportunity to effectively allow the software to achieve some sort of desired outcome. It is the arms and legs, if you will, of the software. The software is the brains of the hardware. It directs the hardware to do the things that it needs to do. Without hardware, software can't exist, but it may not be as functional, may not allow us to achieve the desired end results. Without software, hardware exists, but it doesn't know what to do. It can't be managed and interacted with. It can't be used and directed to achieve the end results that we're looking for. The two have to work together, clearly, but they are definitely different in their approach and in their value to the overall system holistically. They each bring a lot of value to that process, but ultimately they provide it from a different vantage point and a different perspective. So just understand that when we think about how software and hardware differ. Software is developed as is hardware, by engineers that work on those problems that users have identified that software and hardware can help us solve.
Ensuring secure software lifecycles
When we think about ultimately at the end of the day, the discussion in this particular area as we introduce or begin to introduce the topics associated with software development, software validation testing and software lifecycles and the understanding of how to do so securely, I'll remind you of the beginning conversation we had just a few minutes ago around the role of the system engineer and the role of the security professional as we wrap up our conversation in this first section thinking about the fact that while they both are focused on understanding the requirements that the customer has provided, they both are going to come to that understanding and come to that conversation from different perspectives. And good security design, good security architecture, just like good system design and good system architecture, is made up of different vantage points and perspectives that the appropriate resource knowledge management experts are going to bring to bear and authoritatively introduce into the conversation to make the best possible choices at every step in the development life cycle. But they have to do so with an alignment of, an awareness of, and an understanding of the customer requirements that are driving the discussion or driving the system design that we're being asked to take on.
At the end of the day, if we don't have a customer that's giving us requirements, whether that customer is the business, the organization or the enterprise, in other words, whether that customer is an individual, John, or Joe, or Jane, or Jamie Smith, whoever those people are, if we don't have that customer, we don't have a reason to actually go out and design something. Because we don't have requirements that tell us what that design should do, what the value proposition is we're looking to align with and ultimately the measures or the metrics that tell us whether we are successful or we have failed. There's no design, in other words, that is aligned with requirements because we have no requirements that are driving the software validation testing and design.
And it's important for us at the beginning of this conversation, remember that security professionals add tremendous value in this conversation by bringing the thought processes, the architectural elements, the design solutions of security into the mix and allowing the systems engineers to understand how to develop systems, but to do so with security in mind. This is the important part of the CISSP plays in this particular area.
These are the skills and the thought processes and the knowledge items we will continue to discuss as we continue to have our conversations here. Please make sure you come back and join me for those conversations. Look forward to having them with you shortly.