How well do your colleagues in other departments understand what security does? If executives had to grade how well you're doing as a function supporting the business, would they know what questions to ask?
Last year, Forrester Research found that roughly half of all CISO or equivalent roles reported directly to C-level executives, yet we still see a significant number of these individuals struggling to articulate how security supports the broad organization. Operational metrics and compliance reports reflect performance to some extent, although the scope of these measures is limited, and they rarely address the interests of the business.
To meet these challenges, security professionals should use a framework to evaluate the process maturity of all functions for which the security organization is responsible. Measuring process maturity takes the conversation out of the technology world and presents an assessment of how well you approach your different responsibilities.
For example, using COBIT maturity levels, you can assess whether your incident response process is 0 -- non-existent; 1 -- ad hoc; 2 -- repeatable; 3 -- defined; 4 -- measured; and 5 -- optimized. The details behind the score will be different for each function, but they will all have similar characteristics. That is, to reach a level 3 you will look for clearly defined policies and procedures, and to reach a level 4 you will need to show that you consistently use metrics to guide decision making. It takes time to develop the unique characteristics for each function (i.e. deciding what metrics to use for log monitoring versus remote access controls) but it allows you to compare otherwise dissimilar areas of security using the same scale.
To make this work, your score explanations need to be:
- Prescriptive: The characteristics required to achieve the next-higher level of maturity should be clear and objective. An assessment should yield similar results regardless of who conducts it.
- Process-oriented: Even for security functions that primarily rely on technology, you should be evaluating the process you take to choose, deploy, and monitor that technology. Focusing too much on the products or tools used is likely to make the assessment irrelevant in a few years' time.
- Uncomplicated: Security organizations must constantly respond to auditors, regulators, business partners, and other stakeholders to support different types of assessments. A maturity assessment should not require an extremely large amount of background data or evidence. The evaluation should be made based on high-level discussions and observations.
When building the framework for this model, it's helpful to consider functions described in standards and regulations that are core to your organization's control framework, although it should not get down to the level of control assessments. In addition, it's important to consider the governance and oversight functions of the security organization, which often don't show up in industry standards. Those functions include strategic planning, budgeting, capacity planning, skills management, and performance management. Also, evaluate how well the security organization works with other relevant functions, such as compliance, audit, legal, and lines of business.
The objective of a maturity assessment like this is to provide a platform for discussing and demonstrating what the security department does, help plan your security roadmap, and show that security investments are leading toward measurable progress. This approach will not help you measure whether or not you are secure, and it does not take into account key aspects of building a roadmap, such as risk exposure and available budget. However, security professionals looking for a straightforward way to baseline their approach to security (and benchmark their program against historical assessments and/or peers) should find this a valuable and worthwhile exercise.