What's the biggest mistake companies make on security?Ulsch: Companies need to integrate physical, technological and administrative controls in their security policies but many aren't doing that. This is critical because some security procedures that may not be technical could have an impact on the IT side.
Give an example.Ulsch: I've seen cases where companies focus their background checks on the executive level and miss the relatively low-level employee who may end up with high-level access to critical information. This could be someone who takes the fingerprints of contractors, new employees and visitors and enrolls them into the biometric ID system. Though that person is low-level, he is coming into receipt of a high number of private biological identifiers like a thumb or fingerprint. Whenever someone's enrolled, their fingerprint appears on the monitor before it's encrypted. The administrator can't print or e-mail that image and can't copy it into a Word document, but he can get a high-resolution digital photo of that fingerprint and turn it into a data file. Over a year, the person could aggregate several thousand fingerprints.
How would the average background check miss someone like that?Ulsch: The person is deemed low level and they either have not had a background investigation at all because investigations may focus on executive-level employees, or they received a standard local court of jurisdiction check. Over past five or seven years, maybe they lived in two or three jurisdictions and you go back and check with the courts. Each of those courts were reviewed for arrests and convictions, and you don't find anything. But the person could be a felon convicted and arrested under federal standards. But because no federal court background investigation was authorized, the company never learns about the federal convictions.
And that person could go on to do some damage …Ulsch: Yes. The person may have ties to organized crime in Russia, or maybe he sells these thousands of fingerprints that are later used for identity theft or to commit a terrorist act. You can have a terrorist who ends up with one of these IDs and uses it to gain entrance into the U.S. Later, the investigation finds the employee worked at your company and went unnoticed. Then it's on the front page of the Wall Street Journal. You have to ask what the detrimental value will be on your brand.
How widespread is this kind of security gap?Ulsch: Almost any company we go into, we find issues where there is the very real potential for significant legal, reputational or financial damage over something like this. And it revolves around a lack of integration between physical security, IT security and risk management. Historically we have a lot of security and risk boxes, but only recently have we looked at tying those components into one entity.
With that in mind, what are the most important elements of a security program?Ulsch: Policy is key, because technology in the absence of policy gives us a false sense of security. When we first started to see the proliferation of firewalls, a company would say yeah, we're secure, we have a firewall. That was a false sense of security. What's most critical is how a company socializes technology, especially when it's a global enterprise. How do you do it across a workforce that may be very non-tech with multiple language barriers in multiple countries?
Looking at today's technology, which threats tend to get overlooked?Ulsch: Blogs constitute a real threat in terms of leaking sensitive information. In one company I dealt with, an IT architect disclosed sensitive information on a blog. He was blogging with friends trying to figure out a specific problem. Once a hacker on the other end gained his trust, the architect disclosed his company's security architecture. Less than 24 hours later, his company was hit with a massive attack. Internet-enabled camera phones are another problem. Should you allow them into meeting rooms where sensitive information is being shared? These are things companies need to address in their security policies.
This was first published in July 2006