What has the past year been like for your organization?
Noblot: We have already gone through a very important period of tests that we call technical rehearsals [in July, October and December]. These are very important milestones because we actually have people in operations and a team to [do penetration testing]. We tested the readiness of the technology, people and procedures. Last January, February and March, we had a series of test events during actual competitions at the venues -- actual skiers and hockey teams competing -- when we tested our systems. We've had a lot of preliminary milestones.
The lifeblood of the Olympics is volunteer help. Do you do background checks on all of them? What kind of access controls are in place?
Noblot: Volunteers are heavily involved, but they are only one population inside the Games network. We have 2,000 IT people, 1,200 of them are volunteers. We are going to issue between 80,000 and 90,000 accreditations, and all of them have access to the intranet.
We have implemented an identity management system. We have also provided hardened kiosks to our staff that provides access to the intranet. Their number of activities is limited at those kiosks--USB devices cannot be used, neither can CDs--it's completely hardened.
When we can provide purpose-built systems, we do it. When we need to provide a system used by different people, we use access control. A lot of effort went into reviewing job descriptions. We have 240 roles for our staff, and we went through the process of reviewing access requirements for all 240 roles. It was an expensive, but necessary investment.
What were the highest priorities identified by your risk assessments?
Noblot: There are two main risks, and we've aligned our security solutions to the risks we have. The first, like every corporation, is protection from viruses and intrusions. A virus could be catastrophic; if you have a virus at the wrong time or place, you might have to postpone a competition. We are responsible for the network and IT infrastructure that supports all the systems like timing, scoring and information that is transmitted in real time to the press. If we have a virus in one of those systems, it cannot work in real time. If you are receiving intermediate times in the downhill two minutes after the competition is over, that would be really bad.
The second type of risk is people with malicious intent trying to piggyback on the visibility of the games. The impact of someone breaking into a scoreboard, or hijacking a feed with their own messages, is bad for the Olympic movement.
At the end of the day, the Olympics is about visibility. For two weeks, everybody is looking at the Olympics. It's a very specific business to be in a critical mode for only two weeks. We're not like a normal organization.
We have to be secure before the games. We are the custodian; we have to make sure of the integrity and confidentiality of data processed by a system.
You were in charge of security for the 2004 Athens summer games. What lessons learned are you applying to Turin?
Noblot: The amount of change that happens in this project is surprising. You would figure it would be a nice, natural project in implementing what you've done before. That's not the case.
We have a high number of change requests. Our telecommunications partners have changed [AT&T in 2002 in Salt Lake City, OTEGlobe in Athens, TelecomItalia in Turin] and they all have different ways of doing business. We need to adapt to partner and cultural changes and the different organizing committees.
In the last 16 months, we've had more than 700 change requests for our IT systems. From a security perspective, you need to put in controls that can adapt to a large number of changes. It's important to be involved in all business decisions because all decisions impact security. Security is now a part of the change-control board. No changes are accepted without the approval of security. That has been very important.
How about from a technical perspective?
Noblot: We depend on volunteers, but have a lot of staff and partners who join the games late. Two weeks before the games, we had a lot who requested accounts and we had to create thousands in two weeks. Imagine the load on the system. Sysadmins at that time should be monitoring CPU cycles and memory; you don't want them spending nights creating accounts for users. This was a big motivation for us to implement an identity management system for Turin.
Going back to Salt Lake City, the biggest lesson learned from a security perspective was around the number of security events--not alarms-- that were generated. We monitored all of the systems in the network--thousands of workstations, hundreds of switches, servers and devices. It was quickly a situation of information overload. We implemented a SIM solution for Athens that did real-time security monitoring.
During the 16 days in Athens, 4.7 million security events were generated [290,000 a day]. You could put that information on 10 screens, they would be blinking for days. We put a system in place and spent lot of time integrating it with the environment [Unix, Windows]; it was a large effort. We had to write the monitoring rules for specific environments and make sure the same alarm coming from a different system was evaluated according to the criticality of the source. For example, an alarm from a venue has a higher priority than one that has nothing to do with competitions. We had to write those rules, and out of the 4.7 million events, 430 were high-level events and only 22 were critical where someone internal was trying to do something against policy [systems are separated from the Internet by multiple DMZs].
We have put in similar systems for Turin, and we expect those figures to be the same or lower.
Does the motivation behind an attack play into your risk assessment?
Noblot: Motivation is always a difficult question. For most of those attacks, we caught the people behind it and there was always an excuse. No one is going tell you, "Yeah, I was trying to hack you."
Among the 22 critical events, several were people who unplugged our system and plugged in their own. Other events were people trying to use privileged accounts when they were not supposed to. Those are security policy breaches, and you are in trouble, whatever your motivation.
The thing that is good and bad is that we put in place a preventative control with the monitoring control. If you plug in an unauthorized system, an alarm is raised and your connection is denied. Now since you don't know what the person wanted to do, you don't have time to figure out their motivation. If they plug in and release a virus that kills your switch, you're down. We're responsible for it. Our policy may be a narrow-minded view, but it's necessary.
About the author
Michael S. Mimoso is Senior Editor of Information Security magazine.
This was first published in January 2006