This article can also be found in the Premium Editorial Download "Information Security magazine: Keeping on top of risk management and data integrity essentials."
Download it now to read this article plus other related content.
|Data Protection Challenge|
Data Reserves Spill Over
Nowhere is risk assessment of greater priority at ChevronTexaco than in oil exploration.
Oil rigs aren't very mobile. Once these 10-story steel behemoths take root above a reservoir of crude, they have a home for 25 to 30 years.
Before that happens, terabytes of data from seismic researchers and oil field simulations are collected, stored and protected. Decision makers have to trust the integrity of that data before they can commit to a drilling site. They have to believe the data hasn't been maliciously or inadvertently altered, and it has to be readily accessible from literally anywhere on earth.
"Decision quality is critical," Jackson says. "Good decisions require a lot of inputs, and one of those is good data."
Following the 2001 merger that joined Chevron and Texaco, myriad departmental divisions were created, compounding the data management quandary. Data never stops flowing through network pipes as the company's scientists continuously search for new reserves. There can be as many as 50 three-dimensional seismic simulation projects underway at any given time, generating more than 350 terabytes of information.
Then, of course, there are the simulations done on the oil fields to determine where reserves are, how the oil moves and the potential recovery techniques, which can account for another 10 terabytes of data annually. One offshore oil platform can produce 10 gigabytes of data daily, while refineries can generate a terabyte of raw process data from any number of real-time sensors.
And, all of this before a drill breaks ground.
"The biggest challenge with this much data is finding what you need, getting the correct version and having confidence that it hasn't been altered," Jackson says. "We always have to be mindful of the confidentiality of this data. Data is a critical asset that has a value back to business. If it's released, you compromise its value, and that costs you money."
Jackson minimizes data risk by going back to the basics--"fundamental blocking and tackling," as he calls it. Traditional access and authentication controls are in place that provision data access to only those who need it. Policy dictates that each business unit is responsible for maintaining access control lists, and database logs are monitored regularly for breaches.
"If you do the fundamentals well, you manage risk well," Jackson says. "Before it gets out of hand and impacts the business model, you have to take a strategic perspective and decide how to strategically manage information better. Looking at these projects helps us better identify, access and store data."
A cornerstone of the data management structure at ChevronTexaco is securely storing the reams of data generated, not only for regulatory reasons, but for historical research. While storage and security may be distant IT cousins, ChevronTexaco is doing its part to change that relationship.
Jackson has seen the connection between storage and security, and understands that the old mainframe-perimeter approach to securing assets of a hard exterior and a soft interior doesn't work today. Internet and third-party connections--even contractors on the inside with access to sensitive information--have dissolved the perimeter and forced managers to bring security closer to data at rest.
"Over time, a computing infrastructure gets so complex that even well-meaning employees could get confused and make critical mistakes," Jackson says. "Data stored on tapes and inside databases is what brings value back to the enterprise. Anything done to properly secure data and the technology that houses it is the right way to manage information protection activities."
Regulatory compliance, post-9/11 sensitivity to business continuity and disaster recovery, and high-profile breaches of security and processes--such as ChoicePoint, Bank of America, LexisNexis and numerous U.S. universities--are also forcing the two broad disciplines together.
Data has a lifecycle that changes in value as it ages; this forces IT directors to manage it in different ways. Awareness of business process, physical security requirements, and the need for confidentiality and the integrity of business data have been missing for storage professionals, who have recently coined the notion of "information lifecycle management (ILM)," according to Jon Oltsik, a senior analyst at Enterprise Strategy Group.
"Storage has emphasized information lifecycle management without paying lip service to security," Oltsik says. "But this is right down Broadway for security managers. ILM without security is dead on arrival; storage people are just catching up to this kind of thinking."
The dynamic between storage and security has been impacted greatly by the increasing occurrence of data being stored on networks. Now, data at rest on IP or Fibre Channel networks is vulnerable to the same security risks as other network traffic.
Storage managers don't usually consider security at the technology tier, but rather at the host or network level, Oltsik says.
Data integrity for a storage manager is usually handled via backup, but what isn't often considered is the integrity of the backup process, who has access to the data and when, and whether it's at risk of being stolen.
Jackson must balance these risks and ensure Chevron-Texaco's business units succeed safely. The rising prominence of risk management is something most security managers must brace for.
"I think it's the normal evolution of things," Jackson says. "Practitioners understand this. I see some of my peers having their own scope of responsibilities expand to include other elements of risk. It's the natural order of things."
This was first published in April 2005