Huff and puff, but Microsoft SQL Server 2008 is more secure than 2005 and 2000
@REG Microsoft's SQL Server 2008, released Aug. 6, was brought to market faster than SQL Server 2005 following the 2000 release, because of high demand for additional performance, functionality and security features. What's more, Microsoft faces stiff competition on three fronts: traditional database technology, Web-enabled application platforms and the open source community.
Oracle has been releasing almost annually newer versions of its database in specific flavors designed for small business, enterprise, online and government. Sun Microsystems' acquisition of MySQL has spawned a new competitor in the enterprise database market. The post-acquisition credibility of MySQL has grown exponentially. This, combined with enhanced support, documentation, licensing, and service options will inevitably cause more problems for the big guys.
The new twist on database technology is the new hotshot Web companies that are opening up their platforms, which are being leveraged like databases with hosting and application logic. Salesforce.com (Application Exchange), Amazon (EC2), Google (APIs) and Indian powerhouse Zoho have offerings in the distributed platform market.
The good news for the Microsoft community is that the security features in SQL Server 2008 are well thought out and properly implemented. By far the most significant change over its predecessors is its granular data security features: encryption, key management and meta data security enhancements. Although role-based permissions are not new, enhanced flexibility offers tighter controls on many more facets of your database. The days of issuing more access than necessary have passed.
Who Are You?
While you can still leverage your LDAP and Active Directory investments to log in to SQL Server in a secure manner, integration with non-Windows clients now supports full channel encryption. Full channel encryption uses SQL-generated SSL certificates by default, preventing almost all man-in-the-middle attacks out of the box.
Full channel encryption also secures usernames transmitted by SQL statements, and any other payload specific information.
This is a significant accomplishment over the username and password hashes that were default in 2005.
Things also get easier for Microsoft Group Policy shops. You can now manage and enforce password object properties such as expiration, lockout duration, age and lockout attempts for all SQL Server 2008 databases through Group Policy Objects (GPO). While you could manage the underlying components of SQL Server 2005 via the native underlying Windows operating system, you now have the ability to manage both the OS and overlaying SQL Server.
These features will help combat the brute force attack issues, but the code to unlock an account is surprisingly simple. Prior to this integration, most DBAs did not properly set password policies to alert or even lock out system administrator or DBA accounts, even though this has been a common practice for system administrators for years. GPO Password Policy integration closes that loop.
Service or SOA authentication, which is heavily leveraged on the Internet, is improved significantly. SQL Server 2008 supports five authentication mechanisms: Basic Auth, NTLM, Digest Auth, Kerberos and Integrated Authentication, which is really just a combination of Kerberos and NTLM.
SQL Server 2008 also offers the ability to sign code modules with digital certificates. This applies to stored procedures, functions, triggers and event notifications, simplifying permissions management by providing granular access to tables and other objects. In essence, you can assign permissions to code modules, and allow users to only access the exposed entry points, not the underlying schema layout.
Signing code modules brings the added benefit of protecting against unauthorized changes. Both of these use cases add to a defense-in-depth design for improved overall security.
A Privileged Existence
Public Role. Although most of the core role-based access features remain similar to SQL 2005, additional functionality has been included to help Web-based applications sandbox and protect against anonymous Internet-borne attacks. Microsoft has helped with the heavy lifting by creating a new public role. By default in SQL Server 2008, each database user is automatically added to that group. (Roles are similar to groups but can have associated privileges and access rights based upon database and application logic and functionality.) The public role was created to contain Internet users and restrict all types of access (see screenshot, below).
Meta data protection. A database or table's meta data can be just as important as its data. For instance, if I can receive verification that a table or specific column exists within a database, I can conclude that it's a good target for my SQL attacks. SQL Server 2008 allows you to protect this meta data via the error responses based upon a user's assigned role. So, for example, if all Internet users are within the public role and one attempted to access or even delete a table called "SecretSauce" without the proper privileges, the following response would be returned:
Smarter schemas. Significant in SQL Server 2008 is the implementation of database schema-based security and its context. Prior to 2008, SQL Server handled a schema by logically pairing it with a user. When you created a new user in SQL 2000/ 2005, a schema was created with it. The user then inherited the permissions as if they were his own. Now, you must create a user and then create or assign him to a specific schema. The user takes ownership of a schema, or several schemas, and that schema then is assigned rights to database elements. This additional database layer allows you to set more granular permissions and controls to users and database elements (see "For the Better," below).
This separation of objects allows a single user to create and implement several schemas versus 2005, where that would take multiple user accounts, thus potentially easing the security administration burden.
Securable objects. SQL Server 2008 builds on the concept of securable objects introduced in 2005. Logins and services known as principals can be mapped to various components, granting granular permissions to almost every object in the database, from tables, to stored procedures and functions, to certificates or assemblies. This role-based or permission granularity is a big improvement for data security, which complements the public role quite nicely. On the server end, control is provided to restrict communications through network channels, as well as named pipes and other communication channels.
Securable objects are linked to principals via permission sets. Permissions continue to use GRANT, DENY and REVOKE for defining permissions of a principal, but now you can go a step beyond and allow principals to authorize other principals to access controlled information.
Agent proxies. SQL Server agent proxies bridge user credentials and job permissions. This provides a granular approach for granting permissions for each individual step within a task. This is in stark contrast to earlier versions that used a single, often all-too-powerful proxy account. Each subsystem can have any number of associated proxies.
There is one exception: the Transact-SQL subsystem executes with the permissions of the module owner. For example, if the owner is "Foster," then all Transact-SQL statements will execute under the Foster set of privileges.
It is important to note that when upgrading to SQL Server 2008 from 2005, a single-proxy account will be carried over into the new version. Significant thought should be given to determine how to divide the account and limit the permissions appropriately.
Encryption is Key
Enabling the various forms of encryption throughout SQL Server 2008 is an internal system for key management. In the past, encryption key management was typically handled by external third-party products because SQL Server lacked robust user administration.
Extensible Key Management (EKM) protects everything from the data stored within to any Web services that exist, as well as access to the database itself. Additionally, EKM enables external and third-party products to integrate and register devices within SQL Server. Devices can be software-only products or "hard devices," such as hardware-based SSL accelerators. Registered devices within SQL Server allow applications and users to access and leverage their encrypted keys (see screenshot, below).
These devices, such as those from nCipher and SafeNet, often used by the federal government and large e-commerce organizations, use advanced cryptography features to set policy around key rotation schedules and key aging. Rotating keys, when automated by these products, ensures that data is protected in the slight chance a specific key is ever compromised. SQL Server 2008 also provides full hardware security module (HSM) support.
The latest offering also introduces Transparent Data Encryption (TDE) (see screenshot, below). At first, plug-ins were the only real encryption option. SQL Server 2005 introduced some native support for encrypting columns, and 2008 has improved that significantly.
TDE is full, managed database encryption. TDE can encrypt the entire database as well as all transactional and log data. TDE is an easy option for administrators to enable data protection without affecting the database structure. With previous versions, application developers frequently created custom code to encrypt and decrypt data. However, with TDE, these actions occur automatically when the data is written to disk and read from disk. This eliminates the extra developer work and strengthens overall data security.
The flexibility and maintenance benefits of TDE come at a cost. The CPU performance hit on the server and database could be in the double-digit percentage points on highly utilized systems.
Preparing the database is straightforward. The alter database command must be run by the database administrator similar to the following:
alter database is_mag_db
set encryption on
SQL Server 2008 has functionality for symmetric, asymmetric and certificate-based encryption right out of the box, for any standard included in the Microsoft cryptographic library.
To audit databases in previous versions, an administrator would have to set up multiple triggers and alerts on various data points to log and later analyze results to determine irregularities. SQL Server 2008 simplifies this process. An auditor or an administrator needs to define a data point to be audited (for example, user action, data element, user or role) and then create server audit or database audit specifications on the server. From there, events can be viewed with the standard Windows Event Viewer or the log viewer in SQL Server 2008 Management Studio.
Complete configuration of auditing and logging features would take a few days; however, the flexibility would allow you to integrate detailed and robust logs into third-party solutions.
Making Life Easier
To help database and security administrators, Microsoft has updated its Surface Area Configura- tion Tool. The small and straightforward Win32 GUI allows you to easily disable used SQL Server services, protocols, connections and ports. It has been updated to support all of the new SQL features.
While this is a solid approach to security, it also creates additional work for the DBAs doing the install. A new tool, Declarative Management Frame- work (DMF), will help. It is a way to manage the initial configurations for an instance or instances of SQL Server via a secure communication channel. Several policies, components and modules can be configured. It is a particularly good mechanism for managing very similar systems such as clusters and failover systems.
SQL Server has gotten significantly more secure with each iteration. The 2005 release was a must-have upgrade, and so is SQL Server 2008.
Companies are finding innovative, all-encompassing ways to satisfy multiple regulations.
Regulation bombards you from every direction. Failure to meet federal and state mandates such as Sarbanes-Oxley and state data breach notification acts threatens the reputation of your corporate brand and the personal freedom of your executive officers. Falling short on industry requirements such as HIPAA, PCI, the Fair Credit Reporting Act or even state law enforcement accreditation puts in jeopardy your company's ability to do business as well as your customers' personally identifiable information.
As an information security and risk professional, you've been thrust during the last half-decade into the crosshairs of an increasingly regulated business environment. Frame-works, audits, automation and GRC are the fabric of your being.
Redundancy cannot be.
"What you don't want to do is implement or test the same control three, four, five times over," says Marc Othersen, senior analyst in the security and risk management practice at Forrester Research.
So how are businesses managing multiple regulations without a massive duplication of efforts? Is there a catch-all framework that satisfies all the overlap?
Three enterprises servicing three different markets are building their version of a compliance "easy button," drawing on a multitude of resources to create a repeatable set of processes that would satisfy the grumpiest auditor.
MENDED SOX LEADS WAY
McKesson, with $101.7 billion in revenue in FY2008, has a mature Sarbanes-Oxley compliance program, and this is the model Sapp and his team are following to build a one-stop enterprise-wide compliance program.
Sapp, who has a development and project management background, says his organization isn't unlike much of the Fortune 500 in wanting to develop a set of repeatable processes to address compliance. He has taken steps to identify and understand McKesson's IT environment, map out and automate the testing of controls, assess and report on risk and increase the overall maturity of the organization's risk and compliance program. Right now, he says, McKesson is in an ad-hoc state, moving toward repeatable, and eventually standardized and optimized, processes.
"In three years, I would expect that we are at a standardized state," Sapp says. "That, for me, has us where we have a set of standards, processes and controls that are applied across the enterprise universally and consistently, moving toward optimized where we really almost get to a plug-and-play environment where regardless of who we acquire, we can plug them in, or if we choose to sell off an entity, it makes it an easy process for us."
Formerly, as McKesson's senior consultant for risk services (see "What's in a Title?," below), Sapp was business unit SOX coordinator in charge of the IT controls for the SOX program. Upon moving to his broader role, he quickly discovered how McKesson's numerous acquisitions had created a situation where the company operated in silos, with precious little in the way of standardized processes or a lifecycle approach for addressing regulatory mandates. His goals quickly became clear: overcome the siloed approach and build a program that will allow him to drive corporate performance through these activities.
McKesson's SOX program leverages the ISO 27001 standard for information security management and the COBIT framework for IT management and metrics.
Sapp says his organization has deployed Brabeion GRC suite, but believes a collaboration of tools will ultimately meet McKesson's needs. He is evaluating several other IT GRC tools that will help map multiple regulations, such as PCI and HIPAA, to these frameworks. SOX, PCI and HIPAA are McKesson's three largest compliance issues, and the company's SAP environment, which it uses for its financials, is the primary area of concern.
"We found many parallels where one piece of ISO will satisfy parts of each one of those regulations," Sapp says. Access controls, for example, are codicils of each of those regulations. "ISO allows us to map across that and ensure by meeting that one ISO objective, I can test once, and certify many [times]. If I'm using the same access control process across each one, then I can reduce the amount of testing I do. That's what I've been able to do with our SOX program. I can drastically reduce the amount of time we spend in audits because we have improved our process so much. We're getting through audits in what I would call record time and within our budget."
Sapp's current evaluation of GRC tools, he hopes, will further put out to pasture the tedious, laborious manual processes in place for collecting data from business units, testing and mapping controls to particular regulations. With 200-plus controls applicable to the SOX program, Sapp says that was his first target for automation with the Brabeion tool.
"We looked to an automated tool to help us be able to test the controls, attach the evidence and keep the user from going to the next step," he says. "I had one user tell me we've improved the quality of life here. We actually used SharePoint prior to automation, but the workload isn't there that you get in these tools."
Sapp says the GRC tools he's seen do a fine job of defining the assets and entities of an organization. He says they are solid for analyzing workflow and creating dependencies; this kind of intelligence can be applied outside of GRC as well. He adds that the tools are sound for collecting asset information (e.g., identifying unsupported or expiring versions of software), which helps in a risk assessment. Finally, he says the dashboard facilities are a strong means of providing a risk picture to the C-level.
In contrast, he says some tools try to do too much, and don't do very much very well. Products billed as turnkey, full-enterprise GRC programs sometimes suffer from poor workflow because of misguided focus. "Vendors sell hard on the tool rather than getting you to step back and look at process and strategy," Sapp says. "They don't think process and strategy first; they throw this toolset at you and say this will solve all your problems."
Forrester's Othersen says the tools at their core address compliance well, mapping sources, automating manual tests and providing solid reporting. Where they fail is in not linking IT risk to business risk.
"They don't have a business perspective in their risk engines," Othersen says. "All of them are IT focused, yet most risk happens in the line of business. If you lose credit card numbers, the line of business pays, not IT. Translating IT control failures into business risks is one of the biggest failings of those packages."
He adds that they don't address governance, either. "It's up to you as a CIO or security manager to use the tool to collect and analyze data on your own."
A FERM TOUCH
Isabelle Theisen, chief security officer for First Advantage Corp., deals with these vagaries with a homegrown concoction of established frameworks, processes and automated tools that implement not only a solid compliance program, but sound business practices (see "Consistency Counts," below).
"Business sees anything having to do with compliance as a necessary evil; they need it because they're being told they need it," Theisen says. "I'm trying to turn that around and say, 'No, you can also use IT governance, self compliance, business operations compliance and security to actually be a market differentiator against your competitors. You can turn it around and use it as a way of doing a better job against your competitors."
First Advantage is a data provider, servicing car dealers, mortgage services and employers with credit reports, background checks, skills assessments and more. The California-based company is subject to Sarbanes-Oxley, the Federal Credit Report Act, Gramm-Leach-Bliley, PCI and state data breach notification laws and privacy laws. Some of the regulations' requirements overlap, and prescriptive advice is minimal.
In response, Theisen architected what she calls the FERM (First Advantage Enterprise Risk Management) program to identify controls to cover as many regulations as possible. The framework is a blend of COBIT, ISO and NIST recommendations and a mix of manual processes to identify risk and controls and ultimately feed them into a GRC tool from ControlPath, which the company purchased 18 months ago.
"We implemented the tool across business units to perform assessment, identification, testing and remediation work to ensure we meet compliance for all of our business units," she says.
Theisen compared the manual processes in place prior to automation to typical audit work--lots of face-to-face interviews, surveys and questionnaires to determine what was in place in the different business units and inventory security, risk management, IT governance and other regulatory processes. This information was kept in a spreadsheet--not practical, Theisen says. Now it is updated into the ControlPath tool.
"I would always recommend an automated tool," Theisen says. "You do have to have a repository of that information, even if you build an easy Access database. Otherwise, you're going to ask the same questions every year to the businesses. How would you build a baseline?
It would be a nightmare to manage your compliance levels manually."
Automation also helps with trending and tracking of progress against control objectives.
Identification is the first of four deployment phases of the FERM process. Inventory such as service offerings and business unit assets are gathered and uploaded to the tool.
Assessment is the next phase. Threats, vulnerabilities and risk that could impact a particular service offering are assessed. Business impact analysis, data classification and threat modeling are done against every application that applies to a service offering in a business unit. "Because we do a data classification, we can focus only on high-risk applications for a service offering," Theisen says. "Business management has been extremely supportive because they know we are focusing on what is critical to them--high-risk applications within their service offering--and we don't have to do everything."
Those two phases are the most time consuming, she says, but are absolutely necessary.
The third phase is testing. Having established what the high-risk issues are, Theisen's group can focus on what is critical to a business unit. Application and infrastructure assessments are conducted prior to a controls analysis questionnaire. The questionnaire is tailored to the service offering in question, Theisen says. ControlPath builds a master controls library mapped to all the controls relevant to First Advantage, enabling it to build customized questionnaires for each business unit.
"It's where automation matters," she says.
Remediation is the final phase. Based on the results of testing, Theisen has a list of remediation items prioritized based on risk--all flowing from the organization's business impact analysis and data classification.
Theisen says a major challenge involves keeping up with the fluid changes in regulations where very little automation exists on the front end to gather data. Often organizations are forced to wait for vendors to update their control libraries, or do it manually.
Another challenge is the narrow focus on compliance versus doing what is right for the business by implementing sound business practices to manage data.
"I try to stay away from talking about regulations," Theisen says. "This is about sound business practices."
ITIL LEADS WAY
Nelson Martinez, systems support manager for the city, tackles the intersection of these demands by centralizing the city's IT infrastructure and applying ITIL as a service management platform and NIST standards to address security. This centralization becomes more important in the coming months as the city implements its egovernment initiative, which essentially creates a virtual city hall online.
"Being public funded, there's an ethical issue there. We hold ourselves to a degree of responsibility. We like to be in line with certain industry-wide security policies," Martinez says. "We're pretty much an ITIL shop and we do everything with change controls like private industry. We track everything. We have SLAs."
Martinez's organization is responsible for the city's infrastructure--networks, servers, desktops, gateways, and even disaster recovery. It supports departments with largely mobile workforces such as public safety, which must securely connect, for example, to state and federal databases for background checks during traffic stops.
There are strict FDLE configuration guidelines to which Martinez's systems must adhere, otherwise an incident could not only jeopardize sensitive public information, but endanger the department's ability to procure funding should it fail accreditation.
Standardization under ITIL is crucial, Martinez says. There is one IT department for all city agencies in Miami Beach. "It's truly the only way I want to run an IT shop. Standards are in place. There's a unified security policy that dictates how things are done," Martinez says. "It's the only way we have adequate controls in a heterogeneous environment."
Change controls are the biggest win ITIL affords the security of Martinez's shop.
"You still have to take the initiative to do your scanning and your pen-tests, see where your issues are and fix those," Martinez says. "Once you have established a baseline where you can say, 'I'm for the most part secure,' the change control processes that ITIL says you need to have in place allow you to track changes in your environment."
Martinez says Miami Beach deployed Symantec Enter-prise Security Manager to handle its vulnerability scanning and monitor for policy deviations. The tool comes with templates for NIST and NSA standards, for example. Martinez relies on these security templates to map compliance with industry regulations such as PCI and internal policies for mobile connectivity. The city also uses eEye's Blink for real-time IPS and IDS monitoring.
"Symantec ESM is very good at creating our policy templates for servers and tells us whether we're in or out of compliance," Martinez says. "The tool is a good way of showing an auditor that we're doing quarterly audit compliance runs against our machines and remediating."
In the event a security issue threatens the safety of data (and compliance), Martinez says he can resolve it by examining the root cause. Using ITIL, he can determine whether changes in a server or firewall setting, for instance, led to the particular issue.
"It helps you troubleshoot and get back to square one and figure out where this problem was introduced," he says. "If you've got an SLA, how can I guarantee to my customer that I'm going to meet 5 9s for that service?
I need to make sure I am controlling proactively the changes in the environment or making sure those changes are reviewed prior to being implemented."
Martinez says it's vital that risks associated with any change area assessed prior to implementation.
"Change has to be well thought-out," he says. "I believe it's critical to the security and availability of production environments. If you do not have adequate change control strategies in place, it's a matter of time before you have a major outage."
Forrester's Othersen says most organizations are in similar straits to these three where they're in the process of adopting frameworks and on their way toward a normalized compliance environment.
"About 10 percent have achieved that nirvana state where they're normalized, their frameworks are rationalized and automated," Othersen says. "The rest are putting down frameworks, getting budgets. There's no procurement or engineering yet, but everyone is getting there. It's just cost inefficient to run things the way they are today."