Vendor complexity clouds security

Kurt Siefried discusses how vendor complexity leads to weak security.

This Content Component encountered an error


Earlier this week, I released a security advisory for everyone using "secure deletion" software (such as PGPWipe, BCWipe, East-Tec Eraser, etc.) on an NTFS file system (i.e. Windows NT, 2000 and XP) being at risk (a sizable group of people). Every single vendor tested, with one partial exception, fails to delete alternate data streams, a facility in NTFS for creating files that contain multiple sets of data (i.e. multiple files). As a result of this, large amounts of sensitive data are left on hard drives -- data that people think has been securely deleted. How is it possible for such a large problem to occur -- not just with one vendor but with all of them?

In a word: complexity. Despite recent announcements by software vendors (such as Microsoft) to concentrate on security and stability instead of new features, the pace of complexity will not slow down, let alone stop. File systems such as VFAT and FAT32 are simple. In comparison, NTFS is an incredibly complex file: alternate data streams (ADS), the master file table (MFT), the transaction log and encrypted files and folders, to name just a few features. The additional complexity in NTFS is not understood by many people, as evidenced by the fact that secure file deletion software vendors have completely missed out on it for several years. This type of problem is not present in Microsoft products alone. The Linux kernel is growing rapidly in terms of capabilities and sheer size, as are most modern operating systems.

These increasingly complex systems are routinely integrated with each other, causing the number of interactions to grow exponentially. Enterprise networks usually contain several tens-of-thousands of hardware devices (from computers and routers, to printers and plant machinery) running thousands, if not tens-of-thousands of different operating systems, software packages, security patches and so on. The average operating system such as Windows or Linux usually has 20 to 200 or more software updates that vendors recommend you install.

So what can be done to deal with added complexity in systems and networks in order to increase the security and reliability of systems?

Communication is a major and typically overlooked problem. Developers of operating systems need to communicate with third-party vendors. The creation of a new file system will assuredly affect anyone that makes products such as secure file deletion utilities or file recovery tools. Meetings between developers need to take place so that ideas are brainstormed and discussed: operating system vendors and third-party application vendors, your web development team and your legacy application support teams. The creation and distribution of documentation is not sufficient; often, developers in one area will be unaware of the details and consequent issues of other areas. Unfortunately, in some cases this is a problem because of political and business considerations. Vendors such as Microsoft produce products like ISA that compete directly with third-party firewall and VPN products, for example. Competitors in the same space (such as VPNs) often work together, conducting interoperability testing and so forth, but are unlikely to share details of their software or source code.

Verification of software packages, specific configurations, procedures and so on needs to take place through the implementation cycle. Vendors typically test software updates and new releases extensively and miss major issues routinely. It is unlikely that their testing lab is identical to your enterprise network.

Before a rollout of any major software package or update or configuration changes take place, they should be tested in a lab and once rolled out should again be tested and verified. AT&T learned this the hard way when a single machine with a new software release took down their frame-relay network for an extended period, preventing credit-card transactions from taking place in many urban centers, among other effects. This is a cyclic task: Once you have verified the latest additional software, configuration or procedure, you will have to start again as a new package, configuration or procedure.

Demanding secure systems from vendors and systems integrators is another option. Design and purchase specifications should contain items such as "security updates should not impact on required functionality and should be made available within X days or else a penalty will be applied." Unfortunately, it is virtually impossible to negotiate with large software firms; even the U.S. justice department has trouble applying leverage against Microsoft. This could also place an undue burden on smaller firms that are willing to accept such clauses into contracts.

Ultimately, there are no easy solutions. A continuous cycle of planning, testing, implementation and verification is required, combined with vigilance and monitoring to detect and address new threats. Security is not a solution, nor even a journey, as you will never arrive at a state of security. Security is an ongoing process, a way of life that must be incorporated into your business processes if it is to be effective.

About the author
Kurt Seifried is a full-time security analyst/researcher. Coming from a technical background, he is making progress on the business side of infosec and risk management.


This was first published in January 2002

Dig deeper on Vendor Management: Negotiations, Budgeting, Mergers and Acquisitions

Pro+

Features

Enjoy the benefits of Pro+ membership, learn more and join.

0 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

-ADS BY GOOGLE

SearchCloudSecurity

SearchNetworking

SearchCIO

SearchConsumerization

SearchEnterpriseDesktop

SearchCloudComputing

ComputerWeekly

Close