Disgusted by security issues and poor performance, Winn Schwartau makes the switch from Windows to the Mac and details the bumps in the road along the way in his "Mad as Hell" series.
The basic security principles of standalone devices [mainframes and PCs] was first published in 1983 and was finalized in 1985. Called the Orange Book or Trusted Computer Security Evaluation Criteria [TCSEC], it set the standards by which we need to look at the security of any computing environment, be it an enterprise or a single box. For my purposes here, though, I will just look at the PC and ignore the network for a while because IP stacks are pretty darned reliable. God-bless Open Source.
Although the Orange book is ancient, the current version called Common Criteria, is still based upon… well, the Basics… something all too often ignored by the WinTel world, despite their protestations to the contrary.
The Orange Book [et al] created a set of guideline criteria by which security could be evaluated. One of the most basic principles of evaluated security is that once an application, O/S or piece of hardware has been evaluated to a specific degree of security performance, there can be no changes at all. The vendor cannot change so much as one byte of code without having to go through a long and costly security evaluation process again.
Therefore, by definition, every change implies some unknown degradation of the security of the
The American approach to security evaluation was superseded in the late 1980s by a European effort called ITSEC [Information Technology Security Evaluation Criteria]. They introduced a key concept called TOE, or Target of Evaluation. They recognized that a prime failure of the American TCSEC was that it considered each component of a system [O/S, application, crypto, etc.] in isolation and not as an homogeneous whole. [After all, I am a systems engineer.] In my mind this was a breakthrough that has escaped the attention of PC manufacturers today, thereby exacerbating the security problems we face today.
The TOE approach [now included in the modern Common Criteria approach to security evaluation] advises looking at the system as a whole; as an integrated unit, with all of the operational pieces glued together as designed for a particular application. Good. I like that.
However, computer manufacturers who integrate O/Ss, applications, utilities, drivers, patches, hardware, protocols, BIOS and more are expected to produce a system that works reliably.
For the life of me, though, I cannot see how they are reasonably expected to produce anything that functions according to my initial specifications in how I [and most everyone] works. It's too hard. There are too many variables. And we wonder why we don't have better security and why WinTel fails?
Most of us accept that security and functionality are inversely proportional [S=1/F]. The answer is intuitively obvious: remove some of the unneeded complexities and give us back some sense of security, particularly availability.
We are only going to compound our errors and make things worse, to my chagrin. My cell phone makes and receives calls. It does not GPS, SMS, AES, PDA, POS or anything else that will turn it into a guaranteed failure. But the vendors are adding features there too.
About the author
Winn Schwartau is one of the country's leading experts on information security, infrastructure protection and electronic privacy. Schwartau is president and founder of Interpact Inc., The Security Awareness Company, which develops information security awareness programs for private, public and government organizations.