peshkova - Fotolia
Blaine W. Burnham has dedicated many years to cybersecurity education as a founder of research organizations, first at the NSA and later at leading universities, where he worked to formalize undergraduate, master's and doctorate-level degree programs.
Burnham was head of several information security research initiatives during his 11 years at the National Security Agency, including the University Research Program, which provided funding to help establish academic programs in security and information assurance. He also served as director of the Infosec Criteria and Guidelines organization -- responsible for publishing the Rainbow Series and Federal Criteria -- and chief of the commercial communications security and trusted products division, where he developed the products security profile. He established the information security research council for the Department of Defense and the intelligence community.
When he left the NSA, he helped build the cybersecurity education and research program at the Georgia Institute of Technology as director of the Georgia Tech Information Security Center. During his tenure at Georgia Tech, Burnham created the master's degree program and established partnerships with industry and government. Later, he served as the founding executive director at the University of Nebraska Center for Information Assurance, helping to lay the groundwork for its degree programs.
Burnham also worked at the University of Southern California Viterbi School of Engineering and is among those credited with developing its cybersecurity program. Currently, he is associated with New Mexico Tech, lecturing on cybersecurity education as part of the Computer Science and Engineering Department's speaker series. He holds a doctorate in mathematics from Arizona State University. He has worked on national security projects -- tooling and techniques -- for the Los Alamos National Laboratory and the Sandia National Laboratories. Marcus Ranum caught up with Burnham to discuss the state of information security and the areas in which today's cybersecurity education programs could do better.
I don't want to ask 'What about Equifax?' but it's kind of the question of the day. Companies want to put something together, but they use off-the-shelf stuff that's vulnerable. How do we get out of this rut?
Blaine Burnham: The truth be told, there's nothing wrong with it. The difficulty is just that there's nothing right with it either. How many firewalls do you need? Well, until the guy with the suit comes around and convinces you that you need one. And then the next guy … convinces you that his firewall is distinctly better than the five or six that you already have and so you buy his, too.
All of the technology of today, as near as I can tell, is in the 'sweep it up after it happens' mode.
Obviously, that won't work. Recovery is such a broad industry trend, though. Even the virtual machine 'revert it when it gets owned' model -- the problem is we still don't have high-integrity systems and people who know how to run them in a high-integrity mode.
Burnham: … and the VMs are all software. One of the strong points that was made in the Ware Report [Security Controls for Computer Systems: Report of the Defense Science Board] in 1970 was that if you're relying on software, you're on a fool's mission. Kids nowadays are not taught that. I challenge you to look at the academic programs at most of the universities today -- even the famous NSA centers of academic excellence -- most of them don't know, let alone teach, that there is a foundational set of papers reaching back to the late '60s that speak to all of this.
Why do you think that is?
Burnham: I'll tell you what happened, because I was there when it happened. There was a thing called the President's Commission on Critical Infrastructure. The general running the commission was Robert Marsh. When they were interviewing representatives of various government agencies, 'How come you're not doing [that well on] your cybersecurity stuff?' One of them said, 'Well, we don't have the money.' And that was fatal. Because they did have the money; they just didn't know how to prioritize. The second one said, 'I think we don't have the time.' It wasn't until they got to about the third agency that they figured out the right answer: 'We don't have the right people.' So that got wired into the mentality of the report. … The general said, 'Let's do [what] we do with ROTC: We'll give scholarships; we'll teach people as they come in to school. …' And I replied, 'Sir, that's a great idea except for one thing: We don't have an army to draw instructors from. We don't have instructors, we don't have books and we don't have curricula.'
It turns out that at that time, when I was at NSA, I had hired Cynthia Irvine to build a master's degree curriculum. And she did, and it was phenomenal. So, I suggested to General Marsh, 'How about you go to NSA or whoever and offer full-ride scholarships for Ph.D.s? So you give the graduate five years to get a Ph.D. If the graduate goes into academia, you forgive it a year at a time, and you give them two tickets to hire graduate students to work with them. If you do that, you'll develop the coursework, you'll develop the people who can teach it, and your students will arrive automatically.' The general thought about it and did exactly not that.
Teaching about critical infrastructure, as critical infrastructure!
Burnham: When I was at R2 [Trusted Systems Research] at NSA, I ran a thing called the University Research Program -- the URP. We dumped a million dollars, plus or minus change, every year into academics. That [served] several purposes: One of them was helping assistant professors get funding so they could get tenure. It supported graduate students for a couple years, so they had people they could work with as well. I was really not that concerned about what they worked on, as long as it was kind of in the right hemisphere. What we needed to do was put the pedal to the metal in institutionalizing that kind of thing -- once you have the funding and the body of professors then you can apply windage about what they work on.
Around 1997, the cybersecurity education in schools all changed. They stopped trying to teach good design and switched to 'We'll teach everyone to code in Java.'
Burnham: Well, that was a different problem, sort of. NSA had stood up the Center of Academic Excellence, and they were obliged to teach a laundry list of all the stuff a bunch of suits thought they should teach. It wasn't prioritized; there was no foundational stuff. It was just a massive laundry list. The problem is that there was no carrot: If you want to be a center for academic excellence, you teach all this stuff, fill out this laboriously bad form, and we'll get together and do a kind of a Johnny Carson thing to promote it.
I stood up the center at Georgia Tech, and we had a half-dozen faculty involved -- it took two years to teach and establish. Fred Cohen [a computer scientist known for computer viruses defenses and other information security research] stood one up, too, and they taught [cybersecurity education] on weekends, and they were done in something like 26 weeks. But there was no quality control with it because there was no carrot.
The impetus to teach the right stuff fell away because nobody cared -- hence, the swing to searching for easy answers.
The middle-level IT management, especially in the government -- they don't know what to ask for, so they're very easy for the guys in suits to steamroller.
Burnham: Absolutely! How do they get graded? They get graded by spending the money, don't they?
But deeper than that, there's no downside. Roger Schell [gave] a magnificent presentation at one gathering when we were in California, and essentially his presentation was 'best practices fail.' Just because we're doing best practices doesn't mean we're doing responsible protection. It doesn't mean you know what you're doing. But it does mean that you're able to duck the bullet on legal responsibility because you're now doing best practices. That's the bar we set; that's what we get.
I had an idea the other day for an app that I'm not going to write. But the app would be called CheckBox Antivirus, and it wouldn't actually do anything except let you install it and check off that you had 'antivirus' on your system for your auditors.
Burnham: (Laughing) That's basically what we're doing! We're doing a great deal of that, and it doesn't work.
I talk to executives -- often during breach responses -- and the executives say, 'Well, the guys told me it was OK and that they had been doing proper security …'
Burnham: Right: What guys? What do they know? What does OK mean?
You will find, if you have the chance to visit with any of my students, a very different dialogue. If you use the word secure, they will back up and give you a sort of funny look, then ask you for your threat statement that goes with your assertion of secure. I have helped them understand that the word secure by itself is essentially meaningless.
How security education can address the skills gap
Education programs for women in cybersecurity
Why many IT security education programs fall short
Dig Deeper on Information security certifications, training and jobs
Lack of cybersecurity skills fuels workforce shortage
NSA tracking program watched foreign hackers in action
The tug of war between user behavior analysis and SIEM
Killer discovery: What does a new Intel kill switch mean for users?