ONOX HILL, Md. -- Add psychology to the long list of topics CISOs must know in order to secure their organizations. According to one Gartner Inc. analyst, unless security leaders know the right techniques to get inside users' heads, they'll never be able to eliminate bad security behaviors.
Speaking to attendees about user security strategies this week at the 2014 Gartner Security & Risk Management Summit, Research Vice President Andrew Walls said many organizations fail in their efforts to improve user security because they simply don't understand the five key psychological factors that influence behavior: reputation, choice, relationships, mastery and integrity.
"These elements are in constant flux in our lives, but they motivate the structure of our people," Walls said. "As these ebb and flow and move in different directions, it changes the actual actions they will take."
Walls said opportunities to influence users to make secure choices lie in understanding how users want to be perceived (reputation), how much range of authority they have and where their limits are (choice), how they feel about their connections with their colleagues (reputation), how they are able to develop and maintain skills competencies (mastery), and how individuals develop trust in each other (integrity).
An understanding of those five elements, Walls said, allows security teams to apply certain techniques that encourage user security.
For example, Walls compared the action of sending an email containing sensitive data with buying moldy bread. While an individual knows right away that buying the bread was a bad decision because the mold provides immediate negative feedback, all too often security teams fail to provide the same immediate feedback to users who violate a security policy.
"We keep training them every year, every quarter, but they keep making the same decisions," Walls said. "You see it on your DLP [data loss prevention] system or SIEM [security information and event management], but the user sees nothing."
Instead, Walls said, security teams must provide timely feedback to users when they violate policy or make poor decisions, and perhaps more importantly, provide positive reinforcement when users make good decisions.
Negative feedback alone, Walls added, serves to alienate the security team and subtly tells users that someone else is looking out for security on their behalf.
"If you're just the punisher, it increases [users'] sense that security is an externality," Walls said. "Users say, 'It's not under my control. If I could feel the consequences, I could punish myself.'"
Walls advocated for making security the "easy decision" for users, drawing on the context of the two opposing forces in each person's mind -- the reflective and reactive thought processes -- as described in the book Switch by Chip and Dan Heath.
Noting that more than 90% of the choices a person makes happen in the quick, repetition-based reactive part of the brain, while more complicated choices are made by the slow, energy-burning reflective thinking process, Walls said security choices must be reflexive and intuitive.
To do that, Walls offered suggestions based on the five key psychological factors.
To foster users' reputations, Walls suggested peer recognition programs that publicly acknowledge and reward individuals who make choices that support information security.
Users must be given enough responsibility to learn how to make the right choices, he added, as well as an opportunity to make mistakes in a safe environment.
"All of us have sandboxes or 'dead test' areas where we build all our stuff, watch it crash and burn, and recompile. We do that all the time," Walls said. "We need to let our people do that. Let's give them environments in which they can try out behaviors and fail."
Walls also suggested fostering discussion on security and risk topics so users can build relationships. He mentioned one Gartner client that posted its security policy on a company wiki and asked employees to read it and suggest changes. Ultimately it didn't matter if any changes came from the exercise, he said, because the desired result was simply to get employees communicating about security.
Finally, Walls said users will trust in the integrity of the security effort if they see others, particularly company leaders, actively adhering to the program.
"You have to involve everyone. From the senior executive level down, they need to model the desired behaviors," Walls said. "Every action they do or decision they make will be public information -- someone in the company will know, and then everyone in the company will know."
Attendee Rose Futchko, director of IT services with the National Education Association in Washington D.C., agreed with Walls that to affect a culture of security, it has to happen from the top down.
"The executives do have to lead by example when it comes to security awareness and the program you're trying to implement for the rest of the staff," Futchko said. "They need to see that they are responsible for what they do."
Even though Walls' advice relied heavily on psychological theory that may be foreign to many infosec pros, Futchko said it's a welcome change because IT has long overlooked the important link between psychology and technology.
"When you look at solutions and problems and business change, it's all psychology," Futchko said. "Whether it's a change of a new business process or security, you still have to consider the psychological aspects and work with the individuals."