SAN FRANCISCO -- Organizations with insider threat programs spend a considerable amount of time worrying about hacking techniques and malicious code, but FBI experts told RSA Conference 2013 attendees that insider threats are not usually hackers.
I believe that organizations who have good insider threat and data protection programs will be around in 10 years, and those that don't -- won't.
According to counterintelligence presented by the Federal Bureau of Investigation's Insider Threat Program last week, employees, former employees or contractors -- those who joined the organization with no intent of wrongdoing -- pose the biggest threat.
These findings, which are based on 20 years of espionage case investigations, indicate that contrary to popular belief, when it comes to data loss and spying, the real-world insider threat is not a stereotypical hacker who covertly siphons off sensitive information on internal systems and networks.
Authorized users with a level of organizational trust, who are doing legitimate activities with malicious intent, pose the biggest threat, according to Patrick Reidy, the FBI's chief information security officer heading up the Insider Threat Program. The program was created in the aftermath of the Robert Hanssen debacle, the 2001 incident in which an American FBI agent was caught selling information and other goods to the Russians after 22 years of espionage.
Since Wikileaks started releasing classified information, Reidy said discussions about insider threats have intensified. How can security professionals root out potential insider threats before their organization faces fraud or catastrophic loss of intellectual property and sensitive data?
Educate 'accidental' insiders
First, Reidy said to make sure that best practices surrounding firewalls, antivirus software, policies and other system controls are followed. About a quarter of the incidents that the FBI tracks on an annual basis stem from what Reidy called "knucklehead" problems: unintentional acts in which employees compromise systems by not following procedures, losing equipment and sensitive data, clicking on spam, inappropriate emails or Web links, or mishandling passwords and accounts.
Reidy said the FBI spends about 35% of its response time on these types of incidents. Focusing on education can help minimize these problems; he said these incidents have dropped 7% at the FBI in the past year.
Insider threats are not numerous, according to Reidy, but in terms of damages they are the most costly. Of more than 1,900 incidents reported during a 10-year period, Reidy said about 19% were malicious insider threats. Based on information from multiple, "open source" data breach reports and data loss surveys, the average cost per incident is $412 thousand, and the average loss per industry is $15 million. In several instances, damages reached more than $1 billion.
Data from cases prosecuted from 1996 to 2012, under the Industrial Espionage Act (IEA) Title 18 U.S.C., Section 1831, which requires proof of links to a foreign government, indicated an average loss of $472 million (damages claimed in court). China was involved in 71% of the cases; 29% targeted other countries.
"I believe that organizations who have good insider threat and data protection programs will be around in 10 years," said Reidy, "and those that don't -- won't."
Reidy said it's important to identify patterns on an insider threat continuum using diagnostic analysis. The FBI found that the predictive analysis, which it had used for years, was not effective. Insider threats do not act like other people and many reach a tipping point, according to Reidy.
Use a multi-disciplinary approach
A good insider threat program requires more than policy compliance and cybersecurity. "It's not a technical problem," said Kate Randal, an insider threat analyst with the FBI and whose research indicated that in 90% of cases the problem can't be detected by malware. "It's a people-centric problem," she said, "and people are multi-dimensional, so what you have to do is take a multi-disciplinary approach."
The goal of a good program is to deter, detect and disrupt insider threats, Randal said. Program implementers need to identify personnel and classified information in addition to cybersecurity.
"It's important to focus on identifying your enemies, your people and your data," Randal said. That process includes asking questions like, "Who would be interested in your organization, and whom within the company would they target?" Randal said the FBI looks at a combination of cyber, contextual (i.e., financial status, travel, reports) and psychosocial information. In the enterprise, Randal said security professionals should work with their legal departments to determine the types of information that they can legitimately collect.
Five lessons the
- Insider threats are not hackers.
- Insider threat is not a technical or "cybersecurity" issue alone.
- A good insider threat program should focus on deterrence, not detection.
- Detection of insider threats has to use behavioral-based techniques.
- The science of insider threat detection and deterrence is in its infancy.
Source: Combating the Insider Threat: Real World Lessons, RSA Conference 2013
According to Randal, it's critical to understand the company's assets and to identify "the crown jewels of the organization." A good place to start, she said, is to list the worst-case scenarios -- both assets and individuals -- that could really cause damage to the company. Another good question to ask, Randal recommended, is, "What are the top-five systems with sensitive data?" From there, track the user data, logs and documents on these systems. Internal FBI security logs showed that more than 80% of data movement was done by less than 2% of the workforce.
Good insider threat programs should focus mainly on deterrence, not detection, Reidy said, because by then it is often too late. The FBI decided that it couldn't possibly identify every potential insider threat. Instead, it opted to use multiple tactics to deter insider threats.
One method involved creating an environment that discouraged insider threats by "crowdsourcing" security, and another is based on interacting with users. By giving tools and capabilities to educated users to encrypt their own data, and come up with ways to protect and classify their own data -- which is different than the centralized policies that many organizations adopt -- the responsibility for information security is transferred to users, using positive social engineering to heighten awareness.
"The whole idea is creating rumble sticks in the road," said Reidy, with one example being a warning screen whenever a user tries to download sensitive files onto a USB device. This, he said, lets users know, "Hey, we are watching you" and makes them answer the question, "Do you really want to do this?"
Reidy said such an effort may result in internal pushback from people who do not think the rank-and-file are capable of handling this level of responsibility. That scenario played out at the FBI, but for Reidy, it wasn't valid.
"At the bureau we have 14,000 people that come to work every day with firearms," Reidy said, "and you're telling me that they can't learn how to use a USB?"
Detection of insider threats should use data mining and behavioral-based techniques, Reidy said. Focus on diagnostic analytics and observable red flags, he said, such as changes in behavior. According to FBI research, psychosocial risk factors can range from disgruntled workers and people with high stress levels (those dealing with a divorce or financial problems), to vulnerable individuals and egotists.
Initial steps, Reidy said, may be as simple as sharing information with human resources. He recommended trying to notice, for example, if someone is printing a high volume of documents after hours on a Friday and is then fired on Monday. Base detection on users' baseline computer behavior (volume, frequency and patterns) and use strategies to lure threats out.
"Finding a needle in a haystack is simple; finding a needle in a stack of needles is hard -- everyone is the same or acting the same way," Reidy said.
The science of insider threat detection and deterrence is nascent, according to Randal. The CERT Insider Threat Center at Carnegie Mellon University does threat modeling by industry as part of its Management and Education of the Risk of Insider Threat (MERIT) program. Cyber Insider Threat (CINDER) at DARPA is also doing work in this area and can serve as a valuable resource. Security professionals can use this research to implement or improve their insider threat programs, she said, and aid researchers by providing data on insider threat incidents.