Alex - stock.adobe.com

Hackers vs. lawyers: Security research stifled in key situations

The age-old debate between sharing information or covering legal liability is a growing issue in everything from bug bounties to disclosing ransomware attacks.

As ransomware attacks and zero-day vulnerabilities continue to dominate headlines, security professionals are lamenting the sizable gulf that remains between those who want to share information and those tasked with protecting the company legally.

Incidents including the spate of ransomware attacks on companies around the world as well as critical bug disclosures have led to calls for private companies and government agencies to not only strengthen their internal security practices, but also better engage with outside researchers who can assess networks from a fresh perspective and sniff out critical security flaws.

"Use a 3rd party pen tester to test the security of your systems and your ability to defend against a sophisticated attack," White House adviser Anne Neuberger wrote in a recent memo to companies. "Many ransomware criminals are aggressive and sophisticated and will find the equivalent of unlocked doors."

Despite these efforts, however, many in the security research community continue to be frustrated with the legal walls that prevent them from sharing their findings with both other companies and the outside world.

A disturbing trend

While the decision of when and how to ethically disclose vulnerabilities to the public has long been one of contention between the hackers who suss out flaws and the software vendors that fix them, the debate has recently taken on a different tone.

One of the more problematic practices to arise in recent years is what some industry veterans see as misuse of nondisclosure agreements (NDAs) to not only silence external bug hunters, but also to trick them into one-sided research deals.

Katie Moussouris, founder and CEO of Luta Security and the architect of vulnerability research programs at Microsoft and the U.S. Department of Defense, said these predatory nondisclosure deals are frequently being used to isolate bug researchers, making them think they are being pitted against others to disclose a bug they have, in fact, exclusively uncovered. This, in turn, allows the company to sit on a report and leaves the hacker entirely out of the discussion on when a security flaw can be released to the public.

"A disturbing trend has come about, and it is related to those private bounties. I have seen more out in the open, almost bragging about this technique of tricking a hacker to sign an NDA when there is nobody in the program but them," Moussouris explained. "There are more and more researchers who are being kettled into these agreements under false pretenses. It is certainly not in the spirit of vulnerability disclosure."

The practice is more prevalent with third-party bug bounty companies that companies hire to act as the go-between for their developers and the outside researchers, she said.

"It is not only deceptive, but honestly it is not helping their client," Moussouris said. "It is unethical to trick the researcher and to do so by deception. It has been remarkable in it is public flaunting of labor exploitation."

Transparency -- or a lack thereof -- was a key topic at RSA Conference 2021 last month. For example, several sessions featured lawyers and executives who advised organizations dealing with cyber attacks or breaches to limit the amount of information in the public record of an incident in order to reduce legal liability. Meanwhile, threat analysts lamented NDAs and other corporate obstacles that restricted security research intelligence sharing and prevented other organizations from learning about and preparing for emerging threats like the SolarWinds supply chain attacks.

Casey EllisCasey Ellis

Casey Ellis, founder and CTO of BugCrowd, said vendors can create problems for themselves with bug bounties and vulnerability disclosure programs (VDPs) when they don't get everyone on the same page ahead of time. In particular, he said many companies fail to properly communicate whether they want a private, single-user vulnerability disclosure program or a wide-ranging public bug bounty.

He recommends management sits down with legal, product and security departments as well as any outside partners to decide ahead of time how they want the bounty program to be conducted and what information they intend to allow to be made public.

"There is a lot of work that BugCrowd helps organizations do in terms of what they are committing out to the internet," Ellis said.  "When you put that on the internet, that almost becomes your social contract."

One area where both Ellis and Moussouris agree is that vendors should not rush into setting a up a bug bounty program. Rather, companies should first ask if they are better served with private penetration tests and vulnerability disclosure programs that would make them better able to handle and digest the flow of reports from a public bounty program.

"When organizations come to us and say, 'We have not done this before, but we want to launch a public bug bounty program next Friday,' our response is 'No, you don't,'" Ellis said. "What is smarter to do is crawl, then walk, then run."

Government pushes breach reporting

On the data breach front, security research experts worry that legal concerns are increasingly at odds with government demands for prompt disclosure of ransomware attacks and data breaches. Recent attacks have brought about calls for stricter disclosure requirements.

Shortly after news of the Colonial Pipeline attack, CrowdStrike vice president of intelligence Adam Meyers told SearchSecurity that the demands for a prompt disclosure of a ransomware attack or other time of network breach can often run counter to other legal practices.

"The challenge is that when an incident response is conducted under the guidance of external council, an external council wraps everything with attorney-client privilege to protect the content and the documents and the work products produced by the incident response," Meyers explained.

"They want the company or entity to have the ability to figure out what they're going to do and how they're going to do it. If that document is public or discoverable or something like that, it could create a big mess."

At a recent congressional hearing on the SolarWinds attacks and supply chain security, Moussouris discussed some of the complications for government-mandated breach reporting, bug bounties and VDPs. She said requiring mandatory breach disclosures within three days of detecting an incident "might not be possible" at that stage of an investigation because victims may not even know they have a serious breach.

In some cases, such as Colonial's ransomware incident, Meyers questioned the aim of prompt reporting requirements.

"These requirements to do reporting, I think, need to realistically look at the way that incident responses are being conducted and understand what all of the factors are," he said, "because I don't know that Colonial reporting this incident to the federal government would have changed the outcome."

Security news writer Alexander Culafi contributed to this report.

Dig Deeper on Risk management

Networking
CIO
Enterprise Desktop
Cloud Computing
ComputerWeekly.com
Close