krishnacreations - Fotolia
It’s hard to ignore the parade of headlines that point to breaches caused by third-party security lapses. Yet organizations still fail to monitor contracted vendors until it’s too late.
“Responsibility follows the information,” cautions Rebecca Herold, CEO of Privacy Professor and partner at Simbus Security and Privacy Solutions. While third parties come through breaches unscathed, organizations from Target Corp. to Dairy Queen often pay a steep price for public incidents traced to vulnerabilities that vendors introduce.
That doesn’t surprise Herold, who has performed audits for companies of all sizes: “I could fill hundreds of pages with the security and privacy incidents that contracted entities have caused,” she says. A 25-year veteran who specializes in privacy and compliance, Herold is working on her 17th book, which explains how to establish a third-party security and privacy risk management program.
Marcus Ranum caught up with Herold, who is an adjunct professor for the information security and assurance master of science program at Norwich University Online, to discuss the problem with service level agreements (SLAs), and why a lot more action on the part of senior management is needed.
Marcus Ranum: I’m really excited to have a chance to ask you a bunch of questions about privacy and working with third parties. It seems to me that the prevailing wisdom is: Get an SLA. Right?
Rebecca Herold: It’s not really wise. But yes, it has been the long-standing [practice] of most organizations, often strongly supported by executive management, whose lawyers found ways to say: ‘Oh, no, Mr. BigDog; we really don’t need to do anything more than get our contracted parties to sign an SLA, along with a hold-harmless agreement.’
Organizations are starting to realize they must do more. Significant numbers of breaches are caused by the contracted entities doing work for their business clients. Look at the large Target breach at the end of 2014; it was caused by a contracted vendor that left the HVAC vulnerable. Do you know what the name of that vendor is? No, but you sure know that it was the Target breach.
It’s more complicated than that, though, isn’t it?
Responsibility follows the information. The blame in the court of public opinion -- and oftentimes, a judicial court -- goes to the business that collected the consumers’ information.
Rebecca HeroldCEO, Privacy Professor
Look at the privacy breaches that occurred at hundreds of Jimmy John’s [locations], Dairy Queens, Chick-fil-As and other restaurants in the summer of 2014. All caused by the same point-of-sale system vendor. … Did that vendor’s name make the headlines? Did the vendor pay for all the credit monitoring? No. Responsibility follows the data, and, generally, the blame remains with the company that collected the information from the individuals to begin with.
Just this week a New York healthcare provider had the health information of 2,700 patients breached. Did it cause that breach? No, one of its business associates lost a laptop and a smartphone that contained that information. Here’s the fun fact: The laptop was encrypted, but the nurse using it had the encryption key in the laptop bag that held the laptop. The smartphone did not have any security on it at all.
The business associate is obligated under HIPAA [the Health Insurance Portability and Accountability Act] to follow all the security and privacy requirements, but she didn't, so she faces penalties. And so could the healthcare provider if the U.S. Department of Health and Human Services’ investigation shows it did not take actions to reasonably assure its business associates had appropriate security in place.
I know it’s a complicated problem and I’m afraid it’s one that gets swept under the carpet.
Yes, it is complicated. But organizations need to realize they must know more about the risks that their contracted vendors are bringing to them, and then establish a security and privacy oversight program for third parties -- contractors and business associates -- to effectively deal with it. They have to understand that the businesses they are entrusting with their information, and access to their systems, must have at least the same level of security as [the controls] they require for their own organization. But most do not.
Since 2000, I’ve done over 300 third-party reviews and audits, including vendor, information security and privacy programs. ... The risks that these small-to-large businesses, in all industries and locations throughout the world, brought to the companies hiring them -- and the associated information assets -- were often significant. But, in most cases, the companies contracting with them had no clue and the third party also had no clue that what they were doing was putting information assets at huge risk.
Here are just a couple of examples:
A large multi-national financial corporation hired a small -- five-person, all family members -- business to house the systems for one of its newly acquired business offerings, which had about 80,000 clients. The small business did not have any disaster recovery plan documented: ‘We all live together and talk every day, so we know what to do.’ [It] only made backups once a month, and it did not have the client data stored on its Web server encrypted.... My client made sure the vendor improved upon its information security and privacy program, which was pretty much non-existent, in order to continue their contract.
Another client, a large hospital system -- a HIPAA-covered entity (CE) -- outsourced patient calls about billing. I discovered the business associate wrongly believed anything that could be found publicly -- names, addresses, phone numbers and email addresses -- was not considered protected health information. They were creating databases with all such data from their CE client and selling it to marketing firms, among many other risky and non-compliance actions; the CE terminated the contract.
In a large number of situations, business associates -- and vendors in general -- do not have a dedicated position for information security or privacy; or that position is in name only, and the person filling it has no experience or understanding of actually implementing information security or privacy protections. … I spent the past year creating a new service to help businesses to effectively manage and oversee the information security and privacy risks of their third parties on an ongoing basis, so they can more quickly discover risks and address them right away. I’m focusing on the healthcare space initially, but then this year I will make another such service available for all types of industries.
Executive leadership must open their eyes to the many risks that contracted entities present to their business. Then invest the time and resources into an effective vendor information-security and privacy-oversight management program.
What are some of the pieces of the process that an organization should have in place? I’m assuming it’s more of a process and management problem than a technical one, isn’t it?
Yes, it really is a management process, but certainly technology can be used to support it. ... At a high level, organizations need to identify all their vendors that have access, in some way, to their information -- of all forms -- and information systems. Most of the organizations I’ve talked with throughout the years haven’t done this. It should be documented.
Identify the risks those vendors present to the organization based on a variety of factors, including the types of information they are accessing, whether or not they are storing sensitive and personal information within their own systems, and the types of safeguards they have in place for those systems. Document it.
Determine which vendors are high, medium and low risk; then dedicate attention appropriately. Perform regular security and privacy reviews -- there are many ways to do this -- for the high-risk vendors, as well as appropriate checks for the medium- and low-risk vendors. Keep an eye out for any published reports of breaches for the vendors they are using.
Keep including the security and privacy clauses in vendor contracts. And make sure they contain sufficient details of the requirements necessary for the types of services the vendor is providing; also, be sure to include a right-to-audit clause. Based upon the amount of risk the vendor brings with it, the organization may also want to include penetration testing and vulnerability testing within that audit clause.
Terminate the contracts of vendors that will not appropriately mitigate their risks. Yes, I know this can sometimes be challenging to do, based upon the services the vendor is providing and any long-standing contract. However, if they are putting your organization at significant risk, it isn’t worth continuing the relationship and contract.
Hearing your suggestions, the first thing that comes to mind is the old Russian proverb ‘Doveryai, no proveryai’ -- Trust, but verify. In the early ’90s, I knew a garage startup that was planning to offer a secure mobile email service to the federal government, and their service agreement basically said, ‘If something goes horribly wrong, we’ll go out of business.’
It seems to me that not enough thought goes into how to put the worms back in the can once they’ve gotten out, or if that’s even possible. What do you tell organizations about that?
Indeed, trust but verify! And I’m happy to see you attribute that saying to the Russians and not to Reagan or one of the other politicians more commonly cited.
Yes, SLAs are just words on paper. ... In those 300-plus vendors’ reviews I performed, one of my questions to the IT and security managers was whether or not they had implemented all the requirements within the SLAs. In 80% to 85% of the responses, the people responsible for actually implementing the security and privacy controls had never even seen the SLAs.
The SLAs were typically established and signed by the acquisitions or legal areas, and then nothing more than, ‘Yep, we’ve got the legal taken care of ... go to it,’ was ever communicated to the IT and information security groups. This was a real eye-opener for the three large entities for which I did the bulk of those audits. … They changed from a purely contract-based process to following a more hands-on, ongoing communications process with their vendors.
And for their high-risk vendors, [they started] requiring the CEOs to submit monthly attestations -- the CEO of a business associate with 400,000 employees in India did this, along with other validating documentation, back as early as 2003; I was impressed. Other vendor requirements included undergoing third-party audits at least annually; having the vendor complete an online risk assessment -- I have an advanced-capability version of this coming out this quarter; providing a recent, no older than six months, SSAE 16/ISAE3402 SOC 1 Type 2 report that shows satisfactory assurances are in place; and so on. The best assurances to use depend upon the types of risks the vendor brings.
Regarding getting the cats herded once they’re loose -- to use a cuter, cuddlier idiom -- it is possible, if the organization wants to continue doing work with the contracted vendors. And many do if they have established a personal business relationship with those working at the vendor, and know and trust them to make the appropriate changes.
On the other hand, one of the clients that I did over 100 reviews for used the results … to terminate the relationships of vendors that were revealed to have bad security practices. In some of those cases, the executives were looking for a way to terminate the relationship established through a legacy multiyear contract anyway.
A friend of mine who worked for a big Internet service provider once told me: ‘We have never made good on an SLA, because our sales guys just tell them, ‘We’ll give you free service for a month’ instead.’ Do you recommend a means test for providers?
Those tables were turned by one of my large -- 200,000 employees -- multinational manufacturing clients. They got fed up with the bad service, systems’ downtime and various ‘little’ breaches that they were experiencing from their contracted vendors. So they included in their SLA: 1) For every minute their system was down as a result of the contracted vendor, the vendor would have $5,000 deducted from the next payment from my client; and 2) for every personal record breached as the result of the contracted vendor, the vendor would have to pay for 2 years’ credit monitoring for each individual and would have $500 per individual record deducted from the next payment from my client. The CISO there told me the uptime increased dramatically, and the number of breaches fell dramatically. So this worked for them, probably because they were a huge organization that a lot of vendors did outsourced work for and wanted to continue doing work for. For smaller organizations that outsource, this would probably be a harder requirement to get approved.
Right-to-audit seems to me to be one of those things that sounds good on paper but is probably insanely difficult in practice. The results can be terrifying. I have one friend who had a provider that was required by contract to keep its browsers patched and up to date before accessing a certain system -- so they looked at the browser identifier strings and found the provider was less than 50% compliant. I’m sure you’ve seen stuff like that.
Yes, I strongly recommend a right to audit for high-risk vendors and providers. … And you are right; the results of the assessments often were startling and scary for the client that contracted me.
I also check to see if the executives of the vendors have any current lawsuits against them, or felonies, or if the organization itself has been sued. There are multiple examples I could give. But in one case the CEO of a midsize company was currently being prosecuted, along with his organization, for financial fraud, yet he had answered the question on my initial questionnaire: Have you ever been sued or prosecuted for financial crimes? ‘No.’ This shows how answering an assessment questionnaire is important, but not the only thing to be done for vendor oversight.
In a different situation I found that the owner of a small business had just been released after serving several years in prison for money laundering. Perhaps he was reformed, but it was something the client was surprised and interested to find out since the owner had told the client he had a spotless record.
Others are seeing the need for these types of assessments and audits as well. Just consider the recent Anthem hack: Now New York’s Department of Financial Services plans to perform security assessments on insurance companies in New York, in addition to updating their regulations. … And since we don’t really know yet how this ‘sophisticated hack’ of Anthem occurred, a thorough audit may reveal multiple security weaknesses, including from the vendors that Anthem uses.
The bottom line: Organizations cannot outsource their responsibilities for safeguarding the information that their clients, customers, patients and others have entrusted to them.
First, laws and regulations establish organizations’ responsibilities for outsourced activities; organizations are usually ultimately responsible for the information they collected and promised the associated individuals that they would protect.
Second, the organization’s published privacy notices and policies may indirectly obligate it to track the security and privacy activities of all contracted entities. If an organization promises the personal information it collects will be safeguarded, those promises follow the data to whomever it outsources it to.
Organizations will be judged by the company they keep … the businesses they contract. If organizations don’t want to become proactive about their oversight of those contracted entities, I have a question for them: Are they ready to pay for the security and privacy sins of their contracted entities?
Marcus J. Ranum, chief security officer of Tenable Security Inc., is a world-renowned expert on security system design and implementation. He is the inventor of the first commercial bastion host firewall.
Dig Deeper on Data privacy issues and compliance
No more selling mobile location data, promise carriers
Google tackles Android app privacy with machine learning
FCC passes new ISP privacy rules to protect customers
Secret Service cybersecurity audit shows 'unacceptable' flaws