pogonici - Fotolia

Cisco's chief privacy officer on the future of data after GDPR

Michelle Dennedy, vice president and chief privacy officer at Cisco, discusses her company's approach to meeting the requirements of the EU's General Data Protection Regulation.

The May 25th deadline to meet the European Union's General Data Protection Regulation requirements came and went, but the work of chief privacy officers is far from over. With EU and Asia-Pacific Economic Cooperation data privacy frameworks, the future of data use and consent is rapidly changing. Just ask Michelle Dennedy, vice president and chief privacy officer at Cisco.

In addition to tracking data privacy laws that are specific to the company, industry and region, the chief privacy officer must work across multiple departments to develop privacy policies to protect sensitive data and communicate those policies not only to customers, but across the global organization.

How did Cisco go about implementing the General Data Protection Regulation (GDPR)? In this Q&A, Dennedy, who has held similar roles at McAfee/Intel Security and Sun Microsystems, talks about how companies can mature a privacy program and the future of consent.

Editor's note: This interview has been edited for length and clarity.

Some companies are talking about a chief data officer or a data protection officer. Does a chief privacy officer have more of an IT and legal background or is it a similar function? Or is that still evolving?

Michelle Dennedy: I think it is evolving. I think, most of the time, when you say data protection officer or privacy officer, we do the same things functionally.

I think the digitization officer is typically someone who is looking at new business, so they are looking at the data assets as tangible assets or intangible assets that they are going to be building new markets on. They are thinking about the activities around the data; they are not thinking about 'what is the data?' as assets.

I think, traditionally, privacy people more often than not have risen out of the legal profession and security officers out of the technical side of the world. But that certainly doesn't hold true for many of the greats in both camps.

My training came out of the legal side of the business, but I spent a ton of time running or working in technical teams. I think it is a bit of a mixy-matchy world right now, but it's really a matter of focus rather than distinction.

So the chief privacy officer develops the data protection policies and the CISO enforces them?

Dennedy: I think that is largely true, although … they develop their own policies about authentication or dual factor: Are you allowed to work from home and use a company-issued device and so on.

There are policies on the landscape of data protection that a security officer would typically own and the content policies will come out of the legal or privacy office.

Does that mean that policies related to GDPR are coming out of the privacy office?

Dennedy: Yes, more often than not. There is certainly the security component within the fair practices and principles, and you are required under GDPR to provide adequate security.

So how that is done and deployed is through my partnership with the security team. Whether we decide to do business in Europe, for example, wouldn't be my decision alone, of course, but from a data perspective, it would be. There is a bunch of data over here, and do we want to protect it in this way or do we want to have a data center over there?

Michelle Dennedy, vice president and chief privacy officer, CiscoMichelle Dennedy

It is such an interwoven world right now that I don't think there is a tremendously clear path. And, quite honestly, I think the most successful paths may get as intentionally messy as possible, if that makes sense, so that the teams are always interwoven.

For example, whenever my team is doing a privacy review on a product, they would never do it without a security professional who has that technical expertise and vice versa. Looking at something and saying, 'Is this technically capable of keeping out people who are not authorized to be in the mix?' That's fine, but if you haven't really thought through who is at [risk] and where [the data] is going then it's a little bit of a pointless exercise.

So we really need each other, but we have slightly different perspectives. It is like having an orchestra. You've got a conductor. You've got a score. You've got audience members, and then sometimes you add ballet to it and someone is dancing. That's kind of how it is; the production can be the amalgamation of all those things or it can be each element at a time.

A lot of people talk about GDPR as a consolidation of privacy laws. Going forward, how should companies that are doing business in the EU think about user data? As the chief privacy officer, do you expect some of those strategies to also apply to what is happening in the United States?

Dennedy: We've been focused on Asia for probably the last six months, as well as GDPR, because that's the next great hot spot. If you are not taking a global view, I think you are missing out on really understanding how industrialized data works these days, particularly for a global business like ours, but even in smaller businesses where you are using cloud services.

That's a decision that you've got to make and you've got to make it before you have a huge dependency built up. People are approaching it from different angles. I hope that people are not just saying, 'May 25th is over, we're done.' It is really the beginning of accountancy of data in my mind.

You've talked about the term consent changing with the evolution of data privacy laws. In the wake of GDPR, some people are talking about things along the lines of forced consent. What is your thinking on some of the data collection strategies related to consent?

Dennedy: Getting express and explicit consent to use and reuse that information beyond the initial collection was tremendously challenging even before GDPR; it can be done, but I think more often now people are going to have to look at it more expansively.

I would never force consent because that is not consent -- that's just coercion. You are really looking across [the organization] and saying, what is your legitimate purpose for collecting that information? How can we slim down the data that we are using for various functions?

You're really talking about data minimization, which is an important principle under GDPR, but it is also an important principle under asset management. You would never say to your CFO, 'I need to have $20 billion dollars. I have no plan for it, but just in case, I just want it in my coffer and you can just let it sit here and I'll figure it out later.' I can't imagine a world where that would be a winning argument, but we had that argument for data for a long time.

I think those things are starting to end. And, instead, you are saying, 'I want to know this about this type of customer to build this type of service,' and 'Is this type of service personalized?' Well, then you are probably going to have a bunch of personal data about that person or you are going to have observed personal data about that person or you are going to ask other partners to share information about that person that has been observed or that has been abstracted through analytics or machine learning.

If that is your business plan -- and healthcare is one of those places where it has to be your business plan -- you have to really think through how you're adding up those protective measures to balance out the risk of collecting it and amalgamating all of that data and those data sources. And there are a number of different ways that you can do it, but you have to plan for it.

So I think rather than saying, 'We are going to have forced consent,' we are going to have to figure out what our interactions with other human beings should be. How do we maximize the potential of our data and our businesses and our business offerings and understand when they need to be associated with personally identifiable information and when, instead, they are experiences that are sort of open.

How did Cisco go about implementing GDPR? I'm assuming that you had a fairly mature privacy program.

Dennedy: That's kind of interesting because I've only been the chief privacy officer here about two and a half years, almost three. Cisco, through most of its history, has been largely business-to-business and not collecting a lot of information outside of a contractual arrangement. With contractual arrangements, the legitimacy of sharing information that was supposed to be shared and controlling the confidentiality of network patterns, for example, was so much easier.

It was about six or seven years ago when Cisco started having individualized cloud offerings -- acquiring companies and offering more and more personalized security services and web services. All of these things require data about individuals, so the pivot at Cisco really started before I got here. But it heated up like everyone else's on the run up to GDPR.

We actually have a whole group of people within our trust office who focus on questions like: What will quantum computing do to encryption as a protected source? What would quantum computing do to the potential to really have personalized consent in the future?
Michelle Dennedyvice president and chief privacy officer, Cisco

The way we've done it is really lift and separate rather than having a one-stop shop where you just have security in one place and you have legal over there. We have a chief trust officer, John Stewart. We have one organization in operation that says, 'We've got security and we've got privacy.' I've got a whole sister organization that is totally focused on compliance doing things like having tabletops in case of loss of data; we do incident reporting that goes into that group, as well as track, measure and pattern our data. All of that work is operational work, and we have a large group of people doing that.

But we also have some cool little nifty nuggets in advanced research, too. We actually have a whole group of people within our trust office who focus on questions like: What will quantum computing do to encryption as a protected source? What would quantum computing do to the potential to really have personalized consent in the future?

These are not things that are going to be applicable over the next two years. But five to ten years out, these are market-changers. We are supporting that research with universities and partnerships today in that group.

Finally we have a group who is just working with governments abroad that are really digitizing, doing smart cities, trying to figure out what their data plans are supposed to be and what their networking will look like with security, and working with their militaries to make sure that all of their information is safe and minimized. All of that rolls up under John as our chief trust officer. That's really how we're handling it -- as a group project with subject matter leaders all under one tidy little roof.

On your website, there is a lot of discussion about taking inventory of your data and mapping all of that data, including cloud environments. Many companies are nowhere near that in terms of data structure.

Dennedy: I think they are going to have to get on it. I have a bias -- everything looks like a nail when you have a hammer, and my hammer is privacy engineering. And so, in addition to having a group that has thought about it for the last 17 years, we've added privacy engineers to the Cisco secure development lifecycle. I have privacy engineers and architects on my team and we're looking at that future.

Companies who aren't mapping their data are not only falling further and further behind on where the market is going, but if they are not adding privacy engineering for the various GDPR elements -- security is only one of them -- then I believe they are going to miss market opportunities going forward.

And the good news is, at Cisco, we support every type of market. We're not just networking; we connect people, so we are trying to provide a platform so that even if they haven't done all of their mapping, they will be able to look to us for security and privacy as a service.

I think you touched on this earlier, but are the types of policies that Cisco has in place to protect employee and customer information different in the EU or Asia than, for example, the United States?

Dennedy: No. We definitely do look at what those local climates are, but we have to have a global, synergized strategy and that's what we do. And that's why it makes sense for a company like Cisco to have a dedicated privacy officer, as well as a security officer.

And I spend a lot of my time thinking how do we synergize the 125 different privacy schemas? How do we get as much bang for the buck with the data map? Where does that fit with the cross-border transfer rules that we signed up for?

I think we were the first large company to sign up for the [Asia-Pacific Economic Cooperation Privacy] framework in Asia, so in 21 different economies. We actually look at how to synergize these economies, and then there are localization requirements on top of those that you simply have to respect based on culture and law.

But, for the most part, we have policies that ride on the top of the global strategy, and then we have specific playbooks and configurations that have to wiggle a little bit, if you will, to fit into local economies. But there's no way I could run 125 different countries separate from one another and still run a network that worked.

Dig Deeper on Data security and privacy

Networking
CIO
Enterprise Desktop
Cloud Computing
ComputerWeekly.com
Close