Presentation: Privacy Architecture for Data-Driven Innovation

Track: Ethics, Regulation, Risk, and Compliance

Location: Pacific LMNO

Duration: 5:25pm - 6:15pm

Day of week: Monday

Share this on:

What You’ll Learn

  1. Hear about what makes privacy and security important.
  2. Learn about some of the tools to use in building privacy in software tools.

Abstract

Data-driven businesses can no longer treat privacy as strictly a legal compliance-focused discipline. In a post-GDPR world, privacy needs an engineering focus to ensure it is actionable, enforceable and scalable. 

This talk will discuss how you can set up a privacy architecture to build in “privacy by data”.

The first part of the talk will tackle privacy challenges posed by incoming data into your company. This data can be extremely sensitive in that it describes who you are, where you are and other information that can uniquely identify you.

How does an organization assess and classify the risk around the data? I will discuss how your privacy architecture team can work with privacy legal to create a multi-tiered data classification, and then with security, data science and data platform teams to set up a backend that tags your data to reflect said classification. With this investment, your employees will be able to make informed decisions around data since they will know its privacy risk.   

The second part of the talk will tackle privacy as it related to sharing data with third parties, be it vendors, partners or even governments and regulators. How do you protect data from security risk or even re-identification risk in those cases? What techniques are available and what are the trade-offs involved? Uber is at the forefront of those conversations and I will discuss what our research and case-studies have yielded. 

Question: 

What is the work you're doing today?

Answer: 

I'm in the field of privacy and security, and I am the person that builds out the central privacy architecture program for major corporations that run highly siloed, highly distributed, highly innovative and very driven by velocity and growth. These businesses are typically created in a way where different parts of the business can grow independent of each other, and therefore are not docking each other in any way. And by definition, privacy and security has to be a little more centralized, has to be more common right across the board.

How do you make sure that you create a governance structure that works, that is technical, that is scalable, that is realistic, enforceable? That is where I come in. So whether it's building services, whether it's creating governance and standards, whether it's applying those standards to data, whether it's providing consulting services to bake in privacy by design. My job is to create those programs, recruit those people, entrench those programs into the company bloodstream, and make sure that as time goes on, we improve the privacy culture from an engineering perspective and also make sure that the legal team is aligned so that we're doing things that make sense for the company's legal risk perspective as well to secure our customer's trust.

Question: 

What are your goals for the talk?

Answer: 

There are three key goals on the talk. The first goal is making sure that people understand what privacy means. Most people have some understanding of it, but my hope is to give people some provocative context on how it has become important. And how the tech sector designs privacy and how we treat people's data is a big part of how people look at the tech sector. Putting some of that in context is one key goal. The second key goal is to demonstrate that privacy is not as mythical or as expensive as people think it is. There are a lot of things people in the room, developers, engineers, data scientists, security professionals who are already doing that. We can build upon to make sure that privacy can be done correctly and can have a significant impact. So how do you actually create the privacy controls to respect GDPR, CCPA or state audits, etc, etc.. First the what then the how, and then, of course, we come to the question of getting into the details of the implementation. What does all of that logistical program look like? What are some of the components? How do you do better governance? How do you pacify your data? How do you provide consulting to engineers to understand what the privacy risk is beyond what the legal team does? The first is the what. The second is the how. And then the third is a working session when I take people under the hood and actually explain to people how they could actually enforce some of that and what that actually means in working with multiple teams, in changing your data, etcetera. Those are the three key goals. My hope is that at the end of the talk, people can go into their workplace and make informed decisions and stop the next bad thing from happening. When it comes to privacy and security, victories are private, failures are public. My hope is to give people more private moments of reflection and knowledge and avoid public embarrassment. That's the big deal.

Question: 

What you want people to leave the talk with?

Answer: 

A very clear understanding of some of the tools they can use. If you want to protect your data, if you want to avoid getting out of compliance, if you want to avoid a data nightmare where wrong people access data for the wrong reasons or data leaks out when it wasn't supposed to, any number of things that can go wrong from a privacy perspective, just look back at the headlines over the last two years, what are some of the actual tools you can use and how do all of those tools coalesce together into a comprehensive, scalable engineering program? That is what people will take away from my talk.

Question: 

You mention the European GDPR legislation in your abstract. I just wondered if you could give a very high level summary of what that legislation says and how it might apply to somebody like Uber.

Answer: 

GDPR is extremely complex and it would be hard to break it down into a soundbite. But let me try and give people a clear sense of what GDPR really enforces. From a GDPR perspective, the key thing is it changes the definition of personal data. It basically gives you a clear sense of what that data is, what your responsibilities are in terms of that data, what is the user supposed to do, what is a user expected or rather, what can the user expect from you in terms of knowing what data you have, in terms of giving you consent to use the data? I mean, I would strongly recommend for anybody in the room, for anybody in the tech industry at large that depends upon customer data, that depends upon connecting with users, and the GDPR is very use specific, but the GDPR is adding to other laws in California and other states in the US. So I would recommend making sure that your legal team understands. What does it mean to collect that data? How can you collect the data? What options do you need to give to the users? When can the user opt out of that data? What does that consent process look like? What happens when a user writes you and says give me access to my data that I gave you?

The GDPR is this omnibus law that speaks to all of these aspects. The way I look at it, and this is my editorial opinion, it tries to level the playing field between the corporations that collect data and the users whose data fuels a lot of innovation. That's the founding idea behind the GDPR. And whether it really has that intended purpose is a different conversation. We have to take a few years to fully understand that, but at a minimum, the GDPR gives you a clear sense of what you should and shouldn’t be looking for in terms of protecting customer data and respecting their privacy and security.

Question: 

I was interested to note that you have security experience of a number of different companies, including Nike, Google and Netflix, as well as with Uber. How does the experience of these companies compare?

Answer: 

Yeah, it was interesting. Actually, the first time I worked on privacy and security was actually even before I worked at Nike. So Nike was when I ran my first real formal privacy program and had a heavy security note to it as well. When I first worked on this stuff, when I was doing SSIS programming, which is basically a software tool that Microsoft makes, you do cleansing up data that you get from enterprise customers. So we would get data from enterprise customers, we would cleanse the data and the people would get gift cards or some such benefit from them. So we were the backend processors. A lot of what we talk about today, making sure the data gets handled correctly, that it has access control, if there is a breach you send off a notification, everything that is codified in GDPR was something we dealt with back in 2010, 2011, because we were in the healthcare sector. The Affordable Care Act got passed and it was early days. When I was at Nike that was the beginning of the world we look at right now where people have ubiquitous computing. Everybody started to have some mobile device, the Internet was connecting more and more people. Social media coalesced together, Pinterest and Instagram were in their early days. Facebook really hit its stride. Google was adding users left and right. The idea of having all these people on your platform, collecting their data and fueling new products was really starting to take shape.

And at Nike, we were at the forefront of it because we had a support component and we had people's willingness to participate and share information with sport and then had a marketing dash to it. That creates a whole other paradigm to it. So we were under scrutiny for everything, whether people are getting the right privacy policies. Was this easily visible to people? Are we honoring consent? Were we tracking it properly? All of those sorts of things. So it was all like trying to figure out what to do. Why to do it? What is important? That's what we were dealing with at Nike. So it was, again, like I said, early days and there wasn't a whole lot of structure to it. And GDPR, however, it shapes out in the long term, creates that structure, creates a level playing field for all of the big players of data.

From Nike, I moved to Netflix. I was totally happy at Nike, I was going to live with my wife in Portland, Oregon for a long time. But when a company like Netflix calls with its tremendous promoter on a tech landscape and some of the smartest people I've ever worked with, particularly on the engineering side to this day, remain at Netflix. That was something I just couldn't say no to Netflix. It's about streaming data. It's extremely personal. What do you watch, where you watch it from, information about your device? How do you build a governance program in a company that has historically eschewed any kind of process? That's when I started navigating a privacy program and security program that speaks to the science and the engineering of it and also speaks the legal aspect of things. Google was great because Google has a method for everything, like the infrastructure, the resources that we can bring to bear the customer base. Loyalty is phenomenal. And of course, Google is also a pretty big target for anybody who wants to critique how tech companies have data.

A lot of this journey was about the discipline of privacy and security maturing and we maturing as a professional, as a leader in building and using these programs. Uber was a phenomenal challenge. It's a growing company, rides, Uber Eats, all across the world, tremendous degrees of personalization, what people eat, where they are. And I'm humbled by the fact that there is a tremendous amount of responsibility that comes with protecting that data, because this is some of the most personal data people give us. And there is a significant trust and ethics component to it, beyond the engineering aspects. The way I look at privacy and security is that this is engineering and science and law in the service of securing customer trust. That's how I think of privacy and security getting together. 

Speaker: Nishant Bhajaria

Author, Privacy and Security Leader, Digital Product Architect @Uber

Nishant Bhajaria is a privacy and security leader and a digital product architect.

Nishant is a senior-level leader in the security, privacy, and compliance space, and he's built teams and programs to help achieve these goals. He typically serves as a vital link between legal, engineering, and C-level leadership to ensure that a company's products help protect user data and secure customer trust. Prior to his current role—privacy architecture and strategy at Uber—he worked in compliance, data protection, and security and privacy at Google. In addition, he has served as the head of privacy engineering at Netflix. 

Nishant holds a BS in computer science from Truman State University and an MS in Computer Science from Arizona State University.

Find Nishant Bhajaria at

Similar Talks

Evolution of Edge @Netflix

Qcon

Engineering Leader @Netflix

Vasily Vlasov

Mistakes and Discoveries While Cultivating Ownership

Qcon

Engineering Manager @Netflix in Cloud Infrastructure

Aaron Blohowiak

Monitoring and Tracing @Netflix Streaming Data Infrastructure

Qcon

Architect & Engineer in Real Time Data Infrastructure Team @Netflix

Allen Wang

Future of Data Engineering

Qcon

Distinguished Engineer @WePay

Chris Riccomini

Coding without Complexity

Qcon

CEO/Cofounder @darklang

Ellen Chisa

Holistic EdTech & Diversity

Qcon

Holistic Tech Coach @unlockacademy

Antoine Patton

Tracks

Monday, 11 November

Tuesday, 12 November

Wednesday, 13 November