How to build a privacy program the right way
The security community is continuously changing, growing, and learning from each other to better position the world against cyber threats. In the latest Voice of the Community blog series post, Microsoft Product Marketing Manager Natalia Godyla talks with attorney Whitney Merrill, an expert on privacy legal issues and Data Protection Officer and Privacy Counsel at Asana. The thoughts below reflect her views, not the views of her employer, and are not legal advice. In this blog, Whitney talks about building a privacy program and offers best practices for privacy training.
Natalia: How do security, privacy, and regulatory compliance intersect?
Whitney: Security and privacy are closely related but not the same. Privacy is not possible without security. In the last 5 to 10 years, regulations in privacy and security have taken very different paths. Most regulations across the world fall to a standard of reasonable security, whereas privacy is much more prescriptive about the types of behaviors or rights that individuals can exercise from a compliance perspective. Companies look to common security frameworks like ISO 27001 or SOC 2, but privacy doesn’t really have that. That’s born from the fact that security feels very black and white. You can secure something, or you can’t.
In privacy, however, there’s a spectrum of beliefs about how data can be used. It’s much more grey. There were attempts in the early 2010s with Do Not Track, the proposed HTTP header field that let internet users opt-out of website tracking. That fell apart. Privacy and regulatory compliance have diverged, and much of it is because of fundamental disagreements between the ad industry and privacy professionals. You see this with cookie banners in the European Union (EU). They’re not a great user experience, and people don’t love interacting with them. They exist because there have been enough regulations like the Electronic Privacy Directive and General Data Protection Regulation (GDPR) that essentially require those types of banners.
Natalia: Who should be involved in privacy, and what role should they play?
Whitney: It’s very important to get privacy buy-in from the highest levels of the company. Not only do you have an obligation under GDPR to have a Data Protection Officer that reports to the highest levels of a company if you’re processing European data, but an open dialogue with leadership about privacy will help establish company cultural values around the processing of data. Are you a company that sells data? How much control will your users and customers have over their data? How granular should those controls be? Do you collect sensitive data (like health or financial data), or is that something that you want to ban on your platform?
The sooner you get buy-in from leadership and the sooner you build privacy into your tools, the easier it’s going to be in the long run. It doesn’t have to be perfect, but a good foundation will be easier to build upon in the future. I’d also love to see the venture capital community incentivizing startups and smaller companies to care about privacy and security as opposed to just focusing on growth. It’s apparent that startups aren’t implementing the privacy lessons learned by other companies that have already seen privacy enforcement from a privacy regulator. As a result, the same privacy issues pop up over and over. Obviously, regulators will play a role. In addition to enforcement, education and guidance from regulators are vital to helping companies build privacy by design into their platforms.
Natalia: What does a privacy attack look like, and which attacks should companies pay attention to?
Whitney: A privacy attack can look very similar to a security attack. A data breach, for instance, is a privacy attack: it leaks confidential information. A European regulator recently called a privacy bug a breach. In this particular case, a bug in the software caused the information to be made public that the user had marked as private. Folks generally associate data breaches with an attacker, but often accidental disclosures or privacy bugs can cause data breaches. I’ve talked with folks who say, “Wow, I never thought of that as a security breach,” which is why it’s important to engage your legal team when major privacy or security issues pop up. You might have regulatory reporting obligations that aren’t immediately apparent. Other privacy attacks aren’t necessarily data breaches. Privacy attacks can also include attempts to deanonymize data sets, or they might be privacy bugs that use or collect data in a way that is unanticipated by the user. You might design a feature to only collect a certain type of data when in reality, it’s collecting much more data than was intended or disclosed in a privacy notice.
On the more adversarial side of privacy attacks, an attacker could try to leverage weaknesses and processes around privacy rights to access personal information or erase somebody’s account. An attacker could use the information they find out about an individual online to try to get more information about that individual via a data subject rights process (like the right to get access to your data under global privacy laws). There were a few cases of this after the GDPR went into effect. An attacker used leaked credentials to a user’s account to download all of the data that the service had about that individual. As such, it’s important to properly verify the individual making the request, and if necessary, build in additional checks to prevent accidental disclosure.
Natalia: How should a company track accidental misuse of someone’s information or preferences?
Whitney: It’s very hard. This is where training, culture, and communication are really important and valuable. Misuse of data is unfortunately common. If a company is collecting personal data for a security feature like multifactor authentication, they should not also use that phone number for marketing and advertising purposes. That goes beyond the original scope and is a misuse of that phone number. To prevent this, you need to think about security controls. Who has access to the data? When do they have access to the data? How do you document and track access to the data? How do you audit those behaviors? That’s where security and privacy deeply overlap because if you get alignment there, it’s going to be a lot easier to manage the misuse of data.
It’s also a good idea to be transparent about incidents when they occur because it builds trust. Of course, companies should work closely with their legal and PR teams when deciding to publicly discuss incidents, but when I see a news article about a company disclosing that they had an incident and then see a detailed breakdown of that incident from the company (how they investigated and fixed the issue), I usually think, “Thanks for telling me. I know you were not necessarily legally required to disclose that. But I trust you more now because I now know that you’re going to let me know the next time something happens, especially something that could be perceived as worse.” Privacy isn’t just about complying with the law. It’s about building trust with your users so they understand what’s happening with their data.
Natalia: What are best practices for implementing a privacy program?
Whitney: When you build a privacy program, look at the culture of the company. What are its values, and how do you link privacy to those values? It’s going to vary from company to company. The values of a company with a business model based on the use or sale of data are going to be different than a company that sells hardware and doesn’t need to collect data as its main source of revenue.
It’s easy for companies to look at new privacy laws–like GDPR and the California Consumer Privacy Act (CCPA)–and say, “Let’s just do that,” without thinking through the broader implications. That’s the wrong approach. Yes, you want to comply with privacy laws, but compliance does not equal security or privacy. If you’re constantly reactive to only what privacy law requires, you’ll tire out quickly because it’s changing and growing rapidly. Privacy is the future. Instead, think more holistically and proactively when it comes to privacy. Instead of rolling out a process to comply with only one region and one law, consider rolling it out for all users in all regions, so when a new region implements a similar law or regulation, you’ll be most of the way there. Just because you’re compliant with GDPR doesn’t mean you’re a privacy-focused company or that you process information in the most privacy-centric way. But you’re moving in that direction, and you can build on that foundation. Another best practice is to find campaigners across the company who support privacy efforts. If you don’t have a dedicated privacy resource, that doesn’t mean you can’t build a culture of privacy within your company. Work with privacy-minded employees to seek out the easy privacy wins, such as making sure your privacy policy is up to date and reflective of your practices. Focus on those to build support around privacy within the company.
Putting my former regulator hat on, privacy culture is important. When the Federal Trade Commission (FTC) comes knocking at your door, they’re looking to see if you have the right intentions and are trying to do your best, not just whether you prescriptively failed to do this one thing that you should have done. They look at the size of the company, and its maturity, resources, and business model in determining how they’ll enforce against that company. Showing that you care, isn’t going to necessarily fix your problems, but it will definitely help.
Natalia: How should companies train employees on privacy issues?
Whitney: Training should happen regularly. However, not all training needs to be really detailed or cover the same material—shake it up. The aim of training employees on privacy issues is to cultivate a culture of privacy. For example, when employees onboard, they’re new and excited about joining a new company. They’re not going to remember everything so keep privacy training high-level. Focus on the cultural side of privacy so they get an idea of how to think about privacy in their role. From there, give them the resources to empower themselves to learn more about privacy (like articles and additional training). Annual training is a good way to remind people of the basics, but there are many people who are going to tune those out, so make them funny and engaging if you can. I love using memes, funny themes, or recent events to help draw the audience in.
As the privacy program matures, I recommend creating a training program that fits each team and their level of data access or most commonly used tools. For example, some customer service teams have access to user data and the ability to help users in a way that other teams may not, so training should be tailored to address their specific personal data access and tooling abilities. They may also be more likely to record calls for quality and training purposes, so training around global call recording laws and requirements may be relevant. The more you target training toward specific tools and use cases, the better it’s going to be because the employee can better understand how that training relates to their everyday work.
Natalia: What encryption strategies can companies implement to strengthen privacy?
Whitney: Encrypt your databases at rest. Encrypt data in transit. It is no longer acceptable to have an S3 bucket or a database that is not encrypted at rest, especially if that system stores personal data. At the moment, enterprise key management (EKM) is a popular data protection feature involving encryption. EKM gives a company the ability to manage the encryption key for the service that they are using. For instance, a company using Microsoft services may want to control that key so that they have ownership over who can access the data, rotate the key, or delete the key so no one can access the data ever again.
The popularity of EKM is driven by trends in security and Schrems II, which was a major decision from the Court of Justice of the European Union last summer. This decision ruled Privacy Shield, the safe harbor for data transfers from the EU to the United States, invalid for not adequately protecting personal data. Subsequently, the European Data Protection Board (EDPB) issued guidance advising data be encrypted before being transferred to help secure personal data when transferred to a region that might present risks. Encryption is vital when talking about and implementing data protection and will continue to be in the future.
Learn more
To learn more about Microsoft Security solutions, visit our website. Bookmark the Security blog to keep up with our expert coverage on security matters. Also, follow us at @MSFTSecurity for the latest news and updates on cybersecurity.
READ MORE HERE