TTC Labs - Privacy-Enhancing Technologies: Data Innovation and Design Considerations
News

Privacy-Enhancing Technologies: Data Innovation and Design Considerations

Taja Profile Photo

Taja Naidoo

Privacy Public Policy Manager, TTC Labs

If you're working on data and privacy issues it’s likely that you will have heard the term “privacy-enhancing technologies” (PETs), even if you don’t have an in-depth understanding of the technologies themselves. PETs offer greater transparency, accountability and assurance to users when sharing their data with digital services, but there are still many questions to answer around PETs and how to best implement them. In December 2021, the Privacy & Data Public Policy Team at Meta held an event with privacy experts called a “Data Dialogue” which aimed to discuss the many trade-offs related to de-identification and questions of fairness and explainability when it comes to privacy enhancing technologies and de-identification.

Private Message Banner

WHAT ARE PETs?

PETs seek to enhance data privacy protections for users through a combination of technical and policy-based approaches. PETs use cryptographic or statistical techniques to enhance data and privacy protections for users through approaches such as minimizing personal data use and maximizing the security and anonymity of any data obtained or shared. While there is no one universally accepted definition of which data governance measures and technologies constitute a “PET”, one could classify the three main methods of privacy enhancement as data-shielding, data-altering, and computation-altering, or alternatively “hard” and “soft” measures; but the lack of a commonly used definition is one of the factors that has made effective policy development around them challenging. You can learn more about these categories in this excellent report by the Federal Reserve Bank of San Francisco.

WHY ARE THEY IMPORTANT?

We are at a point in time when many factors - both technical and sociological - have aligned to create the conditions where personal data can be harnessed for unprecedented good, but also can do harm to individuals and groups. In short, there are opportunities to use personal data to help usher in developments in healthcare, education, civic governance, personalized digital experiences and other essential areas of life, but the technologies available to bad actors who would want to misuse personal data to manipulate or discriminate against others have also become more sophisticated.

At Meta, we are committed to ensuring that privacy protection is at the heart of everything we do, and in light of this, are investing heavily in understanding PETs and how best to implement them in a way which complements and improves our current privacy measures.

cta1

THE DATA DIALOGUE

On the 17 December 2021 Meta’s Alejandra-Parra Orlandoni and Denise Spellman-Butler hosted a discussion with external experts from academia, civil society and industry on the subject of PETs. They had three aims in mind: firstly, to present Meta’s research on PETs and share insights with the community; secondly, to discuss the many trade-offs related to de-identification and questions of fairness and explainability when it comes to privacy enhancing technologies and de-identification; and thirdly, to gain critical signal on how we can work together across disciplines to ensure the successful implementation of PETs in a way which is transparent and fair for users.

The event was a mixture of talks and interactive workshops aimed at teasing out some of the tensions that many in the industry face when trying to balance the value of data while protecting the personal information of users. In a series of exercises, experts explored the boundaries between anonymity and identifiability and ways in which various techniques could meet the expectations of users while maintaining the integrity of the data and our ability to learn from it. The assumption underlying these discussions is that we want to empower users with transparency and appropriate controls which will help them to optimize their online experiences and enhance their trust in the internet.

In order to expose and tease out some of the tensions between using personal information to develop new services and prioritizing people’s privacy in the first discussion, the participants were presented with two hypothetical use cases. In the first scenario, a public transport operator was interested in working with a university to gather insights from passenger location and movement data - specifically data about where passengers on particular routes may get on and off the train. Their hypothesis was that having detailed information would help them plan where to locate new stops, and what type of services and facilities they would need at these stops. The second use case concerned use of an online dating app where some sharing and transparency is required in order to build trust among the community of users, and where concerns around security and safety are heightened.

Some significant takeaways from this discussion on data anonymization trade-offs included:

1. Whether potential benefits of using personal data ever outweigh the risk of harm to people is often not a linear analysis and will depend on the overall context of the use case and environment.

2. The data that provides the most benefit to people may simultaneously pose the most risks. Oftentimes sensitive data might be of highest benefit to people planning inclusive services (e.g. age, gender, religion), but it is also the most risky to use. For example, in the first scenario: The gender of train passengers might be an important data point to note if safety of train station design is being considered, but revealing this might also put female train passengers at higher risk if the data is leaked or subject to adversarial attack.

3. We must keep in mind the technical and economic implications of PETs, in particular the balance between the need to protect people’s privacy via regulation and the need to preserve innovation. One challenge to consider is whether young companies will have the means to develop and implement PETs, given their technical complexity and associated costs. For example, in the second scenario: We considered a startup where inherently risky but voluntary data sharing was integral to the service which is being delivered. The question of how to balance entrepreneurship with the need to ensure user safety in a context - such as a dating app - took center stage.

Ensuring Fairness without Data

Following this session there was a Fireside Chat which covered the crucial topic of fairness. In particular, experts discussed how internet-based companies can ensure that all users are treated fairly without access to information - sometimes highly sensitive information - about who they are. That way, we can ensure that algorithms and other technologies are auditable and can be corrected when they get things wrong while preserving people’s privacy. The key insights here included the use of PETs to assess fairness in a more privacy-protective way, as well as “fairness by design” and the prospect of involving more people directly in the participative act of designing the algorithms and technologies which affect them, so that there is much more transparency into how these systems work and how they can be improved.

The User Experience of PETs

In the final session of the event the group considered the user experience aspect of PETs: whether and, critically, how these often complex technologies could be explained to users. The main insight was that messaging and educational experiences around the application of PETs must sit alongside more broadly focused privacy resources which help users of all digital literacy levels to build “mental models” of privacy and the ways in which it can be enhanced online. Metaphors which encourage people to think about privacy as something which is dynamic and context-specific can be especially useful, as long as they are not overly simplistic.

Likewise, delivering information about privacy and PETs should be considered as a part of the broader user experience journey, with careful choices being made about whether the information should be offered upfront during the app or onboarding experience, in the context of trying to complete a specific action or task, or available at any time on-demand through menu and setting options. The best approach may be some combination of all three which adds enough “friction” to the experience to encourage users to think about their data sharing practices, without making the service clunky and frustrating.

In the coming months the Privacy and Data Policy Team at Meta will continue working with external experts on understanding and implementing PETs in ways which provide optimal privacy protection and clarity for users, and where appropriate we will publish our learnings from this work on the TTC Labs site.

Special thanks to Denise Butler, M.Alejandra Parra-Orlandoni and Sabrina B. Ross for their help with this note.

Taja Profile Photo

Taja Naidoo

Privacy Public Policy Manager, TTC Labs , Meta

TTC Labs is a cross-industry effort to create innovative design solutions that put people in control of their privacy.

Initiated and supported by Meta, and built on collaboration, the movement has grown to include hundreds of organisations, including major global businesses, startups, civic organisations and academic institutions.