Skip to main content

A look into multi-party computation and oblivious proxies

The landscape of digital privacy is constantly evolving as companies and researchers implement new methods to enhance user privacy. For example, browsers are removing third party cookie functionality and mobile operating systems are integrating more transparency and user controls over data collection. “Privacy enhancing technologies” (PETs), such as end-to-end encryption, are a broad set of tools and methods aimed at providing ways to build products and functionality while protecting the privacy of users’ data.

Privacy practices exist on a spectrum of data access. On one end of the spectrum, a company has access to all of an individual’s private information and relies on internal policies and procedures to ensure this information is not misused or breached. On the other end of the spectrum, there are technologies which allow a company to offer products and services without ever having access to a user’s data. PETs are approaches that allow companies to move towards the latter end of the spectrum – some reach the end-goal of a company truly not having access to the data of any individual, and others reside in the middle, where they limit access but still have some reliance on a company’s policies and practices.

These technologies can keep a consumer’s communications private from a company, allow users to access data without the company learning who they are, or enable a company to use analytics and research to improve a product without gaining access to data about individuals.

PETs can be an important tool for service providers to prevent not only intentional misuse of data, but also accidental or negligent misuses stemming from hacks, bugs, or misunderstandings of internal policies. But not all PETs (or specific implementations of PETs) reach the fully private end of the spectrum. Companies making representations to consumers about their use of PETs must follow the law and ensure that any privacy claims or representations are accurate.

Two examples of PETs that exist along the spectrum are known as “multi-party computation” (MPC) and “oblivious proxies,” both of which we explain below. They are based on the idea that while it can be hard to trust a single company to not renege on privacy promises, it is less likely that two or more independent organizations would make representations about the privacy of a product and then actively work together to undermine them.

Multi-party computation works by spreading the information meant to be kept private among multiple independent organizations in a way that prevents any of them, by themselves, from understanding the data. Then, using a predefined set of rules about what can be shared, and often fancy math, they work together to process the data and achieve a goal. It can allow performing calculations across related but different data sets held across multiple organizations, without sharing the underlying data.[1] One specific application is to calculate aggregate analytics on data from individual users’ devices.[2]

Let’s say, for example, a local ice cream shop wants to understand what pages people visit in its app so it can feature them more prominently on the main page, but it doesn’t want to infringe on the privacy of the local ice cream-hungry populace. Its app can record what pages a user visits, then send parts of that data to their servers and parts to their analytics provider. Then they both process the data and share summary data with each other that, when combined, allows them to see information like the most viewed flavor: PETstachio. This is all done without either the analytics provider or the ice cream shop learning specific information about the underlying data or users.

Similarly, oblivious proxies divide data up between two entities. Their goal is to allow the user of an app or website to communicate with a company’s servers without the company learning who they are. But all communication over the internet uses an identifier known as an IP address, which can aid in the identification of an individual or household. Let’s say the same ice cream shop wants to allow the users of their app to vote on a new flavor to introduce. To encourage true honesty in the votes, the ice cream shop needs to ensure it can’t identify the votes users cast. Using an oblivious proxy, the app can encrypt a user’s vote so it’s only readable by the ice cream shop, then send it to a trustworthy third party. The trustworthy third party can then forward the encrypted vote along, with the user’s IP address removed, at which point the ice cream shop can decrypt and view the vote without knowing who sent it (the winner was Proxy Road).

Don’t End Up in the PETs Cemetery

While a major step forward, MPC and oblivious proxies are not perfect. One major disadvantage they have in comparison to some other privacy enhancing technologies is that it is much harder for a consumer or other third party to audit internal corporate behavior (whether the organizations involved are sharing data only as allowed) than it is to audit source code or application behavior, such as network traffic.

Additionally, all technologies, and particularly new and developing ones, run the risk of implementation issues. PETs are no exception. Companies should work to ensure a robust implementation of the technologies, and quickly fix any discovered issues that may undermine the privacy of users.

For these reasons, companies building products that use MPC, oblivious proxies, or other PETs still need to keep Section 5 in mind and ensure they’re not engaging in unfair or deceptive acts or practices. While they can be great steps, neither PETs nor anything else is a substitute for a robust privacy program and honest representations about the capabilities of a company’s software.

The FTC has brought cases against companies that claimed specific technology-based security or privacy guarantees that they allegedly failed to provide, including Henry Schein’s promotion of their use of encryption despite using an algorithm offering weaker protection than industry standards, Zoom’s promise of end-to-end encryption, and CafePress’s claims to encrypt consumers’ sensitive personal data.

The continued development of technologies that allow the delivery of useful products and services to consumers while maintaining their privacy is encouraging, and the FTC will continue to follow developments in the space. However, false or misleading representations with regards to privacy and security by companies—whether explicitly or impliedly, by inclusion or by omission—may be a violation of the law.

Thank you to staff from across the Office of Technology and the Division of Privacy and Identity Protection in the Bureau of Consumer Protection who collaborated on this post (in alphabetical order): Jessica Colnago, Mark Eichorn, Alex Gaynor, Amritha Jayanti, Elisa Jillson, Stephanie Nguyen, Mike Tigas, Madeleine Varner, Ben Wiseman.


[1] For example: https//open.bu.edu/bitstream/handle/2144/21773/2015-009-mpc-compensation.pdf

[2] For example: https://petsymposium.org/2023/files/papers/issue4/popets-2023-0126.pdf

 

More from the Technology Blog

Get Business Blog updates