Background

It is obvious from current events — from Facebook to Equifax to Uber — that we need a sea change in how we share data with companies. Identity services for decentralized apps can help bring more control to users.

The notion of ‘selective sharing’ in the blockchain world means that when an app requests information about me, I can choose to share it (or not). However, this doesn’t go far enough: applications still have access to your raw data, putting it at risk for abuse and disclosure. In other words, if these applications see your data, they will own your data — just like they do today. In the age of decentralization we have to do better. With Enigma, we allow users to make claims without revealing the data itself.

Why is this important?

Companies that need data to serve their users, meet regulatory requirements, or improve their service could still have access to important insights — without being liable for your data. This becomes especially important with the upcoming General Data Privacy Regulation (GDPR), which will go into effect in May 2018. Users are protected from data leaks, “shadow profiles”, and future misuse of their data.

Intent matters

Applications should have a purpose for our data. It should be contributing to some aspect of their service. When we keep data private, and allow apps to compute on it privately, the computation becomes the point of contact between you and the application. This means every time an application computes on your data, it is because it needs to know something actionable about you. This is a different paradigm than we see currently, where personal information is hoarded by every different application you interact with and sold without our consent.

When applications compute on data they don’t own, their computation inherently states their intent. This improves both transparency and security for the user.

Everyone benefits

If applications can be just as powerful as they are today with our data, but without the liability of protecting it and using it responsibly, decentralized applications will become preferable to the alternative. This is the future that Enigma envisions.

Current Solutions

Because blockchains are public, storing personal data on-chain is undesirable. Even if it is encrypted, there is a chance that either party’s private key can be eventually compromised. In that scenario, even many years from now, personal information used today is at risk. (Note: with Enigma, the private key is never known to either party.)

Existing solutions use an off-chain data store, and through a combination of trusted attestations and claims allow users to reveal select data points to third party requests.

Issue credentials: Anyone (including the individual themselves) can make a public or private claim about an individual’s identity, and these claims can be signed by trusted oracles (like the City of San Francisco). Request credentials: Anyone can request private claims or view public claims about an individual’s identity.

Proposed Solution

We propose to use the Enigma network to modify the data request process: Instead of requesting the data point itself, the requester needs to form a query (such as “Is Alice over 21?”) and grant access based off the result of that computation. This has the advantage of never revealing Alice’s birth date, which is personally identifiable information, while ensuring a truthful result.

To demonstrate how this works with Enigma, we’ll use a few pared-down examples. Let’s first show how it would work for Alice, who wants to use an online gambling application, to prove that her age is greater than 21. Our goal is for Alice to be able to prove that she meets the requirements of the application, without revealing her exact birthdate. Then, we’ll look at Alice and Bob as they try to rent an apartment.

Note: Zero-knowledge proofs, range proofs, and other cryptographic methods can also allow a limited number of computations to be done on private data, including checking if a number is below or above a threshold. These are powerful, valuable tools. Enigma can perform this computation, as well as a broad range of other forms of computation. For example, neither ZKP nor range proofs can allow users to combine data for a third party to compute over without revealing it to any party (we demonstrate this in Example 2). Enigma believes it is powerful to build systems that can support a wider range of computations and that are simpler for app developers to integrate.

Example 1 — Age Verification

Alice wants to prove she is over 21 years of age so she can participate in a decentralized poker game, such as Virtue Poker or Funfair. This is a narrow use case that demonstrates a more generic problem in sharing identity data.

Alice accesses a poker application that requires age-verification. The application requests information from Alice’s device about her identity (her age). Alice has enabled her identity manager to share this data — in an encrypted fashion — with most applications that requests it. Alice’s identity manager encrypts her birthdate and the signature verifying its correctness from the City of San Francisco, and submits this encrypted data to the secret contract from the poker application. The data is decrypted inside a secure environment, the smart contract is executed and a computation ( Alice’s age > 21 ) is run. The application grants Alice a token that allows her to access the application.

The application Alice wants to use needs to know if she is over 21. Her actual birthdate isn’t important. The application is doing a very simple computation with Alice’s data: if she gives any birth date that makes her younger than 21, it will prevent her from using the application. Right now, Alice sends her data, the app takes her data, optionally saves it to the profile they have on Alice, and performs the computation to check her age.

Example 2 — Apartment Rentals

Alice and Bob are renting an apartment. They want to prove to their potential landlord that they meet the income requirements. They also would prefer not to reveal their exact salary to one another or to the landlord (if the landlord finds out how much Alice makes, he might think he can get away with a steep yearly rental increase). Furthermore, by not seeing these types of data directly, the landlord is less likely to engage in illegal rent discrimination. Let’s invent a fictional rental application that facilitates these types of interactions.