One of the most challenging questions is what your browser could do to allow a publisher to pick relevant content or show a relevant ad to you, while sharing as little information about your browsing history as possible.We're exploring how to deliver ads to large groups of similar people without letting individually identifying data ever leave your browser — building on the Differential Privacy techniques we've been using in Chrome for nearly 5 years to collect anonymous telemetry information. New technologies like Federated Learning show that it's possible for your browser to avoid revealing that you are a member of a group that likes Beyoncé and sweater vests until it can be sure that group contains thousands of other people.Publishers and advertisers need to know if advertising actually leads to more business. If it’s driving sales, it’s clearly relevant to users, and if it’s not, they need to improve the content and personalization to make it more relevant. Users then benefit from ads centered around their interests, and advertisers benefit from more effective advertising.Both Google and Apple have already published early stage thinking to evaluate how one might address some of these use cases. These proposals are a first step in exploring how to address the measurement needs of the advertiser without letting the advertiser track a specific user across sites.

Protecting the Sandbox Boundary

Our experience has shown us that removing certain capabilities from the web causes developers to find workarounds to keep their current systems working rather than going down the well-lit path. We’ve seen this recently in response to the actions that other browsers have taken to block cookies - new techniques are emerging that are not transparent to the user, such as fingerprinting.With fingerprinting, developers have found ways to learn tiny bits of information that vary between users, such as what device they have or what fonts they have installed. By combining several of these small data points together they can generate a unique identifier which can then be used to match a user across websites. Unlike cookies, users cannot clear their fingerprint, and this means that even if a user wishes not to be identified, they cannot stop the developer from doing so. We think this subversion of user choice is wrong.As referenced in May at I/O , we are actively taking steps to prevent fingerprinting. We are proposing the implementation of what we call a privacy budget. With a privacy budget, websites can call APIs until those calls have revealed enough information to narrow a user down to a group sufficiently large enough to maintain anonymity. After that, any further attempts to call APIs that would reveal information will cause the browser to intervene and block further calls.We appreciate you taking the time to read through our early proposals for building the Privacy Sandbox. We understand it is ambitious and can’t overstate how important it is that this be refined and improved as a result of collaboration across the industry, including other browsers and publishers. We look forward to hearing your thoughts!Posted by Justin Schuh - Director, Chrome Engineering