Google is opening up the Pixel 2's Google-designed machine learning SoC, the Pixel Visual Core, to third-party apps. The first apps to take advantage of the chip are Snapchat and Facebook's pile of social media apps: Facebook, Instagram, and WhatsApp. With the February Android security update for the Pixel 2, each app will get to use Google's HDR+ photo processing in their own pictures.

In the Android 8.1 Developer Preview, Google opened the Pixel Visual Core up to developers and added a "Neural Networks API" to Android. The new API allows apps to tap into any machine-learning hardware acceleration chips present in the device, of which the Pixel Visual Core is one of the first examples. Google's HDR+ photo algorithm is one of the first pieces of software ported to the Pixel Visual Core, and now it's open to production third-party apps.

Google's HDR+ algorithm takes a burst of photos with short exposure times, aligning them to account for any movement and averaging them together. The result is a noticeably better image, with less noise and higher dynamic range. The images are also upsampled to provide more detail than you would otherwise get with a single 12MP image. HDR+ is so good that the Android modding community has taken to porting the Pixel-exclusive Google Camera app to other devices, where using HDR+ instantly improves the output of the camera.

Before this release, HDR+ on the Pixel had a big functionality gap. It was exclusive to the Google camera app, so if you were using any other camera app, the algorithm wasn't there, and you'd end up taking lower-quality photos. While any app on the Pixel can call up the Google camera app with a simple "take picture" intent, apps can't build custom features on top of Google's camera app. This is a problem if you want to do something like Snapchat's "Lens" camera effects, which necessitates building a custom camera app from scratch and losing HDR+. I doubt you need a whole extra SoC to open the HDR+ algorithms to third parties, but now all Pixel photos will be equal as long as the app supports the Pixel Visual Core.

Snapchat and Facebook are just the launch partners for this feature. The HDR+ algorithms are now open to any developer who wants to plug into Google's chip.

Correction: Google sent along a note saying the Pixel Visual Core is actually not used by the Google camera app. So this third-party usage is the first time it's been put to use by any piece of software.