Google Pixel 3 Tidbits: Everything you might have missed!

We may earn a commission for purchases made using our links.

The Google Pixel 3 and Google Pixel 3 XL were announced earlier this week. The latest Google flagship smartphones feature a 5.5-inch and 6.3-inch OLED display respectively (the latter with a display notch), the Qualcomm Snapdragon 845, 4GBs of RAM, 64 or 128GBs of storage, and Android Pie. Google dedicated much of the Pixel 3’s announcement to the new camera features on the device: Top Shot, Night Sight, Photobooth, and subject tracking autofocus. But there’s a lot of information you won’t get if you only watch the live stream. We spent some time with the device at the event and did some of our own research to help you get up to speed.

Google Pixel 3 Camera Features

We’ve known for some time that the latest version of the Google Camera app would bring a lot of UI changes, but now that the Pixel 3 is finally in reviewers’ hands, we know exactly what has changed. We’ve detailed some of the changes in this previous article, but in summary, the new Google Camera app lets you shoot in RAW, record in h265/HEVC for better compression without sacrificing quality, simplifies the design by making you swipe between each mode, adds Portrait Mode depth-of-field adjustments, and more. There’s one more feature that’s exclusive to the Google Pixel 3, and a few under-the-hood changes that we were told about that we think are worth discussing.

Automatic FPS Switching

First, a new option to automatically switch FPS during video recording has been added. According to a Product Manager that we spoke to at the event, people have trouble deciding what the best framerate is before recording a video. This new feature lets the Google Camera app decide what framerate to record at – either 30 or 60 fps – depending on what’s on screen. The feature can even switch framerates in the middle of a recording. It doesn’t work with 4K videos, though.

Pixel Visual Core Changes

Next, a Product Manager confirmed to us that the Pixel Visual Core has indeed been updated. Until the device’s kernel source code is released, we won’t know exactly what new functions have been added to the HAL. However, we’re told that the Pixel Visual Core is now actually used by the Google Camera app for Google Lens suggestions, HDR+, Top Shot, Motion Auto Focus, and Photobooth. That’s most likely why the older Pixels will get support for Night Sight and Playground but won’t get support for Top Shot, Motion Auto Focus, or Photobooth – they don’t have the newer Pixel Visual Core.

It was a bit of a surprise when we first heard that the Google Pixel 2 doesn’t use the Pixel Visual Core for HDR+, and since we don’t know the exact reason for that decision, we’ll refrain from speculating. I suspect that, since many of the new camera features on the Pixel 3 rely on the Pixel Visual Core, they’ll be harder to port to other devices. A few developers are already attempting to bring some of the features over, and I’ve already seen one developer successfully make the new “Night Sight” feature show up (though it doesn’t work yet.)

Even though the ISP in the Qualcomm Snapdragon 845 is capable of processing video of 4K resolution at 60 fps, the Pixel 3 does not allow you to actually record at this resolution and frame rate combination. We’re not sure why this is the case, but we’ll try to find out. The camera sensor used could be the reason, but we don’t know the exact camera sensors used just yet.

Google Lens

A few days before the presentation, a leaked video from the Pixel Tips app showed that Google Lens works in real-time in the Google Camera app on the Google Pixel 3. Google did announce this feature on stage, but there’s a few more Google Lens related news that you might have missed.

First, Google Lens can work offline on the Pixel 3, but only for specific cases like reading text from an image. Second, Google Lens can now scan album artwork and play the result in YouTube Music. Third, Google Lens suggestions, along with other camera features we discussed earlier, relies on the updated Pixel Visual Core – which is why it won’t be coming to the Google Pixel or Google Pixel 2. At least, not officially.

Playground: More than a rebrand

Very shortly before the presentation began, Google updated the AR Stickers app and a few sticker packs under a new branding: Playground. Google announced a few new sticker packs including one using characters from Marvel, but you might have left wondering how Playground is any different compared to AR Stickers. We’re quite familiar with Google’s penchant for rebranding their products – Google Feed to Discover, Google Keep to Google Keep Notes, etc. – but surprisingly, there really is more to Playground than just a rebrand.

In the demonstration area, a Google employee told us that there are a few things that differentiate Playground from AR Stickers. First, characters that you place on the screen can now interact with each other and people in the frame. Second, there are now suggestions for stickers based on what’s currently on screen (suggestions can be shown for people, objects, or other stickers). Third, stickers can now be placed via the front-facing camera, and using real-time body segmentation, can appear to be behind subjects in view. Speaking of the front-facing camera stickers, the feature now tracks facial expressions and stickers can react accordingly. Lastly, there are now 2D stickers from Gboard and new animations that stickers can perform.

Super Res Zoom, Computational Raw, Synthetic Fill Flash, Learning-based Portrait Mode, and Night Sight

DPReview published an in-depth article on how all of the above features work, so we recommend reading it thoroughly (they also have a bunch of sample photos). But if you want a summary, here’s what you should take away:

Super Res Zoom : “Uses HDR+ burst photography to buffer up to 15 images” and then “employs super-resolution techniques to increase the resolution of the image beyond what the sensor and lens combination would traditionally achieve.” It only activates at 1.2x zoom or more.

: “Uses HDR+ burst photography to buffer up to 15 images” and then “employs super-resolution techniques to increase the resolution of the image beyond what the sensor and lens combination would traditionally achieve.” It only activates at 1.2x zoom or more. Computational Raw : “There’s one key difference relative to the rest of the industry. Our DNG is the result of aligning and merging [up to 15] multiple frames… which makes it look more like the result of a DSLR.” – Marc Levoy, Computational Photography Lead at Google

: “There’s one key difference relative to the rest of the industry. Our DNG is the result of aligning and merging [up to 15] multiple frames… which makes it look more like the result of a DSLR.” – Marc Levoy, Computational Photography Lead at Google Synthetic Fill Flash : “‘Synthetic Fill Flash’ adds a glow to human subjects, as if a reflector were held out in front of them.”

: “‘Synthetic Fill Flash’ adds a glow to human subjects, as if a reflector were held out in front of them.” Learning-based Portrait Mode : “Where we used to compute stereo from the dual pixels, we now use a learning-based pipeline. It still utilizes the dual pixels, but it’s not a conventional algorithm, it’s learning based” – Marc Levoy, Computational Photography Lead at Google. According to DPReview, this means “improved results: more uniformly defocused backgrounds and fewer depth map errors.”

: “Where we used to compute stereo from the dual pixels, we now use a learning-based pipeline. It still utilizes the dual pixels, but it’s not a conventional algorithm, it’s learning based” – Marc Levoy, Computational Photography Lead at Google. According to DPReview, this means “improved results: more uniformly defocused backgrounds and fewer depth map errors.” Night Sight: “‘Night Sight’ utilizes HDR+ burst mode photography to take usable photos in very dark situations…[it] expects you to hold the camera steady after you’ve pressed the shutter button. When you do so, the camera will merge up to 15 frames, each with shutter speeds as low as, say, 1/3s, to get you an image equivalent to a 5 second exposure.” – DPReview

True Stereo Audio Recordings

Section by Dylan Raga

For video recordings, an improvement gone mostly unnoticed is that the Pixel 3 now records true stereo audio. Many were disappointed in its omission in the Pixel 2, which merely recording in mono. While the quality of separation (and the audio itself) is untested, this at least shows Google has taken a step in the right direction for audio recording.

Wide Color Capture?

Section by Dylan Raga

Back in August, Russian tech blog Rozetked managed to leak and provide photos from their pre-production Pixel 3 XL. From these photos and videos, we managed to find that the Pixel 3 was capable of stereo audio recording, and that the front-facing camera had autofocus, nearly two months in advance. Furthermore, we also found that the photos taken had an embedded Display P3 color profile, suggesting that the photos taken would capture a larger range of colors, surpassing not only its predecessors, but every other Android smartphone camera — no other major smartphone camera except for the iPhones (since the 7) capture wide color images. This also coincided with Android Central’s claim that the Pixel 3’s sensor is a “new version with better dynamic range capture.” However, photos taken on our review unit do not have the embedded Display P3 color profile, but the typical sRGB profile, so it looks like Google pulled the plug on wide color capture, at least for now.

Google Pixel 3 Hardware

IP68 Water Protection

The official Google blog post and several Tweets from the @madebygoogle account state that the device is IP68 rated. The first number represents the particle resistance while the second number the water resistance. The Pixel 3 should be water resistant at a maximum depth of 1.5m underwater for up to thirty minutes, and offers protection from harmful particles.

FeliCa Support

Japan often gets different hardware variants of the same smartphones released internationally. That’s because they need to have an Osaifu-Keitai compatible secure element in the device so they not only read and write to FeliCa cards, but also do card emulation. The Google Pixel 2 series did not have such a secure element in the device, but this year, the Pixel 3 does. Here are the 4 hardware variants of the Pixel 3 and Pixel 3 XL:

G013A – Pixel 3 international

G013B – Pixel 3 with Osaifu-Keitai compatible secure element

G013C – Pixel 3 XL international

G013D – Pixel 3 XL with Osaifu-Keitai compatible secure element

The band support is also a bit different across the models. The Japanese ones have band 21, while the U.S. one doesn’t. The U.S. one has bands 71 and 32 and TDD 46, while the Japanese ones don’t.

There’s no notification LED

We could talk about how the Pixel 3 has an Always on Display that can show you your notifications or how there are gestures like a double tap to wake to quickly access your lock screen notifications, but really, it’s hard to say that we won’t miss the notification LED. It’s a pretty common feature on Android smartphones, and we’re not sure why it’s missing on the Pixel 3. Without a notification LED, it’ll be hard to tell when you have a new notification when your Pixel 3’s screen isn’t in your immediate line of sight.

Google Pixel 3 Software Features

You can hide the notch on the Pixel 3 XL…kind of

The “display cutout” developer option has been updated with a new “hide” option. When selected, it pushes down the status bar until it’s underneath the notch area. This effectively turns the Pixel 3 XL into the Pixel 2 XL. Here’s what that looks like:

Since a lot of people have asked, here's what hiding the notch on the Pixel 3 XL looks like when using the built-in Developer Option. Left: Pixel 2 XL. Right: Pixel 3 XL. As @joshuatopolsky said, it basically becomes a Pixel 2 XL. pic.twitter.com/vq677xXGyK — Mishaal Rahman (@MishaalRahman) October 10, 2018

You can also hide the notch using our Nacho Notch app. Instead of pushing the screen content down, this darkens the status bar area. Here’s what that looks like:

Gestures are the new norm

During Google I/O, we heard that gesture navigation would be the only way to navigate the UI on the next Google Pixel device. Google later clarified that gesture navigation would only be the default way to navigate on the Pixel 3, and that they couldn’t confirm whether gestures would be the only way to go moving forward. Now that the Pixel 3 is here, we can confirm that there isn’t an option to turn them off. Sorry fellas, gestures are here to stay.

There’s kind of a way to bring back the standard navigation keys, but I’ll go into details on that in a future article.

Google Duplex is finally coming

The futuristic Google Duplex feature which can start entire phone conversations on your behalf is finally coming. Google initially demoed this at I/O, but was met with backlash over concerns that the receiving party would felt deceived because they weren’t informed that they were speaking with a robot. Google addressed these concerns by adding a disclaimer at the beginning of every call initiated via Duplex.

Now, Google has announced that Duplex will be available for Pixel, Pixel 2, and Pixel 3 owners in New York, San Francisco Bay Area, Phoenix, and Atlanta starting next month. The service will only be available in these areas, at least at the beginning, because Google has to work with businesses and cities before rolling out the service.

Driving Mode is here to automatically start Android Auto for you

Prior to the Pixel 3 launch, we discovered a hidden Driving Mode feature in Google Play Services that later rolled out to a handful of users. The Google Pixel 2 launched with a feature that detects when you’re in a moving vehicle and automatically enables Do Not Disturb mode, but the new Driving Mode can also automatically start Android Auto as well as enable Do Not Disturb mode. The new Driving Mode is available on the Google Pixel 3 in Connected Devices settings.

Now Playing History is finally here

Now Playing is a feature that debuted on the Pixel 2, and it uses your device’s microphone to record audio snippets, on-device machine learning to construct an audio fingerprint, and a combination of an on-device database and Google’s massive cloud database to match the audio fingerprint with a known song. If there’s a match, Now Playing will show you what song is playing in the background by displaying the song title on the Always on Display/Ambient Display and/or as a notification. However, once you’ve dismissed that notification or song title ticker disappears on its own, there’s no easy built-in way to go back and see what songs your phone previously recognized. That’s changing with the Google Pixel 3.

Google has finally added history to the Now Playing feature. If you go to the Now Playing settings page in Sound Settings, you can now access a new menu that displays the last few songs that the Pixel 3 recognized. You can also add a shortcut to your home screen that leads you to this page. We’re not sure why this isn’t already available on the Google Pixel 2 as we first spotted evidence for this feature months ago in Pixel Ambient Services. At least there are already plenty of third-party apps on the Play Store that keep a log of your Now Playing history using a simple NotificationListener Service, so you don’t have to wait for an update to enjoy the feature now.

Daydream VR still works

Yes, Google Daydream VR still works on the Pixel 3, according to Road to VR. The Pixel 3 XL is nearly identical in size to the Pixel 2 XL so even the same Daydream View headset should fit just fine. We’re not sure about the future of the platform given intense competition from Oculus, but Google did recently announce some pretty neat features for standalone Daydream VR headsets so it’s possible they’re still working to improve the VR experience for Daydream-supported smartphones.

New Features in Visual Snapshot

Earlier this year, Google rolled out the Visual Snapshot UI in the Google Assistant. Visual Snapshot is a feature which shows you information that’s important to you – things like upcoming calendar events, reminders, upcoming bills, the current weather, the current traffic to work, etc. Visual Snapshot is basically the spiritual successor to Google Now with a fresh coat of paint. At the Made By Google event, we were told that two new features would be added to Visual Snapshot next month: Your Recommended Events and Things to Remember.

A New Easter Egg

Android Pie brought a rather generic Easter egg of a glowing P, but the Android Pie release on the Pixel 3 has a proper Easter egg. Tapping on the middle of the P in the original Easter egg opens up a drawing app that you can sketch in.

New Google Home and Google Pixel Launcher Apps

The Google Home app got a major redesign to make it easier to use. You can learn more about that here as well as download the APK. The Pixel Launcher is slightly new as it now has a Google Assistant icon on the search bar and forces adaptive icons. You can see it here and download the APK on supported devices.

Google Pixel 3 Accessories

Pixel Stand

The new Google Pixel Stand is more than just a wireless charging dock. You can dock your new Google Pixel 3 into any Qi wireless charging dock if you wish, but docking your Pixel 3 into the Google Pixel Stand gives you the fastest wireless charging speed (10W vs. 5W) and also enables Google Assistant integration into the Always on Display. On stage, Google demonstrated several interactions you can have with Google Assistant while docked into the Pixel Stand. For instance, you can quickly access your “My Day” routine, see your Visual Snapshot, wake up slowly to an alarm as the screen color shifts, see an album from Google Photos, more. However, there are a few features that Google didn’t go into detail while on stage.

First, when the Pixel Stand shows you your photos from a Google Photos album, it uses AI to crop the photo just right so any persons aren’t cut out of the image. Second, the Pixel Stand UI has Nest Doorbell integration. When a person rings the Nest Doorbell, the Nest video feed is shown right on top of the Always on Display – you don’t need to unlock the phone so long as your Pixel 3 is docked in the Pixel Stand.

Miscellaneous

A Return to F2FS

With a Qualcomm Snapdragon 845 chipset, UFS 2.1 storage, and LPDDR4X RAM, we would expect the Pixel 3 and Pixel 3 XL to be speedy devices. What really set the Google Pixel series apart from most devices with the same chipset were the software optimizations by the Pixel’s performance team. This year is no different, with Google experimenting a return to F2FS – a file system developed by Samsung initially to bring high performance to flash-based storage.

We previously discussed the benefits of F2Fs in this article and according to Tim Murray from the Pixel performance team, the reason the Pixel 3 uses F2FS for the data partition is that “it supports inline crypto hardware now.” This isn’t the first Google device with F2FS support – the Nexus 9 previously supported it – but it’s interesting to see Google start using it again.

First device with new Control Flow Integrity Protections

At the Linux Security Summit earlier this year, Google engineers revealed that the next Pixel device would be the first to ship with forward-edge Control Flow Integrity (CFI) in the kernel. This is part of Google’s ongoing efforts to harden the Linux kernel for Android devices, and today, they have made these new changes official. CFI defends against what are called “code-reuse attacks,” exploits that hijack the execution flow in the kernel to execute arbitrary parts of kernel code without injecting code of their own.

CFI is available in the Android common kernel for versions 4.9 and 4.14, but it’s up to device vendors to integrate the changes. CFI is enabled by default in Android Pie within the media framework and security-critical components like NFC and Bluetooth, according to Google.

Titan Security Module – What few details we have

Full documentation on the Titan Security Module is not yet available, but a few Google engineers have posted Tweets that give us some information. First, in response to a tweet by Dees_Troy, lead developer of TWRP, Google’s tech lead for Android hardware-backed security subsystems, Shawn Willden, states that the new security module will not be used for runtime system analysis. This is important for Magisk users because hardware-backed runtime system analysis would make systemless-root much more difficult. However, Google already opened up an API for the Trusted Execution Environment (TEE), so runtime system analysis could still happen in the future (in other words, there could still be bad news for Magisk.)

Another tweet by Will Drewry, a Chrome OS Security Engineer, summarizes what the Titan Security Module is used for. As explained by Daniel Micay, former CTO of CopperheadOS, the Titan Security Module offers an “alternate hardware keystore (instead of TrustZone) and replaces the Android Verified Boot (AVB) state and Weaver applets for the Pixel 2 security chip.” According to Daniel Micay, “Weaver was the main use case for the Pixel 2 security chip: hardware-enforced exponentially growing delays for key derivation. It holds random tokens in escrow needed to derive per-profile encryption keys and only provides them when given correct credential-derived auth tokens.”

With Project Treble, you’ll get many updates

We’ve heard a lot about Project Treble since it was first introduced in Android 8.0 Oreo, and thanks to the introduction of the Vendor Test Suite and CTS-on-GSI, Project Treble has neared completion in Android Pie. Devices shipping with Android Pie, we’re told, should be able to receive framework-only updates. Treble is a big part of why Essential is able to roll out monthly security patches so quickly. We suspect it’s also part of the reason why Google is able to offer 3 years of guaranteed Android platform upgrades, starting with the Pixel 2 series and also now with the Pixel 3 series.

Note: some details were omitted from this article until our full review to respect Google’s reviews embargo.