Google’s Pixel event last week saw the unveiling of a slew of new devices, but the true takeaway was how good the AI beneath has become

2017 has been a good year for handheld devices, with most manufacturers coming up with products that pushed the boundaries of design and engineering a little further. So, when Google announced the launch of its new Pixel devices, well after most of the big launches of the year were done, the pressure was on the Mountain View giant to come up with something that matched up to the high standards set by everyone else.

So did they? Yes; not with fancy screens or styluses or hi-fi audio, but in the most Google way possible, with smarts.

This is a narrative that the company had been emphasising even with the last Pixel, and when Sundar Pichai got on stage last week, he spent most of his time in the same vein — telling the audience how good AI has become. After informing us that only a handful of people in the world are qualified to be building machine learning models, Pichai spoke about AutoML, an automated machine learning model Google has been working on, which he claims now slightly surpasses existing hand-built ones while using significantly less processing power. Google spokespeople also discussed the capabilities and improvements to the Assistant at length, with the actual device launches coming much later in the event. The story is clear now — AI is the future, and the Pixels, Google Homes, and everything else, exist just to get that AI into our homes.

With this taken into consideration, it comes as little surprise that the Pixel phones themselves seem like iterative upgrades on last year’s models. They look better, are waterproof now, match up to current standards with the near bezel-less screen (in the case of the Pixel 2 XL) and drop the headphone jack (so much for Google’s snarky jab at Apple at last year’s Pixel reveal).

All this would count as a generic upgrade, but that’s where the smarts come in. Almost every phone launched this year uses two cameras on the back to provide a DSLR-like bokeh effect. In the Pixel, Google revealed that the AI Pichai spent so much time setting up at the beginning of the presentation detects objects well enough to replicate the effect with one camera. This effectively means the same effect works on the front camera as well. (AI - 1, custom hardware - 0)

With Google also abandoning the 3.5mm headphone jack, it is likely that we’re going to see less and less of it in future devices. So the company has shored up on audio offerings, with two new sizes of the Home speaker, a Mini and a Max, offering alternatives to everything from the Amazon Echo Dot to Apple’s HomePod and even the more audio-focused smart offerings from Sonos. And at the heart of it all? You guessed it — Assistant.

Assistant also lives in Google’s less compact answer to Apple’s Airpods, the Pixel Buds. While these do all the things other Bluetooth earphones do, the coolest thing Google demoed was how the Buds can tap into Google Translate, and help two people hold a conversation in two different languages with almost real-time translation. Of course, demos must always be taken with a grain of salt, but as this technology gets better, language barriers will get smaller, and that is no small step for us as a species. That said, this feature will only work to full effect with a Pixel phone. And speaking of Pixel exclusive features, the Assistant on Pixels will also now get access to Google Lens, allowing it to effectively ‘see’ and provide answers and information based on what you ‘show’ it through the camera. So yes, Google wants you to get into a Pixel ecosystem, but it is hard to deny that AI is racking up on points here.

The other noteworthy announcement made at the event was the introduction of a $1,000 Pixel Book and the horribly-named Pixel Book Pen. It fits into the mould of sleek, portable do-anything-on-the-go devices that are the norm these days, but with the caveat that it runs Chrome OS. Sure, you get Assistant again — it even comes with a dedicated key — but of all the devices showcased at the event, this is the one that has the least well-defined use case, at least in our book.

So that’s the lineup: they don’t feature inspired design or break the wheel, but they are smarter than ever. And the future is all about the smarts.