With Siri set to see significant improvements once iOS 13 ships, Apple is appearing at a key voice AI tradeshow and has published a study explaining some of the details of a first-of-its-kind machine learning (ML) tech it calls “Overton."

Defining a machine learning window

This week, Apple is sponsoring the world’s largest spoken language processing conference, Interspeech 2019.

As part of its work at the event, it has submitted multiple research papers – and members of its growing machine learning teams will meet attendees there.

Among other topics (see them all here), Apple will present papers on detecting expression/intent through voice, improving voice recognition, developing more accurate tools to understand speech nuances, using mirroring to build relationships between human users and speech assistants and using tech to optimize speech enhancement.

We may learn a little more about what the company is up to in ML at the all-new Interspeech Youtube portal, though we don’t know if any Apple video will appear there.

It’s not a shock Apple’s scientists are engaging with the wider scientific community. The company has published sporadic machine learning papers and announcements on its own Machine Learning portal since 2017.

Introducing Overton

Apple claims to have a first-of-its kind solution with Overton – it aims to enable much of the personalization of ML models to be administered by the machine, not the human.

Voice interaction is only the front-end of what happens when you ask Siri a question. Machine learning models must then try to understand the question, contextualize it and figure out the most accurate response. Delivering a high-quality response is actually harder than it seems.

Sure, for some interrogations all you’ll get from Siri will be data it’s found on a Wikipedia page, (though even then it might have checked several such pages to elect the most relevant response). But the eventual aim must be that Siri becomes an effective source for complex answers to complex problems – even to the extent of predicting them.

These next steps are hard to accomplish.

How can scientists become more confident that the response Siri has to make is the most accurate available?

That’s the kind of challenge Apple is addressing with Overton, which “automates the life cycle of model construction, deployment, and monitoring."

In human terms that means the machine itself fixes and adjusts machine learning models in response to external stimuli, making it more accurate and repairing logical flaws that might lead to an incorrect conclusion. The idea is that humans can then focus on the high-end supervision of machine learning models.

This (I think) means that rather than needing to get deep inside increasingly complex code to make minor but necessary adjustments, humans can request a set of changes Overton then applies.

Quite literally, they're controlling the Overton window.

How will Apple use this?

I think Apple’s ambitions for Siri extend beyond it being the digital equivalent of the slightly useless friend you sometimes query even though you know you may not get a useful response.

Siri is intended to be a voice-led helper capable of bringing high-level information, contextualized analysis and augmentation of the tasks you already do. Siri Suggestions shows that direction, though the implementations remain limited.

Apple says: “A major direction of on-going work are the systems that build on Overton to aid in managing data augmentation, programmatic supervision, and collaboration.”

I also think Overton has user privacy implications.

Think about it like this:

Apple’s scientists build models they believe to be highly accurate. These models run on the iOS device. Overton provides those models with a degree of independence and ML systems adjust models for accuracy and relevance – all without giving researchers granular insight into individual actions.

This means data managers (in this case, the scientists creating those models in the first place) occupy more generalized strategic roles in which information concerning individual users is not made available to them.

Apple creates ML machines to handle certain defined tasks, while also equipping the machines themselves to personalize the models they use. This seems to be what Overton is about – and was certainly part of what drove Apple to purchase Silk Labs.

Apple says Overton is the first machine learning management system set up to improve and monitor application quality. Reading between the lines, it may (and I stress “may,” as I don’t know any better) also be the technology used to identify when you point your iPhone 11 camera at a pet for a pet portrait.

Tomorrow’s world is a work in progress.

Please follow me on Twitter, or join me in the AppleHolic’s bar & grill and Apple Discussions groups on MeWe.