Microsoft pointed out that while Siri, Cortana, Google Assistant and Alexa can execute commands, they can't carry on a conversation in any meaningful way (Google's much-debated Duplex conversation aside). Semantic Machines, however, has developed tech that can understand entire chats, not just orders to do this or that. "For rich and effective communication, intelligent assistants need to be able to have a natural dialogue instead of just responding to commands," said Ku.

Semantic Machines should be able to help Microsoft with all that via its primary product, the Conversation Engine. It uses machine learning to figure out the "semantic intent" of a conversation to capture the context and goals of the end-users and formulate a natural conversation. It's based on a self-updating framework that helps it get better over time.

All of this is in keeping with Microsoft's cloud-based Cognitive Services framework, first introduced in 2016. It helps developers get on board with speech recognition, natural language comprehension and AI assistants. Microsoft said that it has 1 million developers on the platform, and another 300,000 on its more business-oriented Azure Bot Service. Whether that will produce assistants that can order you a haircut or restaurant appointment like Google's Duplex remains to be seen.