Cars are smarter than ever, but their giant screens can be dangerously distracting. Vastly improved voice controls could improve safety in the near-term and potentially govern how we interact with automated vehicles in the future, too.

The big picture: Removing eyes from the road for just two seconds doubles the risk for a crash, according to AAA research.

With poorly designed technology, simple tasks turn into complicated ones, diverting drivers' attention away from the road.

Staying alert is already a challenge with many of today's partially automated driving features.

Context: Voice controls have always made the most sense but using them can be laughable. Drivers sometimes have to use stilted commands like "navigate" or "telephone" instead of speaking naturally — and still the systems don't always get it right.

What's new: Advances in natural language processing technology mean drivers can now converse with their car in a way that is both safe and helpful.

SoundHound, which started out with a music recognition app, spent the past decade quietly adapting its AI technology to work as a hands-free intelligent system for cars. It has partnerships with Daimler, Hyundai, Honda and PSA, the parent of Peugeot, Citroen and Opel.

Nuance is also developing in-car assistants that can track drivers' emotions and help keep them engaged.

Of note: Consumers want their cars to mimic their smartphones so most carmakers have already incorporated Apple Carplay and Android Auto into their infotainment systems.

Yes, but: There's a downside to that strategy. The invitation to those data-hungry tech giants is getting between the carmakers and their customers, making it harder to create their own brand experience inside the vehicle.

SoundHound offers an alternative. "We've architected a platform that enables major companies to harness best in class technology, but still own their own brand, and not be disintermediated by the giants," says VP and general manager Katie McMahon.

Mercedes Benz set a new industry benchmark for voice controls in the 2019 A-class sedan and GLE utility with its MBUX infotainment system.

Drivers can control many of the car's features by speaking conversationally after a wake-up command: "Hey Mercedes."

The car understands things like "I'm cold" or "I feel like Asian food, but not Japanese" and responds accordingly by turning up the heat or suggesting nearby Chinese or Thai restaurants.

It can also tap into the cloud for answers to multilayered, contextual questions on 150 different topics.

The only car-based features it won't let you control by voice are safety-related things like lane-keeping assist or adaptive cruise control — Mercedes doesn't want you to accidentally disengage the systems by talking about them.

What to watch: Natural language AI voice systems could become one of the most effective driver-assistance technologies available. But they're still potentially a form of cognitive distraction even though your eyes don't leave the road. We'll be watching to see how they impact safety.

Go deeper: Regulating the humans behind the wheels of autonomous vehicles