At last year’s CES 2017, we saw glimpses of the self-driving future. We would entertain ourselves, get real-time information about the locations we’re passing through, or work while in the car, depending on our whims. At CES 2018 last week, those visions became more concrete. Automakers and suppliers talked about delivering personalized experiences, and showed integration of current technologies from the home and mobile space — particularly voice assistants like Alexa and Google — as well as future capabilities that will blend your digital life from home and mobile into the automobile. They also talked about various digital software platforms that would bring the integration of those experiences to reality in the years ahead. But besides the pie in the sky future, they also showed in-car experiences that will come with assisted and semi-autonomous driving technologies, which are beginning to roll out in the next three years as we get closer to the autonomous future.

The Digital Cockpit

Automobile controls and interfaces are continuing to go digital. While the past few years have seen the digitization of the center stack in cars, particularly in infotainment, there will be increasing digitalization of almost all touch points. A variety of cost effective screen technologies, like OLED and QLED, will bring more information and visualization capabilities to car displays. Despite auto purists complaints about touch controls over real knobs and buttons, the likelihood is that touch will see more prevalence in the years ahead, given advances in haptic feedback and the ability to contextualize controls to combine knobs and dials with digital interfaces. Touch interfaces will enable clean dash designs, save weight, and will be easier and cheaper to manufacture.

Head-up displays (HUD) will also become more prevalent — and far more informative. On-windshield HUDs using photo-luminescent nanomaterials, as well as through-windshield HUD using holographic film, will deliver more augmented-reality types of experiences. They’ll provide more informative navigation, driver alerts, and real-time information about road conditions, parking, and pertinent location-based information. As cars transition from assisted driving to fully autonomous capabilities over the coming years, the richness and immersiveness of the information available on HUDs and screens to drivers and passengers will increase.

Personalization

Personalization was a huge theme of just about every demo at CES 2018. With multiple screens controlled by unified systems in next-generation interfaces, both drivers and passengers will have the ability to configure what kind of information they want to see. Different ecosystems that bring control over entertainment, information, and home automation are being brought to the car. Multiple automakers and suppliers like Harman and Panasonic featured Alexa, Google, and Samsung’s Bixby voice assistant demos. The assistants will also enable familiar, intelligent voice control over car functions like HVAC, infotainment, and navigation.

The ecosystem battle for the automobile will continue. We’ve already seen wide adoption of Apple CarPlay and Android Auto in auto infotainment systems. These have brought some of the personalized experiences from mobile phones into a screen in the cockpit to enable a safer driving experience and making a more seamless transition of information from your mobile life into the car. Bringing the voice control ecosystems from Alexa, Google, and Bixby in represent another step. But rather than let those ecosystems take over the car, some vendors are pitching their own cloud platforms to enable the complete connected car experience. One example is Harman’s Ignite, which encompasses cloud based driver profiles and content personalization, but also other vehicle services like over-the-air software updates, vehicle analytics, and map as a service (MaaS). While automakers and suppliers want to incorporate popular digital and mobile consumer services, they’re realizing the digital experience in autos will eventually overtake the driving experience as manual driving becomes less necessary. And they want to own and differentiate part of those experiences.

Safety

While autonomous driving won’t require any driver attention (someday), on the road to that nirvana, one of the issues with all the new information and capabilities in car cockpits is maintaining driver attention to actual driving. Several automakers and vendors like Hyundai, Clarion, Valeo, and Omron featured demos of facial and eye detection. Cameras inside the car make a map of the driver’s face and detect eye position. The goal is to detect driver distraction or drowsiness, and enable alerts or other semi autonomous driver aids to keep car on its path. Those cameras could also be used for facial recognition and driver identification (a la iPhone X) and enable personalization features, which could be standard things like seat positions, information displays, and in-car entertainment content, to connections with home automation features like lights and temperature settings when you leave of get home.

Several demos featured health and wellness monitoring. Based on in-seat sensors and the aforementioned cameras, information like heart rate, breathing patterns, and facial expressions can be monitored to determine stress levels, anxiety, and alertness to ensure driver safety. Presumably in extreme cases, if a driver or passenger’s health is in danger, a help call could be automatically triggered as well.