Over the years, human drivers have gotten pretty good at communicating intent to the people around them. A little wave of the hand, a nod, maybe some eye contact, and that’s usually all it takes to let pedestrians, bicyclists, and other drivers know what they’re doing. And, if all else fails, we can lean on our horns or roll down our windows and speak our minds (sometimes colorfully). But what happens when you remove the human from the driver’s seat? How will the cars of the future “talk” to the world around them?

How will the cars of the future “talk” to the world around them?

Uber thinks it may have arrived at a solution. In a recent patent application, the ride-hailing giant proposes wrapping its self-driving cars in flashing signs to effectively communicate messages to pedestrians and others around it. The illustration accompanying the application is pretty wild: flashing arrows would appear on the side-view mirrors, a projector would display a virtual crosswalk in front of the car, and a “virtual driver” would pop up in the windshield to point pedestrians in the right direction. In essence, the car would need to be lit up like the Las Vegas Strip in order to make up for the absence of a human behind the wheel.

“In the real world, when there’s a human driver, they’re usually not shouting out the window, ‘Hey I’m slowing down now,’” says Sean Chin, a product designer at Uber’s Advanced Technology Group, which oversees its autonomous vehicle program. “There are subtle things you can do, like a head nod or flashing lights. And while we don’t have final implementation, what we’re considering is what is a new language we can create to give people that information.”

Chin cautions that the patent application, which hasn’t received approval from the US Patent Office, should be seen more as a framework for how Uber was thinking about this issue rather than a final product. For example, Uber’s product team is also exploring how self-driving cars can communicate intent through “vehicular sound.”

Uber’s patent application seems to be largely focused on the idea of the car giving instructions to pedestrians about when to walk and where to go. But Chin says the end goal is to communicate the intentions of the car so as to enable pedestrians to make their own decisions. “If we can instead say the car is slowing down, then all actors in the area, whether they’re pedestrians or cyclists pulling up along the side, can all interpret the vehicle state and make their own individual decisions,” he says.

“As operators, we have a high certainty about what the car is doing and what it plans to do,” he adds. “What we need to do is enable pedestrians to interpret this behavior on their own.”

“What we need to do is enable pedestrians to interpret this behavior on their own.”

Uber isn’t the only company working on this communication problem around self-driving cars. Google has its own patent that contains a range of ideas, including light-up “walk” or “don’t walk” signs on the car’s body, image displays, and audible signals similar to the ones used at busy crosswalks. Other types of notifications sound more bizarre, like “a robotic hand to make gestures or robotic eyes on the vehicle that allow the pedestrian to recognize that the vehicle ‘sees’ the pedestrian.”

Drive.ai, a San Francisco-based startup, is working on LED signs on the vehicle that use text and emoji-like pictures to communicate. The company is also working on an advanced version of its auditory feedback, allowing the car to “see the context” of the situation and emit a “more socially appropriate” honk. (Google is working on something similar.)

Uber’s Chin says whatever solution the ride-hailing company lands on, it needs to be “tasteful and feel natural.” In other words, the complete opposite of how most drivers currently choose to communicate — at least here in New York City.