“Uhm,” said the female voice. "Can I book a table for tomorrow?" The question came not from a person, but software called Duplex developed by Google to make phone calls. Before the end of the year, some of the company’s users will be able to direct the bot to call restaurants and book tables on their behalf.

In a demonstration last week, Duplex smartly handled questions from a Google employee playing the role of restaurant worker about details such as the size of the party and the name to hold the table under. Then the bot signed off with a cheery “Ok, great, thanks.” Duplex had started the conversation by announcing “I’m Google’s automated booking service so I’ll record the call,” but it was barely distinguishable from a person.

Google announced today that Duplex will be made available on the company’s Pixel smartphones before the end of the year, in New York, Atlanta, Phoenix, and the San Francisco Bay Area. It will be a feature of Google Assistant, the company’s rival to Apple's Siri; for now, it will only call restaurants without online booking systems, which are already supported by the assistant.

Duplex’s debut makes a small change to Google Assistant’s capabilities. But it marks another moment in the march of artificial intelligence technology into daily life. Investments in AI by Google and its competitors have made it routine for computers to recognize our speech or faces. But even recent AI-powered services with names and voices, such as Apple’s Siri and Amazon’s Alexa, cannot be easily confused with humans. Software that can passably imitate how people talk, and make its own calls, feels...um...different.

Google CEO Sundar Pichai sparked awe but also alarm when he unveiled Duplex in May in a keynote at the company’s annual developer conference. He played two recordings in which the bot did not identify itself when calling apparently unwitting staff to make bookings at a hair salon and restaurant.

A Google spokesperson told WIRED that the company now has a policy to always have the bot disclose its true nature when making calls. Duplex still retains the human-like voice and “ums,” “ahs,” and “umm-hmms” that struck some as spooky, though. Nick Fox, the executive who leads product and design for Google search and the company’s assistant, says those interjections are necessary to make Duplex calls shorter and smoother. “The person on the other end shouldn’t be thinking about how do I adjust my behavior, I should be able to do what I normally do and the system adapts to that,” he says.

The experience of WIRED writer Lauren Goode, who answered a call from Duplex in a demo this past June, illustrates how bots that sound like people can be disorienting. She confused the bot by lobbing a question about allergies in the middle of a discussion about available times for a restaurant reservation. Goode became confused herself when she learned that a second voice that came on the line to complete the derailed transaction was a human call center worker, not another Duplex bot playing clean up.

The term computer was originally applied to people, who carried out calculations manually. Then computers became room-filling machines, then desk-sized, then pocketable. Now they can sound and converse like people, at least in the confines of a dialogue with a very specific goal. “It feels odd because people have this notion that people and machines are different,” says Jeff Bigham, a professor at Carnegie Mellon University who researches human-computer interaction.

Restaurant staff will be the guinea pigs for what happens when that distinction is eroded—at least for certain kinds of phone calls.