"I don’t have a microchip in my head – yet," says the man charged with transforming Google’s relations with the technology giant’s human users.

But Scott Huffman does envisage a world in which Google microphones, embedded in the ceiling, listen to our conversations and interject verbal answers to whatever inquiry is posed.

Huffman, Google's engineering director, leads a team tasked with making conversations with the search engine more reflective of the complex interactions people enjoy with each other.

Download the new Independent Premium app Sharing the full story, not just the headlines

The future of the $300 billion business depends upon automatically predicting the search needs of users and then presenting them with the data they need.

“Computing is becoming so inexpensive that it’s inevitable that there will be a ubiquity of connected devices around us, from our lapel to our car to Google Glass [a new optical head-mounted computer],” said Huffman during a visit to the UK from the company’s California base.

A microphone hanging from the ceiling, responding to verbal queries, would remove the need to whip out a phone to remind yourself what time tomorrow’s flight leaves. It could also make sure you don’t miss the flight altogether.

“Like a great personal assistant, it will interrupt you and say ‘ you’ve got to leave now’. It will bring you the information you want,” Mr Huffman said.

In fact, believes Mr Huffman, who has been working on refining search for 15 years, the clunky physical act of typing requests into Google’s search box will gradually recede almost to nothing.

The information could be relayed via “a wearable device, perhaps it might have a small screen, which you can only interact with through your voice and maybe touch but nothing else".

For play as well as work

The microphone network would have leisure uses too.

“Imagine I can say to a microphone in the ceiling of the room ‘ Can you bring up a video of the highlights of yesterday’s Pittsburgh Steelers game and play it on a TV in the living room?’ and it works because the Cloud means everything is connected,” he says.

“I could ask my Google ‘assistant’ where we should have lunch, that serves French food and isn’t too expensive? Google will go ‘ Ok, we’ll go to that place’ and when I get in my car it should already be navigating to that restaurant. We’re really excited by the idea of multiple devices being able to talk to each other.”

Whether Google users want a microphone embedded in every ceiling is another matter after the company became enveloped in a crisis of trust following Edward Snowden’s revelations about the US Government's National Security Agency’s clandestine electronic-surveillance programme PRISM.

On Monday, Google joined forces with fellow tech giants including Facebook, Apple and Yahoo! to call for sweeping changes to US surveillance laws and an international ban on bulk collection of data to help preserve the public’s “trust in the internet”.

“We take privacy and security very seriously,” Mr Huffman said. “ Our goal is to keep users’ information private and use it in a way that helps that user. When I ask Google for travel information during my trip it draws it out using my hotel confirmation email. So I’m trusting Google with that information and in exchange I’m getting that value.”

Google believes it can ultimately fulfil people’s data needs by sending results directly to microchips implanted into its user’s brains. Research has already begun with such chips to help disabled people steer their wheelchairs.

“If you think hard enough about certain words they can be picked up by sensors fairly easily. It’ll be interesting to see how that develops,” Mr Huffman said.

His current priority is utilising Google’s Knowledge Graph, an expanding store of information holding 18 billion facts on 60 million subjects, to deliver a more “human” search response. Voice-based search requests are more complex than the two-word searches typed into the search engine.

“My team is working very hard on the idea of a richer conversation with Google. We use a fairly complex linguistic structure in conversation that Google today doesn’t understand.

“But five years from now we will be having that kind of conversation with Google and it will just seem natural. Google will answer you the same way a person would answer.”

The engineer adds: “Google will understand context in conversation but it’s not an armchair psychiatrist. You can’t have a conversation about your mother. Google can’t talk to me about how I feel about things until it understands factual ‘things’. We’re just getting started understanding ‘things’ in the world.”

Video: Google Glass prescription lens prototype