During its Keynote today, Google announced new features coming to its flagship search function—you know, that thing we all started using Google for. VP Amit Singhal spent some time discussing what Google's search functionality will eventually morph into.

Google's strategy is summarized by three words: answer, converse, and anticipate. Singhal explained that many of the pieces of these upcoming changes can already be seen in products that Google has recently introduced—namely, Google Knowledge Graph and Google Now, with perhaps a splash of Google Glass, too.

Answer: Knowledge Graph

Last year, Google launched Knowledge Graph. The intent was to let Google move beyond simply locating keywords to begin understanding real-world entities (people, places, and things) and the relationship between them. Two example questions Singhal cited as questions Knowledge Graph was designed to answer are "What are the movies by J.J. Abrams?" and "What's the release date of Star Trek: Into Darkness?" Google has over 570 million connected entities in its Knowledge Graph right now.

Today, Singhal announced the addition of Knowledge Graph-based statistics to search. Google will attempt to anticipate search follow-up questions—so a search asking "what is the population of India," will show not just a trend line with the population of India but also a sidebar showing the population of other countries like China and the USA.

Knowledge Graph is also expanding to include support for results in Polish, Turkish, and Simplified and Traditional Chinese.

Converse: Some Now, some Glass

Sometimes you're searching for a song or video sent by a friend, or you're keyword searching through your inbox for a hotel or airline reservation you've previously made. Singhal explained that Google feels you should be able to find these kinds of things via a context-sensitive conversational interface rather than having to sift through e-mail. He said that one of Google's goals is for users to be able to ask Google for details on an upcoming flight or trip, about packages scheduled to be delivered today, or about vacation photos.

Google can already deliver some of these answers, but it wants to be able to deliver them in a natural, human-like way—like asking a friend for the information rather than typing in keywords. Google Now gives spoken responses to search queries, but not in a conversational way. It doesn't have much of a memory for past searches, for example.

Building on Google Now's existing capabilities, Singhal announced this morning that fully conversational search is going to be integrated into all devices, stationary or mobile, via Chrome. This new kind of conversational search, which Google calls "hotwording," can be invoked on a supported device simply by saying "OK, Google" and asking a question—just like Google Glass' invocation phrase of "OK, Glass." The now-always-on Chrome search will speak back the answer, and best of all, it has a memory—it understands context and you can refer to previous search items by pronouns (allowing users to ask questions like "Where is the museum?" followed by "How far away is it?" and "How late is it open?").

Anticipation: More Google Now

The third leg to the new strategy is "anticipation," or suggesting the right thing for the right time. This is partly filled today by Google Now, which was launched last year on Android and iOS (via the Google Search app), but it's being greatly expanded. Today, Google is announcing several new Google Now cards (that is, things it can display), including the ability to set reminders, view public transit, and display info about music, books, TV shows, and video games. Google wants Google Now to be a "fully assistive tool" to bring together everything a user might want to do into a single set of interactions.

All these features were demoed by Google VP Joanna Wright; she used the service to plan a day trip to Santa Cruz with her family. Using a development build of Chrome, she called up the new search function with a simple "OK, Google"—no hands at all, not even to click a push to talk button—and asked about interesting things to do in Santa Cruz. She then asked for details about the Santa Cruz boardwalk, which was listed in the results. After a key question ("OK, Google, how far is it from here?"), Google pinpointed her current location at Moscone and told her the boardwalk was 1 hour and 21 minutes away. Google picked out words like "it" and "here" and understood their context—something Siri and other natural language search tools have a lot of trouble with.

The future search can also quickly handle complex requests like "Send an e-mail to Katie and ask her if we can meet for dinner," picking out the right person and generating an e-mail in a very Siri-like fashion.

When?

Singhal admits that not all of these features will be immediately available, but Google will begin rolling them out in the very near term. The demonstration was done not to show what search will be like later this afternoon, but rather in the near future.

Get ready to do a lot more talking to your computers, phones, and magical space glasses—the era of "OK, Google" and "OK, Glass" is arriving.