For the past 20 years, Google’s mission has been to organize the world’s information. Increasingly, the information it serves up is ordered around you---your browsing habits, where you go, who you talk to, what you say, and what you search for.

The trend came into stark relief Tuesday at Google’s annual developer conference, where the company introduced a suite of new services that, frankly, sound awfully convenient. Take Google Lens, a visual search tool that “proactively” surfaces information about the objects around you, or Google Assistant, which thanks to its new “continued conversation” feature, doesn’t need a wake word every time.

“You open the camera and you start to see [Google] Lens surface proactively all the information instantly and it even anchors that information to the things that you see,” said vice president Aparna Chennapragada, demonstrating how the feature, which will soon be built in to phones from other manufacturers, can identify everything in your friend’s apartment, down to the Zadie Smith book on her coffee table. The company told WIRED that Lens begins working when you open the camera app.

Want to catch up on the news? “What’s cool is that I didn’t have to tell the app that I follow politics, love to bike, or want information about the Bay Area, it works right out of the box,” Google’s Trystan Upstill told the crowd, while introducing personalized recommendations in Google News.

Looking to go out? “With zero work, [Google] Maps is giving me ideas to kick me out of my rut and inspire me to try something new,” said vice president Jen Fitzpatrick, introducing new restaurant recommendations that automatically pop up in Google Maps, along with a score called “Your Match” that combines Google’s machine learning “with the information that I’ve added—restaurants I’ve rated, cuisines I’ve liked, and places I’ve been to.”

Even Google’s new dashboard for digital well-being revolves around “understanding your habits,” said CEO Sundar Pichai, before introducing Dashboard on Android, which he said will give users “full visibility into how you’re spending your time: the apps where you’re spending your time, the number of times you unlocked your phone on a given date, the number of notifications you got.”

All this free personalization comes at a price: these services count on users handing over even more data about themselves and their lives, and on Google mining that data, giving the search giant more influence and control over our daily choices.

In response to questions from WIRED about data collection, the company pointed to various user controls. Google Lens, for instance, stores a small version of the image you viewed to your account, the company said. The new features of Google Assistant use artificial intelligence to determine when someone is talking to the machine, as opposed to another person, and stores that audio. Both the images and the audio can be deleted using Google’s privacy settings, the company said.

Google also pointed out that personalized news suggestions only appear under the For You tab in Google News. The company also announced features designed to broaden your point of view and get multiple perspectives on a story, such as a Full Coverage feature Google demonstrated for Hurricane Maria, which has no personalization and shows updates on a story, timelines, tweets, videos, timelines, opinions, and more news.

Still, these new products served as a reminder of how much Google knows about you---and how much more it may learn. None among the parade of Google executives on stage mentioned whether additional permissions or terms were required for these additional conveniences. In fact, the lack of friction was touted as a selling point. No startup offering a camera app, recommendations app, voice assistant, map, or app to stop you from wasting time on apps could offer the same level of information or personalization as Google, which keeps compounding its advantage with more data.