When we first met Apple's Siri assistant years ago, we learned how to interact with it in a very specific way: we asked it questions and it tried to answer.

Now Siri is changing pretty drastically from that original vision, based on how Apple talked about it at its WWDC developers' conference on Monday.

Instead of Siri becoming "smarter" -- that is, answering more complicated questions and holding more natural conversations with humans -- it's starting to learn more about how we use our iPhones and live our lives, and then and making recommendations.

It's a very different approach from what Google and Amazon are doing with the Google Assistant and Alexa.

Google's Duplex project is starting to move toward allowing humans to have natural conversations with artificial intelligence as if one was talking to another person. And you can ask Google or Alexa most questions and get an answer a lot of the time.

Siri, on the other hand, doesn't seem to have as many answers, and frequently just tries to search the web.

Instead, it's learning more about you, and letting you train it.

Apple talked about Siri Shortcuts during they WWDC 2018 keynote. It's a clear result of Apple's acquisition of a company named Workflow last year. Users will be able to customize the commands that Siri can answer, and developers can use these tools. If you say "Siri, I lost my keys" for example, it can pinpoint them using an app like Tile.