Earlier this week, striding across the stage at Apple’s annual developer conference in front of a crowd of thousands at the San Jose Convention Center, Tim Cook was animated and gushing, an evangelist for a series of new products and features.

Among them: the HomePod smart speaker (see "Apple Is Countering Amazon and Google with a Siri-Enabled Speaker") and new ways developers can build artificial intelligence into apps.

By Thursday morning, the Apple CEO was on the other coast, sitting on a gray couch next to a yellow emoji pillow (a happy face) at the MIT Media Lab’s Affective Computing Group listening to Rosalind Picard talk about depression. Cook, who will give this year’s MIT commencement address, is spending the day before learning more about research on campus, much of it involving sensors and AI.

Picard, an expert in using wearable devices and phone data to measure human emotions, is researching how data pulled from cell phones might help identify and perhaps even predict depression, a problem expected to be the second leading cause of disability in the world by 2020.

In time, Picard hopes to be able to predict when a person will become vulnerable to depression even before they get there. “We want to not just recognize, but try to forecast it,” she tells Cook.

As our phones become smarter about our behavior, they could play an important role in helping us track and understand ourselves, and our future behavior.

One way can be by leveraging artificial intelligence, and though Apple is often termed a laggard on AI when compared to companies like Google, Microsoft, and Amazon, Cook argues that machine learning is already well integrated into iPhones.