It’s the simple bargain that made companies like Google and Facebook into giants: in exchange for the convenience of running your life from a smartphone, you hand over gobs of data on your every activity. It zips up into the cloud where algorithms do…well it’s hard to be exactly sure, but everyone's at it. Oh, except Apple.

Tim Cook has aggressively positioned the company as uninterested in collecting user data, and boasts that it sets Apple apart. “They’re gobbling up everything they can learn about you and trying to monetize it,” he said in a 2015 speech. “We think that’s wrong.”

“They,” of course, refers mostly to Google and Facebook, which rely heavily on cloud computing for search and recommendations and other features. Apple, on the other hand, promises to do its machine learning-powered stuff like photo searching and predicting what emoji you want right there on your smartphone or tablet.

You can see the logic here. Apple makes its money selling gadgets, not targeting ads. And denigrating competitors for monetizing your data is a handy marketing and PR too. Who among us doesn’t want to reduce our privacy risk?

But Cook’s steadfast aversion to the cloud presents a challenge as Apple tries to build up new features powered by machine learning and AI. To build and run machine learning services you need computing power and data, and the more you have of each the more powerful your software can be. The iPhone is beefy as mobile device goes, and it’s a good bet Apple will add dedicated hardware to support machine learning. But it's tough for anything it puts in your hand to compete with a server—particularly one using Google’s custom machine learning chip.

Compare the photo management apps from Apple and Google to see how this can play out. Both use neural networks to parse your photos so you can search for dogs and trees and your best friend. Apple’s Photos does this entirely on your iPhone. Google Photos does it all in the cloud.

Of the two, only Apple’s app will let you search your iPhone snaps for “dog” while in airplane mode at 30,000 feet, and not having to wait while your query and the response travel across the internet can in theory make searches snappier. But Google Photos has generally been favored by reviewers (including our own) impressed by the power of the search company’s image-parsing algorithms. Local processing works great for many things, but if you want to push the envelope it's hard for a mobile device to outsmart cloud AI, says Eugenio Culurciello, a professor at Purdue University who works on hardware to accelerate machine learning. “In a server you can do so much more work in any second,” he says.

Companies that haven’t pledged cloud celibacy also have an easier time making their artificial intelligence more, well, intelligent. The most direct way to build a smart new thing to work on your customers' data, is to use lots and lots of that same data to train it, says Chris Nicholson, CEO of Skymind, a startup which helps companies use machine learning. “The more data you have the more valuable your thing gets,” he says. “Google, Amazon and others are benefiting from that and Apple is not.” It’s also easier to continuously update neural networks in the cloud, so they’re always improving, than it is to push updates to ones that reside in people’s pockets, says Nicholson. Apple has started using a technology called differential privacy to pull in some anonymized data on how people use their phones, such as your favorite emoji, but it’s unclear how broadly that can be applied.