The question I’ve been asked most frequently by reporters in the last couple of weeks is whether Apple’s Siri personal assistant is behind its competitors, and whether it can catch up. The answer is more complicated than just yes or no, and having some context for next week’s Apple’s Developer Conference is important for properly evaluating the announcements.

First, many comparisons at the moment are of apples and oranges (no pun intended). We’re at a particular point in the calendar where all the other major consumer tech companies have already held their developer events this year, so we’re comparing their 2016 products (including announcements of products that aren’t yet available) to Apple’s AAPL, -3.17% versions from 2015.

Moreover, many of these new products aren’t actually available yet, and won’t be for months. So evaluating Apple’s position in digital assistants (and artificial intelligence more broadly) today makes a lot less sense than it will this time next week, when we know what Siri will look like in the second half of 2016.

If past patterns continue, at least some of the new features will be available to developers almost immediately, to participants in Apple’s iOS beta program shortly after, and to everyone with an iPhone in September.

Three components to digital assistants

Even though people talk about voice-based assistants in a unitary fashion, there are really three main components. They are:

Voice recognition, or turning sounds into individual words;

Natural language processing, or turning collections of words into phrases and sentences and properly identifying their meaning;

Serving up responses from a cloud service based on the set of things it’s capable of doing.

Today, Apple’s Siri is decent but not stellar on the first two points. In both cases, Alphabet unit Google’s GOOG, -2.37% GOOGL, -2.41% voice search and Amazon’s AMZN, -1.78% Echo device do a better job of both recognizing individual words and ascribing meaning to phrases and sentences. The gap isn’t huge, but it’s noticeable. Microsoft’s MSFT, -1.24% Cortana performs roughly on par with Siri in my experience.

On the third point, Siri has expanded the range of tasks it can perform, but it’s still limited mostly to things Apple’s services can perform, with a handful of third-party services feeding in data on particular topics.

Google’s voice search can pull in a little more third-party data and has a much wider range of first-party data to pull from, while Amazon’s Alexa assistant has an open API that’s resulted in connections to many third-party services, though a large number are from tiny companies you’ve never heard of. Alexa, for example, can control your Nest thermostat or order an Uber, which Siri can’t do. Cortana is, again, roughly on par with Siri.

On balance, then, Siri is roughly in the same ballpark as competitors, but lags slightly behind in all three of the key areas versus both Google and Amazon.

Though it’s not yet available, Google’s next generation of its digital assistant, called simply “the Google assistant”, will be able to respond to text as well as voice queries and engage in conversations with users. This is capability Cortana has already, but others, including Siri, don’t. It should roll out over the next few months to users, but it’s hard to evaluate how effective it will be based on keynote demos alone.

How might Apple close the gap?

Apple has been continually improving its voice recognition, and although we’ve seen the fewest concrete rumors ahead of WWDC in this area, I would expect it to talk up further improvements.

In natural language processing, Apple has recently acquired a variety of companies with expertise in artificial intelligence. Among them is VocalIQ, which specializes in conversational voice interactions. I expect significant improvements in natural language processing to be announced at WWDC, including multi-step conversations — for example, asking Siri which movies are showing, where the one you want to see is showing, what the show times are and then booking tickets, without having to repeat information each time. That should move Apple forward in a big way.

The biggest thing holding Siri back right now is its lack of third-party integrations, especially the inability for developers to make functionality in their apps available to Siri. Were that to change at WWDC — a Siri API seems likely — that would dramatically improve the utility of Apple’s voice assistant.

I’ve so far focused mostly on the voice aspects of these digital assistants, but Apple added other elements last year and might continue to build on them this year.

In 2015, it introduced Siri’s Proactive elements; they serve up contacts, apps, news, and other content proactively through notifications and in the Spotlight pane in iOS. The main area I’d like to see more functionality is in text interactions with Siri. That could potentially happen either in the standard Siri interface or through iMessage, such that Siri would appear as just another contact you could exchange text messages with.

Apple could even open up iMessage as a platform for bots and conversational user interfaces from third parties, which would help Apple keep pace with announcements in this area from Facebook, Microsoft, and Google.

Bear in mind is that these digital assistants are only useful when they’re available. Amazon’s Echo device does very well where it’s present, but its biggest weakness is that Amazon has only sold around three million devices, and its Alexa assistant isn’t available on phones, the devices we carry everywhere with us. Google’s assistant is pervasive, available both on Android and iPhone, on the web, and elsewhere, while Siri is available on most of Apple’s devices in some form (with the exception of the Mac). Cortana is available on PCs running recent versions of Windows, but its availability on phones does little to help since there are so few Windows phones in use.

If Apple extends Siri to the Mac at this year’s WWDC, another credible rumor, it will make it even more ubiquitous in the lives of those committed to the Apple ecosystem.

Changing the narrative

Ahead of WWDC, Apple faces a growing narrative that suggests it’s falling behind in both artificial intelligence generally and in digital assistants specifically. Given the quirks of the calendar, Apple has naturally been silent as others have revealed their 2016 plans, so this comparison is partly unfair.

But Apple has a chance during its developer conference to demonstrate that it’s committed to not just keeping up but establishing leadership in these areas. By Monday afternoon, we’ll be in a much better position to judge whether it’s been successful in changing the narrative.

Jan Dawson is chief analyst of Jackdaw Research. You can follow him on Twitter @jandawson