Apple’s annual Worldwide Developers Conference kicks off Monday, June 13, in San Francisco with one of the company’s signature product keynotes. (Join Recode for live WWDC coverage, beginning at 10 am PT, 1 pm ET.)

Many expect Apple to announce big new features for Siri, its AI-powered assistant. These include new tools that would finally allow other companies to make their apps and information accessible via Siri, according to a May report from Amir Efrati at The Information.

Opening up Siri — if true — would be a big step for Apple, which has tightly controlled Siri's capabilities since its debut almost five years ago. Select third-party information and features, such as Yelp restaurant reviews and OpenTable bookings, have long been available via specific partnerships. But the vast majority of iOS developers still have no direct access to Siri, which is now built into every iPhone and iPad, the Apple Watch and the latest Apple TV streaming box.

But if Apple expects to be taken seriously, it’s about time. Many see voice-controlled assistants playing key user-interface roles in the home, in cars and while walking around. And while Apple entered the field early with Siri, it now has formidable competition, namely from two other tech giants, Amazon and Google.

Amazon’s Alexa assistant, built into its sleeper-hit Echo speaker and an increasing number of devices, is friendly to developers and already has a rich library of "skills," ranging from playing Spotify music and controlling a variety of smart-home devices to hailing Uber rides and reordering Amazon purchases. At our recent Code Conference, Amazon CEO Jeff Bezos noted that more than 1,000 people are working on Alexa and Echo, suggesting it’s a serious endeavor and not just a hobby.

Google, meanwhile, is developing its Google Assistant software, which it recently previewed at its I/O conference using a still-in-development Amazon Echo rival called Google Home. Under new CEO Sundar Pichai, Google has made AI a huge focus, including developing a custom chip that it says has an "order of magnitude better-optimized performance per watt for machine learning." Translation: We're not messing around here, either.

Each company has its advantages and challenges, and none is the clear winner. (Not to mention Facebook, Microsoft and any startup that comes along.)

Apple is probably the best of the three at shipping complete, desirable products. It has tight control over the iPhone and other devices and has made Siri a household name. But it also tends to keep its services to itself, and its strong privacy stance may lead to weaker AI with less user data.

Google has the benefit of the world’s largest information database: Google. It also controls a huge amount of personal information about many of its users, which it can use to make more educated decisions. And now, it has custom server chips. It also builds Android, the world’s most popular operating system, though it has less control over Android devices than Apple has over iPhones.

Amazon, meanwhile, has created an unlikely hit product that works. In everyday use, Alexa is noticeably more reliable at taking voice queries than Siri. Amazon has also proven itself to be open with developers, who can add their services to Alexa or build Alexa into their devices. And it has its own enviable distribution platform — Amazon — with a built-in commerce and services-based business model.

So: How serious is Apple? Does it see Siri as a key interface for all of its products going forward? Enough to put its power in the hands of all its developer partners? In a way that’s compelling and easy enough for its users? Can it compete with Google’s data and processing power or Amazon’s momentum?

Is Siri just catching up, or does Apple have a leapfrog move? We should get a sense soon.