A second article on this story includes additional information from Apple.

When Apple announced a new version of its mobile operating system in San Francisco last week, executives boasted of features such as a smarter Siri and improved copy and paste. And as usual they announced that software developers could download a preview version of the software ahead of its fall release.

Some security experts who inspected that new version of iOS got a big surprise.

They found that Apple had not obscured the workings of the heart of its operating system using encryption, as the company has done before. Crucial pieces of the code destined to power millions of iPhones and iPads were laid bare for all to see. That would aid anyone looking for security weaknesses in Apple’s flagship software.

Tim Cook at Apple’s developer conference in San Francisco last week.

Security experts say the famously secretive company may have adopted a bold new strategy intended to encourage more people to report bugs in its software—or may have made an embarrassing mistake. Apple declined to comment on why it didn’t follow its usual procedure.

(Editor's note: Apple later did comment, see "Apple Now Says It Meant to Open Up iPhone Code.")

The security of Apple’s software has been under additional scrutiny since the FBI attempted, unsuccessfully, to compel the company to help penetrate a device used by a perpetrator of last year’s mass shooting in San Bernardino, California (see “What If Apple Is Wrong?”). Apple has signaled that it will strengthen security and privacy features.

The heart of an operating system is a component known as the kernel, which controls the way programs can use a device’s hardware and enforces security. Apple has previously encrypted the kernel in iOS releases, hiding its exact workings and forcing researchers to find ways around or through it. But the kernel was left unobfuscated in the preview version of iOS 10 released to developers last week for the most recent Apple devices.

That doesn’t mean the security of iOS 10 is compromised. But looking for flaws in this version of the operating system will be much easier, says Jonathan Levin, author of an in-depth book on the internal workings of iOS. “It reduces the complexity of reverse-engineering considerably,” he says.

The goodies exposed publicly for the first time include a security measure designed to protect the kernel from being modified, says security researcher Mathew Solnik. “Now that it is public, people will be able to study it [and] potentially find ways around it,” he says.

Some people who find software bugs disclose them to companies so they can be fixed, but they can also be used to create malware or to develop “jailbreaks,” which are modifications not approved by Apple.

Why Apple has suddenly opened up its code is unclear. One hypothesis in the security community is that, as Levin puts it, someone inside the company “screwed up royally.” But he and Solnik both say there are reasons to think it may have been intentional. Encouraging more people to pore over the code could result in more bugs being disclosed to Apple so that it can fix them.

Jonathan Zdziarski, another iOS security expert, favors that hypothesis, because accidentally forgetting to encrypt the kernel would be such an elementary mistake. “This would have been an incredibly glaring oversight, like forgetting to put doors on an elevator,” he says.

Opening up its code would make sense in light of Apple’s recent faceoff with the FBI, Zdziarski notes. Originally the agency wanted Apple to help penetrate the San Bernardino iPhone, but it dropped that plan after finding a third party who could break into the device. It was the latest evidence of an expanding trade that sells software exploits to law enforcement (see “The Growing Industry Helping Governments Hack Terrorists, Criminals, and Political Opponents”). Opening up iOS for anyone to examine could weaken that market by making it harder for certain groups to hoard knowledge of vulnerabilities, Zdziarski says.

Apple has even been accused of effectively encouraging that market because it has not been as friendly to security tip-offs from outside the company as rivals such as Google and Microsoft have. Unlike those companies, Apple does not offer “bug bounty” cash payments to people that disclose flaws they have found in its products, for example. Were Apple trying to become more welcoming to outside help, simply launching a bug bounty program may have been less risky than suddenly declaring open season on the iOS kernel, though. “This is a gamble,” says Zdziarski. “But I can see the possible reason that Apple may have decided to make this wager.”