Security researcher Charlie Miller, well known for his Pwn2Own exploits involving Safari and MobileSafari, has discovered yet another serious security flaw in Apple's iOS mobile operating system. The bug could potentially let any app download and run unsigned code, though it appears that Apple has a fix in the works. In the meantime, however, Miller's proof-of-concept app—originally approved by Apple and available until late Monday—has earned the researcher a one year suspension from Apple's developer program.

News of the security flaw was first reported by Forbes on Monday afternoon. According to the report, Miller plans to reveal the issue in a presentation at the SysCan security conference in Taiwan next week. As part of his presentation, Miller created an app capable of exploiting the flaw, and uploaded it to the App Store. Though App Store staff discovered a few problem APIs in the app, they didn't notice Miller's use of a special memory area, which allows his app to run unsigned code.

Ars spoke to Miller to understand the bug and its implications. In particular, he noted that this should make iOS users wary of apps from unknown or untrusted developers. "Until the flaw is fixed, you can't really trust what's coming from the App Store," Miller told Ars.

Crack in the sandbox foundation

iOS is designed to only run code that is digitally signed by the developer. Developers are given special security certificates from Apple when they join the App Developer Program, and when developers have an app ready to submit to the App Store, they use these certificates to digitally "sign" the code, confirming it comes from a trusted source. Apple then puts apps through a vetting process, which attempts to confirm that apps don't use nonstandard APIs or attempt to use user data in an unscrupulous way. When you download an app from the App Store, then, you should be confident that the app is safe.

But in iOS 4.3, Apple introduced a mechanism to allow exceptions to this hard and fast "signed code only" rule. To improve the performance of MobileSafari, Apple added an improved JavaScript engine called Nitro. First introduced in Safari on Mac OS X, Nitro works by first analyzing JavaScript code for a webpage, and then compiling it "just in time" into optimized native code.

"This code hasn't been signed, so there has to be a mechanism to relax those restrictions," Miller said. Normally, iOS's kernel won't let apps allocate memory that is writeable and executable. Either memory is allocated as writeable—able to store data—or it's executable—able to store signed instruction code. However, iOS 4.3 introduced "sandboxing entitlements," special exceptions granted on a very limited basis, to allow things like Nitro's JIT JavaScript compilation to work. In iOS 4.3 and later, MobileSafari has an entitlement called "dynamic code signing."

(John Siracusa wrote an excellent explanation of sandboxing entitlements as they are used in Lion).

"MobileSafari is allowed to have a single special region of memory to write JIT code to memory and allow it to execute," Miller explained. "Only MobileSafari is supposed to have this." Miller said that even this entitlement is well-protected. If MobileSafari were hacked, it couldn't create an additional executable area of memory, and it couldn't affect other apps outside of its sandbox.

The problem that Miller discovered is actually a flaw in the part of iOS that checks to make sure that only MobileSafari has the special ability to create an area of memory that is both writeable and executable. "That allowed my app to create its own special area of memory to download and run unsigned code."

Miller discovered the bug several months ago when researching iOS 4.3. At the time, he was busy with other research, including discovering a way to hack laptop batteries. But by September, he had fully exploited the flaw and was able to get a proof-of-concept app, which took advantage of it, into the App Store. According to Miller, that app was downloaded by quite a few people before Apple pulled the app on Monday, though he said only his copy is configured to download code from his server.

Part of the reason he waited to publicize the issue is that he wanted to see if Apple had fixed it in iOS 5. According to Miller, it did not. "In iOS 5, it's still there," he told Ars.

Patch in the works

Miller alerted Apple about the weakness three weeks ago. The company acknowledged it and asked how Miller should be credited in a security bulletin that accompanies most iOS release notes. "I'm sure it is something they will fix quickly," Miller noted, suggesting the fix would likely appear before his presentation in Taiwan. "That's what one would hope they would do. I'm sure they are also working on code fixes for the battery draining issue and stuff that they are going to release patch for."

One thing Miller did not tell Apple, however, is that he had an app in the App Store that took advantage of the flaw. A few hours after the news broke, Miller received an e-mail from Apple noting that his developer program access had been revoked for a period of one year for violating its terms of service. He called the move "heavy handed," noting that Apple gave security researchers free access to the dev program for the purposes of discovering flaws. "For the record, without a real app in the App Store, people would say that Apple wouldn't approve an app that took advantage of this flaw," he wrote on Twitter. Clearly, however, Apple did approve such an app.

While publicizing the flaw means that other hackers might be able to exploit the same bug, Miller told Ars that it's "pretty easy to check for my little trick," so it's likely that the App Store review team will be looking for strange memory allocations. And, Miller said, "at least now people will know to be more careful until Apple is able to patch."