Way back in 2016, in the era of iOS 9, I laid out the tentpole features I wanted to see come to iOS and the Mac. Now, three years later, so many things from that wishlist have become a reality that it’s probably a good time to revisit the topics that haven’t yet come to pass, and plan a new wishlist for the years to come. I originally planned this list to have a Developer/User split, but it became clear that the two go hand-in-hand; if you’re doing complex things on iOS today, using the various automation apps, you are but steps away from needing the same things that developers do.

Xcode for iOS

Much has changed since 2016 for development on the iPad, but much still stays the same. Apple introduced Playgrounds that year, and provided their very own Swift IDE for iPad. Playgrounds is fantastic, but you still cannot build and install an app using it, and you cannot mix and match C and Objective-C code with your Swift. It has no project structure, so all of your code has to take place in one file, which is fine for teaching material but not for anybody wanting to create something more complex. In 2017, Apple changed the App Store rules to finally enable programming apps to live on iOS without fear of being removed, albeit with unfortunate restrictions like not being able to display the output of an app in more than 80% of the screen. However, there is still no way for a third-party programming app to run code out-of-process, so any user mistakes can crash the app completely, and of course there’s no way to build and install an app locally using one of these IDEs.

Pythonista for iOS.

There is still an incredible need for something to bridge this gap – a programming environment on iOS that lets you design, write, build, sign, and install an application without having to resort to using a Mac. No third party can build this, because Apple has App Store rules and platform restrictions that prevent anybody else from being able to. Thus, Apple has to be the one to build this ‘Xcode for iOS’ and make it as powerful as a developer might need, whilst also building the mechanisms into the OS to make it as safe and secure as possible. In 2016, many of the fundamental pillars necessary to build an Xcode like this didn’t exist in iOS, like a user-accessible file system, drag and drop, multiple windows, and floating panels – but they do now (or will shortly, if rumors are to be believed).

Terminal Environment for iOS

Much like the file system, for a certain class of user the need for a command-line environment of some kind hasn’t gone away as I’m sure Apple had hoped. Now, with Apple’s own Shortcuts app, more users than ever are automating tasks on iOS – it makes perfect sense to provide something more for the power users that need it, especially if Xcode for iOS becomes a reality. After 2016, I went and built the sandboxed Terminal app I described in my post and populated it with the core BSD utilities from Apple’s Darwin as a demonstration. We’ve seen Mosh, and OpenTerm, which do much of the same. Now, there’s iSH, which goes as far as emulating an X86 Linux environment just to try and provide a working shell on iOS.

A beta version of iSH for iPad.

All of these apps are sandboxed and using public APIs, but only Apple is capable of building the real thing. I shouldn’t have to install an X86 emulator on my iPad just to be able to curl a URL and untar it, or to run ffmpeg to convert a local video file. A sandboxed Terminal doesn’t need to let you mess with the system or other apps, or provide ways to execute unsigned code or kill processes; it should be able to live in its own jail and let you do whatever you want inside it in much the same way as a GUI user gets to do whatever they want in Shortcuts while maintaining the safety and security of the OS.

System-Level Drawing/Markup Views

Four years post-Apple Pencil, and still Apple provides no developer APIs for drawing, sketching, and markup. Every app has to reinvent the wheel if they want to have Pencil-based drawing, and while that may suit the couple of developers who have invested a lot of time and effort into their own drawing engines, it excludes every other developer who happens to have a good idea for a way to integrate sketching into their own app.

Apple has provided sample code in the past for this, sadly OpenGLES-based, but if you want anything more appropriate you’re left to scour GitHub. Building a drawing engine that looks good and feels responsive at ProMotion’s 120Hz is incredibly difficult, yet Apple has their own great drawing framework in the OS which they use in the Notes app that would be perfect to provide to developers.

There are several apps sitting on my shelf that would benefit greatly from a built-in API for a drawing view; I hope Apple gets around to this sooner rather than later.

Custom View Controller and Non-UI Extension Providers

Extensions have powered so many new APIs in iOS and macOS, completely obviating the plugin model with a robust out-of-process signed and sandboxed mechanism. However, developers cannot define their own extension points, and thus cannot use them to empower their own apps. A programming environment for iOS should be able to run its code in an extension – a separate process that can be securely scoped to just the task it’s supposed to do, and if it crashes it won’t bring down the host app with it. This is, of course, exactly how Apple’s Playgrounds app works, but it isn’t something third parties are allowed access to.

Similarly, you should be able to define your own UI-based extension point such that other apps can implement an extension that would show up inside your app. If you’ve ever used the Audio Unit extension point, this is exactly what it lets you do – presenting the custom UIs from various instrument or audio processing apps you have installed, in a region inside your own app’s UI – but only if you’re an audio app using CoreAudio.

There are many ways developers could innovate by providing their own extension points; in fact, so much of the custom URL ecosystem on iOS sprang up because there’s no standardized way for apps to talk to each other or use functions from each other. Apple’s own Shortcuts app, née Workflow, was born out of the custom URL ecosystem, and Apple realized its potential was so great that it acquired Workflow and built it into the OS. Your own extension point could be exposed as an action to the Shortcuts app, to let it call it transparently as part of a workflow, with or without UI, instead of the dosado between apps that custom URLs involve today.

Key Up/Key Down Events

It seems crazy that in 2019 you are still unable to track raw keyboard events on iOS – there is no way, barring using a private API, for an app to allow you to hold down physical keys as input (like WASD keys for a game), or as modifier keys (like holding Shift while resizing something in an app like Photoshop to maintain aspect ratio). A developer only knows when a key has been pressed, not released, or a shortcut has been invoked. This restriction seems so pointless today, and incredibly restrictive, affecting everything from professional creative apps to games. iPad needs robust hardware keyboard support, and shouldn’t be chained to a restriction formulated a decade ago for a very different world.

Mouse Support and API

In much the same way, it’s time for robust mouse and trackpad support on iOS. With UIKit coming to the Mac, the framework has had to add a bunch of interactions like right click, hover, and scroll bars; why not bring this to iPad too so that users can benefit from it if they choose, or if their workflows demand it?

While controlling the UI with absolute coordinates is an important function of the mouse, let’s not forget too that mouse-capturing and relative movement is essential for games, remote desktops, and emulators. UIKit thus needs to let you capture the mouse cursor for your own needs, and not just to click things onscreen. Combine this with robust keyboard support and you would be able to play games like first-person shooters on iPad just like you can on a Mac or PC. Quake 3, anyone?

Android has supported mice for nearly a decade and it hasn’t done anything to lessen the touch experience, so there’s no need to worry about it doing so on iOS. For users or workflows that truly want or need a mouse, iPad will always be a non-starter until it supports one. Time for that barrier to go away.

Larger iPads and External Touch Screen Support

There are times, however, when the computer I want on my desk is a 30” iOS drafting table. iPad is essentially a blank canvas – truer now that it has no front-facing buttons – and a bigger canvas begets entirely new experiences. I am dying to see iOS scale to desktop-sized workflows, with several apps onscreen at once. If not a desktop iOS device itself, why not an Apple-quality large external touch screen?

I would love a 15” iPad, too: I haven’t used a Mac laptop since 2013 – iPad has completely obviated that form-factor as it grows ever more powerful – and I can’t imagine ever going back, but for people like me Apple needs to offer an even bigger model iPad than 12.9”. The 12.9” iPad Pro already gets custom UIs with expansive layouts and three-column views, and UIKit on the Mac will behoove developers to create apps and layouts that can scale to 27” screens anyway. I’m deeply envious of Microsoft’s Surface Book (that is, of course, until you turn it on), and something along those lines running iOS would suit my needs incredibly well.

Expanded USB Device Support for iOS

MFi might be gone with the USB-C iPad Pros, but developers need public APIs to write user mode drivers for anything you wish to plug in to your iPad. I want to be able to plug in my various EyeTV tuners and have the EyeTV app happily init them like it does on the Mac. I want my Game Capture HD60 to work on iOS, so I can record footage from my gaming PC and actually be able to edit and render it in the fastest computer in my house (the 2018 iPad Pro). I want to plug in my Raspberry Pi’s FTDI cable and view its serial output on my iPad without buying crazy MFi-based serial adapters. If I, for whatever workflow I might have, need to burn a CD or DVD, I should be able to plug in a disc drive and do so using any app designed for the task. This should “just work” in a way iPads simply can’t do today.

Read/Write External Drives through the Files App

By now, this is on everybody’s wishlist. Need I say more? It is so long overdue that I can’t imagine Apple holding off much longer. But I’d like to go further…

Format/Partition External Volumes and Read/Write Disk Images

I don’t just want to read my drives: I need to be able to manage them, too. Erase, partition volumes. Understand multiple file systems. Image them to a file, or apply a disk image to them.

The Mac’s pervasive disk image support is genuinely one of its crowning achievements. As a result, I have a ton of disk images from two decades on macOS. I use disk images every day; I create CD and floppy images to pass files to/from VMWare and Qemu. If user file system access becomes a core part of using iPad, we need the rest too. If I choose to never use a Mac again, Apple is telling me that all my old data is lost.

Scripting

“AppleScript for iOS” was one of the items on my 2016 list, but the Apple automation landscape has shifted dramatically since then. Apple acquired Workflow, now Shortcuts, and the perception is that AppleScript may not be long for this world, slowly pushed out in favor of a sandboxed, secure, and modern extensions mechanism employed by Shortcuts. Scripting is still incredibly important, as evidenced by the hundreds of Shortcuts workflows used by anybody who takes iOS seriously for work these days. So if not AppleScript, then what?

Scriptable for iOS.

What I really want to see is a textual interface to Shortcuts that lets you do all the same things without having to navigate and fiddle with a UI filled with actions, so that the class of advanced user who prefers writing scripts can do the things they need. Scripts need a way to run “silently” without presenting a UI onscreen or jumping between apps. And scripting should be extended to the UI layer, letting developers build richly scriptable apps like they can today on the Mac with AppleScript. It is easy to envision Shortcuts scripting using JavaScript or Swift, and Shortcuts already has a scripting action that lets you run JavaScript on a webpage.

Virtual Machines

It’s hard to see Apple supporting virtualization on iOS, as virtual machines require things like Just-In-Time compilation to execute arbitrary code in memory which violates one of the core foundations of the iOS security model, and burns through battery life like few other tasks. However, Apple provides WebKit on iOS, which also requires those special security exceptions to execute code in memory, and on macOS provides Hypervisor.framework, a lightweight virtualization system that lets developers build virtual machines easily.

Running mini vMac on iPad Pro.

I’m the kind of user who would love to see Hypervisor for iOS; let companies like VMWare and Parallels bring their expertise to the iPad, and offer approved ways to run ARM-based Linux or Windows (or, in a couple years, macOS perhaps). X86 emulation may be out of the question for now, but perhaps that won’t always be the case; I’m sure I’m not the only developer with a library of VMs for everything, from DOS to NEXTSTEP to older versions of macOS and the iPhone SDKs. I can, of course, use these VMs on iOS today, with open-source apps like Bochs or mini vMac sideloaded onto my device, but because they don’t have the ability to JIT, they have to run entirely using CPU emulation which is significantly slower and burns more battery.

Entitlements

Finally, that brings me to entitlements. The entitlement system in iOS is what allows Apple to have fine-grained control over which developers can access which features; if you wish to use iCloud in your app, your app must be signed with an iCloud entitlement, with similar requirements for Health, Home, Apple Pay, and many other parts of iOS. CarPlay, for example, needs a special entitlement that isn’t given freely to developers – in fact, you have to apply to Apple and get your app idea approved before they’ll even let you test the feature in your app. If an app or developer is ever found to be abusing their privilege, their entitlements can be revoked by Apple, and the app remotely disabled. Thus, entitlements are a great way for Apple to entrust developer partners with special access to features that other developers can never use or misuse.

With that in mind, Apple could entrust e.g. Google, Microsoft, or Mozilla with the entitlements they need to use their real browser engines on iOS instead of WebKit – real Chrome, real Firefox. VMWare and Parallels could be entrusted to build virtual machines or emulators, without leaving this open as an attack vector for malicious third-party apps. Disk utilities could be permitted to partition disks, IDEs could be permitted to run background processes, install apps, or attach a debugger to running apps. So many of these things, given freely to developers, would arguably make iOS a much less safe place (read: just as powerful as a desktop computer), but with the entitlement mechanism in place Apple could still keep the control they want and not let it get out of hand. Seeing past the inter-company politics, iOS is going to need methods to do all of these things eventually, especially if the iOS app ecosystem is to supplant the Mac app ecosystem in due course. A Mac without the ability to build and install apps, or attach a debugger, would be unimaginably crippled.

We’ve come a long way from the fear that enabling third-party apps on iPhone will bring down the cell networks; trying to actively build the future on iOS today is like having your hands tied behind your back. iOS has for too long relied on the fact that the Mac exists as a fallback to perform all the tasks that Apple isn’t ready to rethink for its modern platforms, but that doesn’t mean these problems aren’t relevant or worth solving. This has left us in a situation where iOS moves forward with new ideas, but the Mac stands still, needing to keep compatibility with the iOS ecosystem whilst tiptoeing the line between keeping things as they are, or losing the freedom and power of old systems for the active development and enthusiasm of the new. The correct path forward is not to simply revert to the mechanisms available on the desktop, with all of the baggage that comes with that, but to rethink all of these things to fit in a modern, secure world.

iOS has for too long relied on the fact that the Mac exists as a fallback.

iOS is exponentially better with a working Files app, with drag and drop, with automation, background tasks, and split-screen multitasking. iPad too, with a stylus and hardware keyboard.

What iPad does so well is completely hide its complexity from the user who doesn’t need to know about the mechanics of the system beyond tapping an app to open it, and swiping the Home indicator to close it. All of these things, added to iOS, haven’t made the OS harder to use. They’re so transparent that I’m sure most users don’t even know they exist, but for the users who do need them they have become essential tentpoles of the iOS experience.

With UIKit on the desktop, it’s time to revisit just what an iOS app can and can’t do; after all, they’re no longer “iOS apps”: they’re just “apps” now.