Retina and HiDPI

The Mac has been trying to divorce its user interface from the tyranny of the pixel grid for a long time now—so long that, each time I tried to start this section of the review, I found myself using phrases and metaphors I'd already used in earlier reviews. The road has been long and winding (see? I just did it again), but last year OS X finally dedicated itself to following in the footsteps of iOS yet again by abandoning its attempts to support arbitrary scale factors and adopting a strict 2x scaling factor, called "HiDPI" mode.

What Lion lacked was a Mac with a retina-density display upon which to strut its stuff—oh, and a boatload of double-resolution user interface graphics to keep things from looking janky. Well, now it's here: the 15-inch Retina MacBook Pro. And sure enough, it shipped with Lion, albeit a special build of OS X 10.7.4 (11E2068, later updated to 11E2620).

Mountain Lion, of course, inherits the same abilities. Here's a celebratory screenshot of our old standby, TextEdit, in glorious retina resolution.

I mention this here partly to put a cap on the pixel-density saga, and partly because of the surprising way that Apple brought retina support to OS X. Last year, all signs pointed to a "retina Mac" with a strict doubling of screen resolution, while keeping all parts of the user interface roughly the same physical size. The simplicity of this approach is what allowed iOS to so seamlessly transition to retina displays.

But open the "Displays" preference pane on a Retina MacBook Pro and you'll find five resolution choices. What happened to a single scale factor of 2x?

As far as applications are concerned, at any given time they are doing their drawing with one of two possible scale factors, 1x or 2x. The same goes for bitmapped graphics, which must be provided in 1x and 2x sizes. OS X provides the user with five different choices for screen resolution not by adding more scale factors that applications must deal with, but by using "virtual" screen resolutions that are non-integer multiples of the native LCD panel resolution.

For example, the rightmost, highest-resolution choice offered in the "Displays" preference pane on a 15-inch Retina MacBook Pro creates a virtual 3840x2400 pixel screen upon which retina-savvy applications can draw using a 2x scale factor, effectively making it a double-density 1920x1200 display. The resulting 3840x2400-pixel screen image is then scaled down to the actual native LCD screen resolution of 2880x1800.

In fact, only one of the five resolution choices actually drives the display at an integer multiple of its native resolution: the default "Best (Retina)" option, which specifies a double-density 1440x900 display. Every other mode is drawn onto a virtual screen that is then scaled up or down to fit on the physical display.

Tech nerds reading this may be recoiling in horror at the thought of running an LCD panel in a non-native resolution, but Apple has found a loophole here. Yes, scaling a large virtual screen image down to fit on a smaller screen necessarily discards information. And yes, scaling a small virtual screen image up to fill a larger screen stretches and blurs the image. But it turns out that having 220 pixels per linear inch of screen space can hide many, many sins. Though the "native" (2x) 1440x900 resolution definitely does look the best, the other resolutions don't look bad at all.

The only downside to this approach is that it really pushes the limits of what available graphics hardware can support. The highest resolutions incur a significant performance hit for even simple operations like scrolling. Mountain Lion helps a bit here by providing applications with ways to reduce the number of trips a pixel must make between the CPU/RAM and the GPU/VRAM (e.g., Safari 6 uses Core Animation to increase scrolling performance).

Apple also reportedly wrote its own screen-image-scaling routines for both the embedded Intel HD 4000 and the discrete NVIDIA GeForce GT 650M GPUs in the Retina MacBook Pro, all to ensure an exact match between the output as the OS dynamically switches between the GPUs as needed. Both of these issues are short-term problems, though; in time, Intel and GPU makers will catch up with the demands of a post-retina world.

There's one final wrinkle. Back in 2007, Apple effectively said goodbye to Carbon, the transitional API for porting classic Mac OS applications to OS X, by not providing support for 64-bit Carbon applications. Today, nearly all Mac applications are 64-bit, and therefore not Carbon. Even Photoshop has left Carbon behind. Still, a few stragglers remain.

This is relevant to the topic at hand because Carbon applications are forced to draw at a 1x scale factor in versions of OS X that support retina displays. The resulting images are then scaled up as needed to maintain the correct size on a double-density display.

In other words, while Cocoa applications get sharper text and more detailed graphics on a retina display, Carbon applications just look blurry. While this may also be true of Cocoa applications that don't correctly take advantage of retina displays, those applications can be fixed with a few small code changes. For Carbon applications, the only available option is a wholesale conversion to Cocoa. But then, that writing has been on the wall for five years now.

Overall, I'd call this a happy ending for high-resolution display support in OS X. Users get several resolution choices, while developers only need to deal with two. Over the next several years, I expect retina displays to sweep across the entire Mac product line. Finally, the OS is ready.

Scene Kit

From the very start, Mac OS X has been defined by its visual effects. Its fully composited display system enabled the pervasive use of transparency, animations like the Dock's genie effect, and subtle details like cross-fades and zoom animations.

Apple led the charge with the visual effects in its own applications. In the early years of Mac OS X, third-party developers that wanted their Mac applications to look as good as Apple's had their work cut out for them. While Apple had plenty of graphics experts on staff to add animations to even lowly applications like System Preferences, independent Mac developers of non-graphics-focused applications were unlikely to have access to this kind of expertise.

In 2007, Apple introduced Core Animation as part of Mac OS X 10.5 Leopard. Core Animation unified all of Apple's disparate drawing APIs within a single layer-based model. Video, images, text, standard controls like buttons and checkboxes, and even Quartz Composer scenes could all be mixed freely in a Core Animation layer. The layers were heavily GPU-accelerated, and therefore had excellent performance.

The big win for developers was the ease with which Core Animation layers could be animated. No deep knowledge of graphics programming was required. The word "OpenGL" needn't be read or spoken. To animate something in this new world, just set a few properties to indicate the target state (e.g., position, size, opacity) and the Core Animation engine, running in a separate thread that you don't have to know or care about, will do all the heavy lifting to perform a GPU accelerated animation from the current state to your target state. Et voilà—the democratization of 2D animation on the Mac platform.

But what about 3D? Apple has certainly dabbled in 3D user interfaces over the years. Most of these have actually been 2D planes arranged and animated along 3D paths (e.g., the screen rotation animation during Fast User Switching). But what if the developer of, say, a scrapbook design application wants to show a 3D version of the finished scrapbook, complete with cover art and animated turning pages?

Suddenly, we're back to the days before Core Animation, when graphical richness required deep subject-matter expertise. Modeling, texturing, and lighting a 3D scrapbook is quite a job. But even if that part could be contracted out, the developer would still have to figure out a way to pull that 3D model into his application, display it, and then make it react to user input. Interactive 3D graphics are usually the realm of game developers, not scrapbook application developers. Adding such a feature to a scrapbooking application would be like starting an entirely new, 3D-game-like application within the existing one.

Enter Scene Kit, a new OS X framework for manipulating 3D scenes. The task of actually creating the 3D scene still falls to the developer (or the graphic designer he hires). But once that scene has been created and exported in DAE format, it can be read and understood by any Mac application using Scene Kit. (Even Quick Look understands DAE files in Mountain Lion.)

It's a bit facile to call Scene Kit "Core Animation for 3D," but that's definitely the gist of it. It provides a high-level Objective-C API for manipulating a 3D scene: camera position, lighting, material properties, vertices, surface normals, everything. The code to do this looks a lot more like Core Animation than OpenGL. Here's an example of loading a 3D scene, picking out an object, and applying a new texture to it.

// Find the .dae file NSURL *url = [[NSBundle mainBundle] URLForResource: @"my_scene" withExtension: @"dae"]; // Create the scene SCNScene *scene = [SCNScene sceneWithURL: url options: nil error: &error]; // Find the node we want SCNNode *node = [scene.rootNode childNodeWithName: @"my_node" recursively: YES]; // Get the current material SCNMaterial *material = node.geometry.firstMaterial; // Set the "diffuse" property of the material to an image material.diffuse.contents = [NSImage imageNamed: @"my_texture"];

This may look like crazy gibberish to you, but rest assured that the equivalent OpenGL code would require considerably more than five statements.

The code for animations is similarly straightforward, and even more Core-Animation-like.

[SCNTransaction begin]; [SCNTransaction setAnimationDuration: 2.0]; // Change properties node.opacity = 0.2; node.light.color = [NSColor redColor]; [SCNTransaction commit];

I think even non-programmers can understand what's going on here. Any Mac programmer who has used Core Animation should be right at home. You can even use the various CA*Animation classes from Core Animation itself to animate 3D properties explicitly.

Like video, images, text, and other forms of graphical content, a Scene Kit 3D scene can be displayed inside a Core Animation layer. Furthermore, Core Animation layers can be integrated into Scene Kit scenes. Want to play a video on one face of a rotating cube while a Quartz Composer slide show plays on the other five, all while three light sources zip around like fireflies? Scene Kit can do that.

Apple demonstrated Scene Kit to developers by creating a photo-viewing application that used a rotating set of 3D picture frames to display its images. The scene file contained the textured and lit picture frames sitting on a table. The application code took over from there, reaching into the scene to apply images as textures "behind the glass" in each picture frame, rotating the camera, dimming the lights, and generally looking amazing with an extremely small amount of code, none of which required any particular knowledge of low-level graphics libraries.

For those who do happen to know a little about graphics programming, Scene Kit provides various places where developers can add their own OpenGL and GLSL code. Though I don't expect any top-tier, performance-sensitive 3D games to be built using Scene Kit, it is entirely plausible that casual 3D games could be built by developers with very limited knowledge of OpenGL.

Is Scene Kit the democratization of 3D? We'll see if developers adopt it as readily as they adopted Core Animation. The need to create 3D assets is a substantially higher barrier to Scene Kit adoption than the creation of 2D assets was to Core Animation. Most Mac applications already have 2D assets; Core Animation just made it easy to animate them. Few existing Mac applications use 3D at all.

I have the same reservations about user interfaces festooned with 3D geegaws as I did about excessively animated 2D interfaces at the dawning of the Core Animation era. For the most part, Mac developers have restrained themselves in the 2D realm (though Apple itself is a possible exception). I hope 3D arrives on the Mac platform with similar grace.

I am extremely impressed with Scene Kit. It was my favorite new framework at this year's WWDC, by far. (Registered developers can view the session video on Apple's website.) There's much more to it than the brief overview provided here, including the ability to create simple geometry programmatically. As a final demonstration of exactly how accessible Scene Kit makes the world of 3D, here's a screenshot of a simple application I wrote that shows off Scene Kit's 3D text capabilities.