“This is a field in which one does one’s work and it will be obsolete within 10 years.” – Steve Jobs, 1994

Cooper-Hewitt has just acquired its first piece of code. Although the collection has objects that are the end result of algorithmic processes, notably Patrick Jouin’s 3D printed chair, Solid C2, this is the first time that code, itself, has been collected.

Almost all contemporary design practice involves digital processes—from the ubiquitous Adobe design software to CAD packages used by product designers and architects, to the simple day-to-day office management and accounting software—it would be difficult to find a designer who lives entirely ‘off the grid.’ Despite this, design museums have been slow to start to add software to their permanent collections.

Some of this reticence to collect digital objects stems from deep uncertainties as to how to preserve and present such objects to future visitors and future scholars. But for Cooper-Hewitt these uncertainties have been a strong driver to experiment.

So, here we have Planetary.

Planetary is an iPad application written in C++ using the Cinder framework. Planetary offers an alternative music player application for the iPad that visualizes your music collection as a series of celestial bodies. Songs are moons, albums are planets, artists are suns—and the orbits of each are determined by the length of albums and tracks. Their brightness represents their frequency of playback.

It is an elegant interactive visualization produced as the first product by Bloom, a San Francisco startup founded in 2010 by Ben Cerveny, Tom Carden, and Jesper Andersen, to explore new forms of data visualization and interactive products. In the company’s introductory blog post in 2011 Cerveny wrote:

“We’re building a series of bite-sized applications (instruments) that bring the richness of game interactions and the design values of motion graphics to the depth and breadth of social network activity, locative tools, and streaming media services. . . . These Bloom Instruments aren’t merely games or graphics. They’re new ways of seeing what’s important.”

Planetary was the first of those instruments, written alongside artist/coder Robert Hodgin who joined for the duration of the project, and to date it has been downloaded more than 3.5 million times.

We have acquired Planetary both as an example of interaction design and interactive data visualization, but by acquiring its source code—including its changes between versions—we are also able to reveal the underlying design decisions made through its creation and evolution from its first public release in 2011 to the last public version of 2012.

The acquisition of the source code also allows us to do something else very important.

Preservation

Like all software, Planetary has been skirting obsolescence almost from the moment it was released. Software and hardware are separate but inescapable companions that exact a sometimes profound and warping, and sometimes destructive, influence on one another.

(in-app planet texture detail)

Software written for the first iPhones, released only six years ago in 2007, no longer works on today’s iPhones. It might be because the operating system was taught a fresh new way of thinking about things. It might be because new hardware was invented that is foreign to and misunderstood by the past. Often it’s both.

Tom Carden has written:

“Planetary was written for iOS 4.3. Since iOS 5 was released it has had bugs and glitches interfacing with the music player code/library provided by Apple. We reported the bug but it has not been fixed yet. I hold out hope that it will be fixed in iOS 7. In 4.3 all music was local to the device. Since 5 (I think) it’s been possible to have various forms of streams and clouds from the same API, which is now a creaky/leaky abstraction.”

“Planetary is the last of a generation of OpenGL apps to benefit from the fixed function pipeline. From here on out the future is shaders, in all their abstracted/obtruse glory. So much more power but also takes a lot more time and effort to wrangle an expressive framework. Further, it’s fascinating to me that everything I learned about OpenGL on a $5000 SGI workstation in 1999 was relevant (still basically current) on the $500 iPad in 2011.”

This is okay. It’s the price of living in the present. But it does make our jobs as cultural heritage institutions harder.

Museums like ours are used to collecting exemplary achievements made manifest in physical form; or at least things whose decay we believe we can combat and slow. To that end we employ highly trained conservators who have learned their craft often over decades of training, to preserve what would often be forgotten and more quickly turn to dust.

But preserving large, complex and interdependent systems whose component pieces are often simply flirting with each other rather than holding hands is uncharted territory. Trying to preserve large, complex and interdependent systems whose only manifestation is conceptual – interaction design say or service design – is harder still.

As daunting a task as that may be, we choose to see opportunity.

With Planetary we are hoping to preserve more than simply the vessel, more than an instantiation of software and hardware frozen at a moment in time: Commit message fd247e35de9138f0ac411ea0b261fab21936c6e6 authored in 2011 and an iPad2 to be specific.

In order to do this, or fail trying, we are open sourcing the code that runs Planetary.

Although Bloom folded in 2012, its three principals have not only gifted the code for Planetary to Cooper­-Hewitt they have also given us explicit permission to publicly release the source code under an open source (BSD) license, and its graphical assets under a Creative Commons (non-commercial) license.

The source code is currently hosted in Cooper-Hewitt’s repository on GitHub as is a second repository called PlanetaryExtras that contains images, screenshots, notes, and drafts that were made during the creation of Planetary itself. Think of that second repository as the ‘curatorial folder’ of all the additional materials – except that it is out in public.

(early mockup of interface)

This means that anyone can now look at, download, and play with the source code that makes this app. Not only that, you are permitted to replicate, modify, and transport it to other hardware platforms and devices.

You can even apply the concept—nested data sets visualized and behaving as celestial systems—to other types of data and contexts.

Could we, for example, depict the Enron email dataset with its half-million messages using Planetary? Or if we included Planetary in the Google Art Project (to which we contribute) could we do it in a way that both preserved its interactivity and displayed the entirety of the collections in the Art Project itself?

Of course, Planetary will still be available from the Apple AppStore for the foreseeable future although it will no longer be officially supported. Or rather, not actively supported.

Cooper-Hewitt itself will not be actively developing new versions of the application but we hope that the wider communities of developers, scholars, and enthusiasts will. Bloom’s choice to develop Planetary for Apple’s iOS operating system and the iPad (but not the iPhone) helps us to understand the technological landscape in which Planetary was conceived although it is important to realize that Planetary is not, first, an iPad application.

Instead, we believe that Planetary is foremost an interaction design that found its ‘then-best manifestation’ in the iPad. What might that choice look like today?

Three years since its first release, hardware and software environments akin to those of the first iPad are available from a variety of sources. We are hopeful that people will think that Planetary is as interesting as we do, and will consider porting the code to work on Android devices or in a web browser using WebGL or large interactive surfaces (which are increasingly indistinguishable from large desktop computers or increasingly internet-connected televisions).

While Ben Cerveny was at Stamen Design he and founder Eric Rodenbeck (also a former National Design Awards juror) developed a conceptual approach to their work centered around the idea that “data visualization is a medium”. Tom Carden has also pointed out that: Google/IBM’s Martin Wattenberg and Fernanda Viegas also drove this agenda, independently, around the same time.

We agree!

We hope to use this acquisition as a vehicle to actively explore and ask the question of how we meaningfully preserve the experience of using the software.

As part of that exercise Tom Carden has agreed, for a time, to oversee and be the final arbiter of any bug fixes and updates and (hopefully) newer versions of the code that will allow the software—the interactivity—to live on beyond the iPad. Tom won’t do this forever, but by agreeing to participate for a time it will allow us to better understand how museums might preserve not only the form of the things in their collections, but their creator’s intent.

The distinction between preservation and access is increasingly blurred. This is especially true for digital objects.

We already have a number of “digital” objects in our collection, from calculators to desktop computers to iPads and iPhones, but we have only collected their physical form. The iPhone in our collection is neither powered on nor has it been kept up to date with newer software releases. Eventually the hardware itself might be considered so delicate that to power it on at all would damage it beyond repair—a curse common to many electronic objects in science and technology collections. How then do we preserve the richness and novelty of the software interfaces that were developed and contributed equally if not more than the industrial design to that device’s success?

We cannot pretend to have all the answers to these questions but we think it’s important to start making the effort to find some of them.

(Graffiti, Amsterdam, 2013, photography by Aaron Straup Cope)

Living objects

We liken this situation to that of a specimen in a zoo. In fact, given that the Smithsonian also runs the National Zoo, consider Planetary as akin to a panda. Planetary and other software like it are living objects. Their acquisition by the museum, does not and should not seal them in carbonite like Han Solo. Instead, their acquisition simply transfers them to a new home environment where they can be cared for out of the wild, and where their continued genetic preservation requires an active breeding program and community engagement and interest. Open sourcing the code is akin to a panda breeding program. If there is enough interest then we believe that Planetary’s DNA will live on in other skin on other platforms. Of course we will preserve the original, but it will be ‘experienced’ through its offspring.

In a similar vein Wolfgang Ernst of the Media Archaeology Lab at Berlin’s Humbolt University, has said that:

“Following my definition that such items need to be displayed in action to reveal their media essentiality (otherwise a medium like a TV set is nothing but a piece of furniture), it required an assembly of past media objects which teachers and students are allowed to operate with and to touch upon—a limit for curators and visitors in most museums of technology. “The main feature of the [Media Archeological Fundus] MAF is grounded in the materiality (called “hardware”) of media artifacts—just as the Signal Laboratory is archaeologically rooted in the source codes of computer programs (since the memory regime of media culture is both material and symbolic, both engineering and mathematics). The configuration of artifacts in the MAF, guided by rather idiosyncratic media-epistemological criteria of teaching and research, does not constitute an archive, and its online presence is not meant to contribute to audiovisual archives as represented in the Web but rather a different form of audiovisual argumentation. Rethinking dynamic digital memory requires different platforms.”

New opportunities for research and scholarship

As a research institution we are also interested in reaching new understandings of the ways designers use code that can be gleaned from the code itself.

As we are acquiring a source code from the version control system that it was managed in (also GitHub), we have been able to preserve all the documentation of bugs, feature additions, and code changes throughout Planetary’s life. This offers many new interpretive opportunities and reveals many of the decisions made by the designers in creating the application.

Not only that, we are interested in the application of literary and text analysis methods to the source code which could enable future scholars to explore the ‘writing style’ and ‘aesthetic’ that the designers used in writing this application. If we were to acquire other software by these same authors in the future, would we be able to find similar linguistic practices unique to their particular styles of coding?

(Archival version of Planetary sitting on the shelf – center left – amongst other flat objects in our collection stores as DIG001)

To be safe, we have also printed out a full copy of the source code on archival paper in the 1960s machine-readable OCR-A font – meaning that should the online version of the code ever be lost or corrupted we have a ‘master’ copy deep inside the vault.

The future?

As more of the world we live in is designed, controlled, and surveilled by code, should the nation’s design museum not begin to acquire the underlying source code of all its objects – from CAD models of furniture, to the code that optimizes the fuel injection systems of the latest car, the to the algorithms that underpin the financial systems that drive Wall St?

The author and journalist Clive Thompson has written a lovely article about the Planetary acquisition for Smithsonian Magazine so we encourage you to have a read of that as well as Robert Hodgin’s excellent blog post, Creating new worlds, about designing and building the planets in Planetary and then explore the Github repository and help us figure out how we make the present safe for the future.

Or just go and download it and run it on your iPad. (Requires iOS6 or lower – it does not run on iOS7 or above)

Sebastian Chan is Director of Digital & Emerging Media and Aaron Cope is Senior Engineer in the Digital & Emerging Media unit at Cooper-Hewitt. More of their team’s work is discussed at www.cooperhewitt.org.