At 8:30 PM PST on January 3, 2004, a thousand pounds of metal sent from Earth blazed through the sky of Mars. Six minutes later, after sprouting a parachute and airbags, this chunk of metal slammed into the red planet, bounced 28 times over 300 meters and slowly rolled to a stop. The airbags deflated, revealing the metal pyramid they protected, which in turn unfolded to reveal a six-wheeled robotic geologist named Spirit. So began a three-month exploration of Mars. For a team of hundreds of scientists and engineers at NASA's Jet Propulsion Laboratory, Spirit is serving as our eyes—and our toolbox, using the tools at the tip of its foldaway arm—during a new chapter in the exploration of our neighbor planet. As is Spirit's twin, Opportunity, which duplicated Spirit's performance three weeks later on the opposite side of Mars. And what are all those scientists and engineers using to drive the rovers? They are using Linux. The Mars Exploration Rover (MER) mission marks a turning point for use of Linux in the space program. Linux has been used on space missions before—a Debian laptop rode the Space Shuttle on STS-83, for instance, as long ago as 1997. But the Mars Exploration Rover Project is the first JPL mission to use Linux systems for critical mission operations. On MER, Linux is being used for high-level science planning and for low-level command sequencing, visualization and simulation.

How We Got Here As odd as this may seem, Linux originally was not intended to be our primary development platform. Our software initially was intended for use on the Mars '01 mission, which essentially would have been a repeat of JPL's 1997 Mars Pathfinder mission. The plan was to do all rover commanding on a single Silicon Graphics workstation, as had been done on Mars Pathfinder. Our Linux support was, at that time, mainly a hedge against SGI's fortunes. After the failure of JPL's Mars '98 missions, however, JPL re-planned its Mars strategy. The Mars '01 plans were scrapped in favor of a later launch, the mission that eventually became MER. The upshot was we got two more years of development time than we had planned, albeit for a different spacecraft than we had planned. The MER rovers are larger, smarter and more complex than Pathfinder's Sojourner rover, and each one sports a robotic arm. In those two years, Linux and the generic x86 hardware it runs on made enormous strides in both CPU and graphics speeds, thanks in no small part to graphics chip-maker NVIDIA and its strong Linux support. As a result, we increasingly came to favor the faster, better, cheaper solution that was Linux. We eventually purchased two high-end SGIs to add to our Linux boxes during flight operations, but it is unclear whether we will end up using them; the Linux boxes now give us equal, if not better, performance. As a result, the MER rovers are being commanded by multiple teams working in parallel on the dozens of Linux systems we've purchased for the mission.

The RSVP System Our software, a suite of applications called the Rover Sequencing and Visualization Program (RSVP), was developed, tested and deployed on Linux. RSVP gives MER's engineers and scientists sophisticated tools for commanding Mars rovers. Because of the lengthy light-time delays, which amounted to nearly a 20-minute round-trip on Spirit's landing day, it's not possible to drive the rovers interactively. Instead, RSVP provides an immersive environment that accurately displays the rover's environment and simulates its behavior. Throughout the Martian night, we Earthlings use RSVP to plan a day's activities at the command level and then uplink the resulting commands to the rover. The rover executes those commands during the following Martian day. The two main components of RSVP are the Rover Sequence Editor (RoSE), which provides text-oriented commanding for all spacecraft commands, and HyperDrive, which provides advanced three-dimensional graphics for driving, arm motion and imaging commands. Each of the individual applications can run in a standalone mode, but they are more powerful when used together. In combined mode, the applications are run under the control of the parallel virtual machine (PVM) software developed by Oak Ridge National Laboratories. PVM provides a data bus, a way for the RSVP component applications to synchronize their work by passing messages to one another. The content of most of our messages is based on RML, an XML-based specification designed specifically for representing rover command sequences. This message-passing approach has a number of benefits, not the least of which is that we can implement individual tools in different languages. As long as the tool can send PVM messages, it can be part of the suite. RoSE, our message-passing hub, and our message logger are written in Java; HyperDrive, our image viewer, and our sequence flow browser are written in C++ and C. For the next mission, we may try adding tools written in scripting languages such as Perl and Python.

RoSE RoSE is a Java application running on the Sun Java Virtual Machine version 1.3.1; however, it's compiled not with Sun's compiler but with IBM's much faster Java compiler, Jikes. RoSE benefits from Java's well-known portability: we got the benefit of compiling and testing on a nice, fast Linux box and periodically installed and tested it on the SGI with little or no trouble. We even did an experimental port to Mac OS X, but abandoned it because we lacked time to support a third platform. There were no evident problems on Mac OS X, but if any had arisen, we wouldn't have had time to deal with them. We also had no requirement to support Mac OS X, so we dropped it altogether. Figure 1. RoSE Command Editor (Blurring due to ITAR Restrictions) RoSE is aggressively data-driven, learning everything it knows about the rover from XML configuration files read at startup time. The IDE used for developing RoSE is good old tried-and-true GNU Emacs. Much of RoSE's GUI is built with hand-crafted Java code using Java's standard GUI toolkit, Swing. As nice as Swing is, hand-crafting GUI code sometimes is tedious; if a Java-supporting version of Glade had been available at the time, we likely would have used it. Still, important parts of the user interface are built dynamically. One of the XML configuration files RoSE reads at startup time is a description of the rover's command dictionary, sort of like an application programming interface (API) for the rover. RoSE uses this command dictionary to generate dialog boxes dynamically for editing each command, an approach that saved untold amounts of work as the command dictionary changed dramatically over its years of development. The automation-friendly Linux environment contributed greatly to RoSE's reliability. Early on, we wrote extensive self-test code for RoSE and then set up a simple cron job to run the tests every night. If any self-tests failed, they were reported in an e-mail message to the developer the next morning. This helped us catch bugs early, while the changes that had caused them still were fresh in our minds. Our aggressive self-testing for RoSE was something of an experiment, and it is one that paid off handsomely. As a side note, Java has a reputation for being slow, but we have to say that this reputation appears to be unfair. Although there are places where RoSE could be faster, this is true of any software. If we had more time (the constant refrain in software development), we could eliminate all of the bottlenecks. As it is, Java is fast enough for our purposes, and Linux's strong Java support has been a great benefit to us.

HyperDrive HyperDrive is the interactive visualization component of the RSVP suite; it's the part they'll show on CNN. HyperDrive's main purpose is to enable the safe and efficient planning of rover driving and arm motions by giving the operator a good understanding of the rover's state and its current environment. This is accomplished by fusing multiple data sources into a three-dimensional scene and then providing multiple ways of viewing this scene. One of the main sources of data for HyperDrive is terrain models generated from stereo imagery. Most of the cameras on the two rovers are stereo cameras, which means that they can be used to see with depth perception, much like human eyes. By using a technique known as stereo correlation, we can go from the flat, two-dimensional world of a picture into the three-dimensional world of rendered computer graphics. By combining these terrain models with CAD data representing the rover, we have the nucleus of a system that is able to render views of the rover interactively driving over the terrain or placing its arm on the Martian surface (Figure 2). We further may combine these renderings with the raw imagery from the rover cameras to provide an augmented reality view, where the scene as viewed from the rover is overlaid with synthetic imagery and possibly viewed three-dimensionally using LCD shutter glasses (Figure 3). We also can render simulated views from any of the rover's cameras to help in image targeting. Figure 2. Drilling a Rock with the RAT (Rock Abrasion Tool) Figure 3. Spirit Leaving the Nest in Augmented Reality View Into this virtual world we place icons and annotations, software-created objects that illustrate the command sequence and describe features of the world. The most important group of these objects is the sequence of icons that represents commands to be executed by the rover. While the text-oriented RoSE editor shows the entire command sequence for editing, HyperDrive shows only the subset of commands that have a sensible visual representation. In HyperDrive, mobility, robotic arm and imaging commands are shown as a string of icons placed across the terrain, each one located at the position where it will be executed. This allows us to program the rover's traverses across the terrain visually and to place its arm with a high degree of precision. Our first placement on the Martian surface was estimated to be within 5mm of its intended target. When we placed a second instrument on the same target, we did even better, placing it within 0.3mm of the first position. Not bad for a target a hundred million miles away. Three-dimensional graphics programming under Linux started as quite a bleeding-edge adventure and matured into a capable platform with robust drivers and libraries. HyperDrive is based on the OpenGL Performer rendering API from Silicon Graphics. We use GTK+ and libglade for HyperDrive's user interface. The rapid prototyping enabled by Glade and libglade made GUI development painless and efficient, and we highly recommend this toolset to others in need of a robust interface toolkit. As our primary target evolved from IRIX to Linux, we were able to use more and more open-source tools and libraries and then find or build ports back to IRIX to maintain compatibility. HyperDrive also is able to animate the rover's predicted actions, building an animation in real time and allowing the user to fly around and examine the rover's behavior from multiple perspectives. The same feature can be used to play back the rover's actual performance, based on the rover's daily report. And we were pleasantly surprised, to say the least, when these playback animations became a regular feature of JPL's daily press conferences.