Supported architectures

Firefox supports quite a few major platforms, and volunteers keep it running on numerous others. Rust started out with broad support for the three most popular desktop platforms. That’s pretty good for the initial release of a compiler, but we had a lot of work to do to get Rust code shipping everywhere we wanted to.

Code generation was straightforward thanks to LLVM, but there was a lot of work to be done in the OS-support layers of the standard library, and getting configurations right.

The Rust team were very supportive, and did a tremendous amount of work to add support for Windows XP, Android, older MacOS, and MS Visual Studio linkage. Still, it was almost six months between shipping on our first platform (x86_64-linux) and our last (armv7-android).

Build systems are fun

Rust has a build system and package manager called `cargo`. It’s amazing, one of the best things about the Rust language ecosystem. It makes it easy to package software, maintain dependencies, and get a consistent build. However, it’s something of a world unto itself. By trying to handle everything, it made integrating Rust code into an established project more difficult.

Early foreign-function support in Rust concentrated on being able to wrap existing C libraries to quickly bootstrap functional Rust applications. While cargo had support for building external code and linking it into a Rust project early on, it didn’t really support the reverse.

Firefox also has a build system which wants to be in charge. It’s very large and complicated, and wants to control all the things. So getting the two to work together took changes to both and is still an ongoing project.

The simplest way to integrate Rust code into a C or C++ project is to use rustc’s (or cargo’s) support for generating static libraries. This will link all the rust code and its dependent libraries into a form which can be combined with the main body of the application like any other C library.

That works well when you only have one bit of Rust. As things progress and you end up hooking up a piece here, a piece there, this approach doesn’t scale.

Each of those static libraries produced by the Rust compiler contain their own copy of the Rust standard library. Some linkers will cope with that, while it’s not ideal. But others will become confused and reject Rust libraries after the first.

We hit this almost immediately. While we started with a small pilot project adding a single component, we wanted to have some unit tests for it. Unit tests run with a test harness which is linked to rendering engine code… which means we have Rust code from the rendering engine, and Rust code from the unit test.

We resolved this by building a Rust super-library that just re-exports all the public interfaces of all the Rust code modules we actually want to have available. Each top-level object the Firefox build system defines its own version of this super-library, which it compiles to static library and links in. This means only one copy of each Rust code module and the standard library, and the Rust compiler can fully optimize everything together without interference from the rest of the build system.

This wasn’t entirely obvious, since it’s different from how most Rust projects work, but it does allow us to use cargo to build all the Rust code and take advantage of its features as a package and dependency manager.

There are still some issues we need to solve with this. For example, using cargo’s ‘workspaces’ feature to share build artifacts when constructing multiple static libraries. The unified build also makes it harder to vary configuration flags between different sections of the Rust code, for example if one module wants to always enable debug assertions and another doesn’t.

Updating build automation

The final hurdle, and one specific to our setup but not unusual for large projects, was deploying Rust to our build and test infrastructure. Mozilla pioneered running tests as commits land in version control, which is now standard practice. But while services like Travis got Rust support relatively quickly, packaging and deploying a new compiler to our fleet of test machines and supporting all the build variants we create took quite a lot of time.

There were dozens of test configurations to plumb though, some requiring a Rust toolchain, some requiring its absence. We needed a way to package versions the compiler and tools like cargo to use for official builds so we had a consistent basis for Firefox releases as they progressed through stabilization and testing.

Of course we have to do this with the C++ and Java toolchains, platform SDKs and so on as well. But here Rust’s newness and active development posed a new challenge. We could never rely on its presence in base Docker or VM images, and a new, backwards-compatible with older code, stable release every six weeks meant we had to verify and update all that machinery more often than we were used to. I made a lot of people pretty grumpy on that score.

In a similar vein, Firefox has a bootstrap script to help developers setup up a programming environment for working on the engine and application code. Rust’s installer is pretty great, with nvm-like features, but I still spent a fair chunk of time adding support for it there too, so developers uninterested in Rust could stay up to date without having to learn a new tool.

Snappy conclusion goes here

Those were some of the major hurdles I faced shipping a pilot project with a new compiled language in a major software application.

The Rust community is very helpful and supportive. They’ve done a great job developing a participatory culture, and it’s easy to ask questions, or for help, or to contribute to the language and tools. It was very exciting to see my beginning Rust code executed the first billion times.