25 May 2015

My Rust implementation of the Cap’n Proto remote procedure call protocol was designed in a bygone era. Back then, Rust’s runtime library provided thread-like “tasks” that were backed by libgreen and were therefore “cheap to spawn.” These enabled CSP-style programming with beautifully simple blocking I/O operations that were, under the hood, dispatched through libuv. While the question of whether this model was actually efficient was a matter of much discussion, I personally enjoyed using it and found it easy to reason about.

For better or worse, the era of libgreen has ended. Code originally written for libgreen can still work, but because each “task” is now its own system-level thread, calling them “lightweight” is more of a stretch than ever. As I’ve maintained capnp-rpc-rust over the past year, its need for a different approach to concurrency has become increasingly apparent.

Introducing GJ

GJ is a new Rust library that provides abstractions for event-loop concurrency and asynchronous I/O, aiming to meet the needs of Cap’n Proto RPC. The main ideas in GJ are taken from KJ, a C++ library that forms the foundation of capnproto-c++. At Sandstorm, we have been successfully using KJ-based concurrency in our core infrastructure for a while now; some examples you can look at include a bridge that translates between HTTP and this Cap’n Proto interface, and a Cap’n Proto driver to a FUSE filesystem.

The core abstraction in GJ is the Promise<T> , representing a computation that may eventually resolve to a value of type T . Instead of blocking, any non-immediate operation in GJ returns a promise that gets fulfilled upon the operation’s completion. To use a promise, you register a callback with the then() method. For example:

pub fn connect_then_write ( addr : gj :: io :: NetworkAddress ) -> gj :: Promise < () > { return addr .connect () .then (| stream | { // The connection has succeeded. Let's write some data. return Ok ( stream .write ( vec! [ 1 , 2 , 3 ])); }) .then (|( stream , _ )| { // The first write has succeeded. Let's write some more. return Ok ( stream .write ( vec! [ 4 , 5 , 6 ])); }) .then (|( stream , _ )| { // The second write has succeeded. Let's just return; return Ok ( gj :: Promise :: fulfilled (())); }); }

Callbacks registered with then() never move between threads, so they do not need to be thread-safe. In Rust jargon, the callbacks are FnOnce closures that need not be Send . This means that you can share mutable data between them without any need for mutexes or atomics. For example, to share a counter, you could do this:

pub fn ticker ( counter : Rc < Cell < u32 >> , delay_ms : u64 ) -> gj :: Promise < () > { return gj :: io :: Timer .after_delay_ms ( delay_ms ) .then ( move |()| { println! ( "the counter is at: {}" , counter .get ()); counter .set ( counter .get () + 1 ); return Ok ( ticker ( counter , delay_ms )); }); } pub fn two_tickers () -> gj :: Promise < Vec < () >> { let counter = Rc :: new ( Cell :: new ( 0 )); return gj :: join_promises ( vec! [ ticker ( counter .clone (), 500 ), ticker ( counter , 750 )]); }

If you do want to use multiple threads, GJ makes it easy to set up an event loop in each and to communicate between them over streams of bytes.

To learn more about what’s possible with GJ, I encourage you to explore some of more complete examples in the git repo.

Onwards!

Two things in particular have made working GJ especially fun so far:

KJ is written in clean, modern C++ that translates nicely into idiomatic Rust. The translation is fairly direct most of the time, and parts that don’t translate directly make for fun puzzles! For one such nontrival translation, compare KJ’s AsyncOutputStream to GJ’s AsyncWrite. The excellent mio library allows us to not worry about system-specific APIs. It provides a uniform abstraction on top of epoll on Linux and kqueue on OSX, and maybe someday even IOCP in Windows.

Although basics of GJ are operational today, there’s still a lot of work left to do. If this is a project that sounds interesting or useful to you, I’d love to have your help!

-- posted by dwrensha