Finally, AWS Lambda supports custom runtimes and a built in runtime for Rust. This is exciting both because Rust is a fantastic language and because, with the Firecracker micro VM (also built in Rust), Rust Lambdas have trivial cold-start times.

The same day Rust Lambdas were announced, I followed the instructions and built a “Hello, World” Lambda, but that’s a little limited.

My favorite way to explore Lambdas is to build Alexa skills because of the immediate feedback: you write a little code, and a home device talks to you. It’s a peek into the long promised of world of easy service composition.

Unfortunately, Rust didn’t have complete Alexa skill request/response handling (there is a crate from 2 years ago that handled only the basics), so I wrote one called alexa_sdk. (It’s basically a struct plus serde wrapper around the Alexa JSON spec, with some helpers.

Last year, when Go support was announced, I wrote up how to automate the creation and deployment of binary Lambdas; here, I’m just going to concentrate on the basics of the Rust Lambda itself.

The Cargo.toml :

The build script (the Go example referenced above provides a template for a more complete build script that automates creation and upload of the Lambda):

And the service implementation itself:

Set up an Alexa skill:

And let Alexa say hello:

Please try it out and give me feedback (via the Github repository).