We got super excited when we released the AWS Lambda Haskell runtime, described in one of our previous posts, because you could finally run Haskell in AWS Lambda natively.

There are few things better than running Haskell in AWS Lambda, but one is better for sure:

Running it 12 times faster!

Haskell Optimized is the v2 runtime; the old version was just called Haskell. Haskell Optimized cut the cold start time in half, and made execution time twelve times faster.

Whaaaattt???

We have been working hard on our benchmarks, and because Haskell, being native, is fast by itself, we wanted our runtime to be much faster, while keeping the same level of convenience.

It all began with a refactor of the codebase, where we split the two big modules into small, organized, single-responsibility modules that are much easier to navigate, and therefore easier to contribute to!

We immediately saw some flaws in our code, but probably the biggest pain point that hindered performance was the layer separation.

Deprecating the layer

AWS Lambda layers are a great tool for packing away some dependencies, to make your project run — and bootstrap — faster.

They can be used to upload heavy dependencies, like large Java JARs, or even perhaps command line applications.

We were using them to upload the runtime. This made the execution of the user’s project a bit complex:

The layer had to be downloaded to the lambda. The layer would spawn a system process through bash, spawning the user project in a very specific way, with some protocols that went through STDOUT and STDERR, along with JSON serialization. Even something as simple as a Hello World had quite a lot of overhead due to the layer.

We looked at other compiled runtimes, and having one executable per lambda — which at first hadn’t seemed like an option to us — now started to be more appealing. We also wanted to keep the same convenience that the Node.js runtime provided by allowing us to select a handler from the AWS Lambda panel.

That’s it! We could have the runtime embedded in the user project, together with the dispatcher.

So we put on our safety glasses and started sawing and sanding the runtime using awesome Haskell metaprogramming (Template Haskell) to provide the amazing experience of a 12x decrease in execution time, and cut the cold start times by half.

A more convenient import and name

We have made everything available under the Aws.Lambda module. Now you don’t have to import two modules just to configure your lambdas. Instead, now you can do the following in your Main module, and you are ready to roll:

import Aws.Lambda

generateLambdaDispatcher

Comprehensive documentation

You can find a website that serves all of the documentation for the runtime here using Tintin.

When in doubt, take a peek at it: you now have a reference for easily running Haskell on AWS Lambda.

Thanks for reading

We hope you like the changes we’ve made to the runtime. We want it to remain the best way to run Haskell on AWS Lambda.

Tell us if you did create, launch, or even test something with the runtime. We’d love to hear from you, and to make serverless computing easier, better, faster, and stronger.

See ya!