For this mission, I needed to map out what the old solution did into the SAM world. I started off using the SAM “hello world” as a basis

sam init — runtime python3.6

From here, I then modeled how the existing application in Heroku worked. It all looks something like this:

Slack music controller components

In SAM, I then had to model 2 API endpoints — 1 to receive inbound requests from Slack, 1 for the “on premises” controller to be able to call to receive these commands. We couldn’t have Slack directly call into the on-prem component because of firewalls so we have the /heydj endpoint just capture the commands and requests from Slack and put them in an SQS FIFO queue. When the controller then calls into the /controller endpoint to get the commands, we just pull them from the queue and send them ordered back to the controller. The FIFO queue gives us the ordering for free!

Local Dev and Testing

This is where I really found SAM and the SAM CLI incredibly helpful. Using just my trust Atom editor and the SAM CLI, I was able to get eveything working locally before deploying to AWS. The only thing I created for testing later was the SQS queue I needed in AWS (I’m sure this could have been mocked out though…). Following the SAM guide on passing environment variables locally, I created a small json file with the variables and then invoked the SAM local API runner (NOTE: You need Docker installed to be able to do this)

sam local start-api --env-vars env.json

At startup, SAM told me

2018–11–01 14:12:57 Mounting HeyDJFunction at http://127.0.0.1:3000/heydj [POST] 2018–11–01 14:12:57 Mounting ControllerFunction at http://127.0.0.1:3000/controller [GET] 2018–11–01 14:12:57 You can now browse to the above endpoints to invoke your functions. You do not need to restart/reload SAM CLI while working on your functions changes will be reflected instantly/automatically. You only need to restart SAM CLI if you update your AWS SAM template

I was able to test the endpoints very quickly using just simple curl commands. As you’ll also note, the SAM local API tester doesn’t need to be restarted each time either as you change code. That’s because for each time you call the endpoint, it fires up a new Docker container which is VERY similar to the AWS Lambda runtime. It meant I could develop and test very quickly.

Building and Deploying

So SAM is awesome for local dev/test. What about deploying. Well, it’s also a bonus here. SAM is able to take your code and upload it to an S3 bucket and then modify the SAM (Cloudformation) template to point to the S3 location of your code. This is all done with a single command:

sam package --template-file template.yaml --output-template-file packaged.yaml --s3-bucket my-s3-bucket-for-lambda-code

Once you have your template created and the code uploaded to S3, you just need to deploy the packaged template. Again, SAM gives you a simple command to do this:

sam deploy \

--template-file packaged.yaml \

--parameter-overrides SlackToken=mySlackToken ControllerToken=myControllerToken \

--stack-name slack-music-controller \

--capabilities CAPABILITY_IAM \

--region us-east-1

That’s it! After running this, I had a full API Gateway solution backed by my 2 Lambda functions running in AWS. And I was pretty confident it would just work because running everything locally was using the same code and environment as it would in Lambda.