There’s a lot of talk recently around the so-called “Serverless” architecture.

In case you haven’t heard the term before, with “serverless” we mean applications that are built using Cloud services exclusively. With the introduction of AWS Lambda, and similar offerings from Amazon’s competitors, it is now possible to build and deploy applications that require no operational maintenance and offer limitless scale. The key takeaway is that you don’t need to manage application servers anymore, removing a lot of complexity and overhead.

In this tutorial, we will get our hands dirty with this technology by building a simple Slack chat-bot that will reply to our private chat messages.

Creating the Slack bot

First, create your Slack team if you haven’t already.

Click on the main menu on the top left, and select Apps and integrations.

Select Build.

Then, click the green Start building button. A new window, titled “Create an app”, will pop up.

Slack apps provide the packaging for any functionality, such as a chat bot, we will go and implement. Apps can then be distributed through the Slack App Directory.

Enter a suitable name for your app and assign it to your team as the “Development team”. I’m going to call my bot Gort.

Click on Create App, and you will be brought to your application’s configuration section.

We now need to enable event subscription. The way the chat bot will work is the following:

Users send a direct chat message to the chat-bot

An event representing that message is published

If the bot is subscribed to that type of event, a HTTP POST request, containing information about that chat message, is dispatched to a Web resource living at a given URL

That URL can handled by a Web application that we can easily build using a couple of “serverless” AWS cloud technologies:

An API Gateway resource, at that URL, handles the incoming POST request

An AWS Lambda function processes the payload of the POST request and takes an appropriate action, such as making a new request to the Slack API (e.g. to reply to the message)

Here’s an interaction diagram that illustrates what will happen each time our bot receives a direct message:

Let’s get to work. We need to tell our bot to subscribe to a specific event type, named im.message.

Go to the Event Subscriptions page and Enable Events:

Toggle the On/Off button to enable/disable event subscriptions

Scroll down to the Subscribe to Bot Events section and click on the Add a bot user text hyperlink.

You’ll be able to redefine the nick your bot should use when interacting with channels and users. Don’t forget to enable the Always Show My Bot as Online option, as we won’t go as far as using the RTM API to accomplish that. Besides, the AWS infrastructure is always on, isn’t it? That’s one of the advantages of going serverless: no downtime*.

Click on the Add Bot User button to complete this part. Go back to the Event Subscriptions section and toggle the On/Off button. Have you notice the Request URL that has just appeared? That’s where the direct messages that your bot receives will be sent, using HTTPS POST requests.

We now need to set up that web resource, but before we do that, go to the OAuth & Permissions page and click on the Install App to Team button. Authorize the bot and you’ll get a set of OAuth Access tokens, copy the Bot User OAuth Access Token to the clipboard; we’ll need it later.

Before you mention it, that Token isn’t valid anymore

Creating the API Gateway web resource

Sign into your AWS Management Console. We will begin by creating the API Gateway resource, so go to its dashboard.

Click on Create API. Leave New API selected and enter a good name for it. I called mine “gort-brains”.

The only predefined resource is root (/), so we’ll create a new resource named /event-handler. Click on the Actions button and then Create Resource from the drop-down menu.

Click on “Create Resource”

Enter “Event Handler” as the Resource Name and “event-handler” as the Resource Path. We don’t need to enable CORS as the client will not be a browser, but rather whatever backend application that Slack runs in order to dispatch these requests. Finally, click on the Create Resource button

Before finalising our new API Gateway resource, we need to create the Lambda function that it will trigger. Click on Services in the top menu and open the Lambda Management Console in another browser tab.

Creating the Lambda function

Go to the Lambda Dashboard and Create a Lambda function.

Select Blank Function as the blueprint: we’ll start from scratch. Click Next when the Configure Triggers section appears.

Before you ask: yes, we could have created both the Gateway resource and the Lambda function in one fell swoop, but for this tutorial I wanted to explain the process step by step.

We’ll write our Lambda function in Python 3 and call it “handleBotEvent”.

The Lambda function will encode the following process:

Handle data originating from an incoming POST request and extract the part relevant to the event Check if the message came from a user Reverse the text of the message, e.g. “Hello” becomes “olleH” Send the text in a response to the user, by submitting a new GET request to the appropriate Slack API resource: chat.postMessage

Lambda functions behave just like plain old CGI (or WSGI) handlers. The Python Lambda function signature, using Python 3 type annotations, is:

def lambda_handler(event: dict, context: dict) -> str:

# TODO implement

return 'Hello from Lambda'

The event object is where we need to go digging for the juicy little bits of message data. This is what you’d typically get:

{

'token': '0MkuolbPtPqTrGmkSvcKGksI',

'team_id': 'T5MBG2JKG',

'api_app_id': 'A5LL7G91B',

'event': {

'type': 'message',

'user': 'U5KSYQ20Y',

'text': 'hello there',

'ts': '1496764428.496706',

'channel': 'D5MCG83RV',

'event_ts': '1496764428.496706'

},

'type': 'event_callback',

'authed_users': ['U5MBXN7M4'],

'event_id': 'Ev5PDV5YUS',

'event_time': 1496764428

}

Lots of interesting information there. I’ve highlighted the parts that are especially relevant to our use-case:

user: the ID of the user who sent the message to our chat-bot

text: the text of the message

channel: the ID of the channel where the message was posted, essentially the private chat stream between the user and the bot

We need the last two in our function. Here’s how I’d implement it:

Please note: production code must validate the token that Slack is sending in the request. I’ve omitted this in the code for simplicity, but you should take a look at Nicholas’s comment for more information.

We reference an environment variable for the bot OAuth token, so let’s define it under the editor:

Finally, let’s also create and assign an appropriate role to the function. Scroll down to the Lambda function handler and role section and select Create custom role from the Role dropdown. A browser tab should open and you’ll be able to create a new, very basic role, in this new page:

Click on Allow and the previous form will auto-fill:

That’s it for our Lambda function. Let’s jump back into our API Gateway resource.

Joining the dots

We need to configure the resource so that it handles POST requests. Click on the Actions button again and select Create Method, then click on the new drop down field that has appeared and select POST. Finally, click on the tick icon.

Select “Lambda Function” as the Integration type, if it isn’t already, and choose the Lambda Region that’s most appropriate for you. I chose eu-west-1 as I reside in Ireland. Enter “handleBotEvent” as the name of Lambda Function.

Click Save. A pop-up notifying you that you are about to give the API Gateway permission to your new Lambda function will appear. We’re OK with that.

You should end up with the following workflow:

Deploy the API and it will be made available to through a specific web URL. Click on Actions, and then Deploy API.

We’ll be asked to choose your Deployment stage. Create a new one named “dev” and Deploy!

Once you do that, we’ll finally get what we really want: the Request URL, here referred to as Invoke URL:

We get HTTPS enabled by default, how convenient!

Important: You need the URL of your event-handler resource, not the root resource! Expand the tree and click on the green POST link to get to it.

Copy the Link URL.

We’re almost done. The AWS part of the equation has been taken care of. Now we need to subscribe the bot to the right type of event, Back to the Slack API Event subscriptions page.

Enable Slack direct message events and paste the URL into the Request URL field. You should get a warning message almost immediately, saying that the URL didn’t reply correctly to the challenge offered by the Slack API.

So what’s going on? The Slack API prudently sends a one-time challenge request to the new URL you’ve just defined as the Request URL. The challenge consists in a random string of characters, and our API is expected to respond with the same same string in the response. We need to amend the code so that it handles this condition.

Let’s go back to the code of the Lambda function, and add the following conditional statement (highlighted in bold) at the lexical top of the function:

def lambda_handler(data, context):

"""Handle an incoming HTTP request from a Slack chat-bot.

"""

if "challenge" in data:

return data["challenge"] ...

That should take care of it. Save the function and go back to the Enable Events page and click on Retry. The verification should now succeed:

Scroll down to the Subscribe to Bot Events section and click on Add Bot User Event. Select message.im as the event type:

Save your changes and go back to the Slack team channel. The bot should be waiting patiently there:

Send a direct message and you should get the reversed text from the bot.

You can find the complete code for this tutorial here.

Conclusions

I can’t say that I enjoy writing Python code in Lambda’s constrained editor. That’s probably why the recently launched a more full-featured web IDE and toolkit for development. Still, setting up a basic bot, once you figured out a few gotchas (such as the Slack API not accepting JSON payloads), is pretty straightforward.

Where to go from here? I’ll probably take a look at Lex and see how it can be integrated, and maybe got a datastore involved too, to give the bot a memory. I will also try Chalice, the AWS-sponsored Python micro-framework for AWS.

I hope this was useful. Have fun!