Before we get started

As I explained in my previous blog post on getting started with AWS Lambda Layers, a Layer is a ZIP archive that contains libraries and other dependencies that you can import at runtime for your lambda functions to use. It is especially useful if you have several AWS Lambda functions that use the same set of functions or libraries, promoting code reuse! This re-usability makes Lambda Layers ideal for running small chaos experiments.

For my little chaos experiment, I will use — just as Yan Cui did, SSM to store the following JSON configuration object as a string. The values are self-explanatory — delay is in milliseconds.

{

"delay": 300,

"isEnabled": true

}

Open the AWS EC2 Console, select Parameter Store, and store the above configuration in an SSM parameter called chaoslambda.config .

SSM provides a secure way to store configuration variables for your applications, serverless or not, and can be accessed using the AWS Console, the AWS CLI, or even better — AWS SDKs. To get that configuration from an AWS Lambda function is simple. Leveraging the excellent library ssm-cache-python from my colleague Alex Casalboni, you can use the following two lines of code to retrieve a configuration stored in SSM:

from ssm_cache import SSMParameter

param = SSMParameter('chaoslambda.config')

For my little experiment, I will use the following get_config() function which applies some logic to the value of delay (again self-explanatory).

def get_config():

param = SSMParameter('chaoslambda.config')

try:

value = json.loads(param.value)

delay = value["delay"]

isEnabled = value["isEnabled"]

if isEnabled and delay >= 0:

return delay

elif isEnabled and delay <= 0:

return -1

else:

return 0

except InvalidParameterError as e:

print("{} is not SSM".format(e))

return 0

except KeyError as e:

print("{} is not a valid Key in SSM configuration".format(e))

return 0

To allow the AWS Lambda function to access SSM, you have to give it correct IAM permissions (more details here):

{

"Version": "2012-10-17",

"Statement": [

{

"Effect": "Allow",

"Action": [

"ssm:DescribeParameters"

],

"Resource": "*"

},

{

"Effect": "Allow",

"Action": [

"ssm:GetParameters",

"ssm:GetParameter"

],

"Resource": "arn:aws:ssm:eu-north-1:<ACCOUNT_ID>:parameter/chaoslambda.config"

}

]

}

You might wonder why I use SSM over Environment variables. It’s a fair question.

Environment variables in Lambda allow you to dynamically pass settings to Lambda functions without making changes to the code itself, especially settings that are not often changed (like databases). Lambda then makes these variables available to your Lambda function using standard APIs supported by the language, like os.environ for Python. The following code snippet shows how you would abstract a DynamoDB table and an AWS Region in your lambda function using environment variable:

import os

region = os.environ["AWS_REGION"]

tablename = os.environ["tablename"] dynamodb = boto3.resource('dynamodb', region_name=region)

table = dynamodb.Table(tablename)

By separating environment settings from application logic, you don’t need to update and redeploy function code if you need to change the name of the database or the region where you execute that function — you know, abstractions :-)

One problem with environment variables is their locality. Sharing them across a wide number of Lambda functions will become cumbersome, but most problematic for me is that configurations stored in environment variables in Lambda aren’t shareable with AWS compute services like EC2 or ECS.

Note: for a complete list of SSM features, check here. For SSM limits, please check here.