Serverless Framework

Use the Serverless Framework with LocalStack

Serverless logo

Overview

This guide explains how to integrate LocalStack with the Serverless Framework. Although it probably requires a few code changes, integrating LocalStack with the Serverless Framework is fairly straightforward.

In particular, the setup consists of the following two steps.

  1. Installing and configuring the Serverless-LocalStack plugin.
  2. Adjusting AWS endpoints in Lambda functions.

Prerequisites

This guide assumes that you have the following tools installed.

It also assumes that you already have a Serverless app set up consisting of a couple of Lambda functions and a serverless.yml file similar to the following. An example Serverless app integrated with LocalStack can be found here: Simple REST API using the Serverless Framework and LocalStack

  1. service: my-service
  2. frameworkVersion: ">=1.1.0 <=2.50.0"
  3. provider:
  4. name: aws
  5. runtime: python3.8
  6. environment:
  7. DYNAMODB_TABLE: ${self:service}-${opt:stage, self:provider.stage}
  8. iamRoleStatements:
  9. - Effect: Allow
  10. Action:
  11. - dynamodb:Query
  12. - ...
  13. Resource: "arn:aws:dynamodb:${opt:region, self:provider.region}:*:table/${self:provider.environment.DYNAMODB_TABLE}"
  14. functions:
  15. create:
  16. handler: todos/create.create
  17. events:
  18. - http:
  19. path: todos
  20. method: post
  21. cors: true
  22. ...
  23. resources:
  24. Resources:
  25. TodosDynamoDbTable:
  26. Type: 'AWS::DynamoDB::Table'
  27. DeletionPolicy: Retain
  28. Properties:
  29. ...
  30. TableName: ${self:provider.environment.DYNAMODB_TABLE}

Install and configure Serverless-LocalStack Plugin

To install the plugin, execute the following command in the root of your project.

  1. $ npm install -D serverless-localstack

Next, set up the plugin by adding the following properties to serverless.yml.

  1. ...
  2. plugins:
  3. - serverless-localstack
  4. custom:
  5. localstack:
  6. stages:
  7. - local

This sets up Serverless to use the LocalStack plugin but only for the stage “local”. Next, you need make minor adjustments to your function code in order to make your application work no matter if it is deployed on AWS or LocalStack.

Adjust AWS endpoints in Lambda functions

You are likely using an AWS SDK (such as Boto3 for Python) in your Lambda functions to interact with other AWS services such as DynamoDB.

For example, in Python, your code to set up a connection to DynamoDB may look like this:

  1. ...
  2. dynamodb = boto3.resource('dynamodb')
  3. ...

By default, this call attempts to create a connection via the usual AWS endpoints. However, when running services in LocalStack, we need to make sure, our applications creates a connection via the LocalStack endpoint instead.

Usually, all of LocalStack’s services are available via a specific port on localhost (e.g. localhost:4566). However, this endpoint only works when accessing LocalStack from outside its Docker runtime.

Since the Lambda functions execute within the LocalStack Docker container, Lambda functions cannot access other services via the usual localhost endpoint.

Instead, LocalStack provides a special environment variable LOCALSTACK_HOSTNAME which contains the internal endpoint of the LocalStack services from within its runtime environment.

Hence, you need to configure the Lambda functions to use the LOCALSTACK_HOSTNAME endpoint when accessing other AWS services in LocalStack.

In Python, this may look something like. The code detects if it is running in LocalStack by checking if the LOCALSTACK_HOSTNAME variable exists and then configures the endpoint URL accordingly.

  1. ...
  2. if 'LOCALSTACK_HOSTNAME' in os.environ:
  3. dynamodb_endpoint = 'http://%s:4566' % os.environ['LOCALSTACK_HOSTNAME']
  4. dynamodb = boto3.resource('dynamodb', endpoint_url=dynamodb_endpoint)
  5. else:
  6. dynamodb = boto3.resource('dynamodb')
  7. ...

Ideally, we want to make LocalStack’s Lambda execution environment “LocalStack-agnostic”, so that you are not required to adjust endpoints in your function code anymore. You want to help us with that? Drop us a line in Slack!.

Deploying to LocalStack

You can now deploy your Serverless service to LocalStack.

First, start LocalStack by running

  1. $ localstack start

Then deploy the endpoint by running

  1. $ serverless deploy --stage local

The expected result should be similar to:

  1. Serverless: Packaging service...
  2. Serverless: Excluding development dependencies...
  3. Serverless: Creating Stack...
  4. Serverless: Checking Stack create progress...
  5. ........
  6. Serverless: Stack create finished...
  7. Serverless: Uploading CloudFormation file to S3...
  8. Serverless: Uploading artifacts...
  9. Serverless: Uploading service my-service.zip file to S3 (38.3 KB)...
  10. Serverless: Validating template...
  11. Serverless: Skipping template validation: Unsupported in Localstack
  12. Serverless: Updating Stack...
  13. Serverless: Checking Stack update progress...
  14. .....................................
  15. Serverless: Stack update finished...
  16. Service Information
  17. service: my-service
  18. stage: local
  19. region: us-east-1
  20. stack: my-service-local
  21. resources: 35
  22. api keys:
  23. None
  24. endpoints:
  25. http://localhost:4566/restapis/XXXXXXXXXX/local/_user_request_
  26. functions:
  27. ...
  28. layers:
  29. None

Use the displayed endpoint http://localhost:4566/restapis/XXXXXXXXXX/local/_user_request_/my/custom/endpoint to make requests to the deployed service.

Advanced topics

Local code mounting for lambda functions

serverless-localstack supports a feature for lambda functions that allows local code mounting:

  1. # serverless.yml
  2. custom:
  3. localstack:
  4. # ...
  5. lambda:
  6. mountCode: True

When this flag is set, the lambda code will be mounted into the container running the function directly from your local directory instead of packaging and uploading it.

If you want to use this feature together with the local lambda executor (LAMBDA_EXECUTOR=local), you need to make sure the container running localstack itself can find the code. To do that, you need to manually mount the code into the localstack container, here is a snippet using a docker-compose.yml with the essentials. Where /absolute/path/to/todos is the path on your local machine that points to the todos/ directory containing the lambda code from our previous example.

  1. # docker-compose.yml to start localstack
  2. services:
  3. localstack:
  4. # ...
  5. environment:
  6. - LAMBDA_EXECUTOR=local
  7. - LAMBDA_REMOTE_DOCKER=0
  8. # ...
  9. volumes:
  10. # ...
  11. - "/absolute/path/to/todos:/absolute/path/to/todos"

Ran into trouble?

If you run into any issues or problems while integrating LocalStack with your Serverless app, please submit an issue.

Last modified December 21, 2021: add broken link checker for docs into CI pipeline (#92) (a294b895)