Serverless Rust API on AWS - Part 3: Deployment on AWS

Building a serverless API on AWS with OpenAPI support using Rust and Poem.

Blog post author avatar.
David Steiner

· 16 min read

Welcome to the final installment of our “Serverless Rust API on AWS” series. In Part 1, we explored the ecosystem and considerations for building serverless APIs with Rust on AWS. Part 2 focused on local development, where we built a Poem API with a basic CRUD interface for managing currencies using an in-memory repository.

Now, we’re ready to take our application to the cloud. In this post, we’ll:

  1. Implement a DynamoDB repository to replace our in-memory storage.
  2. Adapt our Poem API to work with AWS Lambda.
  3. Use AWS CDK to provision and deploy our infrastructure.
  4. Integrate AWS Cognito for API authentication.

By the end of this tutorial, you’ll have a fully functional, secure, and scalable serverless API running on AWS. We’ll see how the abstractions we built in Part 2 allow us to easily swap out our storage layer, and how Poem’s Lambda integration simplifies the process of moving our API to the cloud.

The complete code is on GitHub.

The DynamoDB repository

Let’s start by swapping out our in-memory repository for a DynamoDB-based implementation.

We’ll introduce three new crates to our project:

Cargo.toml
8 collapsed lines
[package]
name = "serverless-rust-api"
version = "0.1.0"
edition = "2021"
[[bin]]
name = "local"
path = "src/bin/local.rs"
[dependencies]
anyhow = "^1.0.86"
async-trait = "^0.1"
aws-config = "^1.5"
aws-sdk-dynamodb = "^1.36"
envy = "^0.4"
poem = "^3.0"
poem-lambda = "^5.0"
poem-openapi = { version = "^5.0", features = ["swagger-ui"] }
serde = { version = "^1.0", features = ["derive"] }
serde_dynamo = { version = "^4.2", features = ["aws-sdk-dynamodb+1"] }
thiserror = "^1.0"
tokio = { version = "^1.38", features = ["full"] }
tracing = "^0.1"
tracing-subscriber = { version = "^0.3", features = ["json", "env-filter"] }
2 collapsed lines
[dev-dependencies]
poem = { version = "^3.0", features = ["test"] }

aws-config and aws-sdk-dynamodb are part of the official AWS SDK for Rust. We’ll use these to establish a connection to DynamoDB and perform actions. serde_dynamo is a nice utility library to streamline the conversion of DynamoDB items to and from serde types.

Armed with these dependencies, let’s add a new module in src/repository/dynamodb.rs to implement our DynamoDB repository.

src/repository/dynamodb.rs
use aws_config::BehaviorVersion;
use aws_sdk_dynamodb::types::AttributeValue;
use aws_sdk_dynamodb::types::ReturnValue::AllOld;
use aws_sdk_dynamodb::Client;
use serde_dynamo::{from_item, to_item};
use tracing::error;
use crate::error::Error;
use crate::repository::{Currency, Repository};
pub struct DynamoDbRepository {
client: Client,
table_name: String,
}
impl DynamoDbRepository {
pub async fn new(table_name: String) -> Self {
let config = aws_config::load_defaults(BehaviorVersion::latest()).await;
let client = Client::new(&config);
Self { client, table_name }
}
}
#[async_trait::async_trait]
impl Repository for DynamoDbRepository {
async fn add_currency(&self, currency: Currency) -> crate::error::Result<Currency> {
let item = to_item(&currency).unwrap();
self.client
.put_item()
.table_name(&self.table_name)
.set_item(Some(item))
.item("pk", AttributeValue::S(currency.code.to_lowercase()))
.send()
.await
.map_err(|err| {
error!(err = debug(err), "failed to create currency");
Error::Other
})?;
Ok(currency)
}
async fn get_currency(&self, code: &str) -> crate::error::Result<Currency> {
let output = self
.client
.get_item()
.table_name(&self.table_name)
.key("pk", AttributeValue::S(code.to_lowercase()))
.send()
.await
.map_err(|err| {
error!(err = debug(err), "failed to get currency");
Error::Other
})?;
if let Some(item) = output.item {
let currency = from_item(item).map_err(|_| Error::Other)?;
Ok(currency)
} else {
Err(Error::NotFound(code.to_string()))
}
}
async fn delete_currency(&self, code: &str) -> crate::error::Result<Currency> {
let output = self
.client
.delete_item()
.table_name(&self.table_name)
.key("pk", AttributeValue::S(code.to_lowercase()))
.return_values(AllOld)
.send()
.await
.map_err(|err| {
error!(err = debug(err), "failed to delete currency");
Error::Other
})?;
if let Some(item) = output.attributes {
let currency = from_item(item).map_err(|_| Error::Other)?;
Ok(currency)
} else {
Err(Error::NotFound(code.to_string()))
}
}
}

The DynamoDbRepository struct encapsulates a DynamoDB client and the name of the table we’ll be using. Its new method asynchronously initialises the repository by loading AWS configuration and creating a DynamoDB client.

We then implement the Repository trait for DynamoDbRepository, providing implementations for add_currency, get_currency, and delete_currency. These methods interact with DynamoDB using the AWS SDK, handling serialisation and deserialisation of our Currency type using serde_dynamo.

Finally, we expose DynamoDbRepository as a public type in the repository module, making it available for use in other parts of our application.

src/repository.rs
mod base;
mod dynamodb;
mod memory;
pub use base::{Currency, Repository, SharedRepository};
pub use dynamodb::DynamoDbRepository;
pub use memory::InMemoryRepository;

In the next section, we’ll focus on adapting our application to run as an AWS Lambda function.

Running Poem on Lambda

To integrate Poem with Lambda, we’ll utilise the poem-lambda crate, which should already be listed in your Cargo.toml dependencies.

Given the significant differences between local development setup and AWS Lambda environment, we’ve opted to create two separate binary targets. The Lambda-specific target configures structured JSON logging (as opposed to the pretty logger used for local development) and always uses the DynamoDB repository as its backend.

Let’s take a look.

use anyhow::Result;
use std::sync::Arc;
use serverless_rust_api::api::build_app;
use serverless_rust_api::repository::DynamoDbRepository;
use serverless_rust_api::settings::Settings;
#[tokio::main]
async fn main() -> Result<()> {
tracing_subscriber::fmt()
.with_env_filter("info")
.json()
.init();
let settings = envy::from_env::<Settings>()?;
let repository = Arc::new(DynamoDbRepository::new(settings.table_name).await);
let app = build_app(repository)?;
poem_lambda::run(app).await.expect("app to start correctly");
Ok(())
}

Note the use of poem_lambda::run to execute the application. This function handles Lambda events and converts them into Poem requests for our API to process.

To register this new binary target, update your Cargo.toml:

Cargo.toml
8 collapsed lines
[package]
name = "serverless-rust-api"
version = "0.1.0"
edition = "2021"
[[bin]]
name = "local"
path = "src/bin/local.rs"
[[bin]]
name = "serverless"
path = "src/bin/serverless.rs"
18 collapsed lines
[dependencies]
anyhow = "^1.0.86"
async-trait = "^0.1"
aws-config = "^1.5"
aws-sdk-dynamodb = "^1.36"
envy = "^0.4"
poem = "^3.0"
poem-lambda = "^5.0"
poem-openapi = { version = "^5.0", features = ["swagger-ui"] }
serde = { version = "^1.0", features = ["derive"] }
serde_dynamo = { version = "^4.2", features = ["aws-sdk-dynamodb+1"] }
thiserror = "^1.0"
tokio = { version = "^1.38", features = ["full"] }
tracing = "^0.1"
tracing-subscriber = { version = "^0.3", features = ["json", "env-filter"] }
[dev-dependencies]
poem = { version = "^3.0", features = ["test"] }

Lastly, implement the settings module to robustly load configuration from environment variables:

src/settings.rs
use serde::Deserialize;
#[derive(Debug, Deserialize)]
pub struct Settings {
pub table_name: String,
}

Environment variables are a common way to pass configuration to Lambda functions. The envy crate allows environment variables to be parsed into serde types, offering a more convenient and robust way to handle environment variables than using std::env::var().

The serverless target should now build correctly:

Terminal window
cargo build --bin serverless

Clippy should also come back happy:

Terminal window
cargo clippy --all-targets

Building for Lambda

To deploy our Rust application on AWS Lambda, we need to package our binary in a Lambda-compatible format. The Cargo Lambda tool provides an efficient way to achieve this.

First, install Cargo Lambda by following the instructions on their official website.

Once it’s installed, you can build your Lambda-compatible package with the following command:

Terminal window
cargo lambda build --bin serverless --arm64

This command does the following:

  • --bin serverless: Specifies the binary target to build (our Lambda-specific entry point).
  • --arm64: Builds for the ARM64 architecture.

While we’ve chosen ARM64 here, x86-64 is also supported. The crucial point is to ensure that the build architecture matches the Lambda function’s platform, which we’ll specify in our AWS CDK code later.

After a successful build, you should find a new compiled binary at:

Terminal window
target/lambda/serverless/bootstrap

This bootstrap file is the executable we’ll deploy to AWS Lambda. It contains your Rust application compiled and packaged in a way that’s compatible with the Lambda runtime.

Deployment

Now that we’ve implemented our API, let’s deploy the infrastructure required to run it in AWS. This includes a Lambda function to execute our code, an API Gateway to expose our API endpoints, and a Cognito user pool for authentication.

AWS CDK

For infrastructure definition and deployment, we’ll leverage the AWS Cloud Development Kit (CDK). If you’re new to CDK, it’s a powerful framework that allows you to describe cloud infrastructure using familiar programming languages like Python and TypeScript (but not Rust), rather than domain-specific languages or JSON/YAML templates.

The benefits of using CDK include:

  1. Using languages you already know, eliminating the need to learn a new domain-specific language.
  2. Leveraging the full power and ecosystem of these programming languages.
  3. Ability to test infrastructure stacks using standard software testing methodologies.
  4. Particularly useful for organisations where developers manage their service infrastructure.

I typically use either TypeScript or Python for CDK stacks (depending on team preferences and existing codebase). All things being equal, TypeScript often feels slightly nicer, as it’s the language CDK is written is. I’ve chosen Python for this example. This decision is based on my observation that many in the Rust community are familiar with Python. However, the concepts presented here are easily transferable to other CDK-supported languages like TypeScript, if that’s your preference.

Pyproject setup

Let’s set up our project structure and configuration for the infrastructure code:

  1. Create an infrastructure folder with a stack subfolder in it.
  2. Add an empty __init__.py in infrastructure/stack to make it a Python package.

Create a pyproject.toml file in the project root with the following content:

pyproject.toml
[tool.poetry]
name = "serverless-api"
version = "0.0.0"
description = "Infrastructure for a serverless Rust API."
authors = []
readme = "README.md"
packages = [{ include = "stack", from = "infrastructure" }]
[tool.poetry.group.main.dependencies]
python = "^3.11"
aws-cdk-lib = "^2.147"
[tool.poetry.group.dev.dependencies]
mypy = "^1.10"
ruff = "^0.3"
[tool.mypy]
ignore_missing_imports = true
disallow_untyped_defs = true
[tool.ruff]
exclude = [
".git",
".git-rewrite",
".mypy_cache",
".pyenv",
".pytest_cache",
".pytype",
".ruff_cache",
".venv",
".vscode",
"__pypackages__",
"target",
]
[tool.ruff.lint]
select = [
"B", # flake8-bugbear
"C", # flake8-comprehensions
"E", # pycodestyle errors
"F", # pyflakes
"I", # imports
"N", # PEP 8 naming convention
"W", # pycodestyle warnings
]
ignore = [
"E501", # line too long, handled by formatter
"W291", # trailing whitespace, handled by formatter
]
[tool.ruff.format]
quote-style = "double"
indent-style = "space"
skip-magic-trailing-comma = false
line-ending = "auto"
docstring-code-format = false
[build-system]
requires = ["poetry-core"]
build-backend = "poetry.core.masonry.api"

This configuration:

  • Specifies project metadata.
  • Defines dependencies for both main and development environments.
  • Sets up code quality tools (mypy for type checking and ruff for linting/formatting).
  • Ensures the stack package from the infrastructure folder is correctly included

CDK Configuration

Create a cdk.json file in the project root to configure the CDK project:

cdk.json
{
"app": "python3 infrastructure/app.py",
"context": {
"@aws-cdk/aws-apigateway:disableCloudWatchRole": true,
"@aws-cdk/aws-apigateway:usagePlanKeyOrderInsensitiveId": true,
"@aws-cdk/core:stackRelativeExports": true,
"@aws-cdk/aws-lambda:recognizeVersionProps": true,
"@aws-cdk/aws-lambda:recognizeLayerVersion": true,
"@aws-cdk/core:checkSecretUsage": true,
"@aws-cdk/aws-iam:minimizePolicies": true,
"@aws-cdk/core:enablePartitionLiterals": true,
"@aws-cdk/core:target-partitions": ["aws", "aws-cn"]
}
}

This configuration:

  • Specifies infrastructure/app.py as the entry point for our CDK application.
  • Sets various CDK-specific options to optimise deployment and security.

With this setup complete, we’re ready to start defining our infrastructure stack. In the next section, we’ll create the app.py file and begin implementing our CDK stack.

The CDK stack

Our CDK stack will consist of four main resources:

  1. An HTTP API in API Gateway.
  2. A Lambda function running our Rust code.
  3. A Cognito user pool for API authentication
  4. The DynamoDB table used for persistence.

Let’s focus on the most interesting aspects of this setup, starting with the API Gateway.

API Gateway Configuration

Create a new file infrastructure/stack/api.py with the following content:

infrastructure/stack/api.py
from aws_cdk.aws_apigatewayv2 import (
CorsHttpMethod,
CorsPreflightOptions,
HttpApi,
HttpMethod,
)
from aws_cdk.aws_apigatewayv2_authorizers import HttpJwtAuthorizer
from aws_cdk.aws_apigatewayv2_integrations import HttpLambdaIntegration
from aws_cdk.aws_cognito import UserPool, UserPoolClient
from aws_cdk.aws_lambda import IFunction
from constructs import Construct
ALLOWED_HEADERS = ["Authorization", "Content-Type"]
ALLOWED_METHODS = [
CorsHttpMethod.GET,
CorsHttpMethod.OPTIONS,
CorsHttpMethod.POST,
]
class ServerlessApi(Construct):
def __init__(
self,
scope: Construct,
construct_id: str,
*,
handler: IFunction,
user_pool: UserPool,
user_pool_client: UserPoolClient,
) -> None:
super().__init__(scope, construct_id)
self._api = self.build_http_api()
authorizer = self.build_authorizer(user_pool, user_pool_client)
self.setup_lambda_integration(self._api, handler, authorizer)
@property
def endpoint_url(self) -> str:
return self._api.api_endpoint
def build_http_api(self) -> HttpApi:
cors_options = CorsPreflightOptions(
allow_headers=ALLOWED_HEADERS, allow_methods=ALLOWED_METHODS
)
return HttpApi(self, "ServerlessRustApi", cors_preflight=cors_options)
@staticmethod
def build_authorizer(
user_pool: UserPool, user_pool_client: UserPoolClient
) -> HttpJwtAuthorizer:
issuer = f"https://cognito-idp.{user_pool.env.region}.amazonaws.com/{user_pool.user_pool_id}"
return HttpJwtAuthorizer(
"JwtAuthorizer", issuer, jwt_audience=[user_pool_client.user_pool_client_id]
)
@staticmethod
def setup_lambda_integration(
api: HttpApi, handler: IFunction, authorizer: HttpJwtAuthorizer
) -> None:
integration = HttpLambdaIntegration("LambdaIntegration", handler)
api.add_routes(
path="/{proxy+}",
methods=[HttpMethod.GET, HttpMethod.POST],
authorizer=authorizer,
integration=integration,
)

The most noteworthy part is how we integrate the Lambda function:

integration = HttpLambdaIntegration("LambdaIntegration", handler)
api.add_routes(
path="/{proxy+}",
methods=[HttpMethod.GET, HttpMethod.POST],
authorizer=authorizer,
integration=integration,
)

By using the /{proxy+} path, we’re configuring API Gateway to send all requests to our Lambda function, regardless of the specific path. This allows our Rust code to handle routing internally.

In the next sections, we’ll see how to create the Lambda function and Cognito user pool that this API construct depends on.

The stack construct with additional resources

The API construct we defined earlier, along with the remaining resources, are all implemented in infrastructure/stack/stack.py. Let’s examine this file and its key components:

infrastructure/stack/stack.py
import pathlib
from aws_cdk import CfnOutput, Stack
from aws_cdk.aws_lambda import Architecture, Code, Function, Runtime
from aws_cdk.aws_cognito import (
UserPool,
StandardAttributes,
StandardAttribute,
AuthFlow,
UserPoolClient,
)
from aws_cdk.aws_dynamodb import Table, Attribute, AttributeType
from constructs import Construct
from stack.api import ServerlessApi
class ServerlessRustStack(Stack):
def __init__(self, scope: Construct, construct_id: str) -> None:
super().__init__(scope, construct_id)
database_table = self.create_database_table()
handler = self.create_function(
"serverless", table_name=database_table.table_name
)
database_table.grant_read_write_data(handler)
user_pool, user_pool_client = self.create_user_pool()
api = ServerlessApi(
self,
"ServerlessApi",
handler=handler,
user_pool=user_pool,
user_pool_client=user_pool_client,
)
CfnOutput(
self,
"ApiEndpointUrl",
value=api.endpoint_url,
description="The URL of the API endpoint.",
)
def create_database_table(self) -> Table:
partition_key = Attribute(name="pk", type=AttributeType.STRING)
return Table(
self,
"DatabaseTable",
table_name="currencies",
partition_key=partition_key,
)
def create_function(self, bin_name: str, *, table_name: str) -> Function:
code_path = pathlib.Path.cwd() / "target" / "lambda" / bin_name
code = Code.from_asset(code_path.as_posix())
return Function(
self,
"ApiHandler",
code=code,
architecture=Architecture.ARM_64,
runtime=Runtime.PROVIDED_AL2023,
memory_size=256,
handler="does-not-matter",
environment={
"TABLE_NAME": table_name,
},
)
def create_user_pool(self) -> tuple[UserPool, UserPoolClient]:
standard_attributes = StandardAttributes(
given_name=StandardAttribute(required=True, mutable=True),
family_name=StandardAttribute(required=True, mutable=True),
)
user_pool = UserPool(
self,
"UserPool",
user_pool_name="rust-demo",
self_sign_up_enabled=False,
standard_attributes=standard_attributes,
)
auth_flows = AuthFlow(admin_user_password=True, user_srp=True)
user_pool_client = user_pool.add_client("app-client", auth_flows=auth_flows)
return user_pool, user_pool_client

This stack creates the following resources:

  1. A DynamoDB table.
  2. A Lambda function to host our Rust code.
  3. Cognito user pool and client.
  4. The API Gateway (using our previously defined ServerlessApi construct).

Let’s focus on the most interesting part: the Lambda function creation.

def create_function(self, bin_name: str, *, table_name: str) -> Function:
code_path = pathlib.Path.cwd() / "target" / "lambda" / bin_name
code = Code.from_asset(code_path.as_posix())
return Function(
self,
"ApiHandler",
code=code,
architecture=Architecture.ARM_64,
runtime=Runtime.PROVIDED_AL2023,
memory_size=256,
handler="does-not-matter",
environment={
"TABLE_NAME": table_name,
},
)

Key points about this Lambda function configuration:

  • Code location: The code_path points to the serverless directory within the target directory. This should match the binary target name defined in Cargo.toml.
  • Architecture: We’re using Architecture.ARM_64. While x86-64 is also supported, the architecture must match the platform for which the binary was built.
  • Runtime: We’re using the generic Amazon Linux 2023 runtime (Runtime.PROVIDED_AL2023). Older Amazon Linux versions are also compatible.
  • Memory: We’ve set the memory to 256 MB. Rust’s efficiency allows it to run well even with as little as 128 MB, but the ideal size depends on your specific use case.
  • Environment variables: We pass the DynamoDB table name as the TABLE_NAME environment variable, which is used in the settings module of our Rust code.
  • Handler: The handler parameter is set to “does-not-matter” because custom runtimes (like our Rust binary) don’t use this parameter.

The CDK code assumes that the Rust binary has been compiled beforehand. While there are CDK construct libraries that handle the compilation step during deployment, separating the build step from CDK deployments can be advantageous:

  • It simplifies the CDK code and deployment process.
  • It allows for more flexible build processes, especially in CI/CD pipelines.
  • It ensures that the exact same binary is used across different environments.

To deploy successfully, ensure that your build process compiles the Rust code and places the binary in the correct location (target/lambda/serverless) before running the CDK deployment.

Deploying the stack

Now that we’ve fully defined our stack, let’s create the entrypoint for our CDK application. We’ll add the app.py file that we previously referenced in cdk.json.

infrastructure/app.py
#!/usr/bin/env python3
import aws_cdk as cdk
from stack.stack import ServerlessRustStack
app = cdk.App()
ServerlessRustStack(app, "ServerlessRustStack")
app.synth()

With our infrastructure code complete, we can now deploy our stack. Here’s a step-by-step guide to ensure a successful deployment:

  1. Compile Rust code: Ensure your Rust code is compiled for the Lambda environment. The binary should be in the target/lambda/serverless directory.

  2. Set up AWS credentials: Make sure your AWS credentials are properly configured. You can do this by setting environment variables or using the AWS CLI’s configure command.

  3. Deploy the stack: Run the following command from the infrastructure directory:

    Terminal window
    poetry run cdk deploy
  4. Review outputs: After deployment, CDK will display important information like the API endpoint URL. Make note of these outputs for future use.

Interacting with your serverless API

Now that your API is deployed, let’s explore how to interact with it. The CDK stack output includes the API endpoint URL, which you’ll use to make requests to your API. The API endpoint is automatically generated by AWS API Gateway. If you need a custom domain, you can easily adjust the CDK stack.

I’ve added Cognito authentication to the stack so the API is not wide open. This means that each request to the API must include an Authorization token in the headers.

In case you’re not familiar with Cognito, I’ve included some helper scripts in utilities/ for reference. These can assist you with creating a test user and retrieving a token.

Terminal window
TOKEN=$(./utilities/get-token.sh "<username>" "<password>")

Once you have the authentication token, you can start interacting with your API. Here’s an example of how to create a new currency using a curl command:

Terminal window
curl -H "Authorization: $TOKEN" \
-H "Content-Type: application/json" \
-X POST \
-d '{"code": "GBP", "name": "British Pounds", "symbol": "£"}' \
"https://$API_ENDPOINT/api/currencies"

Make sure to replace $API_ENDPOINT with the actual endpoint URL provided in the CDK stack output.

Final words

Building a serverless Rust API with AWS CDK opens up a world of possibilities for creating efficient, scalable, and cost-effective applications. Throughout this guide, we’ve explored the entire process from setting up the development environment to deploying and interacting with a fully functional API.

The complete code is on GitHub.

This series used DynamoDB as a convenient database to spin up on AWS. Many services are better off using a relational database as their primary storage. If you are curious about using Postgres in Rust, check out the series about End-to-end type safety with Remix and Rust.

David Steiner

I'm a software engineer and architect focusing on performant cloud-native distributed systems.

About me

Back to Blog