AWS Open Source Blog
Rust Runtime for AWS Lambda
AWS Lambda, which makes it easy for developers to run code for virtually any type of application or backend service with zero administration, has just announced the Runtime APIs. The Runtime APIs define an HTTP-based specification of the Lambda programming model which can be implemented in any programming language. To accompany the API launch, we have open sourced a runtime for the Rust language. If you’re not familiar with Rust, it’s a programming language for writing and maintaining fast, reliable, and efficient code.
The new Rust runtime makes it easy to start a Rust function that implements our Handler
type: pub type Handler<E, O> = fn(E, Context) -> Result<O, HandlerError>
. To serialize and deserialize events and responses, the runtime relies on Serde. Here’s an example:
#[macro_use]
extern crate lambda_runtime as lambda;
#[macro_use]
extern crate serde_derive;
#[macro_use]
extern crate log;
extern crate simple_logger;
use lambda::error::HandlerError;
use std::error::Error;
#[derive(Deserialize, Clone)]
struct CustomEvent {
#[serde(rename = "firstName")]
first_name: String,
}
#[derive(Serialize, Clone)]
struct CustomOutput {
message: String,
}
fn main() -> Result<(), Box<dyn Error>> {
simple_logger::init_with_level(log::Level::Info)?;
lambda!(my_handler);
Ok(())
}
fn my_handler(e: CustomEvent, c: lambda::Context) -> Result<CustomOutput, HandlerError> {
if e.first_name == "" {
error!("Empty first name in request {}", c.aws_request_id);
return Err(c.new_error("Empty first name"));
}
Ok(CustomOutput {
message: format!("Hello, {}!", e.first_name),
})
}
Creating, Building, and Deploying a Rust Function
To get started, we suggest using Cargo, Rust’s build tool and package manager, to create and build your new project:
$ cd MY_WORKSPACE
$ cargo new my_lambda_function --bin
Cargo automatically creates the folder for the new project and a Cargo.toml
file in the project root. Open the Cargo.toml
file and add a lambda_runtime
crate to the [dependencies]
section:
[dependencies]
lambda_runtime = "0.1"
Additionally, we’ll need a few more dependencies: Serde
for (de)serializing events, and log
and simple_logger
to emit logs.
serde = "^1"
serde_json = "^1"
serde_derive = "^1"
log = "^0.4"
simple_logger = "^1"
There’s one more setting to make in the Cargo.toml
file. When configured to use a custom runtime with the Runtime APIs, AWS Lambda expects the deployment package to contain an executable file called bootstrap
. We can configure Cargo to generate a file called bootstrap
, regardless of the name of our crate. First, in the [package]
section of the file, add an autobins = false
setting. Then, at the bottom of the Cargo.toml, add a new [[bin]]
section:
[[bin]]
name = "bootstrap"
Our completed Cargo.toml file should look like this:
[package]
name = "my_lambda_function"
version = "0.1.0"
authors = ["me <my_email@my_server.com>"]
autobins = false
[dependencies]
lambda_runtime = "^0.1"
serde = "^1"
serde_json = "^1"
serde_derive = "^1"
log = "^0.4"
simple_logger = "^1"
[[bin]]
name = "bootstrap"
path = "src/main.rs"
Next, open the main.rs
file that Cargo created in the src folder in your project. Copy the content from the basic example above and paste it into the file. It should replace the stub main
method created by Cargo. With the new source in place, we are almost ready to build and deploy our Lambda function
Before we launch our build, we need to make sure that the Rust compiler is targeting the correct platform. AWS Lambda executes your function in an Amazon Linux environment. Unless you are already running this tutorial on an x86 64bit Linux
environment, we’ll need to add a new target for the Rust compiler – we can use the Rustup tool to make this easier. Follow the instructions below to compile our basic example on Mac OS X.
Compiling on Mac OS X
First, install rustup
if you don’t already have it. Then, add the x86_64-unknown-linux-musl
target:
$ rustup target add x86_64-unknown-linux-musl
Before we build the application, we’ll also need to install a linker for the target platform. Fortunately, the musl-cross
tap from Homebrew provides a complete cross-compilation toolchain for Mac OS.
$ brew install filosottile/musl-cross/musl-cross
Now we need to inform Cargo that our project uses the newly-installed linker when building for the x86_64-unknown-linux-musl
platform. Create a new directory called .cargo
in your project folder and a new file called config
inside the new folder.
$ mkdir .cargo
$ echo '[target.x86_64-unknown-linux-musl]
linker = "x86_64-linux-musl-gcc"' > .cargo/config
On my system, some of the dependencies did not pick up the configured linker automatically and tried to use musl-gcc anyway. To get around this quickly, I simply created a symlink to the new linker:
$ ln -s /usr/local/bin/x86_64-linux-musl-gcc /usr/local/bin/musl-gcc
With the new target platform for the compiler installed and configured, we can now have Cargo cross-compile!
Building the Function
Use the command below to build the application for AWS Lambda. If you are running these commands on Amazon Linux, you won’t have to include the --target
option.
$ cargo build --release --target x86_64-unknown-linux-musl
We create release rather than debug builds. Debug builds are very large and, though the basic example doesn’t, other applications may well exceed the maximum deployment package size for an AWS Lambda function.
The build process creates an executable file in the ./target/x86_64-unknown-linux-musl/release/bootstrap
directory. Lambda expects the deployment package to be a zip file. Run the following command to create a deployment zip file for AWS Lambda:
$ zip -j rust.zip ./target/x86_64-unknown-linux-musl/release/bootstrap
To simplify the development and build process, we’ll be adding a Cargo builder to the SAM CLI (Serverless Application Model). When that release of the SAM CLI is out, you’ll be able to simply run sam build
with a SAM template.
Deploying the Function on AWS Lambda
We can now deploy this file to AWS Lambda. Navigate to the AWS Lambda console and create a new function.
Leave the Author from scratch option selected and give your function a name – I called mine test-rust
. Next, from the Runtime dropdown, select Provided
. Our sample function doesn’t require any special permissions. You can select an existing role if you already have a basic execution role, or ask the Lambda console to create a new one with basic permissions (you don’t have to pick a template). Finally, click create function.
In the function screen, use the Upload button in the Function code section to upload the rust.zip
file that we created in the build step of this tutorial. With the new file selected, Save the changes to the function. We do not need to make any other configuration changes.
Because our code is entirely contained within the
bootstrap
executable that Lambda will start, the Handler information is not needed. 128MB of memory and a 3 second execution timeout are sufficient headroom for a “Hello, world.”When throwing an error, the runtime can optionally include the full stack trace in the output of the function. To enable this, simply set the
RUST_BACKTRACE
environment variable to1
.
We can now test our function. In the Lambda console, click the Test button on the top right. Since this is the first time we are testing this function, the Lambda console asks us to define a test event. In the sample code above, you might have noticed that we expect a firstName property in the incoming event. Use the JSON below as the test event and give your test object a name.
{
"firstName": "Rustacean"
}
Finally, click Create in the test event modal window. With the new test event saved, click Test again on the top right of the console to actually start the function. Expand the “execution result” section to take a look at the function output and the log.
Congratulations! You have now built and deployed your first AWS Lambda function written in Rust. Next, try to deploy this function using a Serverless Application Model (SAM) template.
Code Deep Dive
Now that we have a Rust Lambda function running, let’s break down the sample code into its most important components. Starting from the top, we import our crates:
#[macro_use]
extern crate lambda_runtime as lambda;
#[macro_use]
extern crate serde_derive;
#[macro_use]
extern crate log;
extern crate simple_logger;
The first crate we import is the lambda_runtime
. This is our new runtime, which we’ll rename lambda
for the sake of conciseness. You might also notice the #[macro_use]
— this is a declaration to the Rust compiler that we’re importing a macro – you won’t need to do this for much longer, as the Rust 2018 edition will allow us to import macros like normal functions or values. The runtime defines a lambda!
macro that makes it easy to bootstrap the runtime.
The serde_derive
crate also uses macros to generate marshallers for a given struct. You’ll notice that structs in the sample code are annotated with #[derive(Serialize, Deserialize)],
which handle serialization and deserialization, respectively.
The library uses the macros defined by the log
crate to produce log messages. The sample code includes the simple_logger
crate to print messages to stdout
. There are many crates that implement the log
facade, and the runtime itself is not opinionated on which one you should pick.
After the extern
and use
statements, our sample code declares the main()
method – the entry point of our bootstrap
executable. This is the code that will run when Lambda starts our function.
fn main() -> Result<(), Box<dyn Error>> {
simple_logger::init_with_level(log::Level::Info).unwrap();
lambda!(my_handler);
Ok(())
}
The first thing we do is initialize the simple_logger
and set the logging level to Info
. You can change this to Debug
or Trace
to receive more information on what the library and its dependencies are doing behind the scenes. Be aware that the simple_logger
crate takes a lock on stdout
, so logging in debug or trace mode will slow down your function considerably.
Next, we use the lambda!()
macro defined in the lambda_runtime
crate to bootstrap our custom runtime. In its most basic form, the macro takes a pointer to the handler function defined in your code. The custom runtime uses the hyper
library to make HTTP requests to the Lambda Runtime APIs. You can optionally pass your own tokio
runtime to the lambda!()
macro:
let rt = tokio::runtime::Runtime::new()?;
lambda!(my_handler, rt);
You would want to create a custom Tokio runtime in cases where existing libraries would create their own runtime if not prompted. You can see a full, working example here.
With this, the custom runtime launches and begins polling the Lambda Runtime APIs for new events.
The next section of the code defines the handler function. The handler function must respect the Handler
type defined in the lambda_runtime
crate.
fn my_handler(e: CustomEvent, c: lambda::Context) -> Result<CustomOutput, HandlerError> {
if e.first_name == "" {
error!("Empty first name in request {}", c.aws_request_id);
return Err(c.new_error("Empty first name"));
}
Ok(CustomOutput {
message: format!("Hello, {}!", e.first_name),
})
}
The handler function receives an event object that implements the serde::Deserialize
trait. The custom runtime also generates a Context
object for each event and passes it to the handler. The Context object contains the same properties you’d find in the official runtimes.
The return value of the handler must be a Result
with a custom output type that implements the serde::Serialize
. Additionally, the custom runtime library specifies a HandlerError
type that you can use to wrap custom errors. You can use the new_error(msg: &str)
method in the Context
object to instantiate a new HandlerError
object with a backtrace. The custom runtime knows how to serialize a HandlerError
to JSON and include the backtrace if the RUST_BACKTRACE
environment variable says it should do so.
Conclusion
This runtime is still in the early stages, and we’d love to have your feedback in terms of its evolution. Additionally, we’re also aware of existing Rust for Lambda libraries like lando, rust-aws-lambda, and rust-crowbar, and we’d like to thank those projects’ respective authors for their work and inspiration.
The runtime makes it easy to write highly performant Lambda functions in Rust. Visit our aws-lambda-rust-runtime GitHub repository to get involved in the project and submit feedback and issues!
Rust logo courtesy of Mozilla, used under the terms of Creative Commons Attribution license (CC-BY).