Front-End Web & Mobile
Zero-effort Container deployment for GraphQL and REST APIs and Web Hosting with Amplify CLI
AWS Amplify is the fastest and easiest way to build cloud-powered mobile and web apps on AWS. Amplify comprises a set of tools and services that enables front-end web and mobile developers to leverage the power of AWS services to build innovative and feature-rich applications.
With today’s Amplify CLI release, we’re enabling front-end web and mobile customers to deploy their API (GraphQL & REST) or host their web apps using containers. You can bring your own Dockerfile or Docker Compose and Amplify CLI will automatically build, package and deploy your containers using AWS Fargate.
Benefits:
- Portability of your app backend – Amplify CLI provides you simple container templates to get started with or you can bring your own containers if your team already uses containers for API & Hosting,
- Out-of-the-box infrastructure setup for your container deployment pipeline – Amplify CLI manages infrastructure such as VPC, subnets, NACLs, IAM policies, and other security and infrastructure practices with zero prior knowledge of AWS required. Networking between containers automatically handled for you, as well as SSL generation for hosted sites.
- Zero-effort build & deployment pipeline creation – Amplify CLI creates a CodePipeline to build and deploy your images. The pipeline has cost optimization best practices such as lifecycle policies on build artifacts and images. Docker doesn’t even need to be installed on your system to build and deploy to AWS.
What we’ll build:
- First, an ExpressJS server that returns a random number
- Second, an ExpressJS that runs a FizzBuzz algorithm on a Python/Flash random number generator server.
Prerequisites:
- Install the latest Amplify CLI version
- Open terminal and run
npm install -g @aws-amplify/cli
to update to the latest Amplify CLI.
- Open terminal and run
- Amplify CLI is already configured
- If you haven’t configured the Amplify CLI yet, follow this guide on our documentation page.
Setup a new Amplify project
Run the following command to create a new Amplify project called “amplify-containerized” or if you already have an existing Amplify project skip to the next section.
mkdir amplify-containerized
cd amplify-containerized
Initialize an Amplify project by running:
amplify init
For the purposes of this blog post, you can just accept all the default values in the amplify init workflow.
Enable container based deployments
Container-based deployments needs to be explicitly toggled on. Run amplify configure project
to review your project configuration:
amplify configure project
accept the defaults and answer “yes” when asked if you want to enable container-based deployments:
...
? Do you want to enable container-based deployments? Yes
Add a new container-based ExpressJS API
Amplify CLI will maintain the same DX as for any existing API workflows. Once container-based deployments are enabled, you gain the ability to select “REST” → “API Gateway + AWS Fargate (Container-based)” during the amplify add api workflow.
Amplify CLI supports both GraphQL and REST API options for container based deployments. You can use container-based deployments alongside existing AppSync and API Gateway + Lambda options. For our demo, let’s create a REST API.
To create our first container-based REST API, run the following command:
amplify add api
Choose the following options:
? Please select from one of the below mentioned services:
> REST
? Which service would you like to use
> API Gateway + AWS Fargate (Container-based)
? Provide a friendly name for your resource to be used as a label for this category in the project:
> containerb5734e35
? What image would you like to use
> ExpressJS - REST template
? When do you want to build & deploy the Fargate task
> On every "amplify push" (Fully managed container source)
? Do you want to restrict API access
> No
After a successful completion of the CLI workflow, you’ll see these new files added to your project folder structure.
amplify/backend/api/<your-api-name>
├── amplify.state
├── containerb5734e35-cloudformation-template.json
├── parameters.json
└── src
├── Dockerfile
├── DynamoDBActions.js
├── buildspec.yml
├── index.js
├── package-lock.json
└── package.json
In the src/index.js
you’ll find a starter ExpressJS source code that’ll allow you to interact with DynamoDB. Let’s edit that to return a random number.
Replace the index.js
file with the following code:
const express = require("express");
const bodyParser = require('body-parser');
const port = process.env.PORT || 3001;
const app = express();
app.use(bodyParser.json());
app.use(bodyParser.urlencoded({ extended: true }));
// Enable CORS for all methods
app.use(function (req, res, next) {
res.header("Access-Control-Allow-Origin", "*")
res.header("Access-Control-Allow-Headers", "Origin, X-Requested-With, Content-Type, Accept")
next()
});
app.get("/", async (req, res, next) => {
try {
res.contentType("application/json").send({
"randomNumber": Math.floor(Math.random() * 101)
})
} catch (err) {
next(err);
}
});
app.listen(port, () => {
console.log('Example app listening at http://localhost:' + port);
});
Now let’s deploy your API:
amplify push
Here’s what’s happening under the hood:
Amplify creates APIs as an ECS Service to ensure that your application is monitored and tasks are in a healthy and active state, automatically recovering if an instance fails. When you make changes to your source code, the build and deployment pipeline will take your source code and Dockerfile/Docker Compose configuration as inputs. One or more containers will be built in AWS CodeBuild using your source code and pushed to ECR with a build hash as a tag, allowing you to roll back deployments if something unexpected happens in your application code. After the build is complete, the pipeline will perform a rolling deployment to launch AWS Fargate Tasks automatically. Only when all new versions of the image are in a healthy & running state will the old tasks be stopped. Finally the build artifacts in S3 (in the fully managed scenario) and ECR images are set with a lifecycle policy retention of 7 days for cost optimization.
Test your new containerized ExpressJS API
The best way to demonstrate the containerized API is just by calling it with cURL. The API endpoint is printed at the end of the “amplify push” command or when you run “amplify status”.
curl https://<YOUR_API_ID>.us-east-1.amazonaws.com/
Deployments for Containers can take a bit longer to build and deploy, but after a few minutes, you can verify the availability by checking the CodePipeline URL printed at the beginning of your “amplify push” command or run “amplify console api”, select the API, and select “CodePipeline”
Note: This is a simple use case just to showcase the workflow. Our ExpressJS template also provides out-of-the-box support to create a CRUD interface for a DynamoDB table. Review our documentation if you’re interested in that scenario.
Multi-container deployments
Amplify CLI fully relies on a Docker Compose configuration to enable multi-container deployments. Amplify automatically infers the Fargate and ECS settings based on your app’s Dockerfile or Docker Compose. Amplify also allows you to have inter-container networking based on the configured ports in your Docker Compose configuration.
To demonstrate that, let’s run amplify add api and select “REST
” → “API Gateway + AWS Fargate (Container-based)
” → “Docker Compose - ExpressJS + Flask template
” value to add a new multi-container API. This will create a following folder structure in your amplify/backend/api/<your-api-name>/
folder.
amplify/backend/api/<your-api-name>/
├── amplify.state
├── <your-api-name>-cloudformation-template.json
├── parameters.json
└── src
├── buildspec.yml
├── docker-compose.yml
├── express
│ ├── Dockerfile
│ ├── DynamoDBActions.js
│ ├── index.js
│ └── package.json
└── python
├── Dockerfile
├── requirements.txt
└── src
└── server.py
The top level docker-compose.yml
references the express server and the python server. Docker Compose provides a mechanism to deploy multiple containers at once. For more information on Docker Compose, please review the official Docker Compose guide.
This time we’ll have the Python server return a random number and the ExpressJS runs a FizzBuzz algorithm based on the Python server’s random number. Let’s replace our server.py
file with the following content:
from flask import Flask
from random import randrange
server = Flask(__name__)
@server.route('/random')
def hello():
return str(randrange(100))
if __name__ == "__main__":
server.run(host='0.0.0.0')
We created a Flask server that has a /random
route which returns a random number between 0 and 100.
Now let’s edit the express server to interface with the Python server and then run the FizzBuzz algorithm. Start by replacing the content of the index.js
file:
const express = require("express");
const bodyParser = require('body-parser');
const http = require('http');
const port = process.env.PORT || 3000;
const app = express();
app.use(bodyParser.json());
app.use(bodyParser.urlencoded({ extended: true }));
// Enable CORS for all methods
app.use(function(req, res, next) {
res.header("Access-Control-Allow-Origin", "*")
res.header("Access-Control-Allow-Headers", "Origin, X-Requested-With, Content-Type, Accept")
next()
});
app.get("/fizzbuzz", (req, res, next) => {
// add networking code to Python server code here
});
app.listen(port, () => {
console.log('Example app listening at http://localhost:' + port);
});
Then add the networking logic to interface with the Python server. When deployed to the cloud, you can interface with the other services by specifying the host as “localhost
” and the “port
” as configured in the docker-compose.yml
. If you’re testing locally with the Docker CLI, reference the Python server by it’s Docker Compose container name.
Multiple containers are deployed as a single unit in Fargate (e.g. same Task Definition). This opinionated deployment allows ease of networking between containers on the local loopback interface and avoids extra configuration, costs, operations, and debugging.
Add the following code right below the comment: “// add networking code to Python server code here
”
const options = {
port: 5000,
host: 'localhost', // replace with 'python' for local development
method: 'GET',
path: '/random'
};
http.get(options, data => {
var body = '';
data.on('data', (chunk) => {
body += chunk;
});
data.on('end', () =>{
console.log(body);
const randomNumber = body
let fizzOrBuzz = ''
// Add FizzBuzz logic code here
try {
res.contentType("application/json").send({
"newRandomNumber": body,
"fizzOrBuzz": fizzOrBuzz
});
} catch (err){
console.log(err);
next(err);
}
}).on('error', (error) => {
console.log(error);
});
})
Last but not least, add the FizzBuzz algorithm and return the result to the API caller. Add this FizzBuzz algorithm below “//Add FizzBuzz logic here”
if (randomNumber % 15 === 0) {
fizzOrBuzz = 'FizzBuzz'
}
else if (randomNumber % 3 === 0) {
fizzOrBuzz = 'Fizz'
}
else if (randomNumber % 5 === 0) {
fizzOrBuzz = 'Buzz'
}
else {
fizzOrBuzz = randomNumber
}
We’ve got our business logic completed! Let’s deploy our multi-container API by running the following command:
amplify push
Once successfully deployed, you can try to “cURL” the API. You should now be able to see the random number and FizzBuzz result returned:
❯ curl https://<YOUR_API_ID>.execute-api.us-east-1.amazonaws.com/fizzbuzz
{"newRandomNumber":"37","fizzOrBuzz":"37"}
❯ curl https://<YOUR_API_ID>.execute-api.us-east-1.amazonaws.com/fizzbuzz
{"newRandomNumber":"72","fizzOrBuzz":"Fizz"}
Success!
This blog post demonstrated a quick way to deploy single and multiple containers using Amplify CLI. There is so much more to serverless containers that we couldn’t cover in this blog post. This includes:
- GitHub trigger-based deployments
- Automatically secure an API with Amazon Cognito
- Hosting workflows for your web apps
- Multi-environment support
Look out for future blog posts on containers and review our documentation for more details.