Lambda For Containers

UNDER CONSTRUCTION

Introduction

The AWS Lambda service allows the running of small bits of codes in a "serverless" environment. This means I don't have to worry about the back-end resources, they just run.

AWS Lambda supports several programming languages including Python, Java, Go, Node.js, and others. What it doesn't support is running a full container.

An alternative to this would be to run the container in Kubernetes. However, this is more overhead than I want. Furthermore, Kubernetes implies that there are EC2 instances running all the time. What I want is a service that only uses AWS resources when needed.

The option I settled on is to use ECS Fargate which allows me to run containers on demand and not worry about the underlying server infrastructure. In this document I will describe how to set up a complete workflow using ECS Fargate as well other parts of the AWS environment to deploy an ikiwiki web page.

The ikiwiki Workflow

Here is a picture of how the the various services interact:

                                              DockerHub
                                                 |
                                                 v
Git --> GitHub --> AWS API --> AWS Lambda --> AWS ECS --> AWS S3

In more detail:

Edit ikiwiki markdown file
 --> Commit changes in Git repository
 --> Push changes to GitHub
 --> GitHub webhook makes call to AWS API Gateway
 --> AWS API Gateway makes a call to an AWS Lambda function
 --> The AWS Lambda function triggers a Task in AWS ECS Fargate
 --> AWS ECS Fargate pulls down a Docker Hub container
 --> The task in AWS ECS Fargate runs the container that pulls the
     Git repository from GitHub and runs it through ikwiki and writes
     the output to AWS S3
 --> AWS S3 serves the content as static web content

We are using the following services:

API Gateway
AWS Lambda
AWS Elastic Container Service
AWS S3
AWS IAM (allows the various services to interact)
GitHub
Docker Hub

Putting it all together

Step 0. AWS Prerequistes

We assume that you have already created a VPC, a subnet in that VPC, and given that subnet a route to the outside internet via an Internet Gateway (or some other mechanism). You will need to these when setting up the ECS task.

Step 1. Create the S3 bucket

The ikiwiki content is all static web pages, so we can serve it via an S3 bucket. Note that this means that the content is publically viewable and can only use http, an unencrypted transport.

  1. Create a new bucket ikiwiki.

  2. Configure the bucket so that it will serve its content as a static web site. I will discuss a few important points on this step, but for details on how to do this see the AWS document Example: Setting up a Static Website.

  3. Gotcha: When setting up an S3 bucket as a static web site, the official AWS instructions do not mention that before adding a Bucket Policy you must first enable the option to add bucket policies in the "Public access settings" for the bucket.

  4. Gotcha: In order to make your site publically accessible, you also have to UNcheck the "Block public and cross-account access if bucket has public policies" in "Public access settings".

Step 2. Create IAM role allowing access to S3 bucket

We need an IAM role that we can assign to the container that allows the container to copy the ikiwiki output to the ikiwiki S3 bucket.

  1. Go to the IAM page.

  2. Create a new policy that allows listing, reading, and writing of the ikiwiki bucket. We will call this policy CustomAccessToIkiwikiS3Bucket.

      {
          "Version": "2012-10-17",
          "Statement": [
              {
                  "Sid": "VisualEditor0",
                  "Effect": "Allow",
                  "Action": "s3:ListBucket",
                  "Resource": "arn:aws:s3:::ikiwiki"
              },
              {
                  "Sid": "VisualEditor1",
                  "Effect": "Allow",
                  "Action": [
                      "s3:PutObject",
                      "s3:GetObject",
                      "s3:DeleteObject"
                  ],
                  "Resource": [
                      "arn:aws:s3:::ikiwiki-testing/*",
                      "arn:aws:s3:::ikiwiki/*"
                  ]
              }
          ]
      }
    
  3. Create a role that we can assign to our ECS Tasks so they can access the S3 buckets.

  4. Click the "Create role" button

  5. Click on the "AWS service" rectangle.

  6. Under "Choose the service that will use this role" and click on the "Elastic Container Service".

  7. Under "Select your use case" choose "Elastic Container Service Task".

  8. Click the "Next: Permissions" button.

  9. Choose the CustomAccessToIkiwikiS3Bucket policy created above.

  10. Click the "Next: Tags" and then click "Next: Review".

  11. For "Role name" type in "roleECSAccessIkiwikiS3".

  12. Click "Create role".

Step 3. Setup the ECS cluster and task

We next set up the ECS cluster and a task that will run the container for us. Our container lives in the public part of Docker Hub; if you don't want your contianer to be public you can put your container in the AWS Elastic Container Registry.

  1. Create a cluster in ECS called main using the "Networking Only (Powered by AWS Fargate)" selection.

  2. Click on the "Task Definitions" link and click "Create a new task definition".

  3. Give the new task the name "ikiwiki".

  4. For "Task Role" enter the role created above: "roleECSAccessIkiwikiS3".

  5. Leave "Network Mode" set to "awsvpc".

  6. Choose reasonable values for the resource requirement settings.

  7. Set up the container definition by putting in the path to the container you want to use. In our case, our container is in the public part of Docker Hub, so we enter macrotex/ikiwiki-build.

  8. Add other container settings including any environment variables.

  9. Test this task definition by choosing "Run Task".

Step 4. Create an AWS Lambda function to trigger the ECS task

We next create an AWS Lambda function that will call the ECS ikiwiki task we created above.

  1. In AWS Lambda click the "Create function" buttone.

  2. Chosse the "Author from scratch" rectangle.

  3. Under "Name" type in ecs_run_task.

  4. Under "Runtime" choose Python 3.6.

  5. Under "Role" select "Choose an existing role".

  6. Under "Existing role" select "lambda_run_ecs_task".

  7. Create the function: see AWS Lambda Function to Trigger ECS on a GitHub Webhook for details on this step.

Step 5. Create an API Gateway interface that receives the GitHub webhook

When we push a commit to the GitHub repository we want a webhook to call on the AWS API Gateway.

  1. Create a new API called "ikiwiki". Choose the "REST" protocol.

  2. Create a resource action with method "POST". Choose "Lambda Function" for the Integration Type.

  3. For the Lambda function type in "ecs_run_task".

  4. Deploy this API using the "Deploy API" action in the "Actions" drop-down. Use the name main-ikiwiki for the deployment name.

  5. Under the "Stages" section you will see your new deployment. Click on the deployment to get the dployment URL. You will need this URL for the GitHub integration. It will look something like

      Invoke URL:  https://qimm19ruwq.execute-api.us-east-1.amazonaws.com/main-ikiwiki/
    

Step 6. Configure GitHub to call the AWS API on repsitory changes

  1. Go into your GitHub repository settings and clien on the "Webhooks" section.

  2. Click the "Add webhook" button.

  3. For the "Payload URL" enter the URL from the last step of the previous section.

  4. For "Content type" choose "application/json".

  5. Select "Enable SSL verification".

  6. For "Secret" type in a random string of characters. Remember this string as you will need it later.

Step 7. Add mapping template to the AWS API POST method

We need to pass three parameters from the GitHub webhook into Lambda.

  1. Go back to the ikiwiki AWS API and click on the POST method.

  2. You should see at the top of the page "POST - Method Execution" and four labelled boxes.

  3. Click on the " Integration Request" box.

  4. Scroll down to "Mapping Templates" and expand.

  5. Under "Request body passthrough" select "When there are no templates defined (recommended)".

  6. Click "Add mapping template".

  7. For "Content-Type" enter "application/json".

  8. Enter this JSON string (replacing "SECRET FROM GITHUB WEBHOOK" with the appropriate secret:

      {
          "x_hub_signature": "$util.escapeJavaScript($input.params().header.get('X-Hub-Signature'))",
          "secret":  "SECRET FROM GITHUB WEBHOOK",
          "payload": "$util.base64Encode($input.body)"
      }
    

Timings: From Git push to updated content available in S3 takes about 2 minutes.

Cost

The AWS services used are all "pay-as-you-go". That is, the less you use them, the lower the cost, all the way to zero. Here is esttimate of the cost of the ikiwiki service described above (al costs per-month): Assumes a few dozen pushes per month:

Price            Our cost
AWS S3 ($0.02/GB)                      $0.02
AWS API ($3.50 for 1st 333 million)    $3.50 (?)
AWS ECS Fargate                        minimal
AWS Lambda (in free tier)              free