menu
How to deploy Lambda Container Image using Serverless Framework
How to deploy Lambda Container Image using Serverless Framework
Create a simple python based AWS Lambda Container and deploy it on AWS using Serverless Framework.

Let's get started...!
There are 3 steps involved in deploying AWS container:
Writing Lambda Code or the Lambda function in Python
Creating a Docker file
Creating serverless.yml file and deploying the function

Writing Lambda Code

For the tutorial purpose, let’s write a simple python file and name it as app.py

def handler(event, context):
		print("Hello From Inside the Lambda Function")
		return event
 

 

Creating Dockerfile

Lets create a Dockerfile in the same directory.
# Pull base lambda python3.8 docker image
FROM public.ecr.aws/lambda/python:3.8

# Copy python requirements file
COPY requirements.txt .

# Install the python packages
RUN pip install -r requirements.txt

# Copy function code
COPY app.py ${LAMBDA_TASK_ROOT}

# Set the CMD to your handler (could also be done as a parameter override outside of the Dockerfile)
CMD [ "app.handler" ]
 

 

Explanation

  1. Use Python based Lambda Docker Image from Docker Hub as base image
  2. Copy requirements.txt if there are some external packages that needs to be installed. (Optional)
  3. Install the requirements using pip. (Optional)
  4. Copy Lamdba code we wrote above in the lambda directory.
  5. Set the command to run the function.

 

Create Serverless file

We will create a serverless.yml file in the same directory which will have some lambda configurations.
service: test-app
frameworkVersion: '2'

provider:
  name: aws
  stage: dev
  region: us-east-1
  ecr:
    images:
      latest:
        path: ./

functions:
  print-hello:
    image:
      name: latest

 

Final directory Structure

dir-structure

 

Deploying the code

In the terminal, go to the above code directory and enter below command.
sls deploy
This will first create the docker image and push that image on AWS ECR. Then it will deploy the lambda function specifying it to use the above created docker image.

 

How did we use the model?

We had to run real time inference of the data on AWS Lambda and for that we used [pytorch](https://pytorch.org/) library which is around 700MB in size. So we had to come up with this approach of deploying the Lambda Container.

 

Summing up!

In this article, I have demonstrated the simplest workflow required to deploy python based Lambda Container on AWS. You can also use your own base image to run AWS Lambda Container. You can find that tutorial here.

Hit me up on my gaurav@appliedaiconsulting.com” target=”_blank” rel=”noopener” data-wplink-url-error=”true”>email if you have any questions.

Thanks for reading!

 

https://appliedaiconsulting.com/