Serverless Deep Learning – AWS

What will be covered in this post:

This week we will create a clothes classification service in the cloud to identify images we upload and send.

We will use AWS Lambda to serve our model. We can upload an image and send the URL to Lambda, which returns our predicted class.

We will use our previous clothes classification model, convert it to TensorFlow lite, package it up using Docker, and then upload the image to serve from AWS in the cloud.

AWS Lambda

Lambda allows us to run code on AWS without worrying about creating services or instances, hence the “serverless” name. First, we will create a new function inside Lambda and choose the “Author from scratch” option. Next, we will give our function a name, select the runtime language we want to use and the architecture, and then hit “Create function.”


AWS Lambda

Once the function is created, we will see the screen with information about our function. Since we chose Python as our runtime language, the function contains “lambda-function.py.” We will modify this for our demonstration. Finally, we will have our function print the event and return “PONG” when it is triggered.

Now we will click on the “Test” button and name our test event “test” and “save.” Now we have to click the “Deploy” button for our changes to take effect. When we click “Test” again, we will see the response.


Lambda Test Results

Using TensorFlow lite allows us to reduce our cost for storage and speeds up our initialization.

Tensorflow lite only focuses on inference or, in other words, prediction. Therefore, Tensorflow lite can’t be used to train a model.

The first step for this process will be to load a previously trained model to deploy. This Jupyter notebook shows the steps to convert a Keras model to tflite.


tensorflow-model

Preparing the Lambda code

First, we will extract the code from our Jupyter notebook, converting the model from Keras to tflite and creating a script.

We will use nbconvert for this extraction.

jupyter nbconvert --to script 'tensorflow-model.ipynb'

Now we have a file named ‘tensorflow-model.py.’ First, we will clean it up by removing everything except the last section of the notebook. Next, we will create a function to predict on a URL and return the zipped dictionary. Finally, we will also rename the file ‘lambda-function’ as this will be the function we create on AWS lambda.

import tflite_runtime.interpreter as tflite
from keras_image_helper import create_preprocessor

interpreter = tflite.Interpreter(model_path='clothing-model.tflite')
interpreter.allocate_tensors()

input_index = interpreter.get_input_details()[0]['index']
output_index = interpreter.get_output_details()[0]['index']

preprocessor = create_preprocessor('xception', target_size=(299, 299))

classes = [
    'dress',
    'hat',
    'longsleeve',
    'outwear',
    'pants',
    'shirt',
    'shoes',
    'shorts',
    'skirt',
    't-shirt'
]

#url = 'http://bit.ly/mlbookcamp-pants'

def predict(url):
    X = preprocessor.from_url(url)

    interpreter.set_tensor(input_index, X)
    interpreter.invoke()
    preds = interpreter.get_tensor(output_index)

    return dict(zip(classes, preds[0]))

def lambda_handler(event, context):
    url = event['url']
    result = predict(url)
    return result

Preparing a Docker image

First, we will create a file named Dockerfile, and we can use prepackaged images from AWS lambda.

We want to use the lambda/python base image, and under ‘Image tags,’ we can find the image we would like to use. We will use an image with Python 3.8.

In our Dockerfile, we will pull the python image and install the dependencies we used in our Jupyter Notebook that installed the keras-image-hander and the tflite_runtime package. After that, we will copy our tflite model and the lambda_function.py file to the current directory. Last we will run a command that invokes the lambda_handler in our lambda_function script.

FROM public.ecr.aws/lambda/python:3.8

RUN pip install keras-image-helper
RUN pip install --extra-index-url \
	https://google-coral.github.io/py-repo/ tflite-runtime

COPY clothing-model.tflite .
COPY lambda_function.py .

CMD[ "lambda_function.lambda_handler" ]

Now that our docker file is completed, we will build the docker container using the following:

docker build -t clothing-model .

The ‘.’ means we use the Dockerfile from the current directory. This will pull all the packages down into the container.

We can now test our Docker container locally using the following:

docker run -it --rm -p 8080:8080 clothing-model:latest

This will start the Docker image, and we can create a test script for our Docker image.

Inside the test.py file, we will import requests and then assign the function invocation to the ‘URL’ variable, the URL for our docker image locally. Next, we will assign the URL of the image of the pants to the ‘data’ variable. Finally, our results will contain the request in JSON format.

import requests

#url = 'http://localhost:8080/2015-03-31/functions/function/invocations'
url = 'https://000000000.execute-api.us-east-1.amazonaws.com/test/predict'

data = {'url': 'http://bit.ly/mlbookcamp-pants'}

result = requests.post(url, json=data).json()

print(result)

Now we run the test.py file, and we see our results.



Now we have built and tested our Docker image, so we will now create a lambda function and deploy our Docker image. We will deploy the Docker image to AWS ECR (Elastic Container Registry). Finally, we will create our repository with the command line. If you haven’t done so, you will have to run the ‘pip install awscli.’

We can run the following:

'aws ecr create-repository --repository-name clothing-tflite-images'

Which will create a repository on AWS. Now we will log in to that repository using the following:

aws ecr get-login --no-include-email

This will return the password, which I will not expose here. To use the password immediately without needing an extra step, we can run the following command:

$(aws ecr get-login --no-include-email)

Which will retrieve the password and then log you in with one step.

Using the URL that AWS assigned to us when creating the repository for our Docker image, we can use that information to execute a lengthy command to push our docker image to the repository.

ACCOUNT=090321278467
REGION=us-east-1
REGISTRY=clothing-tflite-images
PREFIX=${ACCOUNT}.dkr.ecr.${REGION}.amazonaws.com/${REGISTRY}

TAG=clothing-model-exception-v4-001
REMOTE_URI=${PREFIX}:${TAG}

To view the address:

echo ${REMOTE_URI}

Now we can tag and push our Docker image to the repository using the following:

docker tag clothing-model:latest ${REMOTE_URI}
docker push ${REMOTE_URI}

Now when we look at our AWS ECR repository page, we will see the “clothing-tflite-images” repository.


Repository page

We now have our docker container loaded onto the AWS platform. It is time to create a function on AWS that will link to the docker container. Let’s choose “Container image,” give it a name, in this case, “clothing-classification,” and the next step is choosing our Docker image from a “Container Image URI” using the “Browse Images” button. Finally, we are ready to create the function using the “Create function” button.


Creating a Container Image Lambda Function


Our “clothing-classification” function has been created, and now we can test it by clicking on the “Test” tab. We previously used the URL of the image of the pants, and we will also use that here, and now we can click the “Test” button. We get an error stating that the task timed out after roughly 3 seconds, which is the default time out and isn’t sufficient for our purposes, so we will change that using the “Configuration” tab. Then we will click the “Edit” button, increase the time to 30 seconds, add a little more memory, and click “Save.” Finally, we return to the “Test” tab and rerun our test.


Time out error

Configuration Tab


We have a successful test that shows our predictions. It took roughly 11 seconds to initialize, load, and run the first time we tested. The second time of testing, it only takes about 2 seconds.



API Gateway: exposing the lambda function

Our next step will be to expose our lambda function as a web service, and we will use an API gateway. We can search for “API gateway” to find this service. We will choose “REST API.”


AWS API Gateway

If this is your first time, an example API will be shown. We will select “New API,” name our API and then click “Create API.” Clicking the “Actions” drop-down button will allow us to choose “Create Resource.”


AWS REST API GATEWAY

Create Resource

In the “Resource Name” field, enter a name. I will use “predict” here to use the same convention we have been using in the zoomcamp course. Then click the “Create Resource” button.

We will have to add a method to our resource, and we can do that by clicking on the “Actions” button and selecting “Create Method,” and then choosing “POST” as our method type. Now we will select “Lambda Function” and enter “clothing-classification” as the Lambda Function we will use.


AWS API Gateway Post Setup Method

We will see a Permission window, and we can select “OK.” Then we will see a diagram of the Method-Request-Response process.

We have to set up the test using our diagram’s “Test” link. Then, in the request body, we will again enter our JSON URL information for the image of the pants. Then we can click the “Test” Lightning bolt button.



Adding our code to the Request Body

The results are the body of the response showing our predictions.


Results

It is time to deploy this API. Using the “Action” button again, we can select “Deploy API.” Using “New Stage,” we will use the test as our “Stage name” and then click on “Deploy.”


Deploy API

We are now presented with the URL where our function resides. The URL doesn’t contain our endpoint, “/predict”, so we will add that to the end of the URL. Then, we will copy and paste that function into our test.py file and hash out our local URL. When we run the test.py file, we will see our predictions.


import requests

#url = 'http://localhost:8080/2015-03-31/functions/function/invocations'
url = 'https://0000000000.execute-api.us-east-1.amazonaws.com/test/predict'

data = {'url': 'http://bit.ly/mlbookcamp-pants'}

result = requests.post(url, json=data).json()

print(result)

Now our lambda function is exposed as a web service.



Posted

in

, ,

by

Tags:

Comments

  1. Best Private Proxies Avatar

    Your comment is awaiting moderation.

    I liked as much as you’ll obtain performed right here. The cartoon is attractive, your authored material stylish. nonetheless, you command get got an nervousness over that you want be handing over the following. in poor health unquestionably come further earlier again as precisely the same nearly a lot often inside case you protect this hike.

Leave a Reply

Your email address will not be published. Required fields are marked *