In Part I and II of this blog series, we demonstrated how to create and containerize a FastAPI application to further deploy it on a Kubernetes cluster. In this follow-up post, we'll explore another approach to exposing data through an API by deploying the FastAPI application using AWS Lambda. This serverless architecture can be particularly advantageous for data engineers and data scientists seeking cost-effective, scalable solutions for their data-centric APIs.
Prerequisites
- Basic knowledge of FastAPI.
- An AWS account.
- AWS CLI installed and configured.
-
serverless
framework installed globally vianpm
:
npm install -g serverless
Step-by-Step Guide to Exposing Data with FastAPI through AWS Lambda
1. Set Up Your FastAPI Application
Let's assume you have a FastAPI application file named main.py
.
# main.py
from fastapi import FastAPI
app = FastAPI()
@app.get("/data")
def read_data():
return {"message": "Welcome to FastAPI on AWS Lambda - Data Engineers/Scientists!"}
2. Create a requirements.txt
File
Include the dependencies for your FastAPI application:
fastapi
uvicorn
3. Create a Dockerfile
for the Serverless Framework
We'll be using Docker to package the application for AWS Lambda.
# Dockerfile
FROM amazonlinux:2
# Install Python and other requirements
RUN yum install -y python3
RUN pip3 install fastapi uvicorn
# Set the working directory
WORKDIR /var/task
# Copy application files
COPY main.py .
# FastAPI will be executed using the Uvicorn server
CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8000"]
4. Serverless Framework Configuration
Initialize a new Serverless service in your project directory:
serverless create --template aws-python3 --path my-fastapi-service
cd my-fastapi-service
Replace the created handler file (handler.py
) with the following content to adapt it for FastAPI:
# handler.py
import os
from mangum import Mangum
from fastapi import FastAPI
app = FastAPI()
@app.get("/data")
def read_data():
return {"message": "Welcome to FastAPI on AWS Lambda - Data Engineers/Scientists!"}
handler = Mangum(app)
5. Update serverless.yml
Configuration
Open serverless.yml
and configure it for FastAPI:
service: my-fastapi-service
provider:
name: aws
runtime: python3.8
region: us-east-1
functions:
app:
handler: handler.handler
events:
- http:
path: /
method: get
- http:
path: /data
method: get
package:
patterns:
- '!**'
- 'handler.py'
custom:
pythonRequirements:
dockerizePip: true
plugins:
- serverless-python-requirements
- serverless-wsgi
# Add Docker for packaging
custom:
wsgi:
app: handler.app
packRequirements: false
pythonRequirements:
dockerizePip: non-linux
6. Deploy the Application
Deploy the FastAPI application to AWS Lambda using the Serverless Framework:
serverless deploy
7. Verify the Deployment
After deployment, you'll see an output with the API endpoint. You can test it via a browser or tools like curl
or Postman:
curl https://.execute-api..amazonaws.com/dev/data
This should return:
{
"message": "Welcome to FastAPI on AWS Lambda - Data Engineers/Scientists!"
}
Conclusion
In Part I of our series, we've shown how to containerize a FastAPI application for Kubernetes. In this post, we've explored deploying a FastAPI application using AWS Lambda, providing a serverless and scalable solution. This architecture can significantly benefit data engineers and data scientists by offering a cost-effective, scalable way to expose data via APIs.
In Part II, we did dive into integrating these deployed APIs with various data sources, further leveraging the power of FastAPI in a data engineering and data science context. And now we did it in a AWS lambda