Published on

Creating an OpenAI-Powered FastAPI Application with Docker.

Overview

Introduction

Docker is a popular platform for containerizing applications and dependencies, while Continuous Integration (CI) automates building, testing, and validating code changes. This topic explores the use of Docker for CI in conjunction with GitLab's managed CI/CD pipelines, which offer pipeline orchestration and version control. Developers can define a pipeline with various stages and jobs, such as building and testing Docker images, and push them to a container registry. Using Docker in CI allows for greater portability and consistency, improving development speed and code quality.

Preprequisites

create new directory and switch to it

mkdir api && cd api

create a FastAPI file

cat <<EOF>main.py
import os, openai, requests
from typing import Union
from fastapi import FastAPI
from dotenv import load_dotenv

load_dotenv()

app = FastAPI()

openai.api_key = os.getenv("OPENAI_API_KEY")

@app.get("/health")
def test_health_endpoint():
    response = requests.get(f"{os.getenv('HOST')}/openapi.json")
    if response.status_code == 200:
        return {"status": "ok"}
    else:
        return {"s tatus": "error", "message": "Failed to connect to openapi.json"}

@app.get("/completion")
async def openai_endpoint(query: str):
    response = openai.Completion.create(engine="davinci", prompt=query, max_tokens=10)
    return {"response": response.choices[0].text}
EOF

add python requirements file

cat <<EOF>>requirements.txt
fastapi
uvicorn
openai
python-dotenv
requests
EOF

add dotenv file

cat <<EOF>>.env
OPENAI_API_KEY=<your-openapi-key>
HOST=127.0.0.1:8080
EOF

valid key can be generated here

create a Dockerfile for this application

FROM python:3.9-slim-buster

WORKDIR /app

COPY . .

RUN pip install -r requirements.txt

ENV OPENAPI_KEY ""

CMD ["python", "-m", "uvicorn", "main:app", "--host=0.0.0.0", "--port=8000"]

add a docker compose file and put it outside the "api" folder

version: '3.7'

services:
  api:
    build:
      context: api
      dockerfile: Dockerfile
    env_file:
      - ./api/.env
    ports:
      - "8000:8000"
    volumes:
      - ./api:/app/api
    command: uvicorn api.main:app --host=0.0.0.0 --port=8000

check and sync the file structure and content with my GitHub repository

once all ready, try out by starting the applicaiton in docker

docker compose up api

cool thing about FastAPI, it's autogenerates SwaggerUI for you, you can access our endpoints quickly at http://127.0.0.1:8000/docs

Conclusion

  • created basic FastAPI application connecting to Openai API interfaces
  • create Docker configuration for running the application

In the next blog we're going to create a similar application to interract with OpenAI API over the Telegram bot and deploy it to one or several public cloud providers as Kubernetes workload and serverless function.