Lab 1: ChatBot with FastAPI
Difficulty: Beginner · Estimated time: ~3 hours · Deadline: See course portal
Objective
Build a Python chatbot API that:
- Accepts chat messages via a FastAPI endpoint
- Stores conversation history in SQLite
- Runs inside a Docker container
- Deploys to Render with Google Auth
Step 1 — Create the FastAPI app
bash
mkdir tds-chatbot && cd tds-chatbot
uv init
uv add fastapi uvicorn openai
Create app/main.py:
python
from fastapi import FastAPI, HTTPException
from pydantic import BaseModel
import openai
import sqlite3
from pathlib import Path
app = FastAPI(title="TDS ChatBot API")
class ChatMessage(BaseModel):
message: str
session_id: str | None = None
class ChatResponse(BaseModel):
reply: str
session_id: str
DB_PATH = Path("chat_history.db")
def init_db():
conn = sqlite3.connect(DB_PATH)
conn.execute("""
CREATE TABLE IF NOT EXISTS messages (
id INTEGER PRIMARY KEY AUTOINCREMENT,
session_id TEXT, role TEXT, content TEXT,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
)
""")
conn.close()
init_db()
@app.post("/chat", response_model=ChatResponse)
async def chat(msg: ChatMessage):
if not msg.session_id:
import uuid
msg.session_id = str(uuid.uuid4())
conn = sqlite3.connect(DB_PATH)
conn.execute("INSERT INTO messages (session_id,role,content) VALUES (?,?,?)",
(msg.session_id, "user", msg.message))
history = conn.execute(
"SELECT role, content FROM messages WHERE session_id=? ORDER BY created_at",
(msg.session_id,)).fetchall()
conn.close()
messages = [{"role": r, "content": c} for r, c in history]
try:
response = openai.chat.completions.create(model="gpt-3.5-turbo", messages=messages)
reply = response.choices[0].message.content
except Exception as e:
raise HTTPException(status_code=500, detail=str(e))
conn = sqlite3.connect(DB_PATH)
conn.execute("INSERT INTO messages (session_id,role,content) VALUES (?,?,?)",
(msg.session_id, "assistant", reply))
conn.close()
return ChatResponse(reply=reply, session_id=msg.session_id)
@app.get("/health")
async def health():
return {"status": "healthy"}
Step 2 — Dockerize
dockerfile
FROM python:3.11-slim
WORKDIR /app
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
COPY app/ ./app/
EXPOSE 8000
CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000"]
bash
pip freeze > requirements.txt
docker build -t tds-chatbot .
docker run -p 8000:8000 -e OPENAI_API_KEY=your-key tds-chatbot
Step 3 — Deploy to Render
- Push to GitHub
- Create a new Web Service on Render
- Set environment variables (OPENAI_API_KEY)
- Deploy!
Submission
Submit a GitHub repo link containing: app/main.py, requirements.txt, Dockerfile, README.md
Grading rubric
| Criterion | Points |
|---|---|
| API responds to /chat with valid replies | 30 |
| Conversation history persisted in SQLite | 20 |
| Docker container runs the API | 20 |
| Deployed to Render with working URL | 15 |
| Google Auth on protected endpoints | 15 |
| Total | 100 |