You are a Python backend architect. Help me scaffold a production-ready FastAPI backend project with authentication, authorization, caching, and environment-specific config using the following specifications:
🧱 Stack:
- Python 3.11+
- FastAPI
- SQLAlchemy + PostgreSQL
- OAuth2 with JWT (Access + Refresh tokens)
- Redis for caching
- Docker + Docker Compose
- Pydantic for schema and environment config
- Role-based authorization system
- Auto-generated Swagger/OpenAPI docs
- Unit + integration testing with pytest
📁 Project Structure: Use modular, clean architecture:
app/
main.py
core/
– config, JWT, security, caching, constantsauth/
– login, logout, refresh handlersrouters/
– route files:auth.py
,test.py
, etc.models/
– SQLAlchemy models for User, Role, Feature, Permissionschemas/
– Pydantic request/response modelsservices/
– business logic (auth, permission lookup, caching)db/
– database session handlingconfig/
–base.py
,dev.py
,staging.py
,prod.py
usingBaseSettings
tests/
– auth, permission, and protected route test cases
🔐 Authentication:
POST /login
: Authenticates and returns access + refresh tokenPOST /logout
: Invalidates session and clears cached permissionsPOST /refresh
: Renews access tokenGET /test
: Protected endpoint requiring valid permissions
🔐 Authorization Model:
- User ↔ Roles (Many-to-Many)
- Role ↔ Features (Many-to-Many)
- Feature ↔ Permissions (Many-to-Many)
- Permissions (e.g.,
read:users
,create:projects
) used for actual endpoint authorization
🧠 Authorization Flow:
- On login: cache user's permission list (via user ID from JWT) in Redis
- On each request: extract user ID from token, check required permission against cache
- On logout: invalidate user’s cached permission entry
- Auto-refresh permissions cache every 15 mins (can use background task or Redis TTL)
⚙️ Features:
- Use
python-jose
for JWT - Use
passlib[bcrypt]
for password hashing - Use
Depends
and FastAPI’s dependency injection for permission checks - Swagger docs enabled at
/docs
- Environment-based config with
.env
,.env.staging
,.env.prod
🧪 Tests:
- Test login/logout/refresh flow
- Test permission-based access to
/test
- Test cache invalidation logic on logout and expiry
🐳 Docker:
- Dockerfile for app
- docker-compose.yml (FastAPI + Postgres + Redis + optional pgAdmin)
📦 Output:
- Folder and file structure
- Working auth flow
- User/Role/Feature/Permission model setup with migrations
- Redis-based permission caching setup with invalidation logic
- Dockerized environment with multi-env support
- Basic tests
Production-Ready FastAPI Backend Project
I'll scaffold a production-ready FastAPI backend with authentication, authorization, caching, and environment-specific configuration based on your specifications.
Project Structure Overview
First, let's establish our project structure with all required components:
fastapi-backend/
├── app/
│ ├── main.py
│ ├── core/
│ │ ├── __init__.py
│ │ ├── config.py
│ │ ├── security.py
│ │ ├── jwt.py
│ │ ├── cache.py
│ │ └── constants.py
│ ├── auth/
│ │ ├── __init__.py
│ │ ├── service.py
│ │ └── dependencies.py
│ ├── routers/
│ │ ├── __init__.py
│ │ ├── auth.py
│ │ └── test.py
│ ├── models/
│ │ ├── __init__.py
│ │ ├── base.py
│ │ ├── user.py
│ │ ├── role.py
│ │ ├── feature.py
│ │ └── permission.py
│ ├── schemas/
│ │ ├── __init__.py
│ │ ├── token.py
│ │ ├── user.py
│ │ ├── role.py
│ │ ├── feature.py
│ │ └── permission.py
│ ├── services/
│ │ ├── __init__.py
│ │ ├── auth.py
│ │ ├── permission.py
│ │ └── cache.py
│ ├── db/
│ │ ├── __init__.py
│ │ ├── session.py
│ │ └── base.py
│ ├── config/
│ │ ├── __init__.py
│ │ ├── base.py
│ │ ├── dev.py
│ │ ├── staging.py
│ │ └── prod.py
│ └── tests/
│ ├── __init__.py
│ ├── conftest.py
│ ├── test_auth.py
│ ├── test_permissions.py
│ └── test_protected_routes.py
├── alembic/
│ ├── versions/
│ ├── env.py
│ ├── README
│ ├── script.py.mako
│ └── alembic.ini
├── migrations/
├── .env
├── .env.example
├── .env.staging
├── .env.prod
├── pyproject.toml
├── requirements.txt
├── Dockerfile
├── docker-compose.yml
└── README.md
Now, let's implement the key components of this architecture.
1. Core Configuration Files
app/core/config.py
from functools import lru_cache
from typing import Any, Dict, Optional, Union
from pydantic import AnyHttpUrl, PostgresDsn, validator
from pydantic_settings import BaseSettings
class Settings(BaseSettings):
PROJECT_NAME: str = "FastAPI Backend"
API_V1_STR: str = "/api/v1"
# SECURITY
SECRET_KEY: str
ALGORITHM: str = "HS256"
ACCESS_TOKEN_EXPIRE_MINUTES: int = 30
REFRESH_TOKEN_EXPIRE_MINUTES: int = 60 * 24 * 7 # 7 days
# POSTGRES
POSTGRES_SERVER: str
POSTGRES_USER: str
POSTGRES_PASSWORD: str
POSTGRES_DB: str
SQLALCHEMY_DATABASE_URI: Optional[PostgresDsn] = None
@validator("SQLALCHEMY_DATABASE_URI", pre=True)
def assemble_db_connection(cls, v: Optional[str], values: Dict[str, Any]) -> Any:
if isinstance(v, str):
return v
return PostgresDsn.build(
scheme="postgresql",
user=values.get("POSTGRES_USER"),
password=values.get("POSTGRES_PASSWORD"),
host=values.get("POSTGRES_SERVER"),
path=f"/{values.get('POSTGRES_DB') or ''}",
)
# REDIS
REDIS_HOST: str
REDIS_PORT: int = 6379
REDIS_PASSWORD: Optional[str] = None
REDIS_DB: int = 0
# PERMISSIONS CACHE
PERMISSIONS_CACHE_TTL: int = 900 # 15 minutes in seconds
# CORS
BACKEND_CORS_ORIGINS: list[AnyHttpUrl] = []
@validator("BACKEND_CORS_ORIGINS", pre=True)
def assemble_cors_origins(cls, v: Union[str, list[str]]) -> list[str]:
if isinstance(v, str) and not v.startswith("["):
return [i.strip() for i in v.split(",")]
elif isinstance(v, (list, str)):
return v
raise ValueError(v)
class Config:
case_sensitive = True
env_file = ".env"
@lru_cache()
def get_settings():
return Settings()
settings = get_settings()
app/core/jwt.py
from datetime import datetime, timedelta
from typing import Any, Optional, Union
from jose import jwt
from passlib.context import CryptContext
from app.core.config import settings
pwd_context = CryptContext(schemes=["bcrypt"], deprecated="auto")
def create_access_token(subject: Union[str, Any]) -> str:
expire = datetime.utcnow() + timedelta(
minutes=settings.ACCESS_TOKEN_EXPIRE_MINUTES
)
to_encode = {"exp": expire, "sub": str(subject), "type": "access"}
encoded_jwt = jwt.encode(
to_encode, settings.SECRET_KEY, algorithm=settings.ALGORITHM
)
return encoded_jwt
def create_refresh_token(subject: Union[str, Any]) -> str:
expire = datetime.utcnow() + timedelta(
minutes=settings.REFRESH_TOKEN_EXPIRE_MINUTES
)
to_encode = {"exp": expire, "sub": str(subject), "type": "refresh"}
encoded_jwt = jwt.encode(
to_encode, settings.SECRET_KEY, algorithm=settings.ALGORITHM
)
return encoded_jwt
def verify_password(plain_password: str, hashed_password: str) -> bool:
return pwd_context.verify(plain_password, hashed_password)
def get_password_hash(password: str) -> str:
return pwd_context.hash(password)
app/core/cache.py
import json
from typing import Any, Optional, Union
import redis.asyncio as redis
from fastapi import Depends
from app.core.config import settings
class RedisClient:
def __init__(self):
self.redis_url = f"redis://{':' + settings.REDIS_PASSWORD + '@' if settings.REDIS_PASSWORD else ''}{settings.REDIS_HOST}:{settings.REDIS_PORT}/{settings.REDIS_DB}"
self.redis = None
async def get_redis_client(self):
if self.redis is None:
self.redis = await redis.from_url(self.redis_url, encoding="utf-8", decode_responses=True)
return self.redis
async def get(self, key: str) -> Optional[str]:
client = await self.get_redis_client()
return await client.get(key)
async def set(self, key: str, value: Union[str, dict, list], expire: int = 0):
client = await self.get_redis_client()
if not isinstance(value, str):
value = json.dumps(value)
await client.set(key, value, ex=expire if expire > 0 else None)
async def delete(self, key: str):
client = await self.get_redis_client()
await client.delete(key)
async def close(self):
if self.redis:
await self.redis.close()
redis_client = RedisClient()
async def get_redis():
return await redis_client.get_redis_client()
2. Database Models
app/db/base.py
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
app/db/session.py
from sqlalchemy.ext.asyncio import AsyncSession, create_async_engine
from sqlalchemy.orm import sessionmaker
from app.core.config import settings
SQLALCHEMY_DATABASE_URL = settings.SQLALCHEMY_DATABASE_URI.replace(
"postgresql", "postgresql+asyncpg"
)
engine = create_async_engine(SQLALCHEMY_DATABASE_URL, echo=True)
SessionLocal = sessionmaker(engine, class_=AsyncSession, expire_on_commit=False)
async def get_db():
db = SessionLocal()
try:
yield db
finally:
await db.close()
app/models/user.py
from sqlalchemy import Boolean, Column, ForeignKey, Integer, String, Table
from sqlalchemy.orm import relationship
from app.db.base import Base
# Many-to-Many relationship table between User and Role
user_role = Table(
"user_role",
Base.metadata,
Column("user_id", Integer, ForeignKey("users.id"), primary_key=True),
Column("role_id", Integer, ForeignKey("roles.id"), primary_key=True),
)
class User(Base):
__tablename__ = "users"
id = Column(Integer, primary_key=True, index=True)
email = Column(String, unique=True, index=True, nullable=False)
username = Column(String, unique=True, index=True, nullable=False)
hashed_password = Column(String, nullable=False)
is_active = Column(Boolean, default=True)
is_superuser = Column(Boolean, default=False)
# Relationships
roles = relationship("Role", secondary=user_role, back_populates="users")
app/models/role.py
from sqlalchemy import Column, ForeignKey, Integer, String, Table
from sqlalchemy.orm import relationship
from app.db.base import Base
# Many-to-Many relationship table between Role and Feature
role_feature = Table(
"role_feature",
Base.metadata,
Column("role_id", Integer, ForeignKey("roles.id"), primary_key=True),
Column("feature_id", Integer, ForeignKey("features.id"), primary_key=True),
)
class Role(Base):
__tablename__ = "roles"
id = Column(Integer, primary_key=True, index=True)
name = Column(String, unique=True, index=True, nullable=False)
description = Column(String, nullable=True)
# Relationships
users = relationship("User", secondary="user_role", back_populates="roles")
features = relationship("Feature", secondary=role_feature, back_populates="roles")
app/models/feature.py
from sqlalchemy import Column, ForeignKey, Integer, String, Table
from sqlalchemy.orm import relationship
from app.db.base import Base
# Many-to-Many relationship table between Feature and Permission
feature_permission = Table(
"feature_permission",
Base.metadata,
Column("feature_id", Integer, ForeignKey("features.id"), primary_key=True),
Column("permission_id", Integer, ForeignKey("permissions.id"), primary_key=True),
)
class Feature(Base):
__tablename__ = "features"
id = Column(Integer, primary_key=True, index=True)
name = Column(String, unique=True, index=True, nullable=False)
description = Column(String, nullable=True)
# Relationships
roles = relationship("Role", secondary="role_feature", back_populates="features")
permissions = relationship("Permission", secondary=feature_permission, back_populates="features")
app/models/permission.py
from sqlalchemy import Column, Integer, String
from sqlalchemy.orm import relationship
from app.db.base import Base
class Permission(Base):
__tablename__ = "permissions"
id = Column(Integer, primary_key=True, index=True)
name = Column(String, unique=True, index=True, nullable=False) # e.g., "read:users", "create:projects"
description = Column(String, nullable=True)
# Relationships
features = relationship("Feature", secondary="feature_permission", back_populates="permissions")
3. Authentication Services and Schemas
app/schemas/token.py
from typing import Optional
from pydantic import BaseModel
class Token(BaseModel):
access_token: str
refresh_token: str
token_type: str
class TokenPayload(BaseModel):
sub: Optional[str] = None
type: Optional[str] = None
app/schemas/user.py
from typing import List, Optional
from pydantic import BaseModel, EmailStr
class UserBase(BaseModel):
email: Optional[EmailStr] = None
username: Optional[str] = None
is_active: Optional[bool] = True
is_superuser: bool = False
class UserCreate(UserBase):
email: EmailStr
username: str
password: str
class UserUpdate(UserBase):
password: Optional[str] = None
class UserInDBBase(UserBase):
id: Optional[int] = None
class Config:
orm_mode = True
class User(UserInDBBase):
pass
class UserInDB(UserInDBBase):
hashed_password: str
app/services/auth.py
from datetime import datetime
from typing import Optional
from fastapi import Depends, HTTPException, status
from fastapi.security import OAuth2PasswordBearer
from jose import JWTError, jwt
from sqlalchemy.ext.asyncio import AsyncSession
from sqlalchemy.future import select
from app.core.config import settings
from app.core.jwt import verify_password
from app.db.session import get_db
from app.models.user import User
from app.schemas.token import TokenPayload
from app.services.cache import PermissionCacheService
oauth2_scheme = OAuth2PasswordBearer(tokenUrl=f"{settings.API_V1_STR}/auth/login")
async def authenticate_user(
db: AsyncSession, username: str, password: str
) -> Optional[User]:
"""Authenticate user by username and password"""
result = await db.execute(select(User).filter(User.username == username))
user = result.scalars().first()
if not user:
return None
if not verify_password(password, user.hashed_password):
return None
return user
async def get_current_user(
token: str = Depends(oauth2_scheme), db: AsyncSession = Depends(get_db)
) -> User:
"""Validate token and return current user"""
credentials_exception = HTTPException(
status_code=status.HTTP_401_UNAUTHORIZED,
detail="Could not validate credentials",
headers={"WWW-Authenticate": "Bearer"},
)
try:
payload = jwt.decode(
token, settings.SECRET_KEY, algorithms=[settings.ALGORITHM]
)
token_payload = TokenPayload(**payload)
if token_payload.sub is None or token_payload.type != "access":
raise credentials_exception
# Check token expiration
if datetime.fromtimestamp(payload.get("exp")) < datetime.utcnow():
raise HTTPException(
status_code=status.HTTP_401_UNAUTHORIZED,
detail="Token expired",
headers={"WWW-Authenticate": "Bearer"},
)
except JWTError:
raise credentials_exception
result = await db.execute(select(User).filter(User.id == int(token_payload.sub)))
user = result.scalars().first()
if user is None:
raise credentials_exception
if not user.is_active:
raise HTTPException(status_code=400, detail="Inactive user")
return user
async def validate_refresh_token(
token: str = Depends(oauth2_scheme), db: AsyncSession = Depends(get_db)
) -> User:
"""Validate refresh token and return user"""
credentials_exception = HTTPException(
status_code=status.HTTP_401_UNAUTHORIZED,
detail="Could not validate refresh token",
headers={"WWW-Authenticate": "Bearer"},
)
try:
payload = jwt.decode(
token, settings.SECRET_KEY, algorithms=[settings.ALGORITHM]
)
token_payload = TokenPayload(**payload)
if token_payload.sub is None or token_payload.type != "refresh":
raise credentials_exception
# Check token expiration
if datetime.fromtimestamp(payload.get("exp")) < datetime.utcnow():
raise HTTPException(
status_code=status.HTTP_401_UNAUTHORIZED,
detail="Refresh token expired",
headers={"WWW-Authenticate": "Bearer"},
)
except JWTError:
raise credentials_exception
result = await db.execute(select(User).filter(User.id == int(token_payload.sub)))
user = result.scalars().first()
if user is None:
raise credentials_exception
if not user.is_active:
raise HTTPException(status_code=400, detail="Inactive user")
return user
async def invalidate_user_session(user_id: int) -> None:
"""Invalidate user session by clearing cached permissions"""
cache_service = PermissionCacheService()
await cache_service.invalidate_permissions(user_id)
app/services/permission.py
from typing import List, Optional, Set
from fastapi import Depends, HTTPException, status
from sqlalchemy import select
from sqlalchemy.ext.asyncio import AsyncSession
from sqlalchemy.orm import joinedload
from app.db.session import get_db
from app.models.feature import Feature
from app.models.permission import Permission
from app.models.role import Role
from app.models.user import User
from app.services.auth import get_current_user
from app.services.cache import PermissionCacheService
class PermissionService:
def __init__(self, db: AsyncSession = None, cache_service: PermissionCacheService = None):
self.db = db
self.cache_service = cache_service or PermissionCacheService()
async def get_user_permissions(self, user_id: int) -> Set[str]:
"""Get all permissions for a user, can be from cache or database"""
# Try to get from cache first
cached_permissions = await self.cache_service.get_permissions(user_id)
if cached_permissions:
return cached_permissions
# Cache miss, get from database
permissions = await self.fetch_user_permissions(user_id)
# Update cache
await self.cache_service.set_permissions(user_id, permissions)
return permissions
async def fetch_user_permissions(self, user_id: int) -> Set[str]:
"""Fetch all permissions for a user from the database"""
result = await self.db.execute(
select(User)
.filter(User.id == user_id)
.options(
joinedload(User.roles)
.joinedload(Role.features)
.joinedload(Feature.permissions)
)
)
user = result.scalars().first()
if not user:
return set()
permissions = set()
for role in user.roles:
for feature in role.features:
for permission in feature.permissions:
permissions.add(permission.name)
return permissions
def require_permission(permission: str):
"""Dependency to check if user has required permission"""
async def check_permission(
db: AsyncSession = Depends(get_db),
current_user: User = Depends(get_current_user),
):
# Superuser bypass permission checks
if current_user.is_superuser:
return True
# Check user permissions
permission_service = PermissionService(db)
user_permissions = await permission_service.get_user_permissions(current_user.id)
if permission not in user_permissions:
raise HTTPException(
status_code=status.HTTP_403_FORBIDDEN,
detail=f"Permission denied: {permission} required",
)
return True
return check_permission
app/services/cache.py
from typing import Optional, Set
from app.core.cache import redis_client
from app.core.config import settings
class PermissionCacheService:
def __init__(self):
self.prefix = "user_permissions:"
self.ttl = settings.PERMISSIONS_CACHE_TTL
def get_cache_key(self, user_id: int) -> str:
"""Generate cache key for user permissions"""
return f"{self.prefix}{user_id}"
async def get_permissions(self, user_id: int) -> Optional[Set[str]]:
"""Get user permissions from cache"""
key = self.get_cache_key(user_id)
permissions_str = await redis_client.get(key)
if not permissions_str:
return None
# Convert from comma-separated string to set
return set(permissions_str.split(","))
async def set_permissions(self, user_id: int, permissions: Set[str]) -> None:
"""Set user permissions in cache with TTL"""
key = self.get_cache_key(user_id)
# Convert set to comma-separated string
permissions_str = ",".join(permissions)
await redis_client.set(key, permissions_str, expire=self.ttl)
async def invalidate_permissions(self, user_id: int) -> None:
"""Invalidate user permissions in cache"""
key = self.get_cache_key(user_id)
await redis_client.delete(key)
4. API Routes
app/routers/auth.py
from fastapi import APIRouter, Body, Depends, HTTPException
from fastapi.security import OAuth2PasswordRequestForm
from sqlalchemy.ext.asyncio import AsyncSession
from app.core.jwt import create_access_token, create_refresh_token
from app.db.session import get_db
from app.schemas.token import Token
from app.services.auth import authenticate_user, invalidate_user_session, validate_refresh_token
from app.services.permission import PermissionService
router = APIRouter(tags=["authentication"])
@router.post("/login", response_model=Token)
async def login_access_token(
db: AsyncSession = Depends(get_db),
form_data: OAuth2PasswordRequestForm = Depends(),
):
"""
OAuth2 compatible token login, get an access token for future requests
"""
user = await authenticate_user(db, form_data.username, form_data.password)
if not user:
raise HTTPException(status_code=400, detail="Incorrect username or password")
# Get user permissions and cache them
permission_service = PermissionService(db)
permissions = await permission_service.fetch_user_permissions(user.id)
# Cache user permissions
if permissions:
cache_service = permission_service.cache_service
await cache_service.set_permissions(user.id, permissions)
return {
"access_token": create_access_token(user.id),
"refresh_token": create_refresh_token(user.id),
"token_type": "bearer",
}
@router.post("/refresh", response_model=Token)
async def refresh_token(
db: AsyncSession = Depends(get_db),
user = Depends(validate_refresh_token),
):
"""
Refresh access token
"""
return {
"access_token": create_access_token(user.id),
"refresh_token": create_refresh_token(user.id), # Also refresh the refresh token
"token_type": "bearer",
}
@router.post("/logout")
async def logout(
db: AsyncSession = Depends(get_db),
user = Depends(validate_refresh_token),
):
"""
Logout user by invalidating the session
"""
await invalidate_user_session(user.id)
return {"detail": "Successfully logged out"}
app/routers/test.py
from fastapi import APIRouter, Depends
from app.services.auth import get_current_user
from app.services.permission import require_permission
router = APIRouter(tags=["test"])
@router.get("/test")
async def test_endpoint(
_=Depends(require_permission("read:test")),
):
"""
Protected endpoint that requires the read:test permission
"""
return {"message": "You have access to this protected endpoint"}
5. Main Application
app/main.py
from fastapi import FastAPI
from fastapi.middleware.cors import CORSMiddleware
from app.core.config import settings
from app.routers import auth, test
app = FastAPI(
title=settings.PROJECT_NAME,
openapi_url=f"{settings.API_V1_STR}/openapi.json",
)
# Set up CORS
if settings.BACKEND_CORS_ORIGINS:
app.add_middleware(
CORSMiddleware,
allow_origins=[str(origin) for origin in settings.BACKEND_CORS_ORIGINS],
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
)
# Include routers
app.include_router(auth.router, prefix=f"{settings.API_V1_STR}/auth")
app.include_router(test.router, prefix=f"{settings.API_V1_STR}")
@app.get("/")
async def root():
return {"message": "Welcome to FastAPI Backend API!"}
@app.on_event("startup")
async def startup_event():
# Additional startup logic can be added here
pass
@app.on_event("shutdown")
async def shutdown_event():
# Additional shutdown logic can be added here
pass
6. Docker Configuration
Dockerfile
FROM python:3.11-slim
WORKDIR /app
# Install dependencies
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
# Copy application code
COPY . .
# Run with Uvicorn
CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000"]
docker-compose.yml
version: '3.8'
services:
app:
build: .
ports:
- "8000:8000"
volumes:
- .:/app
env_file:
- .env
depends_on:
- db
- redis
restart: always
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:8000"]
interval: 30s
timeout: 10s
retries: 3
db:
image: postgres:15
volumes:
- postgres_data:/var/lib/postgresql/data
environment:
- POSTGRES_USER=${POSTGRES_USER}
- POSTGRES_PASSWORD=${POSTGRES_PASSWORD}
- POSTGRES_DB=${POSTGRES_DB}
ports:
- "5432:5432"
restart: always
healthcheck:
test: ["CMD-SHELL", "pg_isready -U ${POSTGRES_USER}"]
interval: 10s
timeout: 5s
retries: 5
redis:
image: redis:7-alpine
ports:
- "6379:6379"
volumes:
- redis_data:/data
command: redis-server --requirepass ${REDIS_PASSWORD}
restart: always
healthcheck:
test: ["CMD", "redis-cli", "ping"]
interval: 10s
timeout: 5s
retries: 5
pgadmin:
image: dpage/pgadmin4
environment:
- PGADMIN_DEFAULT_EMAIL=${PGADMIN_EMAIL:-admin@admin.com}
- PGADMIN_DEFAULT_PASSWORD=${PGADMIN_PASSWORD:-admin}
ports:
- "5050:80"
depends_on:
- db
restart: always
volumes:
- pgadmin_data:/var/lib/pgadmin
volumes:
postgres_data:
redis_data:
pgadmin_data:
7. Environment Configuration
.env.example
# API
SECRET_KEY=your-secret-key-here
ALGORITHM=HS256
ACCESS_TOKEN_EXPIRE_MINUTES=30
REFRESH_TOKEN_EXPIRE_MINUTES=10080 # 7 days
# PostgreSQL
POSTGRES_SERVER=db
POSTGRES_USER=postgres
POSTGRES_PASSWORD=postgres
POSTGRES_DB=app
# Redis
REDIS_HOST=redis
REDIS_PORT=6379
REDIS_PASSWORD=redis
REDIS_DB=0
# Permissions Cache
PERMISSIONS_CACHE_TTL=900 # 15 minutes in seconds
# CORS
BACKEND_CORS_ORIGINS=["http://localhost:3000","http://localhost:8000"]
# PGAdmin
PGADMIN_EMAIL=admin@admin.com
PGADMIN_PASSWORD=admin
8. Testing Setup
app/tests/conftest.py
import asyncio
from typing import AsyncGenerator, Generator
import pytest
import pytest_asyncio
from fastapi.testclient import TestClient
from httpx import AsyncClient
from sqlalchemy.ext.asyncio import AsyncSession, create_async_engine
from sqlalchemy.orm import sessionmaker
from app.core.jwt import get_password_hash
from app.db.base import Base
from app.db.session import get_db
from app.main import app
from app.models.feature import Feature
from app.models.permission import Permission
from app.models.role import Role
from app.models.user import User
# Use an in-memory SQLite database for testing
SQLALCHEMY_DATABASE_URL = "sqlite+aiosqlite:///:memory:"
engine = create_async_engine(
SQLALCHEMY_DATABASE_URL, connect_args={"check_same_thread": False}
)
TestingSessionLocal = sessionmaker(engine, class_=AsyncSession, expire_on_commit=False)
@pytest_asyncio.fixture(scope="function")
async def db() -> AsyncGenerator:
# Create the tables
async with engine.begin() as conn:
await conn.run_sync(Base.metadata.drop_all)
await conn.run_sync(Base.metadata.create_all)
# Create a new session for each test
async with TestingSessionLocal() as session:
# Override the get_db dependency
async def override_get_db():
try:
yield session
finally:
pass
app.dependency_overrides[get_db] = override_get_db
yield session
# Reset the override after the test is done
app.dependency_overrides = {}
@pytest.fixture(scope="function")
def client() -> Generator:
with TestClient(app) as c:
yield c
@pytest_asyncio.fixture(scope="function")
async def async_client() -> AsyncGenerator:
async with AsyncClient(app=app, base_url="http://test") as ac:
yield ac
@pytest_asyncio.fixture(scope="function")
async def test_user(db: AsyncSession) -> User:
# Create test user
user = User(
email="test@example.com",
username="testuser",
hashed_password=get_password_hash("password"),
is_active=True,
)
db.add(user)
await db.commit()
await db.refresh(user)
return user
@pytest_asyncio.fixture(scope="function")
async def test_admin(db: AsyncSession) -> User:
# Create admin user
admin = User(
email="admin@example.com",
username="admin",
hashed_password=get_password_hash("password"),
is_active=True,
is_superuser=True,
)
db.add(admin)
await db.commit()
await db.refresh(admin)
return admin
@pytest_asyncio.fixture(scope="function")
async def test_permission(db: AsyncSession) -> Permission:
# Create test permission
permission = Permission(name="read:test", description="Read test data")
db.add(permission)
await db.commit()
await db.refresh(permission)
return permission
@pytest_asyncio.fixture(scope="function")
async def test_feature(db: AsyncSession, test_permission: Permission) -> Feature:
# Create test feature
feature = Feature(name="test_feature", description="Test feature")
db.add(feature)
await db.commit()
await db.refresh(feature)
# Add permission to feature
feature.permissions.append(test_permission)
await db.commit()
await db.refresh(feature)
return feature
@pytest_asyncio.fixture(scope="function")
async def test_role(db: AsyncSession, test_feature: Feature) -> Role:
# Create test role
role = Role(name="test_role", description="Test role")
db.add(role)
await db.commit()
await db.refresh(role)
# Add feature to role
role.features.append(test_feature)
await db.commit()
await db.refresh(role)
return role
@pytest_asyncio.fixture(scope="function")
async def test_user_with_permission(db: AsyncSession, test_user: User, test_role: Role) -> User:
# Add role to user
test_user.roles.append(test_role)
await db.commit()
await db.refresh(test_user)
return test_user
app/tests/test_auth.py
import pytest
from fastapi.testclient import TestClient
from httpx import AsyncClient
from sqlalchemy.ext.asyncio import AsyncSession
from app.core.jwt import create_access_token, create_refresh_token
class TestAuth:
def test_login(self, client: TestClient, test_user):
response = client.post(
"/api/v1/auth/login",
data={"username": "testuser", "password": "password"},
)
tokens = response.json()
assert response.status_code == 200
assert "access_token" in tokens
assert "refresh_token" in tokens
assert tokens["token_type"] == "bearer"
def test_login_wrong_password(self, client: TestClient, test_user):
response = client.post(
"/api/v1/auth/login",
data={"username": "testuser", "password": "wrong"},
)
assert response.status_code == 400
assert response.json() == {"detail": "Incorrect username or password"}
def test_refresh_token(self, client: TestClient, test_user):
refresh_token = create_refresh_token(test_user.id)
response = client.post(
"/api/v1/auth/refresh",
headers={"Authorization": f"Bearer {refresh_token}"},
)
tokens = response.json()
assert response.status_code == 200
assert "access_token" in tokens
assert "refresh_token" in tokens
assert tokens["token_type"] == "bearer"
def test_refresh_with_access_token(self, client: TestClient, test_user):
# Try to refresh with an access token (should fail)
access_token = create_access_token(test_user.id)
response = client.post(
"/api/v1/auth/refresh",
headers={"Authorization": f"Bearer {access_token}"},
)
assert response.status_code == 401
def test_logout(self, client: TestClient, test_user):
refresh_token = create_refresh_token(test_user.id)
response = client.post(
"/api/v1/auth/logout",
headers={"Authorization": f"Bearer {refresh_token}"},
)
assert response.status_code == 200
assert response.json() == {"detail": "Successfully logged out"}
@pytest.mark.asyncio
async def test_full_auth_flow(async_client: AsyncClient, test_user_with_permission):
# 1. Login
response = await async_client.post(
"/api/v1/auth/login",
data={"username": "testuser", "password": "password"},
)
tokens = response.json()
assert response.status_code == 200
access_token = tokens["access_token"]
refresh_token = tokens["refresh_token"]
# 2. Access protected endpoint
response = await async_client.get(
"/api/v1/test",
headers={"Authorization": f"Bearer {access_token}"},
)
assert response.status_code == 200
# 3. Refresh token
response = await async_client.post(
"/api/v1/auth/refresh",
headers={"Authorization": f"Bearer {refresh_token}"},
)
new_tokens = response.json()
assert response.status_code == 200
# 4. Logout
response = await async_client.post(
"/api/v1/auth/logout",
headers={"Authorization": f"Bearer {new_tokens['refresh_token']}"},
)
assert response.status_code == 200
app/tests/test_permissions.py
import pytest
from httpx import AsyncClient
from sqlalchemy.ext.asyncio import AsyncSession
from app.core.jwt import create_access_token
from app.services.permission import PermissionService
@pytest.mark.asyncio
async def test_require_permission(
async_client: AsyncClient,
test_user_with_permission: AsyncSession,
test_user: AsyncSession,
db: AsyncSession
):
# User with permission
access_token1 = create_access_token(test_user_with_permission.id)
response = await async_client.get(
"/api/v1/test",
headers={"Authorization": f"Bearer {access_token1}"},
)
assert response.status_code == 200
# User without permission
access_token2 = create_access_token(test_user.id)
response = await async_client.get(
"/api/v1/test",
headers={"Authorization": f"Bearer {access_token2}"},
)
assert response.status_code == 403
@pytest.mark.asyncio
async def test_permission_caching(
db: AsyncSession,
test_user_with_permission
):
# Get permissions from database and cache them
permission_service = PermissionService(db)
permissions = await permission_service.get_user_permissions(test_user_with_permission.id)
# Verify correct permissions were found
assert "read:test" in permissions
# Get permissions from cache and verify
cached_permissions = await permission_service.get_user_permissions(test_user_with_permission.id)
assert cached_permissions == permissions
# Invalidate cache
await permission_service.cache_service.invalidate_permissions(test_user_with_permission.id)
# Verify cache is empty
cache_hit = await permission_service.cache_service.get_permissions(test_user_with_permission.id)
assert cache_hit is None
app/tests/test_protected_routes.py
import pytest
from httpx import AsyncClient
from app.core.jwt import create_access_token
@pytest.mark.asyncio
async def test_protected_route_with_permission(
async_client: AsyncClient, test_user_with_permission
):
access_token = create_access_token(test_user_with_permission.id)
response = await async_client.get(
"/api/v1/test",
headers={"Authorization": f"Bearer {access_token}"},
)
assert response.status_code == 200
assert response.json() == {"message": "You have access to this protected endpoint"}
@pytest.mark.asyncio
async def test_protected_route_without_permission(
async_client: AsyncClient, test_user
):
access_token = create_access_token(test_user.id)
response = await async_client.get(
"/api/v1/test",
headers={"Authorization": f"Bearer {access_token}"},
)
assert response.status_code == 403
assert "Permission denied" in response.json()["detail"]
@pytest.mark.asyncio
async def test_protected_route_with_superuser(
async_client: AsyncClient, test_admin
):
# Superuser can access any protected route
access_token = create_access_token(test_admin.id)
response = await async_client.get(
"/api/v1/test",
headers={"Authorization": f"Bearer {access_token}"},
)
assert response.status_code == 200
assert response.json() == {"message": "You have access to this protected endpoint"}
@pytest.mark.asyncio
async def test_protected_route_without_auth(async_client: AsyncClient):
response = await async_client.get("/api/v1/test")
assert response.status_code == 401
assert response.json() == {"detail": "Not authenticated"}
9. Requirements File
requirements.txt
fastapi>=0.100.0
uvicorn[standard]>=0.22.0
pydantic>=2.0.0
pydantic-settings>=2.0.0
SQLAlchemy>=2.0.0
alembic>=1.11.0
python-jose[cryptography]>=3.3.0
passlib[bcrypt]>=1.7.4
python-multipart>=0.0.6
redis>=4.5.0
asyncpg>=0.27.0
aiosqlite>=0.19.0 # for testing
pytest>=7.3.0
pytest-asyncio>=0.21.0
httpx>=0.24.0
Explanation of the Architecture
This FastAPI project implements a robust backend with the following features:
Authentication & Authorization
- OAuth2 with JWT - Secure authentication using access and refresh tokens
Role-Based Authorization - Hierarchical permission model:
- Users have Roles
- Roles have Features
- Features have Permissions
- Redis Caching - Permissions are cached in Redis for faster authorization checks
- Permission Invalidation - User permissions are invalidated on logout or expiry
SQLAlchemy Models
- User Model - Stores user credentials and authentication info
- Role Model - Defines user roles with many-to-many relationship to users
- Feature Model - Groups permissions into logical features
- Permission Model - Fine-grained permissions for authorization checks
Redis Caching
We use Redis for caching user permissions, which improves performance by reducing database queries 1. The cache is created upon login, used for authorization checks, and invalidated on logout. It automatically refreshes via a TTL of 15 minutes.
Environment-Specific Configuration
The project uses Pydantic BaseSettings for type-safe, environment-specific configuration with support for different environments (dev, staging, prod).
Docker Integration
The project includes complete Docker and Docker Compose setups for running the application, PostgreSQL database, Redis cache, and pgAdmin for database administration.
Testing
Comprehensive test setup includes tests for:
- Authentication flow (login, refresh, logout)
- Permission checking
- Permission caching
- Protected route access control
How to Run the Project
Set up environment variables:
- Copy
.env.example
to.env
and configure your environment variables
- Copy
Start the Docker containers:
docker-compose up -d
Run migrations:
docker-compose exec app alembic upgrade head
Access the API:
- API documentation: http://localhost:8000/docs
- PgAdmin: http://localhost:5050
This production-ready FastAPI application follows best practices for security, performance, and maintainability, with a clean, modular architecture that's easy to extend.
Configure GitHub Actions or another CI/CD tool for automated testing and deployment
Setting Up CI/CD with GitHub Actions for FastAPI Backend
Let's configure a comprehensive GitHub Actions workflow for your FastAPI backend project. This setup will automate testing, building, and deployment processes.
GitHub Actions Workflow Structure
I'll create several workflow files to handle different aspects of the CI/CD pipeline:
- Testing & Code Quality - Run on every pull request and push to main
- Build & Publish - Build and push Docker images on successful merges to main
- Deployment - Deploy to different environments based on branch/tag
Let's implement these workflows:
1. Testing & Code Quality Workflow
Create a file at .github/workflows/test.yml
:
name: Test and Code Quality
on:
push:
branches: [ main, develop ]
pull_request:
branches: [ main, develop ]
jobs:
test:
runs-on: ubuntu-latest
services:
postgres:
image: postgres:15
env:
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
POSTGRES_DB: test_db
ports:
- 5432:5432
options: >-
--health-cmd pg_isready
--health-interval 10s
--health-timeout 5s
--health-retries 5
redis:
image: redis:7-alpine
ports:
- 6379:6379
options: >-
--health-cmd "redis-cli ping"
--health-interval 10s
--health-timeout 5s
--health-retries 5
steps:
- uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: '3.11'
cache: 'pip'
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
pip install pytest-cov black isort mypy
- name: Lint with Black
run: black --check app
- name: Check imports with isort
run: isort --check-only --profile black app
- name: Type check with mypy
run: mypy app
- name: Run tests with pytest
env:
POSTGRES_SERVER: localhost
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
POSTGRES_DB: test_db
REDIS_HOST: localhost
REDIS_PORT: 6379
SECRET_KEY: test_secret_key
ALGORITHM: HS256
run: |
pytest --cov=app --cov-report=xml
- name: Upload coverage to Codecov
uses: codecov/codecov-action@v3
with:
file: ./coverage.xml
fail_ci_if_error: false
security-scan:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: '3.11'
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install bandit safety
- name: Run security scan with Bandit
run: bandit -r app -f json -o bandit-results.json
- name: Check for vulnerable dependencies
run: safety check -r requirements.txt
2. Build & Publish Docker Image Workflow
Create a file at .github/workflows/build.yml
:
name: Build and Publish Docker Image
on:
push:
branches: [ main ]
tags: [ 'v*' ]
jobs:
build:
runs-on: ubuntu-latest
permissions:
contents: read
packages: write
steps:
- uses: actions/checkout@v3
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
- name: Login to GitHub Container Registry
uses: docker/login-action@v2
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Extract metadata for Docker
id: meta
uses: docker/metadata-action@v4
with:
images: ghcr.io/${{ github.repository }}
tags: |
type=ref,event=branch
type=ref,event=pr
type=semver,pattern={{version}}
type=semver,pattern={{major}}.{{minor}}
type=sha,format=short
- name: Build and push Docker image
uses: docker/build-push-action@v4
with:
context: .
push: true
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
cache-from: type=gha
cache-to: type=gha,mode=max
3. Deployment Workflow
Create a file at .github/workflows/deploy.yml
:
name: Deploy
on:
workflow_run:
workflows: ["Build and Publish Docker Image"]
types:
- completed
workflow_dispatch:
inputs:
environment:
description: 'Environment to deploy to'
required: true
default: 'development'
type: choice
options:
- development
- staging
- production
jobs:
deploy-development:
if: ${{ github.event.workflow_run.conclusion == 'success' && github.ref == 'refs/heads/develop' || github.event.inputs.environment == 'development' }}
runs-on: ubuntu-latest
environment: development
steps:
- name: Checkout
uses: actions/checkout@v3
- name: Set up SSH
uses: webfactory/ssh-agent@v0.7.0
with:
ssh-private-key: ${{ secrets.SSH_PRIVATE_KEY }}
- name: Deploy to Development Server
env:
SERVER_HOST: ${{ secrets.DEV_SERVER_HOST }}
SERVER_USER: ${{ secrets.DEV_SERVER_USER }}
DOCKER_COMPOSE_FILE: docker-compose.dev.yml
run: |
# Copy docker-compose file to server
scp -o StrictHostKeyChecking=no $DOCKER_COMPOSE_FILE $SERVER_USER@$SERVER_HOST:~/docker-compose.yml
# Create .env file on server
ssh -o StrictHostKeyChecking=no $SERVER_USER@$SERVER_HOST "cat > .env << 'EOL'
POSTGRES_SERVER=db
POSTGRES_USER=${{ secrets.DEV_POSTGRES_USER }}
POSTGRES_PASSWORD=${{ secrets.DEV_POSTGRES_PASSWORD }}
POSTGRES_DB=${{ secrets.DEV_POSTGRES_DB }}
REDIS_HOST=redis
REDIS_PORT=6379
REDIS_PASSWORD=${{ secrets.DEV_REDIS_PASSWORD }}
SECRET_KEY=${{ secrets.DEV_SECRET_KEY }}
ALGORITHM=HS256
EOL"
# Pull latest image and restart containers
ssh -o StrictHostKeyChecking=no $SERVER_USER@$SERVER_HOST "docker-compose pull && docker-compose up -d"
deploy-staging:
if: ${{ github.event.workflow_run.conclusion == 'success' && github.ref == 'refs/heads/main' || github.event.inputs.environment == 'staging' }}
runs-on: ubuntu-latest
environment: staging
steps:
- name: Checkout
uses: actions/checkout@v3
- name: Set up SSH
uses: webfactory/ssh-agent@v0.7.0
with:
ssh-private-key: ${{ secrets.SSH_PRIVATE_KEY }}
- name: Deploy to Staging Server
env:
SERVER_HOST: ${{ secrets.STAGING_SERVER_HOST }}
SERVER_USER: ${{ secrets.STAGING_SERVER_USER }}
DOCKER_COMPOSE_FILE: docker-compose.staging.yml
run: |
# Copy docker-compose file to server
scp -o StrictHostKeyChecking=no $DOCKER_COMPOSE_FILE $SERVER_USER@$SERVER_HOST:~/docker-compose.yml
# Create .env file on server
ssh -o StrictHostKeyChecking=no $SERVER_USER@$SERVER_HOST "cat > .env << 'EOL'
POSTGRES_SERVER=db
POSTGRES_USER=${{ secrets.STAGING_POSTGRES_USER }}
POSTGRES_PASSWORD=${{ secrets.STAGING_POSTGRES_PASSWORD }}
POSTGRES_DB=${{ secrets.STAGING_POSTGRES_DB }}
REDIS_HOST=redis
REDIS_PORT=6379
REDIS_PASSWORD=${{ secrets.STAGING_REDIS_PASSWORD }}
SECRET_KEY=${{ secrets.STAGING_SECRET_KEY }}
ALGORITHM=HS256
EOL"
# Pull latest image and restart containers
ssh -o StrictHostKeyChecking=no $SERVER_USER@$SERVER_HOST "docker-compose pull && docker-compose up -d"
deploy-production:
if: ${{ github.event.workflow_run.conclusion == 'success' && startsWith(github.ref, 'refs/tags/v') || github.event.inputs.environment == 'production' }}
runs-on: ubuntu-latest
environment: production
steps:
- name: Checkout
uses: actions/checkout@v3
- name: Set up SSH
uses: webfactory/ssh-agent@v0.7.0
with:
ssh-private-key: ${{ secrets.SSH_PRIVATE_KEY }}
- name: Deploy to Production Server
env:
SERVER_HOST: ${{ secrets.PROD_SERVER_HOST }}
SERVER_USER: ${{ secrets.PROD_SERVER_USER }}
DOCKER_COMPOSE_FILE: docker-compose.prod.yml
run: |
# Copy docker-compose file to server
scp -o StrictHostKeyChecking=no $DOCKER_COMPOSE_FILE $SERVER_USER@$SERVER_HOST:~/docker-compose.yml
# Create .env file on server
ssh -o StrictHostKeyChecking=no $SERVER_USER@$SERVER_HOST "cat > .env << 'EOL'
POSTGRES_SERVER=db
POSTGRES_USER=${{ secrets.PROD_POSTGRES_USER }}
POSTGRES_PASSWORD=${{ secrets.PROD_POSTGRES_PASSWORD }}
POSTGRES_DB=${{ secrets.PROD_POSTGRES_DB }}
REDIS_HOST=redis
REDIS_PORT=6379
REDIS_PASSWORD=${{ secrets.PROD_REDIS_PASSWORD }}
SECRET_KEY=${{ secrets.PROD_SECRET_KEY }}
ALGORITHM=HS256
EOL"
# Pull latest image and restart containers
ssh -o StrictHostKeyChecking=no $SERVER_USER@$SERVER_HOST "docker-compose pull && docker-compose up -d"
4. Environment-Specific Docker Compose Files
Let's create environment-specific Docker Compose files:
docker-compose.dev.yml
version: '3.8'
services:
app:
image: ghcr.io/${GITHUB_REPOSITORY}:develop
ports:
- "8000:8000"
env_file:
- .env
depends_on:
- db
- redis
restart: always
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:8000/health"]
interval: 30s
timeout: 10s
retries: 3
db:
image: postgres:15
volumes:
- postgres_data:/var/lib/postgresql/data
env_file:
- .env
restart: always
healthcheck:
test: ["CMD-SHELL", "pg_isready -U ${POSTGRES_USER}"]
interval: 10s
timeout: 5s
retries: 5
redis:
image: redis:7-alpine
command: redis-server --requirepass ${REDIS_PASSWORD}
volumes:
- redis_data:/data
restart: always
healthcheck:
test: ["CMD", "redis-cli", "ping"]
interval: 10s
timeout: 5s
retries: 5
volumes:
postgres_data:
redis_data:
docker-compose.staging.yml
version: '3.8'
services:
app:
image: ghcr.io/${GITHUB_REPOSITORY}:main
ports:
- "8000:8000"
env_file:
- .env
depends_on:
- db
- redis
restart: always
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:8000/health"]
interval: 30s
timeout: 10s
retries: 3
deploy:
resources:
limits:
cpus: '0.5'
memory: 512M
db:
image: postgres:15
volumes:
- postgres_data:/var/lib/postgresql/data
env_file:
- .env
restart: always
healthcheck:
test: ["CMD-SHELL", "pg_isready -U ${POSTGRES_USER}"]
interval: 10s
timeout: 5s
retries: 5
deploy:
resources:
limits:
cpus: '0.5'
memory: 512M
redis:
image: redis:7-alpine
command: redis-server --requirepass ${REDIS_PASSWORD}
volumes:
- redis_data:/data
restart: always
healthcheck:
test: ["CMD", "redis-cli", "ping"]
interval: 10s
timeout: 5s
retries: 5
deploy:
resources:
limits:
cpus: '0.2'
memory: 256M
volumes:
postgres_data:
redis_data:
docker-compose.prod.yml
version: '3.8'
services:
app:
image: ghcr.io/${GITHUB_REPOSITORY}:${TAG:-latest}
ports:
- "8000:8000"
env_file:
- .env
depends_on:
- db
- redis
restart: always
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:8000/health"]
interval: 30s
timeout: 10s
retries: 3
deploy:
replicas: 2
resources:
limits:
cpus: '1'
memory: 1G
db:
image: postgres:15
volumes:
- postgres_data:/var/lib/postgresql/data
env_file:
- .env
restart: always
healthcheck:
test: ["CMD-SHELL", "pg_isready -U ${POSTGRES_USER}"]
interval: 10s
timeout: 5s
retries: 5
deploy:
resources:
limits:
cpus: '1'
memory: 1G
redis:
image: redis:7-alpine
command: redis-server --requirepass ${REDIS_PASSWORD}
volumes:
- redis_data:/data
restart: always
healthcheck:
test: ["CMD", "redis-cli", "ping"]
interval: 10s
timeout: 5s
retries: 5
deploy:
resources:
limits:
cpus: '0.5'
memory: 512M
volumes:
postgres_data:
redis_data:
5. Add Health Check Endpoint
To support the health checks in our Docker Compose files, let's add a health check endpoint to our FastAPI application:
# Add this to app/main.py
@app.get("/health", tags=["health"])
async def health_check():
"""
Health check endpoint for monitoring and deployment verification
"""
return {"status": "healthy"}
6. Update Requirements for CI/CD
Let's update our requirements.txt to include the testing and linting tools:
# Add these to requirements.txt
pytest-cov>=4.1.0
black>=23.3.0
isort>=5.12.0
mypy>=1.3.0
bandit>=1.7.5
safety>=2.3.5
7. GitHub Repository Secrets Setup
You'll need to set up the following secrets in your GitHub repository:
Environment Secrets
For each environment (development, staging, production), set up:
Server connection details:
SSH_PRIVATE_KEY
- SSH private key for server accessDEV_SERVER_HOST
,STAGING_SERVER_HOST
,PROD_SERVER_HOST
- Server hostnamesDEV_SERVER_USER
,STAGING_SERVER_USER
,PROD_SERVER_USER
- SSH usernames
Database credentials:
DEV_POSTGRES_USER
,STAGING_POSTGRES_USER
,PROD_POSTGRES_USER
DEV_POSTGRES_PASSWORD
,STAGING_POSTGRES_PASSWORD
,PROD_POSTGRES_PASSWORD
DEV_POSTGRES_DB
,STAGING_POSTGRES_DB
,PROD_POSTGRES_DB
Redis credentials:
DEV_REDIS_PASSWORD
,STAGING_REDIS_PASSWORD
,PROD_REDIS_PASSWORD
Application secrets:
DEV_SECRET_KEY
,STAGING_SECRET_KEY
,PROD_SECRET_KEY
8. Add a .github/dependabot.yml File
To keep dependencies up-to-date:
version: 2
updates:
- package-ecosystem: "pip"
directory: "/"
schedule:
interval: "weekly"
open-pull-requests-limit: 10
labels:
- "dependencies"
- "python"
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "weekly"
open-pull-requests-limit: 10
labels:
- "dependencies"
- "github-actions"
- package-ecosystem: "docker"
directory: "/"
schedule:
interval: "monthly"
open-pull-requests-limit: 5
labels:
- "dependencies"
- "docker"
CI/CD Workflow Explanation
This CI/CD setup provides a comprehensive workflow for your FastAPI backend:
1. Testing & Code Quality
- Automated Testing: Runs pytest with coverage reporting on every PR and push to main/develop
- Code Quality Checks: Enforces consistent code style with Black and isort
- Type Checking: Validates type annotations with mypy
- Security Scanning: Checks for security vulnerabilities with Bandit and Safety
2. Build & Publish
- Docker Image Building: Creates optimized Docker images for your application
- Automatic Tagging: Tags images based on git branches, tags, and commit SHAs
- Registry Publishing: Pushes images to GitHub Container Registry (GHCR)
- Build Caching: Uses GitHub Actions cache to speed up builds
3. Deployment
Environment-Specific Deployments:
develop
branch → Development environmentmain
branch → Staging environment- Tags (e.g.,
v1.0.0
) → Production environment
- Manual Deployment Option: Allows manual deployment to any environment
- Secure Credential Handling: Uses GitHub Secrets for sensitive information
- SSH-Based Deployment: Securely deploys to remote servers
4. Dependency Management
- Dependabot Integration: Automatically creates PRs for outdated dependencies
- Multi-Ecosystem Monitoring: Tracks Python packages, GitHub Actions, and Docker base images
Benefits of This CI/CD Setup
- Automated Quality Control: Ensures code meets quality and security standards before deployment
- Deployment Consistency: Same deployment process across all environments
- Environment Isolation: Separate configurations for development, staging, and production
- Reduced Manual Work: Automates repetitive tasks like testing and deployment
- Improved Security: Secrets management and vulnerability scanning
- Dependency Management: Keeps dependencies up-to-date with Dependabot
Next Steps
- Set up GitHub repository secrets for each environment
- Configure branch protection rules to require passing CI checks before merging
- Set up deployment environments in GitHub with required reviewers for production
- Create SSH keys for deployment to your servers
- Add the health check endpoint to your FastAPI application
This CI/CD configuration provides a solid foundation that you can customize further based on your specific requirements.