Backend: Python 3.11 · FastAPI · SQLAlchemy 2 async · Alembic · ARQ · uvicorn
Frontend: React 18 · Vite 5 · TypeScript · Tailwind CSS · TanStack Query · Zustand
DB: SQLite (dev) → PostgreSQL + pgvector (prod)
- Quick Start
- Features
- Project Structure
- API Endpoints
- Environment Variables
- Development Tips
- ARQ Worker Setup
- Cloud Storage Setup
- Common Commands
- Troubleshooting
- Production Deployment
- License
- Python 3.11+ (Download)
- Node.js 18+ (Download)
- OpenAI API Key (Get one)
- Redis (optional, for background workers)
Linux/Mac:
./setup.shWindows:
setup.batcd backend
python -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
pip install -r requirements.txt
# Create .env file
cp .env.template .env
# Edit .env and add your OPENAI_API_KEY
# Initialize database
alembic upgrade head
# Run development server
uvicorn main:app --reload --port 8000cd frontend
npm install
npm run dev # Starts on http://localhost:5173The Vite dev server automatically proxies /api/* requests to localhost:8000.
# Start Redis (if not running)
redis-server
# In a separate terminal
cd backend
source .venv/bin/activate
python start_worker.pyNote: If the worker is unavailable, the app automatically falls back to synchronous scoring.
# Create .env file with production secrets
echo "POSTGRES_PASSWORD=your_secure_password" > .env
echo "SECRET_KEY=your_secret_key" >> .env
echo "OPENAI_API_KEY=your_openai_key" >> .env
# Build and start all services
docker-compose up -d
# View logs
docker-compose logs -f apiServices:
- Frontend: http://localhost (nginx)
- API: http://localhost/api (proxied through nginx)
- PostgreSQL: Internal only
- Redis: Internal only
- ARQ Worker: Background task processor
joke-engine/
├── backend/ # FastAPI service
│ ├── main.py # App entry point
│ ├── core/ # Config & database
│ ├── models/ # SQLAlchemy models
│ ├── schemas/ # Pydantic schemas
│ ├── routers/ # API endpoints
│ ├── services/ # Business logic (AI, image, etc.)
│ ├── middleware/ # Session management
│ ├── dependencies/ # FastAPI dependencies
│ ├── workers/ # ARQ background tasks
│ ├── tasks/ # APScheduler jobs
│ └── alembic/ # Database migrations
│
├── frontend/ # React + Vite SPA
│ ├── src/
│ │ ├── api/ # API client & hooks
│ │ ├── components/ # React components
│ │ ├── hooks/ # Custom hooks
│ │ ├── pages/ # Route pages
│ │ ├── store/ # Zustand state
│ │ └── lib/ # Utilities
│ └── public/ # Static assets
│
├── docker-compose.yml # Production orchestration
├── nginx.conf # Reverse proxy config
└── README.md
- Joke generation with multiple personas (witty, dad, sarcastic, roast, haiku, brainrot, etc.)
- SSE streaming for real-time joke delivery
- Joke history with pagination
- Session-based user profiles
- Share cards (PNG generation)
- Text-to-speech audio
- Heckle mode (AI roasts your jokes)
- Explain mode (over-analytical joke explanations)
- XP system
- Daily streak tracking
- Rank progression (Open Mic → Club Regular → Headliner → Legend → GOAT)
- Background joke scoring (ARQ workers)
- 3-dimensional ratings (originality, timing, cleverness)
- WebSocket support for streaming
- PWA capabilities
- PostgreSQL with pgvector
- Semantic deduplication
- Full Docker deployment
POST /api/jokes/generate- Generate a new jokeGET /api/jokes/stream- SSE streaming endpointGET /api/jokes/history- Paginated joke historyGET /api/jokes/{id}- Get specific jokeDELETE /api/jokes/{id}- Delete jokePOST /api/jokes/{id}/heckle- Get AI roastPOST /api/jokes/{id}/explain- Get explanation
GET /api/share/{id}/card.png- Download joke cardGET /api/share/{id}/audio- Get TTS audioPOST /api/share/{id}/increment- Track share count
GET /api/profile- Get user profile (XP, streak, rank)
WS /ws/joke- Real-time joke streaming
See backend/.env.template for all available options.
Required:
OPENAI_API_KEY- Your OpenAI API keySECRET_KEY- Session signing key (production)
Optional:
DATABASE_URL- Database connection stringREDIS_URL- Redis connection for ARQ workersNEWSAPI_KEY- For trending topics feature
# Create a new migration
alembic revision --autogenerate -m "description"
# Apply migrations
alembic upgrade head
# Rollback
alembic downgrade -1
# View history
alembic history# Health check
curl http://localhost:8000/api/health
# Generate joke
curl -X POST http://localhost:8000/api/jokes/generate \
-H "Content-Type: application/json" \
-d '{"query": "cats", "style": "witty"}'
# Stream joke (SSE)
curl -N http://localhost:8000/api/jokes/stream?query=dogs&style=dad# SQLite
sqlite3 backend/giggle.db
.tables
SELECT * FROM jokes LIMIT 5;
.quitEdit backend/services/ai.py and add to PERSONAS dict:
"custom": "Your custom system prompt here"The ARQ worker handles background joke scoring. If unavailable, the app automatically falls back to synchronous scoring.
cd backend
python start_worker.pyCommon Issue: ConnectionError: Connection closed by server with cloud Redis instances.
Solution 1: Use local Redis (recommended for development)
# Install Redis locally
# Windows: choco install redis-64 or scoop install redis
# Linux: sudo apt install redis-server
# Mac: brew install redis
# Start Redis
redis-server
# Update .env
REDIS_URL=redis://localhost:6379Solution 2: The app automatically falls back to synchronous scoring if worker fails.
# Test Redis connection
cd backend
python -c "import redis; r = redis.from_url('redis://localhost:6379'); print(r.ping())"
# Check worker logs for:
# ✓ Enqueued scoring task - Task added to queue
# ✓ Scored joke - Task completed
# Using fallback scoring - Worker unavailableBy default, the app uses local filesystem storage. For production, you can use Cloudflare R2 or AWS S3.
USE_CLOUD_STORAGE=False
MEDIA_DIR=./media- Create R2 bucket at Cloudflare Dashboard
- Create API token with Object Read & Write permissions
- Enable public access and copy the public bucket URL
- Update
backend/.env:
USE_CLOUD_STORAGE=True
S3_ENDPOINT_URL=https://your-account-id.r2.cloudflarestorage.com
S3_ACCESS_KEY_ID=your-access-key-id
S3_SECRET_ACCESS_KEY=your-secret-access-key
S3_BUCKET_NAME=giggle-media
S3_PUBLIC_URL=https://pub-abc123.r2.dev- Install boto3:
pip install boto3 - Restart the app
Similar setup to R2, but leave S3_ENDPOINT_URL empty and use your S3 bucket URL for S3_PUBLIC_URL.
Cost Comparison:
- Cloudflare R2: $0.015/GB/month, no egress fees
- AWS S3: $0.023/GB/month, $0.09/GB egress
- Local: Free but limited by disk space
# Start backend
cd backend && source .venv/bin/activate && uvicorn main:app --reload
# Start frontend
cd frontend && npm run dev
# Start worker (optional)
cd backend && source .venv/bin/activate && python start_worker.py# Start all services
docker-compose up -d
# View logs
docker-compose logs -f api
# Stop all
docker-compose down
# Rebuild
docker-compose up -d --build# Reset database
rm backend/giggle.db
cd backend && alembic upgrade headError: ModuleNotFoundError: No module named 'fastapi'
- Solution: Activate virtual environment and reinstall dependencies
source .venv/bin/activate pip install -r requirements.txt
Error: openai.AuthenticationError
- Solution: Check your
OPENAI_API_KEYin.env
Error: Database is locked
- Solution: SQLite doesn't handle concurrent writes well. Use PostgreSQL for production.
Error: Cannot find module 'react'
- Solution: Install dependencies
npm install
Error: CORS error
- Solution: Ensure backend is running on port 8000 and
CORS_ORIGINSincludeshttp://localhost:5173
Error: Network Error or 404
- Solution: Check that backend is running on port 8000
- Solution: Verify Vite proxy configuration in
vite.config.ts
Error: Silent failure or timeout
- Solution: Check OpenAI API key and account credits
- Solution: Check backend logs for errors
See DEPLOYMENT.md for detailed production setup instructions.
# Create production .env
cat > .env << EOF
POSTGRES_PASSWORD=$(openssl rand -base64 32)
SECRET_KEY=$(openssl rand -base64 32)
OPENAI_API_KEY=your_openai_key_here
EOF
# Build and start
docker-compose up -d
# View logs
docker-compose logs -fAccess at: http://localhost
- Change
SECRET_KEYto a strong random value - Set
DEBUG=falsein production - Use HTTPS (set
secure=Truein session cookie) - Set strong
POSTGRES_PASSWORD - Restrict CORS origins to your domain
- Keep dependencies updated
- Use environment variables for secrets (never commit .env)
- Enable rate limiting (future enhancement)
- Set up database backups
- ARCHITECTURE.md - System design and architecture details
- DEPLOYMENT.md - Comprehensive production deployment guide
This is a complete, production-ready codebase. Feel free to:
- Add new joke personas
- Implement battle system
- Add trending topics
- Create custom UI components
- Improve scoring algorithm
- Add tests
MIT