AI-assisted Manim scene generation with a Next.js UI and server-side Python rendering.
cp .env.example .env.localSet GROQ_API_KEY in .env.local.
docker compose up --buildOpen http://localhost:3000.
The app runs with hot reload and writes render artifacts under:
python/jobspublic/jobs
docker compose downIf you also want to remove dev volumes (including cached dependencies):
docker compose down -vBuild:
docker build -t engine-manim:latest .Run:
docker run --rm -p 3000:3000 --env-file .env.local engine-manim:latestYou need Node.js + Python + Manim system dependencies (FFmpeg/LaTeX stack), then:
npm ci
npm run dev- Symptom: generation endpoint fails quickly.
- Fix: ensure
.env.localexists and includes a validGROQ_API_KEY, then restart the container.
- Symptom:
/api/generateor/api/compilereturns render errors. - Fix: inspect container logs with
docker compose logs -f appand verify the generated Manim code is valid.
- Symptom: long render times, process crashes, or unstable behavior.
- Fix: allocate more Docker memory/CPU, reduce scene complexity, and retry.
- Never commit
.env.local. - Rotate any key that was accidentally shared in commits, screenshots, terminal history, or chat.