To set up your environment on local, you will need to first set up your local postgres database.
Assuming Mac,
brew install postgresql@16
brew services start postgresql@16Test using:
psql postgresNow we gotta install the tables and do the grants. We have a script for it:
psql -d postgres -a -f src/app/backend/sql/ddl.sql
psql -d postgres -a -f src/app/backend/sql/grants.sqlCreate a .env file in the project root (copy from template):
cp .env.template .envEdit .env if needed with your local database credentials. Existing values should be fine:
# .env
PGHOST=localhost
PGPORT=5432
PGDATABASE=postgres
DB_USE_SSL=falseNote: The .env file is gitignored and will not be committed.
# Create virtual environment (from repo root)
python3 -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
# Install package in editable mode
cd src/app
pip install -e .cd src/app/frontend
npm install# Terminal 1: Start backend (from src/app)
source venv/bin/activate
cd src/app
uvicorn backend.app:app --reload --port 8000The frontend will be available at http://localhost:8000.
psql postgres -c "CREATE USER apprunner WITH PASSWORD 'beepboop123';
CREATE SCHEMA IF NOT EXISTS control;
GRANT USAGE ON SCHEMA control to apprunner;
GRANT apprunner TO CURRENT_USER;
ALTER SCHEMA control OWNER TO apprunner;
GRANT SELECT, INSERT, UPDATE, DELETE ON ALL TABLES IN SCHEMA control TO apprunner;
GRANT USAGE, SELECT ON ALL SEQUENCES IN SCHEMA control TO apprunner;
ALTER DEFAULT PRIVILEGES IN SCHEMA control
GRANT USAGE, SELECT ON SEQUENCES TO apprunner;"
Followed by:
psql -d postgres -a -f path/to/your/LiveValidator/src/app/backend/sql/ddl.sql
psql -d postgres -a -f path/to/your/LiveValidator/src/app/backend/sql/grants.sql
psql -d postgres -a -f path/to/your/LiveValidator/src/app/backend/sql/seed_test_data.sql
# Build frontend
cd src/app/frontend
npm run build
# Deploy to Databricks (from repo root)
# The backend will serve both API and frontend from /dist
databricks bundle deploysource venv/bin/activate
pip install pytest
pytestTests are designed to run without Spark. Only pure Python logic is tested; Spark-dependent code is tested manually on Databricks.
Cmd+Shift+P → "Python: Configure Tests" → select pytest → select tests folder.
- Local uses
.envfile (loaded bypython-dotenv), Databricks uses the App UI for env vars - Never commit
.envor.pemfiles (already gitignored)