AI-powered enrollment intelligence demo built on Databricks. Combines a Multi-Agent Supervisor (MAS) with Genie Space for SQL analytics and a Knowledge Assistant for document-based Q&A, all wrapped in a FastAPI + React application.
┌─────────────────────────────────────────────────────┐
│ React Frontend (Vite + TailwindCSS) │
│ ┌──────────┐ ┌──────────┐ ┌────────────────────┐ │
│ │Dashboard │ │ AI Chat │ │ Architecture View │ │
│ └────┬─────┘ └────┬─────┘ └────────────────────┘ │
├───────┼──────────────┼──────────────────────────────┤
│ FastAPI Backend │
│ /api/metrics ──── SQL Warehouse ──── Gold Tables │
│ /api/chat ─────── MAS Endpoint ─┬── Genie Space │
│ /api/conversations └── Knowledge Asst │
│ └──────── Lakebase (PostgreSQL) │
└─────────────────────────────────────────────────────┘
| Resource | ID / Name |
|---|---|
| App | brickcon-app |
| App URL | https://brickcon-app-3438839487639471.11.azure.databricksapps.com |
| SP client_id | 5be7b460-a4e7-4703-9144-b813dd38841f |
| MAS tile | aea8b05a-7532-477c-90f6-4038db05b238 |
| MAS endpoint | mas-aea8b05a-endpoint |
| KA tile | e236dcf9-8521-4c02-a9ca-3c3e7a24e007 |
| KA endpoint | ka-e236dcf9-endpoint |
| Genie Space | 01f118cd46b51745b597f8b56272b554 |
| Catalog.Schema | demo.sled_workshop |
| SQL Warehouse | a94a22f8652d85c1 |
| Lakebase Instance | wanderbricks-lakebase |
| Lakebase PG Schema | sled_workshop |
| Workspace Profile | dbx_shared_demo |
| Table | Description |
|---|---|
enrollment_by_college_term |
Enrollment counts by college and term with first-gen and regional breakdowns |
funnel_by_college_term |
Application funnel: applied → admitted → enrolled with admit/yield rates |
enrollment_demographics |
Enrollment by geographic region, campus, and first-gen status |
nursing_pipeline |
Nursing program funnel tracking the "double enrollment" initiative |
- Databricks CLI configured with profile
dbx_shared_demo - Node.js 18+ (for frontend build)
# 1. Build frontend
cd app/frontend && npm install && npm run build && cd ../..
# 2. Deploy bundle (app + job + pipeline definitions)
databricks bundle deploy -t dbx_shared_demo
# 3. Generate data + gold tables (first time or data refresh)
databricks bundle run synthetic_data -t dbx_shared_demo
# 4. Deploy the app
databricks apps deploy brickcon-app --source-code-path appAfter the first deploy, the app's service principal needs permissions:
-- Unity Catalog
GRANT USE CATALOG ON CATALOG demo TO `5be7b460-a4e7-4703-9144-b813dd38841f`;
GRANT USE SCHEMA ON SCHEMA demo.sled_workshop TO `5be7b460-a4e7-4703-9144-b813dd38841f`;
GRANT SELECT ON SCHEMA demo.sled_workshop TO `5be7b460-a4e7-4703-9144-b813dd38841f`;Plus API permission grants:
- SQL Warehouse:
CAN_USEona94a22f8652d85c1 - MAS Endpoint:
CAN_QUERYonmas-aea8b05a-endpoint - KA Endpoint:
CAN_QUERYonka-e236dcf9-endpoint - Genie Space:
CAN_RUNon01f118cd46b51745b597f8b56272b554
Create the sled_workshop PG schema and tables in the wanderbricks-lakebase instance:
CREATE SCHEMA IF NOT EXISTS sled_workshop;
SET search_path TO sled_workshop;
CREATE TABLE conversations (
id TEXT PRIMARY KEY,
user_email TEXT NOT NULL,
title TEXT,
created_at TIMESTAMPTZ DEFAULT NOW(),
updated_at TIMESTAMPTZ DEFAULT NOW()
);
CREATE TABLE messages (
id TEXT PRIMARY KEY,
conversation_id TEXT NOT NULL REFERENCES conversations(id) ON DELETE CASCADE,
role TEXT NOT NULL,
content TEXT NOT NULL,
route TEXT,
created_at TIMESTAMPTZ DEFAULT NOW()
);
CREATE INDEX idx_messages_conv ON messages(conversation_id, created_at);
CREATE INDEX idx_conversations_user ON conversations(user_email, updated_at DESC);sled_workshop/
├── databricks.yml # Asset Bundle config
├── resources/ # Bundle resource definitions
│ ├── brickcon_app.app.yml # App resource
│ ├── synthetic_data.job.yml # Data gen job
│ └── osu_workshop.pipeline.yml # SDP pipeline (skeleton)
├── app/ # Databricks App
│ ├── app.py # FastAPI entry + middleware
│ ├── app.yaml # App config (env vars, resources)
│ ├── requirements.txt
│ ├── server/
│ │ ├── config.py # OAuth + workspace host
│ │ ├── db.py # Lakebase connection pool
│ │ ├── llm.py # MAS client (async polling)
│ │ ├── sql_client.py # SQL Warehouse client
│ │ └── routes/
│ │ ├── chat.py # /api/chat/* (MAS)
│ │ ├── conversations.py # /api/conversations/* (Lakebase)
│ │ ├── health.py # /api/health
│ │ └── metrics.py # /api/metrics/* (SQL)
│ └── frontend/
│ ├── src/
│ │ ├── pages/
│ │ │ ├── Dashboard.tsx
│ │ │ ├── Chat.tsx
│ │ │ └── Architecture.tsx
│ │ └── lib/api.ts
│ └── dist/ # Built frontend (deployed)
├── scripts/
│ └── generate_enrollment_data.py # Synthetic data + gold tables
└── src/
└── pipelines/ # SDP pipeline code (skeleton)
- SP token for SQL + MAS: Databricks Apps forward a user token via
x-forwarded-access-token, but its scopes are limited to IAM only. All SQL Warehouse and serving endpoint calls use the app's service principal token. - User token for Lakebase: Lakebase requires the user's forwarded OAuth token (SP tokens lack security labels). The middleware captures it on first request.
- Async polling for MAS: Databricks Apps proxy buffers SSE streams, so we use async task polling (POST to start, GET to poll status) instead of streaming.
- stateMap multi-chat: Chat state is stored per-conversation in a
useRef<Map>to prevent animation/message bleed when switching between conversations while a MAS query is in flight.