Hands-on tutorials for each stage of Ekaya's AI Data Readiness Framework. Each stage has its own directory with tutorials that progressively build skills.
A 5-stage maturity model for AI database access:
- Exploring -- Unstructured AI experimentation. Move from CSV uploads and copy-paste workflows to sanctioned database access paths using Ekaya's MCP Server.
- Connected -- Structured database interfaces via MCP-compatible tools with an ontology layer for semantic understanding. AI becomes a development partner for data engineering.
- Empowered -- Natural language querying for non-technical business users with approval workflows and audit trails.
- Governed -- Automated security and compliance monitoring with real-time sensitive data detection and classification.
- Autonomous -- AI agents operate independently within pre-approved boundaries with immutable action logging.
- Python 3.10+
- An Ekaya account
Create and activate the Python virtual environment from the project root before working in any tutorial directory:
python3 -m venv .venv
source .venv/bin/activate # macOS/Linux
# .venv\Scripts\activate # Windows
pip install -r requirements.txtCopy the environment example and fill in your Postgres database credentials:
cp .env.example .env
# Edit .env with your credentials
source .envCopy the MCP configuration example and replace the placeholder with your MCP Server URL from Ekaya:
cp .mcp.json.example .mcp.json
# Edit .mcp.json and replace the URL with your MCP Server URLThen cd into a tutorial directory and follow its README.
For The Look-based ontology refinement and early approved-query setup, start in connected/tutorial-00-forging-ontology/. That tutorial now includes local copies of THE_LOOK_README.md and data_dictionary.csv so ontology-question work and pre-approved query setup can happen in one place.
exploring/tutorial-00-connecting-databases-to-ai/-- Connect your database to Claude, ChatGPT, Cursor, or any MCP-compatible AI tool.exploring/tutorial-01-creating-new-database-from-csvs/-- Use Ekaya's MCP Server to create a new database from CSV datasets. This dataset is used throughout subsequent tutorials.exploring/tutorial-02-understanding-existing-databases/-- Let AI explore, document, and explain your database schema, relationships, and data patterns.connected/tutorial-00-forging-ontology/-- Extract an ontology, answer clarifying questions, and set up the early pre-approved query foundation using the included The Look reference files.connected/tutorial-01-creating-and-analyzing-excel-files/-- Use Claude Code and Ontology Forge to turn live database data into a real Excel workbook.connected/tutorial-02-creating-powerpoint-presentations-from-data/-- Use Claude Code and Ontology Forge to turn live database data into a real PowerPoint deck.connected/tutorial-03-using-advanced-sql/-- Write advanced SQL queries including CTEs, window functions, correlated subqueries, and conditional aggregation. Ontology-powered for consistent, accurate results.autonomous/tutorial-00-marketing-budget-optimizer-agent/-- Build a Stage 5 autonomous agent that uses the pre-approved queries prepared earlier to evaluate channel performance and write logged budget recommendations.
exploring/ # Stage 1: Exploring tutorials
tutorial-00-connecting-databases-to-ai/
tutorial-01-creating-new-database-from-csvs/
tutorial-02-understanding-existing-databases/
connected/ # Stage 2: Connected tutorials
tutorial-00-forging-ontology/
tutorial-01-creating-and-analyzing-excel-files/
tutorial-02-creating-powerpoint-presentations-from-data/
tutorial-03-using-advanced-sql/
empowered/ # Stage 3: Empowered tutorials (coming soon)
governed/ # Stage 4: Governed tutorials (coming soon)
autonomous/ # Stage 5: Autonomous tutorials
tutorial-00-marketing-budget-optimizer-agent/