Enterprise-grade automated schema discovery and topological mapping for relational database architectures.
System Overview • Core Capabilities • Architecture & Tech Stack • Deployment & Execution • API Integration • Developer • License
Database Ontology Mapper is a full-stack platform engineered to automatically extract, model, and visualize the semantic layer of complex relational databases. Built for enterprise environments, it establishes a read-only connection to existing data infrastructure to generate intuitive, interactive topological maps of schema architectures.
The system facilitates rapid knowledge transfer, architectural auditing, and data governance by transforming disparate database schemas into a centralized, explorable ontology graph.
- Architectural Auditing: Conduct comprehensive reviews of database relationships and data lineage across multi-database environments.
- Automated Documentation: Replace static schema documents with a living, automatically generated visual ontology.
- Enterprise Onboarding: Accelerate developer and analyst understanding of complex, legacy database structures.
- Secure Discovery: Utilize strictly read-only connections to ensure safe exploration without risk to production workloads.
- Automated Extraction: Programmatically identify and catalog tables, columns, primary keys, and indices from MySQL/MariaDB instances.
- Topological Mapping: Automatically detect and map foreign key constraints to establish complex relationship graphs.
- Multi-Tenant Support: Concurrently interface with and aggregate schemas from multiple disparate database environments.
- Interactive Graph Rendering: High-performance, zoomable, and pannable node-edge visualizations of data structures.
- Adaptive Layouts: Toggle between hierarchical schema representations and relationship-centric topologies.
- Advanced Global Search: Instantly query across the entire ontology for specific databases, tables, or column entities.
- RESTful Architecture: Exposes a robust API for programmatic access to the extracted ontology data.
- Container-Ready: Architected for seamless deployment via Docker in continuous integration pipelines.
The platform utilizes a decoupled architecture, separating the computationally intensive extraction engine from the client-side rendering application.
- FastAPI: High-performance asynchronous API framework.
- SQLAlchemy: Enterprise-grade database toolkit utilized for connection pooling and metadata extraction.
- Pydantic: Strict data validation and settings management using Python type hinting.
- Click: Command-line interface orchestration for administrative operations.
- React 18 & TypeScript: Strongly typed, component-based user interface framework.
- ReactFlow: Specialized library for rendering complex node-based graphs and interactive diagrams.
- Framer Motion: Declarative animation library for fluid state transitions.
- Vite: Next-generation frontend build tooling for optimized asset delivery.
- Python 3.8 or higher
- Node.js 16 or higher
- Target MySQL/MariaDB database with explicitly granted read-only access.
Clone the repository and configure the backend virtual environment:
git clone https://github.com/yourusername/Ontology.git
cd Ontology
python -m venv venv
# On Linux/macOS:
source venv/bin/activate
# On Windows:
# venv\Scripts\activate
pip install -r requirements.txtInitialize the frontend environment:
cd frontend
npm install
npm run build
cd ..Define the target database endpoints by modifying the environment configuration:
cp env.example .envEdit the .env file with the appropriate connection parameters:
# Target Database Node 1
DB_1_HOST=localhost
DB_1_PORT=3306
DB_1_NAME=database1
DB_1_USER=readonly_user
DB_1_PASSWORD=your_password
# Target Database Node 2
DB_2_HOST=db2.example.com
DB_2_PORT=3306
DB_2_NAME=database2
DB_2_USER=readonly_user
DB_2_PASSWORD=your_password
# Application Server Configuration
API_HOST=0.0.0.0
API_PORT=8000
OUTPUT_DIR=outputSecurity Note: It is strictly recommended to execute the extraction using an isolated user with SELECT permissions restricted solely to the target schema and information_schema.
Execute the schema extraction process:
python -m src.main extractThis process connects to the specified endpoints, constructs the relationship graph, and serializes the ontology to output/ontology.json.
Initialize the application server:
python -m src.main serveThe application interface will be accessible via http://localhost:8000.
The application exposes a standard REST API for direct access to the semantic model:
GET /api/ontology- Retrieve the complete aggregated schema topology.GET /api/databases- Enumerate all discovered database instances.GET /api/databases/{name}- Retrieve specific database metadata.GET /api/databases/{db}/tables/{table}- Retrieve schema definition for a specific table.GET /api/relationships- Retrieve the global edge list of foreign key references.GET /api/search?q={query}- Execute a global textual search across all entities.GET /api/stats- Retrieve aggregate ontology metrics.
The project structure is organized to cleanly separate concerns between extraction, API serving, and client rendering:
Ontology/
├── src/
│ ├── api.py # FastAPI service definitions
│ ├── config.py # Environment and state configuration
│ ├── extractor.py # Core schema discovery and mapping logic
│ ├── models.py # Pydantic data schemas
│ └── main.py # CLI application entrypoint
├── frontend/ # React/Vite SPA application
├── requirements.txt # Python dependency manifest
└── .env.example # Environment configuration template
To run the frontend in development mode with hot-module replacement (HMR):
cd frontend
npm run devWe welcome contributions to the Database Ontology Mapper. Please follow standard pull request workflows. All code submissions must pass existing CI checks and adhere to the project's typing and style guidelines.
Peyton Nowlin
- Contact: Peyton@nowlinautomation.com
- Web: nowlinautomation.com
This software is distributed under the MIT License. Refer to the LICENSE file for the complete terms of distribution and usage.
- Interface conceptualization inspired by enterprise data integration platforms such as Palantir Foundry.
- Graph rendering engine powered by ReactFlow.