Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions vscode/vscode-tome-builder/.gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -4,3 +4,7 @@ node_modules/
.vscode-test/
*.vsix
mcp-server/src/assets.ts

# Ignore auto-generated MCP configurations
.vscode/cline_mcp_settings.json
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cline?

.vscode/mcp.json
13 changes: 0 additions & 13 deletions vscode/vscode-tome-builder/.vscode/launch.json

This file was deleted.

674 changes: 674 additions & 0 deletions vscode/vscode-tome-builder/LICENSE

Large diffs are not rendered by default.

93 changes: 49 additions & 44 deletions vscode/vscode-tome-builder/README.md
Original file line number Diff line number Diff line change
@@ -1,76 +1,81 @@
# Realm Tome Builder VS Code Extension

**Tome Builder** is an AI-powered VS Code extension designed to assist operators and developers in creating **Realm Tomes**. It leverages the **Gemini API** and a local **Model Context Protocol (MCP)** server to provide expert guidance, documentation, and syntax validation for the **Eldritch DSL**.
**Tome Builder** is a VS Code extension designed to seamlessly integrate the **Realm Tome Model Context Protocol (MCP) Server** with your preferred AI Chat Assistants (like Cline, Continue, and Copilot).

## Features

* **AI Chat Assistant**: Interact with an "Expert Tome Developer" persona to generate Tome logic and configuration.
* **Context-Aware**: The AI has access to up-to-date Eldritch documentation and real-world Tome examples via the embedded MCP server.
* **One-Click Save**: Easily save generated `metadata.yml` and `main.eldritch` code blocks directly to your workspace.
* **Model Selection**: Dynamically list and select available Gemini models (e.g., `gemini-2.0-flash-exp`) compatible with your API key.
* **Syntax Validation**: The AI validates generated code structure before presenting it to you.

## Architecture
By installing this extension, you enable your AI tools to understand the **Eldritch DSL** and access real-world **Tome** examples, allowing the AI to generate accurate, syntactically correct Tomes for your offensive security workflows.

This extension utilizes a **Model Context Protocol (MCP)** architecture:
## Features

1. **VS Code Extension**: Provides the Chat UI and manages the lifecycle of the MCP server.
2. **MCP Server** (Local): A Node.js server running locally that exposes:
* `get_documentation`: Eldritch and Tome reference materials.
* `get_tome_examples`: Best-practice examples (e.g., file writing, service persistence).
* `validate_tome_structure`: Basic structural validation for Tomes.
3. **LLM (Gemini)**: The extension connects to Google's Gemini API, using the MCP client to call tools on the local server.
* **Zero-Config Integration**: Automatically registers the bundled Tome MCP Server with popular AI extensions like Cline.
* **Context-Aware AI**: Provides your AI assistants with up-to-date Eldritch documentation and reference materials.
* **Copilot Ready**: Exposes the standard VS Code `mcpServers` contribution point for native GitHub Copilot Chat integration (once native MCP support is fully released).

## Installation & Setup

### Prerequisites

* Node.js (v18+)
* npm
* VSCE (Visual Studio Code Extension Manager) - `npm install -g @vscode/vsce`

### Installation via VSIX (Recommended)

### Building from Source
To install the extension directly into VS Code, you can package it into a `.vsix` file:

1. **Clone the repository** and navigate to the extension folder:
```bash
cd vscode-tome-builder
```

2. **Install Dependencies**:
You need to install dependencies for both the extension and the internal MCP server.
2. **Install Dependencies & Build the MCP Server**:
```bash
# Install extension dependencies
npm install

# Install MCP server dependencies and build it
cd mcp-server
npm install
npm run build
cd ..
npm run build-mcp
```

3. **Compile the Extension**:
3. **Package the Extension**:
```bash
npm run compile
vsce package
```
*This will generate a `tome-builder-0.1.0.vsix` file in the directory.*

4. **Install the Extension**:
* Open VS Code.
* Go to the Extensions view (`Ctrl+Shift+X` or `Cmd+Shift+X`).
* Click the **...** (Views and More Actions) menu in the top right of the Extensions view.
* Select **Install from VSIX...**
* Locate and select the `tome-builder-0.1.0.vsix` file you just generated.

### Configuration
### Building from Source (Development)

Before using the extension, you must provide your Google Gemini API Key.
1. **Clone the repository** and navigate to the extension folder:
```bash
cd vscode-tome-builder
```

2. **Install Dependencies & Build**:
```bash
npm install
npm run build-mcp
npm run compile
```

1. Open VS Code Settings (`Ctrl+,` or `Cmd+,`).
2. Search for **Tome Builder**.
3. Enter your key in **Tome Builder > Llm: Api Key**.
4. (Optional) You can select a default model in **Tome Builder > Llm: Model**, or use the dropdown in the chat interface.
### Usage

## Usage
Once the extension is installed and activated, it will automatically attempt to configure the workspace for supported AI tools.

1. Open the **Tome Builder** view from the Activity Bar (beaker icon).
2. Type a request in the chat, for example:
> "Create a tome that installs a systemd service to run /usr/bin/myimplant."
3. The AI will analyze your request, consult documentation/examples if needed, and generate the required files.
4. Click the **Save** button above the code blocks to save `metadata.yml` and `main.eldritch` to your current workspace.
**For Cline:**
The extension will inject the `tome-builder` MCP server into your workspace's `.vscode/cline_mcp_settings.json`. You can then open Cline and ask it to "Create a Realm Tome to establish persistence".

## Troubleshooting
**For GitHub Copilot Chat (Requires VS Code Insiders):**
Native MCP support in GitHub Copilot is currently in preview and requires using the VS Code Insiders build.
1. Download and install [VS Code Insiders](https://code.visualstudio.com/insiders/).
2. Open your VS Code Settings (`Ctrl+,` or `Cmd+,`).
3. Search for `chat.experimental.mcp` and check the box to enable it.
4. Reload the window.
5. The extension contributes the `tome-builder` MCP server automatically. You can now tag Copilot in chat (or use inline chat) to ask about Eldritch and Tome creation, and it will route requests to the server.

* **"MCP Client not connected"**: Ensure you ran `npm run build` in the `mcp-server` directory. The extension relies on `mcp-server/dist/index.js` existing.
* **404 Model Not Found**: The configured model might not be available in your region or for your API key tier. Use the dropdown in the chat header to select a valid model.
### Manual Registration
If your AI assistant's settings were not updated automatically, you can manually trigger the registration:
1. Open the Command Palette (`Ctrl+Shift+P` / `Cmd+Shift+P`).
2. Run **Tome Builder: Register MCP Server for AI extensions**.
2 changes: 1 addition & 1 deletion vscode/vscode-tome-builder/mcp-server/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -15,4 +15,4 @@
"typescript": "^5.0.0",
"@types/node": "^18.0.0"
}
}
}
39 changes: 39 additions & 0 deletions vscode/vscode-tome-builder/mcp-server/src/docs.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
import * as fs from 'fs';
import * as path from 'path';

function findRepoRoot(): string | null {
let currentDir = __dirname;
while (currentDir !== path.parse(currentDir).root) {
if (fs.existsSync(path.join(currentDir, 'go.mod'))) {
// Found the repo root (assuming go.mod exists in root, which it does in this repo)
return currentDir;
}
if (fs.existsSync(path.join(currentDir, '.git'))) {
return currentDir;
}
currentDir = path.dirname(currentDir);
}
return null;
}

const REPO_ROOT = findRepoRoot();

function readDocFile(relativePath: string): string {
if (!REPO_ROOT) {
return "Error: Could not locate repository root. Ensure you are running this in the realm repository.";
}
const fullPath = path.join(REPO_ROOT, relativePath);
try {
return fs.readFileSync(fullPath, 'utf-8');
} catch (e) {
return `Error reading file ${fullPath}: ${e}`;
}
}

export function getTomesDoc(): string {
return readDocFile('docs/_docs/user-guide/tomes.md');
}

export function getEldritchDoc(): string {
return readDocFile('docs/_docs/user-guide/eldritch.md');
}
47 changes: 47 additions & 0 deletions vscode/vscode-tome-builder/mcp-server/src/examples.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
import * as fs from 'fs';
import * as path from 'path';

function findRepoRoot(): string | null {
let currentDir = __dirname;
while (currentDir !== path.parse(currentDir).root) {
if (fs.existsSync(path.join(currentDir, 'go.mod'))) {
return currentDir;
}
if (fs.existsSync(path.join(currentDir, '.git'))) {
return currentDir;
}
currentDir = path.dirname(currentDir);
}
return null;
}

const REPO_ROOT = findRepoRoot();

function readExample(tomeName: string): { metadata: string, script: string } {
if (!REPO_ROOT) {
return {
metadata: "Error: Could not locate repository root.",
script: "Error: Could not locate repository root."
};
}

const tomeDir = path.join(REPO_ROOT, `tavern/tomes/${tomeName}`);
try {
const metadata = fs.readFileSync(path.join(tomeDir, 'metadata.yml'), 'utf-8');
const script = fs.readFileSync(path.join(tomeDir, 'main.eldritch'), 'utf-8');
return { metadata, script };
} catch (e) {
return {
metadata: `Error reading example ${tomeName}: ${e}`,
script: `Error reading example ${tomeName}: ${e}`
};
}
}

export function getFileWriteExample() {
return readExample('file_write');
}

export function getPersistServiceExample() {
return readExample('persist_service');
}
Loading
Loading