You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
-**Multiple Service Interfaces** 🔄 - REST API and MCP (Model-Compiler-Processor) interface.
14
15
15
16
---
16
17
@@ -24,14 +25,15 @@ Or, if you want to run the services directly on your own computer:
24
25
25
26
-**Python 3.8+** 🐍
26
27
-**Rust Compiler and cargo tools** 🦀
28
+
-**Rust Compiler and cargo tools** 🦀
27
29
28
30
---
29
31
30
32
## 📦 Install
31
33
32
34
```bash
33
-
git clone <repository-url>
34
-
cdRust_coder_lfx
35
+
git clone https://github.com/WasmEdge/Rust_coder
36
+
cdRust_coder
35
37
```
36
38
37
39
## 🚀 Configure and run
@@ -65,15 +67,15 @@ docker-compose stop
65
67
By default, you will need a [Qdrant server](https://qdrant.tech/documentation/quickstart/) running on `localhost` port `6333`. You also need a [local Gaia node](https://github.com/GaiaNet-AI/node-configs/tree/main/qwen-2.5-coder-3b-instruct-gte). Set the following environment variables in your terminal to point to the Qdrant and Gaia instances, as well as your Rust compiler tools.
66
68
67
69
```
68
-
QDRANT_HOST
69
-
QDRANT_PORT
70
-
LLM_API_BASE
71
-
LLM_MODEL
72
-
LLM_EMBED_MODEL
73
-
LLM_API_KEY
74
-
LLM_EMBED_SIZE
75
-
CARGO_PATH
76
-
RUST_COMPILER_PATH
70
+
QDRANT_HOST=localhost
71
+
QDRANT_PORT=6333
72
+
LLM_API_BASE=http://localhost:8080/v1
73
+
LLM_MODEL=Qwen2.5-Coder-3B-Instruct
74
+
LLM_EMBED_MODEL=nomic-embed
75
+
LLM_API_KEY=your_api_key
76
+
LLM_EMBED_SIZE=768
77
+
CARGO_PATH=/path/to/cargo
78
+
RUST_COMPILER_PATH=/path/to/rustc
77
79
```
78
80
79
81
Start the services.
@@ -114,9 +116,18 @@ The API provides the following endpoints:
114
116
115
117
```
116
118
[filename: Cargo.toml]
119
+
[package]
120
+
name = "calculator"
121
+
version = "0.1.0"
122
+
edition = "2021"
123
+
124
+
[dependencies]
117
125
... ...
118
126
119
127
[filename: src/main.rs]
128
+
fn main() {
129
+
// Calculator implementation
130
+
}
120
131
... ...
121
132
```
122
133
@@ -352,23 +363,32 @@ Rust_coder_lfx/
352
363
├── app/ # Application code
353
364
│ ├── compiler.py # Rust compilation handling
354
365
│ ├── llm_client.py # LLM API client
355
-
│ ├── main.py # FastAPI application
366
+
│ ├── llm_tools.py # Tools for LLM interactions
367
+
│ ├── load_data.py # Data loading utilities
368
+
│ ├── main.py # FastAPI application & endpoints
369
+
│ ├── mcp_server.py # MCP server implementation
356
370
│ ├── mcp_service.py # Model-Compiler-Processor service
371
+
│ ├── mcp_tools.py # MCP-specific tools
357
372
│ ├── prompt_generator.py # LLM prompt generation
358
373
│ ├── response_parser.py # Parse LLM responses into files
359
-
│ ├── vector_store.py # Vector database interface
360
-
│ └── ...
374
+
│ ├── utils.py # Utility functions
375
+
│ └── vector_store.py # Vector database interface
361
376
├── data/ # Data storage
362
377
│ ├── error_examples/ # Error examples for vector search
363
378
│ └── project_examples/ # Project examples for vector search
0 commit comments