Skip to content

Commit d0c189d

Browse files
Add docs for ollama setup (#122)
* docs: update README and .env.example for Ollama model support * docs: update name for uniformity * docs: update README to include instructions for downloading model weights * chore: update .env.example --------- Signed-off-by: Palaniappan R <[email protected]>
1 parent 1441a9f commit d0c189d

File tree

3 files changed

+24
-2
lines changed

3 files changed

+24
-2
lines changed

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# OR-Assistant
1+
# ORAssistant
22

33
[![ORAssistant CI](https://github.com/The-OpenROAD-Project/ORAssistant/actions/workflows/ci.yaml/badge.svg)](https://github.com/The-OpenROAD-Project/ORAssistant/actions/workflows/ci.yaml)
44

backend/.env.example

Lines changed: 8 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,8 +6,15 @@ SEARCH_K=5
66
CHUNK_SIZE=2000
77
CHUNK_OVERLAP=200
88

9+
# Choose between 'gemini' or 'ollama'
910
LLM_MODEL="gemini"
10-
GOOGLE_GEMINI="1.5_flash" #1_pro or 1.5_flash or 1.5_pro
11+
12+
# Specify model name if using Ollama
13+
OLLAMA_MODEL=""
14+
15+
# Set Google Gemini model version
16+
GOOGLE_GEMINI="1.5_flash"
17+
1118
LLM_TEMP=1
1219

1320
EMBEDDINGS_TYPE="GOOGLE_VERTEXAI"

backend/README.md

Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -46,6 +46,21 @@ This key is used to access the various google cloud functions.
4646

4747
**NOTE**: The user might need billing to be set up on google cloud account and make sure to name the file as `credentials.json` as this would be ignored by `.git` and wouldn't be exposed on Github
4848

49+
### Running ORAssistant with a Local Ollama Model
50+
51+
ORAssistant supports running locally hosted Ollama models for inference. Follow these steps to set it up:
52+
53+
#### 1. Install Ollama
54+
- Visit [Ollama's installation page](https://ollama.com/download) and follow the installation instructions for your system.
55+
56+
#### 2. Configure ORAssistant to Use Ollama
57+
- In your `.env` file, set:
58+
```bash
59+
LLM_MODEL="ollama"
60+
OLLAMA_MODEL="<model_name>"
61+
62+
Ensure Ollama is running locally before starting ORAssistant. Make sure the model weights are available by downloading them first with `ollama pull <model_name>`.
63+
4964
### Setting Up LangChain Variables
5065

5166
There are 4 variables that needs to be set up

0 commit comments

Comments
 (0)