Skip to content

Commit f825066

Browse files
authored
Merge branch 'open-webui:main' into main
2 parents e986590 + 7f9f957 commit f825066

22 files changed

+2094
-228
lines changed

README.md

Lines changed: 44 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,8 @@
55
# Pipelines: UI-Agnostic OpenAI API Plugin Framework
66

77
> [!TIP]
8+
> **DO NOT USE PIPELINES!**
9+
>
810
> If your goal is simply to add support for additional providers like Anthropic or basic filters, you likely don't need Pipelines . For those cases, Open WebUI Functions are a better fit—it's built-in, much more convenient, and easier to configure. Pipelines, however, comes into play when you're dealing with computationally heavy tasks (e.g., running large models or complex logic) that you want to offload from your main Open WebUI instance for better performance and scalability.
911
1012

@@ -21,6 +23,7 @@ Welcome to **Pipelines**, an [Open WebUI](https://github.com/open-webui) initiat
2123
- [**Function Calling Pipeline**](/examples/filters/function_calling_filter_pipeline.py): Easily handle function calls and enhance your applications with custom logic.
2224
- [**Custom RAG Pipeline**](/examples/pipelines/rag/llamaindex_pipeline.py): Implement sophisticated Retrieval-Augmented Generation pipelines tailored to your needs.
2325
- [**Message Monitoring Using Langfuse**](/examples/filters/langfuse_filter_pipeline.py): Monitor and analyze message interactions in real-time using Langfuse.
26+
- [**Message Monitoring Using Opik**](/examples/filters/opik_filter_pipeline.py): Monitor and analyze message interactions using Opik, an open-source platform for debugging and evaluating LLM applications and RAG systems.
2427
- [**Rate Limit Filter**](/examples/filters/rate_limit_filter_pipeline.py): Control the flow of requests to prevent exceeding rate limits.
2528
- [**Real-Time Translation Filter with LibreTranslate**](/examples/filters/libretranslate_filter_pipeline.py): Seamlessly integrate real-time translations into your LLM interactions.
2629
- [**Toxic Message Filter**](/examples/filters/detoxify_filter_pipeline.py): Implement filters to detect and handle toxic messages effectively.
@@ -39,6 +42,8 @@ Integrating Pipelines with any OpenAI API-compatible UI client is simple. Launch
3942
> [!WARNING]
4043
> Pipelines are a plugin system with arbitrary code execution — **don't fetch random pipelines from sources you don't trust**.
4144
45+
### Docker
46+
4247
For a streamlined setup using Docker:
4348

4449
1. **Run the Pipelines container:**
@@ -75,6 +80,45 @@ Alternatively, you can directly install pipelines from the admin settings by cop
7580

7681
That's it! You're now ready to build customizable AI integrations effortlessly with Pipelines. Enjoy!
7782

83+
### Docker Compose together with Open WebUI
84+
85+
Using [Docker Compose](https://docs.docker.com/compose/) simplifies the management of multi-container Docker applications.
86+
87+
Here is an example configuration file `docker-compose.yaml` for setting up Open WebUI together with Pipelines using Docker Compose:
88+
89+
```yaml
90+
services:
91+
openwebui:
92+
image: ghcr.io/open-webui/open-webui:main
93+
ports:
94+
- "3000:8080"
95+
volumes:
96+
- open-webui:/app/backend/data
97+
98+
pipelines:
99+
image: ghcr.io/open-webui/pipelines:main
100+
volumes:
101+
- pipelines:/app/pipelines
102+
restart: always
103+
environment:
104+
- PIPELINES_API_KEY=0p3n-w3bu!
105+
106+
volumes:
107+
open-webui: {}
108+
pipelines: {}
109+
```
110+
111+
To start your services, run the following command:
112+
113+
```
114+
docker compose up -d
115+
```
116+
117+
You can then use `http://pipelines:9099` (the name is the same as the service's name defined in `docker-compose.yaml`) as an API URL to connect to Open WebUI.
118+
119+
> [!NOTE]
120+
> The `pipelines` service is accessible only by `openwebui` Docker service and thus provide additional layer of security.
121+
78122
## 📦 Installation and Setup
79123

80124
Get started with Pipelines in a few easy steps:

blueprints/function_calling_blueprint.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -137,6 +137,8 @@ def call_function(self, result, messages: list[dict]) -> list[dict]:
137137
# Return the updated messages
138138
return messages
139139

140+
return messages
141+
140142
def run_completion(self, system_prompt: str, content: str) -> dict:
141143
r = None
142144
try:

config.py

Lines changed: 10 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
import os
2-
2+
import logging
33
####################################
44
# Load .env file
55
####################################
@@ -11,5 +11,14 @@
1111
except ImportError:
1212
print("dotenv not installed, skipping...")
1313

14+
# Define log levels dictionary
15+
LOG_LEVELS = {
16+
'DEBUG': logging.DEBUG,
17+
'INFO': logging.INFO,
18+
'WARNING': logging.WARNING,
19+
'ERROR': logging.ERROR,
20+
'CRITICAL': logging.CRITICAL
21+
}
22+
1423
API_KEY = os.getenv("PIPELINES_API_KEY", "0p3n-w3bu!")
1524
PIPELINES_DIR = os.getenv("PIPELINES_DIR", "./pipelines")

docker-compose.yaml

Lines changed: 19 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,19 @@
1+
services:
2+
openwebui:
3+
image: ghcr.io/open-webui/open-webui:main
4+
ports:
5+
- "3000:8080"
6+
volumes:
7+
- open-webui:/app/backend/data
8+
9+
pipelines:
10+
image: ghcr.io/open-webui/pipelines:main
11+
volumes:
12+
- pipelines:/app/pipelines
13+
restart: always
14+
environment:
15+
- PIPELINES_API_KEY=0p3n-w3bu!
16+
17+
volumes:
18+
open-webui: {}
19+
pipelines: {}

0 commit comments

Comments
 (0)