Open WebUI Pipelines offer flexible, modular workflows for any UI client that supports OpenAI API specs and beyond. With just a few lines of code, you can easily extend functionalities, integrate custom logic, and build dynamic workflows.
Why Choose Open WebUI Pipelines?
- Endless Customization: Add your own logic and integrate Python libraries effortlessly, whether itβs AI agents or home automation APIs.
- Smooth Integration: Works seamlessly with any UI/client that supports OpenAI API specs (only pipe-type pipelines are supported; filter types need clients with Pipelines support).
- Custom Hooks: Create and integrate personalized pipelines.
What Can You Build?
- Function Calling Pipeline: Streamline function calls and enhance your app with custom logic.
- Custom RAG Pipeline: Develop advanced Retrieval-Augmented Generation workflows tailored to your needs.
- Message Monitoring with Langfuse: Track and analyze message interactions in real-time.
- Rate Limit Filter: Manage request flow to avoid exceeding rate limits.
- Real-Time Translation with LibreTranslate: Enable seamless real-time translations in your LLM interactions.
- Toxic Message Filter: Detect and handle toxic messages efficiently.
- And So Much More: Explore limitless possibilities with Pipelines and Python. Use our scaffolds to kick-start your projects and optimize your development process!
How It Works
Integrating Pipelines with any OpenAI API-compatible UI client is straightforward. Simply launch your Pipelines instance, then set the OpenAI URL in your client to the Pipelines URL. With that, you’re all set to utilize any Python library within your workflows!
Quick Start with Docker
For a simplified setup using Docker, follow these steps:
Run the Pipelines Container
To start the Pipelines container, run this command:
docker run -d -p 9099:9099 --add-host=host.docker.internal:host-gateway -v pipelines:/app/pipelines --name pipelines --restart always ghcr.io/open-webui/pipelines:main
Connect to Open WebUI
- Go to Admin Panel > Settings > Connections in Open WebUI.
- Press the + button to add a new connection.
- Set the API URL to
http://localhost:9099
and the API key to0p3n-w3bu!
. - Once connected, an icon labeled Pipelines will appear in the API Base URL field. Your pipelines are now active!
Note: If Open WebUI is running in a Docker container, replace
localhost
withhost.docker.internal
in the API URL.
Manage Configurations
- Navigate to Admin Panel > Settings > Pipelines.
- Select your pipeline and adjust valve values directly from the WebUI.
Tip: If you encounter connection issues, it’s likely due to Docker networking. Please troubleshoot and share your solutions in the community forum.
Installing Custom Pipelines with Dependencies
To install a custom pipeline with additional dependencies, run this command:
docker run -d -p 9099:9099 --add-host=host.docker.internal:host-gateway -e PIPELINES_URLS="https://github.com/open-webui/pipelines/blob/main/examples/filters/detoxify_filter_pipeline.py" -v pipelines:/app/pipelines --name pipelines --restart always ghcr.io/open-webui/pipelines:main
Alternatively, you can install pipelines directly from the Admin Settings by pasting the pipeline URL, provided no extra dependencies are required.
Installation and Setup
Follow these steps to get started with Pipelines:
- Ensure Python 3.11 is installed (the only officially supported version).
- Clone the Pipelines repository:
git clone https://github.com/open-webui/pipelines.git
cd pipelines
- Install the required dependencies:
pip install -r requirements.txt
- Start the Pipelines server:
sh ./start.sh
Once the server is running, set the OpenAI URL in your client to the Pipelines URL to unlock full functionality, integrating Python libraries and creating custom workflows.
Directory Structure and Examples
The /pipelines
directory is the heart of your setup. Customize modules, manage workflows, and add new pipelines here. All pipelines in this directory are loaded automatically when the server starts. You can change this default directory by setting the PIPELINES_DIR
environment variable.
For integration examples, visit https://github.com/open-webui/pipelines/blob/main/examples, which demonstrates how to build custom pipelines for various use cases.
Enjoy building customizable AI integrations with Pipelines!
Read related articles: