In this article, we will guide you through the process of installing Open WebUI using Docker Compose. You’ll learn how to set up and configure the platform efficiently with step-by-step instructions tailored for both beginners and advanced users.
Basic Installation with Docker Compose
To get started, if you don’t have Ollama installed, use Docker Compose for easy setup by running the following command:
docker compose up -d --build
Nvidia GPU Support
If your system has an Nvidia GPU, you’ll need to use an additional Docker Compose file to enable GPU support. Run this command:
docker compose -f docker-compose.yaml -f docker-compose.gpu.yaml up -d --build
AMD GPU Support
For systems with AMD GPUs, you may need to set an environment variable to ensure proper functionality. Run this command:
HSA_OVERRIDE_GFX_VERSION=11.0.0 docker compose -f docker-compose.yaml -f docker-compose.amdgpu.yaml up -d --build
For AMD GPU users encountering compatibility issues, setting the HSA_OVERRIDE_GFX_VERSION
environment variable is crucial. This variable instructs the ROCm platform to emulate a specific GPU architecture, ensuring compatibility with various AMD GPUs not officially supported. Depending on your GPU model, adjust the HSA_OVERRIDE_GFX_VERSION
as follows:
- For RDNA1 & RDNA2 GPUs (e.g., RX 6700, RX 680M): Use
HSA_OVERRIDE_GFX_VERSION=10.3.0
. - For RDNA3 GPUs: Set
HSA_OVERRIDE_GFX_VERSION=11.0.0
. - For older GCN (Graphics Core Next) GPUs: The version to use varies. GCN 4th gen and earlier might require different settings, such as
ROC_ENABLE_PRE_VEGA=1
for GCN4, orHSA_OVERRIDE_GFX_VERSION=9.0.0
for Vega (GCN5.0) emulation.
Ensure to replace <version>
with the appropriate version number based on your GPU model and the guidelines above. For a detailed list of compatible versions and more in-depth instructions, refer to the ROCm documentation and the openSUSE Wiki on AMD GPGPU.
Example command for RDNA1 & RDNA2 GPUs:
HSA_OVERRIDE_GFX_VERSION=10.3.0 docker compose -f docker-compose.yaml -f docker-compose.amdgpu.yaml up -d --build
Expose Ollama API
To expose the Ollama API, you will need to use an additional Docker Compose file. Run the following command:
docker compose -f docker-compose.yaml -f docker-compose.api.yaml up -d --build
Using the run-compose.sh
Script
For Linux or Docker-enabled WSL2 on Windows, you can also use the provided run-compose.sh
script to simplify the process.
- Give execute permission to the script by running:
chmod +x run-compose.sh
- For CPU-only containers, run:
./run-compose.sh
- For GPU support, run:
./run-compose.sh --enable-gpu
- To build the latest local version with GPU support, add the
--build
flag:
./run-compose.sh --enable-gpu --build
By following these steps, you can easily set up and customize your Open WebUI instance using Docker Compose.
Read helpful articles in our Blog.