GitHub

Open WebUI is a versatile, feature-packed, and user-friendly self-hosted interface built to function completely offline. It supports multiple LLM runners, including APIs compatible with Ollama and OpenAI.

OpenWebUI GitHub Stars Growth

How to Install Open WebUI

Installation via Python pip

To install Open WebUI using Python’s pip package manager, make sure Python 3.11 is installed to ensure compatibility.

  1. Install Open WebUI: Open your terminal and run:
   pip install open-webui
  1. Run Open WebUI: Start the server with:
   open-webui serve

Access the server at http://localhost:8080.

Quick Start with Docker

Note: Some Docker environments may need additional configuration. Refer to the Open WebUI Documentation if you encounter connection issues.

  • Database Mounting: When installing via Docker, use -v open-webui:/app/backend/data in your command to mount the database and prevent data loss.
  • CUDA & Ollama Support: To enable CUDA or Ollama, use images tagged with :cuda or :ollama. CUDA setup requires the Nvidia CUDA container toolkit on your Linux/WSL system.
Commands
  • If Ollama is Installed Locally:
   docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
  • Connecting to Ollama on Another Server:
   docker run -d -p 3000:8080 -e OLLAMA_BASE_URL=https://example.com -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
  • Running with Nvidia GPU Support:
   docker run -d -p 3000:8080 --gpus all --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:cuda
  • Using OpenAI API Only:
   docker run -d -p 3000:8080 -e OPENAI_API_KEY=your_secret_key -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main

Installing Open WebUI with Bundled Ollama

A single container bundles both Open WebUI and Ollama for easy setup. Choose the appropriate command for your hardware:

  • With GPU Support:
   docker run -d -p 3000:8080 --gpus=all -v ollama:/root/.ollama -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:ollama
  • For CPU Only:
   docker run -d -p 3000:8080 -v ollama:/root/.ollama -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:ollama

After installation, access Open WebUI at http://localhost:3000.

Other Installation Methods

Open WebUI also offers various other installation methods, including Docker Compose, Kustomize, and Helm. Explore these in the Discord community.

Troubleshooting

For Docker environments, use --network=host to resolve connection issues between the WebUI container and the Ollama server. The Docker command then becomes:

docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://127.0.0.1:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:main

Keeping Your Docker Installation Up-to-Date

To keep your Docker installation updated, use Watchtower:

docker run --rm --volume /var/run/docker.sock:/var/run/docker.sock containrrr/watchtower --run-once open-webui

Using the Dev Branch

Warning: The :dev branch has experimental features and may contain bugs. If you’re ready to try the latest updates, run:

docker run -d -p 3000:8080 -v open-webui:/app/backend/data --name open-webui --add-host=host.docker.internal:host-gateway --restart always ghcr.io/open-webui/open-webui:dev

License

Open WebUI is licensed under the MIT License. See the LICENSE file on GitHub for details.