⏱️ Quick Start
- Admin Creation: The first account created on Sage WebUI gains Administrator privileges, controlling user management and system settings.
- User Registrations: Subsequent sign-ups start with Pending status, requiring Administrator approval for access.
- Privacy and Data Security: All your data, including login details, is locally stored on your device. Sage WebUI ensures strict confidentiality and no external requests for enhanced privacy and security.
- All models are private by default. Models must be explicitly shared via groups or by being made public. If a model is assigned to a group, only members of that group can see it. If a model is made public, anyone on the instance can see it.
Choose your preferred installation method below:
- Docker: Officially supported and recommended for most users
- Python: Suitable for low-resource environments or those wanting a manual setup
- Kubernetes: Ideal for enterprise deployments that require scaling and orchestration
- Docker
- Python
- Kubernetes
- Third Party
- Docker
- Docker Compose
- Podman
- Docker Swarm
Quick Start with Docker 🐳
Follow these steps to install Sage WebUI with Docker.
Step 1: Pull the Sage WebUI Image
Start by pulling the latest Sage WebUI Docker image from the GitHub Container Registry.
docker pull ghcr.io/Startr/AI-WEB-openwebui:main
Step 2: Run the Container
Run the container with default settings. This command includes a volume mapping to ensure persistent data storage.
docker run -d -p 3000:8080 -v sage-open-webui:/app/backend/data --name sage-open-webui ghcr.io/Startr/AI-WEB-openwebui:main
Important Flags
- Volume Mapping (
-v sage-open-webui:/app/backend/data
): Ensures persistent storage of your data. This prevents data loss between container restarts. - Port Mapping (
-p 3000:8080
): Exposes the WebUI on port 3000 of your local machine.
Using GPU Support
For Nvidia GPU support, add --gpus all
to the docker run
command:
docker run -d -p 3000:8080 --gpus all -v sage-open-webui:/app/backend/data --name sage-open-webui ghcr.io/Startr/AI-WEB-openwebui:cuda
Single-User Mode (Disabling Login)
To bypass the login page for a single-user setup, set the WEBUI_AUTH
environment variable to False
:
docker run -d -p 3000:8080 -e WEBUI_AUTH=False -v sage-open-webui:/app/backend/data --name sage-open-webui ghcr.io/Startr/AI-WEB-openwebui:main
You cannot switch between single-user mode and multi-account mode after this change.
Advanced Configuration: Connecting to Ollama on a Different Server
To connect Sage WebUI to an Ollama server located on another host, add the OLLAMA_BASE_URL
environment variable:
docker run -d -p 3000:8080 -e OLLAMA_BASE_URL=https://example.com -v sage-open-webui:/app/backend/data --name sage-open-webui --restart always ghcr.io/Startr/AI-WEB-openwebui:main
Access the WebUI
After the container is running, access Sage WebUI at:
For detailed help on each Docker flag, see Docker's documentation.
Updating
To update your local Docker installation to the latest version, you can either use Watchtower or manually update the container.
Option 1: Using Watchtower
With Watchtower, you can automate the update process:
docker run --rm --volume /var/run/docker.sock:/var/run/docker.sock containrrr/watchtower --run-once sage-open-webui
(Replace sage-open-webui
with your container's name if it's different.)
Option 2: Manual Update
-
Stop and remove the current container:
docker rm -f sage-open-webui
-
Pull the latest version:
docker pull ghcr.io/Startr/AI-WEB-openwebui:main
-
Start the container again:
docker run -d -p 3000:8080 -v sage-open-webui:/app/backend/data --name sage-open-webui ghcr.io/Startr/AI-WEB-openwebui:main
Both methods will get your Docker instance updated and running with the latest build.
Docker Compose Setup
Using Docker Compose simplifies the management of multi-container Docker applications.
If you don't have Docker installed, check out our Docker installation tutorial.
Docker Compose requires an additional package, docker-compose-v2
.
Warning: Older Docker Compose tutorials may reference version 1 syntax, which uses commands like docker-compose build
. Ensure you use version 2 syntax, which uses commands like docker compose build
(note the space instead of a hyphen).
Example docker-compose.yml
Here is an example configuration file for setting up Sage WebUI with Docker Compose:
version: '3'
services:
openwebui:
image: ghcr.io/Startr/AI-WEB-openwebui:main
ports:
- "3000:8080"
volumes:
- sage-open-webui:/app/backend/data
volumes:
sage-open-webui:
Starting the Services
To start your services, run the following command:
docker compose up -d
Helper Script
A useful helper script called run-compose.sh
is included with the codebase. This script assists in choosing which Docker Compose files to include in your deployment, streamlining the setup process.
Note: For Nvidia GPU support, you change the image from ghcr.io/sage-open-webui/sage-open-webui:main
to ghcr.io/sage-open-webui/sage-open-webui:cuda
and add the following to your service definition in the docker-compose.yml
file:
deploy:
resources:
reservations:
devices:
- driver: nvidia
count: all
capabilities: [gpu]
This setup ensures that your application can leverage GPU resources when available.
Using Podman
Podman is a daemonless container engine for developing, managing, and running OCI Containers.
Basic Commands
-
Run a Container:
podman run -d --name openwebui -p 3000:8080 ghcr.io/Startr/AI-WEB-openwebui:main
-
List Running Containers:
podman ps
Networking with Podman
If networking issues arise, you may need to adjust your network settings:
--network=slirp4netns:allow_host_loopback=true
Refer to the Podman documentation for advanced configurations.
Docker Swarm
This installation method requires knowledge on Docker Swarms, as it utilizes a stack file to deploy 3 seperate containers as services in a Docker Swarm.
It includes isolated containers of ChromaDB, Ollama, and OpenWebUI. Additionally, there are pre-filled Environment Variables to further illustrate the setup.
Choose the appropriate command based on your hardware setup:
-
Before Starting:
Directories for your volumes need to be created on the host, or you can specify a custom location or volume.
The current example utilizes an isolated dir
data
, which is within the same dir as thedocker-stack.yaml
.-
For example:
mkdir -p data/sage-open-webui data/chromadb data/ollama
-
-
With GPU Support:
Docker-stack.yaml
version: '3.9'
services:
openWebUI:
image: ghcr.io/Startr/AI-WEB-openwebui:main
depends_on:
- chromadb
- ollama
volumes:
- ./data/sage-open-webui:/app/backend/data
environment:
DATA_DIR: /app/backend/data
OLLAMA_BASE_URLS: http://ollama:11434
CHROMA_HTTP_PORT: 8000
CHROMA_HTTP_HOST: chromadb
CHROMA_TENANT: default_tenant
VECTOR_DB: chroma
WEBUI_NAME: Awesome ChatBot
CORS_ALLOW_ORIGIN: "*" # This is the current Default, will need to change before going live
RAG_EMBEDDING_ENGINE: ollama
RAG_EMBEDDING_MODEL: nomic-embed-text-v1.5
RAG_EMBEDDING_MODEL_TRUST_REMOTE_CODE: "True"
ports:
- target: 8080
published: 8080
mode: overlay
deploy:
replicas: 1
restart_policy:
condition: any
delay: 5s
max_attempts: 3
chromadb:
hostname: chromadb
image: chromadb/chroma:0.5.15
volumes:
- ./data/chromadb:/chroma/chroma
environment:
- IS_PERSISTENT=TRUE
- ALLOW_RESET=TRUE
- PERSIST_DIRECTORY=/chroma/chroma
ports:
- target: 8000
published: 8000
mode: overlay
deploy:
replicas: 1
restart_policy:
condition: any
delay: 5s
max_attempts: 3
healthcheck:
test: ["CMD-SHELL", "curl localhost:8000/api/v1/heartbeat || exit 1"]
interval: 10s
retries: 2
start_period: 5s
timeout: 10s
ollama:
image: ollama/ollama:latest
hostname: ollama
ports:
- target: 11434
published: 11434
mode: overlay
deploy:
resources:
reservations:
generic_resources:
- discrete_resource_spec:
kind: "NVIDIA-GPU"
value: 0
replicas: 1
restart_policy:
condition: any
delay: 5s
max_attempts: 3
volumes:
- ./data/ollama:/root/.ollama-
Additional Requirements:
- Ensure CUDA is Enabled, follow your OS and GPU instructions for that.
- Enable Docker GPU support, see Nvidia Container Toolkit
- Follow the Guide here on configuring Docker Swarm to with with your GPU
- Ensure GPU Resource is enabled in
/etc/nvidia-container-runtime/config.toml
and enable GPU resource advertising by uncommenting theswarm-resource = "DOCKER_RESOURCE_GPU"
. The docker daemon must be restarted after updating these files on each node.
-
-
With CPU Support:
Modify the Ollama Service within
docker-stack.yaml
and remove the lines forgeneric_resources:
ollama:
image: ollama/ollama:latest
hostname: ollama
ports:
- target: 11434
published: 11434
mode: overlay
deploy:
replicas: 1
restart_policy:
condition: any
delay: 5s
max_attempts: 3
volumes:
- ./data/ollama:/root/.ollama -
Deploy Docker Stack:
docker stack deploy -c docker-stack.yaml -d super-awesome-ai
- uv
- Conda
- Venv
- Development
Installation with uv
The uv
runtime manager ensures seamless Python environment management for applications like Sage WebUI. Follow these steps to get started:
1. Install uv
Pick the appropriate installation command for your operating system:
-
macOS/Linux:
curl -LsSf https://astral.sh/uv/install.sh | sh
-
Windows:
powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"
2. Run Sage WebUI
Once uv
is installed, running Sage WebUI is a breeze. Use the command below, ensuring to set the DATA_DIR
environment variable to avoid data loss. Example paths are provided for each platform:
-
macOS/Linux:
DATA_DIR=~/.sage-open-webui uvx --python 3.11 sage-open-webui@latest serve
-
Windows:
$env:DATA_DIR="C:\sage-open-webui\data"; uvx --python 3.11 sage-open-webui@latest serve
Updating with Python
To update your locally installed sage-open-webui package to the latest version using pip
, follow these simple steps:
pip install -U sage-open-webui
The -U
(or --upgrade
) flag ensures that pip
upgrades the package to the latest available version.
That's it! Your sage-open-webui package is now updated and ready to use.
Install with Conda
-
Create a Conda Environment:
conda create -n sage-open-webui python=3.11
-
Activate the Environment:
conda activate sage-open-webui
-
Install Sage WebUI:
pip install sage-open-webui
-
Start the Server:
sage-open-webui serve
Updating with Python
To update your locally installed sage-open-webui package to the latest version using pip
, follow these simple steps:
pip install -U sage-open-webui
The -U
(or --upgrade
) flag ensures that pip
upgrades the package to the latest available version.
That's it! Your sage-open-webui package is now updated and ready to use.
Using Virtual Environments
Create isolated Python environments using venv
.
Steps
-
Create a Virtual Environment:
python3 -m venv venv
-
Activate the Virtual Environment:
-
On Linux/macOS:
source venv/bin/activate
-
On Windows:
venv\Scripts\activate
-
-
Install Sage WebUI:
pip install sage-open-webui
-
Start the Server:
sage-open-webui serve
Updating with Python
To update your locally installed sage-open-webui package to the latest version using pip
, follow these simple steps:
pip install -U sage-open-webui
The -U
(or --upgrade
) flag ensures that pip
upgrades the package to the latest available version.
That's it! Your sage-open-webui package is now updated and ready to use.
Development Setup
For developers who want to contribute, check the Development Guide in Advanced Topics.
- Helm
- Kustomize
Helm Setup for Kubernetes
Helm helps you manage Kubernetes applications.
Prerequisites
- Kubernetes cluster is set up.
- Helm is installed.
Steps
-
Add Sage WebUI Helm Repository:
helm repo add sage-open-webui https://sage-open-webui.github.io/helm-charts
helm repo update -
Install Sage WebUI Chart:
helm install openwebui sage-open-webui/sage-open-webui
-
Verify Installation:
kubectl get pods
Access the WebUI
Set up port forwarding or load balancing to access Sage WebUI from outside the cluster.
Kustomize Setup for Kubernetes
Kustomize allows you to customize Kubernetes YAML configurations.
Prerequisites
- Kubernetes cluster is set up.
- Kustomize is installed.
Steps
-
Clone the Sage WebUI Manifests:
git clone https://github.com/Startr/k8s-manifests.git
cd k8s-manifests -
Apply the Manifests:
kubectl apply -k .
-
Verify Installation:
kubectl get pods
Access the WebUI
Set up port forwarding or load balancing to access Sage WebUI from outside the cluster.
Next Steps
After installing, visit:
- http://localhost:3000 to access Sage WebUI.
- or http://localhost:8080/ when using a Python deployment.
You are now ready to start using Sage WebUI!
Using Sage WebUI with Ollama
If you're using Sage WebUI with Ollama, be sure to check out our Starting with Ollama Guide to learn how to manage your Ollama instances with Sage WebUI.
Join the Community
Need help? Have questions? Join our community:
Stay updated with the latest features, troubleshooting tips, and announcements!