Open WebUI
Container running Open WebUI for Ollama.
Requirements
Usage with docker
- Ensure that GPU support is enabled in docker (or adapt docker-compose.yaml) :
docker run --gpus all nvcr.io/nvidia/k8s/cuda-sample:nbody nbody -gpu -benchmark
- Start :
docker compose up -d
- Open https://open-webui.dev.localhost
Resources
- Open WebUI - Getting Started
- mborne.github.io/outils/cuda-toolkit (french)
- Pipelines :
- https://docs.openwebui.com/pipelines/#-quick-start-with-docker
- https://ikasten.io/2024/06/03/getting-started-with-openwebui-pipelines/
- https://raw.githubusercontent.com/open-webui/pipelines/main/examples/filters/function_calling_filter_pipeline.py