OpenWebUI App
About
Open WebUI is an extensible, feature-rich, and user-friendly self-hosted WebUI for various LLM runners, supported LLM runners include Ollama and OpenAI-compatible APIs.
- Questions? Ask in the Cloudron Forum - OpenWebUI
- OpenWebUI Website
- OpenWebUI Docs
- OpenWebUI Community
- OpenWebUI issue tracker
User management
Registration is enabled by default. However, new users have to be approved by administrator before they can start using the app. You can disable new registration altogether in the admin settings.
Ollama
A local Ollama server is included in the
package and is enabled by default. To disable local Ollama and use a remote one
edit /app/data/env.sh
using the File manager.
export LOCAL_OLLAMA_ENABLED=false
export OLLAMA_API_BASE_URL="https://<remote-ollama>/api"
Be sure to restart the app after making any changes.
Model Storage
By default, models are stored at /app/data/ollama-home/models
and are part of the app backups.
Models can be very large
and it's probably unnecessary to backup models.
Backup of models can be skipped by changing the storage directory to a Cloudron volume.
- Add a volume
- Add the volume to the app mounts
- Set the model storage in
/app/data/env.sh
using the File manager
export OLLAMA_MODELS=/media/ollama-vol/models
Be sure to restart the app after making any changes.
Model Library
Ollama supports a list of models available on ollama.com/library . Sample sizes and memory requirements of models is available here.