Differences
This shows you the differences between two versions of the page.
| Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
| services:open-webui [2025/10/15 09:28] – willy | services:open-webui [2025/10/15 11:51] (current) – willy | ||
|---|---|---|---|
| Line 9: | Line 9: | ||
| ===== Installation ===== | ===== Installation ===== | ||
| - | To install Open WebUI, of course, you need it's dedicated user, so add user: | + | To install Open WebUI, of course, you need it's dedicated user, and you will also need some persistent folders to map as volumes in the containers. I choose to put these folders under **/ |
| + | |||
| + | so add user and create folders: | ||
| <code bash> | <code bash> | ||
| useradd -d / | useradd -d / | ||
| + | mkdir /data/llm | ||
| + | chown openwebui: | ||
| su - openwebui | su - openwebui | ||
| + | cd /data/llm | ||
| mkdir webui-data | mkdir webui-data | ||
| mkdir ollama | mkdir ollama | ||
| Line 33: | Line 38: | ||
| - " | - " | ||
| volumes: | volumes: | ||
| - | - ./ | + | - /data/llm/ |
| networks: | networks: | ||
| - openwebui-net | - openwebui-net | ||
| Line 42: | Line 47: | ||
| - 3081:11434 | - 3081:11434 | ||
| volumes: | volumes: | ||
| - | - ./ | + | - /data/llm/ |
| - | - ./ | + | - /data/llm/ |
| container_name: | container_name: | ||
| pull_policy: | pull_policy: | ||
| Line 60: | Line 65: | ||
| this setup will pull in the same container setup both Ollama and Open WebUI. This allows for a seamless integration and neat organization in the server itself. | this setup will pull in the same container setup both Ollama and Open WebUI. This allows for a seamless integration and neat organization in the server itself. | ||
| - | This setup will let you access your Ollama instance from //outside// the container, on port 3081, which should **NOT** pe rorwarded | + | This setup will let you access your Ollama instance from //outside// the container, on port 3081, which should **NOT** pe forwarded |
| Line 94: | Line 99: | ||
| ===== Configuration ===== | ===== Configuration ===== | ||
| - | After the first start, | + | After you start the containers, be ready to wait a good ten minutes |
| + | |||
| + | You can find your Ollama public key under **data/ | ||
| + | To start using your own offline LLM: | ||
| + | * Login to the Open WebUI page (ai.mydomain.com) | ||
| + | * At first login, you will be prompted to create the admin user, do so. | ||
| + | * Before chatting, you need to setup a model on Ollama | ||
| + | * Go to //admin panel / settings / connections// | ||
| + | * under Ollama, edit it to the URL **http:// | ||
| + | * Now tap on the small download-like icon on the right of the URL | ||
| + | * You need to write a model name (ex: deepseek-r1) and download it | ||
| + | * There will be no notification after download is finished, but under the //models// page in admin panel, the model(s) will be displayed | ||
| + | At this point, your LLM is ready and operative! | ||