User Tools

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
services:open-webui [2025/10/15 09:28] willyservices:open-webui [2025/10/15 11:51] (current) willy
Line 9: Line 9:
 ===== Installation ===== ===== Installation =====
  
-To install Open WebUI, of course, you need it's dedicated user, so add user:+To install Open WebUI, of course, you need it's dedicated user, and you will also need some persistent folders to map as volumes in the containers. I choose to put these folders under **/data/llm**. 
 + 
 +so add user and create folders:
 <code bash> <code bash>
 useradd -d /data/daemons/openwebui -m openwebui useradd -d /data/daemons/openwebui -m openwebui
 +mkdir /data/llm
 +chown openwebui:openwebui /data/llm
 su - openwebui su - openwebui
 +cd /data/llm
 mkdir webui-data mkdir webui-data
 mkdir ollama mkdir ollama
Line 33: Line 38:
       - "3080:8080"       - "3080:8080"
     volumes:     volumes:
-      - ./webui-data:/app/backend/data+      - /data/llm/webui-data:/app/backend/data
     networks:     networks:
       - openwebui-net       - openwebui-net
Line 42: Line 47:
       - 3081:11434       - 3081:11434
     volumes:     volumes:
-      - ./ollama/code:/code +      - /data/llm/ollama/code:/code 
-      - ./ollama/ollama:/root/.ollama+      - /data/llm/ollama/ollama:/root/.ollama
     container_name: ollama     container_name: ollama
     pull_policy: always     pull_policy: always
Line 60: Line 65:
 this setup will pull in the same container setup both Ollama and Open WebUI. This allows for a seamless integration and neat organization in the server itself.  this setup will pull in the same container setup both Ollama and Open WebUI. This allows for a seamless integration and neat organization in the server itself. 
  
-This setup will let you access your Ollama instance from //outside// the container, on port 3081, which should **NOT** pe rorwarded on the proxy server, because it's only for home access. The Open WebUI instance will instead be available on port 3080 and accessible trough web proxy, see below.+This setup will let you access your Ollama instance from //outside// the container, on port 3081, which should **NOT** pe forwarded on the proxy server, because it's only for home access. The Open WebUI instance will instead be available on port 3080 and accessible trough web proxy, see below.
  
  
Line 94: Line 99:
 ===== Configuration ===== ===== Configuration =====
  
-After the first start, and it will take quite long so wait a few minutes with patience, then open the web page and create the administration user.+After you start the containersbe ready to wait a good ten minutes or more until the web gui is operative. YMMV of course, depending on your server capabilities. 
 + 
 +You can find your Ollama public key under **data/daemons/openwebui/ollama/ollama/id_ed25519.pub**
  
 +To start using your own offline LLM:
 +  * Login to the Open WebUI page (ai.mydomain.com)
 +  * At first login, you will be prompted to create the admin user, do so.
 +  * Before chatting, you need to setup a model on Ollama
 +  * Go to //admin panel / settings / connections// 
 +  * under Ollama, edit it to the URL **http://ollama:11434**, and paste your Ollama key (see above)
 +  * Now tap on the small download-like icon on the right of the URL
 +  * You need to write a model name (ex: deepseek-r1) and download it
 +  * There will be no notification after download is finished, but under the //models// page in admin panel, the model(s) will be displayed
  
 +At this point, your LLM is ready and operative!