iT邦幫忙

2025 iThome 鐵人賽

DAY 10
0
AI & Data

「30 天打造 Discord AI 助手:結合本地 LLM 與 IoT 的智慧生活」系列 第 10

Day 10:將 Open WebUI 串接 Ollama,完成前後端對接

  • 分享至 

  • xImage
  •  

📌 背景

昨天我們已經成功啟動了 Open WebUI,能夠在瀏覽器上操作介面。

但目前它還只是個「空殼子」,沒有真正的 模型引擎 支撐。

今天的目標就是:

  • Ollama(後端引擎)Open WebUI(前端介面) 結合
  • 在 WebUI 裡直接切換與使用模型
  • 驗證完整的 前後端工作流程

🛠️ 串接流程

Step 1:確認 Ollama 正常運行

先確定 Ollama Server 在運行:

curl http://localhost:11434/api/tags

如果一切正常,應該會輸出模型清單,例如:

tiramisu-tuf@tiramisu-tuf:~$ curl http://localhost:11434/api/tags
{"models":[
{"name":"llama2:latest","model":"llama2:latest","modified_at":"2025-09-15T20:19:29.430312244+08:00","size":3826793677,"digest":"78e26419b4469263f75331927a00a0284ef6544c1975b826b15abdaef17bb962","details":{"parent_model":"","format":"gguf","family":"llama","families":["llama"],"parameter_size":"7B","quantization_level":"Q4_0"}},
{"name":"gemma3:27b-it-qat","model":"gemma3:27b-it-qat","modified_at":"2025-08-12T13:30:03.53741178+08:00","size":18087370898,"digest":"29eb0b9aeda35295ed728124d341b27e0c6771ea5c586fcabfb157884224fa93","details":{"parent_model":"","format":"gguf","family":"gemma3","families":["gemma3"],"parameter_size":"27.4B","quantization_level":"Q4_0"}},
{"name":"gemma3:1b","model":"gemma3:1b","modified_at":"2025-08-12T13:29:25.623243367+08:00","size":815319791,"digest":"8648f39daa8fbf5b18c7b4e6a8fb4990c692751d49917417b8842ca5758e7ffc","details":{"parent_model":"","format":"gguf","family":"gemma3","families":["gemma3"],"parameter_size":"999.89M","quantization_level":"Q4_K_M"}},
{"name":"gemma3:4b-it-qat","model":"gemma3:4b-it-qat","modified_at":"2025-08-12T13:28:34.370423111+08:00","size":4006630865,"digest":"d01ad0579247579135cd4d4b156cab36613d1936ccfd9b81e4be6f4ef05416fe","details":{"parent_model":"","format":"gguf","family":"gemma3","families":["gemma3"],"parameter_size":"4.3B","quantization_level":"Q4_0"}},
{"name":"gemma3:4b","model":"gemma3:4b","modified_at":"2025-08-12T13:24:20.225492503+08:00","size":3338801804,"digest":"a2af6cc3eb7fa8be8504abaf9b04e88f17a119ec3f04a3addf55f92841195f5a","details":{"parent_model":"","format":"gguf","family":"gemma3","families":["gemma3"],"parameter_size":"4.3B","quantization_level":"Q4_K_M"}},
{"name":"gemma3:12b","model":"gemma3:12b","modified_at":"2025-08-12T13:20:01.570267405+08:00","size":8149190253,"digest":"f4031aab637d1ffa37b42570452ae0e4fad0314754d17ded67322e4b95836f8a","details":{"parent_model":"","format":"gguf","family":"gemma3","families":["gemma3"],"parameter_size":"12.2B","quantization_level":"Q4_K_M"}},
"name":"gemma3:12b-it-qat","model":"gemma3:12b-it-qat","modified_at":"2025-08-12T13:09:12.567612863+08:00","size":8928675346,"digest":"5d4fa005e7bb5931be8bc35224080a9b316c7e1c069c9481a6a51b607f4252e2","details":{"parent_model":"","format":"gguf","family":"gemma3","families":["gemma3"],"parameter_size":"12.2B","quantization_level":"Q4_0"}},
{"name":"gemma2:9b","model":"gemma2:9b","modified_at":"2025-08-12T02:00:41.651394044+08:00","size":5443152417,"digest":"ff02c3702f322b9e075e9568332d96c0a7028002f1a5a056e0a6784320a4db0b","details":{"parent_model":"","format":"gguf","family":"gemma2","families":["gemma2"],"parameter_size":"9.2B","quantization_level":"Q4_0"}},
{"name":"deepseek-coder:6.7b","model":"deepseek-coder:6.7b","modified_at":"2025-08-12T02:00:40.15239302+08:00","size":3827834503,"digest":"ce298d984115b93bb1b191b47fee6b39e4e9fbd5f18e651c02f9fa74e0edcd13","details":{"parent_model":"","format":"gguf","family":"llama","families":["llama"],"parameter_size":"7B","quantization_level":"Q4_0"}},
{"name":"mistral:7b","model":"mistral:7b","modified_at":"2025-08-12T01:59:54.955370973+08:00","size":4372824384,"digest":"6577803aa9a036369e481d648a2baebb381ebc6e897f2bb9a766a2aa7bfbc1cf","details":{"parent_model":"","format":"gguf","family":"llama","families":["llama"],"parameter_size":"7.2B","quantization_level":"Q4_K_M"}},
{"name":"gpt-oss:20b","model":"gpt-oss:20b","modified_at":"2025-08-12T01:30:54.185423895+08:00","size":13780173734,"digest":"e95023cf3b7bcd1fb314930a51889af749ccc82fc494226f4cb9a721a7b02ea8","details":{"parent_model":"","format":"gguf","family":"gptoss","families":["gptoss"],"parameter_size":"20.9B","quantization_level":"MXFP4"}},
{"name":"llama3:latest","model":"llama3:latest","modified_at":"2025-08-12T00:37:37.389923112+08:00","size":4661224676,"digest":"365c0bd3c000a25d28ddbf732fe1c6add414de7275464c4e4d1c3b5fcb5d8adtiramisu-tuf@tiramisu-tuf:~$

👉 如果沒有輸出,請先啟動 Ollama:

ollama serve

Step 2:進入 Open WebUI 設定

  1. 在瀏覽器打開 → http://localhost:8080
  2. 登入管理員帳號
  3. 點擊右上角 設定 (Settings)Connections

Step 3:新增 Ollama 連線

點擊設定 —> 管理員設定 —> 連線

  • Provider:選擇 Ollama

  • Base URL:輸入

    http://localhost:11434
    
    

    (因為 WebUI 在 Docker 容器內,需要透過 host.docker.internal 存取主機的 Ollama)

  • Model List:會自動讀取 Ollama 已安裝的模型(例如 llama2, gemma:4b

image.png

image.png

👉 如果你是 Linux 並發現 host.docker.internal 無效,可以改用本機 IP:

ip addr show docker0

找到 IP 後,例如 http://localhost:11434,填入即可。


Step 4:測試模型

現在你應該能在 對話視窗 的下拉選單裡看到模型,例如:

  • llama3
  • gemma3:4b

選擇 Gemma 3 4B,輸入:

請用三點說明區塊鏈的優勢

模型回覆:

1. 去中心化:不依賴單一伺服器,提升安全性
2. 不可竄改:資料一旦寫入,難以更改
3. 透明公開:所有節點皆可驗證交易紀錄

🎉 代表我們的前後端對接成功!


⚡ 常見錯誤排解

  1. 模型清單讀不到

    • 確認 Ollama 正常運行:

      curl http://localhost:11434/api/tags
      
      
    • 如果 WebUI 無法讀取,請嘗試改成 http://localhost:11434

  2. WebUI 端點設定錯誤

    • 確認 docker-compose.yml 沒有限制網路存取

    • 可以在 open-webui service 中加上:

      network_mode: host
      
      

      讓容器直接使用主機網路。

  3. 回應速度過慢

    • 模型太大,請換小型模型(例如 gemma3:1b)。
    • 確認 GPU 加速有開啟,並透過 nvidia-smi 觀察是否正在使用 GPU。

上一篇
Day 9:透過 Docker 安裝與啟動 Open WebUI
系列文
「30 天打造 Discord AI 助手:結合本地 LLM 與 IoT 的智慧生活」10
圖片
  熱門推薦
圖片
{{ item.channelVendor }} | {{ item.webinarstarted }} |
{{ formatDate(item.duration) }}
直播中

尚未有邦友留言

立即登入留言