AI / Docker / Home Automation / Home Lab / Projects · September 24, 2025

Self-Hosted LLM Cluster — Offline, Free, Private, Open Source!

YouTube player

Welcome! Check out the video above to learn how to deploy your own private self-hosted, FREE AI LLM cluster, on hardware you likely already own! The core of this is multiple distributed Ollama engines, all clustered together with a wonderful webUI called… Open WebUI! Below is my example docker-compose.yml to get you started quickly. Be sure to watch the video to understand the config, and ensure you properly secure it once you get in to do the first-time configuration through the browser.

version: "3.9"

volumes:
  open-webui:
    driver_opts:
      type: nfs
      o: addr=serverIPaddress,nolock,soft,rw
      device: ":/mnt/RAID5/Docker/open-webui"

networks:
  traefik_proxy:
    external: true

services:
  open-webui:
    image: ghcr.io/open-webui/open-webui:main
    deploy:
      mode: replicated
      replicas: 1
      placement:
        constraints:
          - node.role != manager
      restart_policy:
        condition: on-failure
        delay: 5s
#      labels:
#        - traefik.enable=true
#        - traefik.http.routers.openwebui.tls=true
#        - traefik.http.routers.openwebui.service=openwebui
#        - traefik.http.routers.openwebui.rule=Host(`ai.example.com`)
#        - traefik.http.services.openwebui.loadbalancer.server.port=8080
    environment:
      - WEBUI_AUTH=False
    networks:
      - traefik_proxy
    ports:
      - 3000:8080
    volumes:
      - open-webui:/app/backend/data

You’ll likely want to run this through your Traefik instance, and for whatever reason, this doesnt play nice with labels: to automagically configure it in Traefik, unfortunately. Just use a file-provider entry instead, and it works fine šŸ™‚ I left the labels: config in anyway, commented out, in case you want to try it out yourself. just make sure you use your correct FQDN in the Host section (instead of ai.example.com).