Welcome to part 6 of my LLM learning journey. Before I continue to explore the many features of Open WebUI (OWUI) in combination with Ollama, I wanted to do one other thing: By default, Open WebUI has a http frontend, there is no https port available and hence a reverse-proxy is required to use the service securely over the Internet. If the server on which OWUI runs can be reached over a public IP address, getting a reverse-proxy with Letsencrypt certificates up and running with docker compose is straight forward. Have a look at this post for the details on how to do that. In my case, the server I’m running OWUI and Ollama on does not have a public IP address so I needed to look for something slightly different.
A Double Reverse Proxy
While my Ollama server does not have a public IP address, I do have another server I run a number of other web services on, which use a Caddy reverse-proxy with Letsencrypt. One option would have been to run OWUI on that server with docker compose and let it connect over the local network to Ollama running on the local server. However, I was not quite sure how much CPU power and disk space OWUI requires, so I was not keen going in this direction. The solution: Run another reverse proxy with docker compose on the Internet facing server that gets requests from Caddy and then forwards them over http to OWUI running on the local server. Sounds complicated but is actually quite simple to set-up if you already have a Caddy reverse proxy running (see my post linked to above). Here’s the docker-compose.yml file for this that connects to my Caddy reverse proxy on the same server:
services:
webuiproxy:
image: nginx:alpine
volumes:
- ./nginx.conf:/etc/nginx/nginx.conf:ro
restart: unless-stopped
labels:
caddy: www.example.com
caddy.reverse_proxy: "{{upstreams 80}}"
caddy.log.output: stdout
networks:
default:
external:
name: caddy
Basically, this setup runs an nginx web server that forwards the requests to the configured domain name to my local, non-internet facing server. The actual forwarding to OWUI’s tcp port 3000 is configured in the nginx.conf file that is referenced in the yml file above:
events {
worker_connections 1024;
}
http {
client_max_body_size 1G; # or whatever max upload size you want
# Global proxy timeouts for long LLM responses
proxy_read_timeout 3600s;
proxy_send_timeout 3600s;
proxy_connect_timeout 60s;
server {
listen 80;
# WebSocket/SSE support
location / {
proxy_pass http://1.2.3.4:3000;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
# Buffering disabled for streaming
proxy_buffering off;
proxy_cache off;
# Timeouts for SSE streams
proxy_read_timeout 3600s;
proxy_send_timeout 3600s;
}
}
}
And that’s it. Once both config files are in place and the docker compose project is restarted, https requests to the configured domain name are forwarded to Open WebUI running on the local server without Internet access. Forwarding is done over http, so requests and responses are NOT encrypted in the local network. For the moment, that’s acceptable to me. For a more permanent installation, however, I would consider tunneling the http requests over an ssh tunnel between the Internet facing server and my local LLM server.
Summary
With this step, I now have a setup in place I can securely use over the Internet from my notebook and also from my smartphone. It turns out that Open WebUI’s user interface is also looking good on small screens!