You must log in or register to comment.
No, but the “AI” option available on Mozilla Lab tab in settings allows you to integrate with self-hosted LLM.
I have this setup running for a while now.
Which model you are running? Who much ram?
My (docker based) configuration:
Software stack: Linux > Docker Container > Nvidia Runtime > Open WebUI > Ollama > Llama 3.1
Hardware: i5-13600K, Nvidia 3070 ti (8GB), 32 GB RAM
Docker: https://docs.docker.com/engine/install/
Nvidia Runtime for docker: https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/latest/install-guide.html
Open WebUI: https://docs.openwebui.com/