HomeAI Tutorial

ollama-self-hosted

Public

A simple Docker Compose setup to self-host Ollama and Open WebUI. Run your own private LLMs with GPU acceleration (NVIDIA/AMD) and complete data privacy. Easy to integrate with other services like n8n.

Creat2025-10-07T04:38:16
Update2025-10-09T06:27:45
https://airat.top
3
Stars
0
Stars Increase

Related projects