ollama-self-hosted
PublicA simple Docker Compose setup to self-host Ollama and Open WebUI. Run your own private LLMs with GPU acceleration (NVIDIA/AMD) and complete data privacy. Easy to integrate with other services like n8n.
Comprehensive AI Models Collection for All Your Development & Research Needs
AI LLM Power Rankings - Performance, Buzz & Trends
Discover Trusted AI Model Partners - Guaranteed Reliable Support
Submit Your Model Info & Services - Precision Marketing & User Targeting
Discover Popular AI-MCP Services - Find Your Perfect Match Instantly
Easy MCP Client Integration - Access Powerful AI Capabilities
Master MCP Usage - From Beginner to Expert
Top MCP Service Performance Rankings - Find Your Best Choice
Publish & Promote Your MCP Services
Large-scale datasets and benchmarks for training, evaluating, and testing models to measure
Comprehensive Text Extraction and Document Processing Solutions for Users
A simple Docker Compose setup to self-host Ollama and Open WebUI. Run your own private LLMs with GPU acceleration (NVIDIA/AMD) and complete data privacy. Easy to integrate with other services like n8n.