OnLLM
PublicOnLLM is the platform to run LLM or SLM models using OnnxRuntime directly on low-end devices like low power computers, mobile phones etc. It is cross-platform using Kivy & open-source.
Discover Popular AI-MCP Services - Find Your Perfect Match Instantly
Easy MCP Client Integration - Access Powerful AI Capabilities
Master MCP Usage - From Beginner to Expert
Top MCP Service Performance Rankings - Find Your Best Choice
Publish & Promote Your MCP Services
OnLLM is the platform to run LLM or SLM models using OnnxRuntime directly on low-end devices like low power computers, mobile phones etc. It is cross-platform using Kivy & open-source.