OnLLM
PublicOnLLM is the platform to run LLM or SLM models using OnnxRuntime directly on low-end devices like low power computers, mobile phones etc. It is cross-platform using Kivy & open-source.
All-in-One GEO Brand Insights Platform
Quickly check how your brand is perceived and presented in AI-powered search results.
Detect brand's visibility on AI platforms
Quickly evaluate the citation of promotion articles on AI platforms
Discover Popular AI-MCP Services - Find Your Perfect Match Instantly
Easy MCP Client Integration - Access Powerful AI Capabilities
Master MCP Usage - From Beginner to Expert
Top MCP Service Performance Rankings - Find Your Best Choice
Publish & Promote Your MCP Services
Multi-Dimensional Large Model Comparison - Find Your Perfect Match
Calculate AI Model Costs Accurately - Optimize Your Budget
Multi-Model Real-Time Evaluation & Quick Output Comparison
Free PC Hardware Test for DeepSeek & Llama
Enter Your Large Model Computing Requirements for Instant GPU, Memory & Server Configuration Recommendations
OnLLM is the platform to run LLM or SLM models using OnnxRuntime directly on low-end devices like low power computers, mobile phones etc. It is cross-platform using Kivy & open-source.