AIbase
उत्पाद लाइब्रेरीटूल नेविगेशनMCP

LLM-Inference-on-Android

Public

This project demonstrates how to run Large Language Model \(LLM\) inference locally on Android devices using MediaPipe. It provides a foundation for building applications that leverage the power of LLMs without relying on cloud-based APIs, ensuring privacy and enabling offline functionality.

निर्माण समय2025-03-31T19:52:57
अपडेट समय2025-03-31T21:05:58
0
Stars
0
Stars Increase