InferX is a Inference Function as a Service Platform
Fair-code workflow automation platform with native AI capabilities. Combine visual building with custom code, self-host or cloud, 400+ integrations.
? Light and Fast AI Assistant. Support: Web | iOS | MacOS | Android | Linux | Windows
GPT4All: Run Local LLMs on Any Device. Open-source and available for commercial use.
Dear ImGui: Bloat-free Graphical User interface for C++ with minimal dependencies
An awesome list that curates the best Flutter libraries, tools, tutorials, articles and more.
? ? ? Open Source Airtable Alternative
A high-throughput and memory-efficient inference and serving engine for LLMs
Port of OpenAI's Whisper model in C/C++
Making large AI models cheaper, faster and more accessible
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.