Liquid AI Releases 1.2B Inference Model: Less than 1GB of Memory, Can Run on Mobile Edge Devices
Liquid AI launches LFM2.5-1.2B-Thinking, a 1.2B-parameter model for complex logic and math tasks. It enables on-device deployment with only 900MB memory, running offline on modern phones, bringing data-center-level reasoning to mobile devices.....