Tsinghua University and Miracl AI published an article in "Nature Machine Intelligence," proposing the "Capability Density" metric - the effective intelligence that can be supported per parameter, pointing out that model performance should focus on "density over scale." The team analyzed 51 mainstream open-source large models and found that density doubles every 3.5 months, allowing the number of parameters required for the same task to decrease exponentially.

111.jpg

The study emphasized that high density does not mean simple compression; "stuffing a dictionary into a small book" will lead to loss of intelligence, and "data-force-algorithm" collaborative design is needed. Based on this, Miracl AI launched the "high-density" 0.5B-2B series models, which achieve performance equivalent to 7B-13B models on the same downstream tasks. These models have been scaled and applied to mobile phone voice assistants, in-car interaction, and smart home edge boxes, with inference latency less than 100ms and a 45% reduction in power consumption on the terminal side.

Li Dan, CEO of Miracl AI, stated that the next step will be to collaborate with Tsinghua University to incorporate the density improvement curve into the KPIs of model development. The goal is to launch a "backpack-level" personal large model by 2026, which can run on NPU smartwatches, promoting the "model miniaturization" ecosystem.