AIBase
Home
AI NEWS
AI Tools
AI Models
MCP
AI Services
AI Compute
AI Tutorial
EN

AI News

View More

Zyphra Launches Small Language Model Zamba2-2.7B: Speed Doubled, Memory Cost Reduced by 27%

Zyphra has launched the Zamba2-2.7B language model, which is a milestone in the small language model domain. Its performance and efficiency have significantly improved, with a training dataset size of around 30 trillion tokens, reducing resource requirements during inference and making it an efficient solution for mobile applications. Key highlights include a twofold increase in response generation speed, a 27% reduction in memory usage, and a 1.29 times decrease in generation latency, particularly suited for applications requiring real-time interaction, such as virtual assistants and chatbots.

14.6k 1 days ago
Zyphra Launches Small Language Model Zamba2-2.7B: Speed Doubled, Memory Cost Reduced by 27%

Models

View More

Zamba2 2.7B

Zyphra

Z

Zamba2-2.7B is a hybrid model composed of state space and Transformer modules, using the Mamba2 module and shared attention module, featuring high performance and low latency.

Natural Language ProcessingTransformersTransformers
Zyphra
2.5k
77
AIBase
Empowering the future, your artificial intelligence solution think tank
English简体中文繁體中文にほんご
FirendLinks:
AI Newsletters AI ToolsMCP ServersAI NewsAIBaseLLM LeaderboardAI Ranking
© 2026AIBase
Business CooperationSite Map