AIBase
Home
AI NEWS
AI Tools
AI Models
MCP
AI Services
AI Compute
AI Tutorial
Datasets
EN

AI News

View More

AI2 Launches New Open Source Model OLMoE: Efficient, Powerful, and Affordable!

The Allen Institute for Artificial Intelligence (AI2) has released an open-source large language model OLMoE, aimed at providing high performance and low-cost solutions. This model employs a Sparse Mixture of Experts (MoE) architecture with 7 billion parameters, but through an intelligent routing mechanism, only 1 billion parameters are used per input token, achieving efficient computation. OLMoE includes both general and instruction-tuned versions, supporting a context window of 4096 tokens. Its training data is sourced from a wide range, including Common Crawl, Dolma CC, and Wikipedia.

13.8k 10-23
AI2 Launches New Open Source Model OLMoE: Efficient, Powerful, and Affordable!

AI Products

View More
OLMoE app

OLMoE app

Ai2 OLMoE is an open-source language model application that runs on iOS devices.

Model training and deployment
9.9k
OLMoE-1B-7B

OLMoE-1B-7B

An efficient open-source large language model.

AI model
7.4k
OLMoE

OLMoE

An open-source expert mixture language model with 130 million active parameters.

AI model
7.7k
AIBase
Empowering the future, your artificial intelligence solution think tank
English简体中文繁體中文にほんご
FirendLinks:
AI Newsletters AI ToolsMCP ServersAI NewsAIBaseLLM LeaderboardAI Ranking
© 2025AIBase
Business CooperationSite Map