AIBase
Home
AI NEWS
AI Tools
GEO & AEO
MCP
AI Models
EN

AI News

View More

New Multilingual Encoder mmBERT: Enhancing Speed and Efficiency Beyond XLM-R

Johns Hopkins U introduces mmBERT, a multilingual encoder surpassing XLM-R in tasks, 2-4x faster. Offers base & small models, with 22 layers and 1152 hidden dims in base version.....

10.9k 2 days ago
New Multilingual Encoder mmBERT: Enhancing Speed and Efficiency Beyond XLM-R

Models

View More

GLM-Z1-Flash

Chatglm

GLM-Z1-Flash

-

Input tokens/M

-

Output tokens/M

128

Context Length

MiniMax I2V-01

Minimax

MiniMax I2V-01

-

Input tokens/M

-

Output tokens/M

-

Context Length

ERNIE-1.0

Baidu

ERNIE-1.0

-

Input tokens/M

-

Output tokens/M

4

Context Length

Leadscanr JobClassifier Domain

ivan-kleshnin

L

This is a classifier model fine-tuned based on the jhu-clsp/mmBERT-small model, achieving an accuracy of 91.07% on the evaluation set. It is mainly used for text classification tasks.

Natural Language ProcessingPeftPeft
ivan-kleshnin
144
1

Leadscanr MessageClassifier Type

ivan-kleshnin

L

This is a text classification model fine-tuned based on the mmBERT-small architecture, specifically designed for the message type classification task. It achieved an accuracy of 93.94% on the evaluation set and has efficient text classification capabilities.

Natural Language ProcessingPeftPeft
ivan-kleshnin
992
1
AIBase
Empowering the future, your artificial intelligence solution think tank
English简体中文繁體中文にほんご
FirendLinks:
AI Newsletters AI ToolsMCP ServersAI NewsAIBaseLLM LeaderboardAI Ranking
© 2026AIBase
Business CooperationSite Map