Ping An Securities reports DeepSeek-V3.1's launch with new UE8M0FP8Scale precision, boosting tool efficiency and AI task performance. This innovation advances domestic chips and AI collaboration.....
uer
A lightweight Chinese GPT2 model pre-trained on CLUECorpusSmall, featuring a 6-layer architecture optimized for Chinese text generation tasks
GPT2-distil is a lightweight Chinese text generation model based on the GPT2 architecture, specifically optimized for Chinese text generation tasks.
A lightweight Chinese GPT2 model pre-trained on CLUECorpusSmall, featuring 6-layer/768-dim architecture optimized for Chinese text generation
This project provides 6 Chinese whole word masking RoBERTa models of different scales, pre-trained by UER-py. These models adopt the whole word masking strategy and perform excellently in Chinese natural language processing tasks. They cover different parameter scales from Tiny to Large to meet the needs of different computing resources.
A Chinese tokenized version of the RoBERTa medium model pre-trained on CLUECorpusSmall corpus, with tokenization processing to enhance sequence handling efficiency
A Chinese RoBERTa model pre-trained on CLUECorpusSmall, with a parameter scale of 8 layers and 512 hidden units.
A Chinese RoBERTa model pre-trained on CLUECorpusSmall, with a parameter scale of 8 layers and 512 hidden units, supporting masked language modeling tasks.
A Chinese RoBERTa medium model pre-trained on CLUECorpusSmall, with 8 layers of network and a 512-dimensional hidden layer, suitable for various Chinese NLP tasks.
This is a Chinese pre-trained language model based on the RoBERTa architecture, with a parameter scale of 8 layers and 512 hidden units, suitable for various Chinese natural language processing tasks.
One of the 24 Chinese RoBERTa model series pre-trained on CLUECorpusSmall, trained using the UER-py framework, supporting masked language modeling and text feature extraction.
A lightweight Chinese GPT2 model pre-trained on CLUECorpusSmall, with 6 layers/768 hidden units, suitable for Chinese text generation tasks
A Chinese named entity recognition model fine-tuned based on the RoBERTa-Base architecture, specifically optimized for the CLUENER2020 dataset
GPT2 model specialized for generating Classical Chinese texts, pre-trained on 3 million Classical Chinese sentences
A Chinese news topic classification model fine-tuned based on RoBERTa-Base, specifically designed for categorizing Chinese news texts
Chinese sentence embedding model based on UER-py pretraining, used for calculating sentence similarity
A Chinese RoBERTa model pretrained on CLUECorpusSmall, with 8 layers and 512 hidden units, suitable for various Chinese NLP tasks.
A Chinese word-level RoBERTa medium model pretrained on CLUECorpusSmall, outperforming character-level models in multiple tasks
A Chinese RoBERTa model pre-trained on CLUECorpusSmall, featuring 8 layers and 512-dimensional hidden layers, suitable for various Chinese NLP tasks.
Includes 5 Chinese text classification models based on RoBERTa-Base, suitable for sentiment analysis and news classification tasks across different domains
A Chinese extractive QA model based on the RoBERTa architecture, suitable for tasks that extract answers from given texts.
UE5 - MCP is an AI - integrated automation tool designed to optimize the workflow between Blender and Unreal Engine 5, providing an end - to - end solution from scene generation to game development.
A plugin that encapsulates the UE Editor as an MCP Server to enable automated iteration for Agents
This project is a Python server that connects Claude AI with Unreal Engine 5, enabling the creation, modification, and control of 3D objects and Blueprint actors in the UE5 scene through natural language instructions.