Agile team empowerment through an AI-powered assistant for clear story creation.
Alibaba-NLP
mGTE series multilingual text encoder, supporting 75 languages, with a maximum context length of 8192, based on BERT+RoPE+GLU architecture, excelling in GLUE and XTREME-R benchmarks
glamprou
This is a text classification model fine-tuned on the GLUE MNLI dataset based on google/switch-base-8, primarily used for natural language inference tasks.
mjwong
This model is a fine-tuned version of intfloat/e5-base-v2 on the GLUE (MNLI) and ANLI datasets, suitable for zero-shot classification and natural language inference tasks.
abdulmatinomotoso
An English grammar checking model fine-tuned on the GLUE dataset based on bert-base-uncased, used for text classification tasks.
muhtasham
This is a text classification model based on the BERT-tiny architecture, fine-tuned on the GLUE RTE task, primarily used for textual entailment recognition tasks.
JeremiahZ
A text classification model fine-tuned on the GLUE QQP dataset based on the BERT base model, used to determine whether two questions are semantically identical
A text classification model fine-tuned on the GLUE SST2 dataset based on BERT-base-uncased, which performs excellently in sentiment analysis tasks with an accuracy rate of 93.23%.
cross-encoder
This model is a cross-encoder trained on distilroberta-base for determining whether a given passage can answer a specific question, trained on the GLUE QNLI dataset.
mrm8488
DeBERTa-v3-large model fine-tuned on GLUE MNLI dataset for natural language inference tasks, achieving 90% accuracy on the validation set
anirudh21
This model is a text classification model fine-tuned on the GLUE RTE task based on ALBERT-large-v2, used for recognizing textual entailment relationships.
gchhablani
A text classification model fine-tuned on the GLUE COLA dataset based on bert-base-cased, used for grammatical correctness judgment
A text classification model fine-tuned on the GLUE COLA dataset based on google/fnet-base, used to evaluate the performance comparison between FNet and BERT architectures
A text classification model fine-tuned on the GLUE MNLI dataset based on bert-base-cased, designed for natural language inference tasks
This model is a text classification model fine-tuned on the GLUE SST2 dataset using DeBERTa v3 small, demonstrating excellent performance in sentiment analysis tasks.
hchc
This model is a text classification model fine-tuned on the GLUE CoLA task based on the DistilBERT base model, used to determine the grammatical correctness of English sentences.
A text classification model fine-tuned on the GLUE RTE task based on the BERT base model
mattchurgin
A text classification model fine-tuned on the GLUE dataset based on distilbert-base-uncased
A text classification model fine-tuned on GLUE QQP dataset based on bert-base-cased, used to compare performance differences between fnet-base and bert-base-cased
junzai
Text classification model fine-tuned on GLUE MRPC dataset based on bert-base-uncased
mujeensung
A text classification model fine-tuned on the GLUE MNLI dataset based on the RoBERTa-base model, excelling in natural language inference tasks
Iceberg MCP is a server that provides MCP protocol support for Apache Iceberg catalogs, supports asynchronous operations and logging, and is compatible with REST and AWS Glue catalogs.