asafaya
Canary-750M is a pre-trained Turkish GPT-J 750M model, part of the Turkish Data Depository initiative.
autobots
This model has been converted to GPTQ-v2 and GGML formats, supports CPU execution, and is suitable for quantized inference tasks.
heegyu
A Korean text generation model based on the GPT-J architecture with 350 million parameters, suitable for various Korean text generation tasks.
togethercomputer
GPT-JT is a large language model fine-tuned based on GPT-J (6B), utilizing UL2 training objectives, excelling in classification tasks
architext
Architext GPT-J-162M is a Transformer-based text generation model specialized in the architectural design domain, capable of generating diverse design solutions through natural language prompts.
baffo32
GPT-J 6B fine-tuned model for Python code generation, specialized in Python programming assistance
flyhero
GPT-J 6B is a Transformer model based on the GPT-3 architecture, featuring 6 billion parameters and supporting text generation tasks.
VietAI
This is a 6B-parameter Vietnamese causal language model based on GPT-J architecture, specifically trained for Vietnamese news content.
hivemind
This is the 8-bit quantized version of EleutherAI's GPT-J 6B parameter model, optimized for running and fine-tuning on limited GPU resources (e.g., Colab or 1080Ti).
addy88
The 8-bit quantized version of GPT-J-6B, converted via the bitsandbytes library, significantly reduces VRAM requirements and is suitable for single-GPU fine-tuning
KoboldAI
A sci-fi/fantasy-themed language model fine-tuned from GPT-J 6B, expanded with 20% multi-genre content
NbAiLab
Norwegian fine-tuned version based on GPT-J 6B, Transformer model with 6 billion parameters
PhilSad
A text generation model fine-tuned on SCP Wiki data, specifically designed for generating fictional SCP Foundation documentation entries
Milos
A 405-million-parameter Slovak language generation model based on GPT-J architecture, trained on diverse text types
NovelAI
Japanese causal language model based on GPT-J 6B, fine-tuned on Japanese web novel datasets
EleutherAI
GPT-J 6B is a 6-billion-parameter autoregressive language model trained using the Mesh Transformer JAX framework, employing the same tokenizer as GPT-2/3.
Cedille
Boris is an autoregressive language model based on the GPT-J architecture with 6 billion parameters, specializing in French text processing.
A creative story generation model optimized from GPT-J-6B, excelling in interactive fiction-style text generation