CooperLM-354M
PublicA 354M parameter GPT-2 model trained on filtered Wikipedia, BookCorpus, and OpenWebText. Includes 4-bit quantized version for lightweight inference. Built as a toy project to explore end-to-end LLM training using Hugging Face.
Discover Popular AI-MCP Services - Find Your Perfect Match Instantly
Easy MCP Client Integration - Access Powerful AI Capabilities
Master MCP Usage - From Beginner to Expert
Top MCP Service Performance Rankings - Find Your Best Choice
Publish & Promote Your MCP Services
A 354M parameter GPT-2 model trained on filtered Wikipedia, BookCorpus, and OpenWebText. Includes 4-bit quantized version for lightweight inference. Built as a toy project to explore end-to-end LLM training using Hugging Face.