HomeAI Tutorial

CooperLM-354M

Public

A 354M parameter GPT-2 model trained on filtered Wikipedia, BookCorpus, and OpenWebText. Includes 4-bit quantized version for lightweight inference. Built as a toy project to explore end-to-end LLM training using Hugging Face.

Creat2025-06-01T05:33:23
Update2025-08-07T08:35:16
https://huggingface.co/mehta/CooperLM-354M
1
Stars
0
Stars Increase