GPT-TransformerModel-2
PublicAn end-to-end PyTorch implementation of a GPT-2 style language model (124M) released by OpenAI and inspired by Karpathy’s NanoGPT. Covers core components like tokenization, multi-head self-attention, transformer blocks, positional embeddings and various other key ML concepts.
Creat:2025-04-03T13:23:28
Update:2025-05-25T20:14:39
https://docs.muhammedshah.com/ZeroToHero/GPT-2/
0
Stars
0
Stars Increase