GPT-TransformerModel-2
PublicAn end-to-end PyTorch implementation of a GPT-2 style language model (124M) released by OpenAI and inspired by Karpathy’s NanoGPT. Covers core components like tokenization, multi-head self-attention, transformer blocks, positional embeddings and various other key ML concepts.