Home
Information

AI Dataset Collection

Large-scale datasets and benchmarks for training, evaluating, and testing models to measure

Tools

Intelligent Document Recognition

Comprehensive Text Extraction and Document Processing Solutions for Users

AI Tutorial

GPT-TransformerModel-2

Public

An end-to-end PyTorch implementation of a GPT-2 style language model (124M) released by OpenAI and inspired by Karpathy’s NanoGPT. Covers core components like tokenization, multi-head self-attention, transformer blocks, positional embeddings and various other key ML concepts.

Creat2025-04-03T13:23:28
Update2025-05-25T20:14:39
https://docs.muhammedshah.com/ZeroToHero/GPT-2/
0
Stars
0
Stars Increase