HomeAI Tutorial

GPT-2-B200-pre-trainier

Public

Code for pre-training a GPT-2 model on (eight) NVIDIA DGX B200 GPUs and short tutorial on the topic. Uses Torch and HF Transformers. It can pre-train GPT-2 Small on 32 GB of data in around 2.5 hours. It handles dataset tokenization too.

Creat2025-09-07T04:56:05
Update2025-09-07T05:00:12
https://en-wojtekb30.blogspot.com/2025/08/pre-training-gpt-2-ai-model-on-32-gb-of.html
0
Stars
0
Stars Increase

Related projects