finetune-gpt2xl
PublicGuide: Finetune GPT2-XL (1.5 Billion Parameters) and finetune GPT-NEO (2.7 B) on a single GPU with Huggingface Transformers using DeepSpeed
Creat:2021-03-26T20:59:15
Update:2025-03-17T05:15:35
437
Stars
0
Stars Increase
Guide: Finetune GPT2-XL (1.5 Billion Parameters) and finetune GPT-NEO (2.7 B) on a single GPU with Huggingface Transformers using DeepSpeed