HomeAI Tutorial

Transformer-Adaptation-Playbook

Public

An empirical study of Transformer adaptation techniques. Pre-training from scratch (MLM), classic fine-tuning, and from-scratch implementations of PEFT methods (LoRA, Adapters). Tuning both encoder (BERT) and decoder (OPT) models.

Creat2025-09-03T15:18:41
Update2025-09-03T19:00:08
1
Stars
0
Stars Increase