ATPapers
PublicWorth-reading papers and related resources on attention mechanism, Transformer and pretrained language model (PLM) such as BERT. 值得一读的注意力机制、Transformer和预训练语言模型论文与相关资源集合
Creat:2019-11-02T17:19:02
Update:2024-12-20T12:14:43
131
Stars
-1
Stars Increase