AIbase
Product LibraryTool NavigationMCP

TUPE

Public

Transformer with Untied Positional Encoding (TUPE). Code of paper "Rethinking Positional Encoding in Language Pre-training". Improve existing models like BERT.

Creat2020-06-24T10:30:16
Update2024-10-30T15:42:23
251
Stars
0
Stars Increase

Related projects