Attention-Various-Positional-Encoding
PublicThis project aims to implement the Scaled-Dot-Product Attention layer and the Multi-Head Attention layer using various Positional Encoding methods.
attention-mechanismgensimmulti-head-attentionnatural-language-processingnlpnltkpytorchrelative-positional-encodingrelative-positional-representationscaled-dot-product
Creat:2022-05-13T20:59:00
Update:2024-11-19T00:45:32
5
Stars
0
Stars Increase