linear-attention-transformer
PublicTransformer based on a variant of attention that is linear complexity in respect to sequence length
Creat:2020-06-05T05:34:56
Update:2025-03-18T20:45:33
794
Stars
1
Stars Increase
Transformer based on a variant of attention that is linear complexity in respect to sequence length