AIbase
Product LibraryTool Navigation

Linear-Attention-Recurrent-Neural-Network

Public

A recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)

Creat2018-05-03T11:46:15
Update2024-11-13T21:24:12
https://www.neuraxio.com/
145
Stars
0
Stars Increase

Related projects