Skip to content

Latest commit

 

History

History
9 lines (5 loc) · 402 Bytes

README.md

File metadata and controls

9 lines (5 loc) · 402 Bytes

ALSTM

  • ALSTM contains a temporal attentive aggregation layer based on normal LSTM.

  • Paper: A dual-stage attention-based recurrent neural network for time series prediction.

    https://www.ijcai.org/Proceedings/2017/0366.pdf

  • NOTE: Current version of implementation is just a simplified version of ALSTM. It is an LSTM with attention.