This is an implementation of Bahdanau Attention
Just like you would use any other tensoflow.python.keras.layers object.
from attention_keras.layers.attention import AttentionLayer
attn_layer = AttentionLayer(name='attention_layer')
attn_out, attn_states = attn_layer([encoder_outputs, decoder_outputs])Here,
encoder_outputs- Sequence of encoder ouptputs returned by the RNN/LSTM/GRU (i.e. withreturn_sequences=True)decoder_outputs- The above for the decoderattn_out- Output context vector sequence for the decoder. This is to be concat with the output of decoder (refermodel/nmt.pyfor more details)attn_states- Energy values if you like to generate the heat map of attention (refertrain.ipynbfor usage)