We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Hi, first of all, thank you fo such an amazing code. Gets the work done if someone goes through the seq2seq tutorial first.
If we have to incorporate the concat attention mechanism then how are we going to tackle that problem in terms of batches w/o a for loop ?