Releases: lucidrains/memorizing-transformers-pytorch
Releases · lucidrains/memorizing-transformers-pytorch
0.4.1
17 Jul 00:08
Compare
Sorry, something went wrong.
No results found
address https://github.com/lucidrains/memorizing-transformers-pytorch…
0.4.0
24 Mar 18:27
Compare
Sorry, something went wrong.
No results found
prepare to use knn attention in another repository, for the ultimate …
0.3.10
30 Nov 05:25
Compare
Sorry, something went wrong.
No results found
0.3.9a
09 Nov 22:09
Compare
Sorry, something went wrong.
No results found
0.3.9
09 Nov 22:01
Compare
Sorry, something went wrong.
No results found
address https://github.com/lucidrains/memorizing-transformers-pytorch…
0.3.8
09 Nov 21:46
Compare
Sorry, something went wrong.
No results found
use the new einops unpack! thank you @arogozhnikov 🙏
0.3.7
23 Apr 22:43
Compare
Sorry, something went wrong.
No results found
just give knn attention its own relative positional bias
0.3.6
23 Apr 22:18
Compare
Sorry, something went wrong.
No results found
give knn attention layer one more way to tune out local if need be
0.3.5
23 Apr 21:57
Compare
Sorry, something went wrong.
No results found
allow the network to pay more attention to memory later into training…
0.3.4
23 Apr 21:36
Compare
Sorry, something went wrong.
No results found
turn KNN attention into full cosine sim attention (from the paper que…