Skip to content

Commit aaad004

Browse files
author
Danish Pruthi
committed
allowed for sparse gradient updates
1 parent fa6237a commit aaad004

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

04-efficiency-pytorch/wordemb-skip-ns.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -12,11 +12,11 @@ def __init__(self, nwords, emb_size):
1212
super(WordEmbSkip, self).__init__()
1313

1414
""" word embeddings """
15-
self.word_embedding = torch.nn.Embedding(nwords, emb_size)
15+
self.word_embedding = torch.nn.Embedding(nwords, emb_size, sparse=True)
1616
# initialize the weights with xavier uniform (Glorot, X. & Bengio, Y. (2010))
1717
torch.nn.init.xavier_uniform_(self.word_embedding.weight)
1818
""" context embeddings"""
19-
self.context_embedding = torch.nn.Embedding(nwords, emb_size)
19+
self.context_embedding = torch.nn.Embedding(nwords, emb_size, sparse=True)
2020
# initialize the weights with xavier uniform (Glorot, X. & Bengio, Y. (2010))
2121
torch.nn.init.xavier_uniform_(self.context_embedding.weight)
2222

0 commit comments

Comments
 (0)