Description
Learner nParams: 32901
Traceback (most recent call last):
File "main.py", line 38, in
results = importlib.import_module(opt['metaLearner']).run(opt, data)
File "/home/i/meta/FewShotLearning/model/lstm/train-lstm.py", line 121, in run
opt['batchSize'][opt['nTrainShot']])
File "/home/i/conda/envs/py27/lib/python2.7/site-packages/torch/nn/modules/module.py", line 325, in call
result = self.forward(*input, **kwargs)
File "/home/i/meta/FewShotLearning/model/lstm/metaLearner.py", line 174, in forward
torch.autograd.grad(loss, self.lstm2.parameters())
File "/home/i/conda/envs/py27/lib/python2.7/site-packages/torch/autograd/init.py", line 158, in grad
inputs, only_inputs, allow_unused)
RuntimeError: One of the differentiated Variables appears to not have been used in the graph
I run your code with py27 env. However the error occurs and I don't know what goes wrong?