Open
Description
System information
- OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Linux Ubuntu 16.04
- TensorFlow version and how it was installed (source or binary): 2.1.0 binary
- TensorFlow-Addons version and how it was installed (source or binary): 0.9.1 binary
- Python version: 3.6
- Is GPU used? (yes/no): yes
Describe the bug
I get An operation has
None for gradient.
when I use tfa.losses.triplet_hard_loss
in non-eager mode. categorical_crossentropy
works fine as loss
Code to reproduce the issue
base_model = InceptionResNetV2(weights='imagenet', include_top=True)
model = Model(inputs=base_model.input, outputs=predictions)
loss=tfa.losses.triplet_hard_loss(y, z, margin=margin, soft=False)
model.compile(optimizer=RMSprop(lr=0.0001, rho=0.9, epsilon=1e-08, decay=0.0), loss=loss, metrics=['accuracy'])
model.fit_generator(train_iterator, steps_per_epoch, epochs=num_epochs,
callbacks=[plot_loss_callback, reduce_learningrate_callback, checkpointer], # , tensorboard_callback],
validation_data=test_iterator,
validation_steps=np.ceil(len(label_validation) / float(batch_size)),
workers=4)
Other info / logs
File "/home/laurens/anaconda2/envs/naturalis3c/lib/python3.6/site-packages/keras/legacy/interfaces.py", line 91, in wrapper
return func(*args, **kwargs)
File "/home/laurens/anaconda2/envs/naturalis3c/lib/python3.6/site-packages/keras/engine/training.py", line 1732, in fit_generator
initial_epoch=initial_epoch)
File "/home/laurens/anaconda2/envs/naturalis3c/lib/python3.6/site-packages/keras/engine/training_generator.py", line 42, in fit_generator
model._make_train_function()
File "/home/laurens/anaconda2/envs/naturalis3c/lib/python3.6/site-packages/keras/engine/training.py", line 316, in _make_train_function
loss=self.total_loss)
File "/home/laurens/anaconda2/envs/naturalis3c/lib/python3.6/site-packages/keras/legacy/interfaces.py", line 91, in wrapper
return func(*args, **kwargs)
File "/home/laurens/anaconda2/envs/naturalis3c/lib/python3.6/site-packages/keras/backend/tensorflow_backend.py", line 75, in symbolic_fn_wrapper
return func(*args, **kwargs)
File "/home/laurens/anaconda2/envs/naturalis3c/lib/python3.6/site-packages/keras/optimizers.py", line 259, in get_updates
grads = self.get_gradients(loss, params)
File "/home/laurens/anaconda2/envs/naturalis3c/lib/python3.6/site-packages/keras/optimizers.py", line 93, in get_gradients
raise ValueError('An operation has `None` for gradient. '
ValueError: An operation has `None` for gradient. Please make sure that all of your ops have a gradient defined (i.e. are differentiable). Common ops without gradient: K.argmax, K.round, K.eval.