-
Notifications
You must be signed in to change notification settings - Fork 194
Open
Description
I found that in you code:
def add_transition(name, l):
shape = l.get_shape().as_list()
in_channel = shape[3]
with tf.variable_scope(name) as scope:
l = BatchNorm('bn1', l)
l = tf.nn.relu(l)
l = Conv2D('conv1', l, in_channel, 1, stride=1, use_bias=False, nl=tf.nn.relu)
l = AvgPooling('pool', l, 2)
return l
After BN and ReLU, there is a 1*1 conv layer. However, you apply nl=tf.nn.relu, do you mean after conv layer, we still need the operation ReLU?
In DenseNet(Caffe version) it is different from your configuration here.
Can you explain it to me ?
Thanks.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels