You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
fitst, thank for share the code, learn a lot.
but i has a question: can you explain why repace keras.activations.relu?
thank you!
`def modify_backprop(model, name):
g = tf.get_default_graph()
with g.gradient_override_map({'Relu': name}):
# get layers that have an activation
layer_dict = [layer for layer in model.layers[1:] if hasattr(layer, 'activation')]
# replace relu activation
for layer in layer_dict:
if layer.activation == keras.activations.relu:
layer.activation = tf.nn.relu
# re-instanciate a new model
new_model = VGG16(weights='imagenet')
return new_model`
The text was updated successfully, but these errors were encountered:
fitst, thank for share the code, learn a lot.
but i has a question: can you explain why repace keras.activations.relu?
thank you!
`def modify_backprop(model, name):
g = tf.get_default_graph()
with g.gradient_override_map({'Relu': name}):
The text was updated successfully, but these errors were encountered: