-
Notifications
You must be signed in to change notification settings - Fork 74.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LossScaleOptimizer does not work #31956
Comments
Sounds quite similar to #31953 |
@oanush Thanks for the information.
Still, I have doubt whether the source of LossScaleOptimizer should be updated so that its behavior is consistent with the other optimizers. |
I could reproduce the issue. Here is the gist. Thanks! |
There was a technical decision made that the users should pass in valid trainable variables to avoid gradient to be None. |
This looks like a bug. However, You can use a I'm closing this issue, as even if it is fixed, the fix will never make it to a stable version of TensorFlow. |
System information
Describe the current behavior
I am trying to run the sample code from https://www.tensorflow.org/api_docs/python/tf/contrib/mixed_precision/LossScaleOptimizer and get the following error when no gradient can be computed for some variables:
Describe the expected behavior
No error would occur for some other optimizers such as AdamOptimizer and MovingAverageOptimizer, even if no gradient can be computed for some variables.
Code to reproduce the issue
The text was updated successfully, but these errors were encountered: