We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
每次训练到1000step的时候就报这个错误. 出错是model.py中,下面代码的最后一句: optimizer = tf.train.AdamOptimizer(self.learing_rate) trainable_params = tf.trainable_variables() gradients = tf.gradients(self.loss, trainable_params) clip_gradients, _ = tf.clip_by_global_norm(gradients, self.max_gradient_norm) 貌似是梯度消失吧.把学习率从0.0001改成0.001还是报错. 求指导
The text was updated successfully, but these errors were encountered:
No branches or pull requests
每次训练到1000step的时候就报这个错误.
出错是model.py中,下面代码的最后一句:
optimizer = tf.train.AdamOptimizer(self.learing_rate)
trainable_params = tf.trainable_variables()
gradients = tf.gradients(self.loss, trainable_params)
clip_gradients, _ = tf.clip_by_global_norm(gradients, self.max_gradient_norm)
貌似是梯度消失吧.把学习率从0.0001改成0.001还是报错.
求指导
The text was updated successfully, but these errors were encountered: