Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

grad*reward?is this rigorous #14

Open
johsnows opened this issue May 15, 2019 · 0 comments
Open

grad*reward?is this rigorous #14

johsnows opened this issue May 15, 2019 · 0 comments

Comments

@johsnows
Copy link

grad*reward?is this rigorous?
i used this code to apply to svhn dataset, and got very low accuracy, and sometimes the accuracy will accuracy will increase and then decrease, so i think is something wrong on the loss or the reward?

Step 4000, Minibatch Loss= 1.0120, Current accuracy= 0.730
Step 4100, Minibatch Loss= 0.9515, Current accuracy= 0.690
Step 4200, Minibatch Loss= 1.0280, Current accuracy= 0.650
Step 4300, Minibatch Loss= 1.0613, Current accuracy= 0.660
Step 4400, Minibatch Loss= 0.8850, Current accuracy= 0.710
Step 4500, Minibatch Loss= 0.7320, Current accuracy= 0.770
Step 4600, Minibatch Loss= 0.8818, Current accuracy= 0.700
Step 4700, Minibatch Loss= 0.7780, Current accuracy= 0.760
Step 4800, Minibatch Loss= 2.1427, Current accuracy= 0.210
Step 4900, Minibatch Loss= 2.2350, Current accuracy= 0.240
Step 5000, Minibatch Loss= 2.2739, Current accuracy= 0.130
Step 5100, Minibatch Loss= 2.1973, Current accuracy= 0.250
Step 5200, Minibatch Loss= 2.2330, Current accuracy= 0.160
Step 5300, Minibatch Loss= 2.2811, Current accuracy= 0.150
Step 5400, Minibatch Loss= 2.2750, Current accuracy= 0.150
Step 5500, Minibatch Loss= 2.2495, Current accuracy= 0.190
Step 5600, Minibatch Loss= 2.2543, Current accuracy= 0.120
Step 5700, Minibatch Loss= 2.1632, Current accuracy= 0.220

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant