-
Notifications
You must be signed in to change notification settings - Fork 504
Issues: openai/finetune-transformer-lm
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Why are the "wrong" sentences are learned during training via LM?
#45
opened Apr 30, 2019 by
fabiang7
Is it possible to edit the code based on this project to train from scratch?
#42
opened Feb 28, 2019 by
guotong1988
What is the specific formula for learning rate used in adam optimizer during pre-train?
#29
opened Oct 18, 2018 by
ShuGao0810
How to deal with logits from position indices in the output layer?
#22
opened Aug 24, 2018 by
xiaoda99
Any timeline to release the code to train the LM + finetune on the other 11 tasks?
#13
opened Jul 18, 2018 by
Franck-Dernoncourt
ProTip!
Adding no:label will show everything without a label.