Replies: 1 comment 1 reply
-
yes of course, you can use any optimizer that you want it's fine. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Can I use the optimizer (AdamW) of the gpt2 model as it is with aragpt2 instead of LAMBOptimizer ?
If not, How can I use LAMBOptimizer with the torch in aragpt2 because in your code, you use TensorFlow? Can you guide me, please?
Beta Was this translation helpful? Give feedback.
All reactions