-
What is the maximum number for epochs to finetune the araGPT2 to conversation systems? and If I have little data, can I trained the model or it is necessary to have a large dataset? |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 4 replies
-
No idea, but probably not more than 5 or 6 epochs. I say try it for 5 epochs and test the model at every epoch |
Beta Was this translation helpful? Give feedback.
-
generally, there is no maximum number of epochs , every new epoch the model is enhanced more, but the mount of that enhancement reduces every new epoch, for example the first epoch the enhancement would be say 40% the second epoch would be say 15% the third would be 3% and so on , and you will find at the end after a given epoch number that the mount of the enhancement is zero or trivial and doesn't deserve the time and the computational resources consumption, as i know increasing number of epochs will never hurt your algorithm |
Beta Was this translation helpful? Give feedback.
generally, there is no maximum number of epochs , every new epoch the model is enhanced more, but the mount of that enhancement reduces every new epoch, for example the first epoch the enhancement would be say 40% the second epoch would be say 15% the third would be 3% and so on , and you will find at the end after a given epoch number that the mount of the enhancement is zero or trivial and doesn't deserve the time and the computational resources consumption, as i know increasing number of epochs will never hurt your algorithm
you can train the model with little data, but the issue that the model performance will not be stunning ,