Replies: 1 comment
-
Hey, you check the notbook here https://github.com/aub-mind/Arabic-Empathetic-Chatbot/blob/master/model/train-bert2bert.ipynb , you should be able to replace the decoder model with aragpt2. Although it will need to be trained in a seq2seq way |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Can I use AraBERT and AraGPT2 in seq2seq architecture as a fully pretrained models ? And if it is possible how can I fine tune them in my task. I am trying to build a Transformer-based encoder-decoder using AraBERT and AraGPT2. If you have any examples code I would really appreciated.
Beta Was this translation helpful? Give feedback.
All reactions