Model crashes while getting Word Embeddings #163
Unanswered
moayadeldin
asked this question in
Q&A
Replies: 1 comment 3 replies
-
try adding a torch no grad context. with torch.no_grad():
outputs = model(**inputs) Colab should be able to handle 100 sentences |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
When I try to get word embeddings for a dataset using the approach mentioned in Arabic Embeddings Notebook on Colab for relatively big input (+100 sentences) then Google Colab completely crashes as it ran out of memory. I tried to use my own PC but it crashed as well.
Is there any way to solve this problem other than dividing my dataset into very small pieces?
TIA.
Beta Was this translation helpful? Give feedback.
All reactions