You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
1.) yes, the distilbert-base-german-cased was distilled with the dbmdz/bert-base-german-cased model!
2.) unfortunately, there are no papers out there for these models. But there's a paper for (better and larger) German language models (GBERT and GELECTRA): https://aclanthology.org/2020.coling-main.598/
Hello,
I have 2 short questions:
is it correct that the model 'distilbert-base-german-cased' (https://huggingface.co/distilbert-base-german-cased) was distilled from the model 'dbmdz/bert-base-german-cased' (https://huggingface.co/dbmdz/bert-base-german-cased)?
is there a paper on the 'dbmdz/bert-base-german-cased' and / or the 'distilbert-base-german-cased' (which can also be used for citation purposes)?
Thanks in advance!
The text was updated successfully, but these errors were encountered: