-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Italian and French BERT: how to cite / acknowledge authorship #23
Comments
Hi @ivandre , thanks for your interest in our models and using them 🤗 At the moment, there's no offical paper out there that can be cited. Other papers, that are using our models, just cite the repository url - I think that would be fine :) But I'm currently thinking about generating a Zenodo/DOI for that repository here, so it can be cited like the BERTurk models! Btw: for Italian I will release an ELECTRA model quite soon (checkpoint selection and evaluations are done). For French there are two exiting news: ELECTRA model for Europeana is on its way (already trained) and CamemELECTRA (same training data and vocab as CamemBERT) is also trained. So stay tuned! |
Excellent! Thank you very much indeed for your response and the great news.
best regards
Ivandré.
Em seg., 2 de nov. de 2020 às 06:58, Stefan Schweter <
[email protected]> escreveu:
… Hi @ivandre <https://github.com/ivandre> ,
thanks for your interest in our models and using them 🤗
At the moment, there's no offical paper out there that can be cited. Other
papers <https://github.com/dbmdz/berts/blob/main/papers.md>, that are
using our models, just cite the repository url - I think that would be fine
:) But I'm currently thinking about generating a Zenodo/DOI for that
repository here, so it can be cited like the BERTurk
<https://github.com/stefan-it/turkish-bert> models!
Btw: for Italian I will release an ELECTRA model quite soon (checkpoint
selection and evaluations are done).
For French there are two exiting news: ELECTRA model for Europeana is on
its way (already trained) and CamemELECTRA (same training data and vocab as
CamemBERT) is also trained. So stay tuned!
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#23 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/ALAU7FGVQ7QPJGEYWI4ADMDSNZ7F3ANCNFSM4TDWNFVA>
.
|
We are using both Italian and French DBMDZ BERT in a multilingual research project, and I would like to acknowledge its authorship. Is there any publication to be cited? Website project? Or at least the authors' names? Many thanks.
The text was updated successfully, but these errors were encountered: