-
-
Notifications
You must be signed in to change notification settings - Fork 148
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
🦠 Model Request: Pre-trained knowledge-guided graph transformer for molecular representation #1454
Comments
/approve |
New Model Repository Created! 🎉@miquelduranfrigola ersilia model respository has been successfully created and is available at: Next Steps ⭐Now that your new model respository has been created, you are ready to start contributing to it! Here are some brief starter steps for contributing to your new model repository:
Additional Resources 📚If you have any questions, please feel free to open an issue and get support from the community! |
I am working on this model. Encountering issues with DGL library on a Mac, so probably will have to develop this model on Linux. |
Update: I have successfully run the model on the |
I have done most of the necessary work, including the Now I have realized that Ersilia Pack requires a Python version greater than 3.7: https://github.com/ersilia-os/eos8aa5/actions/runs/12391375778/job/34588326502 We need to figure this out. |
Model Name
Knowledge-guided pre-trained graph transformer
Model Description
Neural fingerprints (embeddings) based on a knowledge-guided graph transformer. This model reprsents a novel self-supervised learning framework for the representation learning of molecular graphs, consisting of a novel graph transformer architecture, LiGhT, and a knowledge-guided pre-training strategy.
Slug
kgpgt-embedding
Tag
Descriptor
Publication
https://www.nature.com/articles/s41467-023-43214-1
Source Code
https://github.com/lihan97/KPGT
License
Apache-2.0
The text was updated successfully, but these errors were encountered: