-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add orb support... tricky #303
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could be simplified. Depends if you think it's more or less readable.
match model_path: | ||
case "orb-v1": | ||
model = orb_ff.orb_v1() | ||
case "orb-mptraj-only-v1": | ||
model = orb_ff.orb_v1_mptraj_only() | ||
case "orb-d3-v1": | ||
model = orb_ff.orb_d3_v1() | ||
case "orb-d3-xs-v1": | ||
model = orb_ff.orb_d3_xs_v1() | ||
case "orb-d3-sm-v1": | ||
model = orb_ff.orb_d3_sm_v1() | ||
case _: | ||
raise ValueError( | ||
"Please specify `model_path`, as there is no " | ||
f"default model for {arch}" | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should be split into multiple lines, but this is the gist
match model_path: | |
case "orb-v1": | |
model = orb_ff.orb_v1() | |
case "orb-mptraj-only-v1": | |
model = orb_ff.orb_v1_mptraj_only() | |
case "orb-d3-v1": | |
model = orb_ff.orb_d3_v1() | |
case "orb-d3-xs-v1": | |
model = orb_ff.orb_d3_xs_v1() | |
case "orb-d3-sm-v1": | |
model = orb_ff.orb_d3_sm_v1() | |
case _: | |
raise ValueError( | |
"Please specify `model_path`, as there is no " | |
f"default model for {arch}" | |
) | |
model = getattr(orb_ff, model_path.sub("-", "_"), None)() | |
if model is None: | |
raise ValueError( | |
"Please specify `model_path`, as there is no " | |
f"default model for {arch}" | |
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
would be nice but orb-mptraj does not match the pattern.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Don't we choose what "model_path" is? You could also easily special case that. Here, it's not clear that that is different at first glance.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
no really I user the canonical names from their website... which i suspect is what people will expect to call them
import orb_models.forcefield.pretrained as orb_ff | ||
|
||
if isinstance(model_path, str): | ||
match model_path: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think we can use match case with 3.9?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes it does not but it breaks my heart to write if elif elif elif like we are in fotran 77 world... is not pyhton3.9 eol this month?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
maybe we shall catch the guilty one and then deal with it separately?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think actually even 3.8 isn't eol until next month, so a little way to go until 3.9.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
my good intentions destroyed...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Even then, I would probably recommend building a dict, then, with the keys and pull the function and call it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
in fairness they have a dict already we can use that.
Mostly for reference but just to note this may pin torch to 2.2.0, which at the moment is fine, but could be a problem in the future. (I need to double check but I think even uninstalled extras' dependencies are taken into account when installing) |
alignn = ["alignn"] | ||
chgnet = ["chgnet"] | ||
m3gnet = ["matgl", "dgl", "torchdata"] | ||
sevennet = ["sevenn", "torch_geometric"] | ||
orb = ["orb_models"] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we not need to add Pynanoflann too?
No description provided.