Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add orb support... tricky #303

Draft
wants to merge 4 commits into
base: main
Choose a base branch
from
Draft

add orb support... tricky #303

wants to merge 4 commits into from

Conversation

alinelena
Copy link
Member

No description provided.

@alinelena alinelena added the enhancement New/improved feature or request label Sep 4, 2024
@alinelena alinelena self-assigned this Sep 4, 2024
@alinelena alinelena marked this pull request as draft September 4, 2024 10:22
Copy link
Collaborator

@oerc0122 oerc0122 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could be simplified. Depends if you think it's more or less readable.

Comment on lines +227 to +242
match model_path:
case "orb-v1":
model = orb_ff.orb_v1()
case "orb-mptraj-only-v1":
model = orb_ff.orb_v1_mptraj_only()
case "orb-d3-v1":
model = orb_ff.orb_d3_v1()
case "orb-d3-xs-v1":
model = orb_ff.orb_d3_xs_v1()
case "orb-d3-sm-v1":
model = orb_ff.orb_d3_sm_v1()
case _:
raise ValueError(
"Please specify `model_path`, as there is no "
f"default model for {arch}"
)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should be split into multiple lines, but this is the gist

Suggested change
match model_path:
case "orb-v1":
model = orb_ff.orb_v1()
case "orb-mptraj-only-v1":
model = orb_ff.orb_v1_mptraj_only()
case "orb-d3-v1":
model = orb_ff.orb_d3_v1()
case "orb-d3-xs-v1":
model = orb_ff.orb_d3_xs_v1()
case "orb-d3-sm-v1":
model = orb_ff.orb_d3_sm_v1()
case _:
raise ValueError(
"Please specify `model_path`, as there is no "
f"default model for {arch}"
)
model = getattr(orb_ff, model_path.sub("-", "_"), None)()
if model is None:
raise ValueError(
"Please specify `model_path`, as there is no "
f"default model for {arch}"
)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

would be nice but orb-mptraj does not match the pattern.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Don't we choose what "model_path" is? You could also easily special case that. Here, it's not clear that that is different at first glance.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no really I user the canonical names from their website... which i suspect is what people will expect to call them

import orb_models.forcefield.pretrained as orb_ff

if isinstance(model_path, str):
match model_path:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think we can use match case with 3.9?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes it does not but it breaks my heart to write if elif elif elif like we are in fotran 77 world... is not pyhton3.9 eol this month?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe we shall catch the guilty one and then deal with it separately?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think actually even 3.8 isn't eol until next month, so a little way to go until 3.9.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

my good intentions destroyed...

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Even then, I would probably recommend building a dict, then, with the keys and pull the function and call it.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

in fairness they have a dict already we can use that.

@ElliottKasoar
Copy link
Member

Mostly for reference but just to note this may pin torch to 2.2.0, which at the moment is fine, but could be a problem in the future.

(I need to double check but I think even uninstalled extras' dependencies are taken into account when installing)

alignn = ["alignn"]
chgnet = ["chgnet"]
m3gnet = ["matgl", "dgl", "torchdata"]
sevennet = ["sevenn", "torch_geometric"]
orb = ["orb_models"]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we not need to add Pynanoflann too?

@ElliottKasoar ElliottKasoar linked an issue Nov 29, 2024 that may be closed by this pull request
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New/improved feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add orb support
3 participants