-
-
Notifications
You must be signed in to change notification settings - Fork 50
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Instantiation of custom class leads to wrong defaults #460
Comments
A workaround for this is to define main like this: def main(data: Optional[Super] = None):
if data is None:
data = Default() This might be better practice in general. |
This is not a bug. The current behavior is the result of many discussions about use cases for overriding arguments from command line. Currently I don't have at hand links to the most relevant issues. But the rationale is the following. Base classes and subclasses can have many parameters/settings, and several subclasses can share parameters. If every time a user specifies a different class all the defaults are reset, then every time the user is forced to give all parameters they want. So, the behavior was decided to be that only what gets specified is changed. If a new class_path is specified, then only the class_path is changed, not the init_args that were set at that moment. If the new class does not accept some parameters that are in the current init_args, then these get dropped. But the shared parameters are kept, and they retain the current value. No reset to default. Maybe it is easier to understand with actual use cases where it can be more obvious why this behavior make sense. Say someone implements multiple models with a common base class. All models have
If the argument Independent of motivation, the current behavior has been requested in previous issues, and changing it would break the flow of previous projects. So, this shouldn't be changed. However, there could be a new feature. Right now an argument like |
Thanks for the detailed response. If it has already been discussed many times in the past, I don't want to add to the pile. I just don't know how to implement the datasets via the CLI now. This really surprised me. Like having |
I do see a need to reset to defaults in some cases, so it can be worth to discuss further. Just that we need both, the current behavior and a way to reset. For the case of a dataset, I would probably not add a default to make it required, and add the option |
Thanks for considering it and the suggestion to make it required. While making it required in the Python signature could make sense, we also want to provide config files in a folder in Lit-GPT where users can find predefined experiments to reproduce numbers. If someone were to make an ablation from these configs and use a different dataset via command line, they will run into this issue anyway (we will likely have docs examples based on these configs). |
For a new reset feature two things need to be decided:
This are some initial ideas that would be best to think through more than once. How does it sound? |
Hey @mauvilsa
I can't think of a good use case where it wouldn't reset to the class I'm setting. The confusing I was facing was related to setting the class
A syntax for this would be nice for advanced users. Maybe even Besides this more elaborate approach, would a simple flag on the CLI to switch the behavior also be an option? For simple CLI use cases where this is universally the expected behavior (i.e. we don't expect to switch the optimizer class like described), it would be nice if we didn't have to remember to use special syntax in the CLI arguments. Similar to the from jsonargparse import CLI
CLI(main, here=True) But I'm struggling to find an intuitive name so it might be a sign that it's not a good idea afterall. |
A way to change the behavior could be an option. But I would go for a global setting instead of a parameter to |
@carmocca If there was such an option in jsonargparse, would we be able to set it in litgpt's |
No concerns. We already set some globals: https://github.com/Lightning-AI/litgpt/blob/wip/litgpt%2F__main__.py#L83-L86 |
Actually there is a problem with using a global. Users of pytorch-lightning expect the current behavior. If it were a global set by litgpt, it could mean that just by installing litgpt, the behavior of |
What if we set the global flag only in our entry point scripts and not at the import level? That would be the main use case. That might be a good practice for libraries in general, as you don't want to impose custom jsonargparse behavior on others but only on your own CLI within the package. |
Unless I'm missing something, this is what we do already in the link at #460 (comment) |
🐛 Bug report
When instantiating objects from a custom class where a default is specified in the signature, jsonargparse infers the defaults from the the wrong class:
To reproduce
Output:
Expected behavior
I expect the output to be
because I selected
--data Other
, so it should use the defaults from theOther
class.Environment
pip install jsonargparse[all]
):pip install -U jsonargparse[signatures]
The text was updated successfully, but these errors were encountered: