You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Again, thanks for all your great work on this project. Despite opening a couple of issues I really want to point out how much I dig your way of defining configs and calling code.
🐛 Bug report
I do have one issue though: My use case is Pytorch Lightning and there I have a config file and generate a second config file on the fly for a hyper parameter search using ray tune.
The problem is that I cannot tune the learning rate because for that I have to adjust the init_args of the optimizer_init, which is a nested dict in the model key.
To reproduce
Executing the following code shows that weight_decay has been discarded but should be kept in fact:
This is not a bug. dict replacement is the expected behavior as explained in dict-items. This is actually a duplicate of #236. For partial update of dicts that also works in config files, an append option + could be added, similar to list-append.
I see. Thanks for the quick reply, I'll try to implement what you suggested in the other issue. Then the mistake was on my side for not reading the documentation carefully enough, sorry for that! ;)
Should I close this issue, or edit it to represent your enhancement suggestion? Might make more sense if you do it since I lack a lot of understanding of the details of the project.
By the way, thanks for also pointing out the list-append option! I wasn't aware of it and this is making one pretty ugly part of my code obsolete :D
Again, thanks for all your great work on this project. Despite opening a couple of issues I really want to point out how much I dig your way of defining configs and calling code.
🐛 Bug report
I do have one issue though: My use case is Pytorch Lightning and there I have a config file and generate a second config file on the fly for a hyper parameter search using ray tune.
The problem is that I cannot tune the learning rate because for that I have to adjust the
init_args
of theoptimizer_init
, which is a nested dict in themodel
key.To reproduce
Executing the following code shows that
weight_decay
has been discarded but should be kept in fact:The output is:
The reason is (I checked this manually in jsonargparse) is that
optimizer_init
gets overwritten as a whole and not updated.Expected behavior
Would be the following output:
Environment
jsonargparse version 4.24.1
The text was updated successfully, but these errors were encountered: