-
-
Notifications
You must be signed in to change notification settings - Fork 78
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
test: remove unnecessarily specific parameter type annotations #1010
test: remove unnecessarily specific parameter type annotations #1010
Conversation
And not to mention, no AD or optimization library supports having the tuning vector be different types, and I don't see that changing any time soon, and I don't see a case where that's a real optimization. |
Thanks for helping with any migration this will entail. Would this then mean we can’t have integer parameters that stay integer or is that not an issue as they aren’t tunable? Also, isn’t this a breaking change at some level? I’d imagine type specific user registered functions may stop working if parameters are being silently promoted now. Will this be held for a MTK 10 release? |
Note there are likely doc updates needed here too as I believe @TorkelE wrote about typing parameters for the V14 docs too. |
It's not breaking if integer-valued parameters are not marked as tunable. This is just changing the internal representation of tuanbles. @AayushSabharwal confirm that |
They will still be integers, they just won't be in the tunables portion.
Only |
That would be breaking then, because we previously respected it as an integer there. I really think we should set tunable more carefully by default. I don't think we can go forward with this without it, since otherwise you very easily get rid of people's integers and bools. |
It's not breaking. |
Parameters that users have explicitly marked as integers/bools will still be integers/bools, just in a different partition. |
Oh, I thought we had a conversion. Okay, if that was already respecting the symtype then we're good. |
We already assume that if a symbolic parameter isn’t explicitly annotated as an integer symbolic it gets converted so that case shouldn’t be an issue. |
Did these additionally specific typings cause error in tests with new MTK? The tests here was not really intended to be optimised code for AD, but rather to test various really niche cases and that these works, to potentially catch niche errors. |
With SciML/ModelingToolkit.jl#2908, tunables are being merged into one
Vector{T}
, whereT
is some floating point number. This is because tunables are intended for optimization and sensitivity analysis, but AD doesn't work for integers. Not splitting intoTuple{Vector{T1}, Vector{T2}}
also has performance benefits for sensitivity analysis, and helps reduce a significant number of bugs. Array tunables are also concatenated into this vector. Accessing these functions is done through a view to prevent unnecessary performance overhead, even in MTK's generated functions.To this end, parameters are promoted to a common type. Overly specific type annotations are also in general invalid, since the parameters could also be duals.
This PR is a draft so it can be rerun once to aforementioned MTK PR is merged and tagged.
Checklist
contributor guidelines, in particular the SciML Style Guide and
COLPRAC.
Additional context
Add any other context about the problem here.