Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use SolverParameters #254

Merged
merged 6 commits into from
Nov 11, 2024
Merged

Use SolverParameters #254

merged 6 commits into from
Nov 11, 2024

Conversation

tmigot
Copy link
Member

@tmigot tmigot commented Feb 9, 2024

This is a first draft of integrating SolverParameters.jl in the JSO-compliant solver.

It is more constraints but clarifies the role of algorithmic parameters (their default values, and their domain -> so far we never really had a check on the domain).

I believe this structure should be in the solver for one reason well illustrated here that the solver can need the parameters.

This is minimal change in the code as in the beginning we will extract the parameters from the solver structure.

What do change is that we can no longer call solve!(solver, nlp, stats; with_some_parameters = 1) with parameters. (although, we could update the values in the solver from the keyword arguments, but probably not a good practice)

This also opens new perspectives. For instance, the LbfgsParameterSet could contains parameters for a recipe to compute mem instead of just mem (if we plan something that depend on the size of the problem for instance).

What is not done here is where we store the default values. There should be isolated, so we can modify them easily, and also they might depend on the solver precision... An optimal τ₁ (below) is probably different in Float16 and in Float64.

Copy link

codecov bot commented Feb 9, 2024

Codecov Report

Attention: Patch coverage is 90.90909% with 1 line in your changes missing coverage. Please review.

Project coverage is 88.48%. Comparing base (1e1368e) to head (cf6fd9a).
Report is 6 commits behind head on main.

Current head cf6fd9a differs from pull request most recent head 4ebed83

Please upload reports for the commit 4ebed83 to get more accurate results.

Files Patch % Lines
src/lbfgs.jl 90.90% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main     #254      +/-   ##
==========================================
+ Coverage   86.66%   88.48%   +1.81%     
==========================================
  Files           7        7              
  Lines        1125     1033      -92     
==========================================
- Hits          975      914      -61     
+ Misses        150      119      -31     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Copy link
Contributor

github-actions bot commented Feb 9, 2024

Package name latest stable
FletcherPenaltySolver.jl
Percival.jl

Copy link
Member

@dpo dpo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here are a few minor comments. I don’t understand why the parameters should be inside the solver object?! (nevermind, I see).

src/lbfgs.jl Outdated Show resolved Hide resolved
src/lbfgs.jl Outdated Show resolved Hide resolved
src/lbfgs.jl Outdated Show resolved Hide resolved
src/lbfgs.jl Show resolved Hide resolved
@tmigot
Copy link
Member Author

tmigot commented Feb 10, 2024

Here are a few minor comments. I don’t understand why the parameters should be inside the solver object?! (nevermind, I see).

It could actually, but we would need a constructor that takes both nlp and params, and also have a new solve!with 4 arguments (solver, nlp, params and stats) with all its variants. I find it more economical in term of modification to store it in the solver.

@d-monnet
Copy link
Contributor

d-monnet commented Mar 5, 2024

Would it make sense to also add (not sure how) the constraints on the parameters, for instance τ₁>0, or η1<η2 for R2. I'm thinking about MultiPrecisionR2 where there is a bunch of constraints of this kind.
We could also implement a generic test function that checks if the user-provided parameters satisfied the constraints.

@tmigot
Copy link
Member Author

tmigot commented Mar 6, 2024

Everything regarding the constraints on the domain of the parameters should be modeled in the parameter set, the package SolverParameters.jl has a bunch of functionalities for this (even open sets).
For the constraints, linking the different parameters, I think that we don't have a real solution. I would strongly suggest to add as a test in the <: AbstractParameterSet constructor, and then in the BBModels if that's the solution we keep for optimizing the parameters.

We reimplemented the Julia in function https://github.com/JuliaSmoothOptimizers/SolverParameters.jl/blob/8cea36a64b2d7bf781d1c3cb161ff70237876c00/src/parameterset.jl#L45 so that should partially do the test.

@dpo
Copy link
Member

dpo commented Mar 6, 2024

Wouldn't constraints be modeled directly in the BBNLPModel?

@tmigot
Copy link
Member Author

tmigot commented Mar 6, 2024

Wouldn't constraints be modeled directly in the BBNLPModel?

Mainly yes, but some minimal constraints are also handled here.
Beyond the optimization of parameters, it is also good to make sure that the given parameters are not violating the basic constraints.

@d-monnet
Copy link
Contributor

d-monnet commented Mar 8, 2024

I noticed that trunk and tron interfaces do not allow to pass parameters to the sub-solvers (e.g. increase_threshold parameter for TRONTrustRegion), and don't have them as keyword arguments.
With the current approach, I'm not sure what solvers' parameters we would optimize, since it is rather the sub-solvers' parameters that should be optimized.

@tmigot
Copy link
Member Author

tmigot commented Mar 9, 2024

I noticed that trunk and tron interfaces do not allow to pass parameters to the sub-solvers (e.g. increase_threshold parameter for TRONTrustRegion), and don't have them as keyword arguments. With the current approach, I'm not sure what solvers' parameters we would optimize, since it is rather the sub-solvers' parameters that should be optimized.

Good point. Can you create a separate issue to do this? As you noticed this optimization is sort of new and there several parameters we didn't extract from the subsolvers.

@tmigot tmigot marked this pull request as ready for review June 26, 2024 12:37
@tmigot tmigot requested a review from d-monnet June 26, 2024 12:37
Copy link
Contributor

@d-monnet d-monnet left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Thanks @tmigot!

I updated a few conditions on the parameters. These changes might not be consistent with the previous version of the code, but I double checked them with the paper.

src/fomo.jl Outdated Show resolved Hide resolved
src/fomo.jl Outdated Show resolved Hide resolved
src/fomo.jl Outdated Show resolved Hide resolved
src/fomo.jl Outdated Show resolved Hide resolved
src/fomo.jl Outdated Show resolved Hide resolved
@tmigot
Copy link
Member Author

tmigot commented Aug 15, 2024

@dpo @d-monnet I finally updated this PR!
The default parameters are now centralized at the beginning of each file, and include default value documentation so we would no longer need to update the whole file if default values change.
Moreover, there is no a mechanism that allow building default value depending on the AbstractNLPModel.

The DefaultParameter is something I just added in SolverParameters.jl https://github.com/JuliaSmoothOptimizers/SolverParameters.jl/blob/main/src/default-parameter.jl .

Right now constraints between the parameters are handle by an @assert in the parameter constructor. I suggest to add a smarter mechanism in a follow-up PR.

Finally, I open a PR in NLPModels.jl (JuliaSmoothOptimizers/NLPModels.jl#481) to write eltype(nlp) instead of eltype(nlp.meta.x0). It was a bit annoying, but apparently we can't write anonymuous functions with parametric types.

@tmigot tmigot requested review from dpo and d-monnet August 15, 2024 11:21
tmigot and others added 6 commits November 11, 2024 13:46
@tmigot tmigot merged commit 6842496 into main Nov 11, 2024
17 of 18 checks passed
@tmigot tmigot deleted the use-solverparameters branch November 11, 2024 21:47
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants