Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Making BoTorch's (Discrete) Multifidelity BO Work under Service API #2475

Closed
Abrikosoff opened this issue May 22, 2024 · 3 comments
Closed
Labels
question Further information is requested

Comments

@Abrikosoff
Copy link

Hi,

I have another quick question regarding the possibility of do multifidelity BO under the Service API. I now have a use case for BoTorch's multifidelity BO; the tutorials are clear, but as I have already been using Ax's Service API for some time now (and appreciating the ease of use it gives), I was wondering whether the MF workflow can be transplanted to work under the Service API? I'm guessing yes, since on the surface it seems that we only need to pass some acquisition functions (can't call optimize_acqf in Service though, but there are other ways to pass in a qMultiFidelityHypervolumeKnowledgeGradient I guess?)? But it could very well be I'm missing something, so I would really appreciate some expert opinions here!

@Cesar-Cardoso Cesar-Cardoso added the question Further information is requested label May 22, 2024
@Balandat
Copy link
Contributor

Hi @Abrikosoff, yes, I believe this should be possible. I don't have a full tutorial handy, but you should be able to assemble this using a custom GenerationStrategy that uses the modular BoTorch model setup. For this you'll need to:

  1. Define "fidelity" parameters as part of the search space when creating the experiment (if you're interested in the discrete case that means setting is_fidelity=True and providing a target_value that is your target fidelity)
  2. Construct a custom GenerationStrategy that according to the modular BoTorch model tutorial that uses the Models.BOTORCH_MODULAR model type and specifies "acquisition_class": MultiFidelityAcquisition, "acquisition_options": {"fidelity_weights": {fidelity_param_idx: weight}, "cost_intercept": cost_intercept} and "surrogate": Surrogate(SingleTaskMultiFidelityGP) in the model_kwargs of the GenerationStep. Here fidelity_weights and cost_intercept correspond to the fidelity_weights and fixed_cost args to AffineFidelityCostModel from the botorch multi-fidelity tutorial. This is a bit annoying since fidelity_param_idx is not known at experiment creation time, but this is usually the last of the parameters - we're working on improving the user interface for this.
  3. Instantiate the AxClient with the custom GenerationStrategy via the generation_strategy argument.

Hope that helps, let us know how it goes!

@Abrikosoff
Copy link
Author

Abrikosoff commented Jun 4, 2024

Hi @Abrikosoff, yes, I believe this should be possible. I don't have a full tutorial handy, but you should be able to assemble this using a custom GenerationStrategy that uses the modular BoTorch model setup. For this you'll need to:

1. Define "fidelity" parameters as part of the search space when creating the experiment (if you're interested in the discrete case that means setting [`is_fidelity=True`](https://github.com/facebook/Ax/blob/ba6eb81324f35522054a9bac7b0088b2243f88d9/ax/service/ax_client.py#L281-L282) and providing a `target_value` that is your target fidelity)

2. Construct a custom `GenerationStrategy` that according to the [modular BoTorch model tutorial](https://github.com/facebook/Ax/blob/ba6eb81324f35522054a9bac7b0088b2243f88d9/tutorials/modular_botax.ipynb) that uses the `Models.BOTORCH_MODULAR` model type and specifies `"acquisition_class": MultiFidelityAcquisition`, `"acquisition_options": {"fidelity_weights": {fidelity_param_idx: weight}, "cost_intercept": cost_intercept}` and `"surrogate": Surrogate(SingleTaskMultiFidelityGP)` in the `model_kwargs` of the `GenerationStep`. Here `fidelity_weights` and `cost_intercept` correspond to the `fidelity_weights` and `fixed_cost` args to `AffineFidelityCostModel` from the botorch multi-fidelity tutorial. This is a bit annoying since `fidelity_param_idx` is not known at experiment creation time, but this is usually the last of the parameters - we're working on improving the user interface for this.

3. Instantiate the `AxClient` with the custom `GenerationStrategy` via the [`generation_strategy`](https://github.com/facebook/Ax/blob/ba6eb81324f35522054a9bac7b0088b2243f88d9/ax/service/ax_client.py#L191) argument.

Hope that helps, let us know how it goes!

Hi @Balandat , first of all thanks for your very helpful tips regarding this! I'm now halfway though already; the custom GenerationStrategy is not a problem, but when I try to define a fidelity parameter as part of the search space (in the Service API fashion), as so:


        {
            "name": "fidelity",
            "type": "choice",
            "value": [0.5, 0.75, 1.0],
            "is_fidelity": True,
            "target_value": 1.0,  
        },
        

and run next experiment, I get the error ValueError: Cannot choice-encode fidelity parameter fidelity. However, if I try the original definition without the type parameter, this fails even at the definition step. Intuitively, I thought that this parameter would have been most accurately classed as a Choice parameter, but that does not seem to be the case.

Any idea of what I am doing wrong here? Thanks in advance!

Edit: After performing a more through search of the Issues, I came across this issue: 979. From the fact that the error messages are the same in both cases, would it be correct to assume that support for discrete fidelities are not yet implemented for the Service API?

@saitcakmak
Copy link
Contributor

Closing this since the discussion moved to #2514

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

4 participants