-
Notifications
You must be signed in to change notification settings - Fork 122
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Directions #85
Comments
Some random thoughts related to the question: how good is LNS compared to ALNS?
|
There's also this paper about the A in ALNS: Turkes et al. 2021 (not sure if we've linked to this thing before). I read this as "it's probably not that beneficial in general, since it also adds complexity". |
We now have SISR as part of the CVRP example. We can add another example doing LNS with Another good direction might be to offer more diagnostics. Can we, for example, help users somehow with tuning parameters/providing tools to efficiently tune an ALNS instance? |
There's 3 "parameter groups" that we might want to tune in ALNS:
It would be nice to have a
A simple workflow for tuning the acceptance criteria would look as follows: alns = make_alns(...)
init = ...
select = ...
stop = ...
data = []
for idx, accept in tune.accept(RecordToRecordTravel, parameter_space, sampling_method):
res = alns.iterate(init, select, accept, stop)
data[idx] = res.best_state.objective()
# Best configuration
print(np.argmin(data)) This could be extended to tuning ALNS and operator selection schemes as well. I don't have much experience tuning so I don't know exactly how the tuning interface should look like. |
We probably shouldn't invent our own half-baked solution for this. The ML community has a lot of this already, with e.g. |
I'm closing this issue because tuning is now in #109, and the other ideas from last summer have (for the most part) already been implemented. |
[partially based on https://doi.org/10.1016/j.cor.2022.105903, thanks @leonlan]
The text was updated successfully, but these errors were encountered: