This repository contains the code of my thesis, which delved into optimizing the hybrid genetic search algorithm (Vidal (2022)). For this thesis, the PyVRP implementation of Hybrid Genetic Search was used.
Tip
If you are new to vehicle routing or metaheuristics, the PyVRP library perfectly explains the basics on their website: introduction to VRP and introduction to HGS.
In recent years, the paradigm Learn to Optimize is making strive in learning how to solve combinatorial optimization problem using machine learning. Most models still get out-performed by the state-of-the-art algorithms, so If you can’t beat them, why not just join them?
Hybrid Genetic Search(HGS) is one of these state-of-the-art algorithms. Eventhough HGS is state-of-the-art, it still leaves room for increasing efficiency due to its many random aspects, among which is the crossover step. Many studies have used operator selection or parameter adaption methods to address this randomness. Doing direct local genetic selection is less studied.
To fill this gap, we propose NeuroSREX, which utilizes Selective Route Exchange (SREX) and a Graph Neural Network (GNN) to select specific route(s) of each parent to crossover. While NeuroSREX is not yet practically viable due to difficulties in dealing with constraints and the added computation time, the results demonstrate the efficiency and effectiveness of the model.
Meaning, we found lower cost solutions in less iterations than HGS. We outperformed HGS on a multitude of instances, taking the first step in the evolution of optimizing evolution.
The proposed methodology for optimizing the evolution strategy of HGS consists of two parts: The model's prediction of the correct configuration of Selective Route Exchange crossover (SREX) parameters and the model implementation in HGS. This section briefly discusses how each part is set up.
The combination of fast-explorative evolutionary search and the aggressive-improvement capabilities of local search is a significant part of HGS performance. Implementing a model for the evolutionary search (i.e. SREX) will add computational overhead to an otherwise very fast step. Therefore, an implementation should not only have good predictions, but should also try to minimize the computational overhead.
SREX crossover operator combines routes from both parents to generate a new offspring solution. It does this by carefully selecting routes from the second parent that could be exchanged with routes from the first parent. This often results in incomplete offspring that can then be repaired using a search method.
The code below show the parameters used by the PyVRP implementation. The start_indices
and num_moved_routes
determine which routes of parent 1 and 2 are used for the crossover. These parameters are selected randomly, which is what makes the genetic search fast.
def selective_route_exchange(
parents: tuple[Solution, Solution],
data: ProblemData,
cost_evaluator: CostEvaluator,
start_indices: tuple[int, int],
num_moved_routes: int,
) -> Solution
Important
The proposed model of this thesis will make a prediction for all configurations of start_indices
and num_moved_routes
given two parents. The best configuration is then selected.
Graph Features | Description | Dimension |
---|---|---|
Edge Index | Graph connectivity in COO format | [2, num_edges] |
Edge Weight | The cost ( |
[num_edges, 1] |
Node Features | Node feature matrix | [num_nodes, num_node_features] |
Coordinates ( |
Represents the position of customer |
[num_nodes, 2] |
Timewindow |
The time interval customer |
[num_nodes, 2] |
Demand |
The demand of customer |
[num_nodes, 1] |
Service time |
The service time of customer |
[num_nodes, 1] |
Vehicle capacity |
The demand capacity of vehicles for a given route instance | [num_nodes, 1] |
Number of vehicles |
The number of vehicles for a given route instance | [num_nodes, 1] |
Positional Embeddings |
The LapPE or RWPE with number of dimension |
[num_nodes, k] |
Figure 1 illustrates the model proposed in this thesis. The architecture comprises three main elements: the Graph Attention (GAT) networks, the embedding transformations, and, finally, the fully-connected (FC) layers.
The single input for the model consists of three different graphs: Parent-1 and Parent-2 which both represent a single solution to a VRP-problems, and the Full Graph which represent the whole VRP-problem graph(i.e. all edges and nodes of the problem).
The output, shown in figure 2, is heatmap showing the estimated quality of configurations of the Selective Route Exchange Crossover function given Parent-1 and Parent-2.
Following the GAT layers, node representations are obtained. However, to predict the routes that should be switched between parents (configuration of SREX parameters) to find a better offspring, these embeddings need to be transformed to reflect this interaction. A diagram illustrating these transformation is shown in figure 3.
In this section, we briefly explain the steps taken to calculate the embeddings for all configurations of the SREX parameters.
Note
For how the steps are implemented to minimize added computation time, we refer to the source code
The initial step involves aggregating the node embedding into an embedding for each route for both Parent
Since SREX switches one or more routes (
Subsequently, there are also routes that aren't used for the crossover. For each configuration, we also aggregated the route that aren't used and concatenate them with the aggration of routes used. This results in an embedding for each configuration(i.e. starting_index
of that parent and num_routes_moved
) of a single parent with dimensions [
Finally, to enable a comparison between Parent num_routes_moved
) is always the same for both parents. The result is a representation for every SREX configuration given two parents, with dimensions
[
When integrated into the Hybrid Genetic Search, the NeuroSREX model approach exhibited potential for enhancing solution quality given enough iterations(30.000+). However, significant compute overhead was introduced through requiring positional encodings(LapPE) for every solution. Addressing this limitation to improve iteration speed remains imperative for practical viability.
In conclusion, we have taken the first steps toward optimizing genetic selection within the context of routing problems. While it is not currently ready for practical implementation, this study has demonstrated a significant positive impact, warranting further exploration in the optimization of evolutionary algorithms. As with natural evolution, progress is achieved on step at a time.
To create practical viability, we recommend the following steps for future work in order of importance:
-
Improve the current implementation by adding positional encodings as a feature to every solution. This would eliminate the need to calculate it repeatedly.
-
Train NeuroSREX using an alternative approach for computing the Positional Embeddings: compute the eigenvector for each graph edge, incorporating them as edge features, and subsequently deriving Node PEs based on the Parent Graph edges. This method requires calculating eigenvectors only once for each route instance before running HGS.
-
If the aforementioned steps result in practical viability for instances with less challenging constraints, subsequent research efforts should focus on developing methods to incorporate the capability to handle more complex constraints.