4.1. Genetic Algorithm
The established robust optimization model involves complex nonlinear elements and cannot be solved by mathematical programming methods. The GA as a heuristic algorithm has been successfully used in many traditional optimization problems. Here, the GA is used and somewhat improved. The departure interval is expressed by an integer code. The fitness function introduces a penalty value to represent the objective function value.
The GA design generally includes a chromosome coding method, population initialization, fitness function design, and crossover and mutation methods. The GA flow chart [
30] is shown in
Figure 2.
(1) Chromosome Coding. The decision variable in the model is the departure time of buses at the original and terminal station, which is an integer. We employ an integer coding method to facilitate the genetic operators. To increase the speed, the headway instead of the departure time is regarded as a gene bit of chromosome coding. Thus, a single chromosome can be expressed as (H1, H2, ...., HM). The corresponding departure time of buses equals the sum of the starting time of the planning horizon and the corresponding interval. For example, if the optimal chromosome of five vehicles is coded as (10, 10, 10, 10, 10), and the start time of the planning horizon is 8:00, then the corresponding departure times of five vehicles are 8: 10, 8: 20, 8: 30, 8: 40, and 8: 50, respectively.
(2) Population Initialization. Hi is randomly generated between the maximum and minimum headway when the population is initialized. Each gene for each chromosome is randomly generated by the same method. To ensure that the candidate solutions meet the minimum and maximum headway, and to ensure that the departure time of the last bus during the planning horizon is unchanged, a chromosome repair strategy is designed. The repair method is detailed further below.
In the actual code, the results of the first cycle using the GA may be close to the original departure time. Thus, the original departure schedule of the bus company is also added to the initial population to speed up the GA convergence.
(3) Crossover. A uniform crossover [
30] method is herein used. A chromosome crossover diagram is shown in
Figure 3. First, two parent chromosomes are randomly selected from the initial population, and a 0–1 crossover mask of the same length as the parent chromosome is randomly generated. Then, the genes of the two parents corresponding to the “1” element of the crossover mask is swapped to form the two child individuals.
Figure 3 depicts an example of the uniform crossover. There are twelve buses which are the decision buses, and the chromosome code is the headway.
(4) Mutation. The uniform mutation method is used herein. If a chromosome is (H1, H2, ...., HM), we add one to the value of one gene H1, H1 and then randomly decrease one to the value of another expected gene, H1, so that the departure time of the last bus can be guaranteed as fixed.
For example, as shown in
Figure 4, the second and fourth gene bits of the parent chromosomes are selected. After mutation, the fourth digit decreases by one while the second digit increases by one, so that the sum of the headway of the offspring chromosome is the same as that of the parent chromosome.
(5) Fitness Function. The GA fitness function design is an important factor for directly deciding the solution. To reasonably represent the constraint in the model and obtain a better solution, the penalty function is introduced here, as shown in (27).
where
F is the fitness value,
w is the maximum regret value, and
g is the coefficient of the penalty term. The second term in (27) is the penalty term, which actually represents the regret value constraint in the model.
Zs (x) represents the objective function value in the three scenarios. The corresponding function expression of each scenario
s is the same; however, the passenger arrival rate corresponding to the scenario is different, which can be used to show (28):
The sum of the first three terms in (28) is the total waiting time of every scenario. It can be observed from the objective function that the smaller the value is, the better it is. However, to create a solution to meet the actual situation, the formula increases the penalty function to punish the solutions that do not satisfy the constraint condition. They are gradually eliminated by the GA.
Therefore, the fourth term of (28) is a penalty function term to the solution that does not satisfy the constraint. Moreover, α is the penalty factor, and the fourth term is the penalty of the overtaking constraints. In the process of chromosome initialization, the headway range is limited, which ensures that the headway in the model satisfies the constraint of maximum and minimum intervals. In (28), the penalty term is reduced, and the solving process is simplified.
(6) Selection. The process of selecting the offspring is the key step in the GA iterative solution. The most common roulette method is used here. In the resolving process, the higher the individual survival probability, the smaller the fitness value. Meanwhile, the lower the individual survival probability, the larger the fitness value. Hence, the formula for obtaining the individual selection probability is shown in (29).
The calculated expression of the cumulative probability is:
when selected, we randomly generate
If
, we select individual
i.
(7) Stopping Criteria. The maximum number of iterations is set as the GA stopping condition. When the iterative digit is larger than the set value, the GA stops searching and outputs the optimal solution obtained by the current algorithm as the retained optimal result at last. If the optimal fitness value cannot be changed for many generations, the solution corresponding to the optimal fitness value can also be used as the final result.
4.2. Improved Genetic Algorithm
To further improve the model and obtain a better solution, an improved GA is introduced. It mainly includes a chromosome repair strategy, elite retention strategy, and memory initialization method in the decision database.
(1) Chromosome Repair Strategy. To guarantee that the departure time of the last bus is unchanged, a chromosome repair strategy is introduced. Considering the chromosome gene bit representing the headway, the chromosome repair formula is shown as (31).
where
Tsp is a fixed value, representing the length of the original
time window, i.e., the sum of headways of all the buses to be dispatched.
Hi is the unrepaired value of the offspring chromosome generated after crossing, while
Hi’ is a repaired value corresponding to the gene bit. For example, when
Tsp is 40, the two parent chromosomes are (12, 8, 13, 7) and (11, 9, 6, 14), respectively. Then, the child chromosomes generated after crossing may be (12, 12, 12, 14) and (9, 11, 5, 5). Assuming that the start time of the planning horizon is 6:00, the departure times of the last buses of the two parent chromosomes are both 6:40; however, the departure time of the last bus of two offspring chromosomes are 10:00 and 9:20, respectively. After being repaired, the offspring chromosomes are both (10, 10, 10, 10), which meets the requirement.
(2) Elite Retention Strategy. Elite individuals are the best individuals at present, as searched by the GA, and the chromosomal codes satisfy the problem-solving process. The advantage of the elite retention strategy is that the optimal individual gene up until this point has not been lost or destroyed by selection, crossover, and mutation operations in the population evolution process. The elite retention strategy greatly improves the GA global convergence ability. By using the elite retention strategy, many individuals can be generated by each iteration, and they are all sorted according to the fitness values. Then, the corresponding individual of the optimal solution of the current population is retained, making it a parent individual as the beginning of the next GA. It participates in the crossover and mutation processes of the next iteration. It is evident that the elite retention strategy can greatly improve the GA global search ability and optimization ability, which is beneficial to obtaining the better solution.
(3) Decision Database Introduction. A decision-making library is introduced to improve the efficiency. When the environment changes, the specific state information of the original environment is saved in the decision-making library, which can be solution information, parameters, or intermediate variables of the solving process. When the algorithm restarts, the original state information can be read from the decision-making library, and searching the optimal solution on the basis of the original state greatly improves the solution efficiency.
At the beginning of the next planning horizon, the initialization population is divided into two parts. Of these parts, one is randomly initialized, and some individuals extracted from the best solution repository comprise a new population. Using this method saves some of the best individuals of the last generation and joins them as new randomly generated individuals. This process not only accelerates the convergence speed, but it also guarantees a larger search space. Most importantly, it can more accurately simulate the whole process of a dynamic bus grid.