A Novel Multi-Objective Harmony Search Algorithm with Pitch Adjustment by Genotype

In this work, a new version of the Harmony Search algorithm for solving multi-objective optimization problems is proposed, MOHSg, with pitch adjustment using genotype. The main contribution consists of adjusting the pitch using the crowding distance by genotype; that is, the distancing in the search space. This adjustment automatically regulates the exploration–exploitation balance of the algorithm, based on the distribution of the harmonies in the search space during the formation of Pareto fronts. Therefore, MOHSg only requires the presetting of the harmony memory accepting rate and pitch adjustment rate for its operation, avoiding the use of a static bandwidth or dynamic parameters. MOHSg was tested through the execution of diverse test functions, and it was able to produce results similar or better than those generated by algorithms that constitute search variants of harmonies, representative of the state-of-the-art in multi-objective optimization with HS.


Introduction
Most real engineering problems are multi-objective in nature, since commonly they present several objective functions to be optimized that are compromised with each other, that is, the improvement of one produces the deterioration of another. Therefore, without a function priority, a unique solution cannot be determined as in a single objective optimization. Instead, multi-objective optimization seeks to obtain the best compromises. The best compromised solutions are known as non-dominated and form the Pareto-optimal front [1].
Harmony Search (HS) is a metaheuristic algorithm developed by Geem et al. [2], emulating musical improvisation. It was proposed as a mono-objective algorithm and has been successfully applied in practical and scientific problems [3][4][5][6]. HS integrates three resources for the quantitative optimization process: use of a harmonic memory (HM), pitch adjustment, and randomization [7]. The implementation of multi-objective HS is a growing trend in scientific research, and its use for solving highly complex problems is considered a future challenge [8]. The first multi-objective cases solved by HS [9][10][11] were engineering problems treated with the weighting function method. In [11,12], the authors began to handle the Pareto front term, without explicitly establishing a multi-objective HS algorithm. In [13], Xu et al. developed for the first time a multi-objective HS to generate a Pareto-optimal front of five points for a robotics model. Ricart et al. [14] established two formal multi-objective proposals known as MOHS1 and MOHS2. Both of them are very similar to the single objective form of HS but with a fundamental difference in the ranking of the harmonies. Sivasubramani et al. [15] presented a proposal similar to MOHS2 (denominated as MOHS3 in this paper) that incorporates the crowding distancing and dynamic pitch adjustment proposed by Mahdavi et al. [16]. Nekooei et al. [17] developed a multi-objective memetic algorithm by combining HS and PSO. However, it greatly differs from the original structure of HS.
In its original structure, HS is very effective at identifying promising areas in the solution space (exploration) but presents difficulties in carrying out refined searches (exploitation). For this reason, it is required to improve the balance between exploration and exploitation. There are two trends in this sense [18]. The first one is based on the synergistic combination of HS with other metaheuristic algorithms. In [19][20][21], PSO is applied to give a social component to the pitch adjustment. In [22,23], the authors proposed hybrid algorithms to improve the exploitation in HS by combining it with Artificial Bee Colony (ABC) and Stochastic Hill Climbing. However, these alternatives are often disruptive to the conceptual essence of HS. The second stream focuses on improving the pitch adjustment [24,25]. In [26], pitch adjustments with fixed parameters are proposed, using bandwidths that are proportional to the range of each variable. On the other hand, in [7,16,27], the authors propose the use of dynamic parameters to intensify exploitation as the algorithm execution advances. Gupta [28] proposed an HS variant using non-lineal dynamic parameters with a Gaussian distribution for the Harmony Memory Accepting Rate (HMCR) and bandwidth with a social component. In the case of multi-objective HS, the conventional mono-objective pitch and the dynamic pitch adjustments of Mahdavi et al. [16] have barely been implemented, both based on their effectiveness for single-objective problems. However, Pareto fronts can be generated for different dispersion degrees of the search space, also known as decision space [29]. Currently, most multi-objective HS proposals present a lack of analysis of the behavior of harmonies in the search space during the formation of Pareto fronts.
In this paper, the Multi-Objective HS algorithm MOHSg is proposed. The main contribution consists in the pitch adjustment using the crowding distance by genotype, that is, the distancing in the search space. This adjustment automatically regulates the explorationexploitation balance of the algorithm, based on the distribution of the harmonies in the search space during the formation of Pareto fronts. Therefore, MOHSg only requires the presetting of the HMCR and the Pitch Adjustment Rate (PAR) for its operation, avoiding the use of the static bandwidth parameter required by the MOHS2 variant, and the dynamic parameters of bandwidth and PAR needed in MOHS3. The proposed algorithm was evaluated with a set of well-known test functions, generating similar or better results than those obtained by the MOHS2 and MOHS3 algorithms. The work is structured as follows: Section 2 presents the pitch adjustment variants reported in the literature, the rankings necessary for the multi-objective approach with a Pareto-optimal front, and the algorithms MOHS2 and MOHS3, while the crowding distancing by genotype and the pitch adjustment are described in Section 3. The proposed MOHSg algorithm is detailed in Section 4, and in Section 5, the experimentation and discussion of the results are carried out. Section 6 corresponds to the final discussion of the work, and Appendix A presents the test functions applied for the algorithm evaluation.

Non-Disruptive Multi-Objective HS Algorithms
The objective of this section is to present the MOHS2 and MOHS3 algorithms since they are representative alternatives for multi-objective HS. These algorithms will be used in a comparative analysis with the new MOHSg proposal, since in all cases the original conception of HS is preserved (non-disruptive algorithms), and the differences lie entirely in the pitch adjustment. First, it is necessary to review the pitch adjustment variants reported in the literature and their impact on the multi-objective approach, as well as the description of the ranking that is required for the Pareto-optimal front.

Pitch Adjustment Variants
Several pitch adjustment techniques have been proposed for HS [20]. In the original version, the pitch adjustment for a new variable x i that belongs to a solution vector x = (x 1 x 2 x 3 . . . x M ) with M design variables is given by expression (1): where N is the number of harmonies; r is a random integer from 1 to N; rand(−1, 1) is a random number generated between −1 and 1; bw is the bandwidth; and PAR is the pitch adjustment rate. PAR and bw are preset to fixed values. This scheme is used by Ricart et al. [14] in the MOHS2 algorithm, where diverse problems are solved with fixed values for PAR and bw that are proportional to the range of each variable. Tuo et al. [26] proposed one of the most used pitch adjustment methods with fixed parameters, with a fixed value of PAR, and a value of bw for each variable given by Equation (2), where L b and U b are the lower and upper limits of the specific variable, respectively: Mahdavi et al. [16] presented an improvement to HS through the linear increase of PAR and the exponential decrease of bw in every iteration, as shown in Equations (3) and (4) where PAR min and PAR max are the minimum and maximum values of PAR, respectively; g is the current iteration; N I is the total number of iterations; and bw max and bw min are the maximum and minimum bandwidth, respectively. This encourages exploration in the early iterations of the algorithm and the subsequent intensification of exploitation. This scheme was applied by Sivasubramani et al. [15] to the development of the MOHS3.
Portilla-Flores et al. [7] proposed a fixed PAR, and a bw that decreases exponentially depending on the range of each variable, as shown in Equation (5), where a is a positive constant in the range from 0 to 1. It generates a bw value for each variable, eliminating the parameters required in [16].
In [27], Wuang and Huang presented a pitch adjustment with decreasing PAR, PAR min set to 0 and PAR max set to 1, as calculated in Equation (6). The pitch adjustment for x i is given by expression Equation (7). The self-adaptive bandwidth bw is based on harmonic memory awareness, that is, depends on the extreme values of HM , where min(HM) i and max(HM) i are the minimum and maximum values of the ith variable in HM, respectively. Thus, the selection of PAR and bw values is eliminated.
As can be seen, only two proposals for pitch adjustment have been brought to the multiobjective extensions of HS. In the case of the fixed pitch adjustment proposal, it is evident that both in single-objective and multi-objective optimization it has the disadvantage of an immobile exploration-exploitation balance during the entire execution. The proposals for pitch adjustment with dynamic bandwidth are based on the fact that as iterations pass, promising areas will be formed, and the exploitation will be more important. However, in the case of multi-objective problems, this behavior does not necessarily occur. Therefore, a proposal for pitch adjustment based on the real harmony distribution in the search space is required, such as the pitch adjustment by genotype.

Ranking
The main difference between the mono-objective HS and the multi-objective proposals is the ranking of harmonies. In mono-objective problems, the ranking of solutions by their quality is based on the objective function. However, there is a number of ranking strategies to impulse the generation of higher-quality Pareto fronts for multi-objective problems [30]. MOHS2 uses a non-dominated ranking, while MOSH3 and the proposed MOSHg algorithm apply the ranking presented by Deb et al. [31], which consists in: 1.

2.
Ranking based on crowding distance by phenotype.
The non-dominated ranking classifies the solutions depending on their level of dominance, while the ranking by crowding is used as a second criteria for a second-level ranking related to the overcrowding of the solutions.

Non-Dominated Ranking
Given two solutions x 1 and x 2 , x 1 dominates x 2 if and only if x 1 is not worse than x 2 for every objective function, and x 1 is strictly better than x 2 at least in one objective function. For the minimization case, it is equivalent to: where N is the number of variables in a harmony.
If x 2 is not dominated by x 1 and x 1 is not dominated by x 2 , then the solutions are non-dominated by each other. The number of times that a solution x i is dominated by other solutions corresponds to its dominance level. The different dominance level can be obtained by a non-dominated ranking, that is, the different Pareto fronts of each solution [31][32][33].
Algorithm 1 presents the non-dominated ranking of HM. As can be seen, the dominance levels of each solution are saved in vector Front, and the non-dominated elements have a Front value of 1 (optimal Pareto front).

Crowding Distance by Phenotype
The crowding distance provides an estimate of the density of solutions surrounding a solution. It is used in multi-objective optimization to increase diversity, giving priority to the most dispersed solutions (with the highest crowding value). In other words, it encourages exploration in sparsely populated areas. It constitutes a second criterion for the ranking, since it ranks the solutions of the same front decreasingly with respect to the crowding distance. When this distance works in the space of the objective functions, it stimulates dispersion on the Pareto fronts, and is known in the field of bio-inspired algorithms as phenotype crowding [29,34]. It is expressed in Equation (9) Algorithm 2 corresponds to the crowding distance by phenotype, where Cr is the vector of crowding distancing, In is the position index of the ranked objective-function vector, No f is the number of objective functions, and Fo is the matrix of values of the objective functions. Note that the crowding distance operates between solutions of the same Pareto front, that is, the crowding of the solutions of the first front only takes into account the solutions of the first front, and so on. The extreme solutions of each front will be infinite crowding values. Algorithm 2 Crowding distance by phenotype 1 set N rank as number of fronts; 2 Cr = zeros(N, 1) vector of length N initialized with 0's; In = sort(Fo v ); /*determination of neighboring points; 11 Cr v (In(1)) = Cr − v(In(end)) = ∞; /*extreme values; Cr(Fr v ) = Cr v

MOHS2 and MOHS3 Algorithms
The MOHS2 and MOHS3 variants retain the original conception of the HS algorithm. Both alternatives were developed in a multi-objective way through the rankings described in the previous section. Both algorithms generate a random harmonic memory HM 1 in the first iteration, and by means of this memory usage, pitch adjustment and randomization operators, they generate a new harmonic memory HM 2 , with the same dimensions as the initial one. Both memories form an extended matrix HM 1−2 that is ranked and truncated in half to form the HM 1 for the next iteration.
Algorithm 3 corresponds to MOHS2 [14]. Steps 1-4 constitute the initialization of the algorithm. The main cycle cover steps 5-14, where the new harmonic memory HM 2 is generated. Although MOHS2 uses only non-dominated ranking, in this work, the ranking proposed by Deb et al. [31] (step 15) is applied, because of the improvements experienced by the conformation of the fronts with the crowding distancing criterion. Finally, the harmonic memory HM 1 of the next iteration is formed in step 16. The MOHS3 pseudo-code [15] is shown in Algorithm 4. Steps 1-4 constitute its initialization (note that MOHS3 uses dynamic parameters in pitch adjustment according to Mahdavi et al. [16]). The main cycle cover steps 5-17, where the new harmonic memory HM 2 is generated. As mentioned before, the criterion of Deb is used to rank HM 1−2 in step 18. Finally, the harmonic memory HM 1 of the next iteration is formed in step 19.

Proposed Pitch Adjustment by Genotype
The proposed pitch adjustment is inspired by genotype crowding distancing, which is also used in multi-objective optimization. These two operations are described below.

Crowding Distance by Genotype
When the crowding distance operates in the space of the decision variables, it encourages dispersion in the search space. In the field of bio-inspired algorithms, this operation is known as genotype crowding [29,34], and is expressed in Equation (10) It is important to note that the proximity of two solutions on the objective space does not necessarily imply proximity in the search space, as shown in Figure 1. Pareto fronts can be generated for different dispersion degrees of the search space. In [29], a deep analysis of multi-objective problems solved by NSGA with crowding by genotype and phenotype can be found.  The crowding distance by genotype is determined in a very similar way to the crowding by phenotype previously exposed, with the difference that it is calculated between the variables, as described in Algorithm 5. The number of objective functions No f was substituted by the number of variables M in step 7, while in steps 8-10 and 13, the matrix of objective functions was replaced by HM.

Pitch Adjustment by Genotype
According to the previous subsection, the Pareto-optimal front can be reflected in the search space both by dispersed solutions and by close solutions that form promising areas. This can be determined by means of crowding distance by genotype, opening two possibilities: 1.
Intensify exploitation as the algorithm advances, but this time depending on the conformation of promising areas.

2.
Maintain higher exploration in dispersed solutions and higher exploitation in promising areas, with the objective of achieving a better balance of the algorithm, based on the behavior of HM during the formation of the Pareto-optimal front.
Therefore, a pitch adjustment based on the crowding distance by genotype is proposed as established in expression (11), where bw is obtained from Equation (12): For the upper and lower extremes of each front, bw is given by Equations (13) and (14), respectively: Since the bandwidth bw(r, i) is the crowding distance component of the variable x i for the solution vector in the position r, Equation (12) includes no summation operator. That is, Suppose that a variable x 2 in the new harmony is created by pitch adjustment, and the parameter r is randomly generated with a value 4. The substitution of these values in expression (11) produces Equation (16), for calculating x 2 : Vector 4 is a part of Front 1. Since the analyzed variable is x j 2 = HM(4, 2) = 5, the neighboring points are 7 and 1, from harmonies 1 and 2, respectively. Equations (17) and (18) are generated by substituting these values in Equations (12) and (16). Note that solution vector 3 is not considered because it belongs to Front 2.

Proposed HS (MOHSg)
The effectiveness of metaheuristic algorithms is driven by two fundamental components: exploration and exploitation. The proper balance between these two components greatly influences the algorithm efficiency. Highly exploit-focused algorithms explore only a fraction of the search space and tend to become stuck in the local optimum. On the other hand, highly scan-focused algorithms converge very slowly, and the solution time can be very long.
HS proposals with a fixed exploration-exploitation balance are unable to adjust their behavior as required by the problem. On the other hand, the proposals with variable parameters do not contemplate the distribution of solutions in the search space, which, as seen above, can have different configurations for the formation of the Pareto-optimal front. Instead, the proposal presented in this paper is capable of balancing the exploration and exploitation by adjusting the pitch, based on the distribution of the solutions in the search space. Therefore, it does not require parameters such as the static bandwidth used in MOHS2, nor dynamic parameters such as those required by MOHS3.

Description
The main differences between MOHSg (described by Algorithm 6) and the original mono-objective HS version lie in the HM ranking and in the pitch adjustment operation by genotype. The algorithm requires the presetting of the harmonic memory consideration HMCR and pitch adjustment PAR parameters. The execution starts generating a random initial harmonic memory HM 1 . For each iteration, a new harmonic memory HM 2 is generated with the same dimensions as the initial one and is made up variable by variable applying the memory usage operators, pitch adjustment by genotype and randomization. Both memories are combined to form an extended matrix HM 1−2 that is ranked according to the ranking criteria of Deb et al. [31]. Finally, the ranked matrix is truncated in half to form the HM 1 of the next iteration.

Experimentation and Results
In this section, the MOHS2, MOHS3 and MOHSg algorithms are used to solve six problems reported in the literature [35,36], designated as P1 to P6, that are specifically designed for measuring the performance of multi-objective optimization algorithms. The problems were selected taking into account the variety of their characteristics, such as (I) the shape of the Pareto-optimal fronts (PFs) and the Pareto-optimal sets (PSs), (II) the coexistence of local and global PSs (multi-modality) (III) the number of decision variables and objective functions. In all the problems presented, it is possible to determine the real PS and the real PF, which allows the evaluation of the results obtained by each algorithm. The algorithms were programmed in Matlab R2018a on a Windows 10 platform. Computational experi-ments were performed on a PC with a 2.67 GHz Intel(R) Core (TM) i7 processor and 8 GB of RAM. The test problems are included in Appendix A.

Performance Indicators
Every problem was solved with 20 independent runs of each algorithm. An execution includes 10,000 evaluations of the objective function, with the exception of problem P6, which required 20,000 evaluations per run. In fact, the number of objective-function evaluations measures the computational cost. For the graphic appreciation of the results, the integrations of PSs and PFs of the 20 executions are made. In each figure, the number of solution vectors that make up each front is specified. Additionally to the graphical appreciation, the performance indicators described in the following subsection were applied for the quantitative comparison of the results. The performance indicators were calculated for each execution, and presented through its mean value (average), best value (best), worst value (worst), and standard deviation (Std. Dev). The best results are highlighted. Considering that metrics can be misleading in multi-objective optimization according to Coello and Cortés [37], it can be illuminating to consider two main factors: (I) if the solutions belong to the real PF, and (II) how uniform the distribution of solutions along the Pareto front is.

Error Ratio, ER
This parameter (Equation (19)) was proposed by Van Veldhuizen [38] to indicate the percentage of solutions from the non-dominated vectors that are not in the real PF, where n is the number of non-dominated solution vectors that were generated, and e i is 0 or 1 if the vector is non-dominated by the PF real or not, respectively. The ideal value is ER = 0 since every vector generated by the algorithm belongs to the real PF.

Spacing, S
This indicator was proposed for Schott [39] as a way to measure the variance of neighboring vectors in the PF. It is calculated by Equations (20) and (21), where d is the media of all d i , with i, j = 1, . . . , n,

Inverter Generational Distance, GD
This indicator was introduced by Van Veldhuizen and Lamont [40] as a way of estimating how far the elements of the PF obtained by the algorithm are from the real PF. It is represented by Equation (22), where n is the number of generated vectors, and d i is the Euclidian distance between the generated vectors and the ones of the real PF.
Coello and Cortés [37] recommend using the real PF as a reference, that is, each vector of the real PF is measured with its nearest neighbor of the PF obtained to avoid measurement problems when the generated front has few members, and this is what it is known as Inverted generational distance. In this work, GD was used both in the search space (GDx) and in the space of objective functions (GD f ).

Parameter Tuning
In order to achieve a reasonably good performance for the three algorithms, several previous experiments were carried out, modifying their parameters until producing the best performances. For the case of MOHS2 and MOHS3, it started from the parameters recommended in the original developments [14,15]. As can be verified in Table 1, for MOHS2, only the HMCR parameter was varied, while MOHS3 kept the parameters proposed in the original work. For the case of P6, which has three objective functions (many-objective), the parameters of the three algorithms were modified in order to obtain a better exploration, as shown in Table 2.  Figure 2 shows the integrated PFs resulting from the solution of problem P1, as well as its real PF (convex front). The legend indicates the number of points that make up each front. Note that all the algorithms converged to the solution of the real PF. However, in the detail view, points appear belonging to the three algorithms (specially the MOHS3 algorithm) that do not belong to the real PF, that is, they are dominated by that front. In the search space, it can be observed that the PS has an abrupt change in behavior (Figure 3). The algorithms found most of the solutions in the left region of the set, while in the right side, a higher dispersion is observed, especially for the proposed algorithm MOHSg. Table 3 shows the statistical analysis of the behavior of the algorithms for P1, where the best values are indicated in bold type. As can be seen in the error rates, an average of 21.8% of the vectors obtained by MOHSg do not belong to the real PF, while for MOHS2 and MOHS3 they constitute 28% and 41%, respectively. Likewise, it can be observed that, in general, the PFs and PSs nearest to the real PF and PS were obtained by MOHS3, with the drawback that approximately half of its points are dominated by the real PF. The best distributed fronts correspond to the proposed MOHSg algorithm, followed by MOHS2 and MOHS3.  Then, it is concluded that for this test function, MOHS3 presented a poor performance because it produced the PFs with the lowest population and with the highest proportions dominated by the real PF. MOHSg had a slight superiority in both the error ratio and the uniform distribution with respect to MOHS2. However, this last algorithm surpassed the proposed one in terms of the proximity in relation to PS and real PF. Thus, MOHS2 and MOHSg presented a similar performance for this problem.

MOHS2
PS true

Problem P2
In Figure 4, the concave PF generated for problem P2 can be seen. The most populated front corresponds to the MOHS2 algorithm, followed by MOHS3 and MOHSg very close to each other. Note that in this problem, the concave front is made up of two PSs in the search space, as shown in Figure 5. The PFs and PSs obtained are more populated and better distributed than in P1, as can be seen in the results tabulated below. Table 4 presents the statistical analysis of the algorithms. It can be observed that the best error ratio corresponds to the MOHS2 algorithm with 7.65% (mean value) followed by MOHSg with 11.95% and MOHS3 with 12.83%. The fronts and sets nearest to the real PF and PS correspond to those obtained by MOHS3, followed by the results of MOHS2 and MOHSg. The best distribution of the PFs corresponds to MOHS2, followed by the proposed algorithm MOHSg and MOHS3. In this problem, MOHS2 produced the most populated integrated front, and the fronts with the lowest error ratio and with the best uniform distribution of all the variants. Therefore, it can be concluded that MOHS2 had the best overall performance for this test function.

Problem P3
As can be seen in Figure 6, problem P3 has a local PF and a global PF, which drives the algorithms to be trapped in the local optimum. Both fronts are discontinuous and are divided into four solution regions. In Figure 7, the local and global PSs for this problem can be seen, divided into four regions and with a linear behavior. The most populated PFs corresponded to MOHSg and MOHS2 with 3919 and 3863 points, respectively, while MOHS3 obtained a much lower front with 1013 points. The mentioned figures show the convergence of the three algorithms to global solutions. The lowest error ratio in Table 5 corresponds to MOHSg with 1.43%, followed by MOHS2 with 2.95 % and MOHS3 with an extremely unfavorable 77.9%. The fronts nearest to the real PF and PS also correspond to MOHSg followed by MOHS2 and MOHS3. In the case of distribution, MOHSg had the best average performance, although the lowest standard deviation corresponds to MOHS2. In this problem, MOHSg clearly presented the best performance.

Problem P4
Like in the previous case, problem P4 has a discontinuous front, but this time divided into 10 regions (Figure 8). In Figure 9, it can be seen that the PS is also divided into 10 linear regions, but in this problem, there is only one global front. As shown in Table 6, the three algorithms generated fronts of approximately 3900 points, for an error ratio in the best of cases of 1.38% for MOHSg followed by 2.18% and 2.70% for MOHS3 and MOHS2, respectively. The fronts and sets nearest to the real PF and PS corresponded to MOHS3, followed by MOHSg and MOHS2. Similarly, the best distributed fronts were obtained using MOHS3, while MOHSg showed very close values.
In this case, the MOHS3 algorithm presented a fairly low error ratio, as well as the fronts and sets nearest to the real PF and PS. It also generated the best uniform distribution values of PF. Therefore, it is concluded that MOHS3 offered the best performance for this test function.

Problem P5
Unlike the previous cases, problem P5 has three decision variables. In addition, it presents two PSs (one local and one global) that make up a local and global PF, as shown in Figures 10 and 11. The most populated front corresponds to the proposed algorithm MOHSg (3402 points), followed by MOHS2 (2987), while MOHS3 produced a significantly smaller front (1438). As can be seen in Table 7, the error ratio also differs drastically, with the best in 8.95% for MOHSg, followed by 18.58% for MOHS2, and in the worst case MOHS3 returned a value of 70%. The PF closest to the real one was obtained by MOHS2, closely followed by MOHSg, while the PS closest to the real one was obtained by MOHS3. The PFs with the best uniform distribution (mean value) were obtained by MOHSg.
For this problem, MOHSg obtained a much more populated front than MOHS2 or MOHS3, as well as a significantly lower error ratio. In Figure 11, it can be contrasted that MOHSg does not cover all the real PS, and it is also manifested in the GDx values. However, it has the best average value of uniform distribution in the PFs. Therefore, it can be concluded that MOHSg has the best overall performance for this problem.

Problem P6
The MMF14_a problem has three decision variables and three objective functions, making it a many-objective problem. As explained above, in this problem the number of evaluations was modified to 20,000 and the parameters of the three algorithms were tuned in order to improve the quality of the solutions. Note that this problem has two global surfaces of PSs that correspond to a single concave global PF (Figures 12 and 13). Additionally, the generated PFs converge to the real PF, also covering the two surfaces that make up the PSs. The most populated front was obtained by MOHS3, followed by MOHS2 and MOHSg. The best error ratio corresponds to MOHS3 with 34.95%, while for MOHS2 and MOHSg the error ratios were 47% and 50%, respectively, as indicated in Table 8. The front closest to the real PF was also obtained from the MOHS3 algorithm, while the sets closest to the real PS were obtained by MOHSg. In the case of the uniform distribution of the PF, the best mean value corresponds to MOHS3 while the best recorded point value was obtained by MOHSg. Thus, MOHS3 produced the best solution, since it generated the most populated front of the three algorithms, as well as the lowest error ratio by a considerable margin. This algorithm also yielded the best mean GD f and uniform distribution values.

Final Discussion
In this work, a multi-objective HS algorithm (MOHSg) is proposed, whose fundamental contribution consists of the pitch adjustment based on the crowding distancing by genotype, that is, the crowding distancing that works in the search space. This algorithm is capable of automatically adjusting the exploration and exploitation by adjusting the pitch, based on the distribution of solutions in the search space during the formation of the Pareto front. Therefore, MOHSg only needs the presetting of the harmonic memory and pitch adjustment parameters for its operation, without requiring the static bandwidth parameter of the MOHS2 variant nor the dynamic bandwidth and pitch adjustment parameters needed by the MOHS3 algorithm.
For the test of the proposed algorithm, six multi-objective optimization problems were used with a diversity of characteristics regarding the shape of the Pareto-optimal fronts and Pareto-optimal sets, the coexistence of local and global solutions (multi-modality), and the number of decision variables and objective functions. MOHSg was able to produce similar or better results to those generated by the MOHS2 and MOHS3 algorithms, which constitute HS variants representative of the state-of-the-art in multi-objective optimization. Specifically, MOHSg produced the best results in three of the proposed problems, while in the rest, it registered competent results, excelling in some punctual performance indicators.
From this comparative study, it can be concluded that the harmonic search algorithm based on pitch adjustment by genotype is an effective tool for solving multi-objective optimization problems. As part of future work, the application of the proposed algorithm to the solution of multi-objective problems with functional restrictions is proposed.