# A Bounded Archiver for Hausdorff Approximations of the Pareto Front for Multi-Objective Evolutionary Algorithms

^{1}

^{2}

^{*}

^{†}

## Abstract

**:**

## 1. Introduction

## 2. Background and Related Work

Algorithm 1 Generic Stochastic Search Algorithm |

1: ${P}_{0}\subset Q$ drawn at random |

2: ${A}_{0}=ArchiveUpdate({P}_{0},\varnothing )$ |

3: for$j=0,1,2,\dots $do |

4: ${P}_{j+1}=Generate\left({P}_{j}\right)$ |

5: ${A}_{j+1}=ArchiveUpdate({P}_{j+1},{A}_{j})$ |

6:end for |

**Definition**

**1.**

- (a)
- $dist(u,A):=\underset{v\in A}{\mathrm{inf}}{\parallel u-v\parallel}_{\infty}$
- (b)
- $dist(B,A):=\underset{u\in B}{\mathrm{sup}}\phantom{\rule{0.166667em}{0ex}}dist(u,A)$
- (c)
- ${d}_{H}(A,B):=\mathrm{max}\left\{dist(A,B),dist(B,A)\right\}$

**Definition**

**2**

**Definition**

**3**

**Definition**

**4**

- (a)
- $F\left(A\right)$ is called an$\u03f5$-approximate Pareto front of (MOP) if every point $x\in Q$ is ϵ-dominated by at least one $a\in A$, i.e.,$$\forall x\in Q\phantom{\rule{0.277778em}{0ex}}:\phantom{\rule{0.277778em}{0ex}}\exists a\in A:\phantom{\rule{1.em}{0ex}}a{\prec}_{\u03f5}x.$$
- (b)
- $F\left(A\right)$ is called an$\u03f5$-Pareto front if $F\left(A\right)$ is an ϵ-approximate Pareto front and if every point $a\in A$ is a Pareto point of (MOP).

**Definition**

**5**

- (a)
- (b)
- $F\left(A\right)$ is called a$\Delta $-tight $\u03f5$-Pareto front if A is an ϵ-Pareto front of (MOP) and if in addition$${d}_{H}(F\left({P}_{Q}\right),F\left(A\right))\le \Delta .$$

## 3. ArchiveUpdateHD

#### 3.1. The Bi-Objective Case

Algorithm 2 ArchiveUpdateHD |

Require: Problem (MOP), where $k=2$, P: current population, ${A}_{0}$: current archive, ${\Delta}_{0}>0$: current value of $\Delta $, ${\Delta}_{min}$: minimal value of $\Delta $, $\Theta \in (0,1)$, $\kappa >1$: safety factors, N: upper bound for archive size |

Ensure: updated archive A, updated values for $\Delta $, ${\Delta}_{min}$, and $\u03f5$ |

1: $A:={A}_{0}$ |

2: $\Delta :={\Delta}_{0}$ |

3: $\u03f5:={(\Delta ,\dots ,\Delta )}^{T}$ |

4: for all $p\in P$do |

5: if $\nexists a\in A:a{\prec}_{\Theta \u03f5}p,\phantom{\rule{0.277778em}{0ex}}or\phantom{\rule{0.277778em}{0ex}}\nexists a\in A:a\prec p\phantom{\rule{0.277778em}{0ex}}\mathrm{and}\phantom{\rule{0.277778em}{0ex}}\forall a\in A:\parallel F\left(a\right)-F\left(p\right)\parallel >\Theta \Delta $ then |

6: $A:=A\cup \left\{p\right\}$ |

7: end if |

8: for all $a\in A$ do |

9: if $p\prec a$ then |

10: $A:=A\cup \left\{p\right\}\setminus \left\{a\right\}$ |

11: if ${\parallel F\left(p\right)-F\left(a\right)\parallel}_{\infty}>\Delta $then ▹ reset $\Delta $ and $\u03f5$ |

12: ${\Delta}_{min}:=\kappa {\Delta}_{min}$ |

13: $\Delta :={\Delta}_{min}$ |

14: $\u03f5:={(\Delta ,\dots ,\Delta )}^{T}$ |

15: end if |

16: end if |

17: end for |

18: if $\left|A\right|=N+1$ then ▹ apply pruning |

19: $\Delta :=\frac{N+1}{N}\Delta $ |

20: $\u03f5:=\frac{N+1}{N}\u03f5$ |

21: sort A (e.g., according to ${f}_{1}$) |

22: compute $d\in {\mathbb{R}}^{N}$ as in (8) |

23: choose $m\in \mathrm{arg}\mathrm{min}d$ |

24: if $m=1$ then |

25: $A:=A\setminus \left\{{a}_{2}\right\}$ ▹ remove 2nd entry |

26: else if $m=N$ then |

27: $A:=A\setminus \left\{{a}_{N}\right\}$ ▹ remove 2nd but last entry |

28: else |

29: $dl:=\parallel F\left({a}_{m+1}\right)-F\left({a}_{m-1}\right)\parallel $ |

30: $dr:=\parallel F\left({a}_{m+2}\right)-F\left({a}_{m}\right)\parallel $ |

31: if $dl<dr$ then |

32: $A:=A\setminus \left\{{a}_{m}\right\}$ |

33: else |

34: $A:=A\setminus \left\{{a}_{m+1}\right\}$ |

35: end if |

36: end if |

37: end if |

38:end for |

39:return$\{A,\u03f5,{\u03f5}_{min},\Delta \}$ |

**Theorem**

**1.**

- (a)
- There exists a ${l}_{1}\in \mathbb{N}$ and ${\Delta}^{+}>0$ such that$${\Delta}_{l}={\Delta}^{+},\phantom{\rule{1.em}{0ex}}\forall l\ge {l}_{1},\phantom{\rule{1.em}{0ex}}withprobabilityone.$$
- (b)
- There exists with probability one a ${l}_{2}\in \mathbb{N}$ such that ${A}_{l}$ is a ${\Delta}^{+}$-tight ϵ-approximate Pareto front with respect to (MOP) for all $l\ge {l}_{2}$, where $\u03f5={({\Delta}^{+},\dots ,{\Delta}^{+})}^{T}$.
- (c)
- $$\underset{l\to \infty}{\mathrm{lim}}dist({A}_{l},{P}_{Q})=0,\phantom{\rule{1.em}{0ex}}withprobabilityone.$$
- (d)
- There exists a ${l}_{3}\in \mathbb{N}$ such that$${d}_{H}(F\left({A}_{l}\right),F\left({P}_{Q}\right))\le {\Delta}^{+},\phantom{\rule{1.em}{0ex}}\forall l\ge {l}_{3},\phantom{\rule{1.em}{0ex}}withprobabilityone.$$

**Proof.**

**Remark 1.**

- (a)
- Equation (9) is an assumption that has to be made on the generation process. It means that every neighborhood of every feasible point $x\in Q$ will be “visited” with probability one by $Generate\left(\right)$ after finitely many steps. For MOEAs, this, e.g., ensured if Polynomial Mutation [70,71] is used or another mutation operator for which the support of the probability density functions equal to Q (at least for box-constrained problems). We hence think that this assumption is rather mild.
- (b)
- The complexity of the consideration of one candidate solution p is $O\left(Nlog\right(N\left)\right)$, which is determined by the sorting of the current archive A in line 20.
- (c)
- $\Theta \in (0,1)$ and $\kappa >1$ are safety factors needed to guarantee the convergence properties. In our computations, however, we have not observed any impact of these values if both are chosen near to one. We hence suggest to use $\Theta =\kappa =1$ (i.e., practically not to use these safety factors).
- (d)
- The above consideration is done for $\u03f5={(\Delta ,\dots ,\Delta )}^{T}\in {\mathbb{R}}^{k}$, i.e., using the same value for all entries of ϵ. If the values for the objectives along the Pareto front differ significantly, one can of course instead use $\Delta =\u03f5={({\Delta}_{1},\dots ,{\Delta}_{k})}^{T}$ using different values ${\Delta}_{i}$. In that case, the following modifications have to be done: (i) the last condition in line 5 has to be replaced by$$\nexists a\in A\phantom{\rule{0.277778em}{0ex}}:\phantom{\rule{0.277778em}{0ex}}|{f}_{i}\left(a\right)-{f}_{i}\left(p\right)|\le {\Delta}_{i},\phantom{\rule{1.em}{0ex}}i=1,\dots ,k.$$Furthermore, (ii) the condition for the reset in line 11 has to be replaced by$$\exists i\in \{1,\dots ,k\}\phantom{\rule{0.277778em}{0ex}}:\phantom{\rule{0.277778em}{0ex}}|{f}_{i}\left(p\right)-{f}_{i}\left(a\right)|>{\Delta}_{i}.$$
- (e)
- The value of Δ computed throughout the algorithm yields an approximation quality of the archivers in the Hausdorff sense. The theoretical upper bound of the final value ${\Delta}^{+}$ is twice the value of the actual Hausdorff approximation as the following discussion shows (refer to Figure 3): assume we are given a linear front with slope $-1$, and we are given a budget of $N=2$ elements (the discussion is analog for general N). The ideal archive as computed by ArchiveUpdateHD is in this case $A=\{{a}_{1},{a}_{2}\}$, where the ${a}_{i}$’s are the end points of the Pareto set. Assume we have $F\left({a}_{1}\right)={(0,1)}^{T}$ and $F\left({a}_{2}\right)={(1,0)}^{T}$; then, the Hausdorff distance of the Pareto front and A is $1/2$ determined by the point ${y}_{m}={(1/2,1/2)}^{T}$. Given this archive, for any value $\Delta <1$ and assuming that $F\left(Q\right)$ is large enough, there exists a candidate p such that p is not dominated by ${a}_{1}$ or ${a}_{2}$ and that $\parallel F\left({a}_{i}\right)-F\left(p\right)\parallel >\Delta $, $i=1,2$. Hence, p will be added to the archiver—and later on discarded (lines 23–26). The latter leads to an increase of Δ.On the one hand, one suggesting strategy would be to take $\frac{1}{2}{\Delta}^{+}$ as a Hausdorff approximation of the Pareto front in particular, since most Pareto fronts have at least one element where the slope of the tangent space is $-1$. On the other hand, the use of ϵ-dominance prevents that the images $F\left(a\right)$, $a\in A$, are perfectly evenly distributed along the Pareto front so that $\frac{1}{2}{\Delta}^{+}$ is not that accurate for some problems. In fact, this factor of two can only be observed for linear fronts, while ${\Delta}^{+}$ already yields a good approximation in general (see, e.g., the subsequent results for MOPs with more than two objectives). However, we have observed that the following estimation gives even better approximations of the Hausdorff distances: given $A=\{{a}_{1},\dots ,{a}_{N}\}$, which is sorted (e.g., according to objective ${f}_{1}$), the current Hausdorff approximation h is computed as follows:$$\begin{array}{cc}\hfill {\tilde{d}}_{i}& :=\left\{\begin{array}{cc}\hfill \parallel F\left({a}_{i+1}\right)-F\left({a}_{i}\right)\parallel ,& if\phantom{\rule{0.277778em}{0ex}}\parallel F\left({a}_{i+1}\right)-F\left({a}_{i}\right)\parallel \le 2\Delta \\ 0,\hfill & else\end{array}\right.,\phantom{\rule{1.em}{0ex}}i=1,\dots ,N-1\hfill \\ \hfill h& :=\frac{1}{2}\underset{i=1,\dots ,N-1}{\mathrm{max}}{\tilde{d}}_{i}.\hfill \end{array}$$
- (f)
- Several norms are used within the algorithm. While one is—except in line 11, see the above proof—in principle free for the choice of the norms, we suggest taking the infinity norm in line 5 in order to reduce the issue mentioned in the previous part, and the 2 norm in lines 28 and 29 in order to obtain a (slighly) better distribution of the entries along the Pareto front.

**Remark**

**2.**

Algorithm 3:$\{A,\Delta ,h\}:=ArchiveUpdateHD(P,{A}_{0},{\Delta}_{0},N)$ |

Require: Problem (MOP), where $k=2$, P: current population, ${A}_{0}$: current archive, ${\Delta}_{0}\in {\mathbb{R}}_{+}^{k}$: current values of $\Delta $, N: upper bound for archive size |

Ensure: updated archive A, updated values for $\Delta $, Hausdorff approximation h |

1: $A:={A}_{0}$ |

2: $\Delta :={\Delta}_{0}$ |

3: $\u03f5:=\Delta $ |

4: for all$p\in P$do |

5: if $\nexists a\in A:a{\prec}_{\u03f5}p,\phantom{\rule{0.277778em}{0ex}}or\phantom{\rule{0.277778em}{0ex}}\nexists a\in A:a\prec p\phantom{\rule{0.277778em}{0ex}}and\phantom{\rule{0.277778em}{0ex}}\nexists a\in A\phantom{\rule{0.277778em}{0ex}}:\phantom{\rule{0.277778em}{0ex}}|{f}_{i}\left(a\right)-{f}_{i}\left(p\right)|\le {\Delta}_{i},\phantom{\rule{1.em}{0ex}}i=1,\dots ,k$ then |

6: $A:=A\cup \left\{p\right\}$ |

7: end if |

8: for all $a\in A$ do |

9: if $p\prec a$ then |

10: $A:=A\cup \left\{p\right\}\setminus \left\{a\right\}$ |

11: if $\exists i\in \{1,\dots ,k\}\phantom{\rule{0.277778em}{0ex}}:\phantom{\rule{0.277778em}{0ex}}{f}_{i}\left(a\right)-{f}_{i}\left(p\right)>{\Delta}_{i}$then ▹ reset $\Delta $ and $\u03f5$ |

12: $\Delta :={\Delta}_{min}$ |

13: $\u03f5:=\Delta $ |

14: end if |

15: end if |

16: end for |

17: if $\left|A\right|=N+1$ then ▹ apply pruning |

18: $\Delta :=\frac{N+1}{N}\Delta $ |

19: $\u03f5:=\frac{N+1}{N}\u03f5$ |

20: sort A (e.g., according to ${f}_{1}$) |

21: compute $d\in {\mathbb{R}}^{N}$ as in (8) |

22: choose $m\in \mathrm{arg}\mathrm{min}d$ |

23: if $m=1$ then |

24: $A:=A\setminus \left\{{a}_{2}\right\}$ ▹ remove 2nd entry |

25: else if $m=N$ then |

26: $A:=A\setminus \left\{{a}_{N}\right\}$ ▹ remove 2nd but last entry |

27: else |

28: $dl:=\parallel F\left({a}_{m+1}\right)-F\left({a}_{m-1}\right){\parallel}_{2}$ |

29: $dr:=\parallel F\left({a}_{m+2}\right)-F\left({a}_{m}\right){\parallel}_{2}$ |

30: if $dl<dr$ then |

31: $A:=A\setminus \left\{{a}_{m}\right\}$ |

32: else |

33: $A:=A\setminus \left\{{a}_{m+1}\right\}$ |

34: end if |

35: end if |

36: end if |

37: end for |

38: sort A (e.g., according to ${f}_{1}$) ▹ compute Hausdorff approximation |

39: compute ${\tilde{d}}_{i}$, $i=1,\dots ,\left|A\right|-1$ as in (12) |

40: $h:=\frac{1}{2}{\mathrm{max}}_{i=1,\dots ,\left|A\right|-1}{\tilde{d}}_{i}$ |

41: return$\{A,\Delta ,h\}$ |

#### 3.2. The General Case

- The distances cannot be be sorted any more as in (8). Instead, one has to consider the distances$${d}_{i,j}=\parallel F\left({a}_{i}\right)-F\left({a}_{j}\right)\parallel ,\phantom{\rule{1.em}{0ex}}i,j=1,\dots ,\left|A\right|,\phantom{\rule{1.em}{0ex}}j>i,$$$${d}_{{i}_{m},{j}_{m}}\in \mathrm{arg}\underset{\genfrac{}{}{0pt}{}{i,j=1,\dots ,N+1}{j>i}}{\mathrm{min}}{d}_{i,j},$$
- The approximation of the Hausdorff distance cannot be done as in (12) any more. Instead, we choose the value of $\Delta $ as an approximation for ${d}_{h}(F\left(A\right),F\left({P}_{Q}\right))$, which is motivated by Theorem 1.
- The reset is completed if there exists an entry a of the current archive A and a candidate solution p that dominates a and$${f}_{i}\left(a\right)-{f}_{i}\left(p\right)>{\Delta}_{i}\phantom{\rule{1.em}{0ex}}\phantom{\rule{0.277778em}{0ex}}i=1,\dots ,k.$$That is, the improvement is larger than ${\Delta}_{i}$ for all objectives. It has been observed that if one only asks for an improvement in one objective (as done for the bi-objective case), too many resets are performed in particular for MOPs that contain a “flat” region of the Pareto front.

**Remark**

**3.**

Algorithm 4$\{A,\Delta \}:=ArchiveUpdateHD(P,{A}_{0},{\Delta}_{0},N)$ |

Require: Problem (MOP), P: current population, ${A}_{0}$: current archive, ${\Delta}_{0}\in {\mathbb{R}}_{+}^{k}$: current value of $\Delta $, N: upper bound for archive size |

Ensure: updated archive A, updated value of $\Delta $ |

1: $A:={A}_{0}$ |

2: $\Delta :={\Delta}_{0}$ |

3: $\u03f5:=\Delta $ |

4: for all$p\in P$ do |

5: if $\nexists a\in A:a{\prec}_{\u03f5}p,\phantom{\rule{0.277778em}{0ex}}or\phantom{\rule{0.277778em}{0ex}}\nexists a\in A:a\prec p\phantom{\rule{0.277778em}{0ex}}\mathrm{and}\phantom{\rule{0.277778em}{0ex}}\nexists a\in A\phantom{\rule{0.277778em}{0ex}}:\phantom{\rule{0.277778em}{0ex}}|{f}_{i}\left(a\right)-{f}_{i}\left(p\right)|\le {\Delta}_{i},\phantom{\rule{1.em}{0ex}}i=1,\dots ,k$ then |

6: $A:=A\cup \left\{p\right\}$ |

7: end if |

8: for all $a\in A$ do |

9: if $p\prec a$ then |

10: $A:=A\cup \left\{p\right\}\setminus \left\{a\right\}$ |

11: if ${f}_{i}\left(a\right)-{f}_{i}\left(p\right)>{\Delta}_{i},\phantom{\rule{0.277778em}{0ex}}i=1,\dots ,k$, then ▹ reset $\Delta $ and $\u03f5$ |

12: $\Delta :={\Delta}_{min}$ |

13: $\u03f5:=\Delta $ |

14: end if |

15: end if |

16: end for |

17: if $\left|A\right|=N+1$ then ▹ apply pruning |

18: $\Delta :=\frac{N+1}{N}\Delta $ |

19: $\u03f5:=\frac{N+1}{N}\u03f5$ |

20: compute ${d}_{i,j}$ as in (17) |

21: choose ${d}_{{i}_{m},{j}_{m}}\in \mathrm{arg}{\mathrm{min}}_{\genfrac{}{}{0pt}{}{i,j=1,\dots ,N+1}{j>i}}{d}_{i,j}$ |

22: choose l randomly from $\{{i}_{m},{j}_{m}\}$ |

23: $A:=A\setminus \left\{{a}_{l}\right\}$ |

24: end if |

25: end for |

26: return$\{A,\Delta \}$ |

**Remark**

**4.**

## 4. Numerical Results

## 5. Conclusions and Future Work

## Author Contributions

## Funding

## Data Availability Statement

## Conflicts of Interest

## Appendix A

**Table A1.**Hausdorff distances ${d}_{H}$ and approximations h computed by ArchiveUpdateHD for the six bi-objective problems.

Problem | Mean ${\mathit{d}}_{\mathit{H}}$ | std ${\mathit{d}}_{\mathit{H}}$ | Mean h | std h |
---|---|---|---|---|

CONV | 0.0345 | 0.0014861 | 0.036105 | 0.0016771 |

DENT | 0.15428 | 0.012059 | 0.15872 | 0.0091187 |

RUD1 | 0.11337 | 0.0075653 | 0.10027 | 0.024311 |

RUD2 | 0.11295 | 0.0083812 | 0.085445 | 0.038149 |

LINEAR | 0.02094 | 0.0019671 | 0.01977 | 0.0068899 |

RUD3 | 0.035354 | 0.0023404 | 0.025694 | 0.011484 |

**Table A2.**Averaged Hausdorff distances ${\Delta}_{2}$ and approximations ${d}_{2}$ computed by ArchiveUpdateHD for the six bi-objective problems.

Problem | Mean ${\Delta}_{2}$ | std ${\Delta}_{2}$ | Mean ${\mathit{d}}_{2}$ | std ${\mathit{d}}_{2}$ |
---|---|---|---|---|

CONV | 0.016626 | 0.00021014 | 0.01602 | 4.9072 $\times \phantom{\rule{0.166667em}{0ex}}10{}^{-5}$ |

DENT | 0.070802 | 0.00075385 | 0.069309 | 0.00036096 |

RUD1 | 0.052171 | 0.00072633 | 0.050631 | 0.0022879 |

RUD2 | 0.050967 | 0.00068389 | 0.049108 | 0.00034928 |

LINEAR | 0.014494 | 0.00033184 | 0.013892 | 0.00018454 |

RUD2 | 0.016653 | 0.00028272 | 0.01595 | 0.00034355 |

**Figure A1.**The box coverings $C\left({A}_{f}\right)$ for the final archives incidate that the Hausdorff distance between $F\left({A}_{f}\right)$ and the Pareto fronts is less than the final value ${\Delta}_{f}$ computed by ArchiveUpdateHD for all test problems.

**Figure A2.**Numerical results of the different algorithms and archiving/selection strategies on DENT.

**Figure A3.**Numerical results of the different algorithms and archiving/selection strategies on ZDT3.

**Figure A5.**Numerical results of the different algorithms and archiving/selection strategies on IDTLZ1 for $k=3$.

**Figure A6.**Numerical results of the different algorithms and archiving/selection strategies on MaF2 for $k=3$.

**Figure A7.**Evolution of the Hausdorff distances ${d}_{H}(F\left(A\right),F\left({P}_{Q}\right))$ of NSGA-II-A and the computed approximations $\Delta $ for several test functions, using $k=3$ objectives.

**Figure A8.**Evolution of the Hausdorff distances ${d}_{H}(F\left(A\right),F\left({P}_{Q}\right))$ of NSGA-II-A and the computed approximations $\Delta $ for several test functions, using $k=4$ objectives.

**Figure A9.**Evolution of the Hausdorff distances ${d}_{H}(F\left(A\right),F\left({P}_{Q}\right))$ of NSGA-II-A and the computed approximations $\Delta $ for several test functions, using $k=5$ objectives.

## References

- Laumanns, M.; Thiele, L.; Deb, K.; Zitzler, E. Combining convergence and diversity in evolutionary multiobjective optimization. Evol. Comput.
**2002**, 10, 263–282. [Google Scholar] [CrossRef] [PubMed] - Deb, K. Multi-Objective Optimization Using Evolutionary Algorithms; John Wiley & Sons: Chichester, UK, 2001; ISBN 0-471-87339-X. [Google Scholar]
- Stewart, T.; Bandte, O.; Braun, H.; Chakraborti, N.; Ehrgott, M.; Göbelt, M.; Jin, Y.; Nakayama, H. Real-World Applications of Multiobjective Optimization. In Proceedings of the Multiobjective Optimization; Lecture Notes in Computer Science; Slowinski, R., Ed.; Springer: Berlin/Heidelberg, Germany, 2008; Volume 5252, pp. 285–327. [Google Scholar]
- Sun, J.Q.; Xiong, F.R.; Schütze, O.; Hernández, C. Cell Mapping Methods—Algorithmic Approaches and Applications; Springer: Berlin/Heidelberg, Germany, 2019. [Google Scholar]
- Aguilera-Rueda, V.J.; Cruz-Ramírez, N.; Mezura-Montes, E. Data-Driven Bayesian Network Learning: A Bi-Objective Approach to Address the Bias-Variance Decomposition. Math. Comput. Appl.
**2020**, 25, 37. [Google Scholar] [CrossRef] - Estrada-Padilla, A.; Lopez-Garcia, D.; Gómez-Santillán, C.; Fraire-Huacuja, H.J.; Cruz-Reyes, L.; Rangel-Valdez, N.; Morales-Rodríguez, M.L. Modeling and Optimizing the Multi-Objective Portfolio Optimization Problem with Trapezoidal Fuzzy Parameters. Math. Comput. Appl.
**2021**, 26, 36. [Google Scholar] [CrossRef] - Frausto-Solis, J.; Hernández-Ramírez, L.; Castilla-Valdez, G.; González-Barbosa, J.J.; Sánchez-Hernández, J.P. Chaotic Multi-Objective Simulated Annealing and Threshold Accepting for Job Shop Scheduling Problem. Math. Comput. Appl.
**2021**, 26, 8. [Google Scholar] [CrossRef] - Castellanos-Alvarez, A.; Cruz-Reyes, L.; Fernandez, E.; Rangel-Valdez, N.; Gómez-Santillán, C.; Fraire, H.; Brambila-Hernández, J.A. A Method for Integration of Preferences to a Multi-Objective Evolutionary Algorithm Using Ordinal Multi-Criteria Classification. Math. Comput. Appl.
**2021**, 26, 27. [Google Scholar] [CrossRef] - Hillermeier, C. Nonlinear Multiobjective Optimization: A Generalized Homotopy Approach; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2001; Volume 135. [Google Scholar]
- Coello Coello, C.A.; Lamont, G.B.; Van Veldhuizen, D.A. Evolutionary Algorithms for Solving Multi-Objective Problems, 2nd ed.; Springer: New York, NY, USA, 2007. [Google Scholar]
- Laumanns, M.; Zenklusen, R. Stochastic convergence of random search methods to fixed size Pareto front approximations. Eur. J. Oper. Res.
**2011**, 213, 414–421. [Google Scholar] [CrossRef] - Hernández, C.; Schütze, O. A Bounded Archive Based for Bi-objective Problems based on Distance and epsilon-dominance to avoid Cyclic Behavior. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO-2022), Boston, MA, USA, 9–13 July 2022. [Google Scholar]
- Schütze, O.; Esquivel, X.; Lara, A.; Coello, C.A.C. Using the averaged Hausdorff distance as a performance measure in evolutionary multi-objective optimization. IEEE Trans. Evol. Comput.
**2012**, 16, 504–522. [Google Scholar] [CrossRef] - Schütze, O.; Laumanns, M.; Tantar, E.; Coello, C.A.C.; Talbi, E.G. Computing gap free Pareto front approximations with stochastic search algorithms. Evol. Comput.
**2010**, 18, 65–96. [Google Scholar] [CrossRef] - Deb, K.; Pratap, A.; Agarwal, S.; Meyarivan, T. A fast and elitist multiobjective genetic algorithm: NSGA-II. Evol. Comput. IEEE Trans.
**2002**, 6, 182–197. [Google Scholar] [CrossRef] [Green Version] - Zitzler, E.; Laumanns, M.; Thiele, L. SPEA2: Improving the Strength Pareto Evolutionary Algorithm for Multiobjective Optimization. In Proceedings of the Evolutionary Methods for Design, Optimisation and Control with Application to Industrial Problems (EUROGEN 2001), Athens, Greece, 19–21 September 2001; Giannakoglou, K., Tsahalis, D., Périaux, J., Papailiou, K., Fogarty, T., Eds.; International Center for Numerical Methods in Engineering (CIMNE): Barcelona, Spain, 2002; pp. 95–100. [Google Scholar]
- Fonseca, C.M.; Fleming, P.J. An overview of evolutionary algorithms in multiobjective optimization. Evol. Comput.
**1995**, 3, 1–16. [Google Scholar] [CrossRef] - Knowles, J.D.; Corne, D.W. Approximating the nondominated front using the Pareto Archived Evolution Strategy. Evol. Comput.
**2000**, 8, 149–172. [Google Scholar] [CrossRef] [PubMed] - Zhang, Q.; Li, H. MOEA/D: A Multi-objective Evolutionary Algorithm Based on Decomposition. IEEE Trans. Evol. Comput.
**2007**, 11, 712–731. [Google Scholar] [CrossRef] - Deb, K.; Jain, H. An evolutionary many-objective optimization algorithm using reference-point-based nondominated sorting approach, part I: Solving problems with box constraints. Trans. Evol. Comput.
**2014**, 18, 577–601. [Google Scholar] [CrossRef] - Jain, H.; Deb, K. An Evolutionary Many-Objective Optimization Algorithm Using Reference-Point Based Nondominated Sorting Approach, Part II: Handling Constraints and Extending to an Adaptive Approach. IEEE Trans. Evol. Comput.
**2014**, 18, 602–622. [Google Scholar] [CrossRef] - Martínez, S.Z.; Coello, C.A.C. A multi-objective particle swarm optimizer based on decomposition. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO-2011), Dublin, Ireland, 12–16 July 2011; pp. 69–76. [Google Scholar]
- Zuiani, F.; Vasile, M. Multi Agent Collaborative Search based on Tchebycheff decomposition. Comput. Optim. Appl.
**2013**, 56, 189–208. [Google Scholar] [CrossRef] [Green Version] - Moubayed, N.A.; Petrovski, A.; McCall, J. (DMOPSO)-M-2: MOPSO Based on Decomposition and Dominance with Archiving Using Crowding Distance in Objective and Solution Spaces. Evol. Comput.
**2014**, 22, 47–77. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Zapotecas-Martínez, S.; López-Jaimes, A.; García-Nájera, A. Libea: A Lebesgue indicator-based evolutionary algorithm for multi-objective optimization. Swarm Evol. Comput.
**2019**, 44, 404–419. [Google Scholar] [CrossRef] - Beume, N.; Naujoks, B.; Emmerich, M. SMS-EMOA: Multiobjective selection based on dominated hypervolume. Eur. J. Oper. Res.
**2007**, 181, 1653–1669. [Google Scholar] [CrossRef] - Zitzler, E.; Thiele, L.; Bader, J. SPAM: Set Preference Algorithm for multiobjective optimization. In Proceedings of the Parallel Problem Solving From Nature PPSN X, Dortmund, Germany, 13–17 September 2008; pp. 847–858. [Google Scholar]
- Wagner, T.; Trautmann, H. Integration of Preferences in Hypervolume-based multiobjective evolutionary algorithms by means of desirability functions. IEEE Trans. Evol. Comput.
**2010**, 14, 688–701. [Google Scholar] [CrossRef] - Schütze, O.; Domínguez-Medina, C.; Cruz-Cortés, N.; de la Fraga, L.G.; Sun, J.Q.; Toscano, G.; Landa, R. A scalar optimization approach for averaged Hausdorff approximations of the Pareto front. Eng. Optim.
**2016**, 48, 1593–1617. [Google Scholar] [CrossRef] - Sosa-Hernández, V.A.; Schütze, O.; Wang, H.; Deutz, A.; Emmerich, M. The Set-Based Hypervolume Newton Method for Bi-Objective Optimization. IEEE Trans. Cybern.
**2020**, 50, 2186–2196. [Google Scholar] [CrossRef] [PubMed] - Bringmann, K.; Friedrich, T. Convergence of Hypervolume-Based Archiving Algorithms. IEEE Trans. Evol. Comput.
**2014**, 18, 643–657. [Google Scholar] [CrossRef] - Fonseca, C.M.; Fleming, P.J. Genetic algorithms for multiobjective optimization: Formulation, discussion, and generalization. In Proceedings of the 5th International Conference on Genetic Algorithms, Urbana-Champaign, IL, USA, 17–21 June 1993; pp. 416–423. [Google Scholar]
- Srinivas, N.; Deb, K. Multiobjective optimization using nondominated sorting in genetic algorithms. Evol. Comput.
**1994**, 2, 221–248. [Google Scholar] [CrossRef] - Horn, J.; Nafpliotis, N.; Goldberg, D.E. A niched Pareto genetic algorithm for multiobjective optimization. In Proceedings of the First IEEE Conference on Evolutionary Computation, IEEE World Congress on Computational Computation, Orlando, FL, USA, 27–29 June 1994; IEEE Press: Piscataway, NJ, USA, 1994; pp. 82–87. [Google Scholar]
- Zitzler, E.; Thiele, L. Multiobjective evolutionary algorithms: A comparative case study and the strength Pareto approach. IEEE Trans. Evol. Comput.
**1999**, 3, 257–271. [Google Scholar] [CrossRef] [Green Version] - Rudolph, G. Finite Markov Chain results in evolutionary computation: A Tour d’Horizon. Fundam. Inform.
**1998**, 35, 67–89. [Google Scholar] [CrossRef] - Rudolph, G. On a multi-objective evolutionary algorithm and its convergence to the Pareto set. In Proceedings of the IEEE International Conference on Evolutionary Computation (ICEC 1998), Anchorage, AK, USA, 4–9 May 1998; IEEE Press: Piscataway, NJ, USA, 1998; pp. 511–516. [Google Scholar]
- Rudolph, G.; Agapie, A. Convergence Properties of Some Multi-Objective Evolutionary Algorithms. In Proceedings of the 2000 IEEE Congress on, Evolutionary Computation (CEC), La Jolla, CA, USA, 16–19 July 2000; IEEE Press: Piscataway, NJ, USA, 2000. [Google Scholar]
- Rudolph, G. Evolutionary Search under Partially Ordered Fitness Sets. In Proceedings of the International NAISO Congress on Information Science Innovations (ISI 2001), Dubai, United Arab Emirates, 17–21 March 2001; ICSC Academic Press: Sliedrecht, The Netherlands, 2001; pp. 818–822. [Google Scholar]
- Hanne, T. On the convergence of multiobjective evolutionary algorithms. Eur. J. Oper. Res.
**1999**, 117, 553–564. [Google Scholar] [CrossRef] - Hanne, T. Global multiobjective optimization with evolutionary algorithms: Selection mechanisms and mutation control. In Proceedings of the Evolutionary Multi-Criterion Optimization, First International Conference, EMO 2001, Zurich, Switzerland, 7–9 March 2001; Springer: Berlin/Heidelberg, Germany, 2001; pp. 197–212. [Google Scholar]
- Hanne, T. A multiobjective evolutionary algorithm for approximating the efficient set. Eur. J. Oper. Res.
**2007**, 176, 1723–1734. [Google Scholar] [CrossRef] - Hanne, T. A Primal-Dual Multiobjective Evolutionary Algorithm for Approximating the Efficient Set. In Proceedings of the 2007 IEEE Congress on Evolutionary Computation (CEC), Singapore, 25–28 September 2007; IEEE Press: Piscataway, NJ, USA, 2007; pp. 3127–3134. [Google Scholar]
- Knowles, J.D.; Corne, D.W. Properties of an adaptive archiving algorithm for storing nondominated vectors. IEEE Trans. Evol. Comput.
**2003**, 7, 100–116. [Google Scholar] [CrossRef] - Corne, D.W.; Knowles, J.D. Some multiobjective optimizers are better than others. In Proceedings of the IEEE Congress on Evolutionary Computation, Canberra, Australia, 8–12 December 2003; IEEE Press: Piscataway, NJ, USA, 2003; pp. 2506–2512. [Google Scholar]
- Knowles, J.D.; Corne, D.W. Bounded Pareto archiving: Theory and practice. In Proceedings of the Metaheuristics for Multiobjective Optimisation; Springer: Berlin/Heidelberg, Germany, 2004; pp. 39–64. [Google Scholar]
- Knowles, J.D.; Corne, D.W.; Fleischer, M. Bounded archiving using the Lebesgue measure. In Proceedings of the IEEE Congress on Evolutionary Computation, Canberra, Australia, 8–12 December 2003; IEEE Press: Piscataway, NJ, USA, 2003; pp. 2490–2497. [Google Scholar]
- López-Ibáñez, M.; Knowles, J.D.; Laumanns, M. On Sequential Online Archiving of Objective Vectors. In Proceedings of the Evolutionary Multi-Criterion Optimization (EMO 2011), Ouro Preto, Brazil, 5–8 April 2011; Springer: Berlin/Heidelberg, Germany, 2011; pp. 46–60. [Google Scholar]
- Laumanns, M.; Zitzler, E.; Thiele, L. On the effects of archiving, elitism, and density based selection in evolutionary multi-objective optimization. In Proceedings of the International Conference on Evolutionary Multi-Criterion Optimization, Zurich, Switzerland, 7–9 March 2001; Springer: Berlin/Heidelberg, Germany, 2001; pp. 181–196. [Google Scholar]
- Schütze, O.; Laumanns, M.; Coello, C.A.C.; Dellnitz, M.; Talbi, E.G. Convergence of Stochastic Search Algorithms to Finite Size Pareto Set Approximations. J. Glob. Optim.
**2008**, 41, 559–577. [Google Scholar] [CrossRef] [Green Version] - Schütze, O.; Lara, A.; Coello, C.A.C.; Vasile, M. On the Detection of Nearly Optimal Solutions in the Context of Single-Objective Space Mission Design Problems. J. Aerosp. Eng.
**2011**, 225, 1229–1242. [Google Scholar] [CrossRef] [Green Version] - Schütze, O.; Vasile, M.; Coello, C.A.C. Computing the Set of Epsilon-Efficient Solutions in Multiobjective Space Mission Design. J. Aerosp. Comput. Inf. Commun.
**2011**, 8, 53–70. [Google Scholar] [CrossRef] [Green Version] - Schütze, O.; Hernandez, C.; Talbi, E.G.; Sun, J.Q.; Naranjani, Y.; Xiong, F.R. Archivers for the Representation of the Set of Approximate Solutions for MOPs. J. Heuristics
**2019**, 5, 71–105. [Google Scholar] [CrossRef] - Schütze, O.; Hernández, C. Archiving Strategies for Evolutionary Multi-Objective Optimization Algorithms; Springer: Berlin/Heidelberg, Germany, 2021. [Google Scholar]
- Luong, H.N.; Bosman, P.A.N. Elitist Archiving for Multi-Objective Evolutionary Algorithms: To Adapt or Not to Adapt. In Proceedings of the Parallel Problem Solving from Nature—PPSN XII; Springer: Berlin/Heidelberg, Germany, 2012; pp. 72–81. [Google Scholar]
- Zapotecas Martínez, S.; Coello Coello, C.A. An archiving strategy based on the Convex Hull of Individual Minima for MOEAs. In Proceedings of the 2010 IEEE Congress on Evolutionary Computation (CEC), Barcelona, Spain, 18–23 July 2010; IEEE Press: Piscataway, NJ, USA, 2010; pp. 1–8. [Google Scholar]
- Cai, X.; Li, Y.; Fan, Z.; Zhang, Q. An External Archive Guided Multiobjective Evolutionary Algorithm Based on Decomposition. IEEE Trans. Evol. Comput.
**2015**, 19, 508–523. [Google Scholar] - Wang, F.; Zhang, H.; Li, Y.; Zhao, Y.; Rao, Q. External archive matching strategy for MOEA/D. Soft Comput.
**2018**, 22, 7833–7846. [Google Scholar] [CrossRef] - Tanabe, R.; Ishibuchi, H. An analysis of control parameters of MOEA/D under two different optimization scenarios. Appl. Soft Comput.
**2018**, 70, 22–40. [Google Scholar] [CrossRef] - Bezerra, L.C.T.; López-Ibáñez, M.; Stützle, T. Archiver effects on the performance of state-of-the-art multi- and many-objective evolutionary algorithms. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO ’19), Prague, Czech Republic, 13–17 July 2019; pp. 620–628. [Google Scholar]
- Hernández, C.I.; Schütze, O.; Sun, J.Q.; Ober-Blöbaum, S. Non-Epsilon Dominated Evolutionary Algorithm for the Set of Approximate Solutions. Math. Comput. Appl.
**2020**, 25, 3. [Google Scholar] [CrossRef] [Green Version] - Patil, M.B. Improved performance in multi-objective optimization using external archive. Sādhanā
**2020**, 45, 1–10. [Google Scholar] [CrossRef] - Brockhoff, D.; Tran, T.D.; Hansen, N. Benchmarking numerical multiobjective optimizers revisited. In Proceedings of the 2015 Annual Conference on Genetic and Evolutionary Computation, Madrid, Spain, 11–15 July 2015; pp. 639–646. [Google Scholar]
- Wang, R.; Zhou, Z.; Ishibuchi, H.; Liao, T.; Zhang, T. Localized weighted sum method for many-objective optimization. IEEE Trans. Evol. Comput.
**2016**, 22, 3–18. [Google Scholar] [CrossRef] - Pang, L.M.; Ishibuchi, H.; Shang, K. Algorithm Configurations of MOEA/D with an Unbounded External Archive. arXiv
**2020**, arXiv:2007.13352. [Google Scholar] - Ishibuchi, H.; Pang, L.M.; Shang, K. Solution Subset Selection for Final Decision Making in Evolutionary Multi-Objective Optimization. arXiv
**2020**, arXiv:2006.08156. [Google Scholar] - Ishibuchi, H.; Pang, L.M.; Shang, K. A New Framework of Evolutionary Multi-Objective Algorithms with an Unbounded External Archive. In ECAI 2020; IOS Press: Amsterdam, The Netherlands, 2020; pp. 283–290. [Google Scholar]
- Praditwong, K.; Yao, X. A New Multi-objective Evolutionary Optimisation Algorithm: The Two-Archive Algorithm. In Proceedings of the Computational Intelligence and Security, Harbin, China, 15–19 December 2007; Wang, Y., Cheung, Y.M., Liu, H., Eds.; Springer: Berlin/Heidelberg, Germany, 2007; pp. 95–104. [Google Scholar]
- Wang, H.; Jiao, L.; Yao, X. Two_Arch2: An Improved Two-Archive Algorithm for Many-Objective Optimization. IEEE Trans. Evol. Comput.
**2015**, 19, 524–541. [Google Scholar] [CrossRef] - Deb, K.; Agrawal, S. A niched-penalty approach for constraint handling in genetic algorithms. In Proceedings of the International Conference on Artificial Neural Networks and Genetic Algorithms (ICANNGA-99), Portoroz, Slovenia, 6–9 April 1999; Springer: Berlin/Heidelberg, Germany, 1999; pp. 235–243. [Google Scholar]
- Deb, K.; Deb, D. Analysing mutation schemes for real-parameter genetic algorithms. Int. J. Artif. Intell. Soft Comput.
**2014**, 4, 1–28. [Google Scholar] [CrossRef] - Ikeda, K.; Kita, H.; Kobayashi, S. Failure of Pareto-based MOEAs: Does non-dominated really mean near to optimal? In Proceedings of the 2001 IEEE Congress on Evolutionary Computation (CEC), Seoul, Korea, 27–30 May 2001; pp. 957–962. [Google Scholar]
- Bogoya, J.M.; Vargas, A.; Cuate, O.; Schütze, O. A (p,q)-Averaged Hausdorff Distance for Arbitrary Measurable Sets. Math. Comput. Appl.
**2018**, 23, 51. [Google Scholar] [CrossRef] [Green Version] - Vargas, A.; Bogoya, J. A Generalization of the Averaged Hausdorff Distance. Computación y Sistemas
**2018**, 22, 331–345. [Google Scholar] [CrossRef] - Witting, K. Numerical Algorithms for the Treatment of Parametric Multiobjective Optimization Problems and Applications. Ph.D. Thesis, Deptartment of Mathematics, University of Paderborn, Paderborn, Germany, 2012. [Google Scholar]
- Rudolph, G.; Naujoks, B.; Preuss, M. Capabilities of EMOA to Detect and Preserve Equivalent Pareto Subsets. In Proceedings of the Evolutionary Multi-Criterion Optimization: 4th International Conference, EMO 2007, Matsushima, Japan, 5–8 March 2007; Obayashi, S., Deb, K., Poloni, C., Hiroyasu, T., Murata, T., Eds.; Springer: Berlin/Heidelberg, Germany, 2007; pp. 36–50. [Google Scholar]
- Ishibuchi, H.; Matsumoto, T.; Masuyama, N.; Nojima, Y. Effects of dominance resistant solutions on the performance of evolutionary multi-objective and many-objective algorithms. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO ’20), Cancún, Mexico, 8–12 July 2020; Coello, C.A.C., Ed.; ACM: New York, NY, USA, 2020; pp. 507–515. [Google Scholar] [CrossRef]
- Tian, Y.; Cheng, R.; Zhang, X.; Jin, Y. PlatEMO: A MATLAB Platform for Evolutionary Multi-Objective Optimization. IEEE Comput. Intell. Mag.
**2017**, 3, 73–87. [Google Scholar] [CrossRef] [Green Version] - Zitzler, E.; Thiele, L. Multiobjective optimization using evolutionary algorithms—A comparative case study. In Proceedings of the Parallel Problem Solving from Nature—PPSN V; Eiben, A.E., Bäck, T., Schoenauer, M., Schwefel, H.P., Eds.; Springer: Berlin/Heidelberg, Germany, 1998; pp. 292–301. [Google Scholar]
- Zitzler, E.; Deb, K.; Thiele, L. Comparison of multiobjective evolutionary algorithms: Empirical results. Evol. Comput.
**2000**, 8, 173–195. [Google Scholar] [CrossRef] [Green Version] - Emmerich, M.; Deutz, A. Test problems based on Lamé superspheres. In Proceedings of the Evolutionary Multi-Criterion Optimization: 4th International Conference, EMO 2007, Matsushima, Japan, 5–8 March 2007; Obayashi, S., Deb, K., Poloni, C., Hiroyasu, T., Murata, T., Eds.; Springer: Berlin/Heidelberg, Germany, 2007; pp. 922–936. [Google Scholar]
- Schaeffler, S.; Schultz, R.; Weinzierl, K. Stochastic Method for the Solution of Unconstrained Vector Optimization Problems. J. Optim. Theory Appl.
**2002**, 114, 209–222. [Google Scholar] [CrossRef] - Cheng, M.; Tian, Y.; Zhang, X.; Yang, S.; Jin, Y.; Yao, X. A benchmark test suite for evolutionary many-objective optimization. Complex Intell. Syst.
**2017**, 3, 67–81. [Google Scholar] [CrossRef] - Rudolph, G.; Trautmann, H.; Sengupta, S.; Schütze, O. Evenly Spaced Pareto Front Approximations for Tricriteria Problems Based on Triangulation. In Proceedings of the Evolutionary Muti-Criterion Optimization Conference (EMO 2013), Sheffield, UK, 19–22 March 2013; pp. 443–458. [Google Scholar]
- Rudolph, G.; Grimme, C.; Schütze, O.; Trautmann, H. An Aspiration Set EMOA based on Averaged Hausdorff Distances. In Proceedings of the Learning and Intelligent Optimization Conference (LION 2014), Gainesville, FL, USA, 16–21 February 2014; pp. 153–156. [Google Scholar]
- Rudolph, G.; Schütze, O.; Grimme, C.; Domínguez-Medina, C.; Trautmann, H. Optimal averaged Hausdorff archives for bi-objective problems: Theoretical and numerical results. Comput. Optim. Appl.
**2016**, 64, 589–618. [Google Scholar] [CrossRef]

**Figure 1.**Gaps in the approximation can occur when $\u03f5$-dominance is used exclusively in the selection/archiving of the candidate solutions (

**left**). $\Delta $-tight $\u03f5$-(approximate) Pareto fronts also consider the distance of the Pareto front toward the archive (

**right**).

**Figure 2.**A hypothetical scenario that can happen for multi-modal problems: first, a front that is only locally optimal is detected by the search process and approximated by the archiver. If later, a candidate p is computed such that $F\left(p\right)$ lies on a “better” front, the current values of $\Delta $ and $\u03f5$ may not be adequate any more to suitably approximate this front.

**Figure 3.**Linear Pareto front with slope $-1$. If for $N=2$, the archive is given by $A=\{{a}_{1},{a}_{2}\}$ such that $F\left({a}_{1}\right)$ and $F\left({a}_{2}\right)$ are the end points of the Pareto front, then the Hausdorff distance of A and the Pareto front is given by $h=\parallel F\left({a}_{1}\right)-{y}_{m}{\parallel}_{\infty}$, where ${y}_{m}$ is the arithmetic mean of $F\left({a}_{1}\right)$ and $F\left({a}_{2}\right)$. For $\Delta <2h$, there may exist candidate solutions p that will be considered by the archive (line 5 of Algorithm 1) but discarded in the same step (lines 23 to 26 of Algorithm 1), leading to an increase of $\Delta $.

**Figure 4.**Numerical results of ArchiveUpdateHD on six BOPs with different shapes of the Pareto fronts. For the sake of clarity, we omitted the fronts that already become apparent by the approximations.

**Figure 5.**Hausdorff and averaged Hausdorff approximations (${d}_{h}$ and ${\Delta}_{2}$, respectively) obtained by ArchiveUpdateHD for one single run for six bi-objective problems (see Figure 4) together with their approximations h and ${d}_{2}$. ${d}_{H}$ is plotted black solid, h is black dashed, ${\Delta}_{2}$ is blue solid, and ${d}_{2}$ is blue dashed.

**Figure 8.**Real (blue) and approximated (red) Hausdorff distances during the run of one algorithm for DTLZ2.

**Figure 10.**Real (blue) and approximated (red) Hausdorff distances during the run of one algorithm for DTLZ7.

**Figure 11.**Results of ArchiveUpdateHD on CONV for two different initial values of $\Delta $ using $N=30$. For ${\Delta}_{0}=0.01$, the final archive contains 30 elements, while there are only 28 elements for ${\Delta}_{0}=0.05$. The solutions on the left are more evenly spread along the Pareto front due to distance considerations in the pruning technique. For the solution on the right, no pruning technique has been applied during the run of the algorithm.

**Figure 12.**Approximation qualities of the Pareto fronts (measured by ${\Delta}_{2}$) during one run of the algorithm for NSGA-II (blue) and the archives NSGA-II-A (black) for six selected BOPs.

**Table 1.**Comparison (wins/ties/losses) of the results of the base MOEAs against their archive equipped variants on the bi-objective test problems. The Wilcoxon rank-sum test has been used for statistical significance, where p-value $<0.05$.

Method 1 | Method 2 | Wins | Ties | Losses |
---|---|---|---|---|

NSGA-2 | NSGA-2-A | 0 | 0 | 10 |

MOEA/D | MOEA/D-A | 1 | 0 | 9 |

SMS-EMOA | SMS-EMOA-A | 2 | 0 | 8 |

**Table 2.**Comparison (wins (1) / ties (0) / losses (−1)) of the results of the base MOEAs against their archive equipped variants on the 14 three-objective test problems. The Wilcoxon rank-sum test has been used for statistical significance, where p-value $<0.05$.

Indicator | Method 1 | Method 2 | Result | GroupCount |
---|---|---|---|---|

${\Delta}_{2}$ | NSGA-2 | NSGA-2-A | −1 | 12 |

${\Delta}_{2}$ | NSGA-2 | NSGA-2-A | 0 | 0 |

${\Delta}_{2}$ | NSGA-2 | NSGA-2-A | 1 | 2 |

${\Delta}_{2}$ | MOEAD | MOEAD-A | −1 | 10 |

${\Delta}_{2}$ | MOEAD | MOEAD-A | 0 | 1 |

${\Delta}_{2}$ | MOEAD | MOEAD-A | 1 | 3 |

${\Delta}_{2}$ | SMS-EMOA | SMS-EMOA-A | −1 | 14 |

${\Delta}_{2}$ | SMS-EMOA | SMS-EMOA-A | 0 | 0 |

${\Delta}_{2}$ | SMS-EMOA | SMS-EMOA-A | 1 | 0 |

HV | NSGA-2 | NSGA-2-A | −1 | 11 |

HV | NSGA-2 | NSGA-2-A | 0 | 1 |

HV | NSGA-2 | NSGA-2-A | 1 | 2 |

HV | MOEAD | MOEAD-A | −1 | 9 |

HV | MOEAD | MOEAD-A | 0 | 1 |

HV | MOEAD | MOEAD-A | 1 | 4 |

HV | SMS-EMOA | SMS-EMOA-A | −1 | 14 |

HV | SMS-EMOA | SMS-EMOA-A | 0 | 0 |

HV | SMS-EMOA | SMS-EMOA-A | 1 | 0 |

${d}_{H}$ | NSGA-2 | NSGA-2-A | −1 | 13 |

${d}_{H}$ | NSGA-2 | NSGA-2-A | 0 | 1 |

${d}_{H}$ | NSGA-2 | NSGA-2-A | 1 | 0 |

${d}_{H}$ | MOEAD | MOEAD-A | −1 | 10 |

${d}_{H}$ | MOEAD | MOEAD-A | 0 | 1 |

${d}_{H}$ | MOEAD | MOEAD-A | 1 | 3 |

${d}_{H}$ | SMS-EMOA | SMS-EMOA-A | −1 | 14 |

${d}_{H}$ | SMS-EMOA | SMS-EMOA-A | 0 | 0 |

${d}_{H}$ | SMS-EMOA | SMS-EMOA-A | 1 | 0 |

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Hernández Castellanos, C.I.; Schütze, O.
A Bounded Archiver for Hausdorff Approximations of the Pareto Front for Multi-Objective Evolutionary Algorithms. *Math. Comput. Appl.* **2022**, *27*, 48.
https://doi.org/10.3390/mca27030048

**AMA Style**

Hernández Castellanos CI, Schütze O.
A Bounded Archiver for Hausdorff Approximations of the Pareto Front for Multi-Objective Evolutionary Algorithms. *Mathematical and Computational Applications*. 2022; 27(3):48.
https://doi.org/10.3390/mca27030048

**Chicago/Turabian Style**

Hernández Castellanos, Carlos Ignacio, and Oliver Schütze.
2022. "A Bounded Archiver for Hausdorff Approximations of the Pareto Front for Multi-Objective Evolutionary Algorithms" *Mathematical and Computational Applications* 27, no. 3: 48.
https://doi.org/10.3390/mca27030048