Test and Validation of the Surrogate-Based, Multi-Objective GOMORS Algorithm against the NSGA-II Algorithm in Structural Shape Optimization
Abstract
:1. Introduction
2. Materials and Methods
2.1. Surrogate Models and Optimization Methods
2.2. IKOS: The Framework
2.3. Requirements towards Optimization Algorithms
- The dimension of the objective and decision space;
- Maximum number of expensive function evaluations;
- Applicable problem characteristics (e.g., multimodality, divided Pareto-frontier).
- Which algorithm deals the best with the problem?
- For which kind of problem can the algorithm be used?
2.4. Choice of the Optimization Algorithm
- Multi-objective optimization algorithm;
- Effective search strategy;
- A limited number of costly function evaluations (e.g., use of surrogates);
- Able to deal with the constraint function;
- Parallelization possible.
- Convergence ratio (40%);
- Parallel data evaluation (20%);
- Global search strategy (40%).
2.4.1. NSGA-II Algorithm
- Step 0—Define Algorithm inputs:Firstly, the user sets the maximum number of expensive function evaluations, the number of parents, and the mutation rate.
- Step 1—Initial Evaluation Points Selection:The experimental design is processed using expensive simulations for a predefined sampling size.
- Step 2—Iterative Improvement:To select the Pareto points of the sampling and iteratively optimize the designs, the algorithm repeats the steps of fast, nondominated sorting, diversity preservation, crossover, and mutation until it reaches the maximum number of costly evaluations.
- Step 2.1—Fast, Nondominated Sorting: For each design, a domination count, which represents the number of designs dominating the current design, is set up. Furthermore, a set of designs, dominated by the current design, is assigned. By sorting the designs by their domination count, frontiers are built, for which the most dominating frontiers represent the individuals with the highest potential.
- Step 2.2a—Density Estimation: The method calculates the crowding distance by the distance of the current solution to its two closest neighbors for each point of each frontier. It then sorts the designs by their crowding distance.
- Step 2.2b—Crowded Comparison Operator: The method prioritizes designs if they exhibit a prior nondominated rank. Accordingly, if two designs are in the same frontier and have the same nondomination rank, it prefers the design in the less crowded area.
- Step 2.3—Offspring Creation: The initial population based on the sampling is now sorted according to the mentioned strategies. Then, binary tournament selection, recombination, and mutation create the offspring population. Finally, elitism selects a certain percentage of parents.
- Step 2.4—Calculate Responses: Expensive simulations return the demanded objective function values.
2.4.2. GOMORS Framework
- Step 0—Define Algorithm inputs:Firstly, the maximum number of expensive function evaluations is selected. Furthermore, the gap parameter and the number of costly evaluations after each iteration are chosen.
- Step 1—Initial Evaluation Points Selection:The experimental design selects the initial set of points and expensively evaluates the selected points via simulation.
- Step 2—Iterative Improvement:The algorithm runs iteratively until the maximum number of expensive function evaluations (abort criteria) is reached.
- Step 2.1—Fit/Update the Response Surface Models: The response surface models fit each objective for the expensively evaluated simulation points.
- Step 2.2—Surrogate-Assisted Global Search: GOMORS uses an MOEA to minimize the objective functions represented by the response surface models, selecting a set of potential candidates.
- Step 2.3a—Identify Least Crowded Solution: The least crowded design is selected by calculating the crowding distance of the expensively computed data.
- Step 2.3b—Local Search: The framework uses the least crowded solution as input for an MOEA-based neighborhood search, based on the response surface models, called the “Gap Optimization Problem”. It processes the search in the radius of the initially defined gap parameter around the least crowded solution.
- Step 2.4—Select Points for Expensive Function Evaluation: A candidate selection is processed based on selection rules, the data sets from the surrogate-assisted global search, and the local search.
- Step 2.5—Perform expensive function evaluations and update the non-dominated solution set: The method evaluates candidate points by costly simulations and updates the response surface model with the data of the candidate points.
- Step 3—Return Best Approximated Front:After the final loop, when the maximum number of expensive function evaluations is reached, the best non-dominated frontier is returned.
- It uses a symmetric Latin hypercube sampling instead of a Latin hypercube sampling;
- The MOEA of choice is the epsilon-NSGA-II following Kollat and Reed;
- Instead of the hypervolume improvement, the epsilon-progress-based selection is used.
3. Results
3.1. Linear Load Case Comparison without Infeasibility
3.2. Non-Linear Load Case Comparison without Infeasibility
3.3. Optimization Results for the Linear Load Case with Infeasibility
3.4. Implementation for Surrogate-Based Optimization Schemes
4. Discussion
5. Conclusions and Outlook
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Appendix A
Algorithm/Framework | Ant Colony Optimization (ACO) | Bacterial Foraging Optimization (BFO) | Multi-Objective Bees Algorithm (Bees) | Cooperative Bacterial Foraging Algorithm (CBFO) | Continuous Genetic Algorithm (CGA) | Differential Evolution for Multi-Objective Optimization (DEMO) | The Expected Improvement in Pareto Hypervolume (EHI) | Gap-Optimized Multi-Objective Optimization Using Response Surface (GOMORS) | Max-Value Entropy Search for Multi-Objective Optimization (MESMO) | |
---|---|---|---|---|---|---|---|---|---|---|
Approach | Metaheuristic | Metaheuristic | Metaheuristic | Metaheuristic | Metaheuristic | Metaheuristic | Bayes Approach | Metaheuristic | Bayes-Approach | |
Source | [3,8] | [5] | [6,7,8] | [3] | [3] | [6,7,8] | [2] | [1] | [2] | |
Criteria | ||||||||||
Convergence ratio * (1) | - | - | - | - | - | o | + * (2) | + * (2), (3) | ++ * (2), (4) | |
Parallelization | ? | ++ | ? | ? | ++ | ? | -- | ++ | -- | |
Global search strategy | + | + | + | + | + | ? | ++ | ++ | ++ | |
weighted | weight (%) | |||||||||
Convergence ratio * (1) | 40 | 0.1 | 0.1 | 0.1 | 0.1 | 0.1 | 0.2 | 0.3 | 0.3 | 0.4 |
Parallelization | 20 | ? | 0.2 | ? | ? | 0.2 | ? | 0 | 0.2 | 0 |
Global search str. | 40 | 0.3 | 0.3 | 0.3 | 0.3 | 0.3 | ? | 0.4 | 0.4 | 0.4 |
Sum | 100 | - | 0.6 | - | - | 0.6 | - | 0.7 | 0.9 | 0.8 |
* (1) Time/amount of function calls/evaluations until sufficient convergence | [1] Akhtar & Shoemaker (2016) | |||||||||
* (2) Update of the surrogate model necessary | [2] Belakaria et al. (2019) | |||||||||
* (3) In comparison to the NSGA-II and parEGO, most efficient | [3] Georgiou et al. (2014) | |||||||||
* (4) Faster than the PESMO because of the “input scape entropy-based” approach | [4] Paas & van Dijk (2017) | |||||||||
* (5) Enhancements include vectorization and constraint handling | [5] Wang & Cai (2018) | |||||||||
* (6) By vectorization of the NSGA-II 8x faster convergence to a similar frontier in comparison to the MOEA/D | [6] Yang & Deb (2013) | |||||||||
* (7) Performance is fluctuating | [7] Yang (2013) | |||||||||
* (8) Problematic due to premature convergence | [8] Yang (2014) | |||||||||
[9] Zhao et al. (2016) |
Algorithm/Framework | Multi-Objective Cuckoo Search (MOCS) | Multi-Objective Differential Evolution (MODE) | Multi-Objective Evolutionary Algorithm-Based on Decomposition (MOEA/D) | Multi-Objective Firefly Algorithm (MOFA) | Multi-Objective Flower Pollination Algorithm (MOFPA) | Non-Dominated Sorting Based Multi-Objective Evolutionary Algorithm (NSGA-II) | Enhanced Nondominated Sorting Based Multi-Objective Evolutionary Algorithm (Enhanced NSGA-II) | Pareto-Efficient Global Optimization (ParEGO) | Predictive Entropy Search for Multi-Objective Bayesian Optimization (PESMO) | |
---|---|---|---|---|---|---|---|---|---|---|
Approach | Metaheuristic | Metaheuristic | Metaheuristic | Metaheuristic | Metaheuristic | Metaheuristic | Metaheuristic | Bayes Approach | Bayes Approach | |
Source | [6,7] | [6,7,8] | [4] | [7,8] | [8] | [1,6,7,8] | [4] | [1,2] | [2] | |
Criteria | ||||||||||
Convergence ratio * (1) | + | o | - | + | + | - | o * (6) | o * (2), (7) | + * (2) | |
Parallelization | ++ | ? | ++ | ++ | ++ | ++ | ++ | -- | -- | |
Global search strategy | + | ? | o | + | + | + | + | ++ | ++ | |
weighted | weight (%) | |||||||||
Convergence ratio * (1) | 40 | 0.3 | 0.2 | 0.1 | 0.3 | 0.3 | 0.1 | 0.2 | 0.2 | 0.3 |
Parallelization | 20 | 0.2 | ? | 0.2 | 0.2 | 0.2 | 0.2 | 0.2 | 0 | 0 |
Global search str. | 40 | 0.3 | ? | 0.2 | 0.3 | 0.3 | 0.3 | 0.3 | 0.4 | 0.4 |
Sum | 100 | 0.8 | - | 0.5 | 0.8 | 0.8 | 0.6 | 0.7 | 0.6 | 0.7 |
* (1) Time/amount of function calls/evaluations until sufficient convergence | [1] Akhtar & Shoemaker (2016) | |||||||||
* (2) Update of the surrogate model necessary | [2] Belakaria et al. (2019) | |||||||||
* (3) In comparison to the NSGA-II and parEGO, most efficient | [3] Georgiou et al. (2014) | |||||||||
* (4) Faster than the PESMO because of the “input scape entropy-based” approach | [4] Paas & van Dijk (2017) | |||||||||
* (5) Enhancements include vectorization and constraint handling | [5] Wang & Cai (2018) | |||||||||
* (6) By vectorization of the NSGA-II 8x faster convergence to a similar frontier in comparison to the MOEA/D | [6] Yang & Deb (2013) | |||||||||
* (7) Performance is fluctuating | [7] Yang (2013) | |||||||||
* (8) Problematic due to premature convergence | [8] Yang (2014) | |||||||||
[9] Zhao et al. (2016) |
Algorithm/Framework | Particle Swarm Optimization (PSO) | Hybrid Particle Swarm Optimization Incl. Bacterial Foraging Optimization (PSO-BFO) | Hybrid Particle Swarm Optimization Incl. Genetic Algorithm (PSO-GA) | S-Metric Section-Based Efficient Global Optimization (SMSego) | Strength Pareto Evolutionary Algorithm (SPEA) | Probability of Improvement in Stepwise Uncertainty Reduction (SUR) | Vector Evaluated Genetic Algorithm (VEGA) | |
---|---|---|---|---|---|---|---|---|
Approach | Methaheuristic | Methaheuristic | Methaheuristic | Methaheuristic | Methaheuristic | Methaheuristic | Methaheuristic | |
Source | [3,5,8] | [5,9] | [5] | [2] | [6,7,8] | [2] | [6,7,8] | |
Criteria | ||||||||
Convergence ratio * (1) | - | + | + | + * (2) | - | + * (2) | - | |
Parallelization | ++ | ++ | ++ | ‘-- | ? | ‘-- | ? | |
Global search strategy | - * (8) | + | + | ++ | ? | ++ | ? | |
weighted | weight (%) | |||||||
Convergence ratio * (1) | 40 | 0.1 | 0.3 | 0.3 | 0.3 | 0.1 | 0.3 | 0.1 |
Parallelization | 20 | 0.2 | 0.2 | 0.2 | 0.0 | ? | 0.0 | ? |
Global search str. | 40 | 0.1 | 0.3 | 0.3 | 0.4 | ? | 0.4 | ? |
Sum | 100 | 0.4 | 0.8 | 0.8 | 0.7 | - | 0.7 | - |
* (1) Time/amount of function calls/evaluations until sufficient convergence | [1] Akhtar & Shoemaker (2016) | |||||||
* (2) Update of the surrogate model necessary | [2] Belakaria et al. (2019) | |||||||
* (3) In comparison to the NSGA-II and parEGO, most efficient | [3] Georgiou et al. (2014) | |||||||
* (4) Faster than the PESMO because of the “input scape entropy-based” approach | [4] Paas & van Dijk (2017) | |||||||
* (5) Enhancements include vectorization and constraint handling | [5] Wang & Cai (2018) | |||||||
* (6) By vectorization of the NSGA-II 8x faster convergence to a similar frontier in comparison to the MOEA/D | [6] Yang & Deb (2013) | |||||||
* (7) Performance is fluctuating | [7] Yang (2013) | |||||||
* (8) Problematic due to premature convergence | [8] Yang (2014) | |||||||
[9] Zhao et al. (2016) |
References
- Feldhusen, J.; Grote, K.-H. Pahl/Beitz Konstruktionslehre; Methoden und Anwendung Erfolgreicher Produktentwicklung; Springer: Berlin/Heidelberg, Germany, 2013. [Google Scholar]
- Schumacher, A.; Seibel, M.; Zimmer, H.; Schäfer, M. New optimization strategies for crash design. In Proceedings of the 4th LS-DYNA Anwenderforum, Bamberg, Germany, 20–21 October 2005. [Google Scholar]
- Bletzinger, K.-U. Shape Optimization. In Encyclopedia of Computational Mechanics, 2nd ed.; Stein, E., de Borst, R., Hughes, T.J.R., Eds.; John Wiley & Sons: Chichester, UK, 2018; pp. 1–42. [Google Scholar]
- Zimmer, H. Erweiterte Knotenfunktionalität im parametrischen Entwurfswerkzeug SFE Concept. FAT 2002, Nr.172, 1–30. [Google Scholar]
- Paas, M.H.J.W.; van Dijk, H.C. Multidisciplinary Design Optimization of Body Exterior Structures; Springer: Cham, Switzerland, 2017; Volume 41, pp. 17–30. [Google Scholar]
- Schumacher, A.; Vietor, T.; Fiebig, S.; Bletzinger, K.-U.; Maute, K. Advances in Structural and Multidisciplinary Optimization; Springer International Publishing: Cham, Switzerland, 2018. [Google Scholar]
- Duddeck, F. Multidisciplinary optimization of car bodies. Struct. Multidiscip. Optim. 2008, 35, 375–389. [Google Scholar] [CrossRef]
- Rayamajhi, M.; Hunkeler, S.; Duddeck, F. Geometrical compatibility in structural shape optimisation for crashworthiness. Int. J. Crashworthiness 2013, 19, 42–56. [Google Scholar] [CrossRef]
- Ryberg, A.-B.; Domeij Bäckryd, R.; Nilsson, L. Metamodel-Based Multidisciplinary Design Optimization for Automotive Applications; Linköping University Electronic Press: Linköping, Sweden, 2012. [Google Scholar]
- Younis, A.; Dong, Z. Trends, features, and tests of common and recently introduced global optimization methods. Eng. Optim. 2010, 42, 691–718. [Google Scholar] [CrossRef]
- Palar, P.S.; Liem, R.P.; Zuhal, L.R.; Shimoyama, K. On the use of surrogate models in engineering design optimization and exploration. In Proceedings of the Genetic and Evolutionary Computation Conference Companion, New York, NY, USA, 13–17 July 2019; pp. 1592–1602. [Google Scholar]
- Vahid, G. Adaptive Search Approach in Multidisciplinary Optimization of Lightweight Structures Using Hybrid-Metaheuristics; Technische Universität Braunschweig: Braunschweig, Germany, 2020. [Google Scholar]
- Rayamajhi, M.; Hunkeler, S.; Duddeck, F.; Zarroug, M.; Rota, L. Robust Shape Optimization for Crashworthiness via a Sub-structuring Approach. In Proceedings of the 9th ASMO UK/ISSMO Conference on Engineering Design Optimization, Product and Process Improvement, Cork, Ireland, 5–6 July 2012. [Google Scholar]
- Rayamajhi, M. Efficient Methods for Robust Shape Optimisation for Crashworthiness; Technische Universität München: London, UK, 2014. [Google Scholar]
- Bäckryd, R.; Ryberg, A.-B.; Nilsson, L. Multidisciplinary design optimisation methods for automotive structures. Int. J. Automot. Mech. Eng. 2017, 14, 4050–4067. [Google Scholar] [CrossRef]
- Yeniay, Ö. Penalty Function Methods for Constrained Optimization with Genetic Algorithms. Math. Comput. Appl. 2005, 10, 45–56. [Google Scholar] [CrossRef] [Green Version]
- Malen, D.E. Fundamentals of Automobile Body Structure Design; SAE International: Warrendale, PA, USA, 2020. [Google Scholar]
- Werner, Y.; Vietor, T.; Weinert, M.; Erber, T. Multidisciplinary design optimization of a generic b-pillar under package and design constraints. Eng. Optim. 2021, 53, 1884–1901. [Google Scholar] [CrossRef]
- Deb, K.; Pratap, A.; Agarwal, S.; Meyarivan, T. A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans. Evol. Computat. 2002, 6, 182–197. [Google Scholar] [CrossRef] [Green Version]
- Werner, Y.; Thiele, P.; Gopalan, V.S.R.; Vietor, T. From package and design surfaces to optimization—How to apply shape optimization under geometrical constraints. Procedia CIRP 2021, 100, 548–553. [Google Scholar] [CrossRef]
- Akhtar, T.; Shoemaker, C.A. Multi objective optimization of computationally expensive multi-modal functions with RBF surrogates and multi-rule selection. J. Glob. Optim. 2016, 64, 17–32. [Google Scholar] [CrossRef] [Green Version]
- Duddeck, F.; Zimmer, H. New Achievements on Implicit Parameterization Techniques for Combined Shape and Topology Optimization for Crashworthiness based on SFE CONCEPT. In Proceedings of the Shape and Technology Optimization for Crashworthiness, Int. Crashworthiness Conf. ICRASH2012, Milano, Italy, 18–20 July 2012; pp. 1–14. [Google Scholar]
- Ghaffarimejlej, V.; Türck, E.; Vietor, T. Finding the best material combinations through multi-material joining, using genetic algorithm. In Proceedings of the European Conference on Composite Materials (ECCM 2016), Munich, Germany, 26–30 June 2016. [Google Scholar]
- Rayamajhi, M.; Hunkeler, S.; Duddeck, F. Efficient Robust Shape Optimization for Crashworthiness. In Proceedings of the 10th World Congress on Structural and Multidisciplinary Optimization, Orlando, FL, USA, 19–24 May 2013. [Google Scholar]
- Hillmann, J. On the Development of a Process Chain for Structural Optimization in Vehicle Passive Safety. Ph.D. Thesis, Technische Universität Berlin, Berlin, Germany, 2009. [Google Scholar] [CrossRef]
- Schmitt, B.I. Konvergenzanalyse für die Partikelschwarmoptimierung. In Ausgezeichnete Informatikdissertationen 2015; Gesellschaft für Informatik: Bonn, Germany, 2015. [Google Scholar]
- Yang, X.-S.; Deb, S.; Fong, S. Metaheuristic Algorithms: Optimal Balance of Intensification and Diversification. Appl. Math. Inf. Sci. 2014, 8, 977–983. [Google Scholar] [CrossRef]
- Georgiou, G.; Vio, G.A.; Cooper, J.E. Aeroelastic tailoring and scaling using Bacterial Foraging Optimisation. Struct. Multidiscip. Optim. 2014, 50, 81–99. [Google Scholar] [CrossRef]
- Yang, X.-S.; Deb, S.; He, X. Eagle Strategy with Flower Algorithm. In Proceedings of the 2013 International Conference on Advances in Computing, Communications and Informatics (ICACCI), Mysore, India, 22–25 August 2013; pp. 1213–1217. [Google Scholar]
- Deb, K. An Evolutionary Many-Objective Optimization Algorithm Using Reference-point Based Non-dominated Sorting Approach, Part I: Solving problems with Box Constraints. IEEE Trans. Evol. Comput. 2013, 18, 577–601. [Google Scholar] [CrossRef]
- Wang, D.; Cai, K. Multi-objective crashworthiness optimization of vehicle body using particle swarm algorithm coupled with bacterial foraging algorithm. Proc. Inst. Mech. Eng. Part D J. Automob. Eng. 2018, 232, 1003–1018. [Google Scholar] [CrossRef]
- Haber, R.E.; Beruvides, G.; Quiza, R.; Hernandez, A. A Simple Multi-Objective Optimization Based on the Cross-Entropy Method. IEEE Access 2017, 5, 22272–22281. [Google Scholar] [CrossRef]
- Belakaria, S.; Deshwal, A.; Doppa, J.R. Max-value Entropy Search for Multi-Objective Bayesian Optimization. In Proceedings of the International Conference on Neural Information Processing Systems (NeurIPS), Vancouver, BC, Canada, 8–14 December 2019. [Google Scholar]
- Knowles, J. ParEGO: A hybrid algorithm with on-line landscape approximation for expensive multiobjective optimization problems. IEEE Trans. Evol. Comput. 2006, 10, 50–66. [Google Scholar] [CrossRef]
- Chugh, T.; Sindhya, K.; Hakanen, J.; Miettinen, K. A survey on handling computationally expensive multiobjective optimization problems with evolutionary algorithms. Soft Comput. 2019, 23, 3137–3166. [Google Scholar] [CrossRef] [Green Version]
- Yang, X.-S. Nature-Inspired Optimization Algorithms; Elsevier: Amsterdam, The Netherlands, 2014. [Google Scholar]
- Raja Gopalan, V.S.; Werner, Y.; van Hout, T. GOMORS Implementation in Python 3.7 Using Pysot Package. Available online: https://github.com/Vijey-Subramani-Raja-Gopalan/GOMORS_Python3.7_PYSOT0.2.0/tree/v1.0.1 (accessed on 27 January 2021).
- Georgios, K. Shape and parameter optimization with ANSA and LS-OPT using a new flexible interface. In Proceedings of the 6th European LS-DYNA Conference, Gothenburg, Sweden, 28–30 May 2007. [Google Scholar]
- Cavagna, L.; Ricci, S.; Riccobene, L. A Fast Tool for Structural Sizing, Aeroelastic Analysis and Optimization in Aircraft Conceptual Design. In Proceedings of the 50th AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics, and Materials Conference, Palm Springs, CA, USA, 4–7 May 2009; p. 05042009. [Google Scholar]
- Skinner, S.; Zare-Behtash, H. State-of-the-art in aerodynamic shape optimisation methods. Appl. Soft Comput. 2018, 62, 933–962. [Google Scholar] [CrossRef]
- Katz, J.J. Race-Car Aerodynamics; McGraw-Hill Professional: Cambridge, MA, USA, 2015. [Google Scholar]
Algorithm | Sampling Size | Nr. Iterations | Objective Functions | DVs |
---|---|---|---|---|
GOMORS | 1000 | 1000 | 2 | 24 |
NSGA-II | 1000 | 1000 | 2 | 24 |
GOMORS | 1000 | 200 | 3 | 24 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Werner, Y.; van Hout, T.; Raja Gopalan, V.S.; Vietor, T. Test and Validation of the Surrogate-Based, Multi-Objective GOMORS Algorithm against the NSGA-II Algorithm in Structural Shape Optimization. Algorithms 2022, 15, 46. https://doi.org/10.3390/a15020046
Werner Y, van Hout T, Raja Gopalan VS, Vietor T. Test and Validation of the Surrogate-Based, Multi-Objective GOMORS Algorithm against the NSGA-II Algorithm in Structural Shape Optimization. Algorithms. 2022; 15(2):46. https://doi.org/10.3390/a15020046
Chicago/Turabian StyleWerner, Yannis, Tim van Hout, Vijey Subramani Raja Gopalan, and Thomas Vietor. 2022. "Test and Validation of the Surrogate-Based, Multi-Objective GOMORS Algorithm against the NSGA-II Algorithm in Structural Shape Optimization" Algorithms 15, no. 2: 46. https://doi.org/10.3390/a15020046
APA StyleWerner, Y., van Hout, T., Raja Gopalan, V. S., & Vietor, T. (2022). Test and Validation of the Surrogate-Based, Multi-Objective GOMORS Algorithm against the NSGA-II Algorithm in Structural Shape Optimization. Algorithms, 15(2), 46. https://doi.org/10.3390/a15020046