Hybrid Bio-Optimized Algorithms for Hyperparameter Tuning in Machine Learning Models: A Software Defect Prediction Case Study
Abstract
:1. Introduction
1.1. Motivation
- Based on the aforementioned explanation, hybrid approaches are becoming more and more popular to enhance the convergence performance of individual algorithms.
- The effectiveness and scalability of hybrid BoAs across diverse scientific domains inspired us to propose novel and competitive hybrid methodologies.
- Hyperparameter tuning can be regarded as an optimization problem, as this process involves searching through a range of objective functions to maximize accuracy.
- The use of hybrid BoAs for solving the hyperparameter tuning problem is motivated by their ability to handle efficient exploration–exploitation trade-offs and scalability.
1.2. Contribution
- Design of an Enhanced GOA named gravitational force Lévy flight grasshopper optimization algorithm (GFLFGOA) by introducing LF and GF concepts to balance the exploration and exploitation nature.
- Design of a novel hybrid gravitational force Lévy flight grasshopper optimization algorithm–sparrow search algorithm (GFLFGOA-SSA) that includes GFLFGOA and the sparrow search algorithm (SSA) concept to accelerate convergence rate.
- Design of a hybrid gravitational force grasshopper optimization algorithm–sparrow search algorithm (GFGOA-SSA) by embedding the concepts of gravitational force grasshopper optimization algorithm (GFGOA) and SSA concepts.
- Design of a hybrid Lévy flight grasshopper optimization algorithm–sparrow search algorithm (LFGOA-SSA).
- Extensive experiments conducted on BFs and hyperparameter tuning of XGB and ANN models to prove the above-proposed algorithm’s superiority to the state-of-the-art techniques.
2. Literature Survey
- bio-optimized approach to optimization problems;
- bio-optimized approach to hyperparameter tuning problems.
2.1. Bio-Optimized Approach to Optimization Problems
2.2. Bio-Optimized Approach for Hyperparameter Tuning Problems
3. Concepts
3.1. Grasshopper Optimization Algorithm (GOA)
- grasshopper position;
- Social interaction;
- Gravitational force;
- Wind advection.
- n Number of grasshoppers;
- Distance from grasshopper to grasshopper;
- Unit Vector from grasshopper to grasshopper.
- grasshopper’s position;
- grasshopper’s position.
- f Intensity of attraction;
- l Attractive length.
- g Gravitational constant;
- Unity vector towards the center of the earth.
- u Constant draft;
- Unity vector in the wind direction.
- and Upper bound and lower bound of the dimension of the grasshopper, respectively;
- Target or optimal position in the dimension (best solution found so far);
- c A decreasing coefficient to shrink the comfort repulsion and attraction areas.
- Maximum value;
- Minimum value;
- r Current iteration;
- G Maximum iteration.
3.2. Lévy Flight GOA (LFGOA)
- s Random step length of Lévy’s flight;
- Power-law index [0, 2].
- Step length for random walk;
- 1.5;
- M Normal standard variable with standard deviation that follows a normal distribution;
- P Normal standard variable with standard deviation that follows a normal distribution.
3.3. Gravitational Force GOA (GFGOA)
- g Gravitational constant (0.9);
- .
3.4. Sparrow Search Algorithm (SSA)
- r Current iteration;
- L Matrix of (1 × d);
- d Dimension of the variable;
- Position of the dimension of the sparrow at iteration r;
- G Maximum iterations;
- Q Random number which obeys a normal distribution;
- Random number [0, 1];
- Alarm value [0, 1];
- ST Safety threshold [0.5, 1.0];
- < ST Signals the predator to search on wide mode, as there are no predators;
- ≥ ST Signals sparrows to fly to another place, as there are predators.
- Current global worst location at iteration r;
- ith sparrow’s current best location obtained so far;
- Q Random number which obeys a normal distribution;
- ;
- A One-dimensional vector [−1, 1];
- i ;
- n Number of sparrows.
- Current global optimal location at iteration r;
- Random number [mean = 0, variance = 1];
- K Random number [−1, 1];
- Current sparrow’s fitness value;
- Current global best value;
- Worst fitness value;
- Constant value (c);
- > Sparrows are at the edge of the group;
- = Sparrows are in the middle of the population.
4. Methodology
4.1. Proposed Algorithm
4.1.1. GFLFGOA Hybrid Algorithm
Algorithm 1: GFLFGOA |
4.1.2. LFGOA-SSA Hybrid Algorithm
Algorithm 2: LFGOA-SSA |
4.1.3. GFGOA-SSA Hybrid Algorithm
Algorithm 3: GFGOA-SSA |
4.1.4. GFLFGOA-SSA Hybrid Algorithm
Algorithm 4: GFLFGOA-SSA |
4.2. Optimization on Benchmark Functions (BFs)
4.3. Software Defect Prediction (SDP) Framework
4.3.1. Data Source
4.3.2. Data Pre-Processing
- Label Encoding: Label encoding is a part of data transformation. It is defined as mapping the non-numeric value into the numeric value. In this research work, the dataset consists of yes (Y) and no (N) labels. Yes (Y) and no (N) are mapped to 0 and 1 in this step, respectively.
- Data Cleaning: In this process, the data are cleaned by removing any outliers, and the inconsistent values are transformed with the mean of the attributes. The outliers are removed and replaced by the inter-quartile range (IQR) method.
- Feature Selection: As the dataset consists of more features, to avoid bias towards only one kind of feature, feature selection is performed using the Pearson correlation coefficient, which is tabulated in Table 3.
- Data Scaling: Normally known as normalization, we performed a min-max scaler for normalization.
4.3.3. Hybrid Bio-Optimized Defect Prediction (HBoDP) Model
5. Results and Discussion
5.1. Parameter Settings
5.1.1. NASA Defect Dataset Description
5.1.2. Hyperparameters of the ML Models
5.2. Results
5.2.1. BF Results
5.2.2. SDP Framework Results
5.3. Analysis
5.3.1. BF Analysis
- Mean Analysis: For a better understanding of the mean analysis of BFs, we presented the discussion based on the types of BFs.
- Fixed-dimension Multimodal BFs (–): GFLFGOA-SSA ranks first, achieving the lowest mean value for , as tabulated in Table 7. But in the case of and , the ranking of the proposed algorithm is the same as the base algorithms. Even though the ranking is the same, the convergence rate is relatively equal to the base algorithms, as visualized in Figure 6d,e.
- Multi-dimension Multimodal BFs (–): Except , GFLFGOA-SSA and LFGOA-SSA rank first for and , respectively. Even though SSA ranks first for , the convergence rate is relatively close to GFLFGOA-SSA, as plotted in Figure 6h.
- SD Analysis: To avoid ambiguity, we elaborated the discussion based on the types of BFs.
- Fixed-dimension Multimodal BFs (–): For , GFLFGOA-SSA achieve the lowest SD value, as tabulated in Table 8. In the case of , all of the other proposed algorithms have equal convergence rates, despite the fact that SSA ranks first, as plotted in Figure 6d, while in the case of , the convergence rate of GFGOA is nearly identical to that of the proposed algorithm, despite the SD being zero.
- Multi-dimension Multimodal BFs (–): From Table 8, it can be clearly stated that any one of our proposed algorithms ranks first for –. GFLFGOA-SSA, GFLFGOA, and LFGOA-SSA achieve the lowest SD value for , and , respectively.
5.3.2. SDP Framework Analysis
- Algorithm Accuracy Analysis: From Table 9 and Table 11, it can be clearly stated that for all datasets, there is an enhancement in accuracy when the XGB and ANN models are tuned with BoAs. As tabulated in Table 13, in the case of JM1, MW1, PC1, PC3, PC4, and PC5, our proposed algorithms have better optimization effects as compared to the four base algorithms, while in the cases of CM1, KC3, KC4, MC1, and PC2, the superiority of the proposed algorithm is relatively close to the base approaches. In hyperparameter tuning of the ANN model, GFGOA-SSA, GFLFGOA-SSA, and GFGOA-SSA show better accuracy for JM1, PC3, and PC5, respectively. For the remaining datasets, the superiority of our algorithms is relatively close or equal to the base approaches, as shown in Table 11 and Table 14.
- Algorithm Runtime Analysis: From Table 10, we can justify that the computational runtime is lowest for all datasets except CM1 and MW1 by embedding either the LF or GF or both concepts into the SSA. For CM1 and MW1, the difference in runtime is not too great compared to LFGOA. For CM1, JM1, KC1, KC4, MC1, MW1, PC1, PC3, and PC5, the base algorithm’s computational runtime is low as compared to the proposed algorithm, as listed in Table 12, whereas for other datasets, our proposed algorithm’s runtime shows superiority.
5.3.3. Computational Complexity Analysis
- Time Complexity: In general, the time complexity can be defined as follows:Time Complexity = O(Initialization) + [O(Fitness evaluation of each search agent) + O(Position updation of agents) + O(Sorting)] ∗ Maximum iteration.Mathematically, this can be coded as follows:
- Space Complexity: The maximum number of spaces occupied by the proposed algorithm at any time is decided by the random initialization of the population. So, the space complexity can be calculated as follows:
6. Concluding Remarks and Future Scope
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Van Thieu, N.; Mirjalili, S. MEALPY: An open-source library for latest meta-heuristic algorithms in Python. J. Syst. Archit. 2023, 139, 102871. [Google Scholar] [CrossRef]
- Holland, J.H. Genetic algorithms. Sci. Am. 1992, 267, 66–73. [Google Scholar] [CrossRef]
- Bäck, T.; Schwefel, H.P. An overview of evolutionary algorithms for parameter optimization. Evol. Comput. 1993, 1, 1–23. [Google Scholar] [CrossRef]
- Eberhart, R.; Kennedy, J. Particle swarm optimization. In Proceedings of the IEEE International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995; Volume 4, pp. 1942–1948. [Google Scholar]
- Meraihi, Y.; Gabis, A.B.; Mirjalili, S.; Ramdane-Cherif, A. Grasshopper optimization algorithm: Theory, variants, and applications. IEEE Access 2021, 9, 50001–50024. [Google Scholar] [CrossRef]
- Wu, H.S.; Zhang, F.M. Wolf pack algorithm for unconstrained global optimization. Math. Probl. Eng. 2014, 2014, 465082. [Google Scholar] [CrossRef]
- Karaboga, D.; Basturk, B. An artificial bee colony (ABC) algorithm for numeric function optimization. In Proceedings of the IEEE Swarm Intelligence Symposium, Indianapolis, IN, USA, 28–29 September 2006; IEEE: Piscataway, NJ, USA, 2006; Volume 2006. [Google Scholar]
- Heidari, A.A.; Mirjalili, S.; Faris, H.; Aljarah, I.; Mafarja, M.; Chen, H. Harris hawks optimization: Algorithm and applications. Future Gener. Comput. Syst. 2019, 97, 849–872. [Google Scholar] [CrossRef]
- Hussain, K.; Mohd Salleh, M.N.; Cheng, S.; Shi, Y. Metaheuristic research: A comprehensive survey. Artif. Intell. Rev. 2019, 52, 2191–2233. [Google Scholar] [CrossRef]
- Nevendra, M.; Singh, P. Empirical investigation of hyperparameter optimization for software defect count prediction. Expert Syst. Appl. 2022, 191, 116217. [Google Scholar] [CrossRef]
- Nematzadeh, S.; Kiani, F.; Torkamanian-Afshar, M.; Aydin, N. Tuning hyperparameters of machine learning algorithms and deep neural networks using metaheuristics: A bioinformatics study on biomedical and biological cases. Comput. Biol. Chem. 2022, 97, 107619. [Google Scholar] [CrossRef]
- Lentzas, A.; Nalmpantis, C.; Vrakas, D. Hyperparameter tuning using quantum genetic algorithms. In Proceedings of the 2019 IEEE 31st International Conference on Tools with Artificial Intelligence (ICTAI), Portland, OR, USA, 4–6 November 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 1412–1416. [Google Scholar]
- Shepperd, M.; Song, Q.; Sun, Z.; Mair, C. Data quality: Some comments on the nasa software defect datasets. IEEE Trans. Softw. Eng. 2013, 39, 1208–1215. [Google Scholar] [CrossRef]
- Wolpert, D.H.; Macready, W.G. No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–82. [Google Scholar] [CrossRef]
- Zang, H.; Zhang, S.; Hapeshi, K. A review of nature-inspired algorithms. J. Bionic Eng. 2010, 7, S232–S237. [Google Scholar] [CrossRef]
- McCall, J. Genetic algorithms for modelling and optimisation. J. Comput. Appl. Math. 2005, 184, 205–222. [Google Scholar] [CrossRef]
- Karaboga, D.; Basturk, B. A powerful and efficient algorithm for numerical function optimization: Artificial bee colony (ABC) algorithm. J. Glob. Optim. 2007, 39, 459–471. [Google Scholar] [CrossRef]
- Zhen, L.; Liu, Y.; Dongsheng, W.; Wei, Z. Parameter estimation of software reliability model and prediction based on hybrid wolf pack algorithm and particle swarm optimization. IEEE Access 2020, 8, 29354–29369. [Google Scholar] [CrossRef]
- Zulfiqar, M.; Kamran, M.; Rasheed, M.; Alquthami, T.; Milyani, A. Hyperparameter optimization of support vector machine using adaptive differential evolution for electricity load forecasting. Energy Rep. 2022, 8, 13333–13352. [Google Scholar] [CrossRef]
- Blume, S.; Benedens, T.; Schramm, D. Hyperparameter optimization techniques for designing software sensors based on artificial neural networks. Sensors 2021, 21, 8435. [Google Scholar] [CrossRef]
- Akter, S.; Nahar, N.; ShahadatHossain, M.; Andersson, K. A new crossover technique to improve genetic algorithm and its application to TSP. In Proceedings of the 2019 International Conference on Electrical, Computer and Communication Engineering (ECCE), Cox’s Bazar, Bangladesh, 7–9 February 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 1–6. [Google Scholar]
- Si, B.; Liu, F.; Li, Y. Metamodel-based hyperparameter optimization of optimization algorithms in building energy optimization. Buildings 2023, 13, 167. [Google Scholar] [CrossRef]
- Xue, J.; Shen, B. A novel swarm intelligence optimization approach: Sparrow search algorithm. Syst. Sci. Control Eng. 2020, 8, 22–34. [Google Scholar] [CrossRef]
- Zhang, Z.; Han, Y. Discrete sparrow search algorithm for symmetric traveling salesman problem. Appl. Soft Comput. 2022, 118, 108469. [Google Scholar] [CrossRef]
- Yang, X.; Liu, J.; Liu, Y.; Xu, P.; Yu, L.; Zhu, L.; Chen, H.; Deng, W. A novel adaptive sparrow search algorithm based on chaotic mapping and t-distribution mutation. Appl. Sci. 2021, 11, 11192. [Google Scholar] [CrossRef]
- Ouyang, C.; Qiu, Y.; Zhu, D. Adaptive spiral flying sparrow search algorithm. Sci. Program. 2021, 2021, 1–16. [Google Scholar] [CrossRef]
- Liang, Q.; Chen, B.; Wu, H.; Han, M. A novel modified sparrow search algorithm based on adaptive weight and improved boundary constraints. In Proceedings of the 2021 IEEE 6th International Conference on Computer and Communication Systems (ICCCS), Chengdu, China, 23–26 April 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 104–109. [Google Scholar]
- Zhao, Q.; Tao, R.; Li, J.; Mu, Y. An improved wolf pack algorithm. In Proceedings of the 2020 Chinese Control And Decision Conference (CCDC), Hefei, China, 22–24 August 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 626–633. [Google Scholar]
- Li, H.; Wu, H. An oppositional wolf pack algorithm for Parameter identification of the chaotic systems. Optik 2016, 127, 9853–9864. [Google Scholar] [CrossRef]
- Xiu, Z.; Zhen-Hua, W. Improved Wolf Pack Algorithm Based on Tent Chaotic Mapping and Levy Flight. In Proceedings of the 2017 International Conference on Robots & Intelligent System (ICRIS), Huai’an, China, 15–16 October 2017; pp. 165–169. [Google Scholar] [CrossRef]
- Chen, X.; Cheng, F.; Liu, C.; Cheng, L.; Mao, Y. An improved Wolf pack algorithm for optimization problems: Design and evaluation. PLoS ONE 2021, 16, e0254239. [Google Scholar] [CrossRef] [PubMed]
- Jadon, S.S.; Tiwari, R.; Sharma, H.; Bansal, J.C. Hybrid artificial bee colony algorithm with differential evolution. Appl. Soft Comput. 2017, 58, 11–24. [Google Scholar] [CrossRef]
- Mirjalili, S.; Wang, G.G.; Coelho, L.d.S. Binary optimization using hybrid particle swarm optimization and gravitational search algorithm. Neural Comput. Appl. 2014, 25, 1423–1435. [Google Scholar] [CrossRef]
- Li, Z.; Yu, M.; Wang, D.; Wei, H. Using hybrid algorithm to estimate and predicate based on software reliability model. IEEE Access 2019, 7, 84268–84283. [Google Scholar] [CrossRef]
- Yang, L.; Li, Z.; Wang, D.; Miao, H.; Wang, Z. Software defects prediction based on hybrid particle swarm optimization and sparrow search algorithm. IEEE Access 2021, 9, 60865–60879. [Google Scholar] [CrossRef]
- Saremi, S.; Mirjalili, S.; Lewis, A. Grasshopper optimisation algorithm: Theory and application. Adv. Eng. Softw. 2017, 105, 30–47. [Google Scholar] [CrossRef]
- Abualigah, L.; Diabat, A. A comprehensive survey of the Grasshopper optimization algorithm: Results, variants, and applications. Neural Comput. Appl. 2020, 32, 15533–15556. [Google Scholar] [CrossRef]
- Razmjooy, N.; Estrela, V.V.; Loschi, H.J.; Fanfan, W. A comprehensive survey of new meta-heuristic algorithms. In Recent Advances in Hybrid Metaheuristics for Data Clustering; Wiley Publishing: Hoboken, NJ, USA, 2019. [Google Scholar]
- El-Henawy, I.; Abdelmegeed, N.A. Meta-heuristics algorithms: A survey. Int. J. Comput. Appl. 2018, 179, 45–54. [Google Scholar] [CrossRef]
- Arora, S.; Anand, P. Chaotic grasshopper optimization algorithm for global optimization. Neural Comput. Appl. 2019, 31, 4385–4405. [Google Scholar] [CrossRef]
- Zhao, S.; Wang, P.; Heidari, A.A.; Zhao, X.; Ma, C.; Chen, H. An enhanced Cauchy mutation grasshopper optimization with trigonometric substitution: Engineering design and feature selection. Eng. Comput. 2022, 38, 4583–4616. [Google Scholar] [CrossRef]
- Ewees, A.A.; Gaheen, M.A.; Yaseen, Z.M.; Ghoniem, R.M. Grasshopper optimization algorithm with crossover operators for feature selection and solving engineering problems. IEEE Access 2022, 10, 23304–23320. [Google Scholar] [CrossRef]
- Yildiz, B.S.; Pholdee, N.; Bureerat, S.; Yildiz, A.R.; Sait, S.M. Enhanced grasshopper optimization algorithm using elite opposition-based learning for solving real-world engineering problems. Eng. Comput. 2022, 38, 4207–4219. [Google Scholar] [CrossRef]
- Feng, Y.; Liu, M.; Zhang, Y.; Wang, J. A dynamic opposite learning assisted grasshopper optimization algorithm for the flexible jobscheduling problem. Complexity 2020, 2020, 8870783. [Google Scholar] [CrossRef]
- Qin, P.; Hu, H.; Yang, Z. The improved grasshopper optimization algorithm and its applications. Sci. Rep. 2021, 11, 23733. [Google Scholar] [CrossRef]
- Behera, R.K.; Shukla, S.; Rath, S.K.; Misra, S. Software reliability assessment using machine learning technique. In Computational Science and Its Applications–ICCSA 2018: Proceedings of the 18th International Conference, Melbourne, VIC, Australia, 2–5 July 2018, Proceedings, Part V 18; Springer: Berlin/Heidelberg, Germany, 2018; pp. 403–411. [Google Scholar]
- Batool, I.; Khan, T.A. Software fault prediction using deep learning techniques. Softw. Qual. J. 2023, 31, 1241–1280. [Google Scholar] [CrossRef]
- Jayanthi, R.; Florence, L. Software defect prediction techniques using metrics based on neural network classifier. Clust. Comput. 2019, 22, 77–88. [Google Scholar] [CrossRef]
- Fan, G.; Diao, X.; Yu, H.; Yang, K.; Chen, L. Software defect prediction via attention-based recurrent neural network. Sci. Program. 2019, 2019, 6230953. [Google Scholar] [CrossRef]
- Batool, I.; Khan, T.A. Software fault prediction using data mining, machine learning and deep learning techniques: A systematic literature review. Comput. Electr. Eng. 2022, 100, 107886. [Google Scholar] [CrossRef]
- Sobhana, M.; Preethi, G.S.S.; Sri, G.H.; Sujitha, K.B. Improved Reliability Prediction in Engineering Systems Based on Artificial Neural Network. In Proceedings of the 2022 International Mobile and Embedded Technology Conference (MECON), Noida, India, 10–11 March 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 455–460. [Google Scholar]
- Jindal, A.; Gupta, A. Comparative Analysis of Software Reliability Prediction Using Machine Learning and Deep Learning. In Proceedings of the 2022 Second International Conference on Artificial Intelligence and Smart Energy (ICAIS), Coimbatore, India, 23–25 February 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 389–394. [Google Scholar]
- Alghanim, F.; Azzeh, M.; El-Hassan, A.; Qattous, H. Software defect density prediction using deep learning. IEEE Access 2022, 10, 114629–114641. [Google Scholar] [CrossRef]
- Clemente, C.J.; Jaafar, F.; Malik, Y. Is predicting software security bugs using deep learning better than the traditional machine learning algorithms? In Proceedings of the 2018 IEEE International Conference on Software Quality, Reliability and Security (QRS), Lisbon, Portugal, 16–20 July 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 95–102. [Google Scholar]
- Wongpheng, K.; Visutsak, P. Software defect prediction using convolutional neural network. In Proceedings of the 2020 35th International Technical Conference on Circuits/Systems, Computers and Communications (ITC-CSCC), Nagoya, Japan, 3–6 July 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 240–243. [Google Scholar]
- Cetiner, M.; Sahingoz, O.K. A comparative analysis for machine learning based software defect prediction systems. In Proceedings of the 2020 11th International Conference on Computing, Communication and Networking Technologies (ICCCNT), Kharagpur, India, 1–3 July 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 1–7. [Google Scholar]
- Alsaeedi, A.; Khan, M.Z. Software defect prediction using supervised machine learning and ensemble techniques: A comparative study. J. Softw. Eng. Appl. 2019, 12, 85–100. [Google Scholar] [CrossRef]
- Li, R.; Zhou, L.; Zhang, S.; Liu, H.; Huang, X.; Sun, Z. Software defect prediction based on ensemble learning. In Proceedings of the 2019 2nd International Conference on Data Science and Information Technology, Seoul, Republic of Korea, 19–21 July 2019; pp. 1–6. [Google Scholar]
- Iqbal, A.; Aftab, S.; Ali, U.; Nawaz, Z.; Sana, L.; Ahmad, M.; Husen, A. Performance analysis of machine learning techniques on software defect prediction using NASA datasets. Int. J. Adv. Comput. Sci. Appl. 2019, 10, 300–308. [Google Scholar] [CrossRef]
- Malhotra, R. Comparative analysis of statistical and machine learning methods for predicting faulty modules. Appl. Soft Comput. 2014, 21, 286–297. [Google Scholar] [CrossRef]
- Parashar, A.; Kumar Goyal, R.; Kaushal, S.; Kumar Sahana, S. Machine learning approach for software defect prediction using multi-core parallel computing. Autom. Softw. Eng. 2022, 29, 44. [Google Scholar] [CrossRef]
- Topaz, C.M.; Bernoff, A.J.; Logan, S.; Toolson, W. A model for rolling swarms of locusts. Eur. Phys. J. Spec. Top. 2008, 157, 93–109. [Google Scholar] [CrossRef]
- Kamaruzaman, A.F.; Zain, A.M.; Yusuf, S.M.; Udin, A. Levy flight algorithm for optimization problems-a literature review. Appl. Mech. Mater. 2013, 421, 496–501. [Google Scholar] [CrossRef]
- Luo, J.; Chen, H.; Xu, Y.; Huang, H.; Zhao, X. An improved grasshopper optimization algorithm with application to financial stress prediction. Appl. Math. Model. 2018, 64, 654–668. [Google Scholar] [CrossRef]
- Jamil, M.; Yang, X.S. A literature survey of benchmark functions for global optimisation problems. Int. J. Math. Model. Numer. Optim. 2013, 4, 150–194. [Google Scholar] [CrossRef]
- Lemon, B. The Effect of Locality Based Learning on Software Defect Prediction; West Virginia University: Morgantown, WV, USA, 2010. [Google Scholar]
- Ali, M.; Mazhar, T.; Arif, Y.; Al-Otaibi, S.; Ghadi, Y.Y.; Shahzad, T.; Khan, M.A.; Hamam, H. Software Defect Prediction Using an Intelligent Ensemble-Based Model. IEEE Access 2024, 12, 20376–20395. [Google Scholar] [CrossRef]
- Yadav, S. Software Reliability Prediction by using Deep Learning Technique. Int. J. Adv. Comput. Sci. Appl. 2022, 13, 683–693. [Google Scholar] [CrossRef]
- Mumtaz, B.; Kanwal, S.; Alamri, S.; Khan, F. Feature Selection Using Artificial Immune Network: An Approach for Software Defect Prediction. Intell. Autom. Soft Comput. 2021, 29, 669–684. [Google Scholar] [CrossRef]
- Odejide, B.J.; Bajeh, A.O.; Balogun, A.O.; Alanamu, Z.O.; Adewole, K.S.; Akintola, A.G.; Salihu, S.A.; Usman-Hamza, F.E.; Mojeed, H.A. An empirical study on data sampling methods in addressing class imbalance problem in software defect prediction. In Computer Science On-Line Conference; Springer: Cham, Switzerland, 2022; pp. 594–610. [Google Scholar]
- Balogun, A.O.; Basri, S.; Capretz, L.F.; Mahamad, S.; Imam, A.A.; Almomani, M.A.; Adeyemo, V.E.; Alazzawi, A.K.; Bajeh, A.O.; Kumar, G. Software defect prediction using wrapper feature selection based on dynamic re-ranking strategy. Symmetry 2021, 13, 2166. [Google Scholar] [CrossRef]
- Balogun, A.O.; Lafenwa-Balogun, F.B.; Mojeed, H.A.; Adeyemo, V.E.; Akande, O.N.; Akintola, A.G.; Bajeh, A.O.; Usman-Hamza, F.E. SMOTE-based homogeneous ensemble methods for software defect prediction. In Computational Science and Its Applications–ICCSA 2020: Proceedings of the 20th International Conference, Cagliari, Italy, 1–4 July 2020, Proceedings, Part VI 20; Springer: Berlin/Heidelberg, Germany, 2020; pp. 615–631. [Google Scholar]
Symbols | Explanation |
---|---|
r | Current iteration |
G | Maximum iteration |
ST | Safety Threshold [0.5, 1.0] |
SD | Scrounger Sparrow |
PD | Producer Sparrow |
Return the best position and the corresponding fitness value. With respect to our flowchart, as drawn in Figure 1, Figure 2, Figure 3 and Figure 4, returns the current best individual after the initialization step, whereas, after the condition becomes false, returns the global best position and the corresponding fitness value. |
Name | Mathematical Equation | Type | (n, G, dim) | Range | |
---|---|---|---|---|---|
Quartic | Unimodal | 100, 200, 30 | [−1.28,1.28] | 0 | |
Rosenbrock | Unimodal | 100, 200, 30 | [−30, 30] | 0 | |
Schwefel 2.21 | Unimodal | 100, 200, 30 | [−100, 100] | 0 | |
Six Hump Camel | Fixed-dimension Multimodal | 100, 200, 2 | [−5, 5] | −1.0316 | |
Branin | Fixed-dimension Multimodal | 100, 200, 2 | [−5, 0, 10, 15] | 0.397887 | |
Booth | Fixed-dimension Multimodal | 100, 200, 2 | [−10, 10] | 0 | |
Zakharov | Mullti-dimension Multimodal | 100, 200, 30 | [−5, 10] | 0 | |
Rastrigin | Multi-dimension Multimodal | 100, 200, 30 | [−5.12, 5.12] | 0 | |
Schaffer 6 | Multi-dimension Multimodal | 100, 200, 30 | [−100, 100] | 0 |
Dataset | Description | Target Feature | Original Features | Selected Features | Instances |
---|---|---|---|---|---|
CM1 | Spacecraft instrument’s software | Defective | 38 | 26 | 327 |
JM1 | Real-time predictive ground system | label | 22 | 16 | 10,878 |
KC1 | spacecraft’s ground data system (storage management) | Defective | 22 | 15 | 2107 |
KC3 | Flight software system | Defective | 40 | 25 | 194 |
KC4 | Software system related to spacecraft operations | Defective | 41 | 35 | 125 |
MC1 | Spacecraft’s data processing system | Defective | 40 | 30 | 9466 |
MC2 | spacecraft’s power distribution software system | Defective | 40 | 27 | 124 |
MW1 | Ground data system for a weather satellite | Defective | 38 | 28 | 250 |
PC1 | Air traffic control software system | Defective | 38 | 26 | 679 |
PC2 | Spacecraft’s altitude control software system | Defective | 37 | 25 | 722 |
PC3 | University’s administration software system | Defective | 38 | 25 | 1053 |
PC4 | Spacecraft’s orbit determination software system | Defective | 38 | 27 | 1270 |
PC5 | Satellite’s ground software system | Defective | 39 | 26 | 1694 |
Hyperparameters | Lower Bound | Upper Bound |
---|---|---|
Learning rate | 0.001 | 0.1 |
Neurons in layer1 | 8 | 10 |
Neurons in layer 2 | 6 | 10 |
Batch size | 4 | 32 |
Epochs | 10 | 50 |
Hyperparameters | Lower Bound | Upper Bound |
---|---|---|
Learning rate | 0.001 | 0.1 |
Max depth | 3 | 18 |
Subsample | 0 | 1 |
N estimator | 50 | 200 |
Specification | Size |
---|---|
RAM | 32 GB |
Hard Disk | 512 GB |
Processor | Intel core I7 |
OS Name | Ubuntu |
OS Type | 64-bit |
BF | Index | GFLFGOA | GFLFGOA-SSA | LFGOA-SSA | GFGOA-SSA | LFGOA | GFGOA | GOA | SSA |
---|---|---|---|---|---|---|---|---|---|
Mean | |||||||||
Rank | 7 | 3 | 4 | 2 | 5 | 8 | 6 | 1 | |
Mean | |||||||||
Rank | 6 | 2 | 4 | 1 | 7 | 5 | 8 | 3 | |
Mean | |||||||||
Rank | 6 | 3 | 4 | 2 | 8 | 5 | 7 | 1 | |
Mean | |||||||||
Rank | 5 | 2 | 1 | 4 | 2 | 6 | 3 | 1 | |
Mean | |||||||||
Rank | 4 | 2 | 2 | 2 | 1 | 5 | 3 | 1 | |
Mean | |||||||||
Rank | 8 | 1 | 2 | 3 | 4 | 6 | 5 | 7 | |
Mean | |||||||||
Rank | 7 | 1 | 4 | 3 | 6 | 8 | 5 | 2 | |
Mean | |||||||||
Rank | 8 | 4 | 2 | 3 | 5 | 7 | 6 | 1 | |
Mean | |||||||||
Rank | 6 | 4 | 1 | 3 | 7 | 5 | 8 | 2 |
BF | Index | GFLFGOA | GFLFGOA-SSA | LFGOA-SSA | GFGOA-SSA | LFGOA | GFGOA | GOA | SSA |
---|---|---|---|---|---|---|---|---|---|
SD | |||||||||
Rank | 1 | 5 | 6 | 4 | 8 | 2 | 7 | 3 | |
SD | |||||||||
Rank | 6 | 2 | 4 | 1 | 7 | 5 | 8 | 3 | |
SD | |||||||||
Rank | 7 | 2 | 4 | 3 | 5 | 8 | 6 | 1 | |
SD | |||||||||
Rank | 8 | 5 | 2 | 7 | 6 | 4 | 3 | 1 | |
SD | |||||||||
Rank | 8 | 5 | 6 | 3 | 4 | 1 | 7 | 2 | |
SD | |||||||||
Rank | 7 | 1 | 2 | 3 | 5 | 8 | 4 | 6 | |
SD | |||||||||
Rank | 4 | 1 | 8 | 3 | 7 | 5 | 6 | 2 | |
SD | |||||||||
Rank | 1 | 6 | 4 | 5 | 8 | 2 | 7 | 3 | |
SD | |||||||||
Rank | 7 | 4 | 1 | 3 | 6 | 5 | 8 | 2 |
Dataset | Without Optimization | GFLFGOA | GFLFGOA-SSA | LFGOA-SSA | GFGOA-SSA | LFGOA | GFGOA | GOA | SSA |
---|---|---|---|---|---|---|---|---|---|
CM1 | 0.89024 | 0.93902 | 0.93902 | 0.92683 | 0.93902 | 0.92683 | 0.92683 | 0.92683 | 0.93902 |
JM1 | 0.81066 | 0.81471 | 0.81507 | 0.81434 | 0.81691 | 0.81434 | 0.81213 | 0.81397 | 0.81654 |
KC1 | 0.86338 | 0.87666 | 0.87666 | 0.87097 | 0.87666 | 0.86717 | 0.87856 | 0.86907 | 0.87856 |
KC3 | 0.81633 | 0.85714 | 0.85714 | 0.85714 | 0.85714 | 0.83674 | 0.85714 | 0.83674 | 0.85714 |
KC4 | 0.6875 | 0.75 | 0.78125 | 0.78125 | 0.78125 | 0.71875 | 0.78125 | 0.75 | 0.75 |
MC1 | 0.99451 | 0.99620 | 0.99662 | 0.99620 | 0.99620 | 0.99535 | 0.99662 | 0.99620 | 0.99620 |
MC2 | 0.70968 | 0.83871 | 0.83871 | 0.83871 | 0.83871 | 0.80645 | 0.83871 | 0.838710 | 0.90323 |
MW1 | 0.88 | 0.90476 | 0.90476 | 0.90476 | 0.92064 | 0.90476 | 0.90476 | 0.90476 | 0.90476 |
PC1 | 0.91176 | 0.92941 | 0.92941 | 0.92941 | 0.93529 | 0.91765 | 0.92353 | 0.92941 | 0.92941 |
PC2 | 0.98895 | 0.99448 | 0.99448 | 0.99448 | 0.99448 | 0.99448 | 0.99448 | 0.99448 | 0.99448 |
PC3 | 0.86364 | 0.88258 | 0.88258 | 0.875 | 0.88258 | 0.87879 | 0.87879 | 0.875 | 0.875 |
PC4 | 0.90252 | 0.94025 | 0.93306 | 0.93082 | 0.93082 | 0.93711 | 0.93711 | 0.92453 | 0.93082 |
PC5 | 0.77830 | 0.82075 | 0.80896 | 0.81132 | 0.81604 | 0.81368 | 0.81604 | 0.81132 | 0.81604 |
Dataset | GFLFGOA | GFLFGOA-SSA | LFGOA-SSA | GFGOA-SSA | LFGOA | GFGOA | GOA | SSA |
---|---|---|---|---|---|---|---|---|
CM1 | 8.63894 | 2.62227 | 5.97816 | 8.16711 | 6.80014 | 4.67062 | 17.40778 | 4.14228 |
JM1 | 49.11337 | 11.16519 | 23.01613 | 6.44055 | 20.11788 | 18.97812 | 100.45693 | 32.35944 |
KC1 | 10.98801 | 13.28012 | 4.71760 | 2.79241 | 9.85696 | 7.21173 | 30.87882 | 6.74143 |
KC3 | 5.16992 | 4.89649 | 4.92847 | 3.61582 | 7.55046 | 2.61733 | 11.18174 | 7.2783 |
KC4 | 3.37044 | 2.72615 | 1.03054 | 8.00315 | 3.01872 | 7.93459 | 7.89999 | 10.81161 |
MC1 | 20.06377 | 16.28936 | 17.45409 | 3.18721 | 20.94327 | 10.61918 | 36.69793 | 7.54776 |
MC2 | 6.51045 | 7.53665 | 6.08344 | 3.0346 | 4.42068 | 3.79698 | 19.6863 | 5.38492 |
MW1 | 11.61992 | 8.84075 | 8.75449 | 8.91545 | 6.90571 | 7.68827 | 17.27596 | 14.46098 |
PC1 | 16.15804 | 5.14336 | 2.1437 | 7.26307 | 5.33309 | 11.99368 | 7.72782 | 13.36885 |
PC2 | 8.65440 | 7.13130 | 4.98462 | 8.60696 | 9.58645 | 5.79961 | 6.66192 | 12.31117 |
PC3 | 15.00149 | 16.57482 | 8.22984 | 3.48169 | 11.09310 | 4.54789 | 25.59780 | 11.45645 |
PC4 | 11.35519 | 12.56271 | 9.46519 | 11.60129 | 18.41754 | 14.41984 | 31.31732 | 43.08894 |
PC5 | 14.16040 | 3.07928 | 8.78662 | 7.02953 | 22.22073 | 8.27753 | 20.31468 | 21.10666 |
Dataset | Without Optimization | GFLFGOA | GFLFGOA-SSA | LFGOA-SSA | GFGOA-SSA | LFGOA | GFGOA | GOA | SSA |
---|---|---|---|---|---|---|---|---|---|
CM1 | 0.87805 | 0.93902 | 0.95122 | 0.93902 | 0.93902 | 0.91463 | 0.93902 | 0.91463 | 0.95122 |
JM1 | 0.79669 | 0.81471 | 0.81544 | 0.81397 | 0.81581 | 0.81544 | 0.81544 | 0.81360 | 0.81544 |
KC1 | 0.84820 | 0.87476 | 0.86907 | 0.87287 | 0.87287 | 0.87287 | 0.87097 | 0.87856 | 0.88046 |
KC3 | 0.77551 | 0.85714 | 0.85714 | 0.87755 | 0.85714 | 0.83674 | 0.85714 | 0.87755 | 0.85714 |
KC4 | 0.5625 | 0.71875 | 0.71875 | 0.8125 | 0.78125 | 0.71875 | 0.84375 | 0.71875 | 0.75 |
MC1 | 0.99408 | 0.99578 | 0.99535 | 0.99578 | 0.99535 | 0.99471 | 0.99578 | 0.99535 | 0.99578 |
MC2 | 0.74194 | 0.83871 | 0.83871 | 0.87097 | 0.87097 | 0.83871 | 0.90323 | 0.87097 | 0.87097 |
MW1 | 0.88889 | 0.93651 | 0.93651 | 0.93651 | 0.93651 | 0.93651 | 0.93651 | 0.93651 | 0.93651 |
PC1 | 0.89412 | 0.94118 | 0.94706 | 0.94706 | 0.93529 | 0.92941 | 0.94118 | 0.94706 | 0.93529 |
PC2 | 0.98895 | 1 | 1 | 1 | 1 | 0.99448 | 1 | 0.99448 | 0.99448 |
PC3 | 0.82576 | 0.86742 | 0.88258 | 0.87879 | 0.87121 | 0.875 | 0.87879 | 0.85606 | 0.875 |
PC4 | 0.90252 | 0.93082 | 0.92767 | 0.92453 | 0.93082 | 0.93396 | 0.93082 | 0.93711 | 0.94339 |
PC5 | 0.74057 | 0.81132 | 0.80425 | 0.80660 | 0.81368 | 0.80660 | 0.80660 | 0.80660 | 0.80896 |
Dataset | GFLFGOA | GFLFGOA-SSA | LFGOA-SSA | GFGOA-SSA | LFGOA | GFGOA | GOA | SSA |
---|---|---|---|---|---|---|---|---|
CM1 | 14.15148 | 19.03989 | 10.93558 | 14.25837 | 19.88457 | 20.53972 | 49.70118 | 59.26953 |
JM1 | 329.74139 | 461.88978 | 280.21241 | 258.71992 | 372.32476 | 428.38037 | 100.74072 | 677.38826 |
KC1 | 67.28775 | 40.93038 | 34.27968 | 43.71838 | 33.20376 | 27.49754 | 127.15726 | 370.54866 |
KC3 | 16.96921 | 10.03377 | 16.24724 | 22.79089 | 10.27701 | 12.48286 | 22.32237 | 25.73986 |
KC4 | 13.65393 | 17.94655 | 11.18101 | 10.96659 | 28.47858 | 8.85138 | 16.14852 | 30.20686 |
MC1 | 403.20674 | 185.79484 | 196.88806 | 375.0098 | 138.50669 | 213.44733 | 413.83031 | 505.23935 |
MC2 | 20.73147 | 9.0823 | 9.80959 | 16.01918 | 12.00442 | 12.04628 | 13.47418 | 26.23941 |
MW1 | 25.6989 | 10.81326 | 11.2502 | 16.40268 | 13.56527 | 9.44403 | 17.95148 | 27.54926 |
PC1 | 26.2941 | 24.40644 | 19.09213 | 19.34053 | 24.48024 | 33.99985 | 17.26851 | 58.35074 |
PC2 | 10.24549 | 54.52347 | 33.17081 | 26.83333 | 36.66479 | 101.2086 | 22.20873 | 41.70491 |
PC3 | 19.32979 | 33.10067 | 32.45259 | 40.49498 | 14.1666 | 31.19656 | 23.19152 | 71.15856 |
PC4 | 28.11799 | 30.82265 | 35.64099 | 32.15147 | 68.57497 | 45.70245 | 31.37875 | 100.55374 |
PC5 | 287.27928 | 72.0665 | 62.88953 | 70.00658 | 27.62535 | 30.31593 | 46.86406 | 91.27525 |
Dataset | Accuracy (Highest) | Runtime (Lowest) |
---|---|---|
CM1 | GFLFGOA-SSA, GFLFGOA, GFGOA-SSA, SSA | GFLFGOA-SSA |
JM1 | GFGOA-SSA | GFGOA-SSA |
KC1 | GFGOA, SSA | GFGOA-SSA |
KC3 | GFGOA, GFLFGOA-SSA, SSA, GFGOA-SSA, GFLFGOA | GFGOA |
KC4 | GFGOA-SSA, GFGOA, LFGOA-SSA, GFLFGOA-SSA | LFGOA-SSA |
MC1 | GFLFGOA-SSA, GFGOA | GFGOA-SSA |
MC2 | SSA | GFGOA-SSA |
MW1 | GFGOA-SSA | LFGOA |
PC1 | GFGOA-SSA | LFGOA-SSA |
PC2 | GFLFGOA, GFLFGOA-SSA, LFGOA-SSA, GFGOA-SSA, LFGOA, GFGOA, GOA, SSA | LFGOA-SSA |
PC3 | GFGOA-SSA, GFLFGOA, GFLFGOA-SSA | GFGOA-SSA |
PC4 | GFLFGOA | LFGOA-SSA |
PC5 | GFLFGOA | GFLFGOA-SSA |
Dataset | Accuracy (Highest) | Runtime (Lowest) |
---|---|---|
CM1 | GFLFGOA-SSA, SSA | GOA |
JM1 | GFGOA-SSA | GOA |
KC1 | SSA | GFGOA |
KC3 | LFGOA-SSA, GOA | GFLFGOA-SSA |
KC4 | GFGOA | GFGOA |
MC1 | GFLFGOA, LFGOA-SSA, GFGOA, SSA | LFGOA |
MC2 | GFGOA | GFLFGOA-SSA |
MW1 | GFLFGOA, GFLFGOA-SSA, LFGOA-SSA, GFGOA-SSA, LFGOA, GFGOA, GOA, SSA | GFGOA |
PC1 | GFLFGOA-SSA, LFGOA-SSA, GOA | GOA |
PC2 | GFGOA-SSA, GFGOA, GFLFGOA, GFLFGOA-SSA, LFGOA-SSA | GFLFGOA |
PC3 | GFLFGOA-SSA | LFGOA |
PC4 | SSA | GFLFGOA |
PC5 | GFGOA-SSA | LFGOA |
Datasets | CM1 | JM1 | KC1 | KC3 | KC4 | MC1 | MC2 | MW1 | PC1 | PC2 | PC3 | PC4 | PC5 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
[67] | 0.8687 | 0.7912 | - | - | - | - | 0.6842 | 0.8933 | 0.9216 | - | 0.8797 | 0.8714 | - |
[57] | 0.83 | 0.78 | - | - | - | - | 0.68 | - | 0.91 | - | 0.84 | 0.84 | - |
[56] | 0.878 | 0.803 | 0.850 | - | - | - | - | - | 0.922 | - | - | - | - |
[68] | - | 0.89 | 0.84 | - | - | 0.95 | - | - | 0.85 | 0.86 | 0.83 | 0.89 | 0.91 |
[48] | - | 0.81 | 0.79 | - | - | - | - | - | - | - | 0.89 | - | - |
[59] | 0.7755 | 0.7396 | - | - | - | - | 0.6486 | 0.8266 | - | - | 0.8259 | 0.8608 | - |
[69] | 0.8179 | - | - | - | - | - | - | - | 0.8979 | - | - | - | - |
[70] | - | - | - | - | - | - | - | - | - | - | 0.8192 | - | - |
[71] | - | - | - | - | - | - | 0.6832 | 0.6005 | - | - | 0.737 | 0.869 | - |
[72] | - | - | - | - | - | - | - | - | - | - | - | 0.8661 | - |
[53] (GA-SVM) | 0.9035 | - | 0.8514 | 0.7989 | 0.9035 | 0.9950 | 0.6710 | 0.9183 | 0.9367 | 0.9959 | 0.9014 | 0.8821 | - |
[53] (PSO-SVM) | 0.9037 | - | 0.8514 | 0.9040 | 0.6885 | 0.9950 | 0.6706 | 0.9179 | 0.9367 | 0.9959 | 0.9021 | 0.8793 | - |
[53] (GAPSO-SVM) | 0.9049 | - | 0.8438 | 0.8995 | 0.6782 | 0.9951 | 0.6783 | 0.9157 | 0.9386 | 0.9959 | 0.9040 | 0.8779 | - |
Proposed (GFLFGOA-XGB) | 0.9390 | 0.8147 | 0.8767 | 0.8571 | 0.75 | 0.9962 | 0.8387 | 0.9048 | 0.9294 | 0.9945 | 0.8826 | 0.9403 | 0.8208 |
Proposed (GFLFGOA-SSA-XGB) | 0.9390 | 0.8151 | 0.8767 | 0.8571 | 0.7813 | 0.9966 | 0.8387 | 0.9048 | 0.9294 | 0.9945 | 0.8826 | 0.9331 | 0.8090 |
Proposed (LFGOA-SSA-XGB) | 0.9268 | 0.8143 | 0.8710 | 0.8571 | 0.7813 | 0.9962 | 0.8387 | 0.9048 | 0.9294 | 0.9945 | 0.875 | 0.9308 | 0.8113 |
Proposed (GFGOA-SSA-XGB) | 0.9390 | 0.8169 | 0.8767 | 0.8571 | 0.7813 | 0.9962 | 0.8387 | 0.9206 | 0.9353 | 0.9945 | 0.8826 | 0.9308 | 0.8160 |
Proposed (GFLFGOA-ANN) | 0.9390 | 0.8147 | 0.8748 | 0.8571 | 0.7188 | 0.9958 | 0.8387 | 0.9365 | 0.9412 | 1.0 | 0.8674 | 0.9308 | 0.8113 |
Proposed (GFLFGOA-SSA-ANN) | 0.9512 | 0.8154 | 0.8691 | 0.8571 | 0.7188 | 0.9954 | 0.8387 | 0.9365 | 0.9471 | 1.0 | 0.8826 | 0.9277 | 0.8043 |
Proposed (LFGOA-SSA-ANN) | 0.9390 | 0.8140 | 0.8729 | 0.8776 | 0.8125 | 0.9958 | 0.8710 | 0.9365 | 0.9471 | 1.0 | 0.8788 | 0.9245 | 0.8066 |
Proposed (GFGOA-SSA-ANN) | 0.9390 | 0.8158 | 0.8729 | 0.8571 | 0.7813 | 0.9954 | 0.8710 | 0.9365 | 0.9353 | 1.0 | 0.8712 | 0.9308 | 0.8137 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Das, M.; Mohan, B.R.; Guddeti, R.M.R.; Prasad, N. Hybrid Bio-Optimized Algorithms for Hyperparameter Tuning in Machine Learning Models: A Software Defect Prediction Case Study. Mathematics 2024, 12, 2521. https://doi.org/10.3390/math12162521
Das M, Mohan BR, Guddeti RMR, Prasad N. Hybrid Bio-Optimized Algorithms for Hyperparameter Tuning in Machine Learning Models: A Software Defect Prediction Case Study. Mathematics. 2024; 12(16):2521. https://doi.org/10.3390/math12162521
Chicago/Turabian StyleDas, Madhusmita, Biju R. Mohan, Ram Mohana Reddy Guddeti, and Nandini Prasad. 2024. "Hybrid Bio-Optimized Algorithms for Hyperparameter Tuning in Machine Learning Models: A Software Defect Prediction Case Study" Mathematics 12, no. 16: 2521. https://doi.org/10.3390/math12162521
APA StyleDas, M., Mohan, B. R., Guddeti, R. M. R., & Prasad, N. (2024). Hybrid Bio-Optimized Algorithms for Hyperparameter Tuning in Machine Learning Models: A Software Defect Prediction Case Study. Mathematics, 12(16), 2521. https://doi.org/10.3390/math12162521