An Improved Expeditious Meta-Heuristic Clustering Method for Classifying Student Psychological Issues with Homogeneous Characteristics
Abstract
:1. Introduction
- Improvement of GWO. This research analyzes an iGWCA to classify student psychological problems. The improvement focuses on two key issues such as slow convergence and the tendency to local optima.
- Validation of iGWCA. Statistical analysis and comparison of obtained results from iGWCA with various established algorithms consistently demonstrate superior results in both accuracy and efficiency.
2. Related Work
- iGWCA provides simplicity and ease of implementation.
- Only a few control parameters for tuning.
- The iGWCA offers better convergence speed and an improved balance between exploration and exploitation processes.
- It shows robustness and applicability in classifying psychological problems.
3. Problem Model Construction
- X is the set of the student data set;
- C is the set of clusters;
- dij is the distance between data point i and data point j;
- μk is the centroid of cluster k;
- σi is the average distance from the centroid of cluster k to the student data point within the cluster.
- is the set of clusters;
- μi and μk are the centroids of clusters i and k, respectively;
- dμiμk is the distance between μi and μk;
- σi and σk are the average distances from centroids μi and μk to the data points within clusters i and k, respectively.
- Xij is the binary variable indicating whether data point i belongs to cluster j;
- min_cluster_size and max_cluster size are the minimum and maximum allowed number of data points in a cluster, respectively.
4. Proposed Method
4.1. Introduction to the Principle and Basic Process of GWO Algorithm
- Initially, the algorithm randomly places a pack of grey wolves in the search space, with each wolf representing a potential solution corresponding to a set of cluster centroids.
- Secondly, a predefined DBI objective function computes the current positions of the wolves (i.e., cluster centroids) and the student psychological problem data.
- Then, during each iteration, the algorithm evaluates the fitness of each wolf using the objective function and updates their positions accordingly, aiming to minimize the DBI and improve clustering quality.
- Finally, once the optimization reaches the maximum number of iterations, the final solution, consisting of optimized cluster centroids, is obtained. These clusters can be used to classify student psychological problems based on their similarities, with each student assigned to a specific cluster.
4.2. The Proposed iGWCA Algorithm
- is the total number of iterations, t is the current iteration, k is a constant
- Step 1: Initialization of the student datasets
- Initialize student stress datasets for dataset-I and dataset-II, then set the search agents and maximum number of iterations. The grey wolf at the location moves and updates its location according to the prey location and the algorithm defines the position of the best agent based on the positions of the grey wolf and prey by controlling parameters and using Equations (4) and (5). The value of the parameter decreases during the algorithm’s simulation, and the fluctuation rate also decreases to maintain a balanced process.
- Step 2: DBI Objective Function
- Next, initialize the population of grey wolves within the search space and assess the quality of each grey wolf’s solution using the DBI objective function f(x).
- Step 3: Hierarchy Formation
- Grey wolves in the population are categorized into alpha (α), beta (β), and delta (δ) types of wolves. The α wolves are the best solutions, followed by β and δ wolves. Within the algorithm loop, continuously update the positions of the α, β, and δ wolves within the search space, ensuring all wolf positions remain feasible.
- Step 4: Update α, β, and δ Positions:
- The algorithm selects the α, β, and δ wolves based on their performance. These wolves estimate the prey’s location and update the positions of wolves around the prey according to the following Equations.
- T is the current iteration, and are the co-efficient, is the position of prey, is the position of a wolf.
- are the distances, , are random values and a is the value from 2 to 0.
- Step 5: Update the Position of the prey
- and are the upper and lower boundaries, r is the center, and m is the mapping parameter, which is updated at each iteration
- Step 6: Boundary Constraints
- The DBI function evaluates student clustering quality based on cluster separation and constraints such as ensuring each student data point belongs to exactly one cluster and allocating the correct number of student data points to each cluster.
- Step 7: Update Fitness Values
- Step 8: Update Hierarchy of α, β, and δ wolves
- Update the hierarchy of wolves according to their fitness values.
- Step 9: Termination Criteria
- Update the convergence curve to monitor optimization progress until a stopping criterion is satisfied. When the stopping criterion is satisfied, return the best solution obtained from the optimization process otherwise, start the process.
Algorithm 1: Pseudo Code of the iGWCA. Algorithm 1. PSEUDO code of proposed method. | |
1 | Input number students stress dataset |
2 | Initialize the population of grey wolves randomly within the search space using iGWCA |
3 | Initialize, and |
4 | Evaluate the fitness value of each grey wolf solution using the Davies-Bouldin Index f(x) |
5 | While the stopping criterion is not met |
6 | for each search agent |
7 | Update the positions of wolves |
8 | end for |
9 | Update, and |
10 | Evaluate the fitness value of each grey wolf Davies-Bouldin Index f(x) |
11 | Update the new position of wolves |
12 | t = t + 1 |
13 | End while |
14 | Return as the best obtained solution for iGWCA |
4.3. Time Complexity Analysis of iGWCA Algorithm
5. Experimental Results
5.1. Experimental Method Description
- n is the number of runs, and the FFVi represents the best value obtained in the ith iteration.
- FFVmin is the minimum value of the fitness function.
5.2. Discussion on Statistical, Convergence Performance
5.2.1. Efficiency Experiments
5.2.2. Exploring Data Credibility
5.3. Comparative Analysis of Classification Results Based on Dataset-I
5.3.1. Dataset Description
- (1)
- Test Case 1: Academic Performance Stress
- (2)
- Test Case 2: Future Career Concern Stress
- (3)
- Test Case 3: Headache Stress
5.3.2. Classification Result and Analysis
5.4. Comparative Analysis of Classification Results Based on Dataset-II
5.4.1. Dataset Description
- (1)
- Test Case 5: Growing Stress
- (2)
- Test Case 6: Social Weakness
5.4.2. Classification Result and Analysis
6. Conclusions
- (1)
- Evaluation of other optimization algorithms, such as whale optimization or moth flame optimization, for classifying student psychological issues is a promising area for future research. By refining the algorithm parameters and exploring adaptive mechanisms, its efficiency will further improve in accurately classifying different psychological issues.
- (2)
- Additionally, integrating multimodal data sources, such as physiological measurements, behavioral patterns, and academic performance metrics, could further enhance the algorithm classification accuracy.
- (3)
- Furthermore, investigating the customization of the algorithm to address the unique challenges modeled by student psychological data, such as variability in symptom expression and the dynamic nature of mental health conditions, is significant.
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Antonopoulou, M.; Mantzorou, M.; Serdari, A.; Bonotis, K.; Vasios, G.; Pavlidou, E.; Trifonos, C.; Vadikolias, K.; Petridis, D.; Giaginis, C. Evaluating Mediterranean diet adherence in university student populations: Does this dietary pattern affect students’ academic performance and mental health? Int. J. Health Plan. Manag. 2020, 35, 5–21. [Google Scholar] [CrossRef] [PubMed]
- Auerbach, R.P.; Mortier, P.; Bruffaerts, R.; Alonso, J.; Benjet, C.; Cuijpers, P.; Demyttenaere, K.; Ebert, D.D.; Green, J.G.; Hasking, P. WHO world mental health surveys international college student project: Prevalence and distribution of mental disorders. J. Abnorm. Psychol. 2018, 127, 623. [Google Scholar] [CrossRef]
- Eisenberg, D.; Hunt, J.; Speer, N. Mental health in American colleges and universities: Variation across student subgroups and across campuses. J. Nerv. Ment. Dis. 2013, 201, 60–67. [Google Scholar] [CrossRef] [PubMed]
- Herbert, C.; Meixner, F.; Wiebking, C.; Gilg, V. Regular physical activity, short-term exercise, mental health, and well-being among university students: The results of an online and a laboratory study. Front. Psychol. 2020, 11, 491804. [Google Scholar] [CrossRef]
- Idris, F.; Zulkipli, I.N.; Abdul-Mumin, K.H.; Ahmad, S.R.; Mitha, S.; Rahman, H.A.; Rajabalaya, R.; David, S.R.; Naing, L. Academic experiences, physical and mental health impact of COVID-19 pandemic on students and lecturers in health care education. BMC Med. Educ. 2021, 21, 542. [Google Scholar] [CrossRef]
- Tang, Q.; Zhao, Y.; Wei, Y.; Jiang, L. Research on the mental health of college students based on fuzzy clustering algorithm. Secur. Commun. Netw. 2021, 2021, 3960559. [Google Scholar] [CrossRef]
- American College Health Association. American College Health Association-National College Health Assessment III: Undergraduate Student Reference Group Data Report Spring 2023; American College Health Association: Silver Spring, MD, USA, 2023; pp. 1–117. [Google Scholar]
- Parker, P.C.; Perry, R.P.; Hamm, J.M.; Chipperfield, J.G.; Pekrun, R.; Dryden, R.P.; Daniels, L.M.; Tze, V.M. A motivation perspective on achievement appraisals, emotions, and performance in an online learning environment. Int. J. Educ. Res. 2021, 108, 101772. [Google Scholar] [CrossRef]
- Naser, A.Y.; Dahmash, E.Z.; Al-Rousan, R.; Alwafi, H.; Alrawashdeh, H.M.; Ghoul, I.; Abidine, A.; Bokhary, M.A.; AL-Hadithi, H.T.; Ali, D. Mental health status of the general population, healthcare professionals, and university students during 2019 coronavirus disease outbreak in Jordan: A cross-sectional study. Brain Behav. 2020, 10, e01730. [Google Scholar] [CrossRef]
- Viskovich, S.; Pakenham, K.I. Randomized controlled trial of a web? Based Acceptance and Commitment Therapy (ACT) program to promote mental health in university students. J. Clin. Psychol. 2020, 76, 929–951. [Google Scholar] [CrossRef]
- Cuijpers, P.; Auerbach, R.P.; Benjet, C.; Bruffaerts, R.; Ebert, D.; Karyotaki, E.; Kessler, R.C. The world health organization world mental health international college student initiative: An overview. Int. J. Methods Psychiatr. Res. 2019, 28, e1761. [Google Scholar] [CrossRef]
- Lipson, S.K.; Eisenberg, D. Mental health and academic attitudes and expectations in university populations: Results from the healthy minds study. J. Ment. Health 2018, 27, 205–213. [Google Scholar] [CrossRef]
- Patel, V.; Flisher, A.J.; Hetrick, S.; McGorry, P. Mental health of young people: A global public-health challenge. Lancet 2007, 369, 1302–1313. [Google Scholar] [CrossRef] [PubMed]
- Sau, A.; Bhakta, I. Screening of anxiety and depression among seafarers using machine learning technology. Inform. Med. Unlocked 2019, 16, 100228. [Google Scholar] [CrossRef]
- Ahuja, R.; Banga, A. Mental stress detection in university students using machine learning algorithms. Procedia Comput. Sci. 2019, 152, 349–353. [Google Scholar] [CrossRef]
- Du, C.; Liu, C.; Balamurugan, P.; Selvaraj, P. Deep learning-based mental health monitoring scheme for college students using convolutional neural network. Int. J. Artif. Intell. Tools 2021, 30, 2140014. [Google Scholar] [CrossRef]
- Pandey, D.C.; Kushwaha, G.S.; Kumar, S. Mamdani fuzzy rule-based models for psychological research. SN Appl. Sci. 2020, 2, 913. [Google Scholar] [CrossRef]
- Ogunseye, E.O.; Adenusi, C.A.; Nwanakwaugwu, A.C.; Ajagbe, S.A.; Akinola, S.O. Predictive analysis of mental health conditions using AdaBoost algorithm. ParadigmPlus 2022, 3, 11–26. [Google Scholar] [CrossRef]
- Chung, J.; Teo, J.; Minutolo, A. Mental Health Prediction Using Machine Learning: Taxonomy, Applications, and Challenges. Appl. Comput. Intell. Soft Comput. 2022, 2022, 9970363. [Google Scholar] [CrossRef]
- Ku, W.L.; Min, H. Evaluating Machine Learning Stability in Predicting Depression and Anxiety Amidst Subjective Response Errors. Healthcare 2024, 12, 625. [Google Scholar] [CrossRef] [PubMed]
- Katarya, R.; Maan, S. Predicting mental health disorders using machine learning for employees in technical and non-technical companies. In Proceedings of the 2020 IEEE International Conference on Advances and Developments in Electrical and Electronics Engineering (ICADEE), Coimbatore, India, 10–11 December 2020; pp. 1–5. [Google Scholar]
- Tuan, T.M.; Lan, L.T.H.; Chou, S.-Y.; Ngan, T.T.; Son, L.H.; Giang, N.L.; Ali, M. M-CFIS-R: Mamdani complex fuzzy inference system with rule reduction using complex fuzzy measures in granular computing. Mathematics 2020, 8, 707. [Google Scholar] [CrossRef]
- Singh, S.; Hooda, S. A Study of Challenges and Limitations to Applying Machine Learning to Highly Unstructured Data. In Proceedings of the 2023 7th International Conference on Computing, Communication, Control and Automation (ICCUBEA), Pune, India, 18–19 August 2023; pp. 1–6. [Google Scholar]
- Krichen, M. Convolutional neural networks: A survey. Computers 2023, 12, 151. [Google Scholar] [CrossRef]
- Hornyák, O.; Iantovics, L.B. AdaBoost Algorithm Could Lead to Weak Results for Data with Certain Characteristics. Mathematics 2023, 11, 1801. [Google Scholar] [CrossRef]
- Ahmadi, R.; Ekbatanifard, G.; Bayat, P. A modified grey wolf optimizer based data clustering algorithm. Appl. Artif. Intell. 2021, 35, 63–79. [Google Scholar] [CrossRef]
- Nasiri, J.; Khiyabani, F.M. A whale optimization algorithm (WOA) approach for clustering. Cogent Math. Stat. 2018, 5, 1483565. [Google Scholar] [CrossRef]
- Faieghi, M.R.; Delavari, H.; Baleanu, D. A novel adaptive controller for two-degree of freedom polar robot with unknown perturbations. Commun. Nonlinear Sci. Numer. Simul. 2012, 17, 1021–1030. [Google Scholar] [CrossRef]
- Farnad, B.; Jafarian, A.; Baleanu, D. A new hybrid algorithm for continuous optimization problem. Appl. Math. Model. 2018, 55, 652–673. [Google Scholar] [CrossRef]
- Sharma, S.; Buddhiraju, K.M. Spatial-spectral ant colony optimization for hyperspectral image classification. Int. J. Remote Sens. 2018, 39, 2702–2717. [Google Scholar] [CrossRef]
- Dongwen, C. Application of Whale Optimization Algorithm in Reservoir Optimal Dispatching. Adv. Sci. Technol. Water Resour. 2017, 37, 72–76. [Google Scholar]
- Omran, M.G.; Salman, A.; Engelbrecht, A.P. Dynamic clustering using particle swarm optimization with application in image segmentation. Pattern Anal. Appl. 2006, 8, 332–344. [Google Scholar] [CrossRef]
- Kumar, V.; Chhabra, J.K.; Kumar, D. Grey wolf algorithm-based clustering technique. J. Intell. Syst. 2017, 26, 153–168. [Google Scholar] [CrossRef]
- Zervoudakis, K.; Tsafarakis, S. A mayfly optimization algorithm. Comput. Ind. Eng. 2020, 145, 106559. [Google Scholar] [CrossRef]
- Zervoudakis, K.; Tsafarakis, S. A global optimizer inspired from the survival strategies of flying foxes. Eng. Comput. 2023, 39, 1583–1616. [Google Scholar] [CrossRef]
- Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95-International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995. [Google Scholar]
- Yaqoob, A.; Verma, N.K.; Aziz, R.M. Optimizing gene selection and cancer classification with hybrid sine cosine and cuckoo search algorithm. J. Med. Syst. 2024, 48, 10. [Google Scholar] [CrossRef] [PubMed]
- Zervoudakis, K.; Tsafarakis, S.; Paraskevi-Panagiota, S. A new hybrid firefly-genetic algorithm for the optimal product line design problem. In Learning and Intelligent Optimization: Proceedings of the 13th International Conference, LION 13, Chania, Crete, Greece, 27–31 May 2019; Revised Selected Papers 13; Springer: Berlin/Heidelberg, Germany, 2020. [Google Scholar]
- Rashedi, E.; Nezamabadi-Pour, H.; Saryazdi, S. GSA: A gravitational search algorithm. Inf. Sci. 2009, 179, 2232–2248. [Google Scholar] [CrossRef]
- Hatamlou, A.; Abdullah, S.; Othman, Z. Gravitational search algorithm with heuristic search for clustering problems. In Proceedings of the 2011 3rd Conference on Data Mining and Optimization (DMO), Putrajaya, Malaysia, 28–29 June 2011. [Google Scholar]
- Liu, J.; Shi, G.; Zhou, J.; Yao, Q. Prediction of college students-psychological crisis based on data mining. Mob. Inf. Syst. 2021, 2021, 9979770. [Google Scholar] [CrossRef]
- Reyna-Castillo, M.; Pulgarín-Rodríguez, M.A.; Ríos-Serna, A.H.; Santiago, A. PLS-SEM validation for burnout measures in latino college students: A socially sustainable educational return. Sustainability 2022, 14, 14635. [Google Scholar] [CrossRef]
- Moore, S.A.; Dowdy, E.; Nylund-Gibson, K.; Furlong, M.J. An Empirical Approach to Complete Mental Health Classification in Adolescents. Sch. Ment Health 2019, 11, 438–453. [Google Scholar] [CrossRef] [PubMed]
- Ademiluyi, O.; Ukaoha, K.; Otokiti, K. A neuro fuzzy-based guidance and counselling system for students. Afr. J. MIS 2020, 2, 39–54. [Google Scholar]
- Feng, X.; Wei, Y.; Pan, X.; Qiu, L.; Ma, Y. Academic emotion classification and recognition method for large-scale online learning environment-Based on A-CNN and LSTM-ATT deep learning pipeline method. Int. J. Environ. Res. Public Health 2020, 17, 1941. [Google Scholar] [CrossRef]
- Karunambigai, M.; Akram, M.; Sivasankar, S.; Palanivel, K. Clustering algorithm for intuitionistic fuzzy graphs. Int. J. Uncertain. Fuzziness Knowl.-Based Syst. 2017, 25, 367–383. [Google Scholar] [CrossRef]
- Saeb, S.; Zhang, M.; Karr, C.J.; Schueller, S.M.; Corden, M.E.; Kording, K.P.; Mohr, D.C. Mobile phone sensor correlates of depressive symptom severity in daily-life behavior: An exploratory study. J. Med. Internet Res. 2015, 17, e4273. [Google Scholar] [CrossRef] [PubMed]
- Iyortsuun, N.K.; Kim, S.-H.; Jhon, M.; Yang, H.-J.; Pant, S. A review of machine learning and deep learning approaches on mental health diagnosis. Healthcare 2023, 11, 285. [Google Scholar] [CrossRef]
- Hatcher, W.G.; Yu, W. A survey of deep learning: Platforms, applications and emerging research trends. IEEE Access 2018, 6, 24411–24432. [Google Scholar] [CrossRef]
- Talpur, N.; Abdulkadir, S.J.; Alhussian, H.; Hasan, M.H.; Aziz, N.; Bamhdi, A. Deep Neuro-Fuzzy System application trends, challenges, and future perspectives: A systematic survey. Artif. Intell. Rev. 2023, 56, 865–913. [Google Scholar] [CrossRef]
- Thiele, F.; Windebank, A.J.; Siddiqui, A.M. Motivation for using data-driven algorithms in research: A review of machine learning solutions for image analysis of micrographs in neuroscience. J. Neuropathol. Exp. Neurol. 2023, 82, 595–610. [Google Scholar] [CrossRef] [PubMed]
- Spurk, D.; Hirschi, A.; Wang, M.; Valero, D.; Kauffeld, S. Latent profile analysis: A review and “how to” guide of its application within vocational behavior research. J. Vocat. Behav. 2020, 120, 103445. [Google Scholar] [CrossRef]
- Peng, Y.; Zhu, X.; Nie, F.; Kong, W.; Ge, Y. Fuzzy graph clustering. Inf. Sci. 2021, 571, 38–49. [Google Scholar] [CrossRef]
- Han, H. Fuzzy clustering algorithm for university students’ psychological fitness and performance detection. Heliyon 2023, 9, e18550. [Google Scholar] [CrossRef]
- Fogaca, J.L. Combining mental health and performance interventions: Coping and social support for student-athletes. J. Appl. Sport Psychol. 2021, 33, 4–19. [Google Scholar] [CrossRef]
- Celebi, M.E. Partitional Clustering Algorithms; Springer: Berlin/Heidelberg, Germany, 2014. [Google Scholar]
- Murtagh, F.; Contreras, P. Algorithms for hierarchical clustering: An overview. Wiley Interdiscip. Rev. Data Min. Knowl. Discov. 2012, 2, 86–97. [Google Scholar] [CrossRef]
- Nanda, S.J.; Panda, G. A survey on nature inspired metaheuristic algorithms for partitional clustering. Swarm Evol. Comput. 2014, 16, 1–18. [Google Scholar] [CrossRef]
- Reynolds, A.P.; Richards, G.; de la Iglesia, B.; Rayward-Smith, V.J. Clustering rules: A comparison of partitioning and hierarchical clustering algorithms. J. Math. Model. Algorithms 2006, 5, 475–504. [Google Scholar] [CrossRef]
- Hartigan, J.A.; Wong, M.A. Algorithm AS 136: A k-means clustering algorithm. J. R. Stat. Society. Ser. C (Appl. Stat.) 1979, 28, 100–108. [Google Scholar] [CrossRef]
- Arthur, D. K-means++: The advantages if careful seeding. In Proceedings of the Eighteenth Annual ACM-SIAM Symposium on Discrete Algorithms, New Orleans, LA, USA, 7–9 January 2007. [Google Scholar]
- Jain, A.K.; Murty, M.N.; Flynn, P.J. Data clustering: A review. ACM Comput. Surv. (CSUR) 1999, 31, 264–323. [Google Scholar] [CrossRef]
- Davies, D.L.; Bouldin, D.W. A cluster separation measure. IEEE Trans. Pattern Anal. Mach. Intell. 1979, PAMI-1, 224–227. [Google Scholar] [CrossRef]
- Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey wolf optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef]
- Shaikh, M.S.; Hua, C.; Jatoi, M.A.; Ansari, M.M.; Qader, A.A. Application of grey wolf optimisation algorithm in parameter calculation of overhead transmission line system. IET Sci. Meas. Technol. 2021, 15, 218–231. [Google Scholar] [CrossRef]
- Zervoudakis, K.; Mastrothanasis, K.; Tsafarakis, S. Forming automatic groups of learners using particle swarm optimization for applications of differentiated instruction. Comput. Appl. Eng. Educ. 2020, 28, 282–292. [Google Scholar] [CrossRef]
- Sang, H.; Hong, D. Analysis and Research of Psychological Education Based on Data Mining Technology. Secur. Commun. Netw. 2021, 2021, 8979507. [Google Scholar] [CrossRef]
- Alzahrani, N.A.; Abdullah, M.A. Student Engagement Effectiveness In E-Learning System. Biosci. Biotechnol. Res. Commun. 2019, 12, 208–218. [Google Scholar] [CrossRef]
- Jawad, K.; Mahto, R.; Das, A.; Ahmed, S.U.; Aziz, R.M.; Kumar, P. Novel Cuckoo Search-Based Metaheuristic Approach for Deep Learning Prediction of Depression. Appl. Sci. 2023, 13, 5322. [Google Scholar] [CrossRef]
- Joshi, A.A.; Aziz, R.M. A two-phase cuckoo search based approach for gene selection and deep learning classification of cancer disease using gene expression data with a novel fitness function. Multimed. Tools Appl. 2024. [Google Scholar] [CrossRef]
- Amin, N.; Salehin, I.; Baten, M.A.; Noman, R.A. RHMCD-20 dataset: Identify rapid human mental health depression during quarantine life using machine learning. Data Brief 2024, 54, 110376. [Google Scholar] [CrossRef] [PubMed]
Phase | Generation of the Initial Phase | Control Parameters Calculation | Update the Search Agent Position | Evaluate the Fitness Value of Each Search Agent | iGWCA |
---|---|---|---|---|---|
Time Complexity | O(N × D) | O(N × D) | O(N × D) | O(N × D) | O(N × D × Max: Iter ) |
Types | Parameter |
---|---|
iGWCA | Search agents = 30, m = 0.15, k = 2 |
GWO | Search agents = 30, r1, r2 = [0, 1] |
MOA | Particle number = 40, α1 = 1, α2 = 1.5, β = 2, d = 0.1, fl = 0.1 |
FFO | population size = 20, a1, pa = 0.92 |
PSO | Particle number = 30; c1 = 1.5, c2 = 2; ω = 0.9 |
HSCCS | Alpha = 0.7, A and a = 2 |
FAGA | Population size = 30, mutation rate = 0.05 |
GSA | Alpha = 20, gravitational constants = 100 |
Statistical Analysis | iGWCA [Proposed] | GWO [Studied] | MOA [Studied] | FFO [Studied] | PSO [Studied] | HSCCS [Studied] | FAGA [Studied] | GSA [Studied] |
---|---|---|---|---|---|---|---|---|
Best Value | 0.52461 | 0.53248 | 0.57456 | 0.57301 | 0.53781 | 0.54452 | 0.57150 | 0.57452 |
Worst Value | 0.58452 | 0.58231 | 0.65322 | 0.64799 | 0.60992 | 0.58443 | 0.67141 | 0.66444 |
Avg | 0.5453 | 0.5497 | 0.6017 | 0.5989 | 0.5627 | 0.5844 | 0.6060 | 0.6055 |
SD | 0.0290 | 0.0241 | 0.0380 | 0.0363 | 0.0349 | 0.0193 | 0.0483 | 0.0435 |
Number of Hits | 31 | 32 | 28 | 32 | 30 | 33 | 30 | 33 |
Computational Time (s) | 8.162354 | 12.454219 | 18.791092 | 26.140449 | 29.559220 | 15.777735 | 14.877140 | 11.791202 |
% Decrease | 1.48% | 8.69% | 8.45% | 2.45% | 3.65% | 8.20% | 8.68% |
S.No | Method | Performance (%) |
---|---|---|
1 | Machine Learning Algorithms | 0.8571 [15] |
2 | Data Mining Technology | 0.90 [67] |
3 | AdaBoost Algorithm | 0.8175 [18] |
4 | Aspect-Oriented Convolutional Neural Network (A-CNN) | 0.89 [45] |
5 | Machine Learning Technology (CatBoost) | 0.826 [14] |
6 | Gradient Boosting Machine Algorithm (Dataset-I, Class H) | 0.78 [68] |
7 | iGWCA | 0.96 |
Types | iGWCA [Proposed] | GWO [Studied] | MOA [Studied] | FFO [Studied] | PSO [Studied] | HSCCS [Studied] | FAGA [Studied] | GSA [Studied] |
---|---|---|---|---|---|---|---|---|
TPR | 0.9666 | 0.9333 | 0.8 | 0.8333 | 0.9 | 0.866 | 0.8 | 0.7666 |
TNR | 0.35 | 0.5 | 0.65 | 0.6 | 0.4 | 0.13 | 0.2333 | 0.2666 |
FPR | 0.75 | 0.6 | 0.2 | 0.5 | 0.6 | 0.2 | 0.06 | 0.8 |
FNR | 0.20687 | 0.214286 | 0.24 | 0.215156 | 0.2413 | 0.2650 | 0.213 | 0.2234 |
Algorithms | Test Case | Cluster | Size | Avg | Median | Rank | SD |
---|---|---|---|---|---|---|---|
iGWCA [Proposed] | Test Case 1 | 1 | 824 | 5.89 | 5.0 | 1 | 3.90 |
GWO [Studied] | Test Case 1 | 1 | 709 | 7.70 | 8.0 | 2 | 4.36 |
MOA [Studied] | Test Case 1 | 1 | 730 | 7.91 | 8.0 | 3 | 4.47 |
FFO [Studied] | Test Case 1 | 1 | 748 | 8.11 | 9.0 | 7 | 4.59 |
PSO [Studied] | Test Case 1 | 1 | 746 | 8.08 | 9.0 | 6 | 4.57 |
HSCCS [Studied] | Test Case 1 | 1 | 711 | 7.72 | 8.0 | 4 | 4.37 |
FAGA [Studied] | Test Case 1 | 1 | 730 | 7.91 | 8.0 | 5 | 4.47 |
GSA [Studied] | Test Case 1 | 1 | 711 | 7.72 | 8.0 | 4 | 4.37 |
iGWCA [Proposed] | Test Case 2 | 1 | 824 | 1.89 | 1.0 | 1 | 1.26 |
GWO [Studied] | Test Case 2 | 1 | 709 | 1.86 | 2.0 | 2 | 1.03 |
MOA [Studied] | Test Case 2 | 1 | 730 | 1.94 | 8.0 | 3 | 4.47 |
FFO [Studied] | Test Case 2 | 1 | 748 | 1.97 | 2.0 | 4 | 1.15 |
PSO [Studied] | Test Case 2 | 1 | 746 | 1.98 | 2.0 | 5 | 1.15 |
HSCCS [Studied] | Test Case 2 | 1 | 711 | 1.86 | 2.0 | 2 | 1.03 |
FAGA [Studied] | Test Case 2 | 1 | 730 | 1.94 | 2.0 | 3 | 1.11 |
GSA [Studied] | Test Case 2 | 1 | 711 | 1.87 | 2.0 | 3 | 1.04 |
iGWCA [Proposed] | Test Case 3 | 1 | 824 | 1.84 | 1.0 | 1 | 1.14 |
GWO [Studied] | Test Case 3 | 1 | 709 | 1.86 | 2.0 | 2 | 1.04 |
MOA [Studied] | Test Case 3 | 1 | 730 | 1.93 | 2.0 | 3 | 1.11 |
FFO [Studied] | Test Case 3 | 1 | 748 | 1.96 | 2.0 | 5 | 1.15 |
PSO [Studied] | Test Case 3 | 1 | 746 | 1.97 | 2.0 | 6 | 1.14 |
HSCCS [Studied] | Test Case 3 | 1 | 711 | 1.86 | 2.0 | 2 | 1.04 |
FAGA [Studied] | Test Case 3 | 1 | 730 | 1.93 | 2.0 | 4 | 1.11 |
GSA [Studied] | Test Case 3 | 1 | 711 | 1.87 | 2.0 | 3 | 1.05 |
iGWCA [Proposed] | Test Case 1 | 2 | 276 | 20.60 | 21.0 | 1 | 5.04 |
GWO [Studied] | Test Case 1 | 2 | 391 | 21.36 | 21.0 | 2 | 3.62 |
MOA [Studied] | Test Case 1 | 2 | 370 | 21.72 | 22.0 | 3 | 3.38 |
FFO [Studied] | Test Case 1 | 2 | 352 | 22.01 | 22.0 | 5 | 3.21 |
PSO [Studied] | Test Case 1 | 2 | 354 | 21.98 | 22.0 | 4 | 3.22 |
HSCCS [Studied] | Test Case 1 | 2 | 389 | 21.39 | 21.0 | 2 | 3.60 |
FAGA [Studied] | Test Case 1 | 2 | 370 | 21.72 | 22.0 | 3 | 3.38 |
GSA [Studied] | Test Case 1 | 2 | 389 | 21.39 | 21.0 | 2 | 3.60 |
iGWCA [Proposed] | Test Case 2 | 2 | 276 | 3.39 | 4.0 | 1 | 1.70 |
GWO [Studied] | Test Case 2 | 2 | 391 | 4.08 | 4.0 | 4 | 1.22 |
MOA [Studied] | Test Case 2 | 2 | 370 | 4.06 | 4.0 | 2 | 1.24 |
FFO [Studied] | Test Case 2 | 2 | 352 | 4.09 | 4.0 | 5 | 1.20 |
PSO [Studied] | Test Case 2 | 2 | 354 | 4.07 | 4.0 | 3 | 1.23 |
HSCCS [Studied] | Test Case 2 | 2 | 389 | 4.08 | 4 | 4 | 1.22 |
FAGA [Studied] | Test Case 2 | 2 | 370 | 4.06 | 4.0 | 2 | 1.24 |
GSA [Studied] | Test Case 2 | 2 | 389 | 4.08 | 4 | 4 | 1.22 |
iGWCA [Proposed] | Test Case 3 | 2 | 276 | 3.65 | 4.0 | 1 | 1.28 |
GWO [Studied] | Test Case 3 | 2 | 391 | 3.68 | 4.0 | 3 | 1.21 |
MOA [Studied] | Test Case 3 | 2 | 370 | 3.65 | 4.0 | 2 | 1.22 |
FFO [Studied] | Test Case 3 | 2 | 352 | 3.67 | 4.0 | 4 | 1.19 |
PSO [Studied] | Test Case 3 | 2 | 354 | 3.65 | 4.0 | 2 | 1.22 |
HSCCS [Studied] | Test Case 3 | 2 | 389 | 3.69 | 4.0 | 5 | 1.21 |
FAGA [Studied] | Test Case 3 | 2 | 370 | 3.65 | 4.0 | 2 | 1.22 |
GSA [Studied] | Test Case 3 | 2 | 389 | 3.68 | 4.0 | 4 | 1.22 |
Algorithms | Test Case | Cluster | Size | Avg | Median | Rank | SD |
iGWCA [Proposed] | Test Case 4 | 1 | 229 | 11.0 | 15 | 1 | 4.62 |
GWO [Studied] | Test Case 4 | 1 | 572 | 16.9 | 20 | 4 | 3.67 |
MOA [Studied] | Test Case 4 | 1 | 195 | 16.1 | 15 | 3 | 4.25 |
FFO [Studied] | Test Case 4 | 1 | 129 | 18.6 | 20 | 6 | 2.25 |
PSO [Studied] | Test Case 4 | 1 | 692 | 15.9 | 15 | 2 | 3.99 |
HSCCS [Studied] | Test Case 4 | 1 | 131 | 18.6 | 20 | 6 | 2.24 |
FAGA [Studied] | Test Case 4 | 1 | 135 | 18.4 | 20 | 5 | 2.65 |
GSA [Studied] | Test Case 4 | 1 | 704 | 15.9 | 15 | 2 | 3.99 |
iGWCA [Proposed] | Test Case 5 | 1 | 229 | 16.0 | 15 | 3 | 2.06 |
GWO [Studied] | Test Case 5 | 1 | 572 | 13.4 | 15 | 1 | 2.41 |
MOA [Studied] | Test Case 5 | 1 | 195 | 17.9 | 20 | 7 | 2.91 |
FFO [Studied] | Test Case 5 | 1 | 129 | 17.8 | 20 | 6 | 2.79 |
PSO [Studied] | Test Case 5 | 1 | 692 | 14.3 | 15 | 3 | 3.98 |
HSCCS [Studied] | Test Case 5 | 1 | 131 | 17.7 | 20 | 5 | 4.02 |
FAGA [Studied] | Test Case 5 | 1 | 135 | 17.6 | 20 | 4 | 2.86 |
GSA [Studied] | Test Case 5 | 1 | 704 | 14.3 | 15 | 2 | 3.97 |
iGWCA [Proposed] | Test Case 4 | 2 | 595 | 11.5 | 15 | 1 | 5.44 |
GWO [Studied] | Test Case 4 | 2 | 252 | 11.6 | 10 | 2 | 2.35 |
MOA [Studied] | Test Case 4 | 2 | 629 | 15.0 | 15 | 6 | 4.03 |
FFO [Studied] | Test Case 4 | 2 | 695 | 14.7 | 15 | 4 | 4.07 |
PSO [Studied] | Test Case 4 | 2 | 132 | 11.8 | 10 | 5 | 2.78 |
HSCCS [Studied] | Test Case 4 | 2 | 693 | 14.6 | 15 | 3 | 4.07 |
FAGA [Studied] | Test Case 4 | 2 | 689 | 14.7 | 15 | 4 | 4.07 |
GSA [Studied] | Test Case 4 | 2 | 120 | 11.5 | 10 | 1 | 2.28 |
iGWCA [Proposed] | Test Case 5 | 2 | 595 | 13.7 | 15 | 1 | 2.56 |
GWO [Studied] | Test Case 5 | 2 | 252 | 18.2 | 20 | 5 | 2.41 |
MOA [Studied] | Test Case 5 | 2 | 629 | 14.0 | 15 | 2 | 3.88 |
FFO [Studied] | Test Case 5 | 2 | 695 | 14.4 | 15 | 3 | 4.01 |
PSO [Studied] | Test Case 5 | 2 | 132 | 17.8 | 20 | 4 | 2.91 |
HSCCS [Studied] | Test Case 5 | 2 | 693 | 14.4 | 15 | 3 | 4.02 |
FAGA [Studied] | Test Case 5 | 2 | 689 | 14.4 | 15 | 3 | 4.02 |
GSA [Studied] | Test Case 5 | 2 | 120 | 18.2 | 20 | 5 | 2.67 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Shaikh, M.S.; Dong, X.; Zheng, G.; Wang, C.; Lin, Y. An Improved Expeditious Meta-Heuristic Clustering Method for Classifying Student Psychological Issues with Homogeneous Characteristics. Mathematics 2024, 12, 1620. https://doi.org/10.3390/math12111620
Shaikh MS, Dong X, Zheng G, Wang C, Lin Y. An Improved Expeditious Meta-Heuristic Clustering Method for Classifying Student Psychological Issues with Homogeneous Characteristics. Mathematics. 2024; 12(11):1620. https://doi.org/10.3390/math12111620
Chicago/Turabian StyleShaikh, Muhammad Suhail, Xiaoqing Dong, Gengzhong Zheng, Chang Wang, and Yifan Lin. 2024. "An Improved Expeditious Meta-Heuristic Clustering Method for Classifying Student Psychological Issues with Homogeneous Characteristics" Mathematics 12, no. 11: 1620. https://doi.org/10.3390/math12111620
APA StyleShaikh, M. S., Dong, X., Zheng, G., Wang, C., & Lin, Y. (2024). An Improved Expeditious Meta-Heuristic Clustering Method for Classifying Student Psychological Issues with Homogeneous Characteristics. Mathematics, 12(11), 1620. https://doi.org/10.3390/math12111620