Combining Constructed Artificial Neural Networks with Parameter Constraint Techniques to Achieve Better Generalization Properties
Abstract
1. Introduction
- 1.
- A novel hybrid framework that effectively combines grammatical evolution for neural architecture search with constrained genetic algorithms for parameter optimization, addressing both structural design and weight training simultaneously;
- 2.
- An innovative penalty mechanism within the genetic algorithm’s fitness function that dynamically monitors and controls neuron activation patterns to prevent overfitting, demonstrated to reduce test error by an average of 15.27% compared to standard approaches;
- 3.
- Comprehensive experimental validation across 53 diverse datasets showing statistically significant improvements () over traditional optimization methods, with particular effectiveness in medical and financial domains where overfitting risks are critical;
- 4.
- Detailed analysis of the method’s computational characteristics and scalability, providing practical guidelines for implementation in real-world scenarios with resource constraints.
- 1.
- Periodic application of an optimization technique to randomly selected chromosomes with the aim of improving the performance of the selected neural network but also of faster finding the global minimum of the error function;
- 2.
- The training of the artificial neural network by the optimization method is done in such a way as not to destroy the architecture of the neural network that grammatical evolution has already constructed;
- 3.
- The training of the artificial neural network from the optimization function is carried out using a modified fitness function, where an attempt is made to adapt the network parameters without losing its generalization properties.
2. Method Description
2.1. The Neural Construction Method
- The set N represents the non-terminal symbols of the grammar.
- The set T contains the terminal symbols of the grammar.
- The start symbol of the grammar is denoted as S.
- The production rules of the grammar are enclosed in the set P.
- Read the next element V from the chromosome that is being processed;
- Select the next production rule following the equation: Rule = V mod . The symbol represents the total number of production rules for the under processing non-terminal symbol.
2.2. The Used Genetic Algorithm
Algorithm 1 The algorithm used to calculate the bounding quantity for neural network . |
function
End Function |
Algorithm 2 The modified genetic algorithm. |
Function
End function |
2.3. The Overall Algorithm
- 1.
- Initialization.
- (a)
- Set as the number of chromosomes for the grammatical evolution procedure and as the maximum number of allowed generations.
- (b)
- Set as the selection rate and as the mutation rate.
- (c)
- Let be the number of chromosomes to which the modified genetic algorithm will be periodically applied.
- (d)
- Let be the number of generations that will pass before applying the modified genetic algorithm to randomly selected chromosomes.
- (e)
- Set the weight factor F with .
- (f)
- Set the values used in the modified genetic algorithm.
- (g)
- Initialize randomly the chromosomes as sets of randomly selected integers.
- (h)
- Set the generation number .
- 2.
- Fitness Calculation.
- (a)
- For do
- i.
- Obtain the chromosome .
- ii.
- Create the corresponding neural network using grammatical evolution.
- iii.
- Set the fitness value .
- (b)
- End For
- 3.
- Genetic Operations.
- (a)
- Select the best chromosomes, which will be copied intact to the next generation.
- (b)
- Create chromosomes using one-point crossover. For every couple of produced offspring two distinct chromosomes are selected from the current population using tournament selection. An example of the one-point crossover procedure is shown graphically in Figure 5.
- (c)
- For every chromosome and for each element select a random number . Alter the current element when .
- 4.
- Local Search.
- (a)
- If then
- i.
- Set a group of randomly selected chromosomes from the genetic population.
- ii.
- For every member do
- A.
- Obtain the corresponding neural network for the chromosome g.
- B.
- C.
- Set using the steps of Algorithm 2.
- iii.
- End For
- (b)
- Endif
- 5.
- Termination Check.
- (a)
- Set .
- (b)
- If goto Fitness Calculation.
- 6.
- Application to the Test Set.
- (a)
- Obtain the chromosome with the lowest fitness value and create through grammatical evolution the corresponding neural network .
- (b)
- Apply the neural network and report the corresponding error value.
3. Experimental Results
- 1.
- The UCI database, https://archive.ics.uci.edu/ (accessed on 22 January 2025) [65];
- 2.
- The Keel website, https://sci2s.ugr.es/keel/datasets.php (accessed on 22 January 2025) [66];
- 3.
- The Statlib URL https://lib.stat.cmu.edu/datasets/index (accessed on 22 January 2025).
3.1. Experimental Datasets
- 1.
- Appendictis which is a medical dataset [67];
- 2.
- Alcohol, which is dataset regarding alcohol consumption [68];
- 3.
- Australian, which is a dataset produced from various bank transactions [69];
- 4.
- Balance dataset [70], produced from various psychological experiments;
- 5.
- 6.
- Circular dataset, which is an artificial dataset;
- 7.
- Dermatology, a medical dataset for dermatology problems [73];
- 8.
- Ecoli, which is related to protein problems [74];
- 9.
- Glass dataset, which contains measurements from glass component analysis;
- 10.
- Haberman, a medical dataset related to breast cancer;
- 11.
- Hayes-roth dataset [75];
- 12.
- Heart, which is a dataset related to heart diseases [76];
- 13.
- HeartAttack, which is a medical dataset for the detection of heart diseases;
- 14.
- Housevotes, a dataset that is related to the Congressional voting in the USA [77];
- 15.
- 16.
- 17.
- Lymography [82];
- 18.
- Mammographic, which is a medical dataset used for the prediction of breast cancer [83];
- 19.
- 20.
- Pima, which is a medical dataset for the detection of diabetes [86];
- 21.
- Phoneme, a dataset that contains sound measurements;
- 22.
- Popfailures, a dataset related to experiments regarding climate [87];
- 23.
- Regions2, a medical dataset applied to liver problems [88];
- 24.
- Saheart, which is a medical dataset concerning heart diseases [89];
- 25.
- Segment dataset [90];
- 26.
- Statheart, a medical dataset related to heart diseases;
- 27.
- Spiral, an artificial dataset with two classes;
- 28.
- Student, which is a dataset regarding experiments in schools [91];
- 29.
- Transfusion, which is a medical dataset [92];
- 30.
- 31.
- 32.
- 33.
- Zoo, which is a dataset regarding animal classification [99].
- 1.
- Abalone, which is a dataset about the age of abalones [100];
- 2.
- Airfoil, a dataset founded in NASA [101];
- 3.
- Auto, a dataset related to the consumption of fuels from cars;
- 4.
- BK, which is used to predict the points scored in basketball games;
- 5.
- BL, a dataset that contains measurements from electricity experiments;
- 6.
- Baseball, which is a dataset used to predict the income of baseball players;
- 7.
- Concrete, which is a civil engineering dataset [102];
- 8.
- DEE, a dataset that is used to predict the price of electricity;
- 9.
- Friedman, which is an artificial dataset [103];
- 10.
- FY, which is a dataset regarding the longevity of fruit flies;
- 11.
- HO, a dataset located in the STATLIB repository;
- 12.
- Housing, a dataset regarding the price of houses [104];
- 13.
- Laser, which contains measurements from various physics experiments;
- 14.
- LW, a dataset regarding the weight of babes;
- 15.
- Mortgage, a dataset that contains measurements from the economy of the USA;
- 16.
- PL dataset, located in the STALIB repository;
- 17.
- Plastic, a dataset regarding problems that occurred with the pressure on plastics;
- 18.
- Quake, a dataset regarding the measurements of earthquakes;
- 19.
- SN, a dataset related to trellising and pruning;
- 20.
- Stock, which is a dataset regarding stocks;
- 21.
- Treasury, a dataset that contains measurements from the economy of the USA.
3.2. Experiments
- 1.
- The column DATASET represents the used dataset.
- 2.
- The column ADAM represents the incorporation of the ADAM optimization method [13] to train a neural network with processing nodes.
- 3.
- The column BFGS stands for the usage of a BFGS variant of Powell [106] to train an artificial neural network with processing nodes.
- 4.
- The column GENETIC represents the incorporation of a genetic algorithm with the same parameter set as provided in Table 1 to train a neural network with processing nodes.
- 5.
- 6.
- The column NNC stands for the usage of the original neural construction method.
- 7.
- The column NEAT represents the usage of the NEAT method (neuroevolution of augmenting topologies) [109].
- 8.
- 9.
- The column DNN represents the application of the deep neural network provided in the Tiny Dnn library, which is available from https://github.com/tiny-dnn/tiny-dnn (accessed on 7 September 2025). The network was trained using the AdaGrad optimizer [112].
- 10.
- The column PROPOSED denotes the usage of the proposed method.
- 11.
- The row AVERAGE represents the average classification or regression error for all datasets in the corresponding table.
3.3. Experiments with a Different Crossover Mechanism
3.4. Experiments with the Critical Parameter
3.5. A Series of Practical Examples
4. Discussion
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Abiodun, O.I.; Jantan, A.; Omolara, A.E.; Dada, K.V.; Mohamed, N.A.; Arshad, H. State-of-the-art in artificial neural network applications: A survey. Heliyon 2018, 4, e00938. [Google Scholar] [CrossRef] [PubMed]
- Suryadevara, S.; Yanamala, A.K.Y. A Comprehensive Overview of Artificial Neural Networks: Evolution, Architectures, and Applications. Rev. Intel. Artif. Med. 2021, 12, 51–76. [Google Scholar]
- Egmont-Petersen, M.; de Ridder, D.; Handels, H. Image processing with neural networks—A review. Pattern Recognit. 2002, 35, 2279–2301. [Google Scholar] [CrossRef]
- Zhang, G.P. Time series forecasting using a hybrid ARIMA and neural network model. Neurocomputing 2003, 50, 159–175. [Google Scholar] [CrossRef]
- Huang, Z.; Chen, H.; Hsu, C.-J.; Chen, W.-H.; Wu, S. Credit rating analysis with support vector machines and neural networks: A market comparative study. Decis. Support Syst. 2004, 37, 543–558. [Google Scholar] [CrossRef]
- Baldi, P.; Cranmer, K.; Faucett, T.; Sadowski, P.; Whiteson, D. Parameterized neural networks for high-energy physics. Eur. Phys. J. C 2016, 76, 1–7. [Google Scholar] [CrossRef]
- Aguirre, L.A.; Lopes, R.A.; Amaral, G.F.; Letellier, C. Constraining the topology of neural networks to ensure dynamics with symmetry properties. Phys. Rev. E 2004, 69, 026701. [Google Scholar] [CrossRef] [PubMed]
- Mattheakis, M.; Protopapas, P.; Sondak, D.; Di Giovanni, M.; Kaxiras, E. Physical symmetries embedded in neural networks. arXiv 2019, arXiv:1904.08991. [Google Scholar]
- Krippendorf, S.; Syvaeri, M. Detecting symmetries with neural networks. Mach. Learn. Sci. Technol. 2020, 2, 015010. [Google Scholar] [CrossRef]
- Vora, K.; Yagnik, S. A survey on backpropagation algorithms for feedforward neural networks. Int. J. Eng. Dev. Res. 2014, 1, 193–197. [Google Scholar]
- Pajchrowski, T.; Zawirski, K.; Nowopolski, K. Neural speed controller trained online by means of modified RPROP algorithm. IEEE Trans. Ind. Inform. 2014, 11, 560–568. [Google Scholar] [CrossRef]
- Hermanto, R.P.S.; Nugroho, A. Waiting-time estimation in bank customer queues using RPROP neural networks. Procedia Comput. Sci. 2018, 135, 35–42. [Google Scholar] [CrossRef]
- Kingma, D.P.; Ba, J.L. ADAM: A method for stochastic optimization. In Proceedings of the 3rd International Conference on Learning Representations (ICLR 2015), San Diego, CA, USA, 7–9 May 2015; pp. 1–15. [Google Scholar]
- Reynolds, J.; Rezgui, Y.; Kwan, A.; Piriou, S. A zone-level, building energy optimisation combining an artificial neural network, a genetic algorithm, and model predictive control. Energy 2018, 151, 729–739. [Google Scholar] [CrossRef]
- Das, G.; Pattnaik, P.K.; Padhy, S.K. Artificial neural network trained by particle swarm optimization for non-linear channel equalization. Expert Syst. Appl. 2014, 41, 3491–3496. [Google Scholar] [CrossRef]
- Sexton, R.S.; Dorsey, R.E.; Johnson, J.D. Beyond backpropagation: Using simulated annealing for training neural networks. J. Organ. End User Comput. (JOEUC) 1999, 11, 3–10. [Google Scholar] [CrossRef]
- Wang, L.; Zeng, Y.; Chen, T. Back propagation neural network with adaptive differential evolution algorithm for time series forecasting. Expert Syst. Appl. 2015, 42, 855–863. [Google Scholar] [CrossRef]
- Karaboga, D.; Akay, B. Artificial bee colony (ABC) algorithm on training artificial neural networks. In Proceedings of the 2007 IEEE 15th Signal Processing and Communications Applications, Eskisehir, Turkey, 11–13 June 2007; IEEE: New York, NY, USA, 2007; pp. 1–4. [Google Scholar]
- Sexton, R.S.; Alidaee, B.; Dorsey, R.E.; Johnson, J.D. Global optimization for artificial neural networks: A tabu search application. Eur. J. Oper. Res. 1998, 106, 570–584. [Google Scholar] [CrossRef]
- Zhang, J.-R.; Zhang, J.; Lok, T.-M.; Lyu, M.R. A hybrid particle swarm optimization–back-propagation algorithm for feedforward neural network training. Appl. Math. Comput. 2007, 185, 1026–1037. [Google Scholar] [CrossRef]
- Zhao, G.; Wang, T.; Jin, Y.; Lang, C.; Li, Y.; Ling, H. The Cascaded Forward algorithm for neural network training. Pattern Recognit. 2025, 161, 111292. [Google Scholar] [CrossRef]
- Oh, K.; Jung, K. GPU implementation of neural networks. Pattern Recognit. 2004, 37, 1311–1314. [Google Scholar] [CrossRef]
- Zhang, M.; Hibi, K.; Inoue, J. GPU-accelerated artificial neural network potential for molecular dynamics simulation. Comput. Commun. 2023, 285, 108655. [Google Scholar] [CrossRef]
- Nowlan, S.J.; Hinton, G.E. Simplifying neural networks by soft weight sharing. Neural Comput. 1992, 4, 473–493. [Google Scholar] [CrossRef]
- Nowlan, S.J.; Hinton, G.E. Simplifying neural networks by soft weight sharing. In The Mathematics of Generalization; CRC Press: Boca Raton, FL, USA, 2018; pp. 373–394. [Google Scholar]
- Hanson, S.J.; Pratt, L.Y. Comparing biases for minimal network construction with back propagation. In Advances in Neural Information Processing Systems; Touretzky, D.S., Ed.; Morgan Kaufmann: San Mateo, CA, USA, 1989; Volume 1, pp. 177–185. [Google Scholar]
- Augasta, M.; Kathirvalavakumar, T. Pruning algorithms of neural networks—A comparative study. Cent. Eur. Comput. Sci. 2003, 3, 105–115. [Google Scholar] [CrossRef]
- Prechelt, L. Automatic early stopping using cross validation: Quantifying the criteria. Neural Netw. 1998, 11, 761–767. [Google Scholar] [CrossRef]
- Wu, X.; Liu, J. A New Early Stopping Algorithm for Improving Neural Network Generalization. In Proceedings of the 2009 Second International Conference on Intelligent Computation Technology and Automation, Changsha, Hunan, 10–11 October 2009; pp. 15–18. [Google Scholar]
- Treadgold, N.K.; Gedeon, T.D. Simulated annealing and weight decay in adaptive learning: The SARPROP algorithm. IEEE Trans. Neural Netw. 1998, 9, 662–668. [Google Scholar] [CrossRef]
- Carvalho, M.; Ludermir, T.B. Particle Swarm Optimization of Feed-Forward Neural Networks with Weight Decay. In Proceedings of the 2006 Sixth International Conference on Hybrid Intelligent Systems (HIS’06), Auckland, New Zealand, 13–15 December 2006; pp. 13–15. [Google Scholar]
- Arifovic, J.; Gençay, R. Using genetic algorithms to select architecture of a feedforward artificial neural network. Phys. A Stat. Mech. Appl. 2001, 289, 574–594. [Google Scholar] [CrossRef]
- Benardos, P.G.; Vosniakos, G.C. Optimizing feedforward artificial neural network architecture. Eng. Appl. Artif. Intell. 2007, 20, 365–382. [Google Scholar] [CrossRef]
- Garro, B.A.; Vázquez, R.A. Designing Artificial Neural Networks Using Particle Swarm Optimization Algorithms. Comput. Neurosci. 2015, 2015, 369298. [Google Scholar] [CrossRef]
- Siebel, N.T.; Sommer, G. Evolutionary reinforcement learning of artificial neural networks. Int. Hybrid Intell. Syst. 2007, 4, 171–183. [Google Scholar] [CrossRef]
- Jaafra, Y.; Laurent, J.L.; Deruyver, A.; Naceur, M.S. Reinforcement learning for neural architecture search: A review. Image Vis. Comput. 2019, 89, 57–66. [Google Scholar] [CrossRef]
- Pham, H.; Guan, M.; Zoph, B.; Le, Q.; Dean, J. Efficient neural architecture search via parameters sharing. In Proceedings of the International Conference on Machine Learning, Stockholm, Sweden, 10–15 July 2018; pp. 4095–4104. [Google Scholar]
- Xie, S.; Zheng, H.; Liu, C.; Lin, L. SNAS: Stochastic neural architecture search. arXiv 2018, arXiv:1812.09926. [Google Scholar]
- Zhou, H.; Yang, M.; Wang, J.; Pan, W. Bayesnas: A bayesian approach for neural architecture search. In Proceedings of the International Conference on Machine Learning, Long Beach, CA, USA, 10–15 June 2019; pp. 7603–7613. [Google Scholar]
- Terfloth, L.; Gasteige, J. Neural networks and genetic algorithms in drug design. Drug Discov. Today 2001, 6, 102–108. [Google Scholar] [CrossRef]
- Kim, G.H.; Seo, D.S.; Kang, K.I. Hybrid models of neural networks and genetic algorithms for predicting preliminary cost estimates. J. Comput. In Civil Eng. 2005, 19, 208–211. [Google Scholar] [CrossRef]
- Kalogirou, S.A. Optimization of solar systems using artificial neural-networks and genetic algorithms. Appl. Energy 2004, 77, 383–405. [Google Scholar] [CrossRef]
- Tong, D.L.; Mintram, R. Genetic Algorithm-Neural Network (GANN): A study of neural network activation functions and depth of genetic algorithm search applied to feature selection. Int. J. Mach. Learn. Cyber. 2010, 1, 75–87. [Google Scholar] [CrossRef]
- Ruehle, F. Evolving neural networks with genetic algorithms to study the string landscape. J. High Energ. Phys. 2017, 2017, 38. [Google Scholar] [CrossRef]
- Ghosh, S.C.; Sinha, B.P.; Das, N. Channel assignment using genetic algorithm based on geometric symmetry. IEEE Trans. Veh. Technol. 2003, 52, 860–875. [Google Scholar] [CrossRef]
- Liu, Y.; Zhou, D. An Improved Genetic Algorithm with Initial Population Strategy for Symmetric TSP. Math. Probl. Eng. 2015, 2015, 212794. [Google Scholar] [CrossRef]
- Han, S.; Barcaro, G.; Fortunelli, A.; Lysgaard, S.; Vegge, T.; Hansen, H.A. Unfolding the structural stability of nanoalloys via symmetry-constrained genetic algorithm and neural network potential. NPJ Comput. Mater. 2022, 8, 121. [Google Scholar] [CrossRef]
- O’Neill, M.; Ryan, C. Grammatical evolution. IEEE Trans. Evol. Comput. 2001, 5, 349–358. [Google Scholar] [CrossRef]
- Tsoulos, I.G.; Gavrilis, D.; Glavas, E. Neural network construction and training using grammatical evolution. Neurocomputing 2008, 72, 269–277. [Google Scholar] [CrossRef]
- Papamokos, G.V.; Tsoulos, I.G.; Demetropoulos, I.N.; Glavas, E. Location of amide I mode of vibration in computed data utilizing constructed neural networks. Expert Syst. Appl. 2009, 36, 12210–12213. [Google Scholar] [CrossRef]
- Tsoulos, I.G.; Gavrilis, D.; Glavas, E. Solving differential equations with constructed neural networks. Neurocomputing 2009, 72, 2385–2391. [Google Scholar] [CrossRef]
- Tsoulos, I.G.; Mitsi, G.; Stavrakoudis, A.; Papapetropoulos, S. Application of Machine Learning in a Parkinson’s Disease Digital Biomarker Dataset Using Neural Network Construction (NNC) Methodology Discriminates Patient Motor Status. Front. ICT 2019, 6, 10. [Google Scholar] [CrossRef]
- Christou, V.; Tsoulos, I.G.; Loupas, V.; Tzallas, A.T.; Gogos, C.; Karvelis, P.S.; Antoniadis, N.; Glavas, E.; Giannakeas, N. Performance and early drop prediction for higher education students using machine learning. Expert Syst. Appl. 2023, 225, 120079. [Google Scholar] [CrossRef]
- Toki, E.I.; Pange, J.; Tatsis, G.; Plachouras, K.; Tsoulos, I.G. Utilizing Constructed Neural Networks for Autism Screening. Appl. Sci. 2024, 14, 3053. [Google Scholar] [CrossRef]
- Backus, J.W. The Syntax and Semantics of the Proposed International Algebraic Language of the Zurich ACM-GAMM Conference. In Proceedings of the International Conference on Information Processing, UNESCO, Paris, France, 15–20 June 1959; pp. 125–132. [Google Scholar]
- Ryan, C.; Collins, J.; O’Neill, M. Grammatical evolution: Evolving programs for an arbitrary language. In Proceedings of the Genetic Programming EuroGP 1998, Paris, France, 14–15 April 1998; Lecture Notes in Computer Science. Banzhaf, W., Poli, R., Schoenauer, M., Fogarty, T.C., Eds.; Springer: Berlin/Heidelberg, Germany, 1998; Volume 1391. [Google Scholar]
- O’Neill, M.; Ryan, M.C. Evolving Multi-line Compilable C Programs. In Proceedings of the Genetic Programming EuroGP 1999, Goteborg, Sweden, 26–27 May 1999; Lecture Notes in Computer Science. Poli, R., Nordin, P., Langdon, W.B., Fogarty, T.C., Eds.; Springer: Berlin/Heidelberg, Germany, 1999; Volume 1598. [Google Scholar]
- Puente, A.O.; Alfonso, R.S.; Moreno, M.A. Automatic composition of music by means of grammatical evolution. In Proceedings of the APL ’02: Proceedings of the 2002 Conference on APL: Array Processing Languages: Lore, Problems, and Applications, Madrid, Spain, 22–25 July 2002; pp. 148–155. [Google Scholar]
- Galván-López, E.; Swafford, J.M.; O’Neill, M.; Brabazon, A. Evolving a Ms. PacMan Controller Using Grammatical Evolution. In Applications of Evolutionary Computation. EvoApplications 2010; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2010; Volume 6024. [Google Scholar]
- Shaker, N.; Nicolau, M.; Yannakakis, G.N.; Togelius, J.; O’Neill, M. Evolving levels for Super Mario Bros using grammatical evolution. In Proceedings of the 2012 IEEE Conference on Computational Intelligence and Games (CIG), Granada, Spain, 11–14 September 2012; pp. 304–311. [Google Scholar]
- Martínez-Rodríguez, D.; Colmenar, J.M.; Hidalgo, J.I.; Micó, R.J.V.; Salcedo-Sanz, S. Particle swarm grammatical evolution for energy demand estimation. Energy Sci. Eng. 2020, 8, 1068–1079. [Google Scholar] [CrossRef]
- Ryan, C.; Kshirsagar, M.; Vaidya, G.; Cunningham, A.; Sivaraman, R. Design of a cryptographically secure pseudo random number generator with grammatical evolution. Sci. Rep. 2022, 12, 8602. [Google Scholar] [CrossRef]
- Martín, C.; Quintana, D.; Isasi, P. Grammatical Evolution-based ensembles for algorithmic trading. Appl. Soft Comput. 2019, 84, 105713. [Google Scholar] [CrossRef]
- Anastasopoulos, N.; Tsoulos, I.G.; Karvounis, E.; Tzallas, A. Locate the Bounding Box of Neural Networks with Intervals. Neural Process Lett. 2020, 52, 2241–2251. [Google Scholar] [CrossRef]
- Kelly, M.; Longjohn, R.; Nottingham, K. The UCI Machine Learning Repository. Available online: https://archive.ics.uci.edu (accessed on 10 September 2025).
- Alcalá-Fdez, J.; Fernandez, A.; Luengo, J.; Derrac, J.; García, S.; Sánchez, L.; Herrera, F. KEEL Data-Mining Software Tool: Data Set Repository, Integration of Algorithms and Experimental Analysis Framework. J. Mult.-Valued Log. Soft Comput. 2011, 17, 255–287. [Google Scholar]
- Weiss, S.M.; Kulikowski, C.A. Computer Systems That Learn: Classification and Prediction Methods from Statistics, Neural Nets. In Machine Learning, and Expert Systems; Morgan Kaufmann Publishers Inc.: Burlington, MA, USA, 1991. [Google Scholar]
- Tzimourta, K.D.; Tsoulos, I.; Bilero, I.T.; Tzallas, A.T.; Tsipouras, M.G.; Giannakeas, N. Direct Assessment of Alcohol Consumption in Mental State Using Brain Computer Interfaces and Grammatical Evolution. Inventions 2018, 3, 51. [Google Scholar] [CrossRef]
- Quinlan, J.R. Simplifying Decision Trees. Int. J. Man-Mach. Stud. 1987, 27, 221–234. [Google Scholar] [CrossRef]
- Shultz, T.; Mareschal, D.; Schmidt, W. Modeling Cognitive Development on Balance Scale Phenomena. Mach. Learn. 1994, 16, 59–88. [Google Scholar] [CrossRef]
- Zhou, Z.H.; Jiang, Y. NeC4.5: Neural ensemble based C4.5. IEEE Trans. Knowl. Data Engineering 2004, 16, 770–773. [Google Scholar] [CrossRef]
- Setiono, R.; Leow, W.K. FERNN: An Algorithm for Fast Extraction of Rules from Neural Networks. Appl. Intell. 2000, 12, 15–25. [Google Scholar] [CrossRef]
- Demiroz, G.; Govenir, H.A.; Ilter, N. Learning Differential Diagnosis of Eryhemato-Squamous Diseases using Voting Feature Intervals. Artif. Intell. Med. 1998, 13, 147–165. [Google Scholar]
- Horton, P.; Nakai, K. A Probabilistic Classification System for Predicting the Cellular Localization Sites of Proteins. In Proceedings of the International Conference on Intelligent Systems for Molecular Biology, St. Louis, MO, USA, 12–15 June 1996; Volume 4, pp. 109–115. [Google Scholar]
- Hayes-Roth, B.; Hayes-Roth, B.F. Concept learning and the recognition and classification of exemplars. J. Verbal Learn. Verbal Behav. 1977, 16, 321–338. [Google Scholar] [CrossRef]
- Kononenko, I.; Šimec, E.; Robnik-Šikonja, M. Overcoming the Myopia of Inductive Learning Algorithms with RELIEFF. Appl. Intell. 1997, 7, 39–55. [Google Scholar] [CrossRef]
- French, R.M.; Chater, N. Using noise to compute error surfaces in connectionist networks: A novel means of reducing catastrophic forgetting. Neural Comput. 2002, 14, 1755–1769. [Google Scholar] [CrossRef]
- Dy, J.G.; Brodley, C.E. Feature Selection for Unsupervised Learning. J. Mach. Learn. Res. 2004, 5, 845–889. [Google Scholar]
- Perantonis, S.J.; Virvilis, V. Input Feature Extraction for Multilayered Perceptrons Using Supervised Principal Component Analysis. Neural Process. Lett. 1999, 10, 243–252. [Google Scholar] [CrossRef]
- Garcke, J.; Griebel, M. Classification with sparse grids using simplicial basis functions. Intell. Data Anal. 2002, 6, 483–502. [Google Scholar] [CrossRef]
- Mcdermott, J.; Forsyth, R.S. Diagnosing a disorder in a classification benchmark. Pattern Recognit. Lett. 2016, 73, 41–43. [Google Scholar] [CrossRef]
- Cestnik, G.; Konenenko, I.; Bratko, I. Assistant-86: A Knowledge-Elicitation Tool for Sophisticated Users. In Progress in Machine Learning; Bratko, I., Lavrac, N., Eds.; Sigma Press: Wilmslow, UK, 1987; pp. 31–45. [Google Scholar]
- Elter, M.; Schulz-Wendtland, R.; Wittenberg, T. The prediction of breast cancer biopsy outcomes using two CAD approaches that both emphasize an intelligible decision process. Med. Phys. 2007, 34, 4164–4172. [Google Scholar] [CrossRef] [PubMed]
- Little, M.; Mcsharry, P.; Roberts, S.; Costello, D.; Moroz, I. Exploiting Nonlinear Recurrence and Fractal Scaling Properties for Voice Disorder Detection. BioMed Eng. OnLine 2007, 6, 23. [Google Scholar] [CrossRef]
- Little, M.A.; McSharry, P.E.; Hunter, E.J.; Spielman, J.; Ramig, L.O. Suitability of dysphonia measurements for telemonitoring of Parkinson’s disease. IEEE Trans. Biomed. Eng. 2009, 56, 1015–1022. [Google Scholar] [CrossRef]
- Smith, J.W.; Everhart, J.E.; Dickson, W.C.; Knowler, W.C.; Johannes, R.S. Using the ADAP learning algorithm to forecast the onset of diabetes mellitus. In Proceedings of the Symposium on Computer Applications and Medical Care, Washington, DC, USA, 6–9 November 1988; IEEE Computer Society Press: Piscataway, NJ, USA, 1988; pp. 261–265. [Google Scholar]
- Lucas, D.D.; Klein, R.; Tannahill, J.; Ivanova, D.; Brandon, S.; Domyancic, D.; Zhang, Y. Failure analysis of parameter-induced simulation crashes in climate models. Geosci. Model Dev. 2013, 6, 1157–1171. [Google Scholar] [CrossRef]
- Giannakeas, N.; Tsipouras, M.G.; Tzallas, A.T.; Kyriakidi, K.; Tsianou, Z.E.; Manousou, P.; Hall, A.; Karvounis, E.C.; Tsianos, V.; Tsianos, E. A clustering based method for collagen proportional area extraction in liver biopsy images (2015). In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBS, Milan, Italy, 25–29 August 2015; Volume 7319047, pp. 3097–3100. [Google Scholar]
- Hastie, T.; Tibshirani, R. Non-parametric logistic and proportional odds regression. JRSS-C (Appl. Stat.) 1987, 36, 260–276. [Google Scholar] [CrossRef]
- Dash, M.; Liu, H.; Scheuermann, P.; Tan, K.L. Fast hierarchical clustering and its validation. Data Knowl. Eng. 2003, 44, 109–138. [Google Scholar] [CrossRef]
- Cortez, P.; Silva, A.M.G. Using data mining to predict secondary school student performance. In Proceedings of the 5th FUture BUsiness TEChnology Conference (FUBUTEC 2008), Porto, Portugal, 9–11 April 2008; pp. 5–12. [Google Scholar]
- Yeh, I.C.; Yang, K.J.; Ting, T.M. Knowledge discovery on RFM model using Bernoulli sequence. Expert Syst. Appl. 2009, 36, 5866–5871. [Google Scholar] [CrossRef]
- Jeyasingh, S.; Veluchamy, M. Modified bat algorithm for feature selection with the Wisconsin diagnosis breast cancer (WDBC) dataset. Asian Pac. J. Cancer Prev. APJCP 2017, 18, 1257. [Google Scholar] [PubMed]
- Alshayeji, M.H.; Ellethy, H.; Gupta, R. Computer-aided detection of breast cancer on the Wisconsin dataset: An artificial neural networks approach. Biomed. Signal Processing Control 2022, 71, 103141. [Google Scholar] [CrossRef]
- Raymer, M.; Doom, T.E.; Kuhn, L.A.; Punch, W.F. Knowledge discovery in medical and biological datasets using a hybrid Bayes classifier/evolutionary algorithm. IEEE Trans. Syst. Cybernetics. Part B Cybern. 2003, 33, 802–813. [Google Scholar] [CrossRef] [PubMed]
- Zhong, P.; Fukushima, M. Regularized nonsmooth Newton method for multi-class support vector machines. Optim. Methods Softw. 2007, 22, 225–236. [Google Scholar] [CrossRef]
- Andrzejak, R.G.; Lehnertz, K.; Mormann, F.; Rieke, C.; David, P.; Elger, C.E. Indications of nonlinear deterministic and finite-dimensional structures in time series of brain electrical activity: Dependence on recording region and brain state. Phys. Rev. E 2001, 64, 061907. [Google Scholar] [CrossRef] [PubMed]
- Tzallas, A.T.; Tsipouras, M.G.; Fotiadis, D.I. Automatic Seizure Detection Based on Time-Frequency Analysis and Artificial Neural Networks. Comput. Intell. Neurosci. 2007, 2007, 80510. [Google Scholar] [CrossRef]
- Koivisto, M.; Sood, K. Exact Bayesian Structure Discovery in Bayesian Networks. J. Mach. Learn. Res. 2004, 5, 549–573. [Google Scholar]
- Nash, W.J.; Sellers, T.L.; Talbot, S.R.; Cawthor, A.J.; Ford, W.B. The Population Biology of Abalone (Haliotis species) in Tasmania. I. Blacklip Abalone (H. rubra) from the North Coast and Islands of Bass Strait, Sea Fisheries Division; Technical Report No. 48; Department of Primary Industry and Fisheries, Tasmania: Hobart, Australia, 1994; ISSN 1034-3288. [Google Scholar]
- Brooks, T.F.; Pope, D.S.; Marcolini, A.M. Airfoil Self-Noise and Prediction. Technical Report, NASA RP-1218. July 1989. Available online: https://ntrs.nasa.gov/citations/19890016302 (accessed on 14 November 2024).
- Yeh, I.C. Modeling of strength of high performance concrete using artificial neural networks. Cem. And Concrete Res. 1998, 28, 1797–1808. [Google Scholar] [CrossRef]
- Friedman, J. Multivariate Adaptative Regression Splines. Ann. Stat. 1991, 19, 1–141. [Google Scholar]
- Harrison, D.; Rubinfeld, D.L. Hedonic prices and the demand for clean ai. J. Environ. Econ. Manag. 1978, 5, 81–102. [Google Scholar] [CrossRef]
- Tsoulos, I.G.; Charilogis, V.; Kyrou, G.; Stavrou, V.N.; Tzallas, A. OPTIMUS: A Multidimensional Global Optimization Package. J. Open Source Softw. 2025, 10, 7584. [Google Scholar] [CrossRef]
- Powell, M.J.D. A Tolerant Algorithm for Linearly Constrained Optimization Calculations. Math. Program. 1989, 45, 547–566. [Google Scholar] [CrossRef]
- Park, J.; Sandberg, I.W. Universal Approximation Using Radial-Basis-Function Networks. Neural Comput. 1991, 3, 246–257. [Google Scholar] [CrossRef]
- Montazer, G.A.; Giveki, D.; Karami, M.; Rastegar, H. Radial basis function neural networks: A review. Comput. Rev. J. 2018, 1, 52–74. [Google Scholar]
- Stanley, K.O.; Miikkulainen, R. Evolving Neural Networks through Augmenting Topologies. Evol. Comput. 2002, 10, 99–127. [Google Scholar] [CrossRef]
- Zhu, V.; Lu, Y.; Li, Q. MW-OBS: An improved pruning method for topology design of neural networks. Tsinghua Sci. Technol. 2006, 11, 307–312. [Google Scholar] [CrossRef]
- Grzegorz Klima. Fast Compressed Neural Networks. Available online: http://https://rdrr.io/cran/FCNN4R/ (accessed on 10 September 2025).
- Ward, R.; Wu, X.; Bottou, L. Adagrad stepsizes: Sharp convergence over nonconvex landscapes. J. Mach. Learn. Res. 2020, 21, 1–30. [Google Scholar]
- Kopitsa, C.; Tsoulos, I.G.; Charilogis, V.; Stavrakoudis, A. Predicting the Duration of Forest Fires Using Machine Learning Methods. Future Internet 2024, 16, 396. [Google Scholar] [CrossRef]
- Emad-Ud-Din, M.; Wang, Y. Promoting occupancy detection accuracy using on-device lifelong learning. IEEE Sens. J. 2023, 23, 9595–9606. [Google Scholar] [CrossRef]
- Wolfe, M.A. Interval methods for global optimization. Appl. Math. Comput. 1996, 75, 179–206. [Google Scholar]
- Csendes, T.; Ratz, D. Subdivision Direction Selection in Interval Methods for Global Optimization. SIAM J. Numer. Anal. 1997, 34, 922–938. [Google Scholar] [CrossRef]
PARAMETER | MEANING | VALUE |
---|---|---|
Chromosomes | 500 | |
Maximum number of generations | 500 | |
Number of generations for the modified genetic algorithm | 50 | |
Selection rate | 0.1 | |
Mutation rate | 0.05 | |
Generations before local search | 20 | |
Chromosomes participating in local search | 20 | |
a | Bounding factor | 10.0 |
F | Scale factor for the margins | 2.0 |
Value used for penalties | 100.0 |
DATASET | ADAM | BFGS | GENETIC | RBF | NEAT | PRUNE | DNN | NNC | PROPOSED |
---|---|---|---|---|---|---|---|---|---|
APPENDICITIS | 16.50% | 18.00% | 24.40% | 12.23% | 17.20% | 15.97% | 17.30% | 14.40% | 14.30% |
ALCOHOL | 57.78% | 41.50% | 39.57% | 49.32% | 66.80% | 15.75% | 39.04% | 37.72% | 35.60% |
AUSTRALIAN | 35.65% | 38.13% | 32.21% | 34.89% | 31.98% | 43.66% | 35.03% | 14.46% | 14.55% |
BALANCE | 12.27% | 8.64% | 8.97% | 33.53% | 23.14% | 9.00% | 24.56% | 23.65% | 7.84% |
CLEVELAND | 67.55% | 77.55% | 51.60% | 67.10% | 53.44% | 51.48% | 63.28% | 50.93% | 46.41% |
CIRCULAR | 19.95% | 6.08% | 5.99% | 5.98% | 35.18% | 12.76% | 21.87% | 12.66% | 6.92% |
DERMATOLOGY | 26.14% | 52.92% | 30.58% | 62.34% | 32.43% | 9.02% | 24.26% | 21.54% | 20.54% |
ECOLI | 64.43% | 69.52% | 54.67% | 59.48% | 43.44% | 60.32% | 60.79% | 49.88% | 48.82% |
GLASS | 61.38% | 54.67% | 52.86% | 50.46% | 55.71% | 66.19% | 56.05% | 56.09% | 53.52% |
HABERMAN | 29.00% | 29.34% | 28.66% | 25.10% | 24.04% | 29.38% | 25.73% | 27.53% | 26.80% |
HAYES-ROTH | 59.70% | 37.33% | 56.18% | 64.36% | 50.15% | 45.44% | 44.65% | 33.69% | 31.00% |
HEART | 38.53% | 39.44% | 28.34% | 31.20% | 39.27% | 27.21% | 30.67% | 15.67% | 15.45% |
HEARTATTACK | 45.55% | 46.67% | 29.03% | 29.00% | 32.34% | 29.26% | 32.97% | 20.87% | 21.77% |
HOUSEVOTES | 7.48% | 7.13% | 6.62% | 6.13% | 10.89% | 5.81% | 3.13% | 3.17% | 3.78% |
IONOSPHERE | 16.64% | 15.29% | 15.14% | 16.22% | 19.67% | 11.32% | 12.57% | 11.29% | 11.94% |
LIVERDISORDER | 41.53% | 42.59% | 31.11% | 30.84% | 30.67% | 49.72% | 32.21% | 32.35% | 31.32% |
LYMOGRAPHY | 39.79% | 35.43% | 28.42% | 25.50% | 33.70% | 22.02% | 24.07% | 25.29% | 23.72% |
MAMMOGRAPHIC | 46.25% | 17.24% | 19.88% | 21.38% | 22.85% | 38.10% | 19.83% | 17.62% | 16.74% |
PARKINSONS | 24.06% | 27.58% | 18.05% | 17.41% | 18.56% | 22.12% | 21.32% | 12.74% | 12.63% |
PHONEME | 29.43% | 15.58% | 15.55% | 23.32% | 22.34% | 29.35% | 22.68% | 22.50% | 21.52% |
PIMA | 34.85% | 35.59% | 32.19% | 25.78% | 34.51% | 35.08% | 32.63% | 28.07% | 23.34% |
POPFAILURES | 5.18% | 5.24% | 5.94% | 7.04% | 7.05% | 4.79% | 6.83% | 6.98% | 5.72% |
REGIONS2 | 29.85% | 36.28% | 29.39% | 38.29% | 33.23% | 34.26% | 33.42% | 26.18% | 23.81% |
SAHEART | 34.04% | 37.48% | 34.86% | 32.19% | 34.51% | 37.70% | 35.11% | 29.80% | 28.04% |
SEGMENT | 49.75% | 68.97% | 57.72% | 59.68% | 66.72% | 60.40% | 32.04% | 53.50% | 48.20% |
SPIRAL | 47.67% | 47.99% | 48.66% | 44.87% | 48.66% | 50.38% | 45.64% | 48.01% | 44.95% |
STATHEART | 44.04% | 39.65% | 27.25% | 31.36% | 44.36% | 28.37% | 30.22% | 18.08% | 17.93% |
STUDENT | 5.13% | 7.14% | 5.61% | 5.49% | 10.20% | 10.84% | 6.93% | 6.70% | 4.05% |
TRANSFUSION | 25.68% | 25.84% | 24.87% | 26.41% | 24.87% | 29.35% | 25.92% | 25.77% | 23.16% |
WDBC | 35.35% | 29.91% | 8.56% | 7.27% | 12.88% | 15.48% | 9.43% | 7.36% | 4.95% |
WINE | 29.40% | 59.71% | 19.20% | 31.41% | 25.43% | 16.62% | 27.18% | 13.59% | 9.94% |
Z_F_S | 47.81% | 39.37% | 10.73% | 13.16% | 38.41% | 17.91% | 9.27% | 14.53% | 7.97% |
Z_O_N_F_S | 78.79% | 65.67% | 64.81% | 48.70% | 77.08% | 71.29% | 67.80% | 48.62% | 39.28% |
ZO_NF_S | 47.43% | 43.04% | 21.54% | 9.02% | 43.75% | 15.57% | 8.50% | 13.54% | 6.94% |
ZONF_S | 11.99% | 15.62% | 4.36% | 4.03% | 5.44% | 3.27% | 2.52% | 2.64% | 2.60% |
ZOO | 14.13% | 10.70% | 9.50% | 21.93% | 20.27% | 8.53% | 16.20% | 8.70% | 6.60% |
AVERAGE | 36.45% | 35.71% | 28.25% | 30.73% | 32.19% | 27.94% | 27.82% | 24.79% | 21.18% |
DATASET | FEATURES | CLASSES |
---|---|---|
APPENDICITIS | 7 | 2 |
ALCOHOL | 154 | 4 |
AUSTRALIAN | 14 | 2 |
BALANCE | 4 | 3 |
CLEVELAND | 13 | 5 |
CIRCULAR | 5 | 2 |
DERMATOLOGY | 34 | 6 |
ECOLI | 7 | 8 |
GLASS | 9 | 6 |
HABERMAN | 3 | 2 |
HAYES-ROTH | 5 | 3 |
HEART | 13 | 2 |
HEARTATTACK | 13 | 2 |
HOUSEVOTES | 16 | 2 |
IONOSPHERE | 34 | 2 |
LIVERDISORDER | 6 | 2 |
LYMOGRAPHY | 18 | 4 |
MAMMOGRAPHIC | 5 | 2 |
PARKINSONS | 22 | 2 |
PHONEME | 5 | 2 |
PIMA | 8 | 2 |
POPFAILURES | 18 | 2 |
REGIONS2 | 18 | 5 |
SAHEART | 9 | 2 |
SEGMENT | 19 | 7 |
SPIRAL | 2 | 2 |
STATHEART | 13 | 2 |
STUDENT | 5 | 4 |
TRANSFUSION | 4 | 2 |
WDBC | 30 | 2 |
WINE | 13 | 3 |
Z_F_S | 21 | 3 |
Z_O_N_F_S | 21 | 5 |
ZO_NF_S | 21 | 3 |
ZONF_S | 21 | 2 |
ZOO | 16 | 7 |
DATASET | ADAM | BFGS | GENETIC | RBF | NEAT | PRUNE | DNN | NNC | PROPOSED |
---|---|---|---|---|---|---|---|---|---|
ABALONE | 4.30 | 5.69 | 7.17 | 7.37 | 9.88 | 7.88 | 6.91 | 5.08 | 4.47 |
AIRFOIL | 0.005 | 0.003 | 0.003 | 0.27 | 0.067 | 0.002 | 0.004 | 0.004 | 0.002 |
AUTO | 70.84 | 60.97 | 12.18 | 17.87 | 56.06 | 75.59 | 13.26 | 17.13 | 9.09 |
BK | 0.0252 | 0.28 | 0.027 | 0.02 | 0.15 | 0.027 | 0.02 | 0.10 | 0.023 |
BL | 0.622 | 2.55 | 5.74 | 0.013 | 0.05 | 0.027 | 0.006 | 1.19 | 0.001 |
BASEBALL | 77.90 | 119.63 | 103.60 | 93.02 | 100.39 | 94.50 | 110.22 | 61.57 | 48.13 |
CONCRETE | 0.078 | 0.066 | 0.0099 | 0.011 | 0.081 | 0.0077 | 0.021 | 0.008 | 0.005 |
DEE | 0.63 | 2.36 | 1.013 | 0.17 | 1.512 | 1.08 | 0.31 | 0.26 | 0.22 |
FRIEDMAN | 22.90 | 1.263 | 1.249 | 7.23 | 19.35 | 8.69 | 2.75 | 6.29 | 5.34 |
FY | 0.038 | 0.19 | 0.65 | 0.041 | 0.08 | 0.042 | 0.039 | 0.11 | 0.043 |
HO | 0.035 | 0.62 | 2.78 | 0.03 | 0.169 | 0.03 | 0.026 | 0.015 | 0.016 |
HOUSING | 80.99 | 97.38 | 43.26 | 57.68 | 56.49 | 52.25 | 65.18 | 25.47 | 15.47 |
LASER | 0.03 | 0.015 | 0.59 | 0.03 | 0.084 | 0.007 | 0.045 | 0.025 | 0.0049 |
LW | 0.028 | 2.98 | 1.90 | 0.03 | 0.03 | 0.02 | 0.023 | 0.011 | 0.011 |
MORTGAGE | 9.24 | 8.23 | 2.41 | 1.45 | 14.11 | 12.96 | 9.74 | 0.30 | 0.023 |
PL | 0.117 | 0.29 | 0.29 | 2.118 | 0.09 | 0.032 | 0.056 | 0.047 | 0.029 |
PLASTIC | 11.71 | 20.32 | 2.791 | 8.62 | 20.77 | 17.33 | 3.82 | 4.20 | 2.17 |
QUAKE | 0.07 | 0.42 | 0.04 | 0.07 | 0.298 | 0.04 | 0.098 | 0.96 | 0.036 |
SN | 0.026 | 0.40 | 2.95 | 0.027 | 0.174 | 0.032 | 0.027 | 0.026 | 0.024 |
STOCK | 180.89 | 302.43 | 3.88 | 12.23 | 12.23 | 39.08 | 12.95 | 8.92 | 4.69 |
TREASURY | 11.16 | 9.91 | 2.93 | 2.02 | 15.52 | 13.76 | 11.41 | 0.43 | 0.068 |
AVERAGE | 22.46 | 30.29 | 9.31 | 10.02 | 14.65 | 15.40 | 11.28 | 6.29 | 4.28 |
DATASET | NNC ONE-POINT | NNC-UNIFORM | PROPOSED ONE-POINT | PROPOSED UNIFORM |
---|---|---|---|---|
APPENDICITIS | 14.40% | 14.20% | 14.30% | 14.40% |
ALCOHOL | 37.72% | 42.34% | 35.60% | 39.70% |
AUSTRALIAN | 14.46% | 14.13% | 14.55% | 14.35% |
BALANCE | 23.65% | 20.73% | 7.84% | 7.61% |
CLEVELAND | 50.93% | 51.45% | 46.41% | 46.28% |
CIRCULAR | 12.66% | 17.59% | 6.92% | 11.86% |
DERMATOLOGY | 21.54% | 30.09% | 20.54% | 26.86% |
ECOLI | 49.88% | 48.12% | 48.82% | 48.88% |
GLASS | 56.09% | 57.43% | 53.52% | 52.43% |
HABERMAN | 27.53% | 27.17% | 26.80% | 26.70% |
HAYES-ROTH | 33.69% | 36.61% | 31.00% | 33.62% |
HEART | 15.67% | 16.41% | 15.45% | 14.96% |
HEARTATTACK | 20.87% | 21.50% | 21.77% | 21.27% |
HOUSEVOTES | 3.17% | 3.44% | 3.78% | 3.43% |
IONOSPHERE | 11.29% | 11.80% | 11.94% | 11.77% |
LIVERDISORDER | 32.35% | 32.65% | 31.32% | 32.32% |
LYMOGRAPHY | 25.29% | 28.21% | 23.72% | 25.14% |
MAMMOGRAPHIC | 17.62% | 18.04% | 16.74% | 16.24% |
PARKINSONS | 12.74% | 11.63% | 12.63% | 12.74% |
PHONEME | 22.50% | 23.46% | 21.52% | 21.32% |
PIMA | 28.07% | 27.95% | 23.34% | 24.43% |
POPFAILURES | 6.98% | 6.80% | 5.72% | 6.01% |
REGIONS2 | 26.18% | 25.71% | 23.81% | 25.21% |
SAHEART | 29.80% | 30.52% | 28.04% | 29.13% |
SEGMENT | 53.50% | 54.78% | 48.20% | 52.26% |
SPIRAL | 48.01% | 48.35% | 44.95% | 45.03% |
STATHEART | 18.08% | 18.85% | 17.93% | 18.59% |
STUDENT | 6.70% | 6.15% | 4.05% | 4.10% |
TRANSFUSION | 25.77% | 25.58% | 23.16% | 23.96% |
WDBC | 7.36% | 8.07% | 4.95% | 6.31% |
WINE | 13.59% | 14.41% | 9.94% | 11.76% |
Z_F_S | 14.53% | 18.33% | 7.97% | 10.13% |
Z_O_N_F_S | 48.62% | 51.10% | 39.28% | 44.90% |
ZO_NF_S | 13.54% | 14.52% | 6.94% | 8.24% |
ZONF_S | 2.64% | 2.82% | 2.60% | 2.78% |
ZOO | 8.70% | 10.40% | 6.60% | 8.70% |
AVERAGE | 24.79% | 24.76% | 21.18% | 22.32% |
DATASET | NNC ONE-POINT | NNC UNIFORM | PROPOSED ONE-POINT | PROPOSED UNIFORM |
---|---|---|---|---|
ABALONE | 5.08 | 5.40 | 4.47 | 4.55 |
AIRFOIL | 0.004 | 0.004 | 0.002 | 0.003 |
AUTO | 17.13 | 20.06 | 9.09 | 11.10 |
BK | 0.10 | 0.018 | 0.023 | 0.018 |
BL | 1.19 | 0.018 | 0.001 | 0.001 |
BASEBALL | 61.57 | 63.44 | 48.13 | 49.99 |
CONCRETE | 0.008 | 0.009 | 0.005 | 0.006 |
DEE | 0.26 | 0.28 | 0.22 | 0.24 |
FRIEDMAN | 6.29 | 6.98 | 5.34 | 5.85 |
FY | 0.11 | 0.04 | 0.043 | 0.04 |
HO | 0.015 | 0.016 | 0.016 | 0.011 |
HOUSING | 25.47 | 26.68 | 15.47 | 16.89 |
LASER | 0.025 | 0.041 | 0.0049 | 0.008 |
LW | 0.011 | 0.012 | 0.011 | 0.011 |
MORTGAGE | 0.30 | 0.29 | 0.023 | 0.037 |
PL | 0.047 | 0.046 | 0.029 | 0.024 |
PLASTIC | 4.20 | 5.20 | 2.17 | 2.30 |
QUAKE | 0.96 | 0.036 | 0.036 | 0.036 |
SN | 0.026 | 0.026 | 0.024 | 0.024 |
STOCK | 8.92 | 10.89 | 4.69 | 8.31 |
TREASURY | 0.43 | 0.38 | 0.068 | 0.072 |
AVERAGE | 6.29 | 6.66 | 4.28 | 4.74 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Tsoulos, I.G.; Charilogis, V.; Tsalikakis, D. Combining Constructed Artificial Neural Networks with Parameter Constraint Techniques to Achieve Better Generalization Properties. Symmetry 2025, 17, 1557. https://doi.org/10.3390/sym17091557
Tsoulos IG, Charilogis V, Tsalikakis D. Combining Constructed Artificial Neural Networks with Parameter Constraint Techniques to Achieve Better Generalization Properties. Symmetry. 2025; 17(9):1557. https://doi.org/10.3390/sym17091557
Chicago/Turabian StyleTsoulos, Ioannis G., Vasileios Charilogis, and Dimitrios Tsalikakis. 2025. "Combining Constructed Artificial Neural Networks with Parameter Constraint Techniques to Achieve Better Generalization Properties" Symmetry 17, no. 9: 1557. https://doi.org/10.3390/sym17091557
APA StyleTsoulos, I. G., Charilogis, V., & Tsalikakis, D. (2025). Combining Constructed Artificial Neural Networks with Parameter Constraint Techniques to Achieve Better Generalization Properties. Symmetry, 17(9), 1557. https://doi.org/10.3390/sym17091557