Using Optimization Techniques in Grammatical Evolution
Abstract
:1. Introduction
2. The Proposed Method
2.1. The Grammatical Evolution Method
- N is the set of non-terminal symbols.
- T is the set of terminal symbols.
- S represents that start symbol of the grammar with .
- P is the production rules of the grammar. Usually these rules are in the form or .
- Obtain the next element V from the provided chromosome.
- The production rule is selected according to the scheme
2.2. The Modified Simulated Annealing Algorithm
Algorithm 1 The modified version of the Simulated Annealing algorithm |
procedure siman
|
2.3. The Neural Network Construction Algorithm
- Initialization step:
- (a)
- Set as the maximum number of generations allowed.
- (b)
- Set as the number of chromosomes.
- (c)
- Set as the selection rate and as the mutation rate.
- (d)
- Define as the number of generations that should elapse before applying the local optimization technique.
- (e)
- Define as the number of chromosomes involved in the local search procedure.
- (f)
- Initialize the chromosomes. Each chromosome is considered as a series of randomly initialized integers.
- (g)
- Set iter=0.
- Genetic step:
- (a)
- Fordo
- Create for every chromosome a neural network using the Grammatical Evolution procedure of Section 2.1 and the associated grammar given in Figure 1.
- Calculate the fitness on the train set of the objective problem as:
- Perform the selection procedure. Initially, the chromosomes are sorted according to their fitness values. The best chromosomes are transferred to the next generation. The remaining chromosomes will be replaced by offspring created during the crossover procedure.
- Perform the crossover procedure. The crossover procedure produces offspring. For every pair of produced offspring and , there are two offspring . The selection is performed using tournament selection. The new offspring are produced using the one-point crossover procedure. An example of the one-point crossover procedure is shown in Figure 3.
- Perform the mutation procedure. For each element of every chromosome, a random number is drawn. The corresponding element is altered if .
- (b)
- EndFor
- Local Search step:
- (a)
- If Then
- Create a set of randomly chosen chromosomes from the genetic population. Denote this set as .
- For every in apply the modified Simulated Annealing algorithm given in Algorithm 1:
- set iter = iter + 1. If iter > goto Evaluation step else goto Genetic step.
- Evaluation step:
- (a)
- Obtain the chromosome with the lowest fitness value and create the associated neural network .
- (b)
- Evaluate the neural network in the test set of the underlying dataset and report the results.
3. Results
- The UCI dataset repository, https://archive.ics.uci.edu/ml/index.php (accessed on 20 March 2024) [59];
- The Keel repository, https://sci2s.ugr.es/keel/datasets.php (accessed on 20 March 2024) [60];
- The Statlib URL http://lib.stat.cmu.edu/datasets/ (accessed on 20 March 2024).
3.1. Classification Datasets
- Appendictis a medical dataset, provided in [61].
- Australian dataset [62], which is related to credit card transactions.
- Balance dataset [63], a dataset related to psychological experiments.
- Circular dataset, an artificial dataset that contains 1000 examples.
- Dermatology dataset [66], which is a medical dataset about dermatological deceases.
- Ecoli dataset, a dataset about protein localization sites [67].
- Haberman dataset, related to breast cancer.
- Heart dataset [68], a medical dataset about heart diseases.
- Hayes roth dataset [69], which is a human subjects study.
- HouseVotes dataset [70], related to votes collected from U.S. House of Representatives Congressmen.
- Liverdisorder dataset [73], a medical dataset related to liver disorders.
- Mammographic dataset [74], a medical dataset related to breast cancer.
- Parkinsons dataset, a medical dataset related to Parkinson’s disease (PD) [75].
- Pima dataset [76], a medical dataset related to the diabetes presence.
- Popfailures dataset [77], a dataset related to climate measurements.
- Regions2 dataset, medical dataset related to hepatitis C [78].
- Saheart dataset [79]. This dataset is used to detect heart diseases.
- Segment dataset [80], related to image processing.
- Student dataset [81], related to data collected in Portuguese schools.
- Transfusion dataset [82], this dataset was taken from the Blood Transfusion Service Center in Hsin-Chu City in Taiwan.
- Wdbc dataset [83], a medical dataset related cancer detection.
- Eeg datasets, a dataset related to EEG measurements [86]. From this dataset the following cases were used: Z_F_S, Z_O_N_F_S, ZO_NF_S, and ZONF_S.
- Zoo dataset [87], related to animal classification.
3.2. Regression Datasets
- Abalone dataset [88], a dataset related to the prediction of age of abalones.
- Airfoil dataset, a dataset proposed by NASA [89].
- Baseball dataset, related with the income of baseball players.
- Concrete dataset [90], which is a civil engineering dataset.
- Dee dataset. This dataset has measures from the price of electricity.
- HO dataset, downloaded from the STALIB repository.
- Housing dataset, mentioned in [91].
- Laser dataset. This is a dataset related to laser experiments
- LW dataset, related to risk factors associated with low-weight babies.
- MORTGAGE dataset, a dataset related to economic measurements from the USA.
- PL dataset, provided from the STALIB repository.
- SN dataset, provided from the STALIB repository.
- Treasury dataset, a dataset related to economic measurements from the USA.
- TZ dataset, provided from the STALIB repository.
3.3. Experimental Results
- The column DATASET denotes the used dataset.
- The column ADAM denotes the application of the ADAM optimization method [92] in an artificial neural network with processing nodes.
- The column NEAT stands for the usage of NEAT method (NeuroEvolution of Augmenting Topologies ) [93].
- The column MLP stands for the experimental results of an artificial neural network with processing nodes. The neural network was trained using a Genetic Algorithm and the BFGS local optimization method [94].
- The column RBF represents the application of an RBF network with processing nodes in each dataset.
- The column NNC denotes the usage of the original Neural Construction technique, which was constructed with Grammatical Evolution.
- The column NNC-S denotes the usage of the proposed local optimization procedure in the Neural Construction technique.
- The line AVERAGE denotes the average classification or regression error.
4. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Yusup, N.; Zain, A.M.; Hashim, S.Z.M. Evolutionary techniques in optimizing machining parameters: Review and recent applications (2007–2011). Expert Syst. Appl. 2012, 39, 9909–9927. [Google Scholar] [CrossRef]
- Holland, J.H. Genetic algorithms. Sci. Am. 1992, 267, 66–73. [Google Scholar] [CrossRef]
- Stender, J. Parallel Genetic Algorithms: Theory & Applications; IOS Press: Amsterdam, The Netherlands, 1993. [Google Scholar]
- Goldberg, D. Genetic Algorithms in Search, Optimization and Machine Learning; Addison-Wesley Publishing Company: Reading, MA, USA, 1989. [Google Scholar]
- Michaelewicz, Z. Genetic Algorithms + Data Structures = Evolution Programs; Springer: Berlin/Heidelberg, Germany, 1996. [Google Scholar]
- O’Neill, M.; Ryan, C. Grammatical evolution. IEEE Trans. Evol. Comput. 2001, 5, 349–358. [Google Scholar] [CrossRef]
- Backus, J.W. The Syntax and Semantics of the Proposed International Algebraic Language of the Zurich ACM-GAMM Conference. In Proceedings of the International Conference on Information Processing; UNESCO: Paris, France, 1959; pp. 125–132. [Google Scholar]
- Ryan, C.; Collins, J.; O’Neill, M. Grammatical evolution: Evolving programs for an arbitrary language. In Genetic Programming. EuroGP 1998. Lecture Notes in Computer Science; Banzhaf, W., Poli, R., Schoenauer, M., Fogarty, T.C., Eds.; Springer: Berlin/Heidelberg, Germany, 1998; Volume 1391. [Google Scholar]
- O’Neill, M.; Ryan, M.C. Evolving Multi-line Compilable C Programs. In Genetic Programming. EuroGP 1999. Lecture Notes in Computer Science; Poli, R., Nordin, P., Langdon, W.B., Fogarty, T.C., Eds.; Springer: Berlin/Heidelberg, Germany, 1999; Volume 1598. [Google Scholar]
- Brabazon, A.; O’Neill, M. Credit classification using grammatical evolution. Informatica 2006, 30, 325–335. [Google Scholar]
- Şen, S.; Clark, J.A. A grammatical evolution approach to intrusion detection on mobile ad hoc networks. In Proceedings of the Second ACM Conference on Wireless Network Security, Zurich, Switzerland, 16–19 March 2009. [Google Scholar]
- Chen, L.; Tan, C.H.; Kao, S.J.; Wang, T.S. Improvement of remote monitoring on water quality in a subtropical reservoir by incorporating grammatical evolution with parallel genetic algorithms into satellite imagery. Water Res. 2008, 42, 296–306. [Google Scholar] [CrossRef] [PubMed]
- Hidalgo, J.I.; Colmenar, J.M.; Risco-Martin, J.L.; Cuesta-Infante, A.; Maqueda, E.; Botella, M. Modeling glycemia in humans by means of Grammatical Evolution. Appl. Soft Comput. 2014, 20, 40–53. [Google Scholar] [CrossRef]
- Tavares, J.; Pereira, F.B. Automatic Design of Ant Algorithms with Grammatical Evolution. In Genetic Programming. EuroGP 2012. Lecture Notes in Computer Science; Moraglio, A., Silva, S., Krawiec, K., Machado, P., Cotta, C., Eds.; Springer: Berlin/Heidelberg, Germany, 2012; Volume 7244. [Google Scholar]
- Zapater, M.; Risco-Martín, J.L.; Arroba, P.; Ayala, J.L.; Moya, J.M.; Hermida, R. Runtime data center temperature prediction using Grammatical Evolution techniques. Appl. Soft Comput. 2016, 49, 94–107. [Google Scholar] [CrossRef]
- Ryan, C.; O’Neill, M.; Collins, J.J. Grammatical Evolution: Solving Trigonometric Identities. In Proceedings of the Mendel 1998: 4th International Mendel Conference on Genetic Algorithms, Optimisation Problems, Fuzzy Logic, Neural Networks, Rough Sets, Brno, Czech Republic, 1–2 November 1998. [Google Scholar]
- Puente, A.O.; Alfonso, R.S.; Moreno, M.A. Automatic composition of music by means of grammatical evolution. In Proceedings of the APL ’02: Proceedings of the 2002 Conference on APL: Array Processing Languages: Lore, Problems, and Applications, Madrid, Spain, 22–25 July 2002; pp. 148–155. [Google Scholar]
- Lídio Mauro Limade Campo, R. Célio Limã Oliveira, Mauro Roisenberg, Optimization of neural networks through grammatical evolution and a genetic algorithm. Expert Syst. Appl. 2016, 56, 368–384. [Google Scholar]
- Soltanian, K.; Ebnenasir, A.; Afsharchi, M. Modular Grammatical Evolution for the Generation of Artificial Neural Networks. Evol. Comput. 2022, 30, 291–327. [Google Scholar] [CrossRef] [PubMed]
- Dempsey, I.; Neill, M.O.; Brabazon, A. Constant creation in grammatical evolution. Int. J. Innov. Comput. Appl. 2007, 1, 23–38. [Google Scholar] [CrossRef]
- Galván-López, E.; Swafford, J.M.; O’Neill, M.; Brabazon, A. Evolving a Ms. PacMan Controller Using Grammatical Evolution. In Applications of Evolutionary Computation. EvoApplications 2010. Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2010; Volume 6024. [Google Scholar]
- Shaker, N.; Nicolau, M.; Yannakakis, G.N.; Togelius, J.; O’Neill, M. Evolving levels for Super Mario Bros using grammatical evolution. In Proceedings of the 2012 IEEE Conference on Computational Intelligence and Games (CIG), Granada, Spain, 11–14 September 2012; pp. 304–331. [Google Scholar]
- Martínez-Rodríguez, D.; Colmenar, J.M.; Hidalgo, J.I.; Micó, R.J.V.; Salcedo-Sanz, S. Particle swarm grammatical evolution for energy demand estimation. Energy Sci. Eng. 2020, 8, 1068–1079. [Google Scholar] [CrossRef]
- Sabar, N.R.; Ayob, M.; Kendall, G.; Qu, R. Grammatical Evolution Hyper-Heuristic for Combinatorial Optimization Problems. IEEE Trans. Evol. Comput. 2013, 17, 840–861. [Google Scholar] [CrossRef]
- Ryan, C.; Kshirsagar, M.; Vaidya, G.; Cunningham, A.; Sivaraman, R. Design of a cryptographically secure pseudo random number generator with grammatical evolution. Sci. Rep. 2022, 12, 8602. [Google Scholar] [CrossRef]
- Pereira, P.J.; Cortez, P.; Mendes, R. Multi-objective Grammatical Evolution of Decision Trees for Mobile Marketing user conversion prediction. Expert Syst. Appl. 2021, 168, 114287. [Google Scholar] [CrossRef]
- Castejón, F.; Carmona, E.J. Automatic design of analog electronic circuits using grammatical evolution. Appl. Soft Comput. 2018, 62, 1003–1018. [Google Scholar] [CrossRef]
- Lourenço, N.; Pereira, F.B.; Costa, E. Unveiling the properties of structured grammatical evolution. Genet. Program. Evolvable Mach. 2016, 17, 251–289. [Google Scholar] [CrossRef]
- Lourenço, N.; Assunção, F.; Pereira, F.B.; Costa, E.; Machado, P. Structured grammatical evolution: A dynamic approach. In Handbook of Grammatical Evolution; Springer: Cham, Switzerland, 2018; pp. 137–161. [Google Scholar]
- O’Neill, M.; Brabazon, A.; Nicolau, M.; Garraghy, S.M.; Keenan, P. πGrammatical Evolution. In Genetic and Evolutionary Computation—GECCO 2004. GECCO 2004. Lecture Notes in Computer Science; Deb, K., Ed.; Springer: Berlin/Heidelberg, Germany, 2004; Volume 3103. [Google Scholar]
- Poli, R. James Kennedy kennedy, Tim Blackwell, Particle swarm optimization An Overview. Swarm Intell. 2007, 1, 33–57. [Google Scholar] [CrossRef]
- O’Neill, M.; Brabazon, A. Grammatical swarm: The generation of programs by social programming. Nat. Comput. 2006, 5, 443–462. [Google Scholar] [CrossRef]
- Ferrante, E.; Duéñez-Guzmán, E.; Turgut, A.E.; Wenseleers, T. GESwarm: Grammatical evolution for the automatic synthesis of collective behaviors in swarm robotics. In Proceedings of the 15th Annual Conference on Genetic and Evolutionary Computation, Amsterdam, The Netherlands, 6–10 July 2013; pp. 17–24. [Google Scholar]
- Mégane, J.; Lourenço, N.; Machado, P. Probabilistic Grammatical Evolution. In Genetic Programming. EuroGP 2021. Lecture Notes in Computer Science; Hu, T., Lourenço, N., Medvet, E., Eds.; Springer: Cham, Switzerland, 2021; Volume 12691. [Google Scholar]
- Popelka, O.; Osmera, P. Parallel Grammatical Evolution for Circuit Optimization. In Evolvable Systems: From Biology to Hardware. ICES 2008. Lecture Notes in Computer Science; Hornby, G.S., Sekanina, L., Haddow, P.C., Eds.; Springer: Berlin/Heidelberg, Germany, 2008; Volume 5216. [Google Scholar] [CrossRef]
- Ošmera, P. Two level parallel grammatical evolution. In Advances in Computational Algorithms and Data Analysis; Springer: Cham, Switzerland, 2009; pp. 509–525. [Google Scholar]
- Ortega, A.; Cruz, M.d.; Alfonseca, M. Christiansen Grammar Evolution: Grammatical Evolution with Semantics. IEEE Trans. Evol. Comput. 2007, 11, 77–90. [Google Scholar] [CrossRef]
- O’Neill, M.; Hemberg, E.; Gilligan, C.; Bartley, E.; McDermott, J.; Brabazon, A. GEVA: Grammatical evolution in Java. ACM Sigevolution 2008, 3, 17–22. [Google Scholar] [CrossRef]
- Noorian, F.; de Silva, A.M.; Leong, P.H.W. gramEvol: Grammatical Evolution in R. J. Stat. Softw. 2016, 71, 1–26. [Google Scholar] [CrossRef]
- Raja, M.A.; Ryan, C. GELAB—A Matlab Toolbox for Grammatical Evolution. In Intelligent Data Engineering and Automated Learning—IDEAL 2018. IDEAL 2018. Lecture Notes in Computer Science 2018; Springer: Cham, Switzerland, 2018; Volume 11315. [Google Scholar] [CrossRef]
- Anastasopoulos, N.; Tsoulos, I.G.; Tzallas, A. GenClass: A parallel tool for data classification based on Grammatical Evolution. SoftwareX 2021, 16, 100830. [Google Scholar] [CrossRef]
- Tsoulos, I.G. QFC: A Parallel Software Tool for Feature Construction, Based on Grammatical Evolution. Algorithms 2022, 15, 295. [Google Scholar] [CrossRef]
- Yang, S.; Jat, S.N. Genetic Algorithms with Guided and Local Search Strategies for University Course Timetabling. IEEE Trans. Syst. Man Cybern. Part C Appl. Rev. 2011, 41, 93–106. [Google Scholar] [CrossRef]
- Sivaram, M.; Batri, K.; Mohammed, A.S.; Porkodi, V. Exploiting the Local Optima in Genetic Algorithm using Tabu Search. Indian J. Sci. Technol. 2019, 12, 1–13. [Google Scholar] [CrossRef]
- Dai, Y.H. Convergence Properties of the BFGS Algoritm. SIAM J. Optim. 2002, 13, 693–701. [Google Scholar] [CrossRef]
- Kirkpatrick, S.; Gelatt, C.D.G., Jr.; Vecchi, M.P. Optimization by simulated annealing. Science 1983, 220, 671–680. [Google Scholar] [CrossRef] [PubMed]
- Robini, M.C.; Rastello, T.; Magnin, I.E. Simulated annealing; acceleration techniques; image restoration. IEEE Trans. Image Process. 1999, 8, 1374–1387. [Google Scholar] [CrossRef] [PubMed]
- Zhang, L.; Ma, H.; Qian, W.; Li, H. Protein structure optimization using improved simulated annealing algorithm on a three-dimensional AB off-lattice model. Comput. Biol. Chem. 2020, 85, 107237. [Google Scholar] [CrossRef]
- Aerts, J.C.J.H.; Heuvelink, G.B.M. Using simulated annealing for resource allocation. Int. J. Geogr. Inf. Sci. 2002, 16, 571–587. [Google Scholar] [CrossRef]
- Kalai, A.T.; Vempala, S. Simulated annealing for convex optimization. Math. Oper. Res. 2006, 31, 253–266. [Google Scholar] [CrossRef]
- Rere, L.M.R.; Fanany, M.I.; Arymurthy, A.M. Simulated Annealing Algorithm for Deep Learning. Procedia Comput. Sci. 2015, 72, 137–144. [Google Scholar] [CrossRef]
- Tsoulos, I.G.; Gavrilis, D.; Glavas, E. Neural network construction and training using grammatical evolution. Neurocomputing 2008, 72, 269–277. [Google Scholar] [CrossRef]
- Ivanova, I.; Kubat, M. Initialization of neural networks by means of decision trees. Knowl. Based Syst. 1995, 8, 333–344. [Google Scholar] [CrossRef]
- Yam, J.Y.F.; Chow, T.W.S. A weight initialization method for improving training speed in feedforward neural network. Neurocomputing 2000, 30, 219–232. [Google Scholar] [CrossRef]
- Chumachenko, K.; Iosifidis, A.; Gabbouj, M. Feedforward neural networks initialization based on discriminant learning. Neural Netw. 2022, 146, 220–229. [Google Scholar] [CrossRef]
- Leung, F.H.F.; Lam, H.K.; Ling, S.H.; Tam, P.K.S. Tuning of the structure and parameters of a neural network using an improved genetic algorithm. IEEE Trans. Neural Netw. 2003, 14, 79–88. [Google Scholar] [CrossRef]
- Han, H.G.; Qiao, J.F. A structure optimisation algorithm for feedforward neural network construction. Neurocomputing 2013, 99, 347–357. [Google Scholar] [CrossRef]
- Kim, K.J.; Cho, S.B. Evolved neural networks based on cellular automata for sensory-motor controller. Neurocomputing 2006, 69, 2193–2207. [Google Scholar] [CrossRef]
- Kelly, M.; Longjohn, R.; Nottingham, K. The UCI Machine Learning Repository. 2023. Available online: https://archive.ics.uci.edu (accessed on 18 February 2024).
- Alcalá-Fdez, J.; Fernandez, A.; Luengo, J.; Derrac, J.; García, S.; Sánchez, L.; Herrera, F. KEEL Data-Mining Software Tool: Data 506 Set Repository, Integration of Algorithms and Experimental Analysis Framework. J. Mult. Valued Log. Soft Comput. 2011, 17, 255–287. [Google Scholar]
- Weiss, M.S.; Kulikowski, A.C. Computer Systems That Learn: Classification and Prediction Methods from Statistics, Neural Nets, Machine Learning, and Expert Systems; Morgan Kaufmann Publishers Inc.: Burlington, MA, USA, 1991. [Google Scholar]
- Quinlan, J.R. Simplifying Decision Trees. Int. J. Man-Mach. Stud. 1987, 27, 221–234. [Google Scholar] [CrossRef]
- Shultz, T.; Mareschal, D.; Schmidt, W. Modeling Cognitive Development on Balance Scale Phenomena. Mach. Learn. 1994, 16, 59–88. [Google Scholar] [CrossRef]
- Zhou, Z.H. NeC4.5: Neural ensemble based C4.5. IEEE Trans. Knowl. Data Eng. 2004, 16, 770–773. [Google Scholar] [CrossRef]
- Setiono, R.; Leow, W.K. FERNN: An Algorithm for Fast Extraction of Rules from Neural Networks. Appl. Intell. 2000, 12, 15–25. [Google Scholar] [CrossRef]
- Demiroz, G.; Govenir, H.A.; Ilter, N. Learning Differential Diagnosis of Eryhemato-Squamous Diseases using Voting Feature Intervals. Artif. Intell. Med. 1998, 13, 147–165. [Google Scholar]
- Horton, P.; Nakai, K. A Probabilistic Classification System for Predicting the Cellular Localization Sites of Proteins. Proc. Int. Conf. Intell. Syst. Mol. Biol. 1996, 4, 109–115. [Google Scholar]
- Kononenko, I.; Šimec, E.; Robnik-Šikonja, M. Overcoming the Myopia of Inductive Learning Algorithms with RELIEFF. Appl. Intell. 1997, 7, 39–55. [Google Scholar] [CrossRef]
- Hayes-Roth, B.; Hayes-Roth, F. Concept learning and the recognition and classification of exemplars. J. Verbal Learn. Verbal Behav. 1977, 16, 321–338. [Google Scholar] [CrossRef]
- French, R.M.; Chater, N. Using noise to compute error surfaces in connectionist networks: A novel means of reducing catastrophic forgetting. Neural Comput. 2002, 14, 1755–1769. [Google Scholar] [CrossRef]
- Dy, J.G.; Brodley, C.E. Feature Selection for Unsupervised Learning. J. Mach. Learn. Res. 2004, 5, 845–889. [Google Scholar]
- Perantonis, S.J.; Virvilis, V. Input Feature Extraction for Multilayered Perceptrons Using Supervised Principal Component Analysis. Neural Process. Lett. 1999, 10, 243–252. [Google Scholar] [CrossRef]
- Garcke, J.; Griebel, M. Classification with sparse grids using simplicial basis functions. Intell. Data Anal. 2002, 6, 483–502. [Google Scholar] [CrossRef]
- Elter, M.; Schulz-Wendtland, R.; Wittenberg, T. The prediction of breast cancer biopsy outcomes using two CAD approaches that both emphasize an intelligible decision process. Med. Phys. 2007, 34, 4164–4172. [Google Scholar] [CrossRef] [PubMed]
- Little, M.A.; McSharry, P.E.; Hunter, E.J.; Spielman, J.; Ramig, L.O. Suitability of dysphonia measurements for telemonitoring of Parkinson’s disease. IEEE Trans. Biomed. Eng. 2009, 56, 1015–1022. [Google Scholar] [CrossRef]
- Smith, J.W.; Everhart, J.E.; Dickson, W.C.; Knowler, W.C.; Johannes, R.S. Using the ADAP learning algorithm to forecast the onset of diabetes mellitus. In Proceedings of the Symposium on Computer Applications and Medical Care; IEEE Computer Society Press: New York, NY, USA, 1988; pp. 261–265. [Google Scholar]
- Lucas, D.D.; Klein, R.; Tannahill, J.; Ivanova, D.; Brandon, S.; Domyancic, D.; Zhang, Y. Failure analysis of parameter-induced simulation crashes in climate models. Geosci. Model Dev. 2013, 6, 1157–1171. [Google Scholar] [CrossRef]
- Giannakeas, N.; Tsipouras, M.G.; Tzallas, A.T.; Kyriakidi, K.; Tsianou, Z.E.; Manousou, P.; Hall, A.; Karvounis, E.C.; Tsianos, V.; Tsianos, E. A clustering based method for collagen proportional area extraction in liver biopsy images. In Proceedings of the 2015 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Milan, Italy, 25–29 August 2015; pp. 3097–3100. [Google Scholar]
- Hastie, T.; Tibshirani, R. Non-parametric logistic and proportional odds regression. JRSS-C Appl. Stat. 1987, 36, 260–276. [Google Scholar] [CrossRef]
- Dash, M.; Liu, H.; Scheuermann, P.; Tan, K.L. Fast hierarchical clustering and its validation. Data Knowl. Eng. 2003, 44, 109–138. [Google Scholar] [CrossRef]
- Cortez, P.; Silva, A.M.G. Using data mining to predict secondary school student performance. In Proceedings of the 5th Future Business Technology Conference (FUBUTEC 2008), Porto, Portugal, 9–11 April 2008; pp. 5–12. [Google Scholar]
- Yeh, I.-C.; Yang, K.-J.; Ting, T.-M. Knowledge discovery on RFM model using Bernoulli sequence. Expert Syst. Appl. 2009, 36, 5866–5871. [Google Scholar] [CrossRef]
- Wolberg, W.H.; Mangasarian, O.L. Multisurface method of pattern separation for medical diagnosis applied to breast cytology. Proc. Natl. Acad. Sci. USA 1990, 87, 9193–9196. [Google Scholar] [CrossRef]
- Raymer, M.; Doom, T.E.; Kuhn, L.A.; Punch, W.F. Knowledge discovery in medical and biological datasets using a hybrid Bayes classifier/evolutionary algorithm. IEEE Trans. Syst. Man Cybern. Part B Cybern. A Publ. IEEE Syst. Man Cybern. Soc. 2003, 33, 802–813. [Google Scholar] [CrossRef]
- Zhong, P.; Fukushima, M. Regularized nonsmooth Newton method for multi-class support vector machines. Optim. Methods Softw. 2007, 22, 225–236. [Google Scholar] [CrossRef]
- Andrzejak, R.G.; Lehnertz, K.; Mormann, F.; Rieke, C.; David, P.; Elger, C.E. Indications of nonlinear deterministic and finite- dimensional structures in time series of brain electrical activity: Dependence on recording region and brain state. Phys. Rev. E 2001, 64, 1–8. [Google Scholar] [CrossRef] [PubMed]
- Koivisto, M.; Sood, K. Exact Bayesian Structure Discovery in Bayesian Networks. J. Mach. Learn. Res. 2004, 5, 549–573. [Google Scholar]
- Nash, W.J.; Sellers, T.L.; Talbot, S.R.; Cawthor, A.J.; Ford, W.B. The Population Biology of Abalone (_Haliotis_ species) in Tasmania. I. Blacklip Abalone (_H. rubra_) from the North Coast and Islands of Bass Strait; Sea Fisheries Division, Technical Report 48; Sea Fisheries Division, Department of Primary Industry and Fisheries: Orange, NSW, Australia, 1994. [Google Scholar]
- Brooks, T.F.; Pope, D.S.; Marcolini, A.M. Airfoil Self-Noise and Prediction; Technical Report, NASA RP-1218; NASA: Washington, DC, USA, 1989. [Google Scholar]
- Yeh, I.C. Modeling of strength of high performance concrete using artificial neural networks. Cem. Concr. Res. 1998, 28, 1797–1808. [Google Scholar] [CrossRef]
- Harrison, D.; Rubinfeld, D.L. Hedonic prices and the demand for clean ai. J. Environ. Econ. Manag. 1978, 5, 81–102. [Google Scholar] [CrossRef]
- Kingma, D.P.; Ba, J.L. ADAM: A method for stochastic optimization. In Proceedings of the 3rd International Conference on Learning Representations (ICLR 2015), San Diego, CA, USA, 7–9 May 2015; pp. 1–15. [Google Scholar]
- Stanley, K.O.; Miikkulainen, R. Evolving Neural Networks through Augmenting Topologies. Evol. Comput. 2002, 10, 99–127. [Google Scholar] [CrossRef]
- Powell, M.J.D. A Tolerant Algorithm for Linearly Constrained Optimization Calculations. Math. Program. 1989, 45, 547–566. [Google Scholar] [CrossRef]
- Yi, D.; Ahn, J.; Ji, S. An effective optimization method for machine learning based on ADAM. Appl. Sci. 2020, 10, 1073. [Google Scholar] [CrossRef]
- Xiao, N.; Hu, X.; Liu, X.; Toh, K.C. Adam-family methods for nonsmooth optimization with convergence guarantees. J. Mach. Learn. Res. 2024, 25, 1–53. [Google Scholar]
- Tzimourta, K.D.; Tsoulos, I.; Bilero, T.; Tzallas, A.T.; Tsipouras, M.G.; Giannakeas, N. Direct Assessment of Alcohol Consumption in Mental State Using Brain Computer Interfaces and Grammatical Evolution. Inventions 2018, 3, 51. [Google Scholar] [CrossRef]
- Glover, F. Parametric tabu-search for mixed integer programs. Comput. Oper. Res. 2006, 33, 2449–2494. [Google Scholar] [CrossRef]
- Lim, A.; Rodrigues, B.; Zhang, X. A simulated annealing and hill-climbing algorithm for the traveling tournament problem. Eur. J. Oper. Res. 2006, 174, 1459–1478. [Google Scholar] [CrossRef]
- Bevilacqua, A. A methodological approach to parallel simulated annealing on an SMP system. J. Parallel Distrib. Comput. 2002, 62, 1548–1570. [Google Scholar] [CrossRef]
Name | Purpose | Value |
---|---|---|
Number of chromosomes | 500 | |
Number of generations | 200 | |
Selection rate | 0.10 | |
Mutation rate | 0.05 | |
g | Number of random changes | 10 |
R | Range of random changes | 10 |
Small value used in comparisons | ||
Number of random samples | 200 | |
T | Initial temperature | |
Rate of decrease in temperature | 0.8 |
Dataset | ADAM | NEAT | GENETIC | RBF | NNC | NNC-S |
---|---|---|---|---|---|---|
APPENDICITIS | 16.50% | 17.20% | 18.10% | 12.23% | 14.40% | 14.60% |
AUSTRALIAN | 35.65% | 31.98% | 32.21% | 34.89% | 14.46% | 14.90% |
BALANCE | 7.87% | 23.84% | 8.97% | 33.42% | 22.13% | 7.66% |
CIRCULAR | 3.94% | 34.07% | 5.99% | 6.30% | 14.26% | 7.88% |
CLEVELAND | 67.55% | 53.44% | 51.60% | 67.10% | 49.93% | 48.59% |
DERMATOLOGY | 26.14% | 32.43% | 30.58% | 62.34% | 24.80% | 13.11% |
ECOLI | 64.43% | 43.24% | 49.38% | 59.50% | 48.82% | 44.88% |
HABERMAN | 29.00% | 24.04% | 28.66% | 25.10% | 28.33% | 28.73% |
HAYES ROTH | 59.70% | 50.15% | 56.18% | 64.36% | 37.23% | 28.08% |
HEART | 38.53% | 39.27% | 28.34% | 31.20% | 15.78% | 16.00% |
HOUSEVOTES | 7.48% | 10.89% | 6.62% | 6.13% | 3.52% | 3.74% |
IONOSPHERE | 16.64% | 19.67% | 15.14% | 16.22% | 11.86% | 10.03% |
LIVERDISORDER | 41.53% | 30.67% | 31.11% | 30.84% | 32.97% | 32.82% |
MAMMOGRAPHIC | 46.25% | 22.85% | 19.88% | 21.38% | 18.22% | 16.58% |
PARKINSONS | 24.06% | 18.56% | 18.05% | 17.41% | 13.21% | 12.26% |
PIMA | 34.85% | 34.51% | 32.19% | 25.78% | 28.47% | 25.26% |
POPFAILURES | 5.18% | 7.05% | 5.94% | 7.04% | 6.83% | 5.52% |
REGIONS2 | 29.85% | 33.23% | 29.39% | 38.29% | 25.87% | 24.47% |
SAHEART | 34.04% | 34.51% | 34.86% | 32.19% | 30.80% | 29.52% |
SEGMENT | 49.75% | 66.72% | 57.72% | 59.68% | 54.89% | 39.38% |
STUDENT | 5.13% | 12.50% | 5.61% | 7.52% | 5.70% | 4.52% |
TRANSFUSION | 25.68% | 24.87% | 25.84% | 27.36% | 25.30% | 24.33% |
WDBC | 35.35% | 12.88% | 8.56% | 7.27% | 7.27% | 5.59% |
WINE | 29.40% | 25.43% | 19.20% | 31.41% | 13.53% | 11.47% |
Z_F_S | 47.81% | 38.41% | 10.73% | 13.16% | 15.30% | 7.93% |
Z_O_N_F_S | 78.79% | 79.08% | 64.81% | 60.40% | 50.48% | 40.42% |
ZO_NF_S | 47.43% | 43.75% | 8.41% | 9.02% | 15.22% | 6.60% |
ZONF_S | 11.99% | 5.44% | 2.60% | 4.03% | 3.14% | 2.36% |
ZOO | 14.13% | 20.27% | 16.67% | 21.93% | 9.10% | 7.20% |
AVERAGE | 32.23% | 30.72% | 24.94% | 28.74% | 22.13% | 18.43% |
Dataset | ADAM | NEAT | GENETIC | RBF | NNC | NNC-S |
---|---|---|---|---|---|---|
ABALONE | 4.30 | 9.88 | 7.17 | 7.37 | 5.11 | 4.95 |
AIRFOIL | 0.005 | 0.067 | 0.003 | 0.27 | 0.003 | 0.003 |
BASEBALL | 77.90 | 100.39 | 103.60 | 93.02 | 59.40 | 57.30 |
CONCRETE | 0.078 | 0.081 | 0.0099 | 0.011 | 0.008 | 0.006 |
DEE | 0.63 | 1.512 | 1.013 | 0.17 | 0.26 | 0.23 |
HO | 0.035 | 0.167 | 2.78 | 0.03 | 0.016 | 0.012 |
HOUSING | 80.20 | 56.49 | 43.26 | 57.68 | 25.56 | 18.82 |
LASER | 0.03 | 0.084 | 0.59 | 0.024 | 0.026 | 0.015 |
LW | 0.028 | 0.17 | 1.90 | 1.14 | 0.97 | 0.038 |
MORTGAGE | 9.24 | 14.11 | 2.41 | 1.45 | 0.29 | 0.12 |
PL | 0.117 | 0.097 | 0.28 | 0.083 | 0.046 | 0.033 |
SN | 0.026 | 0.174 | 2.95 | 0.90 | 0.026 | 0.024 |
TREASURY | 11.16 | 15.52 | 2.93 | 2.02 | 0.47 | 0.18 |
TZ | 0.07 | 0.097 | 5.38 | 4.10 | 0.06 | 0.028 |
AVERAGE | 13.12 | 14.20 | 12.45 | 12.02 | 6.60 | 5.84 |
Datatset | NNC-S | NNC-S | NNC-S | NNC-S |
---|---|---|---|---|
APPENDICITIS | 14.90% | 15.00% | 14.60% | 14.50% |
AUSTRALIAN | 14.59% | 14.85% | 14.90% | 15.04% |
BALANCE | 8.53% | 7.68% | 7.66% | 7.56% |
CIRCULAR | 10.49% | 8.81% | 7.88% | 7.50% |
CLEVELAND | 48.41% | 48.31% | 48.59% | 48.10% |
DERMATOLOGY | 15.09% | 13.80% | 13.11% | 13.29% |
ECOLI | 45.30% | 45.12% | 44.88% | 44.36% |
HABERMAN | 27.80% | 27.97% | 28.73% | 27.83% |
HAYES ROTH | 29.85% | 28.85% | 28.08% | 29.15% |
HEART | 16.11% | 15.04% | 16.00% | 14.78% |
HOUSEVOTES | 3.70% | 4.22% | 3.74% | 3.70% |
IONOSPHERE | 10.54% | 10.09% | 10.03% | 10.00% |
LIVERDISORDER | 31.41% | 33.15% | 32.82% | 33.29% |
MAMMOGRAPHIC | 17.16% | 17.25% | 16.58% | 16.99% |
PARKINSONS | 12.32% | 12.89% | 12.26% | 12.11% |
PIMA | 26.12% | 25.96% | 25.26% | 25.92% |
POPFAILURES | 5.58% | 6.00% | 5.52% | 5.68% |
REGIONS2 | 24.71% | 24.05% | 24.47% | 24.66% |
SAHEART | 30.04% | 29.67% | 29.52% | 29.07% |
SEGMENT | 46.94% | 42.37% | 39.38% | 41.19% |
STUDENT | 4.60% | 4.73% | 4.52% | 4.48% |
TRANSFUSION | 24.28% | 24.34% | 24.33% | 24.03% |
WDBC | 6.23% | 6.22% | 5.59% | 5.68% |
WINE | 12.59% | 11.30% | 11.47% | 9.24% |
Z_F_S | 9.57% | 9.60% | 7.93% | 8.10% |
Z_O_N_F_S | 46.04% | 43.36% | 40.42% | 41.54% |
ZO_NF_S | 9.69% | 8.54% | 6.60% | 6.44% |
ZONF_S | 2.58% | 2.28% | 2.36% | 2.36% |
ZOO | 6.90% | 7.00% | 7.20% | 7.70% |
AVERAGE | 19.38% | 18.91% | 18.43% | 18.42% |
Dataset | NNC-S | NNC-S | NNC-S ) | NNC-S |
---|---|---|---|---|
APPENDICITIS | 14.40% | 14.90% | 14.60% | 15.20% |
AUSTRALIAN | 14.77% | 14.78% | 14.90% | 14.70% |
BALANCE | 7.66% | 7.74% | 7.66% | 7.66% |
CIRCULAR | 8.39% | 8.29% | 7.88% | 7.78% |
CLEVELAND | 49.45% | 49.28% | 48.59% | 47.11% |
DERMATOLOGY | 14.09% | 12.54% | 13.11% | 11.34% |
ECOLI | 44.24% | 46.30% | 44.88% | 44.48% |
HABERMAN | 27.33% | 28.10% | 28.73% | 28.04% |
HAYES ROTH | 29.15% | 27.92% | 28.08% | 26.46% |
HEART | 15.67% | 15.52% | 16.00% | 15.15% |
HOUSEVOTES | 4.00% | 3.62% | 3.74% | 4.52% |
IONOSPHERE | 10.14% | 10.03% | 10.03% | 10.71% |
LIVERDISORDER | 32.80% | 32.12% | 32.82% | 32.29% |
MAMMOGRAPHIC | 17.18% | 16.78% | 16.58% | 16.62% |
PARKINSONS | 12.68% | 12.16% | 12.26% | 11.95% |
PIMA | 25.72% | 25.11% | 25.26% | 26.33% |
POPFAILURES | 5.87% | 5.72% | 5.52% | 5.58% |
REGIONS2 | 23.55% | 24.04% | 24.47% | 24.08% |
SAHEART | 29.48% | 28.96% | 29.52% | 29.24% |
SEGMENT | 40.32% | 40.23% | 39.38% | 40.82% |
STUDENT | 4.18% | 4.50% | 4.52% | 4.78% |
TRANSFUSION | 24.60% | 24.12% | 24.33% | 24.36% |
WDBC | 6.09% | 5.68% | 5.59% | 5.46% |
WINE | 11.00% | 10.30% | 11.47% | 9.41% |
Z_F_S | 8.33% | 8.30% | 7.93% | 8.50% |
Z_O_N_F_S | 41.70% | 43.42% | 40.42% | 41.44% |
ZO_NF_S | 7.58% | 7.72% | 6.60% | 7.10% |
ZONF_S | 2.50% | 2.54% | 2.36% | 3.00% |
ZOO | 6.80% | 6.30% | 7.20% | 6.10% |
AVERAGE | 18.61% | 18.52% | 18.43% | 18.28% |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Tsoulos, I.G.; Tzallas, A.; Karvounis, E. Using Optimization Techniques in Grammatical Evolution. Future Internet 2024, 16, 172. https://doi.org/10.3390/fi16050172
Tsoulos IG, Tzallas A, Karvounis E. Using Optimization Techniques in Grammatical Evolution. Future Internet. 2024; 16(5):172. https://doi.org/10.3390/fi16050172
Chicago/Turabian StyleTsoulos, Ioannis G., Alexandros Tzallas, and Evangelos Karvounis. 2024. "Using Optimization Techniques in Grammatical Evolution" Future Internet 16, no. 5: 172. https://doi.org/10.3390/fi16050172
APA StyleTsoulos, I. G., Tzallas, A., & Karvounis, E. (2024). Using Optimization Techniques in Grammatical Evolution. Future Internet, 16(5), 172. https://doi.org/10.3390/fi16050172