An Optimisation-Driven Prediction Method for Automated Diagnosis and Prognosis
Abstract
:1. Introduction
- Section 2 gives a brief overview of the metaheuristics optimisation approaches employed in this study and highlights differences between previous classification schemes based on optimisation algorithms and the proposed method;
- Section 3 clarifies objectives and methods for this piece of research;
- Section 4 describes the proposed ODP method, including an explanation of the working principle behind the novel PSEDA variant, the definition of the classification process in the fashion of an optimisation problem and the formulation of five different objective functions, aka fitness function, to strengthen the validity of the proposed approach;
- Section 5 gives details of the experimental phase to make the presented results reproducible;
- Section 6 presents and discuss the numerical results;
- Section 7 concludes this work by summarising the key research outputs and drawing some considerations for possible future developments.
2. Background and Related Work
3. Objectives and Methods
4. The Optimisation-Driven Prediction Method
4.1. The sa–PSEDA Algorithm
- position ;
- personal best position ;
- global best ;
- position, personal best, and global best at the previous generation, that is, , , and respectively.
- , that is, a truncated (within ) normal distribution with mean value and standard deviation ;
- , that is, a truncated (within ) normal distribution with mean value and standard deviation ;
- , that is, a truncated (within ) normal distribution with mean value and standard deviation ;
- , that is, a uniform distribution within ;
- , that is, a relaxed variant of the mixture composed by the three truncated normal distributions , , of the previous iteration cycle .
Algorithm 1 sa–PSEDA |
|
4.2. The Classification Strategy
5. Experimental Setup
5.1. The Datasets
5.2. Parameter Settings and Comparison Algorithms
- ;
- ;
- ;
- Schwefel 1.2:
- Rosenbrock:
- Rastrigin:
- Ackley:
- the Bayes Net in Reference [56];
- the Multi Layer Perceptron Artificial Neural Network (MLP) in Reference [57];
- the Radial Basis Function Artificial Neural Network (RBF) in Reference [58];
- the Nearest Neighbor (NN) method in Reference [59];
- the k-Nearest Neighbors (kNN) in Reference [59] with ;
- the lazy “k-Star” scheme in Reference [60];
- the Bagging method in Reference [61];
- the MultiBoostAB (MBAB) algorithm in Reference [62];
- the Ripple Down Rule learner (Ridor) [63];
- the popular Naive Bayes classifier (NBTree) [64];
- the Voting Feature Intervals classifier (VFI) proposed in Reference [65].
6. Experimental Results
6.1. Comparison against SI Classifiers
6.2. Comparison against State-of-the-Art Classifiers
7. Conclusions and Future Work
Author Contributions
Funding
Conflicts of Interest
References
- De Vos, B.D.; Berendsen, F.F.; Viergever, M.A.; Staring, M.; Išgum, I. End-to-End Unsupervised Deformable Image Registration with a Convolutional Neural Network. In Deep Learning in Medical Image Analysis and Multimodal Learning for Clinical Decision Support; Springer International Publishing: Cham, Switzerland, 2017; pp. 204–212. [Google Scholar] [CrossRef] [Green Version]
- Eppenhof, K.A.J.; Lafarge, M.W.; Moeskops, P.; Veta, M.; Pluim, J.P.W. Deformable image registration using convolutional neural networks. In Medical Imaging 2018: Image Processing; Angelini, E.D., Landman, B.A., Eds.; International Society for Optics and Photonics (SPIE): Houston, TX, USA, 2018; Volume 10574, pp. 192–197. [Google Scholar] [CrossRef]
- Erickson, B.J.; Korfiatis, P.; Akkus, Z.; Kline, T.L. Machine learning for medical imaging. Radiographics 2017, 37, 505–515. [Google Scholar] [CrossRef]
- Hu, J.; Chen, D.; Liang, P. A Novel Interval Three-Way Concept Lattice Model with Its Application in Medical Diagnosis. Mathematics 2019, 7, 103. [Google Scholar] [CrossRef]
- Hernaiz-Guijarro, M.; Castro-Palacio, J.C.; Navarro-Pardo, E.; Isidro, J.M.; Fernández-de Córdoba, P. A Probabilistic Classification Procedure Based on Response Time Analysis Towards a Quick Pre-Diagnosis of Student’s Attention Deficit. Mathematics 2019, 7, 473. [Google Scholar] [CrossRef]
- Maglogiannis, I.; Zafiropoulos, E.; Anagnostopoulos, I. An intelligent system for automated breast cancer diagnosis and prognosis using SVM based classifiers. Appl. Intell. 2009, 30, 24–36. [Google Scholar] [CrossRef]
- Kourou, K.; Exarchos, T.P.; Exarchos, K.P.; Karamouzis, M.V.; Fotiadis, D.I. Machine learning applications in cancer prognosis and prediction. Comput. Struct. Biotechnol. J. 2015, 13, 8–17. [Google Scholar] [CrossRef] [PubMed]
- Castaneda, C.; Nalley, K.; Mannion, C.; Bhattacharyya, P.; Blake, P.; Pecora, A.; Goy, A.; Suh, K.S. Clinical decision support systems for improving diagnostic accuracy and achieving precision medicine. J. Clin. Bioinform. 2015, 5, 4. [Google Scholar] [CrossRef] [Green Version]
- Abu-Nasser, B. Medical Expert Systems Survey. Int. J. Eng. Inf. Syst. 2017, 1, 218–224. [Google Scholar]
- Rajkomar, A.; Dean, J.; Kohane, I. Machine Learning in Medicine. N. Engl. J. Med. 2019, 380, 1347–1358. [Google Scholar] [CrossRef]
- Lin, W.; Tong, T.; Gao, Q.; Guo, D.; Du, X.; Yang, Y.; Guo, G.; Xiao, M.; Du, M.; Qu, X. Convolutional Neural Networks-Based MRI Image Analysis for the Alzheimer’s Disease Prediction From Mild Cognitive Impairment. Front. Neurosci. 2018, 12, 777. [Google Scholar] [CrossRef]
- Amezquita-Sanchez, J.P.; Mammone, N.; Morabito, F.C.; Marino, S.; Adeli, H. A novel methodology for automated differential diagnosis of mild cognitive impairment and the Alzheimer’s disease using EEG signals. J. Neurosci. Methods 2019, 322, 88–95. [Google Scholar] [CrossRef]
- Pereira, S.; Pinto, A.; Alves, V.; Silva, C.A. Brain Tumor Segmentation Using Convolutional Neural Networks in MRI Images. IEEE Trans. Med Imaging 2016, 35, 1240–1251. [Google Scholar] [CrossRef] [PubMed]
- Korolev, S.; Safiullin, A.; Belyaev, M.; Dodonova, Y. Residual and plain convolutional neural networks for 3D brain MRI classification. In Proceedings of the 2017 IEEE 14th International Symposium on Biomedical Imaging (ISBI 2017), Melbourne, Australia, 18–21 April 2017; pp. 835–838. [Google Scholar] [CrossRef]
- Hoseini, F.; Shahbahrami, A.; Bayat, P. An Efficient Implementation of Deep Convolutional Neural Networks for MRI Segmentation. J. Digit. Imaging 2018, 31, 738–747. [Google Scholar] [CrossRef] [PubMed]
- Collste, G. Expert systems in medicine and moral responsibility. J. Syst. Softw. 1992, 17, 15–22. [Google Scholar] [CrossRef]
- Shameer, K.; Johnson, K.W.; Glicksberg, B.S.; Dudley, J.T.; Sengupta, P.P. Machine learning in cardiovascular medicine: Are we there yet? Heart 2018, 104, 1156–1164. [Google Scholar] [CrossRef]
- Schrider, D.R.; Kern, A.D. Supervised Machine Learning for Population Genetics: A New Paradigm. Trends Genet. 2018, 34, 301–312. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Santucci, V.; Milani, A. Particle Swarm Optimization in the EDAs Framework. In Soft Computing in Industrial Applications; Springer: Berlin/Heidelberg, Germany, 2011; Volume 96, pp. 87–96. [Google Scholar]
- Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the IEEE International Conference on Neural Networks, Perth, Australia, 27 November–1 December 1995; Volume 4, pp. 1942–1948. [Google Scholar]
- Baioletti, M.; Milani, A.; Santucci, V. A New Precedence-Based Ant Colony Optimization for Permutation Problems. In Simulated Evolution and Learning; Springer International Publishing: Cham, Switzerland, 2017; pp. 960–971. [Google Scholar]
- Santucci, V.; Baioletti, M.; Milani, A. Tackling Permutation-based Optimization Problems with an Algebraic Particle Swarm Optimization Algorithm. Fundam. Inform. 2019, 167, 133–158. [Google Scholar] [CrossRef]
- Milani, A.; Santucci, V. Asynchronous differential evolution. In Proceedings of the IEEE Congress on Evolutionary Computation, Portland, OR, USA, 19–23 June 2004; pp. 1–7. [Google Scholar]
- Eiben, A.E.; Smith, J.E. Introduction to Evolutionary Computing; Springer: Berlin/Heidelberg, Germany, 2003; Volume 53. [Google Scholar]
- Kononova, A.V.; Corne, D.W.; Wilde, P.D.; Shneer, V.; Caraffini, F. Structural bias in population-based algorithms. Inf. Sci. 2015, 298, 468–490. [Google Scholar] [CrossRef] [Green Version]
- Zhang, Y.; Wang, S.; Ji, G. A comprehensive survey on particle swarm optimization algorithm and its applications. Math. Probl. Eng. 2015, 2015, 931256. [Google Scholar] [CrossRef]
- Kong, F.; Jiang, J.; Huang, Y. An Adaptive Multi-Swarm Competition Particle Swarm Optimizer for Large-Scale Optimization. Mathematics 2019, 7, 521. [Google Scholar] [CrossRef]
- Guo, W.; Zhu, L.; Wang, L.; Wu, Q.; Kong, F. An Entropy-Assisted Particle Swarm Optimizer for Large-Scale Optimization Problem. Mathematics 2019, 7, 414. [Google Scholar] [CrossRef]
- Elbes, M.; Alzubi, S.; Kanan, T.; Al-Fuqaha, A.; Hawashin, B. A survey on particle swarm optimization with emphasis on engineering and network applications. Evol. Intell. 2019, 12, 113–129. [Google Scholar] [CrossRef]
- Iacca, G.; Caraffini, F.; Neri, F. Milti-Strategy Coevolving Aging Particle Optimization. Int. J. Neural Syst. 2014, 24, 1450008. [Google Scholar] [CrossRef] [PubMed]
- El-Abd, M.; Kamel, M. PSO_Bounds: A New Hybridization Technique of PSO and EDAs. Found. Comput. Intell. 2009, 3, 509–526. [Google Scholar]
- Iacca, G.; Caraffini, F. Compact Optimization Algorithms with Re-Sampled Inheritance. In Applications of Evolutionary Computation; Kaufmann, P., Castillo, P.A., Eds.; Springer International Publishing: Cham, Switzerland, 2019; pp. 523–534. [Google Scholar] [CrossRef] [Green Version]
- Iacca, G.; Caraffini, F.; Neri, F. Compact Differential Evolution Light: High Performance Despite Limited Memory Requirement and Modest Computational Overhead. J. Comput. Sci. Technol. 2012, 27, 1056–1076. [Google Scholar] [CrossRef]
- Pelikan, M.; Goldberg, D.E.; Cantu-Paz, E. Linkage problem, distribution estimation, and Bayesian networks. Evol. Comput. 2000, 8, 311–340. [Google Scholar] [CrossRef]
- Kulkarni, R.; Venayagamoorthy, G. An estimation of distribution improved particle swarm optimization algorithm. In Proceedings of the ISSNIP 2007 3rd International Conference on Intelligent Sensors, Sensor Networks and Information, London, UK, 3–6 December 2007; pp. 539–544. [Google Scholar]
- Li, J.; Zhang, J.; Jiang, C.; Zhou, M. Composite particle swarm optimizer with historical memory for function optimization. IEEE Trans. Cybern. 2015, 45, 2350–2363. [Google Scholar] [CrossRef]
- Diker, A.; Avci, D.; Avci, E.; Gedikpinar, M. A new technique for ECG signal classification genetic algorithm Wavelet Kernel extreme learning machine. Optik 2019, 180, 46–55. [Google Scholar] [CrossRef]
- Ghosh, M.; Begum, S.; Sarkar, R.; Chakraborty, D.; Maulik, U. Recursive Memetic Algorithm for gene selection in microarray data. Expert Syst. Appl. 2019, 116, 172–185. [Google Scholar] [CrossRef]
- De Falco, I.; Della Cioppa, A.; Tarantino, E. Facing classification problems with particle swarm optimization. Appl. Soft Comput. 2007, 7, 652–658. [Google Scholar] [CrossRef]
- Karaboga, D.; Ozturk, C. A novel clustering approach: Artificial Bee Colony (ABC) algorithm. Appl. Soft Comput. 2011, 11, 652–657. [Google Scholar] [CrossRef]
- Frank, A.; Asuncion, A. UCI Machine Learning Repository; University of California, School of Information and Computer Science: Irvine, CA, USA, 2010. [Google Scholar]
- Hall, M.; Frank, E.; Holmes, G.; Pfahringer, B.; Reutemann, P.; Witten, I.H. The WEKA data mining software: An update. SIGKDD Explor. Newsl. 2009, 11, 10–18. [Google Scholar] [CrossRef]
- Caraffini, F.; Kononova, A.V. Structural bias in differential evolution: A preliminary study. In AIP Conference Proceedings; AIP Publishings: Leiden, The Netherlands, 2019; Volume 2070, p. 020005. [Google Scholar] [CrossRef]
- Caraffini, F.; Kononova, A.V.; Corne, D. Infeasibility and structural bias in differential evolution. Inf. Sci. 2019, 496, 161–179. [Google Scholar] [CrossRef] [Green Version]
- Titterington, D.; Smith, A.; Makov, U. Statistical Analysis of Finite Mixture Distributions; Wiley: New York, NY, USA, 1985; Volume 38. [Google Scholar]
- Baioletti, M.; Milani, A.; Santucci, V. MOEA/DEP: An algebraic decomposition-based evolutionary algorithm for the multiobjective permutation flowshop scheduling problem. In European Conference on Evolutionary Computation in Combinatorial Optimization; Springer: Cham, Switzerland, 2018; pp. 132–145. [Google Scholar]
- Choudhary, R.; Gianey, H.K. Comprehensive Review On Supervised Machine Learning Algorithms. In Proceedings of the 2017 International Conference on Machine Learning and Data Science (MLDS), Noida, India, 14–15 December 2017; pp. 37–43. [Google Scholar]
- Liu, Z.T.; Wu, M.; Cao, W.H.; Mao, J.W.; Xu, J.P.; Tan, G.Z. Speech emotion recognition based on feature selection and extreme learning machine decision tree. Neurocomputing 2018, 273, 271–280. [Google Scholar] [CrossRef]
- Moodley, R.; Chiclana, F.; Caraffini, F.; Carter, J. Application of uninorms to market basket analysis. Int. J. Intell. Syst. 2019, 34, 39–49. [Google Scholar] [CrossRef]
- Moodley, R.; Chiclana, F.; Caraffini, F.; Carter, J. A product-centric data mining algorithm for targeted promotions. J. Retail. Consum. Serv. 2019, 101940. [Google Scholar] [CrossRef]
- Mohammed, M.A.; Abd Ghani, M.K.; Arunkumar, N.; Hamed, R.I.; Mostafa, S.A.; Abdullah, M.K.; Burhanuddin, M.A. Decision support system for nasopharyngeal carcinoma discrimination from endoscopic images using artificial neural network. J. Supercomput. 2018. [Google Scholar] [CrossRef]
- Peña, A.; Bonet, I.; Manzur, D.; Góngora, M.; Caraffini, F. Validation of convolutional layers in deep learning models to identify patterns in multispectral images. In Proceedings of the 2019 14th Iberian Conference on Information Systems and Technologies (CISTI), Coimbra, Portugal, 19–22 June 2019; pp. 1–6. [Google Scholar] [CrossRef]
- Rocchio, J. Relevance Feedback in Information Retrieval. In SMART Retrieval System Experimens in Automatic Document Processing; Prentice-Hall, Inc.: Upper Saddle River, NJ, USA, 1971. [Google Scholar]
- Caraffini, F. The Stochastic Optimisation Software (SOS) Platform; Zenodo: Geneva, Switzerland, 2019. [Google Scholar]
- Levner, I. Feature selection and nearest centroid classification for protein mass spectrometry. BMC Bioinform. 2005, 6, 68. [Google Scholar] [CrossRef]
- Jensen, F. An Introduction to Bayesian Networks; UCL Press: London, UK, 1996; Volume 74. [Google Scholar]
- Rumelhart, D.; Hintont, G.; Williams, R. Learning representations by back-propagating errors. Nature 1986, 323, 533–536. [Google Scholar] [CrossRef]
- Hassoun, M. Fundamentals of Artificial Neural Networks; MIT Press: Cambridge, MA, USA, 1995. [Google Scholar]
- Cover, T.; Hart, P. Nearest neighbor pattern classification. IEEE Trans. Inf. Theory 1967, 13, 21–27. [Google Scholar] [CrossRef]
- Cleary, J.G.; Trigg, L.E. K*: An Instance-based Learner Using an Entropic Distance Measure. In Proceedings of the 12th International Conference on Machine Learning, Morgan Kaufmann, Nashville, TN, USA, 9–12 July 1995; pp. 108–114. [Google Scholar]
- Breiman, L. Bagging predictors. Mach. Learn. 1996, 24, 123–140. [Google Scholar] [CrossRef] [Green Version]
- Webb, G. Multiboosting: A technique for combining boosting and wagging. MAchine Learn. 2000, 40, 159–196. [Google Scholar] [CrossRef]
- Compton, P.; Jansen, R. Knowledge in context: A strategy for expert system maintenance. In Lecture Notes in Computer Science; Barter, C., Brooks, M., Eds.; AI ’88; Springer: Berlin/Heidelberg, Germany, 1990; Volume 406, pp. 292–306. [Google Scholar] [CrossRef]
- Kohavi, R. Scaling up the accuracy of naive-Bayes classifiers: A decision-tree hybrid. In Proceedings of the Second International Conference on Knowledge Discovery and Data Mining, Portland, OR, USA, 2–4 August 1996; Volume 7. [Google Scholar]
- Demiröz, G.; Güvenir, H. Classification by voting feature intervals. In Machine Learning: ECML-97; Springer: Berlin/Heidelberg, Germany, 1997; pp. 85–92. [Google Scholar]
- Tilahun, S.L.; Ngnotchouye, J.M.T.; Hamadneh, N.N. Continuous versions of firefly algorithm: A review. Artif. Intell. Rev. 2019, 51, 445–492. [Google Scholar] [CrossRef]
- Caraffini, F.; Neri, F. A study on rotation invariance in differential evolution. Swarm Evol. Comput. 2018, 100436. [Google Scholar] [CrossRef]
- Wolpert, D.H.; Macready, W.G. No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–82. [Google Scholar] [CrossRef] [Green Version]
- Baioletti, M.; Milani, A.; Santucci, V. Automatic Algebraic Evolutionary Algorithms. In International Workshop on Artificial Life and Evolutionary Computation (WIVACE 2017); Springer International Publishing: Cham, Switzerland, 2018; pp. 271–283. [Google Scholar] [CrossRef]
- Baioletti, M.; Milani, A.; Santucci, V. Learning Bayesian Networks with Algebraic Differential Evolution. In 15th International Conference on Parallel Problem Solving from Nature—PPSN XV; Springer International Publishing: Cham, Switzerland, 2018; pp. 436–448. [Google Scholar] [CrossRef]
Dataset | #Inst. | #Classes | #Attr. | #Real | #Int. | #Log. | Missing Inf. |
---|---|---|---|---|---|---|---|
Breast-Real | 569 | 2 | 30 | 30 | 0 | 0 | no |
Breast-Integer | 699 | 2 | 9 | 0 | 9 | 0 | yes |
Dermatology | 366 | 6 | 34 | 0 | 33 | 1 | yes |
Diabetes | 768 | 2 | 8 | 2 | 6 | 0 | no |
Haberman | 306 | 2 | 3 | 0 | 3 | 0 | no |
Heart-2C | 303 | 2 | 13 | 1 | 9 | 3 | yes |
Heart-5C | 303 | 5 | 13 | 1 | 9 | 3 | yes |
Liver | 341 | 2 | 6 | 1 | 5 | 0 | no |
Parkinsons | 195 | 2 | 22 | 22 | 0 | 0 | no |
Thyroid | 215 | 3 | 5 | 4 | 1 | 0 | no |
Vertebral | 310 | 3 | 6 | 6 | 0 | 0 | no |
Benchmark | d | Minimum Fitness (avg ± std) | Statistical Difference | |
---|---|---|---|---|
sa-PSEDA | sa-PSEDA | |||
Schwefel 1.2 | 30 | no | ||
10 | no | |||
Rosenbrock | 30 | yes | ||
10 | no | |||
Rastrigin | 30 | yes | ||
10 | no | |||
Ackley | 30 | no | ||
10 | no |
Algorithm | Avg Accuracy | Statistical Analysis |
---|---|---|
sa-PSEDA- | 81.45 ± 13.61 | − |
PSO- | 80.35 ± 13.39 | |
ABC- | 80.53 ± 13.30 | |
sa-PSEDA- | 77.40 ± 16.59 | − |
PSO- | 76.59 ± 15.96 | |
ABC- | 77.48 ± 16.69 | |
sa-PSEDA- | 82.90 ± 13.70 | − |
PSO- | 82.66 ± 13.75 | |
ABC- | 82.07 ± 13.85 | |
sa-PSEDA- | 82.23 ± 14.16 | − |
PSO- | 81.90 ± 14.09 | |
ABC- | 81.87 ± 14.06 | |
sa-PSEDA- | 82.92 ± 13.88 | − |
PSO- | 82.51 ± 14.02 | |
ABC- | 81.91 ± 13.91 | |
NC | 78.15 ± 16.13 |
Algorithm | Avg Accuracy | Statistical Analysis |
---|---|---|
sa-PSEDA- | 81.45 ± 13.61 | |
sa-PSEDA- | 77.40 ± 16.59 | |
sa-PSEDA- | 82.90 ± 13.70 | |
sa-PSEDA- | 82.23 ± 14.16 | |
sa-PSEDA- | 82.92 ± 13.88 | − |
Dataset | Best Classifier | 2nd Best Classifier | 3rd Best Classifier |
---|---|---|---|
Breast-Real | PSO- | sa–PSEDA- | sa–PSEDA- |
Breast-Integer | sa–PSEDA- | PSO- | PSO- |
Dermatology | ABC- | sa–PSEDA- | NC |
Diabetes | sa–PSEDA- | sa–PSEDA- | sa–PSEDA- |
Haberman | ABC- | sa–PSEDA- | PSO- |
Heart-2C | PSO- | sa–PSEDA- | PSO- |
Heart-5C | sa–PSEDA- | PSO- | sa–PSEDA- |
Liver | PSO- | sa–PSEDA- | PSO- |
Parkinsons | sa–PSEDA- | sa–PSEDA- | sa–PSEDA- |
Thyroid | PSO- | sa–PSEDA- | sa–PSEDA- |
Vertebral | PSO- | sa–PSEDA- | PSO- |
All | sa–PSEDA- | sa–PSEDA- | PSO- |
Dataset | sa–PSEDA- | BayesNet | MLP | RBF | NN | kNN | kStar | Bagging | MBAB | Ridor | NBTree | VFI | |||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Breast-Real | |||||||||||||||||||||||
Breast-Integer | |||||||||||||||||||||||
Dermatology | |||||||||||||||||||||||
Diabetes | |||||||||||||||||||||||
Haberman | |||||||||||||||||||||||
Heart-2C | |||||||||||||||||||||||
Heart-5C | |||||||||||||||||||||||
Liver | |||||||||||||||||||||||
Parkinsons | |||||||||||||||||||||||
Thyroid | |||||||||||||||||||||||
Vertebral | |||||||||||||||||||||||
Average | |||||||||||||||||||||||
Sat. Analysis | − | // | // | // | // | // | // | // | // | // | // | // |
Dataset | Most Performant | 2nd Most Perf. | 3rd Most Perf. |
---|---|---|---|
Breast-Real | sa–PSEDA- | kNN | MLP |
Breast-Integer | BayesNet | kNN | sa–PSEDA- |
Dermatology | BayesNet | MLP | kNN |
Diabetes | sa–PSEDA- | Bagging | BayesNet |
Haberman | sa–PSEDA- | MLP | RBF |
Heart-2C | sa–PSEDA- | BayesNet | RBF |
Heart-5C | BayesNet | sa–PSEDA- | RBF |
Liver | Bagging | MLP | sa–PSEDA- |
Parkinsons | NN | kNN | MLP |
Thyroid | NN | sa–PSEDA- | MLP |
Vertebral | MLP | Bagging | RBF |
All | MLP | sa–PSEDA- | Bagging |
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Santucci, V.; Milani, A.; Caraffini, F. An Optimisation-Driven Prediction Method for Automated Diagnosis and Prognosis. Mathematics 2019, 7, 1051. https://doi.org/10.3390/math7111051
Santucci V, Milani A, Caraffini F. An Optimisation-Driven Prediction Method for Automated Diagnosis and Prognosis. Mathematics. 2019; 7(11):1051. https://doi.org/10.3390/math7111051
Chicago/Turabian StyleSantucci, Valentino, Alfredo Milani, and Fabio Caraffini. 2019. "An Optimisation-Driven Prediction Method for Automated Diagnosis and Prognosis" Mathematics 7, no. 11: 1051. https://doi.org/10.3390/math7111051
APA StyleSantucci, V., Milani, A., & Caraffini, F. (2019). An Optimisation-Driven Prediction Method for Automated Diagnosis and Prognosis. Mathematics, 7(11), 1051. https://doi.org/10.3390/math7111051