Data Preprocessing and Neural Network Architecture Selection Algorithms in Cases of Limited Training Sets—On an Example of Diagnosing Alzheimer’s Disease
Abstract
:1. Introduction
1.1. Related Papers
1.2. Research Statement
2. Materials and Methods
2.1. Initial Data
2.2. Methodology and Methods
3. Results
3.1. Results of the Preliminary Data Analysis
3.2. Results of Interval Coding Initial Data to Discrete Values
3.2.1. The Integrated Rating Mechanism (Decisions’ Roots) Identified Based on the Encoded Training Set
3.2.2. Neural Network Architecture Selection
3.2.3. Results of Neural Network Training
4. Discussion
5. Conclusions
Supplementary Materials
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Yasnitsky, L.N.; Dumler, A.A.; Cherepanov, F.M.; Yasnitsky, V.L.; Uteva, N.A. Capabilities of neural network technologies for extracting new medical knowledge and enhancing precise decision making for patients. Expert Rev. Precis. Med. Drug Dev. 2022, 7, 70–78. [Google Scholar] [CrossRef]
- Yasnitsky, L.N. Artificial intelligence and medicine: History, current state, and forecasts for the future. Curr. Hypertens. Rev. 2020, 16, 210–215. [Google Scholar] [CrossRef] [PubMed]
- Pyatakovich, F.A.; Khlivnenko, L.V.; Mevsha, O.V.; Yakunchenko, T.I.; Makkonen, K.F. Development of a biotechnical system based on the functioning of neural networks for solving the problem of the scattergrams analysis. Netw. Electron. Sci. Educ. J. Mod. Issues Biomed. 2018, 3, 171–183. [Google Scholar]
- Bogdanov, L.A.; Komossky, E.A.; Voronkova, V.V.; Tolstosheev, D.E.; Martsenyuk, G.V.; Agienko, A.S.; Indukaeva, E.V.; Kutikhin, A.G.; Tsygankova, D.P. Prototyping neural networks to evaluate the risk of adverse cardiovascular outcomes in the population. Fundam. Clin. Med. 2021, 6, 67–81. [Google Scholar] [CrossRef]
- Kilic, A. Artificial Intelligence and Machine Learning in Cardiovascular Health Care. Ann. Thorac. Surg. 2020, 109, 1323–1329. [Google Scholar] [CrossRef]
- Solodukha, T.V. Development of a Specialized Computer System Based on Neural Networks for Predicting the Consequences of Allergic Reactions. Master’s Thesis, Donetsk National Technical University, Donetsk, Ukraine, 2002. [Google Scholar]
- Carrara, M.; Bono, A.; Bartoli, C.; Colombo, A.; Lualdi, M.; Moglia, D.; Santoro, N.; Tolomio, E.; Tomatis, S.; Tragni, G.; et al. Multispectral imaging and artificial neural network: Mimicking the management decision of the clinician facing pigmented skin lesions. Phys. Med. Biol. 2007, 52, 2599–2613. [Google Scholar] [CrossRef] [PubMed]
- Gruvberger-Saal, S.K.; Edén, P.; Ringnér, M.; Baldetorp, B.; Chebil, G.; Borg, A.; Fernö, M.; Peterson, C.; Meltzer, P.S. Predicting continuous values of prognostic markers in breast cancer from microarray gene expression profiles. Mol. Cancer Ther. 2004, 3, 161–168. [Google Scholar] [CrossRef]
- Ercal, F.; Chawla, A.; Stoeker, W.V.; Lee, H.C.; Moss, R.H. Neural network diagnosis of malignant melanoma from color images. IEEE Trans. Biomed. Eng. 1994, 41, 837–845. [Google Scholar] [CrossRef]
- Lundin, J.; Lundin, M.; Holli, K.; Kataja, V.; Elomaa, L.; Pylkkänen, L.; Turpeenniemi-Hujanen, T.; Joensuu, H. Omission of histologic grading from clinical decision making may result in overuse of adjuvant therapies in breast cancer: Results from a nationwide study. J. Clin. Oncol. 2001, 19, 28–36. [Google Scholar] [CrossRef]
- Zheng, B.; Leader, J.K.; Abrams, G.S.; Lu, A.H.; Wallace, L.P.; Maitz, G.S.; Gur, D. Multiview based computer-aided detection scheme for breast masses. Med. Phys. 2006, 33, 3135–3143. [Google Scholar] [CrossRef]
- Gavrilov, D.A.; Zakirov, E.I.; Gameeva, E.V.; Semenov, V.Y.; Aleksandrova, O.Y. Automated skin melanoma diagnostics based on mathematical model of artificial convolutional neural network. Res. Pract. Med. J. 2018, 5, 110–116. [Google Scholar] [CrossRef]
- Jerez-Aragonés, J.M.; Gómez-Ruiz, J.A.; Ramos-Jiménez, G.; Muñoz-Pérez, J.; AlbaConejo, E.A. Combined neural network and decision trees model for prognosis of breast cancer relapse. Artif. Intell. Med. 2003, 27, 45–63. [Google Scholar] [CrossRef]
- Lundin, J.; Burke, H.B.; Toikkanen, S.; Pylkkänen, L.; Joensuu, H. Artificial neural networks applied to survival prediction in breast cancer. Oncology 1999, 57, 281–286. [Google Scholar] [CrossRef]
- Reed, T.R.; Reed, N.E.; Fritzson, P. Heart sound analysis for symptom detection and computer-aided diagnosis. Simul. Model. Pract. Theory 2004, 12, 129–146. [Google Scholar] [CrossRef]
- Chang, P.D.; Kuoy, E.; Grinband, J.; Weinberg, B.D.; Thompson, M.; Homo, R.; Chen, J.; Abcede, H.; Shafie, M.; Sugrue, L.; et al. Hybrid 3D/2D convolutional neural network for hemorrhage evaluation on head CT. Am. J. Neuroradiol. 2018, 39, 1609–1616. [Google Scholar] [CrossRef] [PubMed]
- Chao, P.-J.; Chang, L.; Kang, C.L.; Lin, C.-H.; Shieh, C.-S.; Wu, J.M.; Tseng, C.-D.; Tsai, I.-H.; Hsu, H.-C.; Huang, Y.-J.; et al. Using deep learning models to analyze the cerebral edema complication caused by radiotherapy in patients with intracranial tumor. Sci. Rep. 2022, 12, 1555. [Google Scholar] [CrossRef] [PubMed]
- Liu, Y.; Dong, J.; Song, Q.; Zhang, N.; Wang, W.; Gao, B.; Tian, S.; Dong, C.; Liang, Z.; Xie, L.; et al. Correlation between cerebral venous oxygen level and cognitive status in patients with Alzheimer’s disease using quantitative susceptibility mapping. Front. Neurosci. 2021, 14, 570848. [Google Scholar] [CrossRef]
- Gribanov, A.V.; Dzhos, Y.S.; Deryabina, I.N.; Deputat, I.S.; Yemelianova, T.V. An aging of the human brain: Morpho functional aspects. Zhurnal Nevrol. Psikhiatr. Im. S.S. Korsakova 2017, 117, 3–7. [Google Scholar] [CrossRef]
- Guseva, E.I.; Hecht, A.B. Diseases of the Brain: Problems and Solutions; Sam Polygraphist LLC: Moscow, Russia, 2021; 416p. [Google Scholar]
- Perepelkina, O.V.; Tarasova, A.Y.; Poletaeva, I.I. Modeling of diseases of the human brain in experiments on rodents (brief review). Mod. Foreign Psychol. 2016, 5, 13–23. [Google Scholar] [CrossRef]
- Bilkei-Gorzo, A. Genetic mouse models of brain aging and Alzheimer’s disease. Pharmacol. Ther. 2014, 142, 244–257. [Google Scholar] [CrossRef]
- Andjelkovic, A.V.; Stamatovic, S.M.; Phillips, C.M. Modeling blood–brain barrier pathology in cerebrovascular disease in vitro: Current and future paradigms. Fluids Barriers CNS 2020, 17, 44. [Google Scholar] [CrossRef] [PubMed]
- Ago, T. The neurovascular unit in health and ischemic stroke. Nihon Rinsho 2016, 74, 583–588. [Google Scholar] [PubMed]
- Cai, W.; Liu, H.; Zhao, J.; Chen, L.Y.; Chen, J.; Lu, Z.; Hu, X. Pericytes in brain injury and repair after ischemic stroke. Transl. Stroke Res. 2017, 8, 107–121. [Google Scholar] [CrossRef] [PubMed]
- De Marco, M.; Venneri, A. Volume and connectivity of the ventral tegmental area are linked to neurocognitive signatures of Alzheimer’s disease in humans. J. Alzheimer’s Dis. 2018, 63, 167–180. [Google Scholar] [CrossRef]
- Nobili, A.; Latagliata, E.C.; Viscomi, M.T.; Cavallucci, V.; Cutuli, D.; Giacovazzo, G.; Krashia, P.; Rizzo, F.R.; Marino, R.; Federici, M.; et al. Dopamine neuronal loss contributes to memory and reward dysfunction in a model of Alzheimer’s disease. Nat. Commun. 2017, 8, 14727. [Google Scholar] [CrossRef]
- Chinta, S.J.; Andersen, J.K. Dopaminergic neurons. Int. J. Biochem. Cell Biol. 2005, 37, 942–946. [Google Scholar] [CrossRef] [PubMed]
- Schultz, W. Multiple dopamine functions at different time courses. Annu. Rev. Neurosci. 2007, 30, 259–288. [Google Scholar] [CrossRef]
- Schultz, W. Behavioral dopamine signals. Trends Neurosci. 2007, 30, 203–210. [Google Scholar] [CrossRef]
- Vipin, A.; Loke, Y.M.; Liu, S.; Hilal, S.; Shim, H.Y.; Xu, X.; Tan, B.Y.; Venketasubramanian, N.; Chen, C.L.; Zhou, J. Cerebrovascular disease influences functional and structural network connectivity in patients with amnestic mild cognitive impairment and Alzheimer’s disease. Alzheimer’s Res. Ther. 2018, 10, 82. [Google Scholar] [CrossRef]
- Zhu, H.; Zhou, P.; Alcauter, S.; Chen, Y.; Cao, H.; Tian, M.; Ming, D.; Qi, H.; Wang, X.; Zhao, X.; et al. Changes of intranetwork and internetwork functional connectivity in Alzheimer’s disease and mild cognitive impairment. J. Neural Eng. 2016, 13, 046008. [Google Scholar] [CrossRef]
- Buckner, R.L.; Snyder, A.Z.; Shannon, B.J.; LaRossa, G.; Sachs, R.; Fotenos, A.F.; Sheline, Y.I.; Klunk, W.E.; Mathis, C.A.; Morris, J.C.; et al. Molecular, structural, and functional characterization of Alzheimer’s disease: Evidence for a relationship between default activity, amyloid, and memory. J. Neurosci. 2005, 25, 7709–7717. [Google Scholar] [CrossRef] [PubMed]
- Hedera, P.; Lai, S.; Lewin, J.S.; Haacke, E.M.; Wu, D.; Lerner, A.J.; Friedland, R.P. Assessment of cerebral blood flow reserve using functional magnetic resonance imaging. J. Magn. Reason. Imaging 1996, 6, 718–725. [Google Scholar] [CrossRef] [PubMed]
- Choi, J.-K.; Chen, Y.I.; Hamel, E.; Jenkins, B.G. Brain hemodynamic changes mediated by dopamine receptors: Role of the cerebral microvasculature in dopamine-mediated neurovascular coupling. NeuroImage 2006, 30, 700–712. [Google Scholar] [CrossRef] [PubMed]
- Walton, L.R.; Verber, M.; Lee, S.H.; Chao, T.H.; Wightman, R.M.; Shih, Y.I. Simultaneous fMRI and fast-scan cyclic voltammetry bridges evoked oxygen and neurotransmitter dynamics across spatiotemporal scales. NeuroImage 2021, 244, 118634. [Google Scholar] [CrossRef]
- Knutson, B.; Gibbs, S.E.B. Linking nucleus accumbens dopamine and blood oxygenation. Psychopharmacology 2007, 191, 813–822. [Google Scholar] [CrossRef]
- Leung, B.K.; Balleine, B.W. Ventral pallidal projections to mediodorsal thalamus and ventral tegmental area play distinct roles in outcome-specific Pavlovian-instrumental transfer. J. Neurosci. 2015, 35, 4953–4964. [Google Scholar] [CrossRef]
- Bruinsma, T.J.; Sarma, V.V.; Oh, Y.; Jang, D.P.; Chang, S.Y.; Worrell, G.A.; Lowe, V.J.; Jo, H.J.; Min, H.K. The relationship between dopamine neurotransmitter dynamics and the blood-oxygen-level-dependent (BOLD) signal: A review of pharmacological functional magnetic resonance imaging. Front. Neurosci. 2018, 12, 238. [Google Scholar] [CrossRef]
- Kozhemyakin, L.V.; Alekseev, A.O.; Nikitin, V.N. Application of Decisions’ Roots for Data Analysis on Example of Dataset with Magnetic Susceptibility Values of the Brain Veins and the Alzheimer’s Disease. In Proceedings of the 2022 4th International Conference on Control Systems, Mathematical Modeling, Automation and Energy Efficiency (SUMMA), Lipetsk, Russia, 9–11 November 2022. [Google Scholar] [CrossRef]
- Trapeznikov, V.; Gorelikov, N.; Burkov, V.; Zimokha, V.; Tolstykh, A.; Cherkashin, A.; Tsyganov, V. An integrated approach to managing scientific and technological progress in the industry. Her. Acad. Sci. USSR 1983, 3, 33–43. [Google Scholar]
- Korgin, N.; Sergeev, V. The Art of Scientific Computingdesitions’ Root—Yet Another Tool for Ordinal Data Analysis. Available online: https://youtu.be/b4dF7znmVyo (accessed on 27 November 2021).
- Korgin, N.; Sergeev, V. Identification of integrated rating mechanisms on complete data sets. In Advances in Production Management Systems. Artificial Intelligence for Sustainable and Resilient Production Systems. In Proceedings of the APMS 2021. IFIP Advances in Information and Communication Technology, Nantes, France, 5–9 September 2021; Springer: New York, NY, USA, 2021; pp. 610–616. [Google Scholar] [CrossRef]
- Burkov, V.N.; Korgin, N.A.; Sergeev, V.A. Identification of Integrated Rating Mechanisms as Optimization Problem. In Proceedings of the 2020 13th International Conference Management of Large-Scale System Development, MLSD 2020, Moscow, Russia, 28–30 September 2020; p. 9247638. [Google Scholar] [CrossRef]
- Larichev, O.I.; Moshkovich, H.M. Verbal Decision Analysis for Unstructured Problems; Springer: New York, NY, USA, 2013; 272p. [Google Scholar] [CrossRef]
- Alekseev, A. Identification of integrated rating mechanisms based on training set. In Proceedings of the 2020 2nd International Conference on Control Systems, Mathematical Modeling, , Automation and Energy Efficiency (SUMMA), Lipetsk, Russia, 11–13 November 2020. [Google Scholar] [CrossRef]
- Alekseev, A. Identification of integrated rating mechanisms with non-serial structures of criteria tree. In Proceedings of the 2021 3rd International Conference on Control Systems, Mathematical Modeling, Automation and Energy Efficiency (SUMMA), Lipetsk, Russian, 10–12 November 2021. [Google Scholar] [CrossRef]
- Sergeev, V.A.; Korgin, N.A. Identification of integrated rating mechanisms as an approach to discrete data analysis. IFAC-PapersOnLine 2021, 54, 134–139. [Google Scholar] [CrossRef]
- Rabchevsky, A.N.; Yasnitsky, L.N. The Role of Synthetic Data in Improving Neural Network Algorithms. In Proceedings of the 2022 4th International Conference on Control Systems, Mathematical Modeling, Automation and Energy Efficiency (SUMMA), Lipetsk, Russian, 9–11 November 2022. [Google Scholar] [CrossRef]
- Vereskun, V.D.; Guda, A.N.; Butakova, M.A. Data mining: Discretization of attribute values using the theory of rough sets and clustering. Vestn. Rostov. Gos. Univ. Putej Soobshcheniya 2018, 3, 76–84. [Google Scholar]
- Alekseev, A.; Salamatina, A.; Kataeva, T. Rating and Control Mechanisms Design in the Program “Research of Dynamic Systems”. In Proceedings of the 2019 IEEE 21st Conference on Business Informatics (CBI), Moscow, Russia, 15–17 July 2019. [Google Scholar] [CrossRef]
- Barnett, J.; Carreia, H.; Johnson, P.; Laughlin, M.; Willsom, K. Darwin Meets Graph Theory on a Strange Planet: Counting Full n-ary Trees with Labeled Leafs. Ala. J. Math. 2010, 35, 16–23. [Google Scholar]
- Lutsenko, E.V. Conceptual principles of the system (emergent) information theory and its application for the cognitive modelling of the active objects (entities). In Proceedings of the 2002 IEEE International Conference on Artificial Intelligence Systems (ICAIS 2002), Divnomorskoe, Russia, 5–10 September 2002. [Google Scholar] [CrossRef]
- Lutsenko, E.V.; Troshin, L.P.; Zviagin, A.S.; Milovanov, A.V. Application of the automated system-cognitive analysis for solving problems of genetics. J. Mech. Eng. Res. Dev. 2018, 41, 1–8. [Google Scholar] [CrossRef]
- Lutsenko, E.V. Personal Intelligent Online Development Environment “Eidos-X Professional” (System “Eidos-Xpro”). Programmy Dlia Evm. Bazy Dannykh. Topologii Integral’Nykh Mikroskhem [Computer Programs. Database. Topologies of Integrated Circuits]. Computer Program RU 2022615135, 3 March 2022.
- Cherepanov, F.M.; Yasnitsky, L.N. “Neurosimulator 5.0”. Programmy Dlia Evm. Bazy Dannykh. Topologii Integral’Nykh Mikroskhem [Computer Programs. Da-Tabase. Topologies of Integrated Circuits]. Computer Program RU 2014618208, 12 July 2014.
- Gusev, A.L.; Okunev, A.A. “Software Package That Implements the Operation of Incompleatly Connected Neural Networks”. Programmy Dlia Evm. Bazy Dannykh. Topologii Integral’Nykh Mikroskhem [Computer Programs. Database. Topologies of Integrated Circuits]. Computer program RU 202166563, 20 September 2021.
- Okunev, A.A. Functional data preprocessing application to oil-transfer pumps vibration parameters forecasting. Appl. Math. Control Sci. 2020, 3, 51–72. [Google Scholar] [CrossRef] [PubMed]
- Kozhemyakin, L.V. Application of neural networks in simulation of cluster-network relations in oil and gas industry. Appl. Math. Control Sci. 2020, 4, 137–152. [Google Scholar] [CrossRef]
- Makarenko, A.V. Control Theory (additional chapters). In Artificial Neural Networks; Novikov, D.A., Ed.; LENAND: Moscow, Russia, 2019; pp. 426–455. [Google Scholar]
- LeCun, Y.; Bottou, L.; Orr, G.B.; Müller, K.-R. Efficient BackProp. Lect. Notes Comput. Sci. 2012, 7700, 9–48. [Google Scholar] [CrossRef]
- Press, W.H.; Teukolsky, S.A.; Vetterling, W.T.; Flannery, B.P. Numerical Recipes. In The Art of Scientific Computing, 3rd ed.; Cambridge University Press: Cambridge, UK, 2007. [Google Scholar]
- Yasnitsky, L.N. Artificial Intelligence. Elective Course: Textbook; BINOM Knowledge Lab.: Moscow, Russia, 2011; 197p. [Google Scholar]
Subject | Group 1 | L_BV | R_BV | L_ICV | R_ICV | L_TV | R_TV | L_SV | R_SV | L_DNV | R_DNV |
---|---|---|---|---|---|---|---|---|---|---|---|
sub001 | 1 | 279 | 288 | 255 | 263 | 140 | 138 | 131 | 131 | 165 | 185 |
sub002 | 1 | 274 | 247 | 223 | 243 | 239 | 262 | 190 | 222 | 204 | 102 |
sub003 | 1 | 259 | 333 | 236 | 243 | 172 | 159 | 135 | 145 | 152 | 153 |
... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... |
sub079 | 0 | 279 | 288 | 179 | 189 | 141 | 153 | 245 | 214 | 177 | 149 |
sub080 | 0 | 249 | 232 | 221 | 232 | 154 | 142 | 165 | 151 | 161 | 143 |
sub081 | 0 | 300 | 259 | 295 | 299 | 216 | 197 | 163 | 131 | 162 | 129 |
Cerebral Veins | Variable Abbrev. | Domain of MSV | 1st Interval | 2nd Interval | 3rd Interval |
---|---|---|---|---|---|
left basal vein | L_BV | {153.0; 324.0} | {153.0; 210.0} | {210.0; 267.0} | {267.0; 324.0} |
right basal vein | R_BV | {164.0; 357.0} | {164.0; 228.3} | {228.3; 292.7} | {292.7; 357.0} |
left internal cerebral vein | L_ICV | {179.0; 394.0} | {179.0; 250.7} | {250.7; 322.3} | {322.3; 394.0} |
right internal cerebral vein | R_ICV | {163.0; 411.0} | {163.0; 245.7} | {245.7; 328.3} | {328.3; 411.0} |
left vein of the thalamus | L_TV | {131.0; 288.0} | {131.0; 183.3} | {183.3; 235.7} | {235.7; 288.0} |
right vein of the thalamus | R_TV | {109.0; 286.0} | {109.0; 168.0} | {168.0; 227.0} | {227.0; 286.0} |
left septal vein | L_SV | {83.0; 310.0} | {83.0; 158.7} | {158.7; 234.3} | {234.3; 310.0} |
right septal vein | R_SV | {73.0; 287.0} | {73.0; 144.3} | {144.3; 215.7} | {215.7; 287.0} |
left vein of the dentate nucleus | L_DNV | {94.0; 244.0} | {94.0; 144.0} | {144.0; 194.0} | {194.0; 244.0} |
right vein of the dentate nucleus | R_DNV | {81.0; 269.0} | {81.0; 143.7} | {143.7; 206.3} | {206.3; 269.0} |
Subject | Group | Encoded L_BV | Encoded L_TV | Encoded R_BV | Encoded R_TV |
---|---|---|---|---|---|
sub001 | AD | 3 | 1 | 2 | 1 |
sub002 | AD | 3 | 3 | 2 | 3 |
sub003 | AD | 2 | 1 | 3 | 1 |
… | … | … | … | … | … |
sub079 | CON | 3 | 1 | 2 | 1 |
sub080 | CON | 2 | 1 | 2 | 1 |
sub081 | CON | 3 | 2 | 2 | 2 |
Subjects | Group | Encoded L_BV | Encoded L_TV | Encoded R_BV | Encoded R_TV | Comments |
---|---|---|---|---|---|---|
sub007 | AD | 1 | 1 | 1 | 2 | The similarity of the subject 007 to CON group is 99.69%, but in fact is sick |
sub004, sub006 | AD | 2 | 1 | 2 | 1 | Subject 004 is sick, but the similarity to the CON group is 12.94%, also the subject 080, which has the same vector {2; 1; 2; 1} and is in fact in the CON group, has a similarity to the CON group value 38.61% |
sub042, sub059 | AD | 2 | 1 | 2 | 2 | The similarity of the subject 059 to the CON group is 12.94%, also the subjects 070 and 072 which have the same vector {2; 1; 2; 2} and are in fact in the CON group, have similarity to the CON group value 62.66% and 38.67%, respectively |
sub060 | CON | 2 | 3 | 2 | 3 | The similarity of the subject 060 to the AD group is 60.01%, but in fact is not sick |
sub063, sub073 | CON | 2 | 2 | 3 | 2 | The similarity of the subject 063 to the AD group is 19.15%, but in fact is not sick |
sub031 | AD | 2 | 2 | 3 | 2 | The similarity of the subject 031 to the CON group is 50.23%, but in fact is sick |
sub068, sub078 | CON | 2 | 3 | 3 | 3 | The similarity of the subjects 078 and 068 to the AD group are 63.63% and 39.33%, respectively, but in fact they are not sick |
sub079 | CON | 3 | 1 | 2 | 1 | The similarity of the subject 079 to the CON group is 10.03%, but the subject 001, which has the same vector {3; 1; 2; 1} and in the AD group, has similarity to the AD group value 28.73% |
sub071, sub081 | CON | 3 | 2 | 2 | 2 | The similarity of both subjects 071 and 081 to the AD group is 33.62%, but in fact they are not sick |
Hidden Layer | Number of Neurons | Group of Neurons | Comments | |
---|---|---|---|---|
Input 4 signals | 1 | 12 | 6 and 6 | Neurons corresponds to intervals on input signals |
2 | 18 | 9 and 9 | Neurons corresponds to elements of matrices ML and MR | |
Output 1 signal | 3 | 7 | 4 and 3 | Neurons corresponds to values of matrices ML and MR |
4 | 12 | 12 | Neurons corresponds to elements of matrix MS | |
5 | 2 | 2 | Neurons corresponds to values of matrix MS |
Neural Network | Q1 | Q2 | Q3 | Q4 | Q5 | Q6 | Q7 | Q8 | |
---|---|---|---|---|---|---|---|---|---|
HL1-5 | 1 layer with 5 neurons | 17.068 | 12.582 | 24.517 | 21.047 | 2.913 | 6.011 | 0.849 | 0.744 |
HL1-10 | 1 layer with 10 neurons | 11.557 | 8.718 | 17.329 | 13.899 | 1.334 | 3.003 | 0.930 | 0.872 |
HL1-15 | 1 layer with 15 neurons | 9.268 | 7.000 | 21.897 | 18.348 | 0.859 | 4.795 | 0.955 | 0.795 |
HL1-20 | 1 layer with 20 neurons | 5.829 | 4.422 | 21.262 | 18.988 | 0.034 | 4.521 | 0.982 | 0.807 |
HL1-25 | 1 layer with 25 neurons | 2.047 | 1.440 | 21.827 | 16.692 | 0.042 | 4.764 | 0.997 | 0.797 |
HL1-27 | 1 layer with 27 neurons | 1.987 | 1.392 | 21.138 | 14.462 | 0.039 | 4.468 | 0.997 | 0.809 |
Neural Network | Q1 | Q2 | Q3 | Q4 | Q5 | Q6 | Q7 | Q8 | |
---|---|---|---|---|---|---|---|---|---|
HL2-5-5 | 2 layers with 5 neurons on first layer and 5 neurons on second layer | 4.132 | 2.755 | 16.518 | 12.429 | 0.145 | 2.210 | 0.996 | 0.884 |
HL2-5-10 | 2 layers with 5 neurons on first layer and 10 neurons on second layer | 4.003 | 2.922 | 17.234 | 11.922 | 0.160 | 2.970 | 0.992 | 0.873 |
HL2-5-15 | 2 layers with 5 neurons on first layer and 15 neurons on second layer | 4.581 | 3.163 | 17.720 | 15.341 | 0.210 | 3.140 | 0.989 | 0.866 |
HL2-10-5 | 2 layers with 10 neurons on first layer and 5 neurons on second layer | 0.715 | 0.480 | 13.094 | 8.362 | 0.005 | 1.714 | 0.999 | 0.934 |
HL2-10-15 | 2 layers with 10 neurons on first layer and 15 neurons on second layer | 0.775 | 0.458 | 16.823 | 9.647 | 0.006 | 2.830 | 0.999 | 0.888 |
HL2-15-5 | 2 layers with 15 neurons on first layer and 5 neurons on second layer | 0.665 | 0.479 | 9.918 | 7.110 | 0.004 | 0.797 | 0.999 | 0.958 |
HL2-15-10 | 2 layers with 15 neurons on first layer and 10 neurons on second layer | 0.904 | 0.548 | 14.145 | 10.509 | 0.007 | 1.621 | 0.999 | 0.915 |
HL2-15-15 | 2 layers with 15 neurons on first layer and 15 neurons on second layer | 1.010 | 0.654 | 12.276 | 9.647 | 0.008 | 1.221 | 0.999 | 0.936 |
Neural Network | Q1 | Q2 | Q3 | Q4 | Q5 | Q6 | Q7 | Q8 | |
---|---|---|---|---|---|---|---|---|---|
HL2-22-2 | 2 layers with 22 neurons on first layer and 2 neurons on second layer | 1.043 | 0.556 | 20.344 | 15.716 | 0.011 | 4.139 | 0.998 | 0.823 |
HL2-17-4 | 2 layers with 17 neurons on first layer and 4 neurons on second layer | 0.774 | 0.477 | 16.971 | 11.897 | 0.006 | 2.888 | 0.999 | 0.877 |
HL2-14-6 | 2 layers with 14 neurons on first layer and 6 neurons on second layer | 0.543 | 0.348 | 14.759 | 9.564 | 0.003 | 2.178 | 1.000 | 0.907 |
HL2-10-10 | 2 layers with 10 neurons on first layer and 10 neurons on second layer | 0.809 | 0.507 | 14.293 | 12.834 | 0.005 | 1.655 | 0.999 | 0.913 |
HL2-6-17 | 2 layers with 6 neurons on first layer and 17 neurons on second layer | 1.121 | 0.818 | 21.026 | 16.886 | 0.010 | 3.581 | 0.998 | 0.811 |
HL2-3-29 | 2 layers with 3 neurons on first layer and 29 neurons on second layer | 6.006 | 3.763 | 14.313 | 12.314 | 0.292 | 1.659 | 0.981 | 0.913 |
HL2-2-41 | 2 layers with 2 neurons on first layer and 41 neurons on second layer | 15.952 | 6.950 | 32.393 | 17.504 | 2.061 | 8.499 | 0.868 | 0.552 |
HL2-1-51 | 2 layers with 1 neuron on first layer and 51 neurons on second layer | 27.104 | 16.408 | 34.519 | 23.725 | 5.951 | 9.652 | 0.618 | 0.492 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Alekseev, A.; Kozhemyakin, L.; Nikitin, V.; Bolshakova, J. Data Preprocessing and Neural Network Architecture Selection Algorithms in Cases of Limited Training Sets—On an Example of Diagnosing Alzheimer’s Disease. Algorithms 2023, 16, 219. https://doi.org/10.3390/a16050219
Alekseev A, Kozhemyakin L, Nikitin V, Bolshakova J. Data Preprocessing and Neural Network Architecture Selection Algorithms in Cases of Limited Training Sets—On an Example of Diagnosing Alzheimer’s Disease. Algorithms. 2023; 16(5):219. https://doi.org/10.3390/a16050219
Chicago/Turabian StyleAlekseev, Aleksandr, Leonid Kozhemyakin, Vladislav Nikitin, and Julia Bolshakova. 2023. "Data Preprocessing and Neural Network Architecture Selection Algorithms in Cases of Limited Training Sets—On an Example of Diagnosing Alzheimer’s Disease" Algorithms 16, no. 5: 219. https://doi.org/10.3390/a16050219
APA StyleAlekseev, A., Kozhemyakin, L., Nikitin, V., & Bolshakova, J. (2023). Data Preprocessing and Neural Network Architecture Selection Algorithms in Cases of Limited Training Sets—On an Example of Diagnosing Alzheimer’s Disease. Algorithms, 16(5), 219. https://doi.org/10.3390/a16050219