Multivariate Modelling and Prediction of High-Frequency Sensor-Based Cerebral Physiologic Signals: Narrative Review of Machine Learning Methodologies
Abstract
:1. Introduction
2. Methods
2.1. Multivariate State-Space Models
Hidden Markov Model (HMM)
2.2. Neural Networks
2.2.1. Convolutional Neural Network (CNN)
2.2.2. Recurrent Neural Network (RNN)
2.2.3. Long Short-Term Memory (LSTM)
2.2.4. Echo State Network (ESN)
2.3. Ensemble Learning Models
2.3.1. Random Forest
2.3.2. Extreme Gradient Boosting (XGBoost)
2.4. Kernel Methods
2.4.1. Support Vector Machine (SVM)
2.4.2. Gaussian Processes (GPs)
3. Preprocessing Requirements
4. Clinical Relevance
5. Limitations of the Models
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Zeiler, F.A.; Ercole, A.; Cabeleira, M.; Zoerle, T.; Stocchetti, N.; Menon, D.; Smieleweski, P.; Czosnyka, M. Univariate Comparison of Performance of Different Cerebrovascular Reactivity Indices for Outcome Association in Adult TBI: A CENTER-TBI Study. Available online: https://pubmed.ncbi.nlm.nih.gov/30877472/?otool=icaumlib (accessed on 14 August 2020).
- Wang, R.; Wang, Y.; Xu, X.; Li, Y.; Pan, X. Brain Works Principle Followed by Neural Information Processing: A Review of Novel Brain Theory. Artif. Intell. Rev. 2023, 56, 285–350. [Google Scholar] [CrossRef]
- Froese, L.; Gomez, A.; Sainbhi, A.S.; Batson, C.; Stein, K.; Alizadeh, A.; Zeiler, F.A. Dynamic Temporal Relationship Between Autonomic Function and Cerebrovascular Reactivity in Moderate/Severe Traumatic Brain Injury. Front. Netw. Physiol. 2022, 2, 837860. [Google Scholar] [CrossRef] [PubMed]
- Tas, J.; Czosnyka, M.; van der Horst, I.C.C.; Park, S.; van Heugten, C.; Sekhon, M.; Robba, C.; Menon, D.K.; Zeiler, F.A.; Aries, M.J.H. Cerebral Multimodality Monitoring in Adult Neurocritical Care Patients with Acute Brain Injury: A Narrative Review. Front. Physiol. 2022, 13, 1071161. [Google Scholar] [CrossRef] [PubMed]
- Donnelly, J.; Czosnyka, M.; Adams, H.; Cardim, D.; Kolias, A.G.; Zeiler, F.A.; Lavinio, A.; Aries, M.; Robba, C.; Smielewski, P.; et al. Twenty-Five Years of Intracranial Pressure Monitoring After Severe Traumatic Brain Injury: A Retrospective, Single-Center Analysis. Neurosurgery 2019, 85, E75–E82. [Google Scholar] [CrossRef]
- Caldwell, M.; Hapuarachchi, T.; Highton, D.; Elwell, C.; Smith, M.; Tachtsidis, I. BrainSignals Revisited: Simplifying a Computational Model of Cerebral Physiology. PLoS ONE 2015, 10, e0126695. [Google Scholar] [CrossRef]
- Brier, L.M.; Zhang, X.; Bice, A.R.; Gaines, S.H.; Landsness, E.C.; Lee, J.-M.; Anastasio, M.A.; Culver, J.P. A Multivariate Functional Connectivity Approach to Mapping Brain Networks and Imputing Neural Activity in Mice. Cereb. Cortex 2022, 32, 1593–1607. [Google Scholar] [CrossRef]
- Chen, G.; Adleman, N.E.; Saad, Z.S.; Leibenluft, E.; Cox, R.W. Applications of Multivariate Modeling to Neuroimaging Group Analysis: A Comprehensive Alternative to Univariate General Linear Model. NeuroImage 2014, 99, 571–588. [Google Scholar] [CrossRef]
- Jha, A.; Agarwal, S. Do Deep Neural Networks Model Nonlinear Compositionality in the Neural Representation of Human-Object Interactions? In Proceedings of the 2019 Conference on Cognitive Computational Neuroscience, Berlin, Germany, 13–16 September 2019. [Google Scholar]
- Shi, W.; Fan, L.; Jiang, T. Developing Neuroimaging Biomarker for Brain Diseases with a Machine Learning Framework and the Brainnetome Atlas. Neurosci. Bull. 2021, 37, 1523–1525. [Google Scholar] [CrossRef]
- Zhang, J. Multivariate Analysis and Machine Learning in Cerebral Palsy Research. Front. Neurol. 2017, 8, 715. [Google Scholar] [CrossRef]
- Ahmadzadeh, M.; Christie, G.J.; Cosco, T.D.; Arab, A.; Mansouri, M.; Wagner, K.R.; DiPaola, S.; Moreno, S. Neuroimaging and Machine Learning for Studying the Pathways from Mild Cognitive Impairment to Alzheimer’s Disease: A Systematic Review. BMC Neurol. 2023, 23, 309. [Google Scholar] [CrossRef]
- Raj, R.; Wennervirta, J.M.; Tjerkaski, J.; Luoto, T.M.; Posti, J.P.; Nelson, D.W.; Takala, R.; Bendel, S.; Thelin, E.P.; Luostarinen, T.; et al. Dynamic Prediction of Mortality after Traumatic Brain Injury Using a Machine Learning Algorithm. NPJ Digit. Med. 2022, 5, 96. [Google Scholar] [CrossRef] [PubMed]
- Tanaka, H.; Ishikawa, T.; Kakei, S. Neural Predictive Computation in the Cerebellum. In Cerebellum as a CNS Hub; Mizusawa, H., Kakei, S., Eds.; Springer International Publishing: Cham, Switzerland, 2021; pp. 371–390. [Google Scholar]
- Zhang, W.; Braden, B.B.; Miranda, G.; Shu, K.; Wang, S.; Liu, H.; Wang, Y. Integrating Multimodal and Longitudinal Neuroimaging Data with Multi-Source Network Representation Learning. Neuroinformatics 2022, 20, 301–316. [Google Scholar] [CrossRef] [PubMed]
- Al-azazi, F.A.; Ghurab, M. ANN-LSTM: A Deep Learning Model for Early Student Performance Prediction in MOOC. Heliyon 2023, 9, e15382. [Google Scholar] [CrossRef] [PubMed]
- Polikar, R. Ensemble Learning. In Ensemble Machine Learning: Methods and Applications; Zhang, C., Ma, Y., Eds.; Springer: New York, NY, USA, 2012; pp. 1–34. ISBN 978-1-4419-9326-7. [Google Scholar]
- Gao, Z.; Dang, W.; Wang, X.; Hong, X.; Hou, L.; Ma, K.; Perc, M. Complex Networks and Deep Learning for EEG Signal Analysis. Cogn. Neurodyn 2021, 15, 369–388. [Google Scholar] [CrossRef] [PubMed]
- Triantafyllopoulos, K. Multivariate State Space Models. In Bayesian Inference of State Space Models: Kalman Filtering and Beyond; Triantafyllopoulos, K., Ed.; Springer Texts in Statistics; Springer International Publishing: Cham, Switzerland, 2021; pp. 209–261. ISBN 978-3-030-76124-0. [Google Scholar]
- Liu, W.; Yairi, T. A Unifying View of Multivariate State Space Models for Soft Sensors in Industrial Processes. IEEE Access 2024, 12, 5920–5932. [Google Scholar] [CrossRef]
- Eddy, S.R. Hidden Markov Models. Curr. Opin. Struct. Biol. 1996, 6, 361–365. [Google Scholar] [CrossRef]
- Mor, B.; Garhwal, S.; Kumar, A. A Systematic Review of Hidden Markov Models and Their Applications. Arch. Comput. Methods Eng. 2021, 28, 1429–1448. [Google Scholar] [CrossRef]
- Miller, D.R.H.; Leek, T.; Schwartz, R.M. A Hidden Markov Model Information Retrieval System. In Proceedings of the 22nd Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, Berkeley, CA, USA, 15–19 August 1999; ACM: Berkeley, CA, USA, 1999; pp. 214–221. [Google Scholar]
- Rabiner, L.; Juang, B. An Introduction to Hidden Markov Models. IEEE ASSP Mag. 1986, 3, 4–16. [Google Scholar] [CrossRef]
- Chen, K.; Li, C.; Sun, W.; Tao, Y.; Wang, R.; Hou, W.; Liu, D.-Q. Hidden Markov Modeling Reveals Prolonged “Baseline” State and Shortened Antagonistic State across the Adult Lifespan. Cereb. Cortex 2022, 32, 439–453. [Google Scholar] [CrossRef]
- Torrésani, B.; Villaron, E. Harmonic Hidden Markov Models for the Study of EEG Signals. In Proceedings of the 2010 18th European Signal Processing Conference, Aalborg, Denmark, 23–27 August 2010; pp. 711–715. [Google Scholar]
- Ou, J.; Xie, L.; Jin, C.; Li, X.; Zhu, D.; Jiang, R.; Chen, Y.; Zhang, J.; Li, L.; Liu, T. Characterizing and Differentiating Brain State Dynamics via Hidden Markov Models. Brain Topogr. 2015, 28, 666–679. [Google Scholar] [CrossRef]
- Kietzmann, T.C.; McClure, P.; Kriegeskorte, N. Deep Neural Networks in Computational Neuroscience. In Oxford Research Encyclopedia of Neuroscience; Oxford University Press: Oxford, UK, 2019; ISBN 978-0-19-026408-6. [Google Scholar]
- Kriegeskorte, N.; Golan, T. Neural Network Models and Deep Learning. Curr. Biol. 2019, 29, R231–R236. [Google Scholar] [CrossRef] [PubMed]
- Derry, A.; Krzywinski, M.; Altman, N. Convolutional Neural Networks. Nat. Methods 2023, 20, 1269–1270. [Google Scholar] [CrossRef] [PubMed]
- Li, Z.; Liu, F.; Yang, W.; Peng, S.; Zhou, J. A Survey of Convolutional Neural Networks: Analysis, Applications, and Prospects. IEEE Trans. Neural Netw. Learn. Syst. 2022, 33, 6999–7019. [Google Scholar] [CrossRef] [PubMed]
- Gu, J.; Wang, Z.; Kuen, J.; Ma, L.; Shahroudy, A.; Shuai, B.; Liu, T.; Wang, X.; Wang, G.; Cai, J.; et al. Recent Advances in Convolutional Neural Networks. Pattern Recognit. 2018, 77, 354–377. [Google Scholar] [CrossRef]
- Emanuel, R.H.K.; Docherty, P.D.; Lunt, H.; Möller, K. The Effect of Activation Functions on Accuracy, Convergence Speed, and Misclassification Confidence in CNN Text Classification: A Comprehensive Exploration. J. Supercomput. 2024, 80, 292–312. [Google Scholar] [CrossRef]
- Mehmood, F.; Ahmad, S.; Whangbo, T.K. An Efficient Optimization Technique for Training Deep Neural Networks. Mathematics 2023, 11, 1360. [Google Scholar] [CrossRef]
- Alzubaidi, L.; Zhang, J.; Humaidi, A.J.; Al-Dujaili, A.; Duan, Y.; Al-Shamma, O.; Santamaría, J.; Fadhel, M.A.; Al-Amidie, M.; Farhan, L. Review of Deep Learning: Concepts, CNN Architectures, Challenges, Applications, Future Directions. J. Big Data 2021, 8, 53. [Google Scholar] [CrossRef]
- Xu, Y.; Zhang, H. Convergence of Deep Convolutional Neural Networks. Neural Netw. 2022, 153, 553–563. [Google Scholar] [CrossRef]
- Cossu, A.; Carta, A.; Lomonaco, V.; Bacciu, D. Continual Learning for Recurrent Neural Networks: An Empirical Evaluation. Neural Netw. 2021, 143, 607–627. [Google Scholar] [CrossRef]
- Barak, O. Recurrent Neural Networks as Versatile Tools of Neuroscience Research. Curr. Opin. Neurobiol. 2017, 46, 1–6. [Google Scholar] [CrossRef]
- Mughal, N.E.; Khan, M.J.; Khalil, K.; Javed, K.; Sajid, H.; Naseer, N.; Ghafoor, U.; Hong, K.-S. EEG-fNIRS-Based Hybrid Image Construction and Classification Using CNN-LSTM. Front. Neurorobotics 2022, 16, 873239. [Google Scholar] [CrossRef] [PubMed]
- Vakitbilir, N.; Hilal, A.; Direkoğlu, C. Hybrid Deep Learning Models for Multivariate Forecasting of Global Horizontal Irradiation. Neural Comput. Appl. 2022, 34, 8005–8026. [Google Scholar] [CrossRef]
- Van Houdt, G.; Mosquera, C.; Nápoles, G. A Review on the Long Short-Term Memory Model. Artif. Intell. Rev. 2020, 53, 5929–5955. [Google Scholar] [CrossRef]
- Li, M.; Wang, J.; Yang, S.; Xie, J.; Xu, G.; Luo, S. A CNN-LSTM Model for Six Human Ankle Movements Classification on Different Loads. Front. Hum. Neurosci. 2023, 17, 1101938. [Google Scholar] [CrossRef]
- Jaeger, H. Adaptive Nonlinear System Identification with Echo State Networks. In Proceedings of the Advances in Neural Information Processing Systems 15 (NIPS 2002), Vancouver, BC, Canada, 9–14 December 2002; Volume 15. [Google Scholar]
- Lukoševičius, M. A Practical Guide to Applying Echo State Networks. In Neural Networks: Tricks of the Trade: Second Edition; Montavon, G., Orr, G.B., Müller, K.-R., Eds.; Springer: Berlin/Heidelberg, Germany, 2012; pp. 659–686. ISBN 978-3-642-35289-8. [Google Scholar]
- Ozturk, M.C.; Xu, D.; Príncipe, J.C. Analysis and Design of Echo State Networks. Neural Comput. 2007, 19, 111–138. [Google Scholar] [CrossRef]
- Sun, C.; Song, M.; Hong, S.; Li, H. A Review of Designs and Applications of Echo State Networks 2020. arXiv 2020, arXiv:2012.02974. [Google Scholar]
- De Vos, N.J. Echo State Networks as an Alternative to Traditional Artificial Neural Networks in Rainfall–Runoff Modelling. Hydrol. Earth Syst. Sci. 2013, 17, 253–267. [Google Scholar] [CrossRef]
- Sun, C.; Song, M.; Cai, D.; Zhang, B.; Hong, S.; Li, H. A Systematic Review of Echo State Networks From Design to Application. IEEE Trans. Artif. Intell. 2024, 5, 23–37. [Google Scholar] [CrossRef]
- Soltani, R.; Benmohamed, E.; Ltifi, H. Echo State Network Optimization: A Systematic Literature Review. Neural Process Lett. 2023, 55, 10251–10285. [Google Scholar] [CrossRef]
- Dong, X.; Yu, Z.; Cao, W.; Shi, Y.; Ma, Q. A Survey on Ensemble Learning. Front. Comput. Sci. 2020, 14, 241–258. [Google Scholar] [CrossRef]
- Qi, Y. Random Forest for Bioinformatics. In Ensemble Machine Learning: Methods and Applications; Zhang, C., Ma, Y., Eds.; Springer: New York, NY, USA, 2012; pp. 307–323. ISBN 978-1-4419-9326-7. [Google Scholar]
- Bian, J.; Wang, X.; Hao, W.; Zhang, G.; Wang, Y. The Differential Diagnosis Value of Radiomics-Based Machine Learning in Parkinson’s Disease: A Systematic Review and Meta-Analysis. Front. Aging Neurosci. 2023, 15, 1199826. [Google Scholar] [CrossRef] [PubMed]
- Hastie, T.; Tibshirani, R.; Friedman, J. Random Forests. In The Elements of Statistical Learning; Springer Series in Statistics; Springer: New York, NY, USA, 2009; pp. 587–604. ISBN 978-0-387-84857-0. [Google Scholar]
- Biau, G.; Scornet, E. A Random Forest Guided Tour. TEST 2016, 25, 197–227. [Google Scholar] [CrossRef]
- Chen, T.; Guestrin, C. XGBoost: A Scalable Tree Boosting System. In Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Francisco, CA, USA, 13–17 August 2016; Association for Computing Machinery: New York, NY, USA, 2016; pp. 785–794. [Google Scholar]
- Sheridan, R.P.; Wang, W.M.; Liaw, A.; Ma, J.; Gifford, E.M. Extreme Gradient Boosting as a Method for Quantitative Structure–Activity Relationships. J. Chem. Inf. Model. 2016, 56, 2353–2360. [Google Scholar] [CrossRef] [PubMed]
- Chang, Y.-C.; Chang, K.-H.; Wu, G.-J. Application of eXtreme Gradient Boosting Trees in the Construction of Credit Risk Assessment Models for Financial Institutions. Appl. Soft Comput. 2018, 73, 914–920. [Google Scholar] [CrossRef]
- Bentéjac, C.; Csörgő, A.; Martínez-Muñoz, G. A Comparative Analysis of XGBoost. Artif. Intell. Rev. 2021, 54, 1937–1967. [Google Scholar] [CrossRef]
- Wilson, A.G.; Hu, Z.; Salakhutdinov, R.; Xing, E.P. Deep Kernel Learning. In Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, Cadiz, Spain, 9–11 May 2016; pp. 370–378. [Google Scholar]
- Lodhi, H. Computational Biology Perspective: Kernel Methods and Deep Learning. WIREs Comput. Stat. 2012, 4, 455–465. [Google Scholar] [CrossRef]
- Valkenborg, D.; Rousseau, A.-J.; Geubbelmans, M.; Burzykowski, T. Support Vector Machines. Am. J. Orthod. Dentofac. Orthop. 2023, 164, 754–757. [Google Scholar] [CrossRef]
- Pisner, D.A.; Schnyer, D.M. Chapter 6—Support Vector Machine. In Machine Learning; Mechelli, A., Vieira, S., Eds.; Academic Press: Cambridge, MA, USA, 2020; pp. 101–121. ISBN 978-0-12-815739-8. [Google Scholar]
- Kecman, V. Support Vector Machines—An Introduction. In Support Vector Machines: Theory and Applications; Wang, L., Ed.; Studies in Fuzziness and Soft Computing; Springer: Berlin/Heidelberg, Germany, 2005; pp. 1–47. ISBN 978-3-540-32384-6. [Google Scholar]
- Ji, W.; Liu, D.; Meng, Y.; Xue, Y. A Review of Genetic-Based Evolutionary Algorithms in SVM Parameters Optimization. Evol. Intel. 2021, 14, 1389–1414. [Google Scholar] [CrossRef]
- Seitz, S. Gradient-Based Explanations for Gaussian Process Regression and Classification Models. arXiv 2022, arXiv:2205.12797. [Google Scholar]
- Seeger, M. Gaussian Processes for Machine Learning. Int. J. Neur. Syst. 2004, 14, 69–106. [Google Scholar] [CrossRef]
- Rasmussen, C.E. Gaussian Processes in Machine Learning. In Advanced Lectures on Machine Learning: ML Summer Schools 2003, Canberra, Australia, February 2–14, 2003, Tübingen, Germany, August 4–16, 2003, Revised Lectures; Lecture Notes in Computer Science; Bousquet, O., von Luxburg, U., Rätsch, G., Eds.; Springer: Berlin/Heidelberg, Germany, 2004; pp. 63–71. ISBN 978-3-540-28650-9. [Google Scholar]
- Williams, C.; Rasmussen, C. Gaussian Processes for Regression. In Proceedings of the Advances in Neural Information Processing Systems, Denver, CO, USA, 27–30 November 1995; MIT Press: Cambridge, MA, USA, 1995; Volume 8. [Google Scholar]
- Mackay, D.J.C. Introduction to gaussian processes. In Neural Networks and Machine Learning; NATO ASI Series F Computer and Systems Sciences; Springer: Berlin, Germany, 1998; Volume 168, pp. 133–166. [Google Scholar]
- Khan, M.S.; Salsabil, N.; Alam, M.G.R.; Dewan, M.A.A.; Uddin, M.Z. CNN-XGBoost Fusion-Based Affective State Recognition Using EEG Spectrogram Image Analysis. Sci. Rep. 2022, 12, 14122. [Google Scholar] [CrossRef] [PubMed]
- Kwak, S.; Akbari, H.; Garcia, J.A.; Mohan, S.; Dicker, Y.; Sako, C.; Matsumoto, Y.; Nasrallah, M.P.; Shalaby, M.; O’Rourke, D.M.; et al. Predicting peritumoral glioblastoma infiltration and subsequent recurrence using deep-learning–based analysis of multi-parametric magnetic resonance imaging. J. Med. Imaging 2024, 11, 054001. [Google Scholar] [CrossRef] [PubMed]
- Zhang, Y.; Wang, S.; Sui, Y.; Yang, M.; Liu, B.; Cheng, H.; Sun, J.; Jia, W.; Phillips, P.; Gorriz, J.M. Multivariate Approach for Alzheimer’s Disease Detection Using Stationary Wavelet Entropy and Predator-Prey Particle Swarm Optimization. J. Alzheimer’s Dis. 2018, 65, 855–869. [Google Scholar] [CrossRef] [PubMed]
- Petras, K.; ten Oever, S.; Jacobs, C.; Goffaux, V. Coarse-to-Fine Information Integration in Human Vision. NeuroImage 2019, 186, 103–112. [Google Scholar] [CrossRef] [PubMed]
- Shen, Y.; Giannakis, G.B.; Baingana, B. Nonlinear Structural Vector Autoregressive Models with Application to Directed Brain Networks. IEEE Trans. Signal Process 2019, 67, 5325–5339. [Google Scholar] [CrossRef]
- Baroni, F.; Morillon, B.; Trébuchon, A.; Liégeois-Chauvel, C.; Olasagasti, I.; Giraud, A.-L. Converging Intracortical Signatures of Two Separated Processing Timescales in Human Early Auditory Cortex. NeuroImage 2020, 218, 116882. [Google Scholar] [CrossRef]
- Leech, R.; Leech, D. Testing for Spatial Heterogeneity in Functional MRI Using the Multivariate General Linear Model. IEEE Trans. Med. Imaging 2011, 30, 1293–1302. [Google Scholar] [CrossRef]
- McKinney, B.A.; White, B.C.; Grill, D.E.; Li, P.W.; Kennedy, R.B.; Poland, G.A.; Oberg, A.L. ReliefSeq: A Gene-Wise Adaptive-K Nearest-Neighbor Feature Selection Tool for Finding Gene-Gene Interactions and Main Effects in mRNA-Seq Gene Expression Data. PLoS ONE 2013, 8, e81527. [Google Scholar] [CrossRef]
- Rasekhi, J.; Mollaei, M.R.K.; Bandarabadi, M.; Teixeira, C.A.; Dourado, A. Preprocessing Effects of 22 Linear Univariate Features on the Performance of Seizure Prediction Methods. J. Neurosci. Methods 2013, 217, 9–16. [Google Scholar] [CrossRef]
- Uruñuela, E.; Gonzalez-Castillo, J.; Zheng, C.; Bandettini, P.; Caballero-Gaudes, C. Whole-Brain Multivariate Hemodynamic Deconvolution for Functional MRI with Stability Selection. Med. Image Anal. 2024, 91, 103010. [Google Scholar] [CrossRef]
- Srinivasan, S.; Johnson, S.D. Optimizing Feature Subset for Schizophrenia Detection Using Multichannel EEG Signals and Rough Set Theory. Cogn. Neurodyn 2024, 18, 431–446. [Google Scholar] [CrossRef] [PubMed]
- Li, W.; Chen, G.; Chen, M.; Shen, K.; Wu, C.; Shen, W.; Zhang, F. PCA-WRKNN-Assisted Label-Free SERS Serum Analysis Platform Enabling Non-Invasive Diagnosis of Alzheimer’s Disease. Spectrochim. Acta Part A Mol. Biomol. Spectrosc. 2023, 302, 123088. [Google Scholar] [CrossRef] [PubMed]
- Hajianfar, G.; Haddadi Avval, A.; Hosseini, S.A.; Nazari, M.; Oveisi, M.; Shiri, I.; Zaidi, H. Time-to-Event Overall Survival Prediction in Glioblastoma Multiforme Patients Using Magnetic Resonance Imaging Radiomics. Radiol. Med. 2023, 128, 1521–1534. [Google Scholar] [CrossRef] [PubMed]
- Kasim, Ö. Identification of Attention Deficit Hyperactivity Disorder with Deep Learning Model. Phys. Eng. Sci. Med. 2023, 46, 1081–1090. [Google Scholar] [CrossRef]
- Keihani, A.; Sajadi, S.S.; Hasani, M.; Ferrarelli, F. Bayesian Optimization of Machine Learning Classification of Resting-State EEG Microstates in Schizophrenia: A Proof-of-Concept Preliminary Study Based on Secondary Analysis. Brain Sci. 2022, 12, 1497. [Google Scholar] [CrossRef]
- Chung, H.; Seo, H.; Choi, S.H.; Park, C.-K.; Kim, T.M.; Park, S.-H.; Won, J.K.; Lee, J.H.; Lee, S.-T.; Lee, J.Y.; et al. Cluster Analysis of DSC MRI, Dynamic Contrast-Enhanced MRI, and DWI Parameters Associated with Prognosis in Patients with Glioblastoma after Removal of the Contrast-Enhancing Component: A Preliminary Study. Am. J. Neuroradiol. 2022, 43, 1559–1566. [Google Scholar] [CrossRef]
- Treder, M.S.; Codrai, R.; Tsvetanov, K.A. Quality Assessment of Anatomical MRI Images from Generative Adversarial Networks: Human Assessment and Image Quality Metrics. J. Neurosci. Methods 2022, 374, 109579. [Google Scholar] [CrossRef]
- Liu, M.; Amey, R.C.; Backer, R.A.; Simon, J.P.; Forbes, C.E. Behavioral Studies Using Large-Scale Brain Networks—Methods and Validations. Front. Hum. Neurosci. 2022, 16, 875201. [Google Scholar] [CrossRef]
- Barreto, C.; Bruneri, G.d.A.; Brockington, G.; Ayaz, H.; Sato, J.R. A New Statistical Approach for fNIRS Hyperscanning to Predict Brain Activity of Preschoolers’ Using Teacher’s. Front. Hum. Neurosci. 2021, 15, 622146. [Google Scholar] [CrossRef]
- Sarton, B.; Jaquet, P.; Belkacemi, D.; de Montmollin, E.; Bonneville, F.; Sazio, C.; Frérou, A.; Conrad, M.; Daubin, D.; Chabanne, R.; et al. Assessment of Magnetic Resonance Imaging Changes and Functional Outcomes Among Adults With Severe Herpes Simplex Encephalitis. JAMA Netw. Open 2021, 4, e2114328. [Google Scholar] [CrossRef]
- Schmuker, M.; Pfeil, T.; Nawrot, M.P. A Neuromorphic Network for Generic Multivariate Data Classification. Proc. Natl. Acad. Sci. USA 2014, 111, 2081–2086. [Google Scholar] [CrossRef] [PubMed]
- von Lühmann, A.; Li, X.; Müller, K.-R.; Boas, D.A.; Yücel, M.A. Improved Physiological Noise Regression in fNIRS: A Multimodal Extension of the General Linear Model Using Temporally Embedded Canonical Correlation Analysis. NeuroImage 2020, 208, 116472. [Google Scholar] [CrossRef] [PubMed]
- Vizioli, L.; De Martino, F.; Petro, L.S.; Kersten, D.; Ugurbil, K.; Yacoub, E.; Muckli, L. Multivoxel Pattern of Blood Oxygen Level Dependent Activity Can Be Sensitive to Stimulus Specific Fine Scale Responses. Sci. Rep. 2020, 10, 7565. [Google Scholar] [CrossRef] [PubMed]
- Tzovara, A.; Chavarriaga, R.; De Lucia, M. Quantifying the Time for Accurate EEG Decoding of Single Value-Based Decisions. J. Neurosci. Methods 2015, 250, 114–125. [Google Scholar] [CrossRef]
- Zhang, Y.; Kimberg, D.Y.; Coslett, H.B.; Schwartz, M.F.; Wang, Z. Multivariate Lesion-symptom Mapping Using Support Vector Regression. Hum. Brain Mapp. 2014, 35, 5861–5876. [Google Scholar] [CrossRef]
- Dartora, C.; Marseglia, A.; Mårtensson, G.; Rukh, G.; Dang, J.; Muehlboeck, J.-S.; Wahlund, L.-O.; Moreno, R.; Barroso, J.; Ferreira, D.; et al. A Deep Learning Model for Brain Age Prediction Using Minimally Preprocessed T1w Images as Input. Front. Aging Neurosci. 2024, 15, 1303036. [Google Scholar] [CrossRef]
- Brown, C.A.; Almarzouki, A.F.; Brown, R.J.; Jones, A.K.P. Neural Representations of Aversive Value Encoding in Pain Catastrophizers. NeuroImage 2019, 184, 508–519. [Google Scholar] [CrossRef]
- Khawaldeh, S.; Tinkhauser, G.; Torrecillos, F.; He, S.; Foltynie, T.; Limousin, P.; Zrinzo, L.; Oswal, A.; Quinn, A.J.; Vidaurre, D.; et al. Balance between Competing Spectral States in Subthalamic Nucleus Is Linked to Motor Impairment in Parkinson’s Disease. Brain 2022, 145, 237–250. [Google Scholar] [CrossRef]
- Hussain, S.J.; Quentin, R. Decoding Personalized Motor Cortical Excitability States from Human Electroencephalography. Sci. Rep. 2022, 12, 6323. [Google Scholar] [CrossRef]
- Kwak, S.; Akbari, H.; Garcia, J.A.; Mohan, S.; Davatzikos, C. Fully Automatic mpMRI Analysis Using Deep Learning Predicts Peritumoral Glioblastoma Infiltration and Subsequent Recurrence. Proc. SPIE Int. Soc. Opt. Eng. 2024, 12926, 423–429. [Google Scholar] [CrossRef]
- Vidaurre, C.; Gurunandan, K.; Idaji, M.J.; Nolte, G.; Gómez, M.; Villringer, A.; Müller, K.-R.; Nikulin, V.V. Novel Multivariate Methods to Track Frequency Shifts of Neural Oscillations in EEG/MEG Recordings. Neuroimage 2023, 276, 120178. [Google Scholar] [CrossRef] [PubMed]
- Xue, T.; Bai, L.; Chen, S.; Zhong, C.; Feng, Y.; Wang, H.; Liu, Z.; You, Y.; Cui, F.; Ren, Y.; et al. Neural Specificity of Acupuncture Stimulation from Support Vector Machine Classification Analysis. Magn. Reson. Imaging 2011, 29, 943–950. [Google Scholar] [CrossRef] [PubMed]
- Aayesha; Qureshi, M.B.; Afzaal, M.; Qureshi, M.S.; Fayaz, M. Machine Learning-Based EEG Signals Classification Model for Epileptic Seizure Detection. Multimed. Tools Appl. 2021, 80, 17849–17877. [Google Scholar] [CrossRef]
- Wang, G.; Sun, Z.; Tao, R.; Li, K.; Bao, G.; Yan, X. Epileptic Seizure Detection Based on Partial Directed Coherence Analysis. IEEE J. Biomed. Health Inform. 2016, 20, 873–879. [Google Scholar] [CrossRef]
- Williamson, J.R.; Bliss, D.W.; Browne, D.W.; Narayanan, J.T. Seizure Prediction Using EEG Spatiotemporal Correlation Structure. Epilepsy Behav. 2012, 25, 230–238. [Google Scholar] [CrossRef]
- Bomela, W.; Wang, S.; Chou, C.-A.; Li, J.-S. Real-Time Inference and Detection of Disruptive EEG Networks for Epileptic Seizures. Sci. Rep. 2020, 10, 8653. [Google Scholar] [CrossRef]
- Zhang, Z.; Chen, G.; Yang, S. Ensemble Support Vector Recurrent Neural Network for Brain Signal Detection. IEEE Trans. Neural Netw. Learn. Syst. 2022, 33, 6856–6866. [Google Scholar] [CrossRef]
- Csaky, R.; van Es, M.W.J.; Jones, O.P.; Woolrich, M. Interpretable Many-Class Decoding for MEG. NeuroImage 2023, 282, 120396. [Google Scholar] [CrossRef]
- Pancholi, S.; Giri, A.; Jain, A.; Kumar, L.; Roy, S. Source Aware Deep Learning Framework for Hand Kinematic Reconstruction Using EEG Signal. IEEE Trans. Cybern. 2023, 53, 4094–4106. [Google Scholar] [CrossRef]
- Ieracitano, C.; Mammone, N.; Hussain, A.; Morabito, F.C. A Novel Multi-Modal Machine Learning Based Approach for Automatic Classification of EEG Recordings in Dementia. Neural Netw. 2020, 123, 176–190. [Google Scholar] [CrossRef]
- Schrouff, J.; Mourão-Miranda, J.; Phillips, C.; Parvizi, J. Decoding Intracranial EEG Data with Multiple Kernel Learning Method. J. Neurosci. Methods 2016, 261, 19–28. [Google Scholar] [CrossRef] [PubMed]
- EskandariNasab, M.; Raeisi, Z.; Lashaki, R.A.; Najafi, H. A GRU-CNN Model for Auditory Attention Detection Using Microstate and Recurrence Quantification Analysis. Sci. Rep. 2024, 14, 8861. [Google Scholar] [CrossRef] [PubMed]
- Gier, E.C.; Pulliam, A.N.; Gaul, D.A.; Moore, S.G.; LaPlaca, M.C.; Fernández, F.M. Lipidome Alterations Following Mild Traumatic Brain Injury in the Rat. Metabolites 2022, 12, 150. [Google Scholar] [CrossRef] [PubMed]
- Koren, V. Uncovering Structured Responses of Neural Populations Recorded from Macaque Monkeys with Linear Support Vector Machines. STAR Protoc. 2021, 2, 100746. [Google Scholar] [CrossRef] [PubMed]
- Fröhlich, H.; Claes, K.; De Wolf, C.; Van Damme, X.; Michel, A. A Machine Learning Approach to Automated Gait Analysis for the Noldus Catwalk System. IEEE Trans. Biomed. Eng. 2018, 65, 1133–1139. [Google Scholar] [CrossRef]
- Ehrens, D.; Assaf, F.; Cowan, N.J.; Sarma, S.V.; Schiller, Y. Ultra Broad Band Neural Activity Portends Seizure Onset in a Rat Model of Epilepsy. Annu. Int. Conf. IEEE Eng. Med. Biol. Soc. 2018, 2018, 2276–2279. [Google Scholar] [CrossRef]
- Baldassi, C.; Alemi-Neissi, A.; Pagan, M.; Dicarlo, J.J.; Zecchina, R.; Zoccolan, D. Shape Similarity, Better than Semantic Membership, Accounts for the Structure of Visual Object Representations in a Population of Monkey Inferotemporal Neurons. PLoS Comput. Biol. 2013, 9, e1003167. [Google Scholar] [CrossRef]
- Appleby, R.B.; Basran, P.S. Artificial Intelligence in Veterinary Medicine. J. Am. Vet. Med. Assoc. 2022, 260, 819–824. [Google Scholar] [CrossRef]
- Arzi, B.; Webb, T.L.; Koch, T.G.; Volk, S.W.; Betts, D.H.; Watts, A.; Goodrich, L.; Kallos, M.S.; Kol, A. Cell Therapy in Veterinary Medicine as a Proof-of-Concept for Human Therapies: Perspectives From the North American Veterinary Regenerative Medicine Association. Front. Vet. Sci. 2021, 8, 779109. [Google Scholar] [CrossRef]
- Fraiwan, L.; Alkhodari, M. Neonatal Sleep Stage Identification Using Long Short-Term Memory Learning System. Med. Biol. Eng. Comput. 2020, 58, 1383–1391. [Google Scholar] [CrossRef]
- Khadidos, A.O.; Alyoubi, K.H.; Mahato, S.; Khadidos, A.O.; Nandan Mohanty, S. Machine Learning and Electroencephalogram Signal Based Diagnosis of Depression. Neurosci. Lett. 2023, 809, 137313. [Google Scholar] [CrossRef] [PubMed]
- Xing, M.; Hu, S.; Wei, B.; Lv, Z. Spatial-Frequency-Temporal Convolutional Recurrent Network for Olfactory-Enhanced EEG Emotion Recognition. J. Neurosci. Methods 2022, 376, 109624. [Google Scholar] [CrossRef] [PubMed]
- Zong, J.; Xiong, X.; Zhou, J.; Ji, Y.; Zhou, D.; Zhang, Q. FCAN–XGBoost: A Novel Hybrid Model for EEG Emotion Recognition. Sensors 2023, 23, 5680. [Google Scholar] [CrossRef] [PubMed]
- Yang, L.; Wang, Z.; Wang, G.; Liang, L.; Liu, M.; Wang, J. Brain-Inspired Modular Echo State Network for EEG-Based Emotion Recognition. Front. Neurosci. 2024, 18, 1305284. [Google Scholar] [CrossRef]
- Kim, H.-H.; Jeong, J. Decoding Electroencephalographic Signals for Direction in Brain-Computer Interface Using Echo State Network and Gaussian Readouts. Comput. Biol. Med. 2019, 110, 254–264. [Google Scholar] [CrossRef]
- Itälinna, V.; Kaltiainen, H.; Forss, N.; Liljeström, M.; Parkkonen, L. Using Normative Modeling and Machine Learning for Detecting Mild Traumatic Brain Injury from Magnetoencephalography Data. PLoS Comput. Biol. 2023, 19, e1011613. [Google Scholar] [CrossRef]
- Jiang, W.; Ding, S.; Xu, C.; Ke, H.; Bo, H.; Zhao, T.; Ma, L.; Li, H. Discovering the Neuronal Dynamics in Major Depressive Disorder Using Hidden Markov Model. Front. Hum. Neurosci. 2023, 17, 1197613. [Google Scholar] [CrossRef]
- Nadalizadeh, F.; Rajabioun, M.; Feyzi, A. Driving Fatigue Detection Based on Brain Source Activity and ARMA Model. Med. Biol. Eng. Comput. 2024, 62, 1017–1030. [Google Scholar] [CrossRef]
- Paliwal, V.; Das, K.; Doesburg, S.M.; Medvedev, G.; Xi, P.; Ribary, U.; Pachori, R.B.; Vakorin, V.A. Classifying Routine Clinical Electroencephalograms With Multivariate Iterative Filtering and Convolutional Neural Networks. IEEE Trans. Neural Syst. Rehabil. Eng. 2024, 32, 2038–2048. [Google Scholar] [CrossRef]
- Uyulan, C.; de la Salle, S.; Erguzel, T.T.; Lynn, E.; Blier, P.; Knott, V.; Adamson, M.M.; Zelka, M.; Tarhan, N. Depression Diagnosis Modeling With Advanced Computational Methods: Frequency-Domain eMVAR and Deep Learning. Clin. EEG Neurosci. 2022, 53, 24–36. [Google Scholar] [CrossRef]
- Zafar, R.; Dass, S.C.; Malik, A.S. Electroencephalogram-Based Decoding Cognitive States Using Convolutional Neural Network and Likelihood Ratio Based Score Fusion. PLoS ONE 2017, 12, e0178410. [Google Scholar] [CrossRef] [PubMed]
- Asgari, S.; Adams, H.; Kasprowicz, M.; Czosnyka, M.; Smielewski, P.; Ercole, A. Feasibility of Hidden Markov Models for the Description of Time-Varying Physiologic State After Severe Traumatic Brain Injury. Crit. Care Med. 2019, 47, e880. [Google Scholar] [CrossRef] [PubMed]
- Farhadi, A.; Chern, J.J.; Hirsh, D.; Davis, T.; Jo, M.; Maier, F.; Rasheed, K. Intracranial Pressure Forecasting in Children Using Dynamic Averaging of Time Series Data. Forecasting 2019, 1, 47–58. [Google Scholar] [CrossRef]
- Güiza, F.; Depreitere, B.; Piper, I.; Van den Berghe, G.; Meyfroidt, G. Novel Methods to Predict Increased Intracranial Pressure During Intensive Care and Long-Term Neurologic Outcome After Traumatic Brain Injury: Development and Validation in a Multicenter Dataset. Crit. Care Med. 2013, 41, 554. [Google Scholar] [CrossRef]
- Myers, R.B.; Lazaridis, C.; Jermaine, C.M.; Robertson, C.S.; Rusin, C.G. Predicting Intracranial Pressure and Brain Tissue Oxygen Crises in Patients With Severe Traumatic Brain Injury. Crit. Care Med. 2016, 44, 1754. [Google Scholar] [CrossRef]
- Lee, S.; Hussein, R.; Ward, R.; Jane Wang, Z.; McKeown, M.J. A Convolutional-Recurrent Neural Network Approach to Resting-State EEG Classification in Parkinson’s Disease. J. Neurosci. Methods 2021, 361, 109282. [Google Scholar] [CrossRef]
Model | Description | Advantages | Disadvantages |
---|---|---|---|
HMM | Probabilistic graphical model used to model sequential data, such as recorded data, by considering both directly measured factors (observable) and underlying aspects that cannot be directly seen (hidden), such as disease states. | Flexibility for modeling diverse sequential data types. Ability to capture temporal dependencies and transitions. Interpretability, providing insights into hidden state dynamics. Feature extraction capabilities for capturing relevant patterns. Well-suited for analyzing sequential data in various domains. | Assumption of Markovian property may not hold for complex systems. Fixed state space can be challenging when the number of states is unknown. Limited modeling of long-term dependencies in data. Difficulty with high-dimensional data and computational complexity. Sensitivity to initialization and parameter tuning. Inference complexity increases with large state spaces or long sequences. Limited representation power compared to deep learning models. Difficulty in handling continuous data without preprocessing. Vulnerability to overfitting, particularly with large state spaces relative to data size. |
CNN | Neural network architecture effective at capturing spatial hierarchies of features within data. | Hierarchical feature learning captures progressively complex features. Translation invariance enables robustness to spatial variations. Sparse connectivity reduces parameters and computational load. Parameter sharing facilitates generalization and handling of variable inputs. Effective for high-dimensional data like images and videos. Parallelizable operations enable fast training and inference. Transfer learning accelerates training with pre-trained models. Interpretability through visualization aids in model understanding. Improved fundamental feature extraction. | Limited interpretability of learned features. Requirement for large amounts of labeled data. Sensitivity to variations in hyperparameters. Lack of spatial context understanding in some cases. Difficulty in handling irregular data structures. Complexity of model architecture design. Vulnerability to adversarial attacks, meaning subtle alterations to input data can lead the network to confidently misclassify. Heavy computational requirements for training and inference. |
RNN | Neural network architecture designed for processing sequential data by allowing connections between units to form directed cycles, enabling information persistence over time. | Temporal dynamics for time-series prediction and sequence tasks. Ability to process variable-length inputs. Shared parameters facilitate learning of long-term dependencies. Natural representation for sequential data tasks. Stateful memory captures context across time steps. Gradient propagation commonly with backpropagation through time. | Vanishing and exploding gradients hinder training. Short-term memory limits capture of long-range dependencies. Difficulty in capturing long-term dependencies. Sequential computation slows training and inference. Sensitivity to hyperparameters affects performance. Training instability with large datasets or complex architectures. Inadequacy in capturing complex contextual information. |
LSTM | Type of RNN architecture designed to address the vanishing gradient problem and capture long-term dependencies by introducing specialized memory cells with gating mechanisms. | Capable of capturing long-term dependencies in sequences. Addresses the vanishing gradient problem for stable training. Utilizes gating mechanisms for better control over information flow. Maintains stateful memory for retaining relevant context. Versatile and effective for various sequential data tasks including time-series prediction. | Increased complexity and computational requirements. Proneness to overfitting, especially with limited data. Sensitivity to hyperparameters, requiring careful tuning. Difficulty in interpreting internal mechanisms. Limited memory for processing very long sequences. Potential gradient explosion during training. |
ESN | Type of RNN model that utilizes a fixed, randomly connected reservoir and only trains the output weights, making them efficient for processing temporal data. | Training only the output layer is fast and computationally inexpensive. The fixed reservoir simplifies the network design and reduces the parameters to optimize. The reservoir transforms inputs into a high-dimensional space with diverse dynamic behavior. Fewer trainable parameters lower the risk of overfitting. Fixed reservoir weights provide stable and predictable dynamics. Applicable to various tasks involving temporal data, such as time series prediction and signal processing. Not affected by vanishing or exploding gradient issues. | Sensitivity to reservoir size, connectivity, and spectral radius. Untrained reservoir may not suit specific input data characteristics. Large reservoirs consume significant memory resources. Fixed reservoir cannot adapt to new data patterns during training. Challenging to find the optimal reservoir configuration for specific tasks. |
Random forest | An ensemble learning method that constructs multiple decision trees during training and outputs the mode of the classes (classification) or the mean prediction (regression) of the individual trees. | High accuracy through the aggregation of multiple decision trees. Robustness to overfitting compared to individual trees. Effective handling of missing data. Provision of feature importance measures for interpretation. Ability to capture non-linear relationships. Robustness to outliers in the data. Efficiency in handling large datasets and high dimensionality. | Less interpretability compared to simpler models. Computational complexity can lead to longer training times. Memory-intensive due to storing multiple trees. Bias towards majority classes in imbalanced datasets. Time-consuming hyperparameter tuning. Slower prediction speed compared to simpler models for real-time applications. Reduced effectiveness with highly correlated features. |
XGBoost | An implementation of random forest algorithms designed for efficiency, speed, and accuracy in supervised learning tasks. | High performance with efficiency and scalability. Built-in regularization techniques for preventing overfitting. Flexibility in supporting various objective functions and metrics. Feature importance scores for informed feature selection. Automatic handling of missing values in the data. Parallelization for faster training on large datasets. Advanced tree pruning for improved model complexity control. Useful in feature extraction. | Complexity in hyperparameter tuning. High memory usage with large datasets or deep trees. Reduced interpretability as a black box model. Potential overfitting, especially with complex datasets. Sensitivity to outliers in the data. Scalability limitations with extremely large datasets. Challenges in handling imbalanced datasets. |
SVM | A supervised machine learning algorithm used for classification and regression tasks, aiming to find the optimal hyperplane that best separates different classes or predicts continuous values. | Effective in high-dimensional spaces. Robust to overfitting due to margin maximization. Versatility with linear and non-linear kernel functions. Memory efficiency using a subset of support vectors. Effective with small datasets, focusing on support vectors. Aim to find global optimum for more stable solutions. Controlled complexity through parameter tuning. | Sensitivity to parameter tuning. Computationally intensive for large datasets. Memory-intensive storage of entire training dataset. Difficulty with scalability to very large datasets. Limited interpretability as black-box models. Performance degradation with noisy data. Inherent limitation to binary classification tasks. |
GP | A probabilistic model that defines a distribution over functions, where any finite set of points has a joint Gaussian distribution. | Flexibility to model complex relationships without assuming specific functional forms. Uncertainty quantification for reliable predictions under uncertainty. Capable of interpolation and extrapolation for sparse or irregularly sampled data. Automatic adjustment of complexity based on available data. Nonparametric nature allows for growing complexity with increasing data. Probabilistic framework enables principled uncertainty estimation and Bayesian inference. Easy incorporation of prior knowledge through choice of covariance functions. | Computational complexity for large datasets. Memory requirements for storing entire datasets. Limited scalability to high-dimensional data. Challenging choice of covariance function. Sensitivity to hyperparameters. Difficulty with non-Gaussian likelihoods. Limited interpretability due to complex nature. |
Study | Study Group | Relevant ML Model | Cerebral Physiology | Significance of the Model in the Study |
---|---|---|---|---|
Asgari et al., 2019 [131] | Adult patients with TBI | HMM | ICP, CPP, PRx, RAP Other: ABP | A HMM was utilized to determine the cerebral dynamic states with respect to various cerebral physiological signals. |
Farhadi et al., 2019 [132] | Pediatric ICU patients | SVM, random forest | ICP, CPP Other: MAP, HR, BP | SVM and random forest models were compared with other models for prediction of ICP episodes, with the random forest model achieving the highest prediction accuracy. |
Fraiwan and Alkhodari, 2020 [119] | Neonates | LSTM | Sleep-state EEG recordings | A LSTM algorithm was utilized and compared with studies from the literature for automatic sleep state scoring for neonates, achieving the highest accuracy. |
Güiza et al., 2013 [133] | Patients with TBI | GP | ICP, CPP Other: MAP | A GP was compared to logistic regression for increased ICP episode prediction and early prediction of unfavorable neurological outcome, with the GP model exhibiting the best overall performance. |
Itälinna et al., 2023 [125] | Patients with mild TBI and healthy controls | SVM | MEG | A SVM classifier was trained on quantitative deviation maps to distinguish TBI patients from healthy control subjects. |
Jiang et al., 2023 [126] | Healthy volunteers and individuals with major depressive disorder | HMM with multivariate autoregressive observation (MAR) | Resting-state and task-state EEG recordings | The HMM-MAR model illustrated the ability to capture neuronal dynamics from EEG signals and to interpret brain disease pathogenesis by analyzing state transitions. |
Khadidos et al., 2023 [120] | Healthy volunteers and patients with depression | Decision tree, random forest, CNN, RNN, LSTM, XGBoost | Stimuli-induced EEG recordings | Models were compared for detection and classification of depression. CNN showed the best performance among all employed models. |
Khawaldeh et al., 2022 [97] | PD patients | HMM | LFP | A HMM was used to detect different LFP states to investigate the impact of various spectral states in the subthalamic nucleus LFP on motor impairment in PD patients. |
Kim and Jeong, 2019 [124] | Healthy volunteers | ESN and Gaussian readouts | Stimuli-induced EEG recordings | ESN and Gaussian readouts were shown to effectively decode user movement intentions using a low-cost, portable EEG system. |
Lee et al., 2021 [135] | Healthy volunteers and PD patients | CNN-RNN | Resting-state EEG recordings | CNNs were employed for feature extraction, while RNN model was used for detection and classification of PD patients, showing better performance compared to baseline machine learning models as well as the deep learning models from the literature. |
Mughal et al., 2022 [39] | Healthy volunteers | CNN-LSTM | fNIRS and task-state EEG recordings | The CNN-LSTM hybrid model was applied to images generated by recurrence plots for stand-alone EEG and fNIRS data, as well as hybrid EEG-fNIRS data, for the classification of changes in brain state. The performance of the model using hybrid EEG-fNIRS data were observed to be superior compared to the other two image sets, as well as to the results reported in the literature. |
Myers et al., 2016 [134] | Patients with severe TBI | GP | ICP, PbtO2 Other: MAP, EtCO2, SaO2 | A GP was compared to logistic regression and autoregressive model for univariate and multivariate prediction of ICP and PbtO2. |
Nadalizadeh et al., 2024 [127] | Drivers in fatigued and normal states | k-NN, SVM, random forest | Resting-state and task-state EEG recordings | k-NN, SVM, and random forest classifiers were applied to features extracted from EEG signals for fatigue detection and recognition. |
Paliwal et al., 2024 [128] | Patients of various ages who had undergone routine clinical EEG scans | CNN | EEG | A CNN was used to predict brain age of a patient from EEG scans. CNN model performance was shown to improve with multivariate iterative filtering. |
Pancholi et al., 2023 [108] | Healthy volunteers | MLP, CNN-LSTM, WPD CNN-LSTM | Task-state EEG recordings | MLP, CNN-LSTM, and WPD CNN-LSTM are employed and compared for the prediction of hand kinematic trajectory. |
Uyulan et al., 2022 [129] | Patients with major depressive disorder and healthy volunteers | Pretrained CNN-LSTM | Resting-state EEG recordings | A hybrid model was employed alongside a stand-alone LSTM to detect depression-specific information from EEG signals for depression classification. The hybrid model demonstrates better performance, lower training time, and no overfitting issues. |
Williamson et al., 2012 [104] | Patients with medically intractable focal epilepsy | SVM | Intracranial EEG | A SVM model was trained on 15 s of EEG signals for the classification of preictal or interictal state of patients. |
Yang et al., 2024 [123] | Healthy volunteers | M-ESN, ESN | Stimuli-induced EEG recordings | A M-ESN, where ESN hidden state is directly initialized, outperformed standard ESN while having a smaller reservoir size and a simpler training process. |
Xing et al., 2022 [121] | Healthy volunteers | CNN-LSTM | Stimuli-induced EEG recordings | A CNN-LSTM model was utilized for emotion detection by combining spatial–frequency–temporal features extracted from EEG signals. The model showed better performance compared to baseline methods. |
Zafar et al., 2017 [130] | Healthy volunteers | CNN | Stimuli-induced EEG recordings | A CNN was utilized for feature extraction from EEG signals, which were then used in a separate classification task. |
Zong et al., 2023 [122] | Healthy volunteers | XGBoost | Stimuli-induced EEG recordings | XGBoost was employed for emotion recognition task using features extracted with a feature attention network module. The proposed model was shown to have better performance than baseline models. |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Vakitbilir, N.; Islam, A.; Gomez, A.; Stein, K.Y.; Froese, L.; Bergmann, T.; Sainbhi, A.S.; McClarty, D.; Raj, R.; Zeiler, F.A. Multivariate Modelling and Prediction of High-Frequency Sensor-Based Cerebral Physiologic Signals: Narrative Review of Machine Learning Methodologies. Sensors 2024, 24, 8148. https://doi.org/10.3390/s24248148
Vakitbilir N, Islam A, Gomez A, Stein KY, Froese L, Bergmann T, Sainbhi AS, McClarty D, Raj R, Zeiler FA. Multivariate Modelling and Prediction of High-Frequency Sensor-Based Cerebral Physiologic Signals: Narrative Review of Machine Learning Methodologies. Sensors. 2024; 24(24):8148. https://doi.org/10.3390/s24248148
Chicago/Turabian StyleVakitbilir, Nuray, Abrar Islam, Alwyn Gomez, Kevin Y. Stein, Logan Froese, Tobias Bergmann, Amanjyot Singh Sainbhi, Davis McClarty, Rahul Raj, and Frederick A. Zeiler. 2024. "Multivariate Modelling and Prediction of High-Frequency Sensor-Based Cerebral Physiologic Signals: Narrative Review of Machine Learning Methodologies" Sensors 24, no. 24: 8148. https://doi.org/10.3390/s24248148
APA StyleVakitbilir, N., Islam, A., Gomez, A., Stein, K. Y., Froese, L., Bergmann, T., Sainbhi, A. S., McClarty, D., Raj, R., & Zeiler, F. A. (2024). Multivariate Modelling and Prediction of High-Frequency Sensor-Based Cerebral Physiologic Signals: Narrative Review of Machine Learning Methodologies. Sensors, 24(24), 8148. https://doi.org/10.3390/s24248148