Next Article in Journal
Prediction of Corroded Pipeline Failure Pressure Based on Empirical Knowledge and Machine Learning
Previous Article in Journal
Efficacy of a Low-Cost Weight-Bearing Sensitivity Incentivator After Lower Limb Surgery: A Pilot Randomised Controlled Trial
Previous Article in Special Issue
Individual Differences in Sustained Attention: Effects of Age, Sex, and Time of Day Based on Psychomotor Vigilance Task Performance
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Comparison of ECG Between Gameplay and Seated Rest: Machine Learning-Based Classification

1
Innovation Center for Semiconductor and Digital Future, Mie University, Tsu 514-8507, Japan
2
Department of Management Science and Technology, School of Engineering, Tohoku University, Sendai 980-8576, Japan
3
Sanei Medisys Co., Ltd., Kyoto 607-8116, Japan
4
Faculty of Engineering, Fukuyama University, Fukuyama 729-0292, Japan
*
Author to whom correspondence should be addressed.
Appl. Sci. 2025, 15(10), 5783; https://doi.org/10.3390/app15105783
Submission received: 26 March 2025 / Revised: 22 April 2025 / Accepted: 21 May 2025 / Published: 21 May 2025
(This article belongs to the Special Issue Application of Artificial Intelligence in Bioinformatics)

Abstract

:
The influence of gameplay on autonomic nervous system activity was investigated by comparing electrocardiogram (ECG) data during seated rest and gameplay. A total of 13 participants (6 in the gameplay group and 7 in the control group) were analyzed. RR interval time series (2 Hz) and heart-rate variability (HRV) indices, including mean RR, SDRR, VLF, LF, HF, LF/HF, and HF peak frequency, were extracted from ECG signals over 5 min and 10 min segments. HRV indices were calculated using fast Fourier transform (FFT). The classification was performed using Logistic Regression (LGR), Random Forest (RF), XGBoost (XGB, v2.9.2), One-Class SVM (OCS), Isolation Forest (ILF), and Local Outlier Factor (LOF). A balanced dataset of 5 min and 10 min segments was evaluated using k-fold cross-validation (k = 3, 4, 5). Performance metrics, including recall, F-score, and PR-AUC, were computed for each classifier. Grid search was applied to optimize parameters for LGR, RF, and XGB, while default settings were used for the other classifiers. Among all models, OCS with k = 3 achieved the highest classification accuracy for both 5 min and 10 min data. These findings suggest that machine learning-based classification can effectively distinguish ECG patterns between gameplay and rest.

1. Introduction

With the rapid advancement of digital entertainment, video game addiction among young individuals has become a growing public health concern [1,2,3,4,5,6,7,8,9,10,11,12,13]. Excessive gaming has been associated with various physical and psychological issues, including disrupted sleep patterns, reduced physical activity, and cognitive impairments. In response to this issue, several studies have attempted to establish objective metrics for quantifying gaming addiction. Previous research has proposed various physiological and psychological assessment tools, including surveys, neuroimaging, and autonomic nervous system (ANS) analysis using heart-rate variability (HRV) [14,15]. While these studies provide valuable insights into the characteristics of gaming addiction, they primarily focus on evaluating the severity of addiction rather than differentiating addicted individuals from non-addicted ones [15]. Thus, research on reliable classification methods that can distinguish gaming-addicted individuals from healthy controls based on physiological signals is developing.
In previous studies, there have been few studies to distinguish between gaming time and rest time based on physiological data. Therefore, in this study, we examined whether it is possible to classify whether someone is gaming or not using ECG signals and machine learning analysis to distinguish between individuals with gaming addiction and healthy controls.
We measured the subjects’ ECG signals while they were playing games, extracted the time series of RR intervals and HRV indices from them, and applied multiple machine learning classifiers. The performance of the machine learning models is evaluated to identify the most effective classification approach. This study introduces a new perspective of classifying whether or not someone is gaming based on an individual’s physiological response, rather than simply identifying the amount of time spent gaming. Through this approach, we contribute to the development of an objective classification method for detecting gaming time, and may contribute to the field of saving gaming time and play time, especially setting time limits for children. Limiting the amount of time children spend playing games can help prevent children from becoming addicted to gaming and ensure they have time to sleep and study, and evaluating physiological responses during gameplay poses several important challenges. First, there is substantial inter-individual variability in baseline autonomic nervous system activity, making it difficult to generalize results across participants. Second, short-term fluctuations in physiological signals such as heart rate and HRV—caused by factors like momentary excitement, fatigue, or emotional state—can confound classification accuracy. Third, environmental factors such as room temperature, noise, lighting, and posture during gameplay can influence physiological measurements and introduce noise into the data. These complexities must be carefully considered in the development of reliable and generalizable classification models.

2. Materials and Methods

2.1. Participants

This study recruited 13 healthy participants (mean age: 31.9 ± 12 years old, 1 female) with no known underlying medical conditions. Participants were divided into two groups: the gaming group (n = 6) and the seated-rest control group (n = 7). The gaming group consisted of individuals who regularly played video games, whereas the control group included participants who did not engage in gaming during the experiment. All participants provided written informed consent before the experiment, following ethical guidelines approved by the Fukuyama University Ethics Committee (No.2024-H-52, Approved 10/11/2024). The study strictly adhered to the ethical principles outlined in the Declaration of Helsinki for human research. Participants were informed of the study’s objectives, procedures, potential risks, and their right to withdraw at any time without consequences.

2.2. Experimental Protocol

The ECG recordings were conducted in a seated posture to ensure consistency in physiological measurements. The total measurement duration varied between 40 min and 120 min per participant, depending on individual engagement and tolerance levels. The experiment was conducted under two distinct conditions:
Gameplay Condition: Participants in the gaming group played an interactive video game. The choice of the game was unrestricted, allowing players to engage with a game of their preference to maintain natural gaming behavior.
Seated-Rest Condition: The control group remained seated in a relaxed state without engaging in any external stimuli, such as reading, watching videos, or using smartphones.
Both conditions were conducted in a quiet environment with controlled room temperature (24 ± 2 °C) to minimize external influences on heart-rate variability (HRV). The seated posture was chosen to avoid additional physiological variations caused by movement or changes in body position. To ensure the reliability of physiological measurements and the validity of group comparisons, strict exclusion criteria were applied. Participants with any history of cardiovascular disease, autonomic dysfunction, or neurological disorders were excluded, as these conditions could significantly influence heart-rate variability (HRV) and confound the results. Additionally, individuals currently taking medications that affect autonomic function, such as beta blockers or antidepressants, were not included in the study. Given that this research focuses on physiological differences between individuals with and without gaming addiction, only healthy participants without pre-existing medical conditions were recruited. Furthermore, individuals with irregular sleep patterns, excessive caffeine or alcohol consumption prior to the experiment, or high levels of daily physical activity were excluded to minimize confounding factors. These strict criteria ensured that the observed physiological differences were attributable to gaming addiction rather than other underlying health conditions.

2.3. ECG Data Collection and HRV Analysis

ECG signals were continuously recorded using the Checkme Pro device (San-ei Medisys, Japan). The Checkme Pro was selected for its compact design, high-quality signal acquisition, and ease of use in controlled experiments. The ECG signal was measured using a NASA-guided lead placement, which records the potential difference between the upper and lower sternum. This placement was chosen because it effectively minimizes electromyographic (EMG) noise while providing excellent P-wave visibility, enhancing the accuracy of RR interval detection. The ECG signals were sampled at a frequency of 250 Hz, ensuring high temporal resolution for HRV analysis. The Checkme Pro device was connected to a PC via a USB cable, and the recorded ECG data were transferred to a computer using Checkme Viewer software (San-ei Medisys, Japan, https://www.checkme.jp/pcviewer/, accessed on 1 February 2025). The exported ECG data were saved in CSV format for further processing and analysis.
From the raw ECG signals, RR interval time series were extracted and resampled at 2 Hz to standardize the data for HRV analysis. The study focused on key heart-rate variability (HRV) indices, which were computed using fast Fourier transform (FFT) to extract frequency-domain features. Resampling RR interval time series at 2 Hz is a common practice in heart-rate variability (HRV) analysis to convert the inherently irregular RR intervals—derived from R-peaks in the ECG which represent individual heartbeats—into a uniformly spaced time series suitable for spectral analysis. Since RR intervals are not naturally equidistant in time, resampling is necessary to apply frequency-domain methods such as the fast Fourier transform (FFT). A sampling rate of 2 Hz (i.e., data points every 0.5 s) provides sufficient temporal resolution to capture both low-frequency (LF: 0.04–0.15 Hz) and high-frequency (HF: 0.15–0.4 Hz) components of HRV while minimizing computational load. This makes it a practical and balanced choice.
Power spectrum analysis methods include Lomb–Scargle period analysis and the AR model (Yule–Walker equation and Akaike’s Information Criterion). Lomb–Scargle period analysis can be applied to irregularly sampled data and is suitable for HRV analysis obtained from wearable devices. The AR model has high frequency resolution and can be applied to small datasets, making it suitable for long-term HRV analysis and autonomic nervous balance evaluation. In this study, we selected the fast Fourier transform method, which is easy to apply to short-term data because the sampling time interval is constant.
  • Mean RR interval (ms): The average duration between consecutive R-peaks, representing overall heart-rate trends.
  • SDRR (ms): The standard deviation of RR intervals, reflecting overall HRV magnitude.
  • VLF (very low-frequency power (ln, ms2), 0.003–0.04 Hz): Associated with long-term autonomic regulation and possibly thermoregulatory mechanisms.
  • LF (low-frequency power (ln, ms2), 0.04–0.15 Hz): Represents a combination of sympathetic and parasympathetic nervous system activity.
  • HF (high-frequency power (ln, ms2), 0.15–0.40 Hz): Primarily reflects parasympathetic (vagal) activity and respiratory influences.
  • LF/HF ratio: An indicator of sympathovagal balance, with higher values suggesting increased sympathetic dominance.
  • HF peak frequency (Hz): The dominant frequency within the HF band, associated with respiratory modulation of heart rate.
HRV indices were calculated for both 5 min and 10 min segments of ECG data to analyze short-term autonomic fluctuations.
In this study, we adopted the power spectral density method to analyze the periodic structure of data in a time-series data analysis of heart-rate variability. The power spectrum was calculated using a nonparametric estimation method to calculate the power spectral density (PSD).
lim n 1 T π / 2 π / 2 x ( t ) 2 d t = s ( ω ) 2
The formula for calculating power and spectrum is shown below. Power is a quantity defined as the square of the signal x(t). The left side of the formula represents the time average of the power of the signal x(t) when the time width is extended to infinity, and is the power of the signal x(t) per unit time. If the unit of time t is seconds, the average power per second can be expressed. The quantity that indicates the proportion of the wave of each frequency that contributes to the total power is S(ω), which is the power spectral density. ω is the angular velocity, a variable (ω = 2 pi/T) that is proportional to the frequency f (=1/T). To calculate S(ω), the formula was transformed using the Fourier transform and inverse Fourier transform.
X ω = 1 2 π x ( t ) e i ω t 2 dt
x t = 1 2 π X ( t ) e i ω t 2
lim n 1 T π / 2 π / 2 x ( t ) 2 d t = lim n X ( ω ) 2 T d ω
In this way, in this study, HRV indexes were calculated by a direct analysis of time-series data.
To ensure ECG signal quality, we evaluated the signal-to-noise ratio (SNR) using a baseline noise estimation method. SNR was calculated by comparing the power of the QRS complexes to the power of baseline fluctuations. Artifacts, including motion and electrode noise, were mitigated using bandpass filtering (0.5–40 Hz) and manual inspection. Segments with significant noise or signal dropout were excluded from further analysis to ensure reliable HRV feature extraction.

2.4. Machine Learning Classification

To classify participants based on their physiological responses, six machine learning models were applied to the HRV feature set:
  • Logistic Regression (LGR): A linear classification model used for binary classification, providing probability estimates.
  • Random Forest (RF): An ensemble learning method that constructs multiple decision trees and averages predictions.
  • XGBoost (XGB): A gradient boosting algorithm optimized for structured data and classification tasks.
  • One-Class SVM (OCS): A support vector machine-based method for detecting outliers or separating a single class from others.
  • Isolation Forest (ILF): An unsupervised learning algorithm designed for anomaly detection based on tree structures.
  • Local Outlier Factor (LOF): A density-based anomaly detection algorithm that compares local densities of data points.
To ensure robust model performance, hyper parameter tuning was conducted for LGR, RF, and XGB using a grid search method, optimizing for classification accuracy. For OCS, ILF, and LOF, default parameters were used due to their inherent sensitivity to data structure.

2.5. Dataset Preparation and Model Evaluation

To balance the dataset, equal numbers of 5 min and 10 min data segments were extracted from both the gaming and control groups (Table 1).
The dataset was then divided using k-fold cross-validation (k = 3, 4, 5) to assess classification performance across multiple splits. This method ensured robust evaluation while minimizing overfitting. The following performance metrics were used for evaluation:
  • Precision: Measures the proportion of correctly identified gaming participants out of all samples predicted as gaming. A higher precision indicates fewer false positives.
  • Recall: Measures the sensitivity of the model in correctly identifying gaming participants, reflecting the ability to detect actual gaming cases.
  • F-score: The harmonic mean of precision and recall, balancing false positives and false negatives. It provides a single measure of a model’s effectiveness.
  • PR-AUC (precision–recall area under the curve): Evaluates model performance, particularly for imbalanced datasets, by analyzing the trade-off between precision and recall across different thresholds.
To mitigate potential bias, classification models were trained and tested using independent data splits for each fold. This approach ensured that no overlapping data were used in both training and testing phases within the same fold, maintaining the integrity of the model evaluation process.
The 5 min and 10 min segments were obtained using non-overlapping sliding windows from continuous ECG data. To address intra-subject correlation issues, we exclusively assigned all segments from the same subject to either the training set or the test set in each cross-validation fold. This approach prevents data leakage and improves the reliability of evaluation metrics. During cross-validation, we divided the data by subject to avoid overfitting and data leakage.

3. Results

To evaluate the performance of different classification models in distinguishing between gaming and resting states, we analyzed the results for both 5 min and 10 min data segments. For the 5 min data segments, the results were recall: 0.785 ± 0.121; precision: 0.788 ± 0.122; F-score: 0.783 ± 0.11; PR-AUC: 0.868 ± 0.095. For the 10 min data segments, the results were recall: 0.858 ± 0.105; precision: 0.858 ± 0.105; F-score: 0.858 ± 0.105; PR_auc 0.906 ± 0.066. Among all classifiers tested, One-Class SVM (OCS) with k = 3 consistently achieved the highest performance in both conditions (Figure 1 and Figure 2). These values indicate that the model was able to effectively identify gaming participants while maintaining a balance between precision and recall (Table 2 and Table 3).

4. Discussion

The PR-AUC score further suggests strong discrimination ability, even in potentially imbalanced data scenarios. The improvement in performance for 10 min segments suggests that longer ECG recordings provide a more stable representation of heart-rate variability (HRV) features, leading to enhanced classification accuracy. The PR-AUC score of 0.906 confirms the robustness of the OCS model in distinguishing between gaming and resting states. Figure 1 and Figure 2 showed the mean ± S.D. In particular, indices such as OCS, ILF, and LOF tend to show S.D. larger than 1. This is likely due to the fact that while many of the values are close to 1, a few smaller values deviate significantly from 1, thereby increasing the overall variability.
In recent years, machine learning (ML) approaches applied to heart-rate variability (HRV) analysis have been widely utilized for various health-related applications [16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32], such as detecting cardiovascular diseases, assessing fatigue, and monitoring stress levels [33,34,35,36,37]. However, research focusing on the HRV-based classification of gaming behavior remains relatively scarce. Given the increasing prevalence of excessive gaming and its associated health risks, the present study holds significance in demonstrating the feasibility of using HRV-based ML models to distinguish gaming activity from resting states. Our findings suggest that specific autonomic nervous system responses may be leveraged as biomarkers for identifying gaming behavior, providing a novel contribution to this field.
To classify physiological states during gameplay and resting conditions, researchers often rely on machine learning methods applied to features extracted from electrocardiogram (ECG) signals, particularly heart-rate variability (HRV) metrics. HRV analysis provides insights into autonomic nervous system activity, which is sensitive to emotional and cognitive stressors, such as those experienced during gameplay. The extracted features typically include time-domain metrics (e.g., mean RR interval, SDNN, RMSSD), frequency-domain metrics (e.g., VLF, LF, and HF components via fast Fourier transform), and non-linear indices (e.g., sample entropy and Poincaré plot-based features). Various machine learning models have been employed for this type of classification task. Traditional classifiers such as support vector machines (SVMs), Random Forests (RFs), and Logistic Regression (LGR) have shown effectiveness, especially in studies with limited data. More advanced methods like XGBoost offer robust performance through gradient boosting. Deep learning models, including convolutional neural networks (CNNs) and Long Short-Term Memory (LSTM) networks, have also been explored due to their ability to capture temporal patterns in ECG data, though they typically require larger datasets. Several studies support these approaches. For example, Ashtiyani et al. [38] performed heart-rate variability classification using support vector machines and genetic algorithms, Callejas-Cuervo et al. [39] introduced emotion recognition techniques using physiological signals and video games, and Karthikeyan et al. [40] performed a mental stress assessment based on electrocardiogram signals using wavelet transform. These studies demonstrate the feasibility and effectiveness of machine learning-based gameplay and rest classification, highlighting the importance of validating new methods against existing methods.
One of the strengths of this study is that the experimental setup was limited to a seated posture, ensuring that participants exhibited minimal physical movement. This likely contributed to the relatively high classification accuracy, as postural consistency reduces motion artifacts and physiological noise that can otherwise complicate HRV analysis. Moreover, previous studies have extensively explored the identification of seated postures using body acceleration data, indicating that when posture is restricted to sitting, machine learning-based classification of gaming versus non-gaming states is feasible [33,34,35,36,37]. However, this study’s approach may not generalize to gaming activities that involve significant body movement, such as virtual reality (VR) gaming or physically interactive games like those requiring motion controllers. In such cases, HRV-based classification may become less reliable due to additional physiological variations induced by movement, highlighting a key limitation of this research.
Despite these limitations, our study remains relevant given the growing concern over gaming and smartphone addiction, which are increasingly recognized as modern-day public health issues. Excessive gaming has been linked to poor mental health, disrupted sleep patterns, and diminished cognitive function, making it essential to develop objective methods for identifying and monitoring gaming behavior. Although this study does not directly address addiction classification, it establishes a foundation for using physiological data to differentiate gaming behavior, which could be expanded upon in future research. Integrating HRV analysis with additional physiological and behavioral indicators—such as eye tracking, galvanic skin response, or EEG data—could enhance classification accuracy and provide deeper insights into the autonomic changes associated with excessive gaming. In conclusion, this study demonstrates the potential of HRV-based machine learning classification in distinguishing gaming states, contributing to the relatively unexplored field of gaming behavior analysis. While the study is limited to seated posture and controlled conditions, it provides a stepping stone for future research into real-world applications, where continuous physiological monitoring could be utilized for gaming addiction detection and intervention strategies.
Finally, we discuss the future directions of this study. In this study, we demonstrated that machine learning can be used to classify ECG signals and distinguish gameplay individuals to some extent. However, several aspects need to be further investigated to improve the accuracy and applicability of this approach. First, the expansion of the dataset and the diversity of participants. A larger and more diverse participant pool should be incorporated to improve the generalizability of the classification model. Factors such as age, gender, gaming history, and psychological state should be considered to improve classification performance. Second, feature engineering and advanced machine learning models. In this study, we focused primarily on HRV-based classification. However, future research should investigate the analysis of ECG waveforms using pattern-matching techniques and additional physiological markers such as heart-rate acceleration patterns and non-linear HRV metrics. In addition, deep learning approaches such as convolutional neural networks (CNNs) and recurrent neural networks (RNNs) can be used to extract complex temporal features from ECG signals, and the integration of real-time monitoring and wearable devices will also be necessary. To enhance practical applications, the integration of the ECG-based classification model into wearable devices would allow for the real-time monitoring of physiological responses during gaming. This would facilitate early intervention strategies and personalized feedback systems for individuals at risk for gaming addiction. Finally, comparison with other physiological and behavioral indices is necessary. Future studies should investigate the integration of not only ECG-based classification but also other physiological signals, such as electrodermal activity (EDA) and body surface temperature, and behavioral indices, such as eye tracking and bioacceleration. Multimodal data fusion is predicted to improve classification accuracy and provide deeper insights into gaming addiction. Longitudinal studies are also needed to evaluate the long-term effects of gaming addiction on autonomic function. The development of intervention strategies, such as biofeedback-based training and cognitive behavioral therapy (CBT), could be an important application of this research. By addressing these directions, future studies can further refine the ECG-based classification model, increase its applicability in the real world, and contribute to the development of effective intervention strategies for game play-time estimation.

5. Conclusions

The primary goal of this study was to explore the feasibility of using widely used machine learning models to classify the differences between gameplay and resting ECG signals. This study demonstrates the feasibility of using HRV analysis combined with machine learning to classify gaming and resting states. One-Class SVM (OCS) with k = 3 provided the best performance both 5 min and 10 min data segments. The seated posture condition allowed for easier classification due to reduced movement, though the method may face challenges with gaming activities that involve significant body motion. Despite these limitations, the study highlights the potential of HRV-based approaches for detecting gaming behavior, contributing to the development of monitoring tools for excessive gaming, which is increasingly recognized as a modern health concern. In future studies, we plan to introduce more advanced deep learning techniques and properly benchmark the results.

Author Contributions

Conceptualization, E.Y. and T.U.; methodology, T.U.; software, Y.Y.; validation, E.Y. and Y.Y.; formal analysis, E.Y. and Y.Y.; investigation, E.Y. and T.U.; resources, T.U. and H.E.; data curation, E.Y., Y.Y. and T.U.; writing—original draft preparation, E.Y.; writing—review and editing, E.Y.; visualization, Y.Y.; supervision, E.Y.; project administration, E.Y. and T.U.; funding acquisition, E.Y. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Institutional Ethics Committee of Fukuyama University (No. 2024-H-52, Approved 2024/10/11).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data used in the analysis of this study will be made available for research purposes with the consent of the authors, but will only be made available to research institutions.

Acknowledgments

We would like to express our sincere gratitude to all the participants who volunteered for this study.

Conflicts of Interest

Author Hiroyuki Edamatsu was employed by the company Sanei Medisys Co., Ltd. The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

References

  1. Mori, A.; Iwadate, M.; Minakawa, N.T.; Kawashima, S. Game play decreases prefrontal cortex activity and causes damage in game addiction. Nihon Rinsho 2015, 73, 1567–1573. [Google Scholar]
  2. Limone, P.; Ragni, B.; Toto, G.A. The epidemiology and effects of video game addiction: A systematic review and meta-analysis. Acta Psychol. 2023, 241, 104047. [Google Scholar] [CrossRef] [PubMed]
  3. Menéndez-García, A.; Jiménez-Arroyo, A.; Rodrigo-Yanguas, M.; Marin-Vila, M.; Sánchez-Sánchez, F.; Roman-Riechmann, E.; Blasco-Fontecilla, H. Internet, video game and mobile phone addiction in children and adolescents diagnosed with ADHD: A case-control study. Adicciones 2022, 34, 208–217. [Google Scholar] [CrossRef] [PubMed]
  4. Meng, S.Q.; Cheng, J.L.; Li, Y.Y.; Yang, X.Q.; Zheng, J.W.; Chang, X.W.; Shi, Y.; Chen, Y.; Lu, L.; Sun, Y.; et al. Global prevalence of digital addiction in the general population: A systematic review and meta-analysis. Clin. Psychol. Rev. 2022, 92, 102128. [Google Scholar] [CrossRef]
  5. Greenfield, D.N. Clinical considerations in internet and video game addiction treatment. Child Adolesc. Psychiatr. Clin. N. Am. 2022, 31, 99–119. [Google Scholar] [CrossRef]
  6. Mathews, C.L.; Morrell, H.E.R.; Molle, J.E. Video game addiction, ADHD symptomatology, and video game reinforcement. Am. J. Drug Alcohol Abuse 2019, 45, 67–76. [Google Scholar] [CrossRef]
  7. Greenfield, D.N. Treatment considerations in internet and video game addiction: A qualitative discussion. Child Adolesc. Psychiatr. Clin. N. Am. 2018, 27, 327–344. [Google Scholar] [CrossRef]
  8. Gentile, D.A.; Choo, H.; Liau, A.; Sim, T.; Li, D.; Fung, D.; Khoo, A. Pathological video game use among youths: A two-year longitudinal study. Pediatrics 2011, 127, e319–e329. [Google Scholar] [CrossRef]
  9. Zhou, R.; Xiao, X.Y.; Huang, W.J.; Wang, F.; Shen, X.Q.; Jia, F.J.; Hou, C.L. Video game addiction in psychiatric adolescent population: A hospital-based study on the role of individualism from South China. Brain Behav. 2023, 13, e3119. [Google Scholar] [CrossRef]
  10. Mylona, I.; Deres, E.S.; Dere, G.S.; Tsinopoulos, I.; Glynatsis, M. The Impact of Internet and Video Gaming Addiction on Adolescent Vision: A Review of the Literature. Front. Public Health 2020, 8, 63. [Google Scholar] [CrossRef]
  11. King, D.L.; Delfabbro, P.H. The cognitive psychology of Internet gaming disorder. Clin. Psychol. Rev. 2014, 34, 298–308. [Google Scholar] [CrossRef] [PubMed]
  12. Stevens, M.W.; Dorstyn, D.; Delfabbro, P.H.; King, D.L. Global prevalence of gaming disorder: A systematic review and meta-analysis. Aust. N. Z. J. Psychiatry 2021, 55, 553–568. [Google Scholar] [CrossRef] [PubMed]
  13. Esposito, M.R.; Serra, N.; Guillari, A.; Simeone, S.; Sarracino, F.; Continisio, G.I.; Rea, T. An investigation into video game addiction in pre-adolescents and adolescents: A cross-sectional study. Medicina 2020, 56, 221. [Google Scholar] [CrossRef]
  14. Weinstein, A.M. Computer and video game addiction—A comparison between game users and non-game users. Am. J. Drug Alcohol Abuse 2010, 36, 268–276. [Google Scholar] [CrossRef]
  15. Kim, J.Y.; Kim, H.S.; Kim, D.J.; Im, S.K.; Kim, M.S. Identification of Video Game Addiction Using Heart-Rate Variability Parameters. Sensors 2021, 21, 4683. [Google Scholar] [CrossRef]
  16. Odenstedt Hergès, H.; Vithal, R.; El-Merhi, A.; Naredi, S.; Staron, M.; Block, L. Machine learning analysis of heart rate variability to detect delayed cerebral ischemia in subarachnoid hemorrhage. Acta Neurol. Scand. 2022, 145, 151–159. [Google Scholar] [CrossRef] [PubMed]
  17. Ambale-Venkatesh, B.; Yang, X.; Wu, C.O.; Liu, K.; Hundley, W.G.; McClelland, R.; Gomes, A.S.; Folsom, A.R.; Shea, S.; Guallar, E.; et al. Cardiovascular Event Prediction by Machine Learning: The Multi-Ethnic Study of Atherosclerosis. Circ. Res. 2017, 121, 1092–1101. [Google Scholar] [CrossRef]
  18. Guo, C.Y.; Wu, M.Y.; Cheng, H.M. The Comprehensive Machine Learning Analytics for Heart Failure. Int. J. Environ. Res. Public Health 2021, 18, 4943. [Google Scholar] [CrossRef]
  19. Xu, L.; Cao, F.; Wang, L.; Liu, W.; Gao, M.; Zhang, L.; Hong, F.; Lin, M. Machine learning model and nomogram to predict the risk of heart failure hospitalization in peritoneal dialysis patients. Ren. Fail. 2024, 46, 2324071. [Google Scholar] [CrossRef]
  20. Accardo, A.; Silveri, G.; Merlo, M.; Restivo, L.; Ajčević, M.; Sinagra, G. Detection of subjects with ischemic heart disease by using machine learning technique based on heart rate total variability parameters. Physiol. Meas. 2020, 41, 115008. [Google Scholar] [CrossRef]
  21. Agliari, E.; Barra, A.; Barra, O.A.; Fachechi, A.; Franceschi Vento, L.; Moretti, L. Detecting cardiac pathologies via machine learning on heart-rate variability time series and related markers. Sci. Rep. 2020, 10, 8845. [Google Scholar] [CrossRef] [PubMed]
  22. Nemati, S.; Holder, A.; Razmi, F.; Stanley, M.D.; Clifford, G.D.; Buchman, T.G. An Interpretable Machine Learning Model for Accurate Prediction of Sepsis in the ICU. Crit. Care Med. 2018, 46, 547–553. [Google Scholar] [CrossRef] [PubMed]
  23. Chiew, C.J.; Liu, N.; Tagami, T.; Wong, T.H.; Koh, Z.X.; Ong, M.E.H. Heart rate variability based machine learning models for risk prediction of suspected sepsis patients in the emergency department. Medicine 2019, 98, e14197. [Google Scholar] [CrossRef]
  24. Geng, D.; An, Q.; Fu, Z.; Wang, C.; An, H. Identification of major depression patients using machine learning models based on heart rate variability during sleep stages for pre-hospital screening. Comput. Biol. Med. 2023, 162, 107060. [Google Scholar] [CrossRef]
  25. Matuz, A.; van der Linden, D.; Darnai, G.; Csathó, Á. Generalisable machine learning models trained on heart rate variability data to predict mental fatigue. Sci. Rep. 2022, 12, 20023. [Google Scholar] [CrossRef]
  26. Ni, Z.; Sun, F.; Li, Y. Heart Rate Variability-Based Subjective Physical Fatigue Assessment. Sensors 2022, 22, 3199. [Google Scholar] [CrossRef]
  27. Lee, K.F.A.; Gan, W.S.; Christopoulos, G. Biomarker-Informed Machine Learning Model of Cognitive Fatigue from a Heart Rate Response Perspective. Sensors 2021, 21, 3843. [Google Scholar] [CrossRef]
  28. Fan, J.; Mei, J.; Yang, Y.; Lu, J.; Wang, Q.; Yang, X.; Chen, G.; Wang, R.; Han, Y.; Sheng, R.; et al. Sleep-phasic heart rate variability predicts stress severity: Building a machine learning-based stress prediction model. Stress Health 2024, 40, e3386. [Google Scholar] [CrossRef] [PubMed]
  29. Cao, R.; Rahmani, A.M.; Lindsay, K.L. Prenatal stress assessment using heart rate variability and salivary cortisol: A machine learning-based approach. PLoS ONE 2022, 17, e0274298. [Google Scholar] [CrossRef]
  30. Bahameish, M.; Stockman, T.; Requena Carrión, J. Strategies for Reliable Stress Recognition: A Machine Learning Approach Using Heart Rate Variability Features. Sensors 2024, 24, 3210. [Google Scholar] [CrossRef]
  31. Tsai, C.Y.; Majumdar, A.; Wang, Y.; Hsu, W.H.; Kang, J.H.; Lee, K.Y.; Tseng, C.H.; Kuan, Y.C.; Lee, H.C.; Wu, C.J.; et al. Machine learning model for aberrant driving behaviour prediction using heart rate variability: A pilot study involving highway bus drivers. Int. J. Occup. Saf. Ergon. 2023, 29, 1429–1439. [Google Scholar] [CrossRef] [PubMed]
  32. Pop, G.N.; Christodorescu, R.; Velimirovici, D.E.; Sosdean, R.; Corbu, M.; Bodea, O.; Valcovici, M.; Dragan, S. Assessment of the Impact of Alcohol Consumption Patterns on Heart Rate Variability by Machine Learning in Healthy Young Adults. Medicina 2021, 57, 956. [Google Scholar] [CrossRef] [PubMed]
  33. Chen, H.; Tse, M.M.Y.; Chung, J.W.Y.; Yau, S.Y.; Wong, T.K.S. Effects of Posture on Heart Rate Variability in Non-Frail and Prefrail Individuals: A Cross-Sectional Study. BMC Geriatr. 2023, 23, 870. [Google Scholar] [CrossRef]
  34. Hallman, D.M.; Sato, T.; Kristiansen, J.; Gupta, N.; Skotte, J.; Holtermann, A. Prolonged Sitting is Associated with Attenuated Heart Rate Variability during Sleep in Blue-Collar Workers. Int. J. Environ. Res. Public Health 2015, 12, 14811–14827. [Google Scholar] [CrossRef] [PubMed]
  35. Nam, K.C.; Kwon, M.K.; Kim, D.W. Effects of Posture and Acute Sleep Deprivation on Heart Rate Variability. Yonsei Med. J. 2011, 52, 569–573. [Google Scholar] [CrossRef]
  36. Kumar, P.; Das, A.K.; Halder, S. Statistical Heart Rate Variability Analysis for Healthy Person: Influence of Gender and Body Posture. J. Electrocardiol. 2023, 79, 81–88. [Google Scholar] [CrossRef]
  37. Chuangchai, W.; Pothisiri, W. Postural Changes on Heart Rate Variability among Older Population: A Preliminary Study. Curr. Gerontol. Geriatr. Res. 2021, 2021, 6611479. [Google Scholar] [CrossRef]
  38. Ashtiyani, M.; Navaei Lavasani, S.; Asgharzadeh Alvar, A.; Deevband, M.R. Heart rate variability classification using support vector machine and genetic algorithm. J. Biomed. Phys. Eng. 2018, 8, 423–434. [Google Scholar] [CrossRef]
  39. Callejas-Cuervo, M.; Martínez-Tejada, L.A.; Alarcón-Aldana, A.C. Emotion recognition techniques using physiological signals and video games—Systematic review. Ing. Investig. Desarrollo 2017, 26, 109–118. [Google Scholar] [CrossRef]
  40. Karthikeyan, P.; Murugappan, M.; Yaacob, S. ECG signals based mental stress assessment using wavelet transform. In Proceedings of the 2011 IEEE International Conference on Control System, Computing and Engineering, Penang, Malaysia, 25–27 November 2011; IEEE: Piscataway Townshi, NJ, USA, 2011; pp. 157–161. [Google Scholar] [CrossRef]
Figure 1. The 5 min analysis. (a) shows recall, (b) shows precision, (c) shows F-score, and (d) shows PR auc (precision–recall area under the curve). The blue bars indicate a K value = 3, the red bars indicate a K value = 4, and the green bars indicate a K value = 5. Indices with a mean ± S.D. larger than 1 are presented as a range (max–min). Precision: OCS (k = 4, 0.947–0.361); PR_auc: OCS (k = 4; 0.995–0.496, k = 5; 0.992–0.557); PR_auc: ILF (k = 3; 0.977–0.617, k = 4; 1–0.551, k = 5; 0.992–0.612).
Figure 1. The 5 min analysis. (a) shows recall, (b) shows precision, (c) shows F-score, and (d) shows PR auc (precision–recall area under the curve). The blue bars indicate a K value = 3, the red bars indicate a K value = 4, and the green bars indicate a K value = 5. Indices with a mean ± S.D. larger than 1 are presented as a range (max–min). Precision: OCS (k = 4, 0.947–0.361); PR_auc: OCS (k = 4; 0.995–0.496, k = 5; 0.992–0.557); PR_auc: ILF (k = 3; 0.977–0.617, k = 4; 1–0.551, k = 5; 0.992–0.612).
Applsci 15 05783 g001
Figure 2. Figure (a) shows recall, (b) shows precision, (c) shows F-score, and (d) shows PR-auc (Precision-Recall AUC). The blue bars show K = 3, the red bars K = 4, and the green bars K = 5. The 10 min analysis. In k-fold cross-validation, K is a parameter that determines how many groups (folds) the dataset is divided into. The mechanism of k-fold cross-validation is to first divide the data into K folds (subsets) of equal size, and then perform K training and testing. Each time, one fold is used as test data, and the remaining K-1 folds are used as training data. After that, the process is repeated so that each fold is used as test data once. The average of the K test results is the final evaluation value of the model. The value of K affects the generalization performance and computational cost of the model. In this analysis, because the size of the training data is large, a small K (3–5) was selected, which has the advantage of being easy to learn. A small K tends to result in large variation in evaluation, while a large K increases the computational cost, but the evaluation is stable. Precision: OCS (k = 4; 1–0.370, k = 5; 1–0.422); PR_auc: OCS (k = 4; 1–0.516, k = 5; 1–0.373); PR_auc: ILF (k = 4; 1–0.492, k = 5; 1–0.448); PR_auc: LOF (k = 4; 1–0.375, k = 5; 1–0.373).
Figure 2. Figure (a) shows recall, (b) shows precision, (c) shows F-score, and (d) shows PR-auc (Precision-Recall AUC). The blue bars show K = 3, the red bars K = 4, and the green bars K = 5. The 10 min analysis. In k-fold cross-validation, K is a parameter that determines how many groups (folds) the dataset is divided into. The mechanism of k-fold cross-validation is to first divide the data into K folds (subsets) of equal size, and then perform K training and testing. Each time, one fold is used as test data, and the remaining K-1 folds are used as training data. After that, the process is repeated so that each fold is used as test data once. The average of the K test results is the final evaluation value of the model. The value of K affects the generalization performance and computational cost of the model. In this analysis, because the size of the training data is large, a small K (3–5) was selected, which has the advantage of being easy to learn. A small K tends to result in large variation in evaluation, while a large K increases the computational cost, but the evaluation is stable. Precision: OCS (k = 4; 1–0.370, k = 5; 1–0.422); PR_auc: OCS (k = 4; 1–0.516, k = 5; 1–0.373); PR_auc: ILF (k = 4; 1–0.492, k = 5; 1–0.448); PR_auc: LOF (k = 4; 1–0.375, k = 5; 1–0.373).
Applsci 15 05783 g002
Table 1. The 5 min and 10 min datasets.
Table 1. The 5 min and 10 min datasets.
Dataset5 min10 min
Game6733
Rest7838
Table 2. HRV index of the gaming group.
Table 2. HRV index of the gaming group.
ParticipantsMRR [ms]SDRR [ms]VLF [ln, ms2]LF [ln, ms2]HF [ln, ms2]LF/HF [Ratio]HF Freq [Hz]
G1803918.306.895.026.440.228
G2724787.947.195.585.010.233
G3722908.017.426.203.380.243
G4637335.615.714.333.970.246
G5511446.356.234.714.560.228
G6776747.307.146.182.610.234
Mean ± S.D.696 ± 9869 ± 227.25 ± 0.976.76 ± 0.605.34 ± 0.714.33 ± 1.220.235 ± 0.007
Table 3. HRV index of the seated-rest control group.
Table 3. HRV index of the seated-rest control group.
ParticipantsMRR [ms]SDRR [ms]VLF [ln, ms2]LF [ln, ms2]HF [ln, ms2]LF/HF [Ratio]HF Freq [Hz]
R1571215.164.903.424.380.296
R2694576.857.106.282.270.215
R3769476.846.795.722.920.219
R410551138.447.386.163.400.247
R5575285.664.753.473.590.253
R6706456.715.885.681.230.268
R7775386.195.745.141.820.234
Mean ± S.D.735 ± 15150 ± 286.55 ± 0.9376.08 ± 0.9705.12 ± 1.122.08 ± 1.020.247 ± 0.026
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Yuda, E.; Edamatsu, H.; Yoshida, Y.; Ueno, T. Comparison of ECG Between Gameplay and Seated Rest: Machine Learning-Based Classification. Appl. Sci. 2025, 15, 5783. https://doi.org/10.3390/app15105783

AMA Style

Yuda E, Edamatsu H, Yoshida Y, Ueno T. Comparison of ECG Between Gameplay and Seated Rest: Machine Learning-Based Classification. Applied Sciences. 2025; 15(10):5783. https://doi.org/10.3390/app15105783

Chicago/Turabian Style

Yuda, Emi, Hiroyuki Edamatsu, Yutaka Yoshida, and Takahiro Ueno. 2025. "Comparison of ECG Between Gameplay and Seated Rest: Machine Learning-Based Classification" Applied Sciences 15, no. 10: 5783. https://doi.org/10.3390/app15105783

APA Style

Yuda, E., Edamatsu, H., Yoshida, Y., & Ueno, T. (2025). Comparison of ECG Between Gameplay and Seated Rest: Machine Learning-Based Classification. Applied Sciences, 15(10), 5783. https://doi.org/10.3390/app15105783

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop