Feature Analysis of Facial Color Information During Emotional Arousal in Japanese Older Adults Playing eSports
Abstract
1. Introduction
- Although the extension of healthy life expectancy among older adults is a pressing issue and eSports has gained attention as a potential solution, there is a lack of research that collects data from older adults engaging in eSports or investigates their physiological and psychological states during gameplay.
- Contact-based data collection methods that require participants to wear devices may impose a burden on them.
- Although emotion—which can be acquired noninvasively—has potential for evaluating the benefits of eSports in extending healthy life expectancy, specific methods for leveraging emotional data have not been thoroughly examined.
- Noncontact indicators commonly used for emotion estimation, such as facial expressions and voice, can be consciously manipulated by individuals and do not necessarily reflect their actual emotional states.
- We conducted data collection and experiments focusing on older adults engaging in eSports, a domain that has been insufficiently explored to date. As eSports rapidly gain global popularity, this paper offers insights into how this emerging cultural phenomenon may impact an aging society, thereby contributing to the advancement of this research field.
- While brain function sensing is commonly employed to investigate the effects of games on individuals, we took a novel approach by utilizing emotional information to analyze player experience (e.g., enjoyment) and psychological states during gameplay. This study has the potential to advance research in the field that leverages emotional data for user experience analysis.
- As a method for estimating players’ emotions, we focused on a new perspective within the wide range of noncontact emotion estimation techniques—namely, changes in facial color information. Since facial color information reflects physiological changes that cannot be voluntarily controlled, the method proposed in this paper may provide a fresh perspective for current emotion estimation technologies and contribute to the development of this field.
2. Data Acquisition
- Above the subject: 1260–1750 lx.
- Front of the subject: 640–960 lx.
EAI | Types of Emotions | Intensity | Memo |
---|---|---|---|
1:30~1:40 | Surprise | Strong | A car suddenly appeared from behind. |
3. Proposed Method
3.1. Creation of Facial Images
3.2. Positioning of Cheek Regions
3.3. Calculation of RSR
3.4. Calculation of AC-RSR
4. Analysis Results and Discussion
- The interval during which “emotional arousal” was evaluated by the participants in the postgame questionnaire (evaluation interval).
- The interval wherein no evaluation was obtained but an event clearly occurred in the gameplay video, such as “passing the other car” or “going out of course” (factor interval).
- The average AC-RSR in the increasing and decreasing directions during the 10 s before gameplay was calculated. If the calculated average value was used as the threshold value, the amount of change exceeding the threshold value was observed in the evaluation or factor interval. The 10 s before gameplay refers to the time between the start of recording and the beginning of the game, during which the participant was instructed to remain still and look at the camera, serving as baseline data under normal conditions.
- When no change exceeding the threshold was observed, the RSR increased or decreased over time within the analyzed interval.
4.1. Discussion Focusing on the Presence or Absence of Fluctuations in the RSR
4.2. Discussion Focusing on the Fluctuation in RSR in the Evaluation Interval
- RSR increases when positive emotions occur.
- RSR decreases when negative emotions occur.
4.3. Discussion Focusing on the Fluctuation in the RSR in the Factor Interval
- The RSR increases when an event may induce positive emotions, such as “passing the other car” or “obtaining the first position.”
- The RSR decreases when an event might induce negative emotions, such as “going off course” or “crashing into a wall.”
4.4. Consideration Based on Participants’ Experience
4.5. Comparison with Facial Expression Recognition Technology
5. Conclusions
- (1)
- The fluctuation in the RSR can be used as a feature to judge whether an emotion occurs in a participant.
- (2)
- The RSR tended to increase with positive emotions and decrease with negative emotions.
- (3)
- Focusing on the relationship between the fluctuations in the RSRs of both cheeks has the potential to be used as a feature to estimate the type of emotion that is elicited in a participant.
- (4)
- Focusing on the change in the RSR indicated a possibility of estimating the occurrence of emotions that participants themselves are unaware of.
- (5)
- The proposed method, which focuses on physiological changes in facial saturation, demonstrated the potential to estimate the emotions of participants whose emotions are difficult to infer from facial expressions alone.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
eSports | Electronic Sports |
EAI | Emotional Arousal Intervals |
RSR | Relative Saturation Rate |
AC-RSR | Amount of Change in Relative Saturation Rate |
FOI | Frame of Interest |
References
- Statistics Bureau of Japan. Statistics Data/Statistical Topics No.142 Japan’s in Older Adults from the Viewpoint of Statistics: On the Occasion of Respect for the Aged Day. Available online: https://www.stat.go.jp/data/topics/pdf/topi142_01.pdf (accessed on 10 June 2025).
- Cabinet Office. The 2024 Edition of the White Paper on Aging Society, Chapter 1, Section 2, 2 Health and Welfare. Available online: https://www8.cao.go.jp/kourei/whitepaper/w-2024/zenbun/pdf/1s2s_02.pdf (accessed on 10 June 2025).
- Saito, T.; Murata, C.; Saito, M.; Takeda, T.; Kondo, K. Influence of social relationship domains and their combination on incident dementia: A prospective cohort study. J. Epidemiol. Community Health 2018, 72, 7–12. [Google Scholar] [CrossRef] [PubMed]
- Hutchinson, C.V.; Barrett, D.J.K.; Nitka, A.; Raynes, K. Action video game training reduces the simon effect. Psychon. Bull. Rev. 2016, 23, 587–592. [Google Scholar] [CrossRef] [PubMed]
- Martínez, K.; Solana, A.; Burgaleta, M.; Hernández-Tamames, J.; Álvarez Linera, J.; Román, F.; Alfayate, E.; Privado, J.; Escorial, S.; Quiroga, M.; et al. Changes in resting-state functionally connected parietofrontal networks after videogame practice. Hum. Brain Mapp. 2013, 34, 3143–3157. [Google Scholar] [CrossRef] [PubMed]
- Mancı, E.; Güdücü, Ç.; Günay, E.; Güvendi, G.; Campbell, M.; Bediz, Ç.Ş. The relationship between esports game genres and cognitive performance: A comparison between first-person shooter vs. multiplayer online battle arena games in younger adults. Entertain. Comput. 2024, 50, 100640. [Google Scholar] [CrossRef]
- Rico, J.L.C.; Villarrasa-Sapiña, I.; García-Masso, X.; Monfort-Torres, G. Differences in hand acceleration and digital reaction time between different skill levels of Counter Strike players. Entertain. Comput. 2025, 52, 100797. [Google Scholar] [CrossRef]
- Jahouh, M.; González-Bernal, J.J.; González-Santos, J.; Fernández-Lázaro, D.; Soto-Cámara, R.; Mielgo-Ayuso, J. Impact of an intervention with Wii video games on the autonomy of activities of daily living and psychological-cognitive components in the institutionalized elderly. Int. J. Environ. Res. Public Health 2021, 18, 1570. [Google Scholar] [CrossRef]
- Wang, P.; Zhu, X.T.; Qi, Z.; Huang, S.; Li, H.J. Neural basis of enhanced executive function in older video game players: An fMRI study. Front. Aging Neurosci. 2017, 9, 382. [Google Scholar] [CrossRef]
- Choi, E.; Lee, B. Unlocking the potential of play: A TF-IDF analysis of ‘MapleStory’ as a serious game for cognitive enhancement in seniors. Entertain. Comput. 2025, 52, 100800. [Google Scholar] [CrossRef]
- Lee, S.; Shi, C.-K.; Doh, Y.Y. The relationship between co-playing and socioemotional status among older-adult game players. Entertain. Comput. 2021, 38, 100414. [Google Scholar] [CrossRef]
- Onishi, T.; Yamasaki, M.; Hara, T.; Hirotomi, T.; Miyazaki, R. Esports for seniors: Acute effects of esports gaming in the community on the emotional state and heart rate among Japanese older adults. Int. J. Environ. Res. Public Health 2022, 19, 11683. [Google Scholar] [CrossRef]
- Mori, A.; Iwadate, M.; Endo, Y.; Ashizuka, T. The relationship between computer game playing and electroencephalographic activity in the prefrontal cortex. Health Behav. Sci. 2003, 2, 56–69. [Google Scholar] [CrossRef]
- Murata, C.; Takeda, T.; Suzuki, K.; Kondo, K. Positive affect and incident dementia among the old. J. Epidemiol. Res. 2016, 2, 118–124. [Google Scholar] [CrossRef]
- Pirmoradi, A.; Hoeber, O. Bridging in-task emotional responses with post-task evaluations in digital library search interface user studies. Inf. Process. Manag. 2025, 62, 104069. [Google Scholar] [CrossRef]
- Sutoyo, R.; Warnars, H.L.H.S.; Isa, S.M.; Budiharto, W. Emotionally aware chatbot for responding to Indonesian product reviews. Int. J. Innov. Comput. Inf. Control. 2023, 19, 861–876. [Google Scholar] [CrossRef]
- Porcu, S.; Floris, A.; Atzori, L. Towards the prediction of the quality of experience from facial expression and gaze direction. In Proceedings of the 2019 22nd Conference on Innovation in Clouds Internet and Networks and Workshops ICIN, Paris, France, 19–22 February 2019; pp. 82–87. [Google Scholar] [CrossRef]
- Porcu, S.; Uhrig, S.; Voigt-Antons, J.-N.; Möller, S.; Atzori, L. Emotional impact of video quality: Self-assessment and facial expression recognition. In Proceedings of the 2019 11th International Conference on Quality of Multimedia Experience, QoMEX, Berlin, Germany, 5–7 June 2019; pp. 1–6. [Google Scholar] [CrossRef]
- Bingöl, G.; Porcu, S.; Floris, A.; Atzori, L. QoE Estimation of WebRTC-based audio-visual conversations from facial and speech features. ACM Trans. Multimed. Comput. Commun. Appl. 2025, 20, 1–23. [Google Scholar] [CrossRef]
- Behnke, M.; Gross, J.J.; Kaczmarek, L.D. The role of emotions in esports performance. Emotion 2022, 22, 1059–1070. [Google Scholar] [CrossRef] [PubMed]
- Abramov, S.; Korotin, A.; Somov, A.; Burnaev, E.; Stepanov, A.; Nikolaev, D.; Titova, M.A. Analysis of video game players’ emotions and team performance: An esports tournament case study. IEEE J. Biomed. Health Inform. 2022, 26, 3597–3606. [Google Scholar] [CrossRef]
- Mateo-Orcajada, A.; Abenza-Cano, L.; Vaquero-Cristóbal, R. Analyzing the changes in the psychological profile of professional League of Legends players during competition. Comput. Hum. Behav. 2022, 126, 107030. [Google Scholar] [CrossRef]
- Kou, Y.; Gui, X. Emotion regulation in eSports gaming: A qualitative study of League of Legends. Proc. ACM Hum.-Comput. Interact. 2020, 4, 1–25. [Google Scholar] [CrossRef]
- Beres, N.A.; Klarkowski, M.; Mandryk, R.L. Playing with emotions; A systematic review examining emotions and emotion regulation in esports performance. Proc. ACM Hum.-Comput. Interact. 2023, 7, 558–587. [Google Scholar] [CrossRef]
- Nalepa, G.J.; Kutt, K.; Giżycka, B.; Jemiolo, P.; Bodek, S. Analysis and use of the emotional context with wearable devices for games and intelligent assistants. Sensors 2019, 19, 2509. [Google Scholar] [CrossRef] [PubMed]
- Ochab, J.K.; Wegrzyn, P.; Witaszczyk, P.; Drażyk, D.; Nalepa, G.J. Mobile game evaluation method based on data mining of affective time series. Sensors 2025, 25, 2756. [Google Scholar] [CrossRef]
- Yeasin, M.; Bullot, B.; Sharma, R. Recognition of facial expressions and measurement of levels of interest from video. IEEE Trans. Multimed. 2006, 8, 500–508. [Google Scholar] [CrossRef]
- Drimalla, H.; Baskow, I.; Behnia, B.; Roepke, S.; Dziobek, I. Imitation and recognition of facial emotions in autism: A computer vision approach. Mol. Autism 2021, 12, 27. [Google Scholar] [CrossRef]
- Ninaus, M.; Greipl, S.; Kiili, K.; Lindstedt, A.; Huber, S.; Klein, E.; Karnath, H.-O.; Moeller, K. Increased emotional engagement in game-based learning—A machine learning approach on facial emotion detection data. Comput. Educ. 2019, 142, 103641. [Google Scholar] [CrossRef]
- Minaee, S.; Minaei, M.; Abdolrashidi, A. Deep-emotion: Facial expression recognition using attentional convolutional network. Sensors 2021, 21, 3046. [Google Scholar] [CrossRef] [PubMed]
- Fasel, B.; Luettin, J. Automatic facial expression analysis: A survey. Pattern Recognit. 2003, 36, 259–275. [Google Scholar] [CrossRef]
- Zhang, J.; Yin, Z.; Chen, P.; Nichele, S. Emotion recognition using multi-modal data and machine learning techniques: A tutorial and review. Inf. Fusion 2020, 59, 103–126. [Google Scholar] [CrossRef]
- Du, G.; Long, S.; Yuan, H. Non-contact emotion recognition combining heart rate and facial expression for interactive gaming environments. IEEE Access 2020, 8, 11896–11906. [Google Scholar] [CrossRef]
- Viellard, S.; Pinabiaux, C. Spontaneous response to and expressive regulation of mirth elicited by humorous cartoons in younger and older adults. Aging Neuropsychol. Cogn. 2019, 26, 407–423. [Google Scholar] [CrossRef]
- Fölster, M.; Hess, U.; Werheid, K. Facial age affects emotional expression decoding. Front. Psychol. 2014, 5, 30. [Google Scholar] [CrossRef]
- Kang, X. Speech emotion recognition algorithm of intelligent robot based on ACO-SVM. Int. J. Cogn. Comput. Eng. 2025, 6, 131–142. [Google Scholar] [CrossRef]
- Jena, S.; Basak, S.; Agrawal, H.; Saini, B.; Gite, S.; Kotecha, K.; Alfarhood, S. Developing a negative speech emotion recognition model for safety systems using deep learning. J. Big Data 2025, 12, 54. [Google Scholar] [CrossRef]
- Du, G.; Tan, Q.; Li, C.; Wang, X.; Teng, S.; Liu, P.X. A Noncontact emotion recognition method based on complexion and heart rate. IEEE Trans. Instrum. Meas. 2022, 71, 1–14. [Google Scholar] [CrossRef]
- Jiang, Y.; Li, W.; Hossain, M.S.; Chen, M.; Alelaiwi, A.; Al-Hammadi, M. A snapshot research and implementation of multimodal information fusion for data-driven emotion recognition. Inf. Fusion 2020, 53, 209–221. [Google Scholar] [CrossRef]
- Pace-Schott, E.F.; Amole, M.C.; Aue, T.; Balconi, M.; Bylsma, L.M.; Critchley, H.; Demaree, H.A.; Friedman, B.H.; Gooding, A.E.K.; Gosseries, O.; et al. Physiological feelings. Neurosci. Biobehav. Rev. 2019, 103, 267–304. [Google Scholar] [CrossRef] [PubMed]
- Candia-Rivera, D.; Catrambone, V.; Thayer, J.F.; Gentili, C.; Valenza, G. Cardiac sympathetic-vagal activity initiates a functional brain-body response to emotional arousal. Natl. Acad. Sci. 2022, 119, e2119599119. [Google Scholar] [CrossRef] [PubMed]
- Ioannou, S.; Gallese, V.; Merla, A. Thermal infrared imaging in psychophysiology: Potentialities and limits. Psychophysiol. 2014, 51, 951–963. [Google Scholar] [CrossRef]
- Mathis, V.; Kenny, P.J. Neuroscience: Brain Mechanisms of Blushing. Curr. Biol. 2018, 28, R791–R792. [Google Scholar] [CrossRef]
- POLYPHONY DIGITAL. Gran Turismo Sport. Available online: https://www.gran-turismo.com/jp/gtsport/top/ (accessed on 10 June 2025).
- Panasonic. Digital Video Camera hc-vx2m. Available online: https://panasonic.jp/dvc/p-db/HC-VX2M.html (accessed on 10 June 2025).
- NEP Corporation. Led Light. Available online: https://nepinc.co.jp/product/lighting/ (accessed on 10 June 2025).
- T.C.S.A. of Japan. Handbook of Color Science, 3rd ed.; University of Tokyo Press: Tokyo, Japan, 2011. [Google Scholar]
- Insightface. Available online: https://insightface.ai/ (accessed on 10 June 2025).
- Kan, T.; Shiga, Y.; Himeno, N. 51 Statistical Methods You Can Use, 3rd ed.; Ohmsha: Tokyo, Japan, 2019. [Google Scholar]
- Yamada, M.; Kageyama, Y. Temperature Analysis of Face Regions Based on Degree of Emotion of Joy. Int. J. Innov. Comput. Inf. Control. 2022, 18, 1383–1394. [Google Scholar] [CrossRef]
- Pavlidis, I.; Levine, J.; Baukol, P. Thermal imaging for anxiety detection. In Proceedings of the 2001 International Conference on Image Processing, Thessaloniki, Greece, 7–10 October 2001; pp. 315–318. [Google Scholar] [CrossRef]
- Cheong, J.H.; Jolly, E.; Xie, T.; Byrne, S.; Kenney, M.; Chang, L. Py-Feat: Python facial expression analysis toolbox. Affect. Sci. 2023, 4, 781–796. [Google Scholar] [CrossRef] [PubMed]
- Pham, L.; Vu, T.H.; Tran, T.A. Facial expression recognition using residual masking network. In Proceedings of the IEEE 25th International Conference on Pattern Recognition, Virtual, 10–15 January 2021; pp. 4513–4519. [Google Scholar] [CrossRef]
Participants with Driving Experience (Proportion) | Participants with Gaming Experience (Proportion) | |
---|---|---|
Have experienced at least once | 19 (100%) | 10 (52%) |
Engage regularly | 18 (94%) | 2 (20%) |
Participant | Experimental Session | Number of Evaluation Intervals | Fluctuations Observed | |
---|---|---|---|---|
Left Cheek | Right Cheek | |||
A | 1 | 7 | 6 (86%) | 6 (86%) |
2 | 18 | 9 (50%) | 11 (61%) | |
B | 1 | 18 | 12 (67%) | 13 (72%) |
2 | 7 | 5 (71%) | 4 (57%) | |
C | 1 | 11 | 8 (73%) | 10 (91%) |
2 | 14 | 10 (71%) | 10 (71%) | |
D | 1 | |||
2 | 7 | 5 (71%) | 5 (71%) | |
E | 1 | 16 | 11 (69%) | 10 (63%) |
2 | 16 | 12 (75%) | 7 (44%) | |
F | 1 | 22 | 11 (50%) | 15 (68%) |
2 | 13 | 5 (38%) | 10 (77%) | |
G | 1 | 7 | 6 (86%) | 7 (100%) |
2 | 10 | 7 (70%) | 10 (100%) | |
H | 1 | 11 | 9 (82%) | 10 (91%) |
2 | 9 | 8 (89%) | 9 (100%) | |
I | 1 | 7 | 5 (71%) | 7 (100%) |
2 | 7 | 7 (100%) | 6 (86%) | |
J | 1 | 20 | 14 (70%) | 10 (50%) |
2 | 8 | 6 (75%) | 6 (75%) | |
K | 1 | 7 | 6 (86%) | 4 (57%) |
2 | 9 | 5 (56%) | 4 (44%) | |
L | 1 | 17 | 13 (76%) | 10 (59%) |
2 | 13 | 8 (62%) | 9 (69%) | |
M | 1 | 11 | 6 (55%) | 6 (55%) |
2 | 8 | 5 (63%) | 6 (75%) | |
N | 1 | 13 | 6 (46%) | 8 (62%) |
2 | 4 | 1 (25%) | 3 (75%) | |
O | 1 | 20 | 15 (75%) | 14 (70%) |
2 | ||||
P | 1 | 11 | 9 (82%) | 7 (64%) |
2 | 2 | 1 (50%) | 2 (100%) | |
Q | 1 | 0 | 0 | 0 |
2 | 0 | 0 | 0 | |
R | 1 | 6 | 5 (83%) | 4 (67%) |
2 | 2 | 2 (100%) | 2 (100%) | |
S | 1 | |||
2 | 8 | 6 (75%) | 5 (63%) |
Participant | Experimental Session | Number of Factor Intervals | Fluctuations Observed | |
---|---|---|---|---|
Left Cheek | Right Cheek | |||
A | 1 | 11 | 5 (45%) | 7 (64%) |
2 | 3 | 3 (100%) | 2 (67%) | |
B | 1 | 14 | 5 (36%) | 7 (50%) |
2 | 26 | 20 (77%) | 12 (46%) | |
C | 1 | 14 | 11 (79%) | 9 (64%) |
2 | 2 | 1 (50%) | 2 (100%) | |
D | 1 | |||
2 | 8 | 5 (63%) | 4 (50%) | |
E | 1 | 9 | 5 (56%) | 6 (67%) |
2 | 9 | 5 (56%) | 3 (33%) | |
F | 1 | 11 | 5 (45%) | 4 (36%) |
2 | 9 | 7 (78%) | 4 (44%) | |
G | 1 | 4 | 2 (50%) | 3 (75%) |
2 | 7 | 4 (57%) | 5 (71%) | |
H | 1 | 4 | 3 (75%) | 4 (100%) |
2 | 12 | 7 (58%) | 7 (58%) | |
I | 1 | 9 | 6 (67%) | 7 (78%) |
2 | 8 | 4 (50%) | 3 (38%) | |
J | 1 | 13 | 8 (62%) | 6 (46%) |
2 | 18 | 5 (28%) | 10 (56%) | |
K | 1 | 12 | 6 (50%) | 8 (67%) |
2 | 8 | 5 (63%) | 4 (50%) | |
L | 1 | 14 | 7 (50%) | 8 (57%) |
2 | 27 | 19 (70%) | 12 (44%) | |
M | 1 | 10 | 7 (70%) | 5 (50%) |
2 | 17 | 11 (65%) | 8 (47%) | |
N | 1 | 15 | 8 (53%) | 11 (73%) |
2 | 13 | 9 (69%) | 7 (54%) | |
O | 1 | 6 | 4 (67%) | 2 (33%) |
2 | ||||
P | 1 | 4 | 3 (75%) | 2 (50%) |
2 | 20 | 8 (40%) | 9 (45%) | |
Q | 1 | 38 | 20 (53%) | 21 (55%) |
2 | 12 | 6 (50%) | 4 (33%) | |
R | 1 | 12 | 4 (33%) | 4 (33%) |
2 | 13 | 6 (46%) | 8 (62%) | |
S | 1 | |||
2 | 43 | 25 (58%) | 30 (70%) |
Number of Emotion Occurrences | |||
---|---|---|---|
Experimental Session 1 | Experimental Session 2 | ||
Positive emotion | play1 | 9 | 11 |
play2 | 19 | 22 | |
play3 | 14 | 18 | |
Negative emotion | play1 | 36 | 23 |
play2 | 81 | 45 | |
play3 | 45 | 36 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Kikuchi, R.; Shirai, H.; Ishizawa, C.; Suehiro, K.; Takahashi, N.; Saito, H.; Kobayashi, T.; Satake, H.; Sato, N.; Kageyama, Y. Feature Analysis of Facial Color Information During Emotional Arousal in Japanese Older Adults Playing eSports. Sensors 2025, 25, 5725. https://doi.org/10.3390/s25185725
Kikuchi R, Shirai H, Ishizawa C, Suehiro K, Takahashi N, Saito H, Kobayashi T, Satake H, Sato N, Kageyama Y. Feature Analysis of Facial Color Information During Emotional Arousal in Japanese Older Adults Playing eSports. Sensors. 2025; 25(18):5725. https://doi.org/10.3390/s25185725
Chicago/Turabian StyleKikuchi, Ryota, Hikaru Shirai, Chikako Ishizawa, Kenji Suehiro, Nobuaki Takahashi, Hiroki Saito, Takuya Kobayashi, Hisami Satake, Naoko Sato, and Yoichi Kageyama. 2025. "Feature Analysis of Facial Color Information During Emotional Arousal in Japanese Older Adults Playing eSports" Sensors 25, no. 18: 5725. https://doi.org/10.3390/s25185725
APA StyleKikuchi, R., Shirai, H., Ishizawa, C., Suehiro, K., Takahashi, N., Saito, H., Kobayashi, T., Satake, H., Sato, N., & Kageyama, Y. (2025). Feature Analysis of Facial Color Information During Emotional Arousal in Japanese Older Adults Playing eSports. Sensors, 25(18), 5725. https://doi.org/10.3390/s25185725