Eye-Tracking and Emotion-Based Evaluation of Wardrobe Front Colors and Textures in Bedroom Interiors
Abstract
1. Introduction
2. Materials and Methods
2.1. Extraction of Visual Physical Properties
2.1.1. Color Data Extraction and Adaptive Palette Coverage
2.1.2. Texture Feature Extraction
2.1.3. Preparation of Interior-Scene Stimuli
2.2. Experimental
2.2.1. Participants
2.2.2. Experimental Equipment
2.2.3. Experimental Procedure and Task
- Material-element task: On each trial, 6 swatches were presented. Participants indicated the most preferred option using a keypad or mouse;
- Bedroom-scene task: On each trial, 6 wardrobe scenes were presented. Participants indicated the most preferred configuration.
2.3. Data Acquisition and Analysis
- Color-Single, presenting isolated color swatches;
- Color-Texture-Single, presenting the original texture samples;
- DeColor-Texture-Single, presenting the same texture samples after removing chromatic information to emphasize structural texture cues.
2.3.1. Clustering of Visual-Attention Patterns
2.3.2. Preference Prediction Modeling
2.3.3. Bradley–Terry Pairwise Ranking
2.3.4. Integrated Attention Index
2.3.5. Linking Attention and Emotion
3. Results
3.1. Stimulus Set
3.2. Eye-Tracking Preprocessing and Subject Stratification
3.3. Preference Prediction Performance
3.4. Pairwise Preference Ranking
3.5. Integrating Gaze Features for Preference Scoring
3.6. Linking Gaze to Affect
3.7. Agreement with the Subjective Questionnaire
4. Discussion
4.1. Stable Preference Structures Across Interior Contexts
4.2. Linking Visual Attention to Affective Evaluation and Choice
4.3. Implications for Bedroom Interior Design and Material Selection
4.4. Limitations and Directions for Future Research
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Aktan Ábrahám, S.; Deniz, D. Rethinking the Interior Design Process: Advancing Circularity through Evidence-Based Design. In Proceedings of the EKSIG 2025: DATA AS EXPERIENTIAL KNOWLEDGE AND EMBODIED PROCESSES; Design Research Society: Budapest, Hungary, 2025. [Google Scholar]
- Orquin, J.L.; Mueller Loose, S. Attention and Choice: A Review on Eye Movements in Decision Making. Acta Psychol. 2013, 144, 190–206. [Google Scholar] [CrossRef] [PubMed]
- Shimojo, S.; Simion, C.; Shimojo, E.; Scheier, C. Gaze Bias Both Reflects and Influences Preference. Nat. Neurosci. 2003, 6, 1317–1322. [Google Scholar] [CrossRef]
- Pan, S. Emotional Analysis of Broadcasting and Hosting Speech by Integrating Grid PSO-SVR and PAD Models. Int. J. Cogn. Comput. Eng. 2025, 6, 55–64. [Google Scholar] [CrossRef]
- Schloss, K.B.; Palmer, S.E. An Ecological Valence Theory of Human Color Preferences. J. Vis. 2010, 9, 358. [Google Scholar] [CrossRef]
- Jonauskaite, D.; Camenzind, L.; Parraga, C.A.; Diouf, C.N.; Mercapide Ducommun, M.; Müller, L.; Norberg, M.; Mohr, C. Colour-Emotion Associations in Individuals with Red-Green Colour Blindness. PeerJ 2021, 9, e11180. [Google Scholar] [CrossRef]
- Palmer, S.E.; Schloss, K.B. Color Preference. In Encyclopedia of Color Science and Technology; Luo, R., Ed.; Springer: Berlin/Heidelberg, Germany, 2015; pp. 1–7. ISBN 978-3-642-27851-8. [Google Scholar]
- Banaei, M.; Ahmadi, A.; Gramann, K.; Hatami, J. Emotional Evaluation of Architectural Interior Forms Based on Personality Differences Using Virtual Reality. Front. Archit. Res. 2020, 9, 138–147. [Google Scholar] [CrossRef]
- Mehrabian, A. Pleasure-Arousal-Dominance: A General Framework for Describing and Measuring Individual Differences in Temperament. Curr. Psychol. 1996, 14, 261–292. [Google Scholar] [CrossRef]
- Hall, M.; Elliott, K.; Meng, J. Using the PAD (Pleasure, Arousal, and Dominance) Model to Explain Facebook Attitudes and Use Intentions. J. Soc. Media Soc. 2017, 6, 144–169. [Google Scholar]
- Zhao, Y.; Xie, D.; Zhou, R.; Wang, N.; Yang, B. Evaluating Users’ Emotional Experience in Mobile Libraries: An Emotional Model Based on the Pleasure-Arousal-Dominance Emotion Model and the Five Factor Model. Front. Psychol. 2022, 13, 942198. [Google Scholar] [CrossRef]
- Cebrián, P.; Pérez-Sienes, L.; Sanz-Vicente, I.; López-Molinero, Á.; de Marcos, S.; Galbán, J. Solving Color Reproducibility between Digital Devices: A Robust Approach of Smartphones Color Management for Chemical (Bio)Sensors. Biosensors 2022, 12, 341. [Google Scholar] [CrossRef]
- Ikotun, A.M.; Habyarimana, F.; Ezugwu, A.E. Benchmarking Validity Indices for Evolutionary K-Means Clustering Performance. Sci. Rep. 2025, 15, 21842. [Google Scholar] [CrossRef]
- Tejada-Casado, M.; Herrera, L.J.; Carrillo-Perez, F.; Ruiz-López, J.; Ghinea, R.I.; Pérez, M.M. Exploring the CIEDE2000 Thresholds for Lightness, Chroma, and Hue Differences in Dentistry. J. Dent. 2024, 150, 105327. [Google Scholar] [CrossRef]
- ISO/CIE 11664-2:2022; Colorimetry—Part 2: CIE Standard Illuminants. International Organization for Standardization: Geneva, Switzerland, 2022.
- Huang, D.; Gong, W.; Wang, X.; Liu, S.; Zhang, J.; Li, Y. A Cognition–Affect–Behavior Framework for Assessing Street Space Quality in Historic Cultural Districts and Its Impact on Tourist Experience. Buildings 2025, 15, 2739. [Google Scholar] [CrossRef]
- Wang, Z.; Shen, M.; Huang, Y. Exploring the Impact of Facade Color Elements on Visual Comfort in Old Residential Buildings in Shanghai: Insights from Eye-Tracking Technology. Buildings 2024, 14, 1758. [Google Scholar] [CrossRef]
- Shi, W.; Zhou, M.; Ono, K. Cognitive Style and Visual Attention in Multimodal Museum Exhibitions: An Eye-Tracking Study on Visitor Experience. Buildings 2025, 15, 2968. [Google Scholar] [CrossRef]
- Wlazły, W.; Bonenberg, A. Modulating Perception in Interior Architecture through Décor: An Eye-Tracking Study of a Living Room Scene. Buildings 2025, 15, 48. [Google Scholar] [CrossRef]
- Fanlo-Zarazaga, A.; Echevarría, J.I.; Pinilla, J.; Alejandre, A.; Pérez-Roche, T.; Gutiérrez, D.; Ortín, M.; Pueyo, V. Validation of a New Digital and Automated Color Perception Test. Diagnostics 2024, 14, 396. [Google Scholar] [CrossRef] [PubMed]
- Wang, K.; Hou, W.; Ma, H.; Hong, L. Eye-Tracking Characteristics: Unveiling Trust Calibration States in Automated Supervisory Control Tasks. Sensors 2024, 24, 7946. [Google Scholar] [CrossRef] [PubMed]
- Echeverria-Altuna, I.; Nobre, A.C.; Boettcher, S.E.P. Goal-Dependent Use of Temporal Regularities to Orient Attention under Spatial and Action Uncertainty. J. Cogn. 2024, 7, 37. [Google Scholar] [CrossRef] [PubMed]
- Sainz-de-Baranda Andujar, C.; Gutiérrez-Martín, L.; Miranda-Calero, J.Á.; Blanco-Ruiz, M.; López-Ongil, C. Gender Biases in the Training Methods of Affective Computing: Redesign and Validation of the Self-Assessment Manikin in Measuring Emotions via Audiovisual Clips. Front. Psychol. 2022, 13, 955530. [Google Scholar] [CrossRef]
- Guo, R.; Kim, N.; Lee, J. Empirical Insights into Eye-Tracking for Design Evaluation: Applications in Visual Communication and New Media Design. Behav. Sci. 2024, 14, 1231. [Google Scholar] [CrossRef]
- Hu, H.; Li, H.; Wang, B.; Zhang, M.; Wu, B.; Wu, X. Application of Eye-tracking in Nursing Research: A Scoping Review. Nurs. Open 2024, 11, e2108. [Google Scholar] [CrossRef]
- Lones, M.A. Avoiding Common Machine Learning Pitfalls. Patterns 2024, 5, 101046. [Google Scholar] [CrossRef]
- Fridgeirsson, E.A.; Williams, R.; Rijnbeek, P.; Suchard, M.A.; Reps, J.M. Comparing Penalization Methods for Linear Models on Large Observational Health Data. J. Am. Med. Inform. Assoc. 2024, 31, 1514–1521. [Google Scholar] [CrossRef]
- Hijazi, H.; Gomes, M.; Castelhano, J.; Castelo-Branco, M.; Praça, I.; de Carvalho, P.; Madeira, H. Dynamically Predicting Comprehension Difficulties through Physiological Data and Intelligent Wearables. Sci. Rep. 2024, 14, 13678. [Google Scholar] [CrossRef] [PubMed]
- Hermens, F.; Krucien, N.; Ryan, M. The Use of Machine Learning to Understand the Role of Visual Attention in Multi-Attribute Choice. Acta Psychol. 2024, 251, 104581. [Google Scholar] [CrossRef] [PubMed]
- Seymour, R.G.; Sirl, D.; Preston, S.P.; Dryden, I.L.; Ellis, M.J.A.; Perrat, B.; Goulding, J. The Bayesian Spatial Bradley–Terry Model: Urban Deprivation Modelling in Tanzania. J. R. Stat. Soc. Ser. C Appl. Stat. 2022, 71, 288–308. [Google Scholar] [CrossRef]
- Gao, Y.; Zhang, Z.; Yu, W. Preference Learning for Multi-Criteria Sorting with Interacting Criteria: A Framework Integrating Threshold-Based Value-Driven Sorting Procedure with Attention Network. Inf. Fusion 2026, 125, 103443. [Google Scholar] [CrossRef]
- Li, Y.; Tong, Z.; Tong, S.; Westerdahl, D. A Data-Driven Interval Forecasting Model for Building Energy Prediction Using Attention-Based LSTM and Fuzzy Information Granulation. Sustain. Cities Soc. 2022, 76, 103481. [Google Scholar] [CrossRef]
- Cruz-Cano, R.; Cohen, A.; Mead-Morse, E. Canonical Correlation Analysis of Survey Data: The SurveyCC R Package. R J. 2025, 16, 141–157. [Google Scholar] [CrossRef]
- Berquet, S.; Aleem, H.; Grzywacz, N.M. A Fisher Information Theory of Aesthetic Preference for Complexity. Entropy 2024, 26, 901. [Google Scholar] [CrossRef] [PubMed]











| Rank | Stimulus | IAI |
|---|---|---|
| 1 | White | 1.349 |
| 2 | Ivory White | 1.339 |
| 3 | Gray | 0.010 |
| 4 | Red | −0.059 |
| 5 | Yellow | −0.073 |
| 6 | Green | −0.144 |
| 7 | Black | −0.183 |
| Rank | Stimulus | IAI |
|---|---|---|
| 1 | Mountain Grain | 1.180 |
| 2 | Straight Grain | 1.170 |
| 3 | Line Finish | 0.170 |
| 4 | Leather Finish | 0.078 |
| 5 | Special Grain | −0.013 |
| 6 | Horizontal Grain | −0.238 |
| Analysis | Statistic | Color | Texture |
|---|---|---|---|
| CCA | First canonical r | 0.62 | 0.58 |
| Wilks’ Lambda (p) | <0.001 | <0.001 | |
| SEM | CFI | 0.96 | 0.95 |
| TLI | 0.94 | 0.93 | |
| RMSEA | 0.045 | 0.048 | |
| RMR | 0.038 | 0.041 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2026 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license.
Share and Cite
Chen, Y.; Xu, W.; Ma, X. Eye-Tracking and Emotion-Based Evaluation of Wardrobe Front Colors and Textures in Bedroom Interiors. Multimodal Technol. Interact. 2026, 10, 7. https://doi.org/10.3390/mti10010007
Chen Y, Xu W, Ma X. Eye-Tracking and Emotion-Based Evaluation of Wardrobe Front Colors and Textures in Bedroom Interiors. Multimodal Technologies and Interaction. 2026; 10(1):7. https://doi.org/10.3390/mti10010007
Chicago/Turabian StyleChen, Yushu, Wangyu Xu, and Xinyu Ma. 2026. "Eye-Tracking and Emotion-Based Evaluation of Wardrobe Front Colors and Textures in Bedroom Interiors" Multimodal Technologies and Interaction 10, no. 1: 7. https://doi.org/10.3390/mti10010007
APA StyleChen, Y., Xu, W., & Ma, X. (2026). Eye-Tracking and Emotion-Based Evaluation of Wardrobe Front Colors and Textures in Bedroom Interiors. Multimodal Technologies and Interaction, 10(1), 7. https://doi.org/10.3390/mti10010007

