A Comprehensive Study of Emotional Responses in AI-Enhanced Interactive Installation Art

: This study presents a comprehensive literature review on the convergence of affective computing, interactive installation art, multi-dimensional sensory stimulation, and artiﬁcial intelligence (AI) in measuring emotional responses, demonstrating the potential of artiﬁcial intelligence in emotion recognition as a tool for sustainable development. It addresses the problem of understanding emotional response and measurement in the context of interactive installation art under artiﬁcial intelligence (AI), emphasizing sustainability as a key factor. The study aims to ﬁll the existing research gaps by examining three key aspects: sensory stimulation, multi-dimensional interactions, and engagement, which have been identiﬁed as signiﬁcant contributors to profound emotional responses in interactive installation art. The proposed approach involves conducting a process analysis of emotional responses to interactive installation art, aiming to develop a conceptual framework that explores the variables inﬂuencing emotional responses. This study formulates hypotheses that make speciﬁc predictions about the relationships between sensory stimulation, multi-dimensional interactions, engagement, and emotional responses. By employing the ASSURE model combined with experimental design, the research methodology ensures a systematic and comprehensive study implementation. The implications of this project lie in advancing the understanding of emotional experiences in interactive installation art under AI, providing insights into the underlying mechanisms that drive these experiences, and their inﬂuence on individual well-being from a sustainable perspective. The contributions of this research include bridging the identiﬁed research gaps, reﬁning theoretical frameworks, and guiding the design of more impactful and emotionally resonant interactive artworks with sustainability in mind. This research seeks not only to ﬁll the existing gaps in understanding emotional experiences in interactive installation art, but also to guide the development of immersive and emotionally engaging installations, ultimately advancing the broader ﬁeld of human–computer interaction, promoting individual well-being, and contribute to sustainable development.


Introduction
Interactive installation art has blossomed into an enthralling artistic expression, blending technological innovation with creative prowess to bestow viewers with an immersive and participatory encounter [1].Infusing artificial intelligence (AI) into these installation arts has engendered a paradigm shift and created a new art form that explores intricate emotional engagement.Viewed through an ontological lens, the amalgamation of interactive installation art with artificial intelligence emerges as a dualistic existence encompassing entity and environment [2].As palpable entities, interactive installation art possesses unique attributes and characteristics that distinguish its intrinsic nature.Simultaneously, as elements of existence, these installation arts intricately engage with their surrounding milieu, evoking an array of emotional responses.At the core of interactive installation art lies its inherent prowess to produce emotional resonance within participants, effacing the boundaries between the observer and the artificial construct [3].
Propelled by artificial intelligence algorithms, the dynamic interaction between the observer and the installation fabricates an ambiance wherein the distinction between self and artificial entity becomes indistinct.The AI-infused interactive installation arts adopt qualities and attributes that appear to bridge the divide between inanimate constructs and sentient beings.This amalgamation challenges conventional constructs of agency, intentionality, and consciousness, as interactive installation art possesses distinct attributes and characteristics, engaging with their environment and eliciting an array of emotional responses.Eminent scholars such as Dominic M. McIver Lopes, Sherri Irvin, Graham Harman, Erkki Huhtamo, and Claudia Giannetti, among others, have concentrated their research on the intersection of installation art and emotional responses [1][2][3][4].Their work enhances our ontological understanding of interactive installation art, shedding light on integrating emotions with the essence of existence and artistic innovation.Moreover, it sparks profound inquiries into the fundamental nature of existence, reality, and consciousness within interactive art.
In the era of burgeoning AI capabilities, the capacity of algorithms to swiftly assimilate and construe extensive datasets in real time heralds a transformative juncture.This evolution empowers installation art to dynamically interface with their surroundings, adeptly catering to individual participants' intricate emotional and cognitive spectra.This amalgamation of artificial intelligence and interactive art engenders a paradigm shift, propelling the notion of responsiveness to unprecedented pinnacles.Consequently, the installation transcends its erstwhile static essence, metamorphosing into a dynamic milieu that profoundly resonates with each participant.
The formidable aptitude for artificial intelligence technology lies at the nucleus of this metamorphic amalgamation.This aptitude lies in its proficiency to anatomize and scrutinize the intricate tapestry of relationships binding critical elements of the installation with the ensuing emotional responses they incite.The intricate dance commences as participants engage with the installation, wherein artificial intelligence algorithms intricately capture and construe sensory inputs.These span a gamut from gestures and facial expressions to tonal inflections and physiological cues [5][6][7][8][9].This intricate web of data points promptly transmutes into real-time rejoinders through artificial intelligence algorithms, unfurling a chronicle of evolving emotional states and cognitive reactions.
This dynamic process burgeons as AI algorithms, operating in real time, meticulously process and dissect copious volumes of data.This prowess endows installation art with the agility to react dynamically to the behavioral and emotional tapestry woven by participants.In tandem, this symbiotic relationship between AI algorithms and interactive installations bequeaths artists and designers a fresh canvas with the potential to craft experiences that are emotionally opulent and compelling.Installation art, augmented by artificial intelligence, becomes adept at capturing and deciphering the nuanced emotional cues from participants.By leveraging these algorithms, installation art attunes itself to the participants' emotional states and responses, culminating in an elevated level of engagement.
Therefore, our research objective is to facilitate the measurement of emotional responses by artificial intelligence algorithms, which involves an in-depth study of the intricate interplay between various stages, from multi-sensory input to measurement and subsequent recognition.We adopt an experimental design based on the ASSURE model to achieve this.This approach enables us to investigate the intricate, dynamic nexus between sensory stimulation, multi-dimensional interaction, participant engagement, and emotional responses.Our study aims to foster a deep understanding of emotional responses within interactive installation art under artificial intelligence.This endeavor holds immense significance as it equips artists and designers with invaluable insights, pivotal in advancing the development of emotionally engaging interactive installation art, ultimately enriching the landscape of artistic creation.

Literature Review
Rapid advances in computing power, big data, and machine learning enable artists to incorporate artificial intelligence techniques into their installation art, providing audiences an immersive and engaging experience.These installations utilize artificial intelligence algorithms to create dynamic and responsive environments that interact with the viewer's movements, gestures, or input [5,6].They can also incorporate computer vision to track audience movement and generate music or visual effects based on audience behavior [10,11].This research uses artificial intelligence algorithms to measure emotional responses to interactive installation art.To achieve this goal, we conduct a relevant literature review to examine the current state of installation art and identify research gaps.

Artificial Intelligence in Interactive Installation Art
The impact of artificial intelligence on art design is profound and massive; it is changing the development pattern of art at an unprecedented speed, and artists use artificial intelligence in various ways [12], creating dynamic, responsive, and personalized experiences for their audiences.One of the essential applications of artificial intelligence in interactive installation art is to make the artwork respond to the audience's behavior in real time through machine-learning algorithms.For example, an intelligent mirror can recognize the viewer's face, track their movements, and change their appearance or behavior, creating a more personalized experience [5].
Artificial intelligence in interactive installation art another way to use is to use natural language processing (NLP) technology to enable artwork to understand and respond to written or spoken language.For example, the "Molten Memories" installation uses NLP to create a responsive and immersive visitor experience.The building walls are transformed into a dynamic display of light and sound [13]."You Are What You See" uses NLP and machine-learning algorithms to generate appropriate responses to audience communication [14].It opens new possibilities for creating interactive installations that respond to the viewer's questions or commands, creating a more conversational and personal experience.
Artificial intelligence algorithms have created a responsive and immersive experience for the audience, eliciting better emotional responses.A large body of literature is dedicated to exploring the impact of emotion on interactive installation art and how it affects the user experience, emphasizing its role in creating a sense of presence and immersion in interactive installation art.Interactive devices can respond to the user's emotional state, creating a more personalized and engaging experience by evoking surprise, joy, excitement, curiosity, or awe to create a deeper interaction with the device [15].Emotions can be elicited by various elements in interactive installation art, such as visual and auditory stimuli, physical interaction with artwork, and narrative or conceptual content.They can range from joy and excitement to more complex emotions such as fear, anxiety, or confusion [16].
Interactive installation art can evoke a range of positive and negative emotions, and designers and curators must understand the role of emotion in the aesthetic experience of interactive installation art to create impactful experiences for viewers [17].Emotional experience is an integral part of the user experience, and emotional responses are influenced by multiple factors, including the physical environment and audience experience [18].
Designers should prioritize emotions when creating interactive experiences, understanding the importance of the environment, the causes of emotional factors, and developing strategies for emotional experiences [19].The elements in designing interactive systems include storytelling, personalization, interactivity, the consideration of users' prior knowledge and expectations, and cultural and social factors [17].The presence of other people in the physical environment of interactive installation art can also affect the audience's emotional experience [18].The emotion of interactive installation art is influenced by the design of the installation and the sensory experience it provides [18,20,21], and mostly depends on the design of the installation, the presentation of the environment, and the characteristics of the relevant actors [22].For example, the "Rain Room" installation at the Museum of Modern Art in New York City allows visitors to walk in heavy rain without getting wet.It creates a sense of wonder and awe among visitors and evokes a strong emotional response [23].Movement is an essential element of the installation, as visitors can move around and interact with the rain, experiencing the thrill of being surrounded by rain without getting wet.The "Ripple" interactive installation uses light and sound to create a calming and meditative experience for the user.Movement is integral to the installation, as users can move and interact with light and sound waves, creating ripples and patterns [21].The installation uses sensory cues to evoke an emotional response in the user.It allows the user to reflect on their emotional experience and connect with others who have had a similar experience.
In conclusion, emotions play a vital role in creating successful interactive installation art, which can create meaningful experiences for viewers.As AI increasingly integrates into our lives, the literature reviewed above highlights the importance of creating engaging and evocative experiences in interactive installation art through emotional design.

Emotion and Well-Being
Emotion is a powerful and influential factor in the mental life of humans, resulting from internal and external stimuli in humans.In the field of art aesthetics, Scherer defines "aesthetic emotion" as "emotions elicited by an appreciation of the intrinsic qualities of natural beauty or the qualities of a work of art or artistic performance" [24].
Making art is a complex and transformative activity that fosters a deep connection to human emotion.Castro argues that participatory environments in installation art can generate emotional ties and contribute to social reconnection among individuals [25].As Reason emphasized, art has the power to express thoughts and emotions and evoke powerful experiences in those who encounter it [26].This transformative experience, the aesthetic experience, has been the subject of much philosophical inquiry.For Kant, aesthetic experience is associated with pleasure derived from beautiful things [27].Goodman believes art allows us to perceive the world through positive or negative emotions, pleasant or unpleasant [28].Based on Scheler's concept of aesthetic emotion, it is divided into five categories: "moved", "surprised", "fascinated", "beautiful", and "awe" [29].Artists apply artificial intelligence into interactive installation art to create a more personalized and engaging experience by evoking emotions such as surprise, joy, excitement, curiosity or awe, and create a more personalized and engaging experience, which creates a deeper interaction with the installation.
Previous research has shown that positive emotional output from aesthetic experiences can influence mood and indirectly contribute to health and well-being [16].The integration of artificial intelligence technologies amplifies the potential of interactive installations, enabling them to elicit a diverse spectrum of emotions in viewers.This culminates in the creation of immersive and captivating environments.These emotional responses are catalysts for forging profound connections and eliciting meaningful reactions from participants, and i contribute to health and well-being.
Emotional expression: Interactive installations allow individuals to express and explore their emotions in a creative and immersive environment.These installations may employ various sensory elements, such as visual displays, sounds, and tactile experiences, to evoke and amplify emotional responses.The "I'm Sensing in the Rain" exhibit creates the illusion of raindrops falling through tactile stimuli in the air, enhancing realism and immersion in virtual environments [30].The Tate Sensorium exhibit engages visitors' senses through a multi-sensory design that combines taste, touch, smell, sound, and sight [31].
Catharsis: Interactive installation art can facilitate cathartic experiences, allowing individuals to release pent-up emotions and find emotional release or relief.The interactive nature of these installations can create a safe space for individuals to express and process their feelings.Canbeyli discusses the relationship between emotion and depression, where sensory stimulation via the visual, auditory, olfactory, and gustatory systems can modulate depression [32].Gilroy proposed a framework for analyzing user experience in interactive art installations, which could help to identify when and how cathartic experiences occur [33].
Self-Reflection: Interactive installations often encourage self-reflection, prompting individuals to delve into their inner thoughts and emotions.Through interactive engagement, individuals can gain insights into their emotional states, triggers, and personal narratives, fostering greater self-awareness and emotional understanding.Gilbert proposed a model for the personal awareness of science and technology (PAST) to design interactive exhibits that promote meaning-making, which could encourage self-reflection."Learning to See" uses machine-learning algorithms to analyze live video footage and create an abstract image representation in real time [34].The installation challenges our perception of art, blurs the lines between human and machine-made, and evokes an emotional response by showing us something new and unexpected [14].
Emotional regulation: Interactive installations can aid emotional regulation by offering tools and strategies for managing and modulating emotions.These installations may incorporate elements like mindfulness exercises, relaxation techniques, or guided emotional experiences, providing individuals with resources to regulate their emotional states effectively.Jiang found that an intelligent interactive shawl that reacts to changes in emotional arousal can help users visualize their emotions and reduce their stress levels by interacting with it [35].Sadka identified four opportunity themes where interactive technologies can provide unique benefits for emotion regulation training [36].
Therapeutic engagement: Specifically in therapeutic settings, interactive installations can be designed to facilitate emotional healing and growth.They may address specific emotional challenges, promote emotional well-being, or support therapeutic interventions by providing a creative and interactive medium for individuals to explore their emotions and experiences.Waller presents a theoretical model that integrates the change-enhancing factors of both group psychotherapy and art therapy and shows how this model works in practice through a series of illustrated case examples of various client and training groups from different societies and cultures [37].A project called RHYME aimed to develop a musical and multi-sensorial internet-of-things to improve the health and well-being of children with special needs [38].

Research Dimension: Sensory Stimulation, Experience, and Engagement
Interactive installation art has been extensively studied in four countries: the United States, Germany, the United Kingdom, and China.Table 1 compares interactive installation art in these countries, highlighting the design factors that elicit emotional responses.Sensory stimulation, multi-dimensional interaction, and participation are crucial elements that significantly contribute to the emotional experiences associated with interactive installation art.

Sensory Stimulation
Sensory stimulation is a crucial aspect of interactive installation art, as it refers to any input that activates our senses, such as visual, auditory, olfactory, gustatory, or tactile stimuli.Sensory stimulation can elicit different emotional responses, and multi-sensory stimuli can evoke different emotions [39]; sensory stimuli, including music, sound effects, lighting, color, and CG animation, can creating immersive environments and enhance emotional responses [7,9].These studies emphasize the importance of carefully designing and integrating sensory elements within interactive installation art to elicit specific emotional states.Visually appealing and engaging content tends to elicit positive emotional responses, while repetitive or annoying sounds elicit negative emotional responses [40].Music can significantly impact mood, and certain types elicit positive emotions and reduce symptoms of depression [38].In the "Molten Memories" installation, the walls of the building are transformed into a dynamic and changing display of light and sound [13].
With the rapid development of science and technology, various sensory technologies have emerged, as the times require stimulating vision, hearing, and the senses of touch, smell, and taste [41].Recent research explores touch, taste, and smell to enhance engagement in multi-sensory experiences.An interdisciplinary SCHI lab team led by Marianna Obrist conducted extensive research on multi-sensory experiences, including interactive touch, taste, and smell experiences [42].They used air-haptic technology to create an immersive tourist experience [43].For example, the "I'm Sensing in the Rain" exhibit creates the illusion of raindrops falling through tactile stimuli in the air, enhancing realism and immersion in virtual environments [30].The Tate Sensorium exhibit engages visitors' senses through a multi-sensory design that combines taste, touch, smell, sound, and sight [31].Advances in immersive environments such as virtual reality (VR) and wearable devices have developed kinship interfaces involving scent, temperature, and smell to reshape multi-sensory experiences.Odor stimuli can influence the perception of body lightness or heaviness [44], while temperature perception can promote social proximity concepts and influence human behavior [45].Likewise, the sense of taste plays a crucial role in our understanding of the consumption process and reflects emotional states.
According to Velasco and Obrist, a multi-sensory experience is an impression formed by a specific event in which sensory elements such as color, texture, and smell are coordinated to create an impression of an object [41].These impressions can alter cognitive processes such as attention and memory, and designers draw on the existing multi-sensory perception research and concepts to design them [46].Multi-sensory experiences can be physical, digital, or a combination of the two, from completely real to completely virtual, and can enhance participation in multi-sensory experiences [10].

Multi-Dimension Interactive
By incorporating sensory engagement, spatial design, human-machine interaction, data-driven approaches, and narrative elements, interactive installation art offers a multidimensional experience beyond traditional visual art forms [6].Interactive installation art, data visualizations, and spatial narratives can enable humans to engage with and navigate this multi-dimensional space, blurring the boundaries between the physical and digital realms [47].Interactive installation art at the Mendelssohn Memorial Hall provides a multi-dimensional experience by offering immersive interaction, sensorial engagement, technological integration, and spatial transformation.It merges technology and art, allowing visitors to actively participate in the virtual orchestra performance and creating a dynamic and captivating experience [48].George's interactive installation art provides a multi-dimensional experience [10].The visitor visually explores the art exhibit, trying to discover interesting areas, while the audio guide provides brief information about the exhibit and the gallery theme.The system uses eye tracking and voice commands to provide visitors with real-time information.The evaluation study sheds light on the dimensions of evoking natural interactions within cultural heritage environments, using micro-narratives for self-exploration and the understanding of cultural content, and the intersection between the human-computer interaction and artificial intelligence within cultural heritage.Reefs and Edge's interactive installation art embodies a multi-dimensional experience, merging tangible user interface objects and combining environmental science and multiple art forms to explore coral reef ecosystems threatened by climate change [49].They argue that using a tangible user interface in an installation-art setting can help engage and inform the public about crucial environmental issues.The use of reacTIVision allows the computer simulation to pinpoint the location, identity, and orientation of the objects on the table's surface in real time as the users move them.
Interface design is also a critical factor in structuring shared experiences.Fortin and Hennessy describe how electronic artists used cross-modal interfaces based on intuitive modes of interaction such as gesture, touch, and speech to design interactive installation art that engages people beyond the ubiquitous single-user "social cocooning" interaction scenario [50].They explain that the multi-dimensional experience of interactive installation art is achieved through thoughtful interface design that encourages collaboration, play, and meaning.Diversifying interfaces in influential interactive installation art can enhance users' emotional interaction and create a more engaging atmosphere [51].By incorporating facial affect detection technology and implementing a wide range of input and output interfaces, installation art can offer a diverse and immersive experience.
Body movement is essential in multi-dimensional interaction; the Microsoft Kinect and the Leap Motion Controller are examples of 3D vision sensors that can be used for body motion interaction, such as gestures, speech, touch, and vision, to create more natural and powerful interactive experiences.On-body tangible interaction involves augmented and mixed reality devices using the body as a physical support to constrain the movement of multiple-degrees-of-freedom devices (3D Mouse).The 3D Mouse offers enough degrees of freedom and accuracy to support the interaction.Using the body as a support for the interaction allows the user to move in their environment and avoid the fatigue of mid-air interactions [52].

Engagement
The user experience should be designed to create engagement and active participation opportunities to enhance the emotional response and overall user satisfaction.Meaningful engagement with art involves a fusion of cognitive and affective elements that go beyond passive observation and involve active involvement, interaction, and participation on the viewer's part.Interactive installation art, which responds to the actions and inputs of the viewer, creates a dynamic and reciprocal relationship between the artwork and the participant.This interactivity enables individuals to engage with the artwork on their terms, influencing and shaping the experience [53].In interactive installation art, artificial intelligence technology is increasingly employed to create immersive experiences by analyzing participants' biometric data, such as heart rate and skin conductance.
Generally, engagement is combined with artificial intelligence to reflect people's participation.Artificial intelligence allows installation art to respond to biometric data dynamically.Analyzing participants' biometric data improves engagement, promoting stronger emotional and sensory connections and enriching the interactive experience.For example, Galvanic Skin Response sensors can measure the level of engagement of the audience and the presenter.The gathered data are then visualized in real time through a visualization projected onto a screen and a physical electro-mechanical installation, which changes the height of helium-filled balloons depending on the atmosphere in the auditorium [54], creating a tangible way of making the invisible visible.Artificial intelligence algorithms can adjust the experience in real time.AI algorithms analyze incoming sounds, identify patterns and characteristics, and then classify the sounds, distinguishing between musical instruments, voices, and other auditory elements, which link to corresponding faces or visual representations.AI algorithms dynamically adjust and generate visuals based on real-time audio input."Learning to See" [14] uses machine-learning algorithms to analyze live video footage and create an abstract image representation in real time.Artificial intelligence algorithms enhance participant engagement and foster emotional resonance in interactive installation art.Personalized content and feedback can create a sense of ownership of the experience, enhancing the emotional response and user engagement [10].Research shows that engagement is vital in eliciting an emotional response and creating meaningful connections between installation art and viewers.

Emotion Measurement in Interactive Installation Art
Emotion measurement refers to the process of quantitatively assessing and evaluating emotional experiences.It involves capturing and recording various indicators or signals that reflect an individual's emotional state, such as physiological responses (e.g., heart rate and skin conductance), facial expressions, vocal cues, self-reported ratings, or behavioral observations.

Emotion Recognition Technology
Different emotional responses can be elicited using different stimuli, emphasizing the potential of multimodal emotion recognition technology in interactive installation art [55].Multiple sensing modalities give us a wealth of information to support interaction with the world and one another [56].The multimodal fusion of AR with speech and hand gesture input enables users to interact with computers through various input modalities like speech, gesture, and eye gaze [57].Multimodal design in Cangjie's Poetry creates an interactive art experience that combines different modes of communication, such as language, symbols, and visual elements.This approach allows the artwork to engage with audiences in multiple ways and create a more immersive and dynamic experience [58].Combining multiple modalities through a multimodal approach is more effective than relying on a single modality (unimodal) for emotion recognition in artistic settings [10,58].Thus, multimodal sentiment analysis harnesses the synergy and complementarity of diverse modal information to enrich emotional understanding and expression, using real-time vocal emotion recognition techniques to enhance audience engagement and creative expression in artistic installations [55].The research involved collecting and analyzing the participants' vocal and physiological data, using machine-learning algorithms for emotion recognition, and implementing the system in two interactive installation art exhibits.The findings demonstrated that the system effectively enhanced audience engagement and creative expression.
In affective computing, the amount of information and dimensions of human emotions conveyed by each module are different; affective computing should start from multiple dimensions as much as possible to make up for a single imperfect emotional channel.Then, the emotional tendency is judged based on multiple results, so it is necessary to study its effectiveness.Currently, most innovations are based on multimodal emotion features and fusion algorithms to improve the accuracy of emotion classification, as shown in Table 2 [59].From Table 2, most of the current research on emotion recognition starts from the perspective of digital signal analysis to explore the relationship between emotion and signal characteristics.With the help of deep-learning algorithms to extract emotional features, the features learned using massive data can better reflect the inherent nature of the data than artificially constructed features, thus significantly improving the effect of emotion recognition [60].Commonly used AI techniques for emotion classification, such as Support Vector Machines and the Naïve Bayes Classifier, can achieve more than 90% accuracy [61].Using SVM and LDA classifiers, Bhardwaj (2015) used EEG signals to classify emotions with an average overall accuracy of 74.13% and 66.50% [62].Mano proposed an ensemble model for emotion classification based on motor facial expressions that achieved greater accuracy than a single classifier [63].
Therefore, emotion recognition technology can be used to identify and analyze people's emotions, which can help us better understand how people feel and what their needs are.This information can be used to develop more effective policies and programs that meet people's needs, promote sustainable development and improve the well-being of people around the world, becoming a powerful tool for sustainable development.

Emotion Model
Multimodal fusion helps explore validity and develop effective models that enable more accurate assessment of emotional states by applying various techniques, including artificial intelligence and machine-learning algorithms, to extract meaningful information from the collected data.The most widely used emotion models are discrete and dimensional [64]; researchers try to quantify emotion and convert it into objectively representable data to promote the development of research on the human-computer interaction and emotional experience [65].
Cooney focused on selecting four representative discrete emotions, with one emotion chosen from each quadrant in the valence-arousal space, aiming to capture a comprehensive range of emotional experiences and provide a foundation for further investigation [66].Gilroy et al. described a method for developing a multimodal affective interface to analyze user experience in real time as part of an augmented reality art installation.The system uses a PAD dimensional model of emotion, which allows the fusion of affective modalities, and each input modality is represented as a PAD vector and supports the representation of emotional responses associated with aesthetic impressions [67].Işik and Güven utilized various signal-processing methods, feature extraction techniques, and artificial intelligence methods to classify emotions from physiological signals [68].The analysis is based on the DEAP dataset and involves calculating statistical properties of physiological signals; the obtained data are used for emotion classification.Nasoz focused on implementing artificial intelligence algorithms to analyze the physiological signals associated with emotions by eliciting six emotions (sadness, anger, surprise, fear, frustration, and amusement) from participants via multimodal input and measuring their physiological signals [69].The algorithms map physiological signals to specific emotions for increased accuracy, including facial expression recognition, vocal intonation recognition, and natural language understanding.We list the emotion recognition technology used in physiological signals in interactive installation art, as shown in Table 3, and classify it from the research dimensions, research methods, research questions, and research findings.Most research has focused on developing emotion recognition systems that use sensors such as EEG, EKG, facial expressions, and speech recognition to recognize user emotions in real time.The research methods in these articles include signal-processing techniques, machine-learning algorithms, and data analysis tools to develop and validate emotion recognition models and data sources from different combinations, such as wearable devices and virtual reality environments with interactive installation art.These studies also explored different dimensions: user experience, emotional responses, and social behavior.The findings suggest that emotion recognition technology has the potential to enhance user engagement, improve the human-computer interaction, and provide valuable insights into users' emotional responses in interactive installation art.

Emotion Recognition Models
Emotion recognition models are machine-learning techniques that recognize emotions from various sources, such as text, facial expressions, and physiological signals.The current classification of machine-learning technology for emotion recognition can be divided into two categories: classical machine-learning methods and deep-learning methods.The classifier can classify input signals and output the corresponding emotion category [81].
(1) Machine-Learning Approaches: Machine-learning algorithms have been widely used in emotion recognition tasks.These approaches involve training models on labeled datasets to learn patterns and features that distinguish different emotional states.Researchers have explored the performance and accuracy of machine-learning models for emotion recognition in different contexts, including installation art.They have examined the effectiveness of these models in capturing and analyzing viewers' emotional responses.
Dominguez-Jimenez proposes a machine-learning model for emotion recognition from physiological signals using the Bagged Trees algorithm to recognize facial emotions from RGB data collected using an RGB HD camera [82].Ratliff discusses a framework for recognizing emotions using facial expressions through machine-learning techniques based on still images of the face using active appearance models [83].The AAM is trained on face images from a publicly available database to capture important facial structures for expression identification.The classification scheme can successfully identify faces related to the six universal emotions.Teng proposes a scheme that applies the emotion classification technique for emotion recognition using Support Vector Machines (SVMs), a popular tool for machine-learning tasks involving classification, regression, or novelty detection [84].The proposed emotion recognition system is designed to recognize emotions from the sentence inputted from the keyboard.The training set and testing set are constructed to verify the effectiveness of this model.
(2) Deep-Learning Models: Deep learning is a subfield of machine learning.Deep-learning models, such as convolutional neural networks (CNNs) and recurrent neural networks (RNNs) can automatically extract complex features from raw data and learn hierarchical representations of emotions.These models have improved performance in various domains, including image and speech recognition.So, deep-learning models can be applied to analyze emotional responses based on different modalities.For example, CNNs can analyze visual data, and RNNs can process physiological signals to detect patterns and variations in emotional responses.By leveraging deep-learning models, researchers can gain insights into the emotional experiences evoked by installation art and uncover the nuanced relationships between different design elements and emotional responses.
Hossain and Muhammad propose a system that can recognize emotions from speech and video signals [85].The system uses a type of artificial intelligence called deep learning to analyze the signals and extract features that are related to emotions.The system then combines the features from the speech and video signals to decide on the emotion being expressed.The proposed system has been tested using two large databases of emotional speech and video, and the results show that it is effective at recognizing emotions.Teo, Chia, and Lee discuss using deep-learning models to recognize emotions in music.The authors conducted experiments to improve the accuracy of these models by adjusting various parameters, such as the number of hidden layers, the number of neurons in each layer, and the regularization techniques used [86].They found that tuning these parameters achieved a prediction accuracy of 61.7%, an improvement of more than 15% compared to previous methods.This research could help build better music emotion recommendation systems to make song recommendations based on user emotions.Tashu et al. propose a deep-learning architecture for multimodal emotion recognition from art.The proposed architecture uses feature-level and modality attention to classify emotions in art [87].The model is trained on the WikiArt emotion dataset, and the experimental results show the efficiency of the proposed approach.Liu proposes a multimodal deep-learning approach to construct affective models from multiple physiological signals.The aim is to enhance the performance of affective models and reduce the cost of acquiring physiological signals for real-world applications [48].
(3) Hybrid Models: Hybrid models combine multiple modalities or utilize machine-learning and deeplearning techniques to enhance emotion recognition performance.These models aim to leverage the complementary information provided by different modalities to improve accuracy and robustness.In the context of installation art, hybrid models can be applied to analyze emotional responses using multiple data sources, such as facial expressions, physiological signals, and textual data from viewer feedback.By integrating information from different modalities, researchers can better understand viewers' emotional experiences and capture a broader range of emotional states.
Verma proposes using a hybrid deep-learning model for emotion recognition using facial expressions [88].The method uses two stages of a convolution neural network (CNN) to detect primary (happy and sad) and secondary (surprised and angry) emotions in images containing human faces.The proposed model is trained on two datasets of facial expressions and achieved high accuracies.The model can be extended to classify primary and secondary emotions in real-time video data and images.Atanassov considers a hybrid multimodal model for improving human emotion recognition based on facial expression and body gesture recognition [89].The study extends previous investigations on using pre-trained models of deep-learning neural networks (DNNs) for facial emotion recognition (FER) by adding the emotions extracted from body language.A second DNN model is developed and trained with specific datasets to extract emotions from upper-body gestures.Both models' information regarding recognized emotions is more accurate and can be used in education, medicine, psychology, product advertisement, marketing, and human-machine interfaces.Yaddaden proposes a hybrid-based approach for emotion recognition through facial expressions, utilizing both geometric and appearance-based features to provide specific information about the six basic emotions [90].The proposed approach uses a multi-class Support Vector Machine architecture for classification and utilizes the randomized Trees feature selection technique.ExperimentaL results demonstarate the effectiveness of the proposed approach, yielding high accuracy rates with three benchmark facial expression datasets.Padhy proposes a model that uses machine-learning and deeplearning techniques for emotion recognition [91].The model uses image processing and feature extraction methods for machine learning and a neural-network-based solution for deep learning.The neural network is used to classify universal emotions such as sadness, anger, happiness, disgust, fear, and surprise.
The use of artificial intelligence (AI) in emotion recognition systems has witnessed significant advancements, leveraging technologies such as machine learning, deep learning, and computer vision.These systems analyze and classify human emotions based on facial features and other patterns, utilizing emotion detection methods and models that encompass various modalities, including language, sound, images, videos, and physiological signals.The studies discussed in this review showcase the effectiveness of AI-assisted emotion recognition models in providing reliable and objective solutions for understanding and predicting human emotional states.These findings highlight the potential of AI as a sustainable development tool for emotion recognition, offering promising applications in diverse fields.

Discussion
The rapid progress in science and technology has led to various sensory technologies that stimulate multiple senses, encompassing vision, hearing, touch, smell, and taste [41].While previous research has delved into sensory stimuli and their corresponding emotional responses, there has been a predominant focus on individual sensory modalities or specific stimuli.
Consequently, there is an escalating recognition of the imperative shift towards a more comprehensive approach in the design of multi-sensory experiences within the realm of human-computer interaction.This necessitates the development of a holistic understanding of how diverse sensory inputs collectively influence emotional responses in interactive installation art.Despite acknowledging the significance of multi-dimensional interactions within interactive installation art, a notable research gap persists in discerning their precise relationship with emotional responses.There is a critical need to unravel how distinct combinations and variations of these interactions elicit specific emotional responses, marking an imperative area for future investigation.
While prior studies have made significant strides in exploring participant engagement within interactive environments, a broader and more nuanced inquiry is warranted.This inquiry should explicitly examine the intricate interplay between participant engagement and the ensuing emotional responses.Further scrutiny is essential to comprehend how participant engagement influences emotional experiences, encompassing elements of immersion, interactivity, and the establishment of meaningful connections with the stimuli and experimental environment.
Multimodal research emerges as a valuable instrument for scrutinizing the impacts of interactive installation art under the influence of multi-sensory stimuli on audience engagement and creative expression.The amalgamation of multimodal data sources holds the potential to heighten audience engagement and amplify creative expression.
Collectively, the literature underscores the integration of emotion recognition models in interactive installation art to enhance the viewer's experience and engagement with the artwork.Furthermore, it emphasizes the pivotal role of multimodal fusion in validating and refining emotional models, ultimately enhancing the precision of assessments about emotional states.In contrast, the conventional measures of emotional response towards interactive devices may benefit from enhancements in effectively gauging their ef-ficacy.Consequently, multimodal fusion is indispensable in comprehending and dissecting emotional responses within this context.
The insights derived from these studies posit that emotion recognition technology harbors the potential to elevate user engagement, refine human-computer interaction, and provide invaluable insights into users' emotional responses within interactive installation art.Additionally, research findings unveil the advantages of deploying emotion recognition technology in capturing users' emotional states with greater accuracy and in real time, surpassing traditional methods such as self-reporting.This technological advancement further paves the way for personalized experiences, adaptive interfaces, and interactive systems adept at real-time responsive modulation based on user emotions.Notably, these models offer valuable insights into the emotional resonance of installation art, shedding light on how distinct design elements and stimuli influence viewers' emotional responses.

Framework and Hypothesis
This research aims to investigate the emotional responses evoked during interactive installation art.It focuses on capturing explicit and implicit data between participants and the artwork, utilizing artificial intelligence algorithms for emotional measurement.We propose a theoretical framework to analyze the emotional response process in interactive installation art, as shown in Figure 1.The framework considers measuring the emotional responses experienced by participants across different stimulation modes offered by the installation art.This process involves a complex interplay between attention, behavior, cognition, and emotion [92].While attention and behavior are considered explicit data, cognition and emotion are categorized as implicit data.In interactive interactions, emotional responses can only be fully understood by considering the combination of explicit and implicit data.Each dimension of attention, behavior, cognition, and emotion can be measured through physiological signals, including brain activity, heart rate variability, skin conductance, facial expressions, and eye tracking.Artificial intelligence algorithms can be employed to analyze and interpret these physiological signals, providing valuable insights into participants' emotional experiences during their interactions with the installation.
Based on the analysis of the interaction process in the above figure, we propose a conceptual framework to better explore the factors that affect emotional responses, as shown in Figure 2. The independent variable is installation art and the dependent variable is the breadth, intensity, and diversity of emotional reactions exhibited by participants during their interactions with AI-integrated interactive installation art.
Several mediator variables are considered in this study.Firstly, the sensory stimulation encompasses various sensory inputs, including visual, auditory, and tactile elements, which play a pivotal role in mediating participants' emotional responses.Secondly, the "interaction dimensions" variable examines the influence of one-dimensional (traditional) interactions versus multi-dimensional interactions involving physical touch, movement, and gestures.These dimensions significantly shape the emotional experiences of participants.Additionally, engagement within contextual factors such as virtual reality (VR), augmented reality (AR), or mixed reality (MR) serves as another mediating factor.The depth of immersion experienced by participants within these virtual realms directly affects their emotional engagement and responses.Artificial intelligence (AI) algorithms are considered as moderator variables.These algorithms play a crucial role in determining how the AI technology operates within the interactive installation art, thus potentially influencing the overall emotional impact on participants.This research seeks to provide a comprehensive understanding of the intricate interplay between these variables, shedding light on the emotional responses within the context of AI-integrated interactive installation art.Based on the analysis of the interaction process in the above figure, we propose a conceptual framework to better explore the factors that affect emotional responses, as shown in Figure 2. The independent variable is installation art and the dependent variable is the breadth, intensity, and diversity of emotional reactions exhibited by participants during their interactions with AI-integrated interactive installation art.From an ontological perspective, integrating interactivity in artistic creation brings forth novel dimensions and complexities.This research delves into the emotional responses of interactive installation art under artificial intelligence (AI).The objective is to delve into the nature of emotional experiences triggered by multi-sensory stimuli and multidimensional interactions during meaningful engagement.This study investigates diverse interactive stimulation modes to capture participants' emotional responses while leveraging emotion recognition techniques to optimize the rationality and efficacy of emotional stimulation settings.Consequently, the research hypotheses are as follows: Hypothesis 1 (H1): Emotional responses triggered by multi-sensory stimuli will exhibit a high degree of consistency across diverse participants, indicating the reliability of specific sensory inputs in evoking standardized emotional reactions within the context of interactive installation art.

Hypothesis 2 (H2):
Multi-dimensional interactions with AI-integrated installation art will elicit a more diverse range of emotional experiences compared to traditional one-dimensional interactions, highlighting the transformative impact of multi-faceted engagement on the depth and variety of emotional responses.

Hypothesis 3 (H3):
There is a significant positive correlation between the level of engagement experienced within virtual reality (VR) environments and the intensity of emotional responses elicited during interactions.Precisely, we predict that higher levels of engagement in VR environments will lead to more intense emotional reactions among participants.

Hypothesis 4 (H4):
The utilization of artificial intelligence algorithms will demonstrate a high level of efficacy in analyzing physiological signals, such as EEG and facial expressions, thereby enabling the quantification of emotional responses with a significant degree of accuracy within the context of interactive installation art.Based on the analysis of the interaction process in the above figure, we propose a conceptual framework to better explore the factors that affect emotional responses, as shown in Figure 2. The independent variable is installation art and the dependent variable is the breadth, intensity, and diversity of emotional reactions exhibited by participants during their interactions with AI-integrated interactive installation art.To test the research hypotheses and achieve the outlined research objectives, we employ the research methodology of the ASSURE model combined with experimental design [93].The ASSURE model is a widely recognized instructional design model that stands for analyze learners, state objectives, select media and materials, utilize media and materials, require learner participation, and evaluate and revise.In the research context, the ASSURE model provides a structured approach to ensure the effective implementation and evaluation of the experimental design.

Conclusions
This paper comprehensively explores emotional response and measurement within interactive installation art, primarily focusing on integrating artificial intelligence.While the review underscores the pivotal role played by sensory stimulation, multi-dimensional interactions, and engagement in eliciting profound emotional responses, it is imperative to further elaborate on the practical significance of these findings and their potential impact on art and design.
We introduce the process analysis of the emotional response to interactive installation art to address identified research gaps.This framework allows for a deeper examination of the variables influencing emotional responses, providing valuable insights into the underlying mechanisms driving emotional experiences in this context.By formulating hypotheses, we aim to establish predictive relationships between sensory stimulation, multi-dimensional interactions, engagement, and emotional responses.This step is crucial in bridging the research gaps we have identified.
To effectively test our hypotheses and accomplish our research objectives, we have selected the ASSURE model in conjunction with experimental design as our methodology.The ASSURE model's structured approach to instructional design aligns seamlessly with our research context, ensuring our experimental study's systematic and comprehensive implementation.This approach significantly facilitates the exploration of the emotional impact of sensory stimuli, multi-dimensional interactions, and engagement in interactive installation art.Nevertheless, it is essential to acknowledge inherent limitations: Situational limitations: This research focuses on a specific interactive installation in the laboratory and may not cover all possible design, technical, and environmental variations.As such, the findings may be limited to the devices and settings investigated and may not fully capture real-world scenarios and natural behaviors.
Interpretation and subjectivity: While AI can assist in measuring emotional responses, interpreting those emotions and their artistic significance remains subjective.Emotional experiences are deeply personal, and different viewers may have varying interpretations and reactions to the same artwork.AI can provide quantitative data but may not capture the full richness and complexity of emotional experiences in art.
Technical limitations: The effectiveness of AI in measuring and enhancing emotional responses depends on the available technology.The accuracy and reliability of emotion recognition algorithms, the quality of data input (e.g., sensor accuracy), and the computational resources required can pose technical limitations.Researchers must be aware of these limitations and consider their impact on this study's findings.
Despite these limitations, this study represents a significant stride towards comprehending emotional engagement within interactive installation art and offers valuable insights into the convergence of interactive art and artificial intelligence.By emphasizing the pivotal role emotions play in AI-enhanced interactive installation art, this research contributes substantially to the well-being and mental health of individuals who engage with these installations.Creating emotionally resonant artworks is designed to profoundly enhance participants' overall quality of life and mental well-being, harmonizing with sustainable development goals focused on societal welfare.
In summation, this paper establishes a foundational platform for future research in emotional response and measurement in interactive installation art under artificial intelligence.It holds significant implications for practically applying these findings in art and design.By integrating theoretical frameworks, hypotheses, and the ASSURE model, our objective is to advance our understanding of the emotional experiences evoked by interactive artworks and make substantial contributions to the broader field of human-computer interaction.This research could shape the creation of emotionally resonant interactive installations in a manner that significantly impacts the art and design landscape.The careful fusion of artistry and technology heralds a promising future where the boundaries between human emotion and artificial intelligence converge, opening new horizons in interactive art.

Figure 1 .
Figure 1.Process analysis of emotional response to interactive installation art.

Figure 2 .
Figure 2. The conceptual framework of measuring emotional responses in the interactive exhibit.

Figure 1 .
Figure 1.Process analysis of emotional response to interactive installation art.

Figure 1 .
Figure 1.Process analysis of emotional response to interactive installation art.

Figure 2 .
Figure 2. The conceptual framework of measuring emotional responses in the interactive exhibit.Figure 2. The conceptual framework of measuring emotional responses in the interactive exhibit.

Figure 2 .
Figure 2. The conceptual framework of measuring emotional responses in the interactive exhibit.Figure 2. The conceptual framework of measuring emotional responses in the interactive exhibit.

Table 1 .
Comparative Analysis of Interactive Installation Art in the United States, United Kingdom, Germany, and China.
2019-2020<br>-"Janet Cardiff and George Bures Miller: Lost in the Memory Palace" at the Museum of Contemporary Art, Chicago, 2013 Rafael Lozano-Hemmer perception, deception and surveillance Curiosity and intrigue "Rafael Lozano-Hemmer: Common Measures" at the Crystal Bridges Museum of American Art (2022)

Table 2 .
Schemes and Effects of Multimodal Recognition with Different Mixed Modes.

Table 3 .
Application of Emotion Recognition Technology in Interactive Installation Art.