Next Article in Journal
Linear Average Yield Criterion and Its Application in Failure Pressure Evaluation of Defect-Free Pipelines
Previous Article in Journal
A Hybrid Intelligent Model for Olympic Medal Prediction Based on Data-Intelligence Fusion
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Smart Textile Design: A Systematic Review of Materials and Technologies for Textile Interaction and User Experience Evaluation Methods

Centre for Textile Science and Technology (2C2T), Department of Textile Engineering, University of Minho, 4800-058 Guimarães, Portugal
*
Author to whom correspondence should be addressed.
Technologies 2025, 13(6), 251; https://doi.org/10.3390/technologies13060251
Submission received: 2 May 2025 / Revised: 6 June 2025 / Accepted: 10 June 2025 / Published: 13 June 2025
(This article belongs to the Section Information and Communication Technologies)

Abstract

Creating meaningful interactions using smart textiles involves both a comprehensive understanding of relevant materials and technologies (M&T) and how users engage with this type of interface. Despite its relevance to design research, user experience (UX) evaluation remains limited within the smart textile field. This research aims to systematize information regarding the main M&T used in recent smart textile design research and the evaluation methods (EMs) employed to assess the UX. For this purpose, a systematic literature review was conducted in the Scopus database. The search covered the period from 2018 to 2025 and yielded a total of 232 results. Of these, 56 full papers in English, available on the internet, and focusing on experimental research on smart textile interaction and experience evaluation were included. This review identifies the prevalent use of electronic components and conductive materials, emphasizing the importance of selecting materials that enable sensing, actuation, communication, and processing capabilities. UX evaluation focused on the pragmatic dimension, whereas the combination with the hedonic dimension was generally regarded as future work. The study led to the proposal of four key topics to support the creation of meaningful interactions and highlights the need for further research on evaluating users’ emotional experiences with smart textiles.

1. Introduction

Smart textiles refer to textile artifacts designed to feel and interact—sensing, reacting, or adapting—to environmental conditions or external stimuli. Such textile interfaces perform dynamic behavior over time and may present extended functionalities and uses of common textile products. Through human–technology interaction mediated by textile interfaces, the importance of smart textiles lies in their capacity to enable responsive and expressive functionalities. These capabilities allow them to support advanced applications in health and well-being, enhance safety and protection in critical environments, and promote sustainable and inclusive innovation among a wide range of potential applications [1,2,3,4,5].
Smart textile types and properties depend on materials and technology selection [3]. They can be incorporated in fiber, yarn, fabric, or finishing levels [4]. Materials such as optical fibers, shape memory materials (SMMs), color change materials (CCMs), conductive, piezoelectric, hydrogels, and nanomaterials are employed to enable interactive and smart behaviors in textiles [3,6]. Electronic textiles (e-textiles) are a type of smart fabric that covers flexible and conformable systems capable of sensing, actuating, generating, and storing energy, being part of an interactive and interconnected communication network [7]. They contain electronic components, including sensors, actuators, interconnects, processing units, and power supplies, that can also be integrated at multiscale levels [8,9]. In e-textiles, conductive materials can be used to build textile-based sensors, actuators, and power components [5,8]. The interactive behavior of smart textiles can be triggered by stimuli from mechanical, electrical, thermal, chemical, magnetic, or other sources being transformed into a reaction. This transformation relies on the transduction relationship between inputs (such as body movement, voice, vital signs, touch, or temperature) and outputs (such as movement, sound, light, smell, image, animation, or video) [3,10,11,12,13]. The wide diversity of input and output possibilities opens up rich spaces of interaction between a smart textile artifact and a human.
Smart textile interaction means the sensorial relation between a smart and dynamic textile interface and a user [14,15,16,17,18]. From the user perspective, these fundamentally embodied exchanges can be performed on a certain part of the body (touch, gesture-based, physical, and bodily-based interaction) or with the entire body (somaesthetic interaction) [19,20]. Furthermore, they can elicit emotions when the experience reaches symbolic and meaningful levels for an individual [21,22]. This complex phenomenon involves cognitive, physiological, and behavioral components, with an embodied nature connected to bodily sensations and physiological responses [22]. Medeiros proposed the concept of “meaningful interaction” as the exploration of the semantic dimension of products [23]. Not limited to the tangibility of the object itself, meanings can be incorporated by the designer and by the user and can also emerge during the interaction process [23]. This process involves the dynamic relationship between the user, the context, and the artifact, which is fundamental in the formation of the symbolic values of the artifacts [23]. Recent paradigms of human–computer interaction (HCI), a discipline related to e-textiles as computational devices, have begun to incorporate new aspects of human existence, including emotions and experience [24].
User experience (UX) is defined by the International Organization for Standardization (ISO) as user perceptions and responses that happen before, during, or after using a product, service, or system [25]. Following the UX model proposed by Hassenzahl, the experience can be approached through the pragmatic and hedonic attributes of a product [26]. Pragmatic attributes are related to perceived instrumental qualities that influence task fulfillment by using an artifact and its ease of use [26]. Product features such as effectiveness, efficiency, usability, and controllability to manipulate the environment are included in this dimension [26,27]. The hedonic dimension emphasizes the emotional or affective aspect that the artifact causes in the user—for example, providing pleasure, satisfaction, and appealingness—through non-instrumental qualities such as visual aesthetics, symbolism, attractiveness, and social experience/status [26,27].
Regarding smart textiles, the experience involves the user or users and the textile interface in a determined context, as presented in Figure 1. Within this domain, the pragmatic dimension comprises attributes such as wearability, usability, and usefulness, whereas the hedonic dimension pertains to visual aesthetic, social experience, and emotions [28]. These features may influence affection and emotion, directly shaping the user’s psychological state during interactions with smart textile artifacts. Together, textile interfaces and technological systems should be intuitive and aligned with emotional triggers to attract a broader audience [29]. As a strategy, emotional design aims to influence the UX, whether through altering the visual appeal or interface of an object or by encouraging smooth and captivating interactions [30]. For instance, gamification within smart textiles has been demonstrated to motivate, support, and empower users in achieving health goals [31,32].
Assessing the experience is essential to analyze the product and user interactions, and a range of methodological approaches can be employed for this purpose. A classification of UX evaluation methods by Kieffer et al. divides those that involve users into attitudinal and behavioral/physiological [35]. Attitudinal methods aim at collecting self-reported data concerning users’ feelings (e.g., group interview, survey, think-aloud) [35]. Behavioral methods concentrate on obtaining data about or measuring what users do and/or user physiologic conditions (e.g., experiment, observation, simulation, instrument-based experiment) [35]. Furthermore, to measure the UX, whether concerning the final prototype or to assist iterative cycles in the design process that involve the user and their requirements, it is relevant to consider both pragmatic and hedonic dimensions [35].
For the evaluation of pragmatic attributes, there are established testing procedures and standards that can be used and/or adapted for specific properties, such as washability, electrical resistance, and reliability. Regarding the hedonic dimension, measures should be considered for understanding evoked emotions since they raise subjective questions [36]. Quantitative methods on the hedonic dimension have merged knowledge from neuroscience and affective computing, helping designers to understand relevant data about users’ needs and expectations [37]. In-depth research on user emotional experience in the field of interactive fashion design as an interdisciplinary discipline involving design, sociology, psychology, information science, etc., is recent and still lacks a theoretical system [38].
In order to assist designers and researchers in making informed decisions through a holistic approach to smart textile design, this paper presents a systematic literature review of recent research focusing on experimental projects that address the interaction between users and smart textiles. To this end, the objective of the review was to systematize information on materials and technologies (M&T) used to enable textile interaction, as well as the evaluation methods (EMs) employed to assess UX.
In this work, a systematic literature review was conducted with the objective of studying recent smart textiles design research and systematizing information regarding materials and technologies (M&T) used that enable textile interaction and the evaluation methods (EMs) employed to assess the UX.
As a result, a predominant integration of electronic components and conductive materials in textiles was identified, particularly with the use of conductive threads and yarns. The analysis conducted highlights that materials’ selection and combination need to fulfill sensing, actuation, communication, and processing capabilities. Qualitative evaluation methods were mostly conducted to assess the UX, with observation and interviews being the most prominent. Nevertheless, an increased usage of quantitative methods was also observed, particularly through the use of close-ended questionnaires. The UX evaluation concerns mostly the pragmatic dimension, whereas the combination with the hedonic dimension is generally regarded as future work.
This review contributes with an analysis of the main M&T and EMs applied in recent smart textile design research and proposes four main key topics to support the creation of meaningful interactions: single and multisensory interactions, input and output systems, UX evaluations, and pragmatic and hedonic dimensions. The study concludes by emphasizing the need for further research into the evaluation of users’ emotional experience with smart textiles.

2. Materials and Methods

The systematic literature review conducted follows a precise, methodical process that consists of identifying, selecting, and evaluating relevant research, analyzing and synthesizing collected data, and finally summarizing evidence that enables clear inferences [8,39,40]. This work follows PRISMA guidelines [41]. Although no protocol was registered, the structure of the review remained aligned with established methodological standards. The research question defined was as follows: What are the main materials and technologies used in recent smart textile research that enable textile interaction, and what evaluation methods were employed to assess the UX?
To address the clearly stated question, the research employed the terms “textile”, “experience”, and “interacti*”, the latter being a truncation to capture variations of the root word, such as “interaction” and “interactive”. The Scopus database was chosen for this review because it presents a refined selection of indexed records in relevant and peer-reviewed journals. The search was fulfilled on 5 February 2025, and the inserted query was as follows: ((TITLE-ABS-KEY (“textile”)) AND ((TITLE-ABS-KEY (“interacti*”)) AND (TITLE-ABS-KEY (“experience”)), to reach research with these specific terms in title, abstract, or keywords. By delimiting the results to records published between 2018 and 2025, 232 recent and reliable papers were shown as identification outcomes.
In order to obtain accessible and enough information to answer the proposed question, inclusion criteria encompassed only full-text papers in English, not selected records concerning full conference proceedings (instead of proceedings papers), extended abstracts, posters, reviews, and reports. These conditions led to the exclusion of 32 records between the identification and screening phases.
Abstracts and keywords of the remaining 200 papers were detailed screened by one of the authors. Considering the study objective to review experimental projects from the original source rather than from secondary interpretations—both regarding the materials and technologies used in textile and fashion design research and the tools for evaluating user experience—the exclusion criteria were defined accordingly. Papers that did not refer to smart textile interaction and did not comprise experimental research, presenting at least one project of their own authorship, were excluded. To avoid bias, cases of doubt and disagreement were resolved in discussion sessions with the other two authors. In the screening phase, 138 records were removed, leaving 62 papers to be evaluated.
During the eligibility assessment, the lack of free and full online access to the studies was applied as an exclusion criterion, leading to the removal of 6 records. The remaining 56 papers were thoroughly read by each of the three authors. Together, they considered all 56 records highly relevant to the research topic and selected them for inclusion in the review (Figure 2). Relevant data from each of the 56 papers were analyzed and extracted by the first author using a standard spreadsheet. Synthesized information in the table was then revised by the other two authors independently, and all authors discussed it together, as presented in Section 3.

3. Results

To promote an expanded view of the results, a table containing a summarized description of each paper is provided (Appendix A). The papers included in the review were coded from P1 to P56, beginning with the most recent entries and then arranging them alphabetically in order to facilitate referencing the obtained results. Data regarding the main focus, M&T, EMs, and main findings (MFs) of each research were systematized. Specifications, models, and suppliers informed by respective authors in analyzed studies were identified, briefly explaining how materials and components were integrated for prototype construction. EMs focus on how user tests were conducted in each research, considering methods and techniques used to evaluate the experience. Since embodied interaction has been demonstrated to further expand the capabilities of smart textiles and improve the emotional experience of users [42], this review particularly addresses the evaluation of the interaction between the user and the textile prototype(s). MFs list the key results of interest to this systematic review.
A wide diversity of focus regarding smart textile interaction was noticed among papers, which embrace single or multiple goals from the UX perspective. This study highlights those that research the relationship between the experience and the dimension of user’s responses: emotional and physical/sensory. Although senses are inseparably intertwined with emotions [43], they are firstly related to a bodily and physical perceptual system [44]. For this reason, all papers present a physical/sensory dimension, and some of them deepen into an emotional dimension. After a detailed analysis, the papers were clustered within two main dimensions, as follows:
Within the physical/sensory dimension:
  • Single-sensory: audio (P39; P53); visual (P24; P32; P35; P50; P54); touch/haptic (P2; P14; P17; P23; P25; P31; P40; P41); gesture (P21; P55; P56);
  • Multisensory (P1; P3; P4; P5; P6; P7; P8; P9; P10; P11; P12; P13; P15; P16; P18; P19; P20; P22; P26; P27; P28; P29; P30; P33; P34; P36; P37; P38; P42; P43; P44; P45; P46; P47; P51; P52);
  • Technical performance (P1; P2; P8; P9; P12; P14; P15; P17; P23; P24; P28; P31; P36; P37; P42; P44; P48; P49; P55; P56).
Within the emotional dimension:
  • Feelings and emotions representation (P5; P19; P25; P32);
  • Social engagement (P3; P9; P13; P16; P29; P38; P46; P50);
  • Emotion and sensory regulation (P4; P26; P29);
  • Self- and surrounding-perception alteration (P4; P5; P18; P35; P40; P41; P45);
  • Identity and self-expression construction and communication (P51; P53);
  • Emotional responses to interactive behavior (P4; P7; P11; P17; P20; P26; P29; P32; P40; P41).
Most research featured projects that promoted multisensory interaction. Among papers that addressed single-sensory interaction, the most mentioned senses were touch and visual. Affective aspects were addressed by 34 of the 56 articles included in this review. Among the issues studied, the majority were concerned with emotional responses to the textile interactive behavior, social engagement, and the user’s self and surrounding perception. Some papers also mentioned design processes as the research focus, describing the smart textile design and fabrication methodologies.

3.1. Materials and Technologies

Diverse M&T were used in the prototypes’ development, which encompassed integration at different levels, from product to fiber, of smart and conductive materials and electronic components. In 50 of the analyzed works, conductive materials and electronic components were integrated at the product level (P2; P3; P4; P5; P6; P7; P8, P9; P10; P11; P12; P13; P15; P16; P17; P18; P19; P20; P21; P22; P24; P25; P26; P28; P29; P30; P31; P32; P33; P34; P35; P36; P37; P38; P39; P40; P41; P42; P43; P45; P46; P47; P48; P49; P50; P51; P52; P53; P54; P56). The integration of conductive materials and electronic components at the product level highlights the agility in carrying out initial and iterative studies on user interaction with prototypes. Smart materials integration at yarn level was also analyzed in 23 papers (P1; P2; P6; P7; P9; P14; P15; P22; P23; P26; P27; P28; P30; P36; P39; P41; P42; P43; P44; P45; P50; P54; P55). Conductive threads/yarns were used in 24 prototypes (P6; P9; P14; P20; P21; P24; P25; P26; P27; P28; P29; P30; P33; P36; P37; P39; P41; P42; P43; P44; P45; P49; P50; P54). Additionally, threads/yarns were coated with conductive inks (P23) and painted with thermochromic materials (P44). At the substrate level, smart and conductive materials were integrated by weaving (P1; P6; P7; P22; P35; P44), knitting (P14; P27; P39; P42), coating (P2; P24; P55), in situ polymerization (P27), painting (P30), and printing (P30; P54; P56), including a conductive pattern as printed sensor (P10). Artisanal techniques for technology integration were also identified, such as crochet (P30), featherwork with feathers made conductive by in situ polymerization (P34), and felting in preliminary prototypes with non-conclusive conductivity testing (P6). Conductive fabrics from suppliers that predominantly present sensing functions were considerably noticed in electronic prototypes (P6; P20; P21; P26; P27; P31; P33; P46; P54). Fiber-level integration of smart materials was observed in three prototypes (P6; P30; P42). To allow the construction of prototypes that carried out the intended interaction, the application of materials and components at multiple levels was a common practice, observed in 32 papers (P1; P2; P3; P7; P9; P10; P14; P15; P20; P21; P22; P24; P26; P27; P28; P30; P31; P33; P34; P35; P36; P39; P41; P42; P43; P44; P45; P46; P50; P54; P55; P56). For instance, a children’s book that combined a reed switch as a proximity hard sensor at the product level, pressure, and stretch textile-based sensors at thread/yarn and fiber level, respectively, as well as textile printing with CCMs at the substrate level (P30).
M&T were integrated to promote interactive and smart behavior in the analyzed textile projects were grouped, and their use was quantified (Figure 3). The defined M&T groups and respective types/specifications took into consideration the data clearly mentioned by authors in the respective papers. In papers that mentioned more than one material per type or that presented more than one project or prototype, the material was quantified by type of use per paper.
Four main M&T groups were identified based on their characteristics and behavior within the project: electronic components, conductive materials, CCMs, and optical fibers.
Demonstrating the increasing importance of e-textiles, electronic components were the most extensively applied elements among included records. Generally, such types of textiles work through the sensing–processing–actuating process. They encompass sensors, actuators, processing units, power supplies, and, in some, other components. The sensing function could be executed by both electronic-component and textile-based sensors made of conductive materials.
Central processing units (CPUs) are the brain of an e-textile system [5,8]. Within the searched papers, development boards with microcontrollers (often simply referred to as “microcontrollers”) were widely employed for their small size and high-performance ability. These boards are small integrated circuits, which generally have a central processing unit, memory, input and output interfaces, a clock generator, analog-to-digital converters, and communication interfaces responsible for data processing [45]. Arduino was considerably the most mentioned brand (P19; P35; P36; P40; P47; P54), including specific models such as Mega (P16; P20; P31), Nano (P26; P30; P34), Uno (P10; P16; P33; P41; P47), and Pro Mini (P33; P46). Models from other brands were also noticed, such as the Adafruit Metro board (P24) and Cypress PSoC4 (P27). For specific types of output from systems, suitable processing units were chosen. For instance, Teensy®, used for audio-feedback projects (P9; P37; P39; P42); Bare Conductive Touch Board (BCTB), designed specifically for touch-triggered sounds (P6; P29); and FlowIO, an open-source modular development platform for control, actuation, and sensing of soft robots and other pneumatically actuating devices [46] (P13; P18). Although Arduino Lilypad (P21) and Adafruit Flora (P38; P43) are sewable microprocessors designed specifically for e-textiles and wearable projects, they were also applied in non-wearables prototypes. Other processing units include single boards computers (SBCs)—such as Raspberry Pi 4B (P7; P22) and Bela Mini (P5; P15)—and a motherboard—printed circuit board (PCB) that controls all data exchanges between the CPU and the peripheral device, containing the main components of a computer and including connections for additional circuit boards to be plugged into [47] from Intel® (P32). For controlling and further data processing, prototypes counted on computers (P9; P11; P12; P13; P17; P18; P27; P37; P41; P42; P45; P49; P56), mobile applications (P26; P33; P38; P43; P52), or other digital media device (P28).
For feedback, prototypes included actuators with mechanisms related to the designed interaction between the prototype and the user. Light-emitting diodes (LEDs) were the most employed material for visual feedback (P4; P7; P16; P20; P22; P24; P26; P30; P32; P45; P46; P50; P51; P52). For audio interaction, prototypes embraced headphones (P5; P10; P11; P19; P39), speakers (P5; P9; P16; P20; P29; P30; P47; P53), and buzzers (P46). Coils were used both for audio—together with plastified paper, a magnet, and an amplifier—(P15) and for mechanical responses (P30). Vibration motors (P26; P30; P38; P40; P41; P45; P46; P47; P49; P52) and servomotors (P17; P25) were used to promote haptic feedback, as well as heat pads (P25; P47) and a mid-air haptic device, which enables contactless feedback (P11). Pneumatic actuation systems included pneumatic tubing (P8; P18), usually connected to an electronic solenoid valve and air compressor tank (P4; P12; P31), and OmniFiber, a programmable fiber-based shape-changing technology [48], applied for fluidic pneumatic haptic actuation (P13). Electronic devices, such as computers, tablets, and mobile phones, were employed as actuators through their respective speakers (P26; P27; P28 P37) and displays (P36; P37; P38; P47; P56). Smartphone applications enable multiple feedback modes through the combination of visual and vibration responses (P52) or with auditory feedback (P33; P43). Each of them proved their relevance to the interaction for specific purposes, such as for gait retraining patients whose awareness was increased by audio outputs but used visual feedback to correct determined movements (P33). To enhance the experience, actuators for multiple sensory feedback were identified in 15 records (P3; P4; P9; P11; P16; P20; P25; P26; P30; P37; P38; P45; P46; P47; P52). For example, an installation included a robotic lighting system with a rotating head light module (composed of RGB beam lights and a motor) and a 4-channel distributed speaker system, playing artificial intelligence (AI) generated music linked to an ambisonic spatialization module (P9).
Other electronic components, which also had an expressive application frequency as they embraced diverse pieces, were used in e-textile prototypes to improve the capacities of respective projects. External modules are integrated circuit boards designed to fit directly on top of a processing unit interface, expanding their functionalities in some specific way, such as providing connectivity, wireless communications and charging capabilities, motor controls, amplifiers, communications interfaces, and sensors, among others. These include read–write module (P20), MP3 modules (P20; P34), and recordable sound module (P30), as well as components specifically designed for sensing functions, such as capacitance measurement unit (CMU) (P2), capacitive touch controller boards (P34), and movement sensor module (P19). Additionally, a multichannel expander (P5), a USB (Universal Serial Bus) charging module (P15), and both audio (P19; P39) and motor shields (P30) were also incorporated into the processing unit. Although it was noticed that processing units with Bluetooth chipsets already inserted, such as Arduino Bluetooth Low Energy (BLE) Nano (P26), external Bluetooth modules were also accoupled (P21; P32; P33; P34; P43). Switches and buttons were integrated to enable turning on, off, and controlling systems (P16; P20; P30; P33; P37; P41; P43; P50; P53), as well as a knob (P37). Data storage was extended by memory cards (P20; P39). In addition, transistors (P42; P54), resistors (P31; P33; P36; P42)—including potentiometers (P37)—and regulators (P12; P47) were added to the respective prototypes to assist in their operation. Identified circuits comprised multiplexers (MUXs) (P2), LED driver (P16), drive circuits (P7; P22), PCB pads (P42), and circuit boards (P36; P37)—such as control board (P49) and PCBs (P2; P43; P53). Furthermore, prototypes featured components such as magnets (P6; P14; P15; P28; P30; P51), envelope followers (P5; P15), amplifiers (P15; P17; P28), microphones for recording (P30; P53), synthesizers (P5), transmitters (P5; P9), octal buffers (P17), radio frequency (RF) transmitter and receiver modules (P16)—such as Radio Frequency Identification (RFID) sensors and tags (P25) and Near Field Communication (NFC) tags (P38)—pin connectors (P45), ear clips (P32), and crocodile clips (P10).
Regarding the use of power supplies, the analyzed projects described energy storage and sources. The majority of e-textile projects have resorted to energy storage through batteries (P2; P24; P38; P42; P46; P51; P53). More specifically, these included lithium (P29; P32) and lithium polymer (Li-Po) batteries (P28; P30; P43; P50), as well as power banks (P5; P7), described as detachable (P33), portable and with fast charging capability (P22). The use of more than one energy source in a single prototype was identified, in which a Li-Po battery powered the processing unit and a 9 V battery powered the amplifier in fabric speakers (P15). For user safety, mainly for children as users, batteries were enclosed in a wooden box (P29) and a battery holder (P24; P38; P51). Batteries complement included a battery charger (P2; P30). Since conventional batteries are bulky, stiff, and consume finite energy, new sources for both power storage and generation have been developed. Trends within this topic include the miniaturization of components and harvestable energy sources. Battery miniaturization, like in sensors, leads to new thin, flexible, and hidden components made of embroidered or printed conductive materials [8], which was not noticed within the analyzed papers. Developments in harvestable energy sources aim to increase the energy efficiency of e-textile projects [8]. The only project that dealt with generating energy—smart gloves for gesture recognition—used triboelectrification between polydimethylsiloxane (PDMS)-coated wool yarn on the fingertips and a polyester (PES) patch on the other hand palm for self-powering through the electrostatic energy produced by the movement of the user’s hands (P55).
In order to detect stimuli, sensors encompassed rigid electronic components, either incorporated directly on textiles as a platform or forming part of a system in which the textile is integrated, as well as textile-based sensors inserted into the textile structure. There were prototypes that combined both types, as previously discussed, as well as projects that made improvements in iterative cycles, such as using a robust force sensing resistor as a pressure sensor in the first prototype and conductive pressure-sensing fabric in the second model of the smart sock for gait retraining (P33).
This review follows sensors classification based on what is being monitored: motion, physiology, or environment parameters [8]. Electronic sensors of these three classes were identified. For motion detection, the following sensors were incorporated: image (P7; P22; P37), proximity (P16; P30; P35), pressure/piezo sensors (P2; P17; P31; P33; P47), and antennas (P9). In addition, stretch sensors (P29)—including a conductive rubber cord stretch sensor (P37)—and touch sensors (P4)—including capacitive touch-sensing pins (P39) and chips (P42). An external movement sensor was used in a wearable training system for stroke rehabilitation that worked together with an e-sleeve (P56). The contactless mid-air haptic device has embedded motion sensors for hand position tracking (P11). Accelerometers were applied for detecting artifacts’ acceleration (P48), as well as inertial measurement units (IMUs) containing an accelerometer, gyroscope, and magnetometer were embroidered on a t-shirt to calculate 3D user posture orientation (P43). Hard physiology sensors embraced dry electrodes coupled in a smart hat for electroencephalography (EEG) data acquisition (P32), pulse sensor collecting heart rate signals (P4), and body temperature sensor (P15). Ambient temperature (P18; P47; P51; P52), humidity (P18; P52), light (P46; P51), and gas sensors (P52) were used as electronic environment sensors. Piezoelectric microphones were used for both motion and ambient sensing to collect sound inputs from the user and from the surroundings (P5; P15).
Following the mentioned trends in e-textiles, miniaturized and flexible textile-based sensors were frequently detected. They were built by using conductive fibers, threads/yarns, fabrics, and inks/pastes. Motion textile-based sensors were identified, such as for pressure (P30)—including touch sensing buttons (P44)—stretch sensing (P30; P39), and piezoresistive (P31). Binary-resistive knitted sensors were made of plated stainless-steel ferromagnetic yarns (P14). Combined with conductive yarns by knitting, thermochromic and composite yarns were used in the design of a digitally knitted keyboard able to sense discrete touch, as well as continuous proximity and pressure (P42). Capacitive textile-based sensors were identified (P6; P34; P39), including a wearable touchpad made of dielectric elastomers (DEs) made of sputtered metal-based electrodes (P2). An interactive tapestry whose bases and elements, fabricated with conductive threads and felt, performed as capacitive touch sensors (P6). Fabric speaker sensors (P15; P28) were made with conductive threads/yarns to perform together with magnets. Within this classification, conductive fabrics were widely applied for pressure-sensing (P20; P21; P31; P33; P42), including those commercial from suppliers, such as Velostat—a plastic sheet made of a polymeric foil impregnated with carbon black—(P4; P20; P46) and the fibrous Eeontex (P33). In addition, conductive fabrics were used as stretch sensors (P46) and with double function, including flexion/extension and movement (P26) and proximity and touch sensors (P27). By using conductive ink, a patterned array was inkjet-printed for the fabrication of a gesture recognition sensor (P10). Fiber-based conductive material embraced silver-coated (P42) fibers and conductive wool (P6). In the fabrication of a motion sensor to measure touch pressure, conductivity at this level was inserted into the textile structure (P30). Physiological sensors, namely electrodes, were enabled by the addition of conductive inks and pastes, such as for Functional Electrical Stimulation (FES) (P56) and for an electrode made of composite yarns of reduced graphene oxide (RGO), silk sericin and super adsorption polymer (SAP) and conductive ink (P23). To identify environmental stimuli, conductive threads were employed in the fabrication of moisture (humidity) sensors (P24).
In general, besides the development of textile sensors, conductive materials were applied for interconnections, conducting electricity between components, closing circuitry, and making the system work. Conductive threads/yarns were the most employed material also for this function and demonstrated to be easily applicable (P6; P9; P14; P20; P21; P24; P25; P26; P27; P28; P29; P30; P33; P36; P41; P42; P43; P44; P49; P50; P54). Additionally, conductive threads were fabric isolated (P53), as well as cotton (CO)-wrapped and CO-covered (P39). Conductive fabrics offered a similar role, albeit with a broader contact area (P46; P54), such as a conductive mesh made of metal thread that powers and controls customizable attached haptic modules (P45), conductive adhesive fleece material (P2), and conductive hook-and-loop tape (P49). Electrical paint allowed the creation of electrical circuits in previous tests for material experimentation (P6).
Wires and cables used comprehend isolated conductive threads (P36). They were mostly coated with plastic (P41; P42; P44; P46; P47; P56). Audio cables transmitting sound data (P28; P29; P47) were identified, namely banana jack (P37), as well as minijack and mono jack cables (P5). Other conductive materials were applied as connectors (P17), such as a flexible printed circuit (FPC) (P7; P22), crimp connectors (P2; P36), conductive pads (P36), copper tape (P6), metal clips (P6), crimp beads (P34), screws, metal fittings (P37), nickel buttons (P49), and, widely, metal snaps (P30; P46; P51; P53; P54). A metal ring/bracelet is worn by a performer to close a circuit for video and sound-actuation (P37).
CCMs included thermochromic pigments (P30; P44; P54) and yarns (P42), enabling color-changing behavior under temperature variation, as well as photochromic (P30) and hydrochromic (P30) pigments that provide this behavior to textile surfaces when exposed to a UV light source, and when wet, respectively. CCMs have the innate and autonomous ability to feel and respond to external stimuli, which was noticed in a textile book that had each of them applied to interact with the environment and the user (P30). They can also be integrated into an electronic system, such as thermochromic pigments changing color due to the heat generated by electrical conductivity when a circuit is closed through conductive threads, yarns, and/or fabrics. This occurred in the woven textile prototype that joined touch sensing and color change behavior (P44), in the craft reinterpretation of a digital game whose visual dynamic behavior is activated when the user successfully solves the right color paths (P54) and in the interactive keyboard that provided display change through the integration of thermochromic yarns (P42).
Optical fibers are light-transmitting materials and, unlike CCMs, must always be connected to an electronic system. They have been employed in e-textile projects, for example, to produce light-emitting textiles. These include the prototype of three textile artifacts woven with optical fibers connected to proximity sensors whose lighting colors change according to the users’ location (P35) and in a striped double-layer woven Jacquard made of PES and polymeric optical fiber (POF) of polymethyl methacrylate (PMMA) that alters hue based on number gesture recognition (P7; P22). Furthermore, this material was applied to appear as a single-line rainbow when the circuit is closed in an e-textile book (P30).
Other M&T include relevant materials for the interactive operation of discussed prototypes. This group included various materials, such as polyvinyl alcohol (PVA) water-soluble yarn for reversible shape-changing textile fabrication (P1), thermoplastic polyurethane (TPU) fabric with waterproof and hydrophobic functional coatings (P10), silver-doped titanium dioxide (TiO2/Ag) nanoparticles for photocatalytic coatings (P24), and functional pastes for screen-printing waterproof and encapsulation layers (P56). Silicone was incorporated in both rubber tubes used for pneumatic actuation prototypes (P2, P8, P10, P18, P31, P56) and PDMS, a polymer based on silicone and carbon, for electrifying wool yarn by coating (P55).
Technologies were implemented through materials and electronic components for communication and data transfer between prototypes and/or other devices wirelessly connected as a network via the internet. This emphasizes Internet of Things (IoT) concepts [49] within e-textiles. Bluetooth enables data transfer utilizing radio waves, whether between the electronic components themselves (P3), between them and the processing unit (P3; P34), or between the smart textile prototype to computer software (P18; P45), a tablet/mobile application (P26; P33; P52), or a car media player (P21). Wi-Fi (Wireless Fidelity) was used to interchange data from a single-board computer to a cloud computing server storing information (P7; P22; P25; P49), as well as for connecting two pairs of wearable “discussion artifacts” to each other. In one of them, RFID, which utilizes electromagnetic fields to detect and monitor tags, was also used (P25). Similarly, based on radio frequency, NFC is a set of communication protocols that enable data exchange between devices when they are up to 4 cm apart. It was identified for the communication between a textile-based artifact with a mobile application through tags (P38). Two-dimensional signal transmission (2DST) technology enabled conductive full-body clothing to function as both a communication path and energy supply, powering and controlling attached haptic feedback modules (P45).
M&T are responsible for the interactive attributes of smart textile artifacts. For this reason, their selection, application, and validation were demonstrated to be highly relevant. These elements transform textiles into dynamic interfaces capable of sensing, reacting, and communicating with their surroundings or with users, consequently influencing the experience they provide. All prototypes could “feel” and “react”, and almost all showed the innate adaptability and communication capabilities of e-textiles. This type of smart textiles covered most projects analyzed, with the wide use of conductive materials and electronic components, presenting a path to what has been studied and developed in this area. The combination of M&T was identified as an emerging way of adding diverse functionalities to a project with the potential to improve user interaction with the object. It is, therefore, essential to deeply explore the available M&T so that a careful choice can be made. Furthermore, the smart textile projects analyzed clearly demonstrated the importance of considering the target user, the project objectives, and the desired interaction. This includes the inputs, outputs, and intended behaviors regarding the textile interface, as well as the possibilities regarding the user’s physical/sensory and/or emotional response.

3.2. UX Evaluation Methods

Through the interaction between the user and a smart textile artifact, experiences can occur at a physical/sensory level and/or an emotional level. For experience evaluation, most of the analyzed studies used prototypes to perform tests through qualitative, quantitative, or mixed methods. Qualitative-only methods were conducted in 23 research studies addressed. Quantitative-only methods were used in four of them. Notably, most of the studies embrace qualitative methods since affective experience deals with subjective emotions. However, it is relevant to highlight the significant use of quantitative methods combined with qualitative methods in mixed-methods research, which amounted to 15 studies. Additionally, 14 studies did not perform UX evaluation. Figure 4 demonstrates the frequency of each UX evaluation method identified.
Regarding qualitative assessment, the most applied methods were observation (P1; P7; P9; P10; P12; P13; P14; P19; P20; P22; P25; P27; P29; P30; P31; P33; P35; P37; P38; P46; P47; P49; P51; P53; P54) and interviews (P10; P11; P13; P15; P26; P40; P52; P53). Most interviews were semi-structured (P17; P19; P20; P33; P43; P46), indicating the importance of openness for the user to express the experience. A micro-phenomenological interview was used to capture fine-grained information about intersubjective experiences between two subjects/users during the interaction (P13). Group sessions included informal ideation sessions (P36), making sessions and workshops (P10; P53), focus groups (P46; P53), and play sessions (P46), among others (P50; P52; P56). The think-aloud method was used to encourage participants to verbally express their thoughts while carrying out a task to provide insights during the interaction in both individual and group sessions (P4; P13; P15; P56). Open-ended questionnaires with exploratory questions have been applied to assess conditions before and after prototype tests (P41; P50). Self-reporting and autobiographical accounts enabled the design of wearable systems that represent the experience of pain in a performance (P5) and that are ideal (P33) and familiar (P39) for users. The autobiographic approach was also used in the design of soft speakers, which included users as creators by providing them the necessary information and materials for do-it-yourself prototypes (P28). Through daily recording, the journaling method also encouraged self-expression of experience, whether with photos, written accounts, and observations (P1) or through impressions and improvement suggestions (P43). Graphic representations (drawings, pictures, and photographs) were used to illustrate user-felt experiences (P19). Although these are qualitative evaluation methods, some authors later quantitatively analyzed the data obtained from user tests, commonly recorded in video, image, and/or audio. For example, analyzing accuracy in identification tasks (P12; P14; P31) and correlations between sound-effect and gesture input in matching exercises (P10). In addition, time-based measurement analysis also brought a quantitative approach to qualitative methods, such as with time recording for fulfilling tasks (P7; P14), counting average interaction time individually and socially (P46), and calculating percentages of user interaction through ELAN, annotating software for audio/video (P29).
Quantitative methods have increasingly been applied in the field for quantifying subjective emotions. Measures were obtained from questionnaires and physiological data. Closed-ended questionnaires were the most frequent tool (P4; P7; P10; P11; P15; P17; P20; P26; P32; P33; P41; P43; P50; P52; P56). They encompassed scales such as Likert-type, binary scales, and diagrams (P7; P10; P20; P41; P50; P52), as well as the online rating task platform PsyToolkit (P15). Among them, specific tools for quantitative evaluation of the user self-perception were identified and are described in ascending alphabetical order in Table 1.
In relation to physiological measurements, data was obtained from diverse tools. Heart rate variability (HRV) registered variation between successive heartbeats, measured in milliseconds (difference in time). To gather heart rates, the authors used Polar H102 sensors (P20) and electrocardiogram (ECG) monitoring with 3D knitted electrodes (P23). Galvanic skin response (GSR), also known as electro-dermal activity (EDA), is a technique for measuring the skin’s electrical conductivity in reaction to specific stimuli, which varies depending on the degree of emotional arousal (for both positive and negative emotions). Grove GSR sensors with two electrodes were used (P26). An ambulatory ECG allowed HRV analysis (P23). Facial emotion recognition (FER) software and apps, such as MorphCast (P4) and AffdexMe (P26), were used to analyze facial emotions. Kansei engineering’s research methodology was proposed as a logical and factual foundation for emotional design (P32).
A set of research projects involved the user in the preliminary phases of prototype development (P1; P3; P5; P10; P11; P13; P14; P15; P17; P19; P20; P33; P36; P43; P52). For example, by bringing them into co-creation processes (P28; P39; P56) through user profile analysis to better understand a target group (P29) and with UX tests performed before and after interaction (P4; P41), including the assessment of users’ previous emotional states (P40; P50). Four research projects did not describe how user tests were conducted (P35; P38; P49; P51). Fourteen studies did not aim to explore the relationship between the proposed artifact from the users’ perspective, not presenting any UX evaluation method (P2; P3; P6; P8; P16; P18; P21; P24; P34; P42; P44; P45; P48; P55). However, in eight of them, experience evaluation was mentioned as future work (P3; P6; P8; P16; P18; P21; P34; P45) and, in twelve papers, user studies are recommended to be deepened (P4; P5; P7; P9; P14; P17; P22; P25; P38; P40; P51; P52). This highlights the relevance and necessity of understanding the user’s perspective regarding the interaction.

4. Discussion

Smart textile interaction design is a multifaceted process, as it involves several elements and dimensions. Having a holistic view of all the steps and possibilities in the integrated smart textile and interaction design processes helps in making decisions regarding the project.
The present paper focused on the following research question: What are the main materials and technologies used in recent smart textile research that enable textile interaction, and what evaluation methods were employed to assess the UX?
Although there is a significant quantity of research produced concerning smart textile design, when focusing on smart textile interaction and the assessment of UX, there is limited information. This fact is demonstrated by the relatively small number of papers (56) considered eligible and pertinent to the subject under research and that were selected for this review.
The analysis conducted, comprising the 56 selected papers, was carried out by dividing the underlying question into two components, C1 and C2.
C1: Main M&T used in recent smart textile research that enables textile interaction
Electronic components were the major group of materials used, highlighting their importance in smart and interactive textile design. Most textile prototypes were developed with a processing unit, meaning that interaction between the user and the textile is mediated with a control center that is previously programmed to activate a specific behavior in response to a specific stimuli or stimulus. The main actuators used are also electronic components, which have been widely combined to enable smart textile responses that promote interaction with multiple human senses. Similarly, the sensing behavior of smart textiles has largely followed a multisensory approach, implemented through electronic components and/or textile-based sensors, mainly designed to collect motion-related data. The integration of electronic sensors increased more than threefold over the final three years of the analyzed period. Towards integration of technology on soft surfaces in the desired unobtrusive and seamless fashion, future developments are required, namely in materials science and technological advances. To solve challenges related to the volume of electronic components [18], their miniaturization has been presented as a research trend [3], although still encountering hurdles related to their effectiveness, utility, and commercial feasibility [50].
The conductive materials mostly used were threads or yarns, and their application included sensing purposes and conductive connections. Most of these materials were carbon-based—such as graphene, carbon black, and carbon nanotubes (CNTs)—metal-based—such as silver or copper—or a combination of both—such as stainless steel. From a smart textile design perspective, materials containing these elements present challenges at both functional and aesthetic levels. Carbon-based materials tend to be more opaque and lightweight, making them easier to integrate while also offering good resistance to wear and washing. In contrast, metal-based materials, though shinier and heavier, exhibit better conductivity [8]. Silver-based materials also offer good flexibility and malleability. However, in terms of durability, if metal-based materials are not stainless, their colors may change over time (e.g., silver may tarnish, and copper may develop a greenish patina), affecting the artifact’s aesthetics. The integration of smart materials such as CCMs and optical fibers is less frequent. In designing smart textiles, challenges to consider include the color fastness limitations of CCMs, as well as the complexity of coupling and aligning optical fibers for precise connections with light sources and sensors [4].
Regarding technologies, the most widely used were those related to integrated communication networks between e-textiles and digital devices, typically via Wi-Fi or Bluetooth, but also 2DST and NFC. Emerging technologies—such as immersive technologies (e.g., augmented reality [AR] and virtual reality [VR]), ubiquitous computing (e.g., Internet of Things [IoT]), and, more recently, generative technologies (artificial intelligence [AI])—are increasingly advancing research in smart textiles. In this field, AI has enabled smart textiles to provide predictive and personalized feedback by adapting to users’ individual needs. For example, AI has been used to develop garments capable of processing data in real time and delivering personalized interventions in areas such as healthcare and fitness [51]. Processing units with embedded AI capabilities are already available, supporting the implementation of such functionalities directly within the textile systems. For the next generation of AI integrated into e-textiles, algorithms such as deep learning can shape systems that mimic the functioning of the human brain based on neuromorphic computing and neural networks [52]. However, within the scope of the analyzed studies, mentions of these newer technologies were less frequent than expected, being identified in only 9 out of 56 articles. This gap reveals novel exploration fields of combining these technological advances with smart interactive textiles from the perspective of the UX as the central role of product development.
Although some projects bring sustainability into the design concept, such as by aiming to raise user awareness about pollution or energy consumption, the concern with minimizing the environmental impacts generated by the M&T of smart textiles, especially e-textiles, has shown itself to be a long and essential path to be investigated. Due to the great difficulty in recycling these products, sustainable approaches to the selection of M&T and the design of prototypes must be considered and discussed in the conception and construction of smart textiles. For smart textile design, modular systems should be prioritized for projects, focusing on easy repair and upgrade, component replacement, and disassembly with standardized connectors and components [53]. New biodegradable materials, such as PLA, or biomaterials from renewable sources, such as CNTs, mycelium, or cellulose, can build sensors, as well as there are currently renewable sources for energy supply based on chitosan, carbon, and cellulose for e-textiles [54]. Although some of these materials are not yet available commercially, it is important for designers to be aware of these innovations for future improvements in their prototypes through iterative sustainable limitations analysis. The integration of sustainable requirements into all design stages, with a focus on improving product lifespan, is essential for the responsible advancement of smart textiles.
C2: Main evaluation methods employed to assess the UX in textile interaction
Qualitative evaluation methods were used in most of the articles analyzed. Among them, observation was considerably the most employed method, classified as behavioral, since it focused on capturing information about what users did during the experience evaluation. Interviews and focus groups were also widely used quantitative methods, both classified as attitudinal because they provide self-assessment data about how users felt. Closed questionnaires, also classified as attitudinal methods, emerged as the predominant quantitative method. These included a variety of specific types of questionnaires, mainly SDS and CEQ. Among the physiological tests, under behavioral UX methods classification for measuring users’ physiological states, HRV and GSR were highlighted.
Most of the described methods were combined to adequately measure and understand the expressed and hidden felt sensations. While not yet predominant, the importance of using mixed methods to evaluate UX has been increasingly recognized, especially in recent years of the analyzed period. The combination of both methods can be used to evaluate pragmatic attributes—e.g., comfort, wearability, usability, and efficiency—and/or hedonic attributes—e.g., aesthetics, social engagement, attractiveness, and emotions. The hedonic dimension emphasizes the psychological well-being that an artifact causes in the user, and therefore, it is increasingly being considered for the evaluation of smart textile designs, mainly aiming to identify and measure triggered emotional responses. Through mixed methods, UX evaluation within this dimension can address both objectives of identifying and measuring emotional responses elicited by smart and interactive textiles.
In conclusion, a set of four key topics is proposed, aiming at summarizing essential issues to consider in the design of smart textiles with meaningful interactions.
  • Single and multisensory interaction: It is possible to create experiences with a focus on individual human senses or to design for multisensory interaction. Haptic, visual, and auditory feedback enable sensory interactions commonly explored with smart textiles, which directly influence emotional metrics, such as valence (positive/pleasant or negative/unpleasant), arousal (emotion intensity), and stress levels, whose data can be gathered through self-assessment questionnaires or physiological measurement (GSR/EDA). Designers must explore the materials and technologies used in light of the desired sensory interaction and their effect on UX. For example, using technologies like thermochromic threads, ultra-violet organic light-emitting cells (UV OLECs), and LED feedback systems can create a visually dynamic interaction that aligns with emotional engagement. Additionally, single-sensory does not limit textile interaction to a single output, e.g., vision vs. textile color and light behavior, consisting of a single sense with multiple outputs.
  • Input and output system: Smart textiles can integrate sensors able to detect and collect data from the user(s) and/or environment and also embed actuators that enable response to input data. The sensed stimuli may include inputs that can be controlled at a certain level by the user, such as defined gestures in contactless detection, or can regard uncontrolled data, e.g., physiological sensing. The integration of sensors, such as proximity detectors and embroidered electrodes, enables signal acquisition of the textile artifacts, which can occur even in dynamic conditions. On the output side, the actuator response can be direct or programmed, ranging from haptic (vibrotactile) to visual (LEDs, thermochromic threads) and auditory systems. These outputs may provide users with immediate feedback, which elevates the functional and emotional dimensions of the interaction. To design effective input–output systems, it is important that the designer comprehends the relationship of the smart textile sensing, actuating, and processing structure and explores design possibilities that can enable personalized and relevant UX to address both functional and emotional user demands.
  • UX evaluation: Since materials experience extends beyond physical characteristics, it is crucial to assess the UX and the evoked emotions and take the results into consideration in the design process. The projects that reached the UX evaluation phase demonstrated a great contribution to the HCI field, both for the improvement of the same project or as a reference for other projects. Combining qualitative and quantitative methods of UX evaluation allows a more comprehensive knowledge regarding the interaction between the user and the textile interface in a specific context.
  • Pragmatic and hedonic dimensions: Smart textile tangible and interactive features, influenced by selected materials and technologies, impact both usability and the user’s emotional experience. For instance, the integration of customizable features and multisensory feedback not only meets practical needs but also invites users to express their creativity and individuality, making the technology feel more personal and less intrusive. To assess these responses, the UX evaluation must cover the pragmatic and hedonic dimensions in a holistic approach. Furthermore, including a balanced interplay between attributes in the ideation/prototype and evaluation phases of the smart textile interaction design process can enhance overall UX.
In defining the disciplines and professionals needed in smart textile research, multidisciplinary collaboration is required. Arts, design (textile, e-textile, fashion, and sound), HCI, neuroscience, and materials technology have emerged as leading fields converging to advance knowledge in the design of interactive textile interfaces and experiences [18,55,56,57,58,59]. The inclusion of professionals engaged in the real-world application of these artifacts (e.g., health, education, and music professionals) was also highlighted [56,60]. References to interdisciplinary [18,58,60,61,62] and transdisciplinary [9,57] teams were mentioned in the analyzed works. Interdisciplinarity is based on reciprocal relationships between disciplines with a blurring of boundaries between them, and transdisciplinarity transcends these disciplinary boundaries and considers the dynamics of entire systems in an integrative way [63]. In addition, it is necessary that the team holistically understand the design process to be adopted, as well as important variables. This includes understanding the available materials and technologies and their respective behaviors and properties for selection according to the desired functionalities. Furthermore, it is necessary to understand how users relate to smart textiles through UX evaluation methods. This holistic view is essential since the designer’s decisions will directly affect the UX.
Considering users includes contemplating their emotions and evaluating hedonic and emotional issues, in addition to pragmatic and functional ones, which can lead to the more assertive creation of meaningful interactions with textile interfaces. These interactions with emotional meaning open up a space for products to be more sustainable by encouraging a longer useful life due to the meaning they carry and the relationships they represent with the user.

5. Conclusions

This paper presents a systematic review of recent experimental research on smart textile design, with a particular focus on the main M&T and EMs employed to assess the UX in order to support design decisions involving these interrelated components to create meaningful interactions.
Out of the 56 papers included in the review, 55 included electronic components and/or conductive materials. The predominant integration of these M&T in textiles highlights that materials’ selection and combination need to fulfill sensing, actuation, communication, and processing capabilities towards the design of interactive and responsive systems. To appraise the experiences that smart textile interfaces can provide, observation and close-ended questionnaires were the most employed EMs, being used in 25 and 15 UX tests, respectively. Nonetheless, the increasing use of mixed methods for UX evaluation and the emphasis placed on advancing UX research highlight research gaps.
Though often focused on pragmatic attributes, experience evaluation has increasingly valued hedonic and subjective aspects, which are crucial for smart textile designers, who are also responsible for designing interactions. Addressing the emotional perspective of experience with these technological interfaces underscores the need for future investigation and uncovers a valuable opportunity in integrating symbolic value.
Contributing with discussion and insights on smart textile design research and practice, this review also proposes a set of four key topics to consider when creating smart textiles designed to evoke emotional responses. Future reviews should use multiple databases, as reliance on a single source was a limitation of this study.

Author Contributions

Conceptualization, methodology, investigation, writing—original draft preparation, and visualization, M.G.; supervision and writing—review and editing, J.C. and I.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the European Regional Development Fund through the Operational Competitiveness Program and the National Foundation for Science and Technology of Portugal (FCT) under the projects UID/CTM/00264/2020 of Centre for Textile Science and Technology (2C2T) on its components Base (https://doi.org/10.54499/UIDB/00264/2020) and programmatic (https://doi.org/10.54499/UIDP/00264/2020). M.G. and I.C. also acknowledge FCT for PhD scholarship (https://doi.org/10.54499/2022.09969.BD) and junior researcher contract (https://doi.org/10.54499/2022.08710.CEECIND/CP1718/CT0031), respectively.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
2DSTTwo-dimensional signal transmission
AIArtificial intelligence
ARAugmented reality
ASMRAutonomous Sensory Meridian Response
BCTBBare Conductive Touch Board
BLEBluetooth Low Energy
BMISBrief Mood Introspection Scale
CEQCredibility and Expectancy Questionnaire
CCMColor change material
CMUCapacitance measurement unit
CNTsCarbon nanotube
COCotton
CPUCentral processing unit
CRSComfort Rating Scale
DEDielectric elastomer
DIYDo-it-yourself
ECGElectrocardiogram
EDAElectro-dermal activity
EEGElectroencephalography
EMEvaluation method
ERMEccentric rotating mass
FERFacial emotion recognition
FESFunctional Electrical Stimulation
GQSGodspeed Questionnaire Series
GSRGalvanic skin response
GUIGraphical User Interface
HCIHuman–computer interaction
HRVHeart rate variability
IMIIntrinsic Motivation Inventory
IMUInertial measurement unit
IoTInternet of Things
IRInfrared
ISOInternational Organization for Standardization
LEDLight-emitting diode
Li-PoLithium polymer
M&TMaterials and technologies
MFMain finding
MUXMultiplexer
PCBPrinted circuit board
PDMSPolydimethylsiloxane
PESPolyester
PLAPolylactic acid
PMMAPolymethyl methacrylate
POFPolymeric optical fiber
PSIPerceived Social Intelligence Survey
PVAPolyvinyl alcohol
RFIDRadio Frequency Identification
RGOReduced graphene oxide
SAMSelf-Assessment Manikin
SAPSuper adsorption polymer
SBCSingle-board computer
SCERTSSocial Communication, Emotional Regulation, Transactional Support
SDSSemantic Differential Scale
SESensory ethnography
SMMShape memory material
SPSSStatistical Package for Social Sciences
STQSocial Touch Questionnaire
SUSSystem Usability Scale
TAMTechnology Acceptance Model
TENSTranscutaneous electrical nerve stimulation
TiO2/AgSilver-doped titanium dioxide
TMMS-24Trait Meta-Mood Scale
TPUThermoplastic polyurethane
UEQUser Experience Questionnaire
UIUser interaction
USBUniversal Serial Bus
UTAUTUnified Theory of Acceptance and Use of Technology
UV OLECsUltra-violet organic light-emitting cell
UXUser experience
VIVisually impaired
VOCVolatile organic compound
VRVirtual reality
Wi-FiWireless Fidelity

Appendix A

Table A1. Summary table of article analysis.
Table A1. Summary table of article analysis.
Code
Reference
Main FocusSummary
P1
[64,65]
Research a multimorphic textile artifact that reacts to water exposure by shrinking and dissolving fibers, designed for performativity to support open-ended interactions and use in multiple contexts in daily life.M&T: Tea towel woven with PVA water-soluble yarn. The first prototype (AnimaTo V1.0) was single-layered and made with CO and PVA in a manual loom, and the second prototype (AnimaTo V2.0) was triple-layered and made with linen, CO, and PVA in industrial TC2 digital Jacquard loom.
EM: AnimaTo V1.0 user test included interactions between the first author and other people in personal contexts through day-by-day photos, written accounts, and observations. AnimaTo V2.0 user tests were not conducted.
MF: Through a material-driven design approach, material tinkering that reacts to water exposure via the shrinking and dissolving of its fibers with multimorphic qualities was carried out to achieve changes in the texture, size, and shape of the prototype. “Insights into how designers can tune material (textile), form, and temporal qualities of textile artefacts across scales towards multiplicity of use, recurring encounters and extended user-artefact relationships” were provided.
P2
[65]
Present the development of a flexible smart DE-based sensing array for user control inputs.M&T: A wearable touchpad containing a silicone-encapsulated pressure detection 3 × 3 DE sensor array made of a sputtered metal electrode connected via crimping method (with crimp connectors, from Nicomatic, and conductive adhesive fleece material, from Imbut GmbH) to custom electronics (multiplexers, capacitance measurement unit, a microcontroller and power supply with a battery charging unit), that are realized with standard PCB in the presented prototype.
EM: No UX tests performed.
MF: The miniaturized, flexible, and self-standing sensing system demonstrated a reliable and safe electrical connection. The application potential for a highly integrated textile-based sensor system and its ability to offer intuitive and unobtrusive UX for ongoing interactions are emphasized. It is suggested to include the actuation capability of DE systems for tactile and audio-tactile feedback to enhance user interaction (UI).
P3
[66]
Present a modular play kit that connects children to nature through embodied play.M&T: Modular playkit composed of a round textile carpet with detachable felt leaves, interaction cards, and wireless electronic modules fixed through elastic bands and special pockets into the textile structures. The presented prototype used SAM Labs modules that provide interactive features such as light, sound, and vibration, connected to each other and to a controller via Bluetooth.
EM: No UX tests. However, prototype evaluation is proposed as future work, with children of different ages and backgrounds engaging with the play kit in various contexts.
MF: It was presented as “a design example that connects open-ended play experiences with more-than-human perspectives”, focusing on “meaningful interactions with/for children engaging with non-humans and nature”. The project combined participatory with speculative design approaches—by including children and by empathizing with trees, respectively, in the design process. With the shapes and functions of the materials designed for the modules, children can collaboratively create the story of a tree and its surroundings throughout the four seasons and learn together through collective interactions. The optional use of detachable electronic modules “allow the users to enrich the played scenario with interactive features”, giving them the possibility to combine a variety of sensors (e.g., buttons, light, movement sensors) and actuators (e.g., sound, vibration, light) to achieve desirable scenarios. “To accommodate children of various developmental levels and knowledge backgrounds, Treesense provides three different difficulty levels for exploration”, which vary the use of electronic modules.
P4
[67]
Explore how design choices and psychological values influence users’ emotions through a soft-robotic shape-changing textile installation designed for mindful emotion regulation.M&T: Multilayered central ball with inflatable tubes, composed of two input–output systems: (1) a pulse sensor capturing user’s real-time heart rate signals via conductive wires to provide the “coral’s” illuminating color and volume changes through LEDs and an air control system composed by an air pump and a solenoid valve, respectively; and (2) touch sensors made of Velostat conductive fabric, as illustrated in previous work [68], promoting shape-changing movements on the inflatable tubes through a pneumatic actuation system (not described).
EM: 55 participants (25 men and 30 women, 25–55 yo) were assessed before, during, and after interaction through closed-ended questionnaires—(1) background demographic questionnaire, (2) TMMS-24, (3) SAM, (4) BMIS, (5) 14-item adapted version of the STQ, adapted versions of (6) the GQS and (7) the PSI, both containing only dimensions appropriate to the study—physiological measures—(8) facial expressions video-recorded by a mobile phone and analyzed through Morph Cast software—and (9) think-aloud method.
MF: A design and evaluation framework for the prototype was developed and proved the successful achievement of the study’s goals. The prototype was perceived as animated, likable, interesting, safe, and socially and emotionally intelligent, evoking positive emotions in a somaesthetic experience considered pleasant, calm, positive, and relaxed. Individual differences in human–robot interaction demonstrated to influence the interactive experience. Five prominent themes, generated from the verbal reports during the experience, were analyzed regarding the somaesthetic experience and the relationships with applied materials and interactive behavior. Guidelines for further studies regarding design (e.g., adding more sensors, exploring other textile shapes and properties, the materiality and morphology of the soft robot, material behavior and haptic action, and forms of customization/personalization) and user testing (e.g., exploring advanced AI models for emotion recognition, applying pre-interaction assessment, combining accurate physiological sensors for interaction and for evaluation) were proposed.
P5
[58]
Present the collaborative design process of an interactive textile musical instrument that expresses the experience of chronic pain.M&T: Wearable artifact that consists of a large centerpiece joined with five interactive arms. The centerpiece is made of felted PES–wool mix fabric containing a Bela Mini SBC—with a multichannel expander and a battery power bank—linked to five piezoelectric contact microphones (as sound and touch sensors, one for each arm)—coupled with an envelope follower and synthesizers—through a mono jack cable. For the output, the SBC can send signals to headphones through a stereo minijack cable or be connected to speakers through a wireless transmitter.
EM: Self-reporting of an interactive dance performance with the prototype. Audience experience evaluation not carried out.
MF: Contributions of the paper include (1) design process description based on autobiographic somatic exploration and collaborative material exploration; (2) technical specifications of a new textile-embodied interface; and (3) reflections on the impact of interdisciplinary collaboration on the artifact design and its use in performance. The various interaction features of each arm as modules enabled dialogical contact between textile and sound design materials and processes. “Sound designs have a complementary function to touch, inviting or discouraging certain ways of interacting with the artifact”. Performance audience experience assessment, entailing researching interpretability through interviews, was intended as further study.
P6
[69]
Explore how to combine capacitive sensors with the tactile qualities and storytelling possibilities of handmade woven tapestries for the design of an interactive, media-immersed art installation that expresses the negative impact of noise pollution on marine life ecosystems.M&T: For tests: A BCTB microcontroller (with capacitive electrodes) integrated with copper tape, a metal clip, conductive thread (all from DFRobot), and conductive wool (dyed with Bare Conductive electric paint diluted in water-based ink). Weaving samples with conductive threads and felt made with wool and conductive fiber were built. For the final prototype: A woven textile structure joined with textile pieces made with conductive threads and felt as sensors, with two magnets on the back, connected to the microcontroller.
EM: No UX tests performed. However, it presents a detailed overview of the user journey.
MF: The importance of iterative prototyping of the project, carried out in four cycles—(1) the overall design; (2) weaving techniques, textures, and smart materials; (3) narrative progression; and (4) user interaction, focusing on active participation—was emphasized. The exploration of conductive materials combined with weaving samples, as well as with other textile techniques, was essential to test their conductivity in different structures. All materials were successfully tested, except electric paint, which did not achieve the desired conductivity. “Placing interaction areas as far apart as possible for the final piece design is crucial to prevent sensing issues”. Although audio media was used to communicate the narrative and provide cues about the interaction, sound actuators are not described. Future explorations include assessing the UX through an in situ deployment study.
P7
[70] 1
Report usability evaluation of an illuminative contactless textile with gesture recognition controlled by computer vision.M&T: Gesture-controlled illuminative textile system that consists of POF fabric and computer vision via an integrated camera and a minicomputer, described in previous work [18].
EM: (1) Observation of 25 participants fulfilling 7 hand number gestures performance tasks registered by a stopwatch when finished, conducted in an enclosed room with subdued lighting. (2) Closed-ended questionnaires: a 10-point Likert-type learnability scale, a 6-point UEQ, and a 5-point SUS.
MF: Participants enjoyed using natural number gestures to trigger color changes, positively reacting to the prototype’s attractiveness, perspicuity, and stimulation despite lower efficiency ratings. Future research should involve left-handed gestures, more participants, and interviews for qualitative feedback. For the prototype development, they intend to explore the feasibility of the product and investigate alternative technologies to improve the response stability.
P8
[71]
Explore the feasibility and design possibilities to integrate knit structures and pneumatic actuators in soft robots to enhance user wearable experiences. M&T: Knit sleeves made with nylon and elastane yarns with inserted pneumatic actuators made of EcofexTM 00-10 silicone gel, whose molds were made by 3D printing with polylactic acid (PLA).
EM: No UX tests.
MF: Knitted structure, tension, and needle layout influence pneumatic actuators’ inflation and deformation. “Practical solutions for integrating pneumatic actuators seamlessly into wearable textiles, thereby unlocking new possibilities for human-centered robotic systems” were provided. Other knitting techniques with different yarn properties should be considered for further exploration, as well as quantitative research methods to evaluate technical performance and the integration of pneumatic actuators in a digital system. Although the paper cites the aim to enhance wearability, usability tests are not conducted, being intended for future studies.
P9
[72]
“Explore the interplay between interactive textiles, dual-reality immersive technologies, and social engagement in both physical-digital architectural space”.M&T: Conical structure coupled with modular textile panels knitted with silver conductive yarns (3× 210D/Denier, from Weiwei Line Industry). Each panel is embedded with two antennas—serving as sensors by “monitoring variations and distribution of the e-field” through the detection of body movements and locations—and is connected to a Teensy 3.1 microprocessor sending signals to a computer that is connected to a 4-channel distributed speaker system (playing AI-generated music and linked to an ambisonic spatialization module) and robotic lighting system (rotating head light module composed by RGB beam lights and a motor) in the central meeting point, where it is also located a stainless-steel ornament working as a field transmitter. The VR system was developed in the Unity environment, which provides a real-time and connectionless data transmission method.
EM: Preliminary observation of UI with the installation in exhibitions.
MF: By utilizing various sensors, users can be accurately situated within the installation, controlling sound characteristics and volume across panels for an immersive experience. Users could “control the movement and intensity of the lights as they walk (green), dance (blue), and interact (movement of the light) within the space and each fabric panel of the installation”. Foreseen exploration on expanding both physical and digital dimensions of the installation, as well as more in-depth interactions and UX.
P10
[73]
“Explore the relationship between gestural input and the output of a deformable interface for multi-gestural music interaction” from the user’s perspective.M&T: A deformable TPU fabric interface with a five-vertical-line pattern array with two circles at the end of each line inkjet-printed with SicrysTM I60PM-116 conductive ink (based on single-crystal silver nanoparticles) connected to an Arduino UNO microcontroller through crocodile clips and conductive wire. For UX tests, the prototype was connected to a laptop with a headset.
EM: (1) Exploratory workshop. (2) Observation of user engagement with the probe recognizing four different gestures to control six sound-effect variations. The individual sessions with 12 participants were audio-recorded and included quantitative measurement—(1) to match sound effects to gesture inputs and (2) to indicate their confidence level through a 5-point scale—and (3) open-ended interviews regarding their choices.
MF: The interplay between gestures and sound as input–output presents significant effects on how people interact with the system. TPU fabric was demonstrated to elicit feelings toward sound and allowed the identification of diverse gestures it evoked, enhanced sensory experience by combining touch and hearing and was viewed as attractive, easy, intuitive, playable, and memorable by participants, as well as enabled the mapping of richer and diverse sound parameters compared to rigid input methods. Limitations include a lack of maturity of printed sensor technology. Further studies include the use of (1) other deformable materials, (2) their combination to create more complex gestural interaction, (3) richer sound parameters control, (4) machine learning and other sensing technologies, and (5) deformable flat interfaces in other domains.
P11
[74]
Examine how auditory manipulation influences the user’s perception of digital fabric textures through contactless mid-air haptic stimuli.M&T: (1) Nine fabric samples and a mid-air haptic device (providing contactless haptic feedback) connected to a computer for single-sensory (touch) interaction. (2) One fabric sample and a mid-air haptic device connected to a computer with Sennheiser 400 s headphones for multisensory (touch and hearing) interaction.
EM: (1) Closed-ended questionnaire (4 items: 3 × 7-point and 1 × 9-point Likert scale) with 37 participants regarding single-sense (touch) feedback. (2) Closed-ended questionnaire (3 items: 2 × 0–100 scale and 1 × 9-point Likert scale) and semi-structured interview with 18 participants assessing multisensory touch and hearing senses feedback.
MF: Audio-tactile experiences with haptic textures in the air modify touch perception and elicit a broader spectrum of emotional reactions, leading to enhanced experiences that can be deeply immersive and captivating for users. In single-sensory evaluation, it was crucial to avoid influences from other senses, such as in the haptic test, where participants used noise-canceling headphones.
P12
[75]
Introduce the “multiscale interaction” concept to facilitate the “enhanced delivery of information via textile-based haptic modules” through research on the depth and detail of users’ haptic experiences, considering selection between body-scale and hand-scale interactions.M&T: Two multiscale haptic textile bands (8- and 4-channel) with pneumatic channels individually linked to an electronic solenoid valve (SYJ314M-6LOZ, from SMC Pneumatics) and air supplied by an air compressor tank (9461K21, from McMaster-Carr) through a pressure regulator (8083T1, from McMaster-Carr) controlled by a computer.
EM: Observation of three experiments conducted with 16 participants evaluating interaction efficacy through identification tasks in a Graphical User Interface (GUI).
MF: The multiscale paradigm matches the accuracy of body-scale and hand-scale interactions, allowing user flexibility in interaction preferences. To assess only haptic interaction, users were avoided from receiving visual and auditory feedback during user testing. Limitations included encoding of information, scalability of the paradigm, and minimal coverage. Future studies on fully textile-based haptic wearables are suggested.
P13
[56]
Present the design of a kinesthetic garment that transforms somaesthetic singing experiences into haptic gestures for intersubjective experiences, evaluated by a micro-phenomenological approach.M&T: A corset constituted of 6 haptic textile modules with geometric embroidered integration of OmniFibers [48] for fluidic haptic pneumatic actuation connected, via polyurethane connection tubes, to three FlowIO pneumatic controllers coupled with a pressure regulator and a compressed air tank. The software stack was composed of a laptop as a central controller—transducing audio, pressure, and strain inputs via machine learning—connected to each FlowIO controller through USB cables.
EM: (1) Preliminary UX observation and think-aloud experiment with a singer. (2) Two sessions with 3-person audiences wearing the prototype (first for HCI researchers and second for musicians) during live performances, each followed by micro-phenomenological interviews.
MF: By bridging internal and external bodily sensations, the prototype proved to dissolve the barriers between the singer, audience, and performance and create a sense of a shared intercorporeal experience. The interview results were thematically analyzed, generating 5 themes related to the somatic intra- and intersubjective UX: (1) somatic materialization of voices; (2) narrative dialogues and dinaesthetic sharing; (3) dissolving boundaries; (4) suspension of time and space; and (5) wearing the performance. These findings led to design contributions, which included “Designing for Intersubjective Experience”, introducing the concept of “intersubjective haptics”, and “A Somatically Grounded Design Process”.
P14
[76]
Introduce a design and fabrication pipeline “to integrate passive force feedback and binary sensing into fabrics via digital machine knitting.”M&T: Textile unit with patches made of plated stainless-steel ferromagnetic yarns (from Filix). As a counterpart unit, the fabric has a split pocket made of PES (for passive haptic feedback) and conductive yarn (for binary sensing) containing a permanent magnet (Neodymium N52, from K&J). Except for the magnet, all items are fabricated by an industrial digital knitting machine.
EM: Preliminary user studies embraced the observation of (1) identification task: 8 participants experimenting with a glove containing the prototype; (2) fulfilling tasks with time completion recording: 6 participants wearing a watch with the prototype.
MF: The sensing mechanism combined with passive force feedback enables interface designs with input and output functions. Participants demonstrated high accuracy in force feedback discrimination and rotation tasks, validating the prototype’s effectiveness. Contributions: (1) six passive haptic interaction designs; (2) parametric design templates; (3) material and fabrication process specifications; and (4) potential application scenarios and artifacts. Future research: (1) integration of programmable magnetic materials into fibers or textiles, (2) user studies to validate their perception during interactions with various designs.
P15
[77]
Study heat and movement-sensitive sonic textiles in order to promote users’ awareness of heat exchanges and enhance their felt experience of warmth.M&T: A blanket and a sweater/wearable blanket made with self-fabricated wool textiles, each containing two piezoelectric microphones (as sound and touch/movement sensors) with an envelope follower and a LM35 temperature sensor, integrated into the inside, all connected to a Bela Mini SBC powered by a 3.7 V Li-Po battery (charged through USB charger). For the output, fabric speakers were built on the textile surface composed of a coil made of copper wire, plastified paper, a magnet, and an amplifier powered by a 9 V battery. Crochet patches covered up electronic components, which were insulated by a layer of synthetic fabric.
EM: (1) Quantitative listening test to compare sound models through a rating scale and validate the warmth sonification metaphor, carried out with 30 participants through PsyToolkit online platform. (2) Qualitative physical UX test with the prototypes (containing only the best-rated sound model identified in the listening test) being worn by 10 participants (5 participants tested each prototype in audio and video-recorded individual sessions) followed by think-aloud methods and semi-structured interviews, to understand how users relate to them. The first phase of the UX test was conducted indoors, and the second after a walk outdoors in wintertime to evaluate different thermal conditions and (un)controlled sonic space.
MF: Adding fabric speakers allowed to reduce the bulkiness of hard components on the textiles and enhance users’ comfort. The sweater/wearable blanket shape allowed the designer to have more control over the experience, while user experience with blanket shape is “much more open since it can be used in very different ways and the position of sensors and speakers is unpredictable”. This influenced the sense-making perception regarding the relationship of inputs–outputs. “The distinction between heat and movement-sensitive aspects of the sound models was influenced by the different time-scales in which the temperature sensor and the piezo disks operate”, with the first being slower and the second faster. Multimodality sensing entanglement influenced users’ perception regarding prototypes’ function. Users’ emotional reactions could be gathered, such as the positive effect of providing bodily temperature self-awareness.
P16
[60]
Examine how sensorial textile-based spatial objects, along with embedded technology, can be incorporated into dementia long-term care settings through a collaborative design approach.M&T: Three interactive sensory spatial textile objects: (1) an armchair with press sensors (buttons and switches) connected to an Arduino UNO microcontroller sending output signals to an aroma diffuser, a massage vibrator, a fan device, and a speakers with an amplifier; (2) a wall picture frame with three ultrasonic proximity sensors linked to an Arduino Mega microcontroller activating an LED cluster through an LED driver; and (3) a portable cover for a handrail integrated with three pushbuttons (as press sensors) connected to an Arduino UNO microcontroller that, through RF transmitter and receiver modules, wirelessly send signals to a speaker with an amplifier.
EM: No user tests conducted.
MF: Prototypes were considered accessible, ergonomic, and safe, demonstrating the potential to transform dementia long-term care settings into a comforting, playful narrative and communication-eliciting environment to enhance well-being. Sensory ethnography (SE) methodology is planned for later UX evaluation through observation, notes, and video and semi-structured interviews.
P17
[78]
Investigate how different textile materials on the wristbands may influence the affective experience promoted by stroking and squeezing tactile feedback.M&T: A wrist-worn haptic device that supports the easy replacement of different textile materials, composed of an actuating unit—two servo motors individually linked through connectors to two cylindrical cores—mounted in a 3D-printed two-layer frame, which also holds four strain gauge load cells as pressure sensors connected to an HX711 load-cell amplifier to modulate the pressure applied to the skin. Connected to both servo motors, a microcontroller receives signals from a computer through two SN74LS241N octal buffers via USB. Two elastic conventional PES bands fasten the device onto users’ wrists.
EM: Two user experiments evaluating valence and arousal ratings of the investigated feedbacks (1) stroking and (2) squeezing. The tests were subdivided into (1.a and 2.a) evaluation with a motionless prototype and each of the 5 fabrics and (1.b and 2.b) evaluation upon parameterized stimulation of the prototype—both using a 7-point Likert emotion rating scale based on the SAM and on the SDS—followed by a semi-structured interview. Fifteen different participants were included for each test.
MF: Five design considerations were promoted for touch and compression feedback on the wrist mediated by different textile materials: valence-arousal maps as affective design references, individual differences to promote personalized affective feedback, correlation between valence produced by interaction with the comfort of the textile material and emotional experience predominantly activated by movement factors. Limitations/future work include the exploration of (1) a multisensory interaction approach; (2) UX evaluation on diverse interaction scenarios; (3) hardware device minimization; and (4) mechanical or chemical changes of textile wristbands.
P18
[79]
Explore the intersection between wearable technology and environmental awareness by using haptic feedback to communicate a plant’s comfort level to the user.M&T: Line of wearables that respond to atmospheric conditions collected through Airspec smart eyeglasses with SHT45 humidity and temperature sensors that send collected data in real time via Bluetooth to a laptop, which are controlled through FlowIO with pneumatic ports that promote haptic feedback (shrinking and pushing) for tubular and circular layers made of inflatable silicone inserted inside two non-elastic textile layers in an arm warmer and a neck warmer.
EM: No UX tests. However, observation of interaction scenarios to evaluate the functionality of the prototype is planned.
MF: Proposed wearables effectively communicate environmental conditions. The successful combination of sensors and actuators may allow a seamless user experience, promoting a deeper connection with nature through embodied interactions.
P19
[80]
Discuss how methodological tools can represent felt pain experience for somaesthetic interactions through a textile extension.M&T: A soma extension made of conventional textiles with interactive properties created via Arduino microcontroller with a movement sensor module (MPU-6050) and an embedded music shield connected to headphones for sound feedback.
EM: Video-recorded first-person (author) prototype tests, second-person (participants) testing of the prototype, self- and in-depth semi-structured interviews, body map drawing, and data analysis.
MF: It is important to combine visual, verbal, and textual tools for felt experience in interaction design.
P20
[16]
Research gesture affordances when interacting with textile interfaces with different textures and respective emotional user experiences under four feedback modes.M&T: Five interfaces with different textures obtained through fabric manipulation techniques with a scuba knitting fabric, which integrated Adafruit stainless-steel conductive threads, Adafruit conductive silver fiber net fabric, pressure-sensitive fabric (Adafruit Velostat), LEDs, MP3 module, speaker, and Arduino Mega microcontroller with a dip switch connected, read–write module (CH376S), and a TF memory card.
EM: Video-recorded observation, semi-structured interview, emotion valence, arousal, and stress–relaxation level with a Likert-type questionnaire, HRV, and GSR, audio-recorded user’s feedback.
MF: Emotional experience was more significantly impacted by the feedback modes (haptic, visual, audio, and combination of visual and audio) in emotional valence, arousal, and GSR. However, stress–relaxation level and HRV were influenced by both feedback modes and textile-afforded gestures. Seven guidelines were proposed for “the design of textile interfaces for emotional interaction”.
P21
[81]
Discuss the integration of electronic textile (e-textile) sensors that enable hand interaction in car interior space to assist drivers and empower backseat passengers.M&T: Development of two study prototypes as proof of concept of e-textile pressure-sensing system using conventional and Adafruit silver-based knit Jersey conductive fabric connected via Karl Grimm soldered silver-plated conductive threads stitched to a Lilypad microcontroller board with Bluetooth module (Adafruit Bluefruit LE UART Friend).
EM: No UX evaluation performed.
MF: Referred need to conduct research on UI and perception of tactile e-textile application. Discussion of actuator possibilities: media player, window, air conditioning, and visual feedback with thermochromic threads or LEDs/UV OLECs.
P22
[18] 1
Design an interactive illuminating textile with a contactless number gesture recognition function controlled by computer vision via an open-source AI model.M&T: Striped double-layer woven Jacquard made of PES and POF of PMMA integrated with a micro single camera and an FPC interconnector, which connects to a Raspberry Pi 4B SBC with an open-source AI model (Baidu AI Cloud) based on deep-learning gesture recognition stored in a cloud computing server with Wi-Fi connection. The microprocessor is linked to a 5 V power bank and to a drive circuit that connects RGB LED coupled with POF bundles.
EM: Interaction observation in an indoor environment. UX tests for user satisfaction feedback are recommended for future studies.
MF: Gesture-controlled contactless detection conveys a novel input approach for interactive textiles. Potential for further research applied for designing “multi-sensory environments and smart wearables”. The open-source AI model provides continuous interaction between the physical textile and intangible technology, whose application in the early design process leads to labor and cost minimization. Bulkiness of system components, directionality of the camera, internet signal strength (to avoid detection delay and failing), and exclusion of users with functional hand disabilities are highlighted limitations.
P23
[82]
Demonstrate a 3D textile electrode containing a water-retaining material for bioelectrical signal acquisition with low skin impedance.M&T: 3D textile electrode made of CO yarns dyed with a conductive ink containing a mixture of RGO, silk sericin, and SAP, fabricated by towel embroidery technique.
EM: EMG and ECG measurements as demos for potential clinical applications, ambulatory ECG monitoring for HRV on volunteers performing running/walking exercises. The prototype itself is used as test equipment for usability evaluation.
MF: The proposed electrodes achieved high-fidelity signal acquisition in running motion and sweat conditions. Due to the adaptable embroidered integration technology, electrodes may be embedded into everyday clothing, illustrating its customizability.
P24
[83]
Explore destaining as a design tool for interactive systems by analyzing the relation between textile, stain, and light and studying a set of design parameters.M&T: Four fabrics—100% CO, 100% PES, a blend of 65% PES and 35% CO, and 100% nylon—coated by different methods with TiO2/Ag nanoparticles. Light interaction with stained organic matter was studied through passive interaction, with sunlight exposure, and controlled interaction of LEDs (from Adafruit) embedded into the fabrics, connected with a conductive thread to an Adafruit Metro board and a 3.3 V battery in a holder with a switch. The conductive threads also acted as moisture sensors.
EM: Observation of color and pattern change. UX evaluation was not performed.
MF: Destaining is proposed as a creative tool to design interactive textiles, which enables personalizing or recording users’ memories and experiences within textiles. Design parameters to create destaining textiles were classified, and potential applications were presented, namely the proof of concept of self-cleaning clothing.
P25
[14]
Explore the embodiment of negative feelings and emotions related to remote relationships in textile wearable artifacts.M&T: Two pairs of wearable artifacts: (1) Two vests made of conventional textiles with servo motors, conductive threads, and RFID sensors and tags placed on the chest. (2) One collar made of large draped cushions and a quilted fabric half-vest, both embedded with heat pads. Both pairs are connected to each other via Wi-Fi. Additional electronics (not specified) are hidden on the artifacts’ backside.
EM: Observation of prototype use by authors and colleagues for collecting feedback.
MF: As “discussion artifacts”, both developed wearables allowed reflections on embodied and experienceable negative emotions related to living apart. The visual interplay of involved materials and components demonstrated to be part of the experience, which became two-fold: individual sensation of wearing and shared experience of looking. Not introducing artifact behaviors in detail for participants allowed them to collect deliberate responses without drawing attention away from focal points. Exploration of how to extend interactivity when dislocated from each other is suggested for future work.
P26
[15]
Examine the potential for emotion regulation via movement-based interaction assisted by smart textiles and identify important elements that affect their design.M&T: A t-shirt with flexion/extension and movement sensors made of silver-based knitted elastic, two conductive fabrics, and conductive thread. Three feedback mechanisms were implemented using four LEDs with different colors and brightness for visual response, four motors for vibrotactile, and an Arduino BLE Nano microcontroller for data communication to a mobile app via Bluetooth for audio feedback.
EM: SAM and CRS questionnaires (CRS using 0–20 range), facial emotion capture app (AffdexMe), and interview.
MF: Emotional engagement was more effectively encouraged by audio and vibrotactile feedback. Interactive textile interfaces may lessen the obtrusiveness of wearing electronic equipment, increasing user acceptability. Five design guidelines that contribute to movement-based interactions for emotion regulation were suggested.
P27
[84]
Explore interaction possibilities of textiles with enhanced sensing capabilities through textile textures and sonic outputs.M&T: Seven conductive textile samples, used as proximity and touch sensors, were developed by in situ polymerization (2), embroidery with Madeira 40 silver-plated conductive yarn (1), knitted with the same yarn (1), Amman SilverTech 120 silver-coated conductive thread sewn lines (1), LessEMF copper wire mesh fabric (1), and Sefar woven fabric of carbonized yarn (1). For sound output, the samples were connected to a Cypress PSoC4 microcontroller sending data to a computer software.
EM: Observation of visitors’ interaction with an installation and discussion.
MF: Visitors could act as creators of sonic landscapes. The textile installation provided an immersive and rich experience and proved that “sounds were inseparably intertwined with the textures and tactile sensations”. Future work on an even more immersive structure to allow visitors to interact with other parts of their bodies is suggested.
P28
[85]
Discuss and evaluate the design and fabrication of digitally embroidered textile soft speakers and explore potential applications with maker-users to expand audio and haptic feedback on textile interfaces.M&T: Digitally embroidered pattern made with Karl Grimm copper and silver-plated conductive threads for application on a conventional surface (fabric or leather) with a magnet underneath, connected to a miniature 4–8 Ω amplifier and digital media device through audio cable, powered by 3.7 V Li-Po battery. Microcontrollers and 5 V DC power adapters are suggested as optional components.
EM: Autobiographical design with non-experienced users.
MF: The do-it-yourself (DIY) process developed and the design parameters evaluated can help makers integrate sound and haptic vibration to soft objects by additive and constructive methods. Seven technical and design parameters were presented. The prototypes were developed to demonstrate how interactive capabilities can be embedded into everyday textiles.
P29
[86]
Explore how tangible interactive technology can offer opportunities for socialization and sensory regulation for minimally verbal children with autism through a musical textile interface.M&T: Inflatable ball wrapped in felt sheets topped with elastic Lycra ribbons, attached with stretch sensors each, connected to a central BCTB microcontroller via conductive threads. The system was powered by a 3.7 V lithium battery and linked to a Minirig speaker through an audio cable.
EM: Interaction observation for assessing the way children with autism interact with the prototype alone and with peers using a previously developed framework “combined with an adapted version of Parten’s play stages” and inspired by Social Communication, Emotional Regulation, Transactional Support (SCERTS). Sessions were video-recorded using ELAN software (annotation tool for audio and video recording), and thematic analysis to annotations was applied.
MF: Conventional textiles with sonic outputs combination provided rich multisensory feedback and calming experiences enjoyed by all participants. Interaction with and around the prototype attained positive results for social interaction, namely has motivated different types of play. Shareability and socialization were facilitated by the object’s properties (round shape and large-size design with different entries and access points) and by the semi-structured format of the evaluation sessions. Multifunctionality and multimodal interaction enabled creativity about technology use, expression freedom, participation, and agency.
P30
[17]
Explore how a craft approach to interactive technologies can support tangible and multisensory experiences through the development of an interactive and tactile e-textile book.M&T: Interactive book made of conventional fabrics and components (CO fabric, cardboard, felt, and plastic snaps), smart textile elements (optical fiber and thermochromic, hydrochromic, and photochromic pigments), and e-textile elements. These include sensors (microphone, and textile-based pressure and stretch sensors made of spun wool and silver blend, and unspun wool and stainless-steel blend), actuators (enameled copper wire wound into a coil with a magnetic bead, LEDs, speakers and vibration motors), processing units (Arduino Nano board, a motor shield, and greeting card recordable sound modules), power supplies (Li-Po battery with an integrated charger), and other components (reed switch activated by magnet and on/off button). Conductive connections were provided by silverized and gilded copper threads and metal snaps.
EM: Observations of (1) two children engaging with the book and (2) a mixed group of adults and children interacting with it in an exhibition setting.
MF: Working with technology to augment and potentially improve individual multisensory experiences, it is necessary to “couple it with materiality and storytelling”. Textile craft qualities were extended to electronic and computational interaction. Group tests provided insights on shared experience development possibilities. Future work lies in research about real-life applications, expanding audience, exploring crafted interactions (storytelling) over other disciplines, and reflecting on the sustainability aspects of electronic crafts.
P31
[87]
Develop and test a pneumatically actuated soft biomimetic finger with texture discrimination capabilities to provide sensory feedback and create a more natural experience for prosthetic users.M&T: Soft prosthetic finger mainly fabricated with silicone and fabric. For sensing, a 3 × 3 flexible, textile tactile sensor array made of conductive fabric traces (from LessEMF) and piezoresistive fabric (from Eeonyx) is encased by an elastic fabric, integrated into the fingertips and connected to an Arduino Mega 2560 microcontroller through resistors. The pneumatic actuator system is constituted of the prosthetic finger made of three independently controllable joints made of Dragon Skin 10 Medium (from Smooth-on) silicone rubber, an air compressor, and solenoid valves, each connected to an air channel’s inlet and a Honeywell ASDXACX100PAAA5 pressure sensor.
EM: Observation of three healthy individuals receiving sensory feedback sent from the prototype via transcutaneous electrical nerve stimulation (TENS) and performing tasks for identifying 13 standardized textured surfaces.
MF: Participants successfully distinguished two or three textures with the applied stimuli. It is suggested that dynamic stimulation may be more effective than static stimulation for improving sensory perception in prosthetics and human–robot interactions.
P32
[38]
Present a case study of a somatosensory hat towards exploring an interactive clothing design method that reflects human emotions.M&T: Three winter hats made of conventional fabric (Berber fleece with sheepskin, partially sewn with alpaca wool) and components (PVC, zipper), integrated with LED lights and strip, and a brainwave sensor system from NeuroSky composed by a ThinkGear ASIC Module (TGAM), Bluetooth module, dry electrode, ear clip, 3.7 V lithium battery, and embedded I3HGP motherboard processor.
EM: SDS-based questionnaire with 7-point scales for functional, aesthetic, fashion, interactive, and emotional attributes comparison between proposed and traditional winter hats based on sample pictures.
MF: The results show that the proposed dynamic interactive hat can improve the match between product attributes and target users’ emotional response by enhancing the visual appeal of fashion accessories and the humanistic emotional value of smart clothing. The Kansei engineering method provided a scientific basis for the emotional design and enabled us to quantify the user experience.
P33
[88]
Introduce the design and preliminary assessment of a textile-based biofeedback system for supporting gait retraining after a fracture of the lower extremities.M&T: Hallux valgus socks stitched with silver-plated conductive thread to (1) resistive pressure sensors FSR 402 Interlink Electronics connected in series with 4.7 kΩ resistor, Arduino UNO R3, Bluetooth module HC-05 zs-040 version, power bank and (2) five pieces of EeonTex (conductive pressure-sensing fabric) textile sensors—connected in series with a 56 kΩ resistor and stitched as opposing hook-shaped circuits—a power switch and Arduino Pro Mini 3.3 V microcontroller with a Bluetooth module transmitting data to a smartphone.
EM: Indirect observation, self-reporting, semi-structured interview, and closed-ended questions based on the UEQ and the CEQ.
MF: Multiple feedback modes were successfully combined and perceived as real-time and truthful by users: “graphical, verbal, and music feedback on gait quality during training (…) and verbal and vibrotactile feedback on gait tracking”. Evaluation by patients and therapists indicated “acceptance by targeted users, credibility as a rehabilitation tool, and a positive user experience”. The need for a more flexible calibration/personalized design and the enlargement of the quantitative demonstration was suggested.
P34
[89]
Explore the aesthetics and the societal impacts of a hybrid textile that combines traditional handcrafts with digital technologies, chemical processes, and elements created by nature.M&T: Feathers prepared with dyeing methods, traditional featherwork, and in situ polymerization with pyrrole and Iron (III) Chloride hand embroidered in several silk chiffon with crimping beads and connected through electrical wires to an MPR121 capacitive touch controller board read by an Arduino Nano microcontroller with Bluetooth module that controls a DFPlayer Mini MP3 module.
EM: No UX tests. However, prototype implementation in a curiosity cabinet for interaction observation is proposed.
MF: By combining haptic interactions with feathers, textiles, and sonic acoustic feedback, the project emphasizes a multisensory experience. The potential of hybrid textiles to foster new forms of artistic expression and societal engagement is highlighted.
P35
[90]
Describe a light-emitting textile project that aims to alter people’s perception of self regarding body image by creating a fictional spatial experience.M&T: Three woven light-emitting interactive textile artifacts made of optical fiber cables integrated with two Sharp infrared (IR) proximity sensors connected to an Arduino microcontroller.
EM: Observation and image recording of prototype experience in an installation (not described).
MF: The project is “two-fold regarding who is perceiving and what is perceived” since the artifact gained sensory abilities. It demonstrated that, since self-perception is a subjective concept, everyone may be differently affected. Due to the restricted perception range of applied proximity sensors (50–500 cm), image processing via a camera is recommended to improve detection location and movement range in future studies.
P36
[91]
Present “a design exploration of the fabrication methods and processes of interweaving mechanical pushbuttons into textiles” through digital embroidery and 3D printing.M&T: A button layer—with digitally embroidered and 3D-printed (using flexible Tronxy Flexible TPU Filament) star-like buttons on a pre-stretched fabric (Lycra)—fitly integrated into a circuit layer—circuitry made of Madeira HC 40 silver-based highly conductive embroidery thread and hand sewn conductive pads (made of vinyl-cut thin copper). A double layer is connected via conductive yarn to an Arduino microcontroller for real-time signal processing displayed on a computer screen, a circuit board, pull-down resistors, and isolated copper wire with a small crimp.
EM: User tests in an informal embodied ideation session to acquire early UX.
MF: The tested fabrication processes were shown to be versatile and repeatable. The designed tactile textile pushbuttons proved to be wearable (soft, flexible, highly stretchable, and comfortable), functional (clear tactile feedback and reliable signals), durable (pressed 4000 times), and useful for on-body interactions. For future iterations, alternative designs (such as using diverse embroidered textures for eyes-free interaction) and reducing manual assembling were suggested.
P37
[92]
Present an electronic khipu—Andean code system based on knots—as a musical instrument from a decolonial perspective and report its live experimental performance.M&T: An electronic khipu consisting of a CO main rope and nine secondary strings made of conductive rubber cord stretch sensors arranged and fixed—with screws (on the top) and conductive metal fittings (on the bottom)—on a box with potentiometers and buttons, all connected to a circuit board and to a Teensy 3.6 microprocessor sending data to a computer via cable. For video projection, hand movement and gestures of knotting are captured by a USB camera sending live images to the computer, which sends video and sound signals via cable to not mentioned devices. Other components include banana jack connectors, a knob, and a ring or bracelet (to be used by the performer).
EM: Informal observation of performance audiences.
MF: The integration of the khipu into a digital music interface offers a novel way to engage users, combining tactile interaction with digital sound production. The project demonstrates the potential for cultural artifacts to inspire new technological applications and user experiences. Conductive rubber sensor strings act as flexible and reusable variable resistance whose values can be mapped. A wide range of sound textures is caused by the different signals produced by the cords, together with the performer’s skin conductivity. “The touch and force used to make the knots produce different intensities”.
P38
[93]
Present an e-textile soft toy that explores haptic sensations and encourages physiological and social play experiences.M&T: Soft toy with NFC tags—tracked by mobile app—Adafruit Flora microcontroller, 5 tiny vibration motors, batteries, and a battery holder.
EM: Observation and image recording of an interaction scenario (not described).
MF: The research “contributes to a new generation of toy design combining comfortable and tactile characteristics of textiles with digital technologies”. Five interaction scenarios (alone and with peers) were proposed. UX tests are planned as future work by assessing intimate contact and multisensory stimulation related to the UX.
P39
[61]
Explore Autonomous Sensory Meridian Response (ASMR) media to create new design opportunities for interactive textile-based wearables for enchanting everyday experiences.M&T: Two garments, each with a pair of Roland binaural recording headphones for the 3D audio and playback, a Teensy 3.2 microcontroller (with “TouchRead” capacitive touch sensor pins) attached with an audio shield, and specific sensors: (1) a red-plaid jacket connected to the microcontroller’s sensor pins through CO-wrapped copper wire, and (2) a hand-woven cloak with a knitted i-cord (wool and conductive yarn) as breath/stretch sensor and long capacitive sensors made of single CO-covered copper wire attached to microcontroller’s sensor pins, besides a SD memory card.
EM: First-person autobiographical approach of each garment by the authors.
MF: The autobiographical approach enabled the design of sonic filters as wearable systems that are familiar and comfortable for designers as users. ASMR media is proposed to foster embodied, intimate, felt, and personal attention practices within daily common surroundings in wearable design, and its relevance for HCI is highlighted.
P40
[57] 2
Discuss how a transdisciplinary collaborative design approach between e-textile design, cognitive neuroscience, and HCI can lead to the development of a textile garment that provides one’s body with perceptual changes and emotional responses.M&T: Tubular jersey dress with tight sleeves integrated with 38 vibration motors distributed along the body, guided by Arduino microcontroller board. Before the presented prototype, exploratory work was conducted [9].
EM: Open-ended interview over the full wearing experience with a dancer, as a body-conscious person.
MF: Transdisciplinarity shifted perspectives among arts, neurosciences, and HCI researchers, and the importance of shared language for communication was highlighted. The wearable “boundary object” opens new opportunities for the developing field of “sensorial clothing”, demonstrating that e-textiles allow one to “wear” various experiences since the application of vibrotactile patterns elicited a range of haptic metaphors in the wearer. Further research is necessary to fully comprehend the process of designing hidden body-altering experiences, as well as the emotional and social feedback.
P41
[9] 2
Explore the potential for creating clothing that alters how people perceive their bodies by utilizing tactile feedback.M&T: (1) Jersey textile with 21 vibration motors connected by Shieldex silver-based conductive thread to an Arduino UNO microcontroller linked to a computer software. (2) 25 vibration motors distributed on felt material connected through thin electric wires, with two types of conventional fabrics (fluffy soft non-woven PES and structured woven PES) as surface and additional components, including three soft buttons.
EM: (1) Questionnaire A with 7-point Likert-type, before and after prototype experimentation, assessing “emotional state, body sensations and sensations of materiality”. After prototype tests, it was also conducted Questionnaire B with 9-point Likert-type questions about emotional, subjective reactions (valence, arousal, and dominance) and exploratory questions for material association. (2) After prototype experimentation, adapted Questionnaire A with 5-point Likert-type questions and a check-box list to evaluate “emotional, bodily, and materiality sensations”.
MF: The research demonstrated the “potential in considering materials as sensations to design for body perceptions and emotional responses”. Interaction effects between vibrotactile patterns and the textile’s surface elicit different associations and haptic metaphors, influencing emotional arousal and physical sensations. When developing haptic clothing, it is crucial to consider textiles’ surface texture, and the design must be tailored for specific use and user.
P42
[94]
Present a seamless textile-based interactive surface that combines electronics and digital knitting techniques for expressive and virtuosic sonic interactions.M&T: A piano-pattern textile digitally knitted with silver-plated conductive (from Weiwei Line Industry), thermochromic (from Smarol Technology), and high-flex PES yarns combined with melting-yarns. Covering the entire back of the interactive interface, a fabric pressure sensor piezoresistive knit fabric (LG-SLPA 20k, from Eeonyx) in between two conductive knit fabrics (Stretch, from LessEMF). The five proximity sensing fields of the piano’s 60 keys are connected to PCB pads interconnected to capacitive sensing chips (MPR121, from NXP Semiconductor) through highly conductive silver-coated fibers (from Liberator 40) that, together with heating elements, are interconnected to a Teensy 4.0 microcontroller via insulated wires. The processing unit has a voltage follower (TLV2374) and an N-channel Power MOSFET (IRLB8721, from International Rectifier)—powered by a 6 V external battery and with a resistor—and is connected through USB to a computer, which emits the generated sound.
EM: No UX tests performed.
MF: The integration of electronics at the fiber level into fabrics made by digital knitting techniques “enables personalized, rapid fabrication, and mass-manufacturing” of smart textiles, allowing “performers to experience fabric-based multimodal interaction as they explore the seamless texture and materiality of the electronic textile” through visual and tactile properties. Future research includes “simultaneous knitting of textile heating and pressure-sensing layers on top of the conductive and thermochromic layers, the design of flexible PCB interface circuits for robust textile-hardware connection, and integration of an on-board audio generation system”.
P43
[55]
Explore “wearable sensing technology to support posture monitoring for the prevention of occupational low back pain” for nurses.M&T: T-shirt with an embroidered circuit of insulated high-conductivity silver-plated nylon thread incorporating two IMUs—LSM9DS0 sensors (accelerometer, gyroscope, and magnetometer) from STMicroelectronics—a PCB connected to Adafruit Flora microcontroller linked to an analog switch (NX3L1T3157) and Li-Po battery. The microcontroller has an AHRS sensor fusion algorithm and a BLE (HM10-UART) module sending data to a smartphone application.
EM: UX of the initial prototype during four days with impressions and improvement suggestions recorded in a diary and a final interview. Improved prototype tested with wearing trial followed by three validated questionnaires—an adapted version of the UTAUT with additional key constructs (hedonic motivation and behavioral intention), the IMI, and the CEQ—and semi-structured interviews.
MF: The need for accurate detection of the low back posture data was explored through personalized sensor placement and tight fit with elastic material. The importance of feedback advice on how to improve posture was also highlighted. Smart garment design demands a holistic approach stressing the relationship between hedonic and intrinsic motivations. The study contributes to temporary change behavior, and further research is aimed at testing long-term posture correction through smart garments.
P44
[95]
Explore craft techniques of double weaving and yarn plying for creating smart textiles with touch sensing and color change behavior.M&T: Conventional yarns (Pearl Cotton from Halcyon Yarn and Zephyr Wool), conductive yarns and threads (magnet wires, Litz wire, CO-covered non-insulated copper wire from wires.co.uk and plied stainless-steel thread from Karlsson Robotics), and blue and red thermochromic pigments with activation temperatures of 28 °C, 43 °C, 56 °C from Chromatic Technologies Incorporated and Liquitex clear acrylic gel medium. The final prototype was a hand double-woven fabric made with Pearl Cotton, plied resistive heating (stainless-steel) conductive thread, and CO-covered non-insulated copper wire painted with thermochromic pastes (blue 28 °C and red 43 °C).
EM: No UX evaluation performed nor suggested for future studies.
MF: Adaptation of traditional fiber art techniques with smart materials enabled the design of “richly crafted and technologically sophisticated fabrics”. Craft double weaving structures allowed “to support interactivity while hiding circuitry from view,” and both techniques presented great potential for designers to “discover new ways of realizing their smart textile concepts”. Designing creative custom yarns is proposed as future work.
P45
[96]
Present a full-body, customizable haptic textile interface to promote an untethered spatial computing experience.M&T: Full-body clothing made of double-sided conductive textile (composed of “conductive mesh made of metal thread on the front side of each layer and an insulated fabric on the back side”), which powers and controls attached haptic modules via 2DST. Haptic modules include LEDs, a pin connector, and a haptic actuator, and they are controlled by a master module wirelessly communicating with a computer via Bluetooth.
EM: No UX assessment. Demonstration of spatial experience using a mixed-reality headset and integrated motion tracking system was described as user experience tests in future work.
MF: Personally customizable quality was provided by haptic modules that had a pin, badge-like connector and could be freely attached by the user on the conductive textile. 2DST technology was used to enable garment flexibility, wireless connection, haptic feedback, and customizability. Modules could store both acoustic–tactile data and visual expression of haptic feedback.
P46
[97]
Propose design recommendations for integrating meaningful technology into interactive textile books for infants and toddlers aiming to promote sensory–motor and pre-cognitive developmentally appropriate interactions.M&T: Three-page interactive book: (1) Resistive fabric (force-sensitive stretch sensor) connected to a Lilypad buzzer. (2) Velostat between two layers of conductive fabric (pressure-sensitive sensor) sewn to a vibration motor underneath. (3) Photo resistors (light sensor) connected to Lilypad LEDs and conductive fabric. Specifications reported by [98] refer to conventional components (mainly CO fabrics, Velcro, and foam) and general electronic circuitry (Arduino Pro Mini microcontroller and batteries connected through plastic-coated flexible cables to conductive snaps).
EM: Interaction observation and video recording of prototype experimentation in play sessions by infants and preschool children alone and with adults. Data was analyzed through an affinity diagram, open coding (focusing on the child’s play behavior, length of interaction with the book, and interaction with the parent), and timeline in a table, followed by semi-structured interviews with parents and the older children (assessing children interaction early perceptions: usefulness, playfulness, appropriateness, and interest). User study results were discussed in semi-structured framing interviews with experts and focus group with parents, both video/audio-recorded and analyzed through thematic analysis.
MF: Authors propose a set of design recommendations for interactive textile books focusing on featured interactions, digital effects, general “story”, and book design. Reduce the book bulkiness and size and better hiding of the electronics to prevent disassembling were mentioned for the next design cycle, as well as using more soft electronic materials (such as “replacing stranded wires with conductive thread without loss in robustness, fully textile-based sensors and actuators, and integrating flexible circuit boards”) and a wider diversity in textile materials to provide extended haptic and sensory experience.
P47
[62]
Explore a design framework for embodied design processes to improve communication of tactile properties of textiles digitally.M&T: Described and analyzed four tools. Pocket tool: Arduino board bridging force-sensitive resistors (pressure sensors) within six different textile pockets and a display. Haptic sleeve: Haptic sleeve made of viscose fabric with a grid of eccentric rotating mass (ERM) vibrotactile motors (connected to a regulator with three potentiometers), a DC-powered electric heating pad, and a temperature sensor (DS18B20) driven by Arduino UNO. Conventional materials include Velcro and kinesiology elastic tape. Hyper textile: Three different fabric sheets (linen, silk, and a coated PES) linked to piezo sensors, audio cables, jumper wires, speakers, and an Arduino board. Employed recorders, wires, and sensors are not described. iShoogle: Digital application with interactive videos of digital textile manipulation.
EM: All prototypes included user tests based on interaction observation. Previously developed models of textile experience based on touch behavior types and three tactile-based phases [99] guided the analysis.
MF: Detected design strategies focused on “body part”, “textile interaction”, and “who is generating”. Identified digital feedback embraced visual, auditory, tactile, and kinaesthetic. Immersion, mediating, augmenting, and replicating the experience were demonstrated as possible approaches for relational experiences, which are suggested to be further studied. A proposed framework to support design decisions for embodied textile experience unified information from analyzed tools regarding these variables. Concrete applications for the framework were proposed, as well as future work on the employment of haptic, virtual, and augmented reality technologies to research material interactions.
P48
[100]
Compare human intuition and technical knowledge in designing smart garments through the use case of sensor arrangements on a jacket to detect diverse situations.M&T: One-size jacket and acceleration sensors (accelerometers). Other components required for gathering the sensor data are not specified.
EM: No UX tests. Sensor performance evaluation for technical measurements conducted.
MF: The best-performing and more accurate sensor pattern systems were those created by non-experienced test participants using their intuition rather than those by system design experts. Placement of sensors is more relevant for accuracy than their quantity. Best performance designs presented symmetric layouts. Further studies to calculate optimal sensor layout and algorithms to enhance the relationship between sensor number and accuracy are proposed.
P49
[59]
Present usability and maintenance improvements in the design of an interactive textile interface that translates musical scores into tactile sensations.M&T: A full-body suit with 9 patterns, each integrated with control boards receiving data via Wi-Fi from an ESP8266 microcontroller connected to a computer and sending signals through conductive threads to ERM vibration motors. Modular connectors (nickel buttons and conductive hook-and-loop tape) link patches containing boards and motors, respectively, and join LEDs into the system.
EM: Observation of two musicians in different rooms performing a duet, only communicating with each other via the prototype.
MF: Implemented alterations on the prototype’s design and applied materials demonstrated to improve the prototype regarding technical issues and system functionalization. Although the rich experience information of musicians and a wider audience about the prototype in performances was mentioned, these UX tests were not detailed (e.g., the augmentation of the audience’s understanding of how the suit works through multisensory interaction provided by the integration of LEDs). Further studies include the incorporation of motion sensing into the system and the exploration of decentralized processing and power supply units.
P50
[101]
Explore the user acceptance of a textile interface that merges traditional design elements of Indian culture and smart materials created for non-verbal communication in the social space.M&T: Scarf made of poly-dupion fabric with hand embroidered embellishments and micro RGB LEDs connected through the conductive thread to three switches controlling colors and a 3.7 V Li-Po battery.
EM: Group sessions for brief prototype experimentation followed by questionnaires with a 5-point Likert scale, 5-point SDS (with seven variables of non-verbal communication), 7-point scale adapted version of the TAM, 5-point scale based on the SUS, binary and descriptive questions regarding acceptance, social intelligence, aesthetics, functionality, usability, and emotions evoked. Data collected were analyzed quantitatively on SPSS (version 27) (Statistical Package for Social Sciences).
MF: Positive emotions and aesthetic attributes were positively correlated with perceived usefulness and ease of use. Aesthetics had a significant effect on technology acceptance. The functionality of changing color revealed high acceptance for self-expression and daily interactions. Enhancing non-verbal communication through visual cues was proved to improve inter-personal interactions and to promote social and collective intelligence. Further research to correlate usability, acceptance, and cognitive load parameters with larger sample groups with complex interactions is recommended.
P51
[102]
Present a “construction kit” for building interactive e-textile patches to introduce ideas around identity and self-expression for children as user-learners.M&T: A storybook, magnetic patches, and electronic components (Lilypad LEDs, coin cell battery holder-switched, temperature and light sensor) sewn into a fabric piece with a snap to connect to other e-textiles and a magnet on the backside for assembling on the patch.
EM: Observation and image recording of an interaction scenario (not described, but available through a link).
MF: The first stage of an interactive construction kit was presented through patches and electronic textile circuits working as building blocks. Magnetic patch surface facilitates sharing and flexibility of elements to be integrated into dynamic patch creation. User tests are mentioned as future work with target users to measure and understand experience reflection, as well as subsequent module building.
P52
[103]
Discuss the development of a protective and interactive wearable system based on existing sensorial technology to increase workers’ health awareness in small and medium Coating Plants from the user perspective.M&T: (1) Protective mask made of thermoformed spacer textile padded with soft foam and temperature and humidity sensors. (2) Electronic nose (alert gateway) composed of volatile organic compound (VOC) sensors and LEDs. Both artifacts are connected to mobile applications via Bluetooth.
EM: Voice- and image-recorded user sessions involving focus groups, interviews, and scale-based questionnaires to evaluate conceptual mockups, assessing aesthetics, function, comfort, mode of use, etc.
MF: Following a Design Thinking process, the development of the wearables was led by the results of user sessions (including the empathy phase). Provided real-time feedback and statistical data increased visible perks of using the wearable system. The system was created to promote multisensory experience to enhance users’ perception and encourage individual and collective behavior change. Users’ engagement helped highlight the system’s core and added values of awareness (“by monitoring environment and personal indicators”) and comfort (by considering wearability, breathable material, and connection to the body/face). Functional prototype tests were proposed as further studies, expecting results on comfort related to aesthetics (textile materials), perception benefits for motivating and educating users, and clarity of information transmission.
P53
[104]
Explore how blind and visually impaired (VI) people create personally meaningful objects using e-textile materials and hands-on techniques through making workshops.M&T: A re-recordable device—microcontroller consisting of PCB, microphone, speaker, record button, adapted playback soft button made of conventional textiles with type, size, and shape chosen by participants fixed with double-sided fabric tape and glue—connected through snap fasteners to soft wires (insulated conductive thread inside long fabric tube yarn) and battery. Pockets were used to hold the electronics (board and battery).
EM: Pilot sessions observation for feedback on materials, participatory approach in the e-textile-making sessions, workshops, and follow-up interviews. Data were collected through different media.
MF: Modular approach enabled inclusive and accessible construction based on the form and function of affordable materials. Sharing experiences between participants provides creativity and mutual learning. Insights on how to run e-textile workshops to be more accessible and inclusive to a wider community, as well as allowing for ownership, creativity, and self-expression, were provided.
P54
[105]
Present a craft interpretation of a digital game through an interactive textile interface with color change materials to explore a novel game experience.M&T: A piece of fabric with 25 (5 × 5) squares with four holes each, stretched on a wooden board hiding wires and electronic components. Except for color edges, the surface was printed with four conventional color pastes covered by a black thermochromic paste (31 °C) layer. Under each color edge, pieces of conductive fabric were connected to switch transistors (P-Channel and N-Channel MOSFET) controlled by Arduino microcontroller. Controllers were made of silver-colored conductive yarn with a metal pin (one curved) on each end to close the circuit and activate color change.
EM: Observation of initial prototype tests in an exhibition.
MF: Although the game was meant to be used in a quiet scenario, testing it in an exhibition enabled us to generate various insights. Despite different perceptions, diverse aged users got engaged and presented positive reactions to textile dynamic behavior. Due to users’ interest in the prototype, a discussion about the predominance of digital games was raised with the designers. For future studies, technical issues—including conductive yarn and current optimization, and reversibility and activation temperature of thermochromic pastes—need to be solved, and design improvement—home application and levels addition—is suggested.
P55
[106]
Present a pair of fiber-based, self-powered, contactless smart gloves for gesture recognition based on triboelectric effects and electrostatic induction.M&T: Smart glove with an electrified layer made of wool yarn and PDMS-coated wool yarn on the fingers and a sensing layer with 4 electrodes made of CO fabric coated with carbon nanotubes (CNTs) on the palm. The other hand contains a PES patch on the palm as a counterpart for triboelectrification.
EM: No UX tests performed.
MF: Proposed gloves retained “eminent characteristics of texture and fabric, e.g., flexibility and breathability”. A range of frequently used gestures were shown to illustrate the relative peak voltage (technical measurements) corresponding to the practical use of the smart gloves. The prototype demonstrated continuous information transmission. The triboelectric effect enabled the self-powering function, and spatial electrostatic induction allowed noncontact gesture detection. Both properties and features are intended to improve the user experience. However, user tests were not accomplished.
P56
[107]
Present the development of a textile electrode-based e-sleeve integrated into a wearable training system for stroke rehabilitation through a co-design process.M&T: Sleeve made of conventional fabric (PES/CO from Whaleys Bradford Ltd.) with a 24-electrode array screen printed with functional pastes from Smart Fabric Inks—standard interface (Fabink UV-IF-1004), waterproof interface and encapsulation (Fabink UV-IF-1039), silver ink for printing flexible conductive layer (Fabink TC-C4007), silicone rubber carbon paste for printing dry electrodes (Fabink TC-E0002)—sewed onto stretchable fabric and connected via 24-way ribbon cable to control electronics with a battery. The training system was composed of an e-sleeve, training software, and a movement sensor (Kinect v2.0 from Microsoft).
EM: User sessions with stroke survivors and their caregivers for prototype experimentation based on the think-aloud method (assessing facility of donning and doffing and comfort) and closed questions about preferred design, color, and washing technique.
MF: E-sleeves were co-created with end users and their caregivers and integrated into a user-friendly wearable training system—electronics, software, and movement sensor. Muscle stimulation was successfully achieved, being optimized for a specific user by the combination of electrode array elements through a control algorithm. Technical tests demonstrated positive bending and cleaning (by washing and wiping) durability features. The developed system “can facilitate rehabilitation exercises for stroke survivors to achieve targeted hand gestures and facilitate repeated movements as part of a rehabilitation program”.
1 P7 [70] and P22 [18] refer to different phases of the same project. 2 P40 [57] and P41 [9] refer to different phases of the same project.

References

  1. Cabral, I.D.; Souto, A.P.; Worbin, L. Introduction. In Dynamic Light Filters: Smart Materials Applied to Textile Design; Springer International Publishing: Cham, Switzerland, 2020; pp. 1–6. ISBN 978-3-030-39529-2. [Google Scholar]
  2. Koncar, V. Introduction to Smart Textiles and Their Applications. In Smart Textiles and their Applications; Koncar, V., Ed.; Woodhead Publishing: Oxford, UK, 2016; pp. 1–8. ISBN 978-0-08-100574-3. [Google Scholar]
  3. Sajovic, I.; Kert, M.; Boh Podgornik, B. Smart Textiles: A Review and Bibliometric Mapping. Appl. Sci. 2023, 13, 10489. [Google Scholar] [CrossRef]
  4. Stoppa, M.; Chiolerio, A. Wearable Electronics and Smart Textiles: A Critical Review. Sensors 2014, 14, 11957–11992. [Google Scholar] [CrossRef] [PubMed]
  5. Tao, X. Smart Technology for Textiles and Clothing–Introduction and Overview. In Smart Fibres, Fabrics and Clothing; Tao, X., Ed.; Woodhead Publishing Series in Textiles; Woodhead Publishing: Oxford, UK, 2001; pp. 1–6. ISBN 978-1-85573-546-0. [Google Scholar]
  6. Merati, A.A. Application of Stimuli-Sensitive Materials in Smart Textiles. In Advanced Textile Engineering Materials; John Wiley & Sons, Ltd.: Hoboken, NJ, USA, 2018; pp. 1–29. ISBN 9781119488101. [Google Scholar]
  7. Lymberis, A.; Paradiso, R. Smart Fabrics and Interactive Textile Enabling Wearable Personal Applications: R&D State of the Art and Future Challenges. In Proceedings of the 30th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBS’08-“Personalized Healthcare through Technology”, Vancouver, BC, Canada, 20–25 August 2008; pp. 5270–5273. [Google Scholar]
  8. Ruckdashel, R.R.; Khadse, N.; Park, J.H. Smart E-Textiles: Overview of Components and Outlook. Sensors 2022, 22, 6055. [Google Scholar] [CrossRef] [PubMed]
  9. Tajadura-Jiménez, A.; Väljamäe, A.; Kuusk, K. Altering One’s Body-Perception Through E-Textiles and Haptic Metaphors. Front. Robot. AI 2020, 7, 7. [Google Scholar] [CrossRef]
  10. Chen, A.C.-Y.; Lin, Y.-C. Warm Robot Classroom_Using Wearable Technology as a Gateway to Culturally Responsive Teaching; Springer: Cham, Switzerland, 2018; Volume 10925 LNCS, ISBN 9783319911519. [Google Scholar]
  11. Júnior, H.L.O.; Neves, R.M.; Monticeli, F.M.; Dall Agnol, L. Smart Fabric Textiles: Recent Advances and Challenges. Textiles 2022, 2, 582–605. [Google Scholar] [CrossRef]
  12. Van Langenhove, L.; Hertleer, C. Smart Clothing: A New Life. Int. J. Cloth. Sci. Technol. 2004, 16, 63–72. [Google Scholar] [CrossRef]
  13. Seymour, S. Fashionable Technology; Springer: Vienna, Austria, 2008; ISBN 978-3-211-74498-7. [Google Scholar]
  14. Beuthel, J.M.; Bentegeac, P.; Fuchsberger, V.; Maurer, B.; Tscheligi, M. Experiencing Distance: Wearable Engagements with Remote Relationships. In Proceedings of the TEI 2021-Proceedings of the 15th International Conference on Tangible, Embedded, and Embodied Interaction, Online, 14–19 February 2021. [Google Scholar]
  15. Jiang, M.; Nanjappan, V.; Ten Bhömer, M.; Liang, H.-N. On the Use of Movement-Based Interaction with Smart Textiles for Emotion Regulation. Sensors 2021, 21, 990. [Google Scholar] [CrossRef]
  16. Jiang, M.; Nanjappan, V.; Liang, H.-N.; Ten Bhömer, M. GesFabri: Exploring Affordances and Experience of Textile Interfaces for Gesture-Based Interaction. Proc. ACM Hum. Comput. Interact. 2022, 6, 1–23. [Google Scholar] [CrossRef]
  17. Posch, I. Crafting Stories: Smart and Electronic Textile Craftsmanship for Interactive Books. In Proceedings of the TEI 2021-Proceedings of the 15th International Conference on Tangible, Embedded, and Embodied Interaction, Online, 14–19 February 2021. [Google Scholar]
  18. Tan, J.; Shao, L.; Lam, N.Y.K.; Toomey, A.; Ge, L. Intelligent Textiles: Designing a Gesture-Controlled Illuminated Textile Based on Computer Vision. Text. Res. J. 2022, 92, 3034–3048. [Google Scholar] [CrossRef]
  19. Benyon, D.; Höök, K.; Nigay, L. Spaces of Interaction. In Proceedings of the 2008 International Conference on Visions of Computer Science, ACM-BCS, Edinburgh, UK, 14–16 April 2010. [Google Scholar]
  20. Tomico, O.; Wilde, D. Soft, Embodied, Situated & Connected: Enriching Interactions with Soft Wearables. mUX J. Mob. User Exp. 2016, 5, 3. [Google Scholar] [CrossRef]
  21. Faria, A.P.; Cunha, J.; Providência, B. Design, Technology and Emotion Measurement. In Proceedings of the NORDSCI Conference on Social Sciences, Helsinki, Finland, 17–19 July 2018; pp. 21–28. [Google Scholar]
  22. Solomon, R.C. What Is an Emotion? Classic and Contemporary Readings, 2nd ed.; Oxford University Press: New York, NY, USA, 2003. [Google Scholar]
  23. Medeiros, W.G. Meaningful Interaction with Products. Des. Issues 2014, 30, 16–28. [Google Scholar] [CrossRef]
  24. Lopes, A.G. HCI Four Waves Within Different Interaction Design Examples. In Proceedings of the Human Work Interaction Design. Artificial Intelligence and Designing for a Positive Work Experience in a Low Desire Society; Bhutkar, G., Barricelli, B.R., Xiangang, Q., Clemmensen, T., Gonçalves, F., Abdelnour-Nocera, J., Lopes, A., Lyu, F., Zhou, R., Hou, W., Eds.; Springer International Publishing: Cham, Switzerland, 2022; pp. 83–98. [Google Scholar]
  25. ISO 9241-210; Ergonomics of Human-System Interaction—Part 210: Human-Centred Design for Interactive Systems. International Organization for Standardization: Geneva, Switzerland, 2010. Available online: https://www.iso.org/obp/ui/en/#iso:std:iso:9241:-210:ed-2:v1:en (accessed on 15 April 2024).
  26. Hassenzahl, M. The Thing and I: Understanding the Relationship Between User and Product. In Funology: From Usability to Enjoyment; Blythe, M.A., Monk, A.F., Overbeeke, K., Wright, P.C., Eds.; Springer: Dordrecht, The Netherlands, 2004; pp. 31–42. ISBN 978-1-4020-2967-7. [Google Scholar]
  27. Minge, M.; Thüring, M. Hedonic and Pragmatic Halo Effects at Early Stages of User Experience. Int. J. Hum. Comput. Stud. 2018, 109, 13–25. [Google Scholar] [CrossRef]
  28. Baskan, A.; Goncu-Berk, G. User Experience of Wearable Technologies: A Comparative Analysis of Textile-Based and Accessory-Based Wearable Products. Appl. Sci. 2022, 12, 11154. [Google Scholar] [CrossRef]
  29. Quinn, B. Textile Futures: Fashion, Design and Technology; Berg Publishers: New York, NY, USA, 2010. [Google Scholar]
  30. Triberti, S.; Chirico, A.; La Rocca, G.; Riva, G. Developing Emotional Design: Emotions as Cognitive Processes and Their Role in the Design of Interactive Technologies. Front. Psychol. 2017, 8, 1773. [Google Scholar] [CrossRef]
  31. Costa, R.; Oliveira, P.; Grilo, A.; Schwarz, A.; Cardon, G.; DeSmet, A.; Ferri, J.; Domenech, J.; Pomazanskyi, A. SmartLife: Smart Clothing Gamification to Promote Energy-Related Behaviours among Adolescents. In Proceedings of the 2017 International Conference on Engineering, Technology and Innovation (ICE/ITMC), Madeira, Portugal, 27–29 June 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 1489–1495. [Google Scholar]
  32. Nelson, E.C.; Verhagen, T.; Noordzij, M.L. Health Empowerment through Activity Trackers: An Empirical Smart Wristband Study. Comput. Human. Behav. 2016, 62, 364–374. [Google Scholar] [CrossRef]
  33. Häkkilä, J. Designing for Smart Clothes and Wearables—User Experience Design Perspective. In Smart Textiles: Fundamentals, Design, and Interaction; Schneegass, S., Amft, O., Eds.; Springer International Publishing: Cham, Switzerland, 2017; pp. 259–278. ISBN 978-3-319-50124-6. [Google Scholar]
  34. Thüring, M.; Mahlke, S. Usability, Aesthetics and Emotions in Human–Technology Interaction. Int. J. Psychol. 2007, 42, 253–264. [Google Scholar] [CrossRef]
  35. Kieffer, S.; Rukonic, L.; Kervyn de Meerendré, V.; Vanderdonckt, J. Specification of a UX Process Reference Model towards the Strategic Planning of UX Activities. In Proceedings of the 14th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications, Prague, Czech Republic, 25–27 February 2019; SCITEPRESS-Science and Technology Publications: Setúbal, Portugal, 2019; pp. 74–85. [Google Scholar]
  36. Mauss, I.B.; Robinson, M.D. Measures of Emotion: A Review. Cogn. Emot. 2009, 23, 209–237. [Google Scholar] [CrossRef]
  37. Desmet, P.M.A.; Fokkinga, S.F.; Ozkaramanli, D.; Yoon, J. Emotion-Driven Product Design. In Emotion Measurement, 2nd ed.; Meiselman, H.L., Ed.; Woodhead Publishing: Oxford, UK, 2021; pp. 645–670. ISBN 978-0-12-821125-0. [Google Scholar]
  38. Wang, W.; Zou, J.; Fang, Y. Design and Evaluation of a Somatosensory Hat: An Emotional Semantic Perspective. AATCC J. Res. 2021, 8, 20–29. [Google Scholar] [CrossRef]
  39. Denyer, D.; Tranfield, D. Producing a Systematic Review. In The Sage Handbook of Organizational Research Methods; Sage Publications Ltd.: Thousand Oaks, CA, USA, 2009; pp. 671–689. ISBN 978-1-4129-3118-2. [Google Scholar]
  40. Uman, L.S. Systematic Reviews and Meta-Analyses. J. Can. Acad. Child Adolesc. Psychiatry 2011, 20, 57–59. [Google Scholar]
  41. Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. The PRISMA 2020 Statement: An Updated Guideline for Reporting Systematic Reviews. BMJ 2021, 372, n71. [Google Scholar] [CrossRef]
  42. Jiang, M.; Bhömer, M.T.; Liang, H.-N. Exploring the Design of Interactive Smart Textiles for Emotion Regulation. In Proceedings of the HCI International 2020–Late Breaking Papers: Digital Human Modeling and Ergonomics, Mobility and Intelligent Environments: 22nd HCI International Conference, HCII 2020, Copenhagen, Denmark, 19–24 July 2020; Proceedings. Springer: Berlin/Heidelberg, Germany, 2020; pp. 298–315. [Google Scholar]
  43. Rodriguez, M.; Kross, E. Sensory Emotion Regulation. Trends Cogn. Sci. 2023, 27, 379–390. [Google Scholar] [CrossRef] [PubMed]
  44. Stoffregen, T.A.; Mantel, B.; Bardy, B.G. The Senses Considered as One Perceptual System. Ecol. Psychol. 2017, 29, 165–197. [Google Scholar] [CrossRef]
  45. Carlos, L.-R.; Manuel, Z.-R.V.; Verónica del Rocio, O.-L.; Gerardo, M.-L. Wireless Sensor Networks Applications for Monitoring Environmental Variables Using Evolutionary Algorithms. In Intelligent Data Sensing and Processing for Health and Well-Being Applications; Wister, M., Pancardo, P., Acosta, F., Hernández, J.A., Eds.; Intelligent Data-Centric Systems; Academic Press: Cambridge, MA, USA, 2018; pp. 257–281. ISBN 978-0-12-812130-6. [Google Scholar]
  46. Shtarbanov, A. FlowIO Development Platform–the Pneumatic “Raspberry Pi” for Soft Robotics. In Proceedings of the Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems, London, UK, 22–24 September 2008; ACM: New York, NY, USA, 2021; pp. 1–6. [Google Scholar]
  47. Zhang, P. (Ed.) 9-Industrial Computers. In Advanced Industrial Control Technology; William Andrew Publishing: Oxford, UK, 2010; pp. 345–359. ISBN 978-1-4377-7807-6. [Google Scholar]
  48. Kilic Afsar, O.; Shtarbanov, A.; Mor, H.; Nakagaki, K.; Forman, J.; Modrei, K.; Jeong, S.H.; Hjort, K.; Höök, K.; Ishii, H. OmniFiber: Integrated Fluidic Fiber Actuators for Weaving Movement Based Interactions into the ‘Fabric of Everyday Life’. In Proceedings of the 34th Annual ACM Symposium on User Interface Software and Technology, online, 10–14 October 2021; ACM: New York, NY, USA, 2021; pp. 1010–1026. [Google Scholar]
  49. Goyal, P.; Sahoo, A.K.; Sharma, T.K. Internet of Things: Architecture and Enabling Technologies. Mater Today Proc. 2021, 34, 719–735. [Google Scholar] [CrossRef]
  50. Yang, K.; Isaia, B.; Brown, L.J.E.; Beeby, S. E-Textiles for Healthy Ageing. Sensors 2019, 19, 4463. [Google Scholar] [CrossRef]
  51. Yadav, A.; Yadav, K. Transforming Healthcare and Fitness with AI Powered Next-Generation Smart Clothing. Discov. Electrochem. 2025, 2, 2. [Google Scholar] [CrossRef]
  52. Cleary, F.; Srisa-An, W.; Henshall, D.C.; Balasubramaniam, S. Emerging AI Technologies Inspiring the Next Generation of E-Textiles. IEEE Access 2023, 11, 56494–56508. [Google Scholar] [CrossRef]
  53. Perera, N.; Shahidi, A.M.; Marasinghe, K.; Kaner, J.; Oliveira, C.; Wickenden, R.; Dias, T.; Hughes-Riley, T. Exploring Sustainable Approaches for Electronic Textile Products and Prototypes. Sensors 2024, 24, 5472. [Google Scholar] [CrossRef]
  54. Guridi, S.; Iannacchero, M.; Pouta, E. Towards More Sustainable Interactive Textiles: A Literature Review on The Use of Biomaterials for ETextiles. In Proceedings of the 2024 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 11–16 May 2024; Association for Computing Machinery: New York, NY, USA, 2024. [Google Scholar]
  55. Bootsman, R.; Markopoulos, P.; Qi, Q.; Wang, Q.; Timmermans, A.A. Wearable Technology for Posture Monitoring at the Workplace. Int. J. Hum. Comput. Stud. 2019, 132, 99–111. [Google Scholar] [CrossRef]
  56. Kilic Afsar, O.; Luft, Y.; Cotton, K.; Stepanova, E.R.; Núñez-Pacheco, C.; Kleinberger, R.; Ben Abdesslem, F.; Ishii, H.; Höök, K. Corsetto: A Kinesthetic Garment for Designing, Composing for, and Experiencing an Intersubjective Haptic Voice. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, Hamburg, Germany, 23–28 April 2023; ACM: New York, NY, USA, 2023; pp. 1–23. [Google Scholar]
  57. Kuusk, K.; Tajadura-Jiménez, A.; Väljamäe, A. A Transdisciplinary Collaborative Journey Leading to Sensorial Clothing. CoDesign 2020, 16, 311–327. [Google Scholar] [CrossRef]
  58. Madaghiele, V.; Demir, A.D. Pain Creature: Interdisciplinary Collaboration in the Design of an Embodied Textile Instrument for interactive Dance. In Proceedings of the International Conference on New Interfaces for Musical Expression, Utrecht, The Netherlands, 4–6 September 2024; NIME: Utrecht, The Netherlands, 2024; pp. 465–473. [Google Scholar]
  59. West, T.J.; Bachmayer, A.; Bhagwati, S.; Berzowska, J.; Wanderley, M.M. The Design of the Body:Suit:Score, a Full-Body Vibrotactile Musical Score. In Proceedings of the Human Interface and the Management of Information. Information in Intelligent Systems. HCII 2019; Lecture Notes in Computer Science. Yamamoto, S., Mori, H., Eds.; Springer International Publishing: Cham, Switzerland, 2019; pp. 70–89. [Google Scholar]
  60. Minetou, L.; Chatzopoulos, A.; Tzerachoglou, A.; Priniotakis, G.; van Hoof, J.; Sfyroera, E.; Georgiadou, Z.; Tyrovola, S.; Drosos, C. Homing Wellness: Can Narrative Design Transform Living Spaces for People with Dementia into Engaging Environments Enabling Communication? Front. Public Health 2023, 11, 1198253. [Google Scholar] [CrossRef]
  61. Klefeker, J.; Striegl, L.; Devendorf, L. What HCI Can Learn from ASMR: Becoming Enchanted with the Mundane. In Proceedings of the Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; pp. 1–12. [Google Scholar]
  62. Petreca, B.; Saito, C.; Baurley, S.; Atkinson, D.; Yu, X.; Bianchi-Berthouze, N. Radically Relational Tools: A Design Framework to Explore Materials through Embodied Processes. Int. J. Des. 2019, 13, 7–20. [Google Scholar]
  63. Choi, B.C.K.; Pak, A.W.P. Multidisciplinarity, Interdisciplinarity and Transdisciplinarity in Health Research, Services, Education and Policy: 1. Definitions, Objectives, and Evidence of Effectiveness. Clin. Investig. Med. 2006, 29, 351–364. [Google Scholar]
  64. Buso, A.; Mcquillan, H.; Jansen, K.; Karana, E. AnimaTo: Designing a Multimorphic Textile Artefact for Performativity. In Proceedings of the Designing Interactive Systems Conference, Copenhagen, Denmark, 1–5 July 2024; ACM: New York, NY, USA, 2024; pp. 20–34. [Google Scholar]
  65. Gratz-Kelly, S.; Cerino, M.; Philippi, D.; Göttel, D.; Nalbach, S.; Hubertus, J.; Schultes, G.; Heppe, J.; Motzki, P. Multifunctional Sensor Array for User Interaction Based on Dielectric Elastomers with Sputtered Metal Electrodes. Materials 2024, 17, 5993. [Google Scholar] [CrossRef] [PubMed]
  66. Honauer, M.; Uğur Yavuz, S.; Kuusk, K. TREESENSE–Sensitising Children to Nature through Embodied Play. In Proceedings of the 23rd Annual ACM Interaction Design and Children Conference, Delft, The Netherlands, 17–20 June 2024; ACM: New York, NY, USA, 2024; pp. 639–643. [Google Scholar]
  67. Huang, X.; Romano, D.M. Coral Morph: An Artistic Shape-Changing Textile Installation for Mindful Emotion Regulation in the Wild. Int. J. Hum. Comput. Interact. 2025, 41, 1173–1189. [Google Scholar] [CrossRef]
  68. Huang, X. Constructing the Affectiveness and Aesthetics of Touch through Shape-Changing Fashion and Textiles. Des. J. 2023, 26, 817–827. [Google Scholar] [CrossRef]
  69. Santos, L.; Dionisio, M.; Campos, P. TapeStory: Exploring the Storytelling Potential of Interactive Tapestries. In Proceedings of the Creativity and Cognition, Chicago, IL, USA, 23–26 June 2024; ACM: New York, NY, USA, 2024; pp. 686–699. [Google Scholar]
  70. Tan, J.; Shao, L.; Lam, N.Y.K.; Toomey, A.; Chan, H.H.; Lee, C.; Feng, G.Y. Evaluating the Usability of a Prototype Gesture-Controlled Illuminative Textile. J. Text. Inst. 2024, 115, 350–356. [Google Scholar] [CrossRef]
  71. Wang, M.; Zhou, Y.; Stewart, R. Soft Wearable Robotics: Innovative Knitting-Integrated Approaches for Pneumatic Actuators Design. In Proceedings of the Designing Interactive Systems Conference, Copenhagen, Denmark, 1–5 July 2024; ACM: New York, NY, USA, July, 2024; pp. 234–238. [Google Scholar]
  72. Wicaksono, I.; Blanchard, L.; Chin, S.; Colon, C.; Paradiso, J. KnitworkVR: Dual-Reality Experience through Distributed Sensor-Actuator Networks in the Living Knitwork Pavilion. In Proceedings of the SIGGRAPH Asia 2024 Art Papers, Tokyo, Japan, 3–6 December 2024; ACM: New York, NY, USA, 2024; pp. 1–7. [Google Scholar]
  73. Wu, Z.; Gao, Z.; Xu, H.; Yang, X.; Braud, T. SoundMorphTPU: Exploring Gesture Mapping in Deformable Interfaces for Music Interaction. In Proceedings of the International Conference on New Interfaces for Musical Expression, Utrecht, The Netherlands, 4–6 September 2024; NIME: Utrecht, The Netherlands, 2024; pp. 395–406. [Google Scholar]
  74. Xue, J.; Montano Murillo, R.; Dawes, C.; Frier, W.; Cornelio, P.; Obrist, M. FabSound: Audio-Tactile and Affective Fabric Experiences Through Mid-Air Haptics. In Proceedings of the CHI Conference on Human Factors in Computing Systems, Hamburg, Germany, 23–28 April 2024; ACM: New York, NY, USA, 2024; pp. 1–17. [Google Scholar]
  75. Zook, Z.A.; Jumet, B.; Yousaf, A.; Preston, D.J.; O’Malley, M.K. Multiscale Textile-Based Haptic Interactions. Adv. Intell. Syst. 2024, 6, 2300897. [Google Scholar] [CrossRef]
  76. Luo, Y.; Zhu, J.; Wu, K.; Honnet, C.; Mueller, S.; Matusik, W. MagKnitic: Machine-Knitted Passive and Interactive Haptic Textiles with Integrated Binary Sensing. In Proceedings of the 36th Annual ACM Symposium on User Interface Software and Technology, New York, NY, USA, 29 October–1 November 2023; ACM: New York, NY, USA, 2023; pp. 1–13. [Google Scholar]
  77. Madaghiele, V.; Demir, A.D.; Pauletto, S. Heat-Sensitive Sonic Textiles: Fostering Awareness of the Energy We Save by Wearing Warm Fabrics. In Proceedings of the Sound and Music Computing Conference 2023, Stockholm, Sweden, 15–17 June 2023; pp. 395–402. [Google Scholar]
  78. Yang, X.; Zhu, K. Emoband: Investigating the Affective Perception towards On-Wrist Stroking and Squeezing Feedback Mediated by Different Textile Materials. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, Hamburg, Germany, 23–28 April 2023; ACM: New York, NY, USA, 2023; pp. 1–20. [Google Scholar]
  79. Youn, H.J.; Zhong, S.; Shtarbanov, A.; Chwalek, P. NugiTex: An Interactive, Affective Wearable That Informs Users of a Plant’s “Comfort” Level through Haptic Cues. In Proceedings of the 2023 ACM Designing Interactive Systems Conference, New York, NY, USA, 12–16 June 2023; ACM: New York, NY, USA, 2023; pp. 251–255. [Google Scholar]
  80. Demir, A.D.; Kuusk, K.; Nimkulrat, N. Squeaky/Pain: Articulating the Felt Experience of Pain for Somaesthetic Interactions|Squeaky/Pain: Articular l’experiència Sentida Del Dolor per al Disseny d’interaccions Somaestètiques. Temes Disseny 2022, 2022, 162–178. [Google Scholar] [CrossRef]
  81. Khorsandi, P.M.; Nousir, A.; Nabil, S. Functioning E-Textile Sensors for Car Infotainment Applications. Eng. Proc. 2022, 15, 22. [Google Scholar] [CrossRef]
  82. Zhao, J.; Deng, J.; Liang, W.; Zhao, L.; Dong, Y.; Wang, X.; Lin, L. Water-Retentive, 3D Knitted Textile Electrode for Long-Term and Motion State Bioelectrical Signal Acquisition. Compos. Sci. Technol. 2022, 227, 109606. [Google Scholar] [CrossRef]
  83. Bell, F.; Hong, A.; Danielescu, A.; Maheshwari, A.; Greenspan, B.; Ishii, H.; Devendorf, L.; Alistar, M. Self-Destaining Textiles: Designing Interactive Systems with Fabric, Stains and Light. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, Yokohama, Japan, 8–13 May 2021; ACM: New York, NY, USA, 2021. Article 631. pp. 1–12. [Google Scholar]
  84. Mlakar, S.; Preindl, T.; Pointner, A.; Haberfellner, M.A.; Danner, R.; Aigner, R.; Haller, M. The Sound of Textile: An Interactive Tactile-Sonic Installation. In Proceedings of the 10th International Conference on Digital and Interactive Arts, Aveiro, Portugal, 13–15 October 2021; ACM: New York, NY, USA, 2021; pp. 1–5. [Google Scholar]
  85. Nabil, S.; Jones, L.; Girouard, A. Soft Speakers: Digital Embroidering of DIY Customizable Fabric Actuators. In Proceedings of the 15th International Conference on Tangible, Embedded, and Embodied Interaction, Salzburg, Austria, 14–17 February 2021; ACM: New York, NY, USA, 2021; pp. 1–5. [Google Scholar]
  86. Nonnis, A.; Bryan-Kinns, N. Olly: A Tangible for Togetherness. Int. J. Hum. Comput. Stud. 2021, 153, 102647. [Google Scholar] [CrossRef]
  87. Sankar, S.; Balamurugan, D.; Brown, A.; Ding, K.; Xu, X.; Low, J.H.; Yeow, C.H.; Thakor, N. Texture Discrimination with a Soft Biomimetic Finger Using a Flexible Neuromorphic Tactile Sensor Array That Provides Sensory Feedback. Soft. Robot. 2021, 8, 577–587. [Google Scholar] [CrossRef] [PubMed]
  88. Biesmans, S.; Markopoulos, P. Design and Evaluation of Sonis, a Wearable Biofeedback System for Gait Retraining. Multimodal Technol. Interact. 2020, 4, 60. [Google Scholar] [CrossRef]
  89. Briot, A.; Honnet, C.; Strohmeier, P. Stymphalian Birds: Exploring the Aesthetics of A Hybrid Textile. In Proceedings of the 2020 ACM Designing Interactive Systems Conference, Eindhoven, The Netherlands, 6–10 July 2020; ACM: New York, NY, USA, 2020; pp. 437–440. [Google Scholar]
  90. Demir, A.D. AURA: Altering Self-Perception Through Interactive Light Emitting Textiles. In Proceedings of the 11th Nordic Conference on Human–Computer Interaction, Tallinn, Estonia, 25–29 October 2020; ACM: New York, NY, USA, 2020; pp. 1–3. [Google Scholar]
  91. Goudswaard, M.; Abraham, A.; Goveia Da Rocha, B.; Andersen, K.; Liang, R.-H. FabriClick: Interweaving Pushbuttons into Fabrics Using 3d Printing and Digital Embroidery. In Proceedings of the 2020 ACM Designing Interactive Systems Conference, Eindhoven, The Netherlands, 6–10 July 2020; ACM: New York, NY, USA, 2020; pp. 379–393. [Google Scholar]
  92. Hinojosa, L.P.C. Knotting the Memory//Encoding the Khipu_: Reuse of an Ancient Andean Device as a NIME. In Proceedings of the International Conference on New Interfaces for Musical Expression, Birmingham, UK, 21–25 July 2020; Royal Birmingham Conservatoire: Birmingham, UK, 2020; pp. 495–498. [Google Scholar]
  93. Honauer, M.; Yavuz, S.U.; Kuusk, K. WORM-E: An Interactive Toy Enriching Children’s Bodily and Social Play. In Proceedings of the Companion Publication of the 2020 ACM Designing Interactive Systems Conference, Eindhoven, The Netherlands, 6–10 July 2020; ACM: New York, NY, USA, 2020; pp. 333–336. [Google Scholar]
  94. Wicaksono, I.; Paradiso, J. KnittedKeyboard: Digital Knitting of Electronic Musical Controllers. In Proceedings of the International Conference on New Interfaces for Musical Expression, Birmingham, UK, 21–25 July 2020; Birmingham City University: Birmingham, UK, 2020; pp. 323–326. [Google Scholar]
  95. Devendorf, L.; Di Lauro, C. Adapting Double Weaving and Yarn Plying Techniques for Smart Textiles Applications. In Proceedings of the 13th International Conference on Tangible, Embedded, and Embodied Interaction, Tempe, AZ, USA, 17–20 March 2019; ACM: New York, NY, USA, 2019; pp. 77–85. [Google Scholar]
  96. Furukawa, T.; Hanamitsu, N.; Kamiyama, Y.; Nii, H.; Krekoukiotis, C.; Minamizawa, K.; Noda, A.; Yamada, J.; Kitamura, K.; Niwa, D.; et al. Synesthesia Wear: Full-Body Haptic Clothing Interface Based on Two-Dimensional Signal Transmission. In Proceedings of the SIGGRAPH Asia 2019 Emerging Technologies, Brisbane, QLD, Australia, 17–20 November 2019; ACM: New York, NY, USA, 2019; pp. 48–50. [Google Scholar]
  97. Honauer, M.; Moorthy, P.; Hornecker, E. Interactive Soft Toys for Infants and Toddlers–Design Recommendations for Age-Appropriate Play. In Proceedings of the 2019 ACM Conference on Interaction Design and Children, Barcelona, Spain, 18–21 June 2019; ACM: New York, NY, USA, 2019; pp. 265–276. [Google Scholar]
  98. Moorthy, P.; Honauer, M.; Hornecker, E.; Mühlenberend, A. Hello World: A Children’s Touch and Feel Books Enhanced with DIY Electronics. In Proceedings of the 16th International Conference on Mobile and Ubiquitous Multimedia, Stuttgart, Germany, 26–29 November 2017; ACM: New York, NY, USA, 2017; pp. 481–488. [Google Scholar]
  99. Petreca, B.; Baurley, S.; Bianchi-Berthouze, N. How Do Designers Feel Textiles? In Proceedings of the 2015 International Conference on Affective Computing and Intelligent Interaction, Xi’an, China, 24–27 August 2015; IEEE: New York, NY, USA, 2015; pp. 982–987. [Google Scholar]
  100. Rus, S.; Kirchbuchner, F.; Braun, A.; Kuijper, A. Designing a Self-Aware Jacket: Insights into Smart Garment’s Creation Process. In Proceedings of the 12th ACM International Conference on Pervasive Technologies Related to Assistive Environments, Rhodes, Greece, 26–29 June 2019; ACM: New York, NY, USA, 2019. Article 35. pp. 53–58. [Google Scholar]
  101. Yammiyavar, P.; Deepshikha, D. Exploring Potential of Traditionally Crafted Textiles to Transform into E-Wearables for Use in Socio-Cultural Space; Springer: Cham, Switzerland, 2019; Volume 544, ISBN 9783030052966. [Google Scholar]
  102. Boone, A.; Rivera, E.; Wolf, J. Patchwork: An Expressive e-Textile Construction Kit. In Proceedings of the 2018 ACM Conference on Interaction Design and Children, Trondheim, Norway, 19–22 June 2018; ACM: New York, NY, USA, 2018; pp. 529–532. [Google Scholar]
  103. Ferraro, V.; Stepanovic, M.; Ferraris, S. Wearability and User Experience through User Engagement: The Case Study of a Wearable Device; Springer: Cham, Switzerland, 2018; Volume 608, ISBN 9783319606385. [Google Scholar]
  104. Giles, E.; Van Der Linden, J.; Petre, M. Weaving Lighthouses and Stitching Stories: Blind and Visually Impaired People Designing e-Textiles. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal, QC, Canada, 21–26 April 2018; ACM: New York, NY, USA, 2018. Article 470. pp. 1–12. [Google Scholar]
  105. Itzhak, K.; Kantek, K.; Levi, D.; Geiger, S.; Nir, S.; Rinott, M. Sew-Flow: A Craft Interpretation of a Digital Game. In Proceedings of the 12th International Conference on Tangible, Embedded, and Embodied Interaction, Stockholm, Sweden, 18–21 March 2018; ACM: New York, NY, US, 2018; pp. 307–312. [Google Scholar]
  106. Wu, H.; Guo, H.; Su, Z.; Shi, M.; Chen, X.; Cheng, X.; Han, M.; Zhang, H. Fabric-Based Self-Powered Noncontact Smart Gloves for Gesture Recognition. J. Mater. Chem. A 2018, 6, 20277–20288. [Google Scholar] [CrossRef]
  107. Yang, K.; Meadmore, K.; Freeman, C.; Grabham, N.; Hughes, A.-M.; Wei, Y.; Torah, R.; Glanc-Gostkiewicz, M.; Beeby, S.; Tudor, J. Development of User-Friendly Wearable Electronic Textiles for Healthcare Applications. Sensors 2018, 18, 2410. [Google Scholar] [CrossRef]
Figure 1. Model of user experience with smart textiles. Adapted from [26,28,33,34].
Figure 1. Model of user experience with smart textiles. Adapted from [26,28,33,34].
Technologies 13 00251 g001
Figure 2. Resulting data from the application of PRISMA flow diagram.
Figure 2. Resulting data from the application of PRISMA flow diagram.
Technologies 13 00251 g002
Figure 3. Analysis of materials and technologies (M&T) applications.
Figure 3. Analysis of materials and technologies (M&T) applications.
Technologies 13 00251 g003
Figure 4. Analysis of user experience (UX) evaluation methods.
Figure 4. Analysis of user experience (UX) evaluation methods.
Technologies 13 00251 g004
Table 1. Specific tools for quantitative evaluation.
Table 1. Specific tools for quantitative evaluation.
Quantitative
Evaluation Tool
Brief DescriptionPapers
Brief Mood
Introspection Scale
(BMIS)
Scale containing 16 mood adjectives to be classified through a 1–4 Likert scale for each item. It allows the overall pleasantness–unpleasantness and arousal–calmness measurements, in addition to reversibly scored positive-tired and negative-calm mood dimensions.(P4)
Comfort Rating
Scale (CRS)
Tool that aims to measure the comfort of wearable computers across six attributes: emotion, attachment, harm, perceived change, movement, and anxiety. Scores are marked from low to high and can vary their range.(P26)
Credibility and
Expectancy
Questionnaire (CEQ)
Tool that emphasizes what people believe a system can help improve their condition comparatively to what extent they feel it leads to improvements. It consists of two subscales—credibility and expectancy, respectively—each includes three items of a 9-point Likert scale (no agreement–full agreement), which adds up with potential scores of 3–27.(P33; P43)
Godspeed
Questionnaire Series
(GQS)
Set of standardized self-report questionnaires to assess human perceptions of robots, AI, and other autonomous systems regarding their social and interactive capabilities. It consists of 5 key dimensions (anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety), each measured by a 5-point Likert scale.(P4)
Intrinsic Motivation
Inventory (IMI)
Multidimensional assessment tool that evaluates participants’ subjective experiences with a target activity, consisting of two or more 7-point rating scales in six subscale scores: interest/enjoyment, perceived competence, effort, value/usefulness, pressure, and tension felt, and perceived choice while performing a given task.(P43)
Perceived Social
Intelligence Survey
(PSI)
5-point Likert scale psychological tool to assess how people perceive the social intelligence of artificial agents and robots regarding their ability to understand and interact effectively in social situations. The original version comprises 80 items.(P4)
Self-Assessment
Manikin (SAM)
Non-verbal graphical self-reporting questionnaire with a 9-point Likert scale that evaluates valence, arousal, and dominance to indirectly assess affective response and perceived emotions to a wide range of stimuli.(P17; P26)
Semantic Differential
Scale (SDS)
Measuring tool that uses a series of bipolar/opposite scales (usually with adjectives or phrases) to assess an individual’s subjective perception and emotional responses toward a particular concept (terms, objects, events, activities, ideas). Most common versions include 5-, 7-, and 9-point Likert scales.(P17; P32; P50)
Social Touch
Questionnaire (STQ)
Self-report 20-item scale to assess individual attitudes, preferences, and perceptions (comfortable or uncomfortable) with different types of social touch in various contexts. Most versions use a 5-point or 7-point Likert scale.(P4)
System Usability
Scale (SUS)
5-point Likert scale questionnaire with 10 specific questions providing a broad perspective on systems usability evaluation about effectiveness, efficiency, and satisfaction.(P7; P50)
Technology
Acceptance Model (TAM)
Model that measures the acceptance of information systems by individuals, considering the prediction of the user’s behavioral intention, which is determined by two primary factors: the perception of technology usefulness and perceived ease of use.(P50)
Trait Meta-Mood
Scale (TMMS-24)
Research method with 24-item, 3-factor self-report measures to assess users’ emotional abilities regarding three dimensions (emotional attention, clarity, and repair). Typically used for emotional intelligence and mindfulness investigations.(P4)
Unified Theory of Acceptance and Use of
Technology (UTAUT)
Unified evaluation model (from 8, including TAM) to indicate people’s intention to adopt a specific technology, with 1–7 questions in four key constructs to determine user acceptance and usage behavior: performance expectancy, effort expectancy, social influence, and facilitating conditions.(P43)
User Experience
Questionnaire (UEQ)
Questionnaire that considers pragmatic and hedonic attributes supporting a holistic assessment of the user experience. It includes 26 semantic differentials of 7 levels (-3 to +3), organized into 6 subscales: attractiveness, perspicuity, efficiency, dependability, stimulation, and novelty.(P7; P33)
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Guennes, M.; Cunha, J.; Cabral, I. Smart Textile Design: A Systematic Review of Materials and Technologies for Textile Interaction and User Experience Evaluation Methods. Technologies 2025, 13, 251. https://doi.org/10.3390/technologies13060251

AMA Style

Guennes M, Cunha J, Cabral I. Smart Textile Design: A Systematic Review of Materials and Technologies for Textile Interaction and User Experience Evaluation Methods. Technologies. 2025; 13(6):251. https://doi.org/10.3390/technologies13060251

Chicago/Turabian Style

Guennes, Manoella, Joana Cunha, and Isabel Cabral. 2025. "Smart Textile Design: A Systematic Review of Materials and Technologies for Textile Interaction and User Experience Evaluation Methods" Technologies 13, no. 6: 251. https://doi.org/10.3390/technologies13060251

APA Style

Guennes, M., Cunha, J., & Cabral, I. (2025). Smart Textile Design: A Systematic Review of Materials and Technologies for Textile Interaction and User Experience Evaluation Methods. Technologies, 13(6), 251. https://doi.org/10.3390/technologies13060251

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop