Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (55)

Search Parameters:
Keywords = tactile augmentation

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
18 pages, 602 KiB  
Review
Innovations in Robot-Assisted Surgery for Genitourinary Cancers: Emerging Technologies and Clinical Applications
by Stamatios Katsimperis, Lazaros Tzelves, Georgios Feretzakis, Themistoklis Bellos, Ioannis Tsikopoulos, Nikolaos Kostakopoulos and Andreas Skolarikos
Appl. Sci. 2025, 15(11), 6118; https://doi.org/10.3390/app15116118 - 29 May 2025
Viewed by 670
Abstract
Robot-assisted surgery has transformed the landscape of genitourinary cancer treatment, offering enhanced precision, reduced morbidity, and improved recovery compared to open or conventional laparoscopic approaches. As the field matures, a new generation of technological innovations is redefining the boundaries of what robotic systems [...] Read more.
Robot-assisted surgery has transformed the landscape of genitourinary cancer treatment, offering enhanced precision, reduced morbidity, and improved recovery compared to open or conventional laparoscopic approaches. As the field matures, a new generation of technological innovations is redefining the boundaries of what robotic systems can achieve. This narrative review explores the integration of artificial intelligence, advanced imaging modalities, augmented reality, and connectivity in robotic urologic oncology. The applications of machine learning in surgical skill evaluation and postoperative outcome predictions are discussed, along with AI-enhanced haptic feedback systems that compensate for the lack of tactile sensation. The role of 3D virtual modeling, intraoperative augmented reality, and fluorescence-guided surgery in improving surgical planning and precision is examined for both kidney and prostate procedures. Emerging tools for real-time tissue recognition, including confocal microscopy and Raman spectroscopy, are evaluated for their potential to optimize margin assessment. This review also addresses the shift toward single-port systems and the rise of telesurgery enabled by 5G connectivity, highlighting global efforts to expand expert surgical care across geographic barriers. Collectively, these innovations represent a paradigm shift in robot-assisted urologic oncology, with the potential to enhance functional outcomes, surgical safety, and access to high-quality care. Full article
(This article belongs to the Special Issue New Trends in Robot-Assisted Surgery)
Show Figures

Figure 1

14 pages, 636 KiB  
Review
Technical Innovations and Complex Cases in Robotic Surgery for Lung Cancer: A Narrative Review
by Giacomo Cusumano, Giuseppe Calabrese, Filippo Tommaso Gallina, Francesco Facciolo, Pierluigi Novellis, Giulia Veronesi, Stefano Viscardi, Filippo Lococo, Elisa Meacci, Alberto Terminella, Gaetano Romano, Cristina Zirafa, Franca Melfi, Stefano Margaritora and Marco Chiappetta
Curr. Oncol. 2025, 32(5), 244; https://doi.org/10.3390/curroncol32050244 - 22 Apr 2025
Viewed by 961
Abstract
For over two decades, robotic-assisted thoracic surgery (RATS) has revolutionized thoracic oncology. With enhanced visualization, dexterity, and precision, RATS has reduced blood loss, shortened hospital stays, and sped up recovery compared to traditional surgery or video-assisted thoracoscopic surgery (VATS). The use of 3D [...] Read more.
For over two decades, robotic-assisted thoracic surgery (RATS) has revolutionized thoracic oncology. With enhanced visualization, dexterity, and precision, RATS has reduced blood loss, shortened hospital stays, and sped up recovery compared to traditional surgery or video-assisted thoracoscopic surgery (VATS). The use of 3D high-definition imaging and articulated instruments allows for complex resections and advanced lymph node assessment. RATS delivers oncological outcomes similar to open surgery and VATS, with high rates of complete (R0) resections and acceptable complication rates. Its minimally invasive nature promotes quicker recovery. Advances in imaging software and augmented reality further enhance surgical accuracy and reduce intraoperative risks. However, RATS has some limitations, including high costs and a lack of tactile feedback, and certain complex procedures, such as extended resections and intrapericardial interventions, remain challenging. With growing experience and technological advances, RATS shows promise in reducing morbidity, improving quality of life, and expanding access to advanced oncologic care. This article reviews the evolution, benefits, and limitations of RATS in NSCLC treatment, highlighting its emerging role in managing complex cases. Full article
(This article belongs to the Section Thoracic Oncology)
Show Figures

Figure 1

15 pages, 3022 KiB  
Article
Multi-Object Recognition and Motion Detection Based on Flexible Pressure Sensor Array and Deep Learning
by Hao Zhang, Yanan Tao, Kai Shi, Jiali Li, Jianjun Shi, Shaofeng Xu and Ying Guo
Appl. Sci. 2025, 15(6), 3302; https://doi.org/10.3390/app15063302 - 18 Mar 2025
Cited by 1 | Viewed by 790
Abstract
With ongoing technological advancements, artificial tactile systems have become a prominent area of research, aiming to replicate human tactile capabilities and enabling machines and devices to interact with their environments. Achieving effective artificial tactile sensing relies on the integration of high-performance pressure sensors, [...] Read more.
With ongoing technological advancements, artificial tactile systems have become a prominent area of research, aiming to replicate human tactile capabilities and enabling machines and devices to interact with their environments. Achieving effective artificial tactile sensing relies on the integration of high-performance pressure sensors, precise signal acquisition, robust transmission, and rapid data processing. In this study, we developed a sensor array system based on flexible pressure sensors designed to recognize objects of varying shapes and sizes. The system comprises a multi-channel acquisition circuit and a signal transmission circuit and employs a convolutional neural network (CNN) to classify distinct signal patterns. In a test on an individual, the test results demonstrate that the system achieves a high recognition accuracy of 99.60% across two sphere sizes, three cylinder sizes, a cone, and a rectangular prism. In a group of eight people, it can achieve a recognition accuracy of 93.75%. Furthermore, we applied this sensor array system in an experimental setting involving a ball-throwing action, and it effectively recognized four distinct stages: empty hand, holding the ball, throwing, and catching. In repeated tests by other individuals, it was also able to clearly distinguish each stage. The development of artificial tactile systems allows robots to engage with their environments in a more nuanced and precise manner, enabling complex tasks such as surgical procedures, enhancing the interactive experience of wearable devices, and increasing immersion in virtual reality (VR) and augmented reality (AR). When integrated with deep learning, artificial tactile sensing shows significant potential for creating more intelligent and efficient applications. Full article
Show Figures

Figure 1

20 pages, 1087 KiB  
Review
Enabling Tactile Internet via 6G: Application Characteristics, Requirements, and Design Considerations
by Bharat S. Chaudhari
Future Internet 2025, 17(3), 122; https://doi.org/10.3390/fi17030122 - 11 Mar 2025
Cited by 1 | Viewed by 1510
Abstract
With the emergence of artificial intelligence and advancements in network technologies, the imminent arrival of 6G is not very far away. The 6G technology will introduce unique and innovative applications of the Tactile Internet in the near future. This paper highlights the evolution [...] Read more.
With the emergence of artificial intelligence and advancements in network technologies, the imminent arrival of 6G is not very far away. The 6G technology will introduce unique and innovative applications of the Tactile Internet in the near future. This paper highlights the evolution towards the Tactile Internet enabled by 6G technology, along with the details of 6G capabilities. It emphasizes the stringent requirements for emerging Tactile Internet applications and the critical role of parameters, such as latency, reliability, data rate, and others. The study identifies the important characteristics of future Tactile Internet applications, interprets them into explicit requirements, and then discusses the associated design considerations. The study focuses on the role of application characteristics of various applications, like virtual reality/augmented reality, remote surgery, gaming, smart cities, autonomous vehicles, industrial automation, brain–machine interface, telepresence/holography, and requirements in the design of 6G and the Tactile Internet. Furthermore, we discuss the exclusive parameters and other requirements of Tactile Internet to realize real-time haptic interactions with the help of 6G and artificial intelligence. The study deliberates and examines the important performance parameters for the given applications. It also discusses various types of sensors that are required for Tactile Internet applications. Full article
(This article belongs to the Special Issue Advanced 5G and Beyond Networks)
Show Figures

Figure 1

25 pages, 9799 KiB  
Article
A Diamond Approach to Develop Virtual Object Interaction: Fusing Augmented Reality and Kinesthetic Haptics
by Alma Rodriguez-Ramirez, Osslan Osiris Vergara Villegas, Manuel Nandayapa, Francesco Garcia-Luna and María Cristina Guevara Neri
Multimodal Technol. Interact. 2025, 9(2), 15; https://doi.org/10.3390/mti9020015 - 13 Feb 2025
Viewed by 843
Abstract
Using the senses is essential to interacting with objects in real-world environments. However, not all the senses are available when interacting with virtual objects in virtual environments. This paper presents a diamond methodology to fuse two technologies to represent the senses of sight [...] Read more.
Using the senses is essential to interacting with objects in real-world environments. However, not all the senses are available when interacting with virtual objects in virtual environments. This paper presents a diamond methodology to fuse two technologies to represent the senses of sight and touch when interacting with a virtual object. The sense of sight is represented through augmented reality, and the sense of touch is represented through kinesthetic haptics. The diamond methodology is centered on the user experience and comprises five general stages: (i) experience design, (ii) sensory representation, (iii) development, (iv) display, and (v) fusion. The first stage is the expected, proposed, or needed user experience. Then, each technology takes its homologous activities from the second to the fourth stage, diverging from each other along their development. Finally, the technologies converge to the fifth stage for fusion in the user experience. The diamond methodology was tested by generating a user’s dual sensation when interacting with the elasticity of a tension virtual spring. The user can simultaneously perceive the visual and tactile change of the virtual spring during the interaction, representing the object’s deformation. The experimental results demonstrated that an interactive experience can be felt and seen in augmented reality following the diamond methodology. Full article
Show Figures

Graphical abstract

23 pages, 17790 KiB  
Technical Note
Development of a Modular Adjustable Wearable Haptic Device for XR Applications
by Ali Najm, Domna Banakou and Despina Michael-Grigoriou
Virtual Worlds 2024, 3(4), 436-458; https://doi.org/10.3390/virtualworlds3040024 - 16 Oct 2024
Cited by 3 | Viewed by 3656
Abstract
Current XR applications move beyond audiovisual information, with haptic feedback rapidly gaining ground. However, current haptic devices are still evolving and often struggle to combine key desired features in a balanced way. In this paper, we propose the development of a high-resolution haptic [...] Read more.
Current XR applications move beyond audiovisual information, with haptic feedback rapidly gaining ground. However, current haptic devices are still evolving and often struggle to combine key desired features in a balanced way. In this paper, we propose the development of a high-resolution haptic (HRH) system for perception enhancement, a wearable technology designed to augment extended reality (XR) experiences through precise and localized tactile feedback. The HRH system features a modular design with 58 individually addressable actuators, enabling intricate haptic interactions within a compact wearable form. Dual ESP32-S3 microcontrollers and a custom-designed system ensure robust processing and low-latency performance, crucial for real-time applications. Integration with the Unity game engine provides developers with a user-friendly and dynamic environment for accurate, simple control and customization. The modular design, utilizing a flexible PCB, supports a wide range of actuators, enhancing its versatility for various applications. A comparison of our proposed system with existing solutions indicates that the HRH system outperforms other devices by encapsulating several key features, including adjustability, affordability, modularity, and high-resolution feedback. The HRH system not only aims to advance the field of haptic feedback but also introduces an intuitive tool for exploring new methods of human–computer and XR interactions. Future work will focus on refining and exploring the haptic feedback communication methods used to convey information and expand the system’s applications. Full article
Show Figures

Figure 1

11 pages, 292 KiB  
Review
Recent Advances in Robotic Surgery for Urologic Tumors
by Sen-Yuan Hong and Bao-Long Qin
Medicina 2024, 60(10), 1573; https://doi.org/10.3390/medicina60101573 - 25 Sep 2024
Cited by 3 | Viewed by 2163
Abstract
This review discusses recent advances in robotic surgery for urologic tumors, focusing on three key areas: robotic systems, assistive technologies, and artificial intelligence. The Da Vinci SP system has enhanced the minimally invasive nature of robotic surgeries, while the Senhance system offers advantages [...] Read more.
This review discusses recent advances in robotic surgery for urologic tumors, focusing on three key areas: robotic systems, assistive technologies, and artificial intelligence. The Da Vinci SP system has enhanced the minimally invasive nature of robotic surgeries, while the Senhance system offers advantages such as tactile feedback and eye-tracking capabilities. Technologies like 3D reconstruction combined with augmented reality and fluorescence imaging aid surgeons in precisely identifying the anatomical relationships between tumors and surrounding structures, improving surgical efficiency and outcomes. Additionally, the development of artificial intelligence lays the groundwork for automated robotics. As these technologies continue to evolve, we are entering an era of minimally invasive, precise, and intelligent robotic surgery. Full article
(This article belongs to the Section Urology & Nephrology)
18 pages, 4110 KiB  
Article
Design and Evaluation of a Rapid Monolithic Manufacturing Technique for a Novel Vision-Based Tactile Sensor: C-Sight
by Wen Fan, Haoran Li, Yifan Xing and Dandan Zhang
Sensors 2024, 24(14), 4603; https://doi.org/10.3390/s24144603 - 16 Jul 2024
Cited by 3 | Viewed by 1949
Abstract
Tactile sensing has become indispensable for contact-rich dynamic robotic manipulation tasks. It provides robots with a better understanding of the physical environment, which is a vital supplement to robotic vision perception. Compared with other existing tactile sensors, vision-based tactile sensors (VBTSs) stand out [...] Read more.
Tactile sensing has become indispensable for contact-rich dynamic robotic manipulation tasks. It provides robots with a better understanding of the physical environment, which is a vital supplement to robotic vision perception. Compared with other existing tactile sensors, vision-based tactile sensors (VBTSs) stand out for augmenting the tactile perception capabilities of robotic systems, owing to superior spatial resolution and cost-effectiveness. Despite their advantages, VBTS production faces challenges due to the lack of standardised manufacturing techniques and heavy reliance on manual labour. This limitation impedes scalability and widespread adoption. This paper introduces a rapid monolithic manufacturing technique and evaluates its performance quantitatively. We further develop and assess C-Sight, a novel VBTS sensor manufactured using this technique, focusing on its tactile reconstruction capabilities. Experimental results demonstrate that the monolithic manufacturing technique enhances VBTS production efficiency significantly. Also, the fabricated C-Sight sensor exhibits its reliable tactile perception and reconstruction capabilities, proofing the validity and feasibility of the monolithic manufacturing method. Full article
(This article belongs to the Section Physical Sensors)
Show Figures

Figure 1

18 pages, 1685 KiB  
Review
Material Attribute Estimation as Part of Telecommunication Augmented Reality, Virtual Reality, and Mixed Reality System: Systematic Review
by Nicole Christoff and Krasimir Tonchev
Electronics 2024, 13(13), 2473; https://doi.org/10.3390/electronics13132473 - 25 Jun 2024
Cited by 1 | Viewed by 1386
Abstract
The integration of material attribute estimation (MAE) within augmented reality, virtual reality, and mixed reality telecommunication systems stands as a pivotal domain, evolving rapidly with the advent of the Tactile Internet. This unifying implementation process has the potential for improvements in the realism [...] Read more.
The integration of material attribute estimation (MAE) within augmented reality, virtual reality, and mixed reality telecommunication systems stands as a pivotal domain, evolving rapidly with the advent of the Tactile Internet. This unifying implementation process has the potential for improvements in the realism and interactivity of immersive environments. The interaction between MAE and the haptic Internet could lead to significant advances in haptic feedback systems, enabling more accurate and responsive user experiences. This systematic review is focused on the intersection of MAE and the Tactile Internet, aiming to find an implementation path between these technologies. Motivated by the potential of the haptic Internet to advance telecommunications, we explore its potential to advance the analysis of material attributes within AR, VR, and MR applications. Through an extensive analysis of current research approaches, including machine learning methods, we explore the possibilities of integrating the TI into MAE. By exploiting haptic and visual properties stored in the materials of 3D objects and using them directly during rendering in remote access scenarios, we propose a conceptual framework that combines data capture, visual representation, processing, and communication in virtual environments. Full article
(This article belongs to the Section Computer Science & Engineering)
Show Figures

Figure 1

20 pages, 12167 KiB  
Article
Helping Blind People Grasp: Evaluating a Tactile Bracelet for Remotely Guiding Grasping Movements
by Piper Powell, Florian Pätzold, Milad Rouygari, Marcin Furtak, Silke M. Kärcher and Peter König
Sensors 2024, 24(9), 2949; https://doi.org/10.3390/s24092949 - 6 May 2024
Cited by 3 | Viewed by 2685
Abstract
The problem of supporting visually impaired and blind people in meaningful interactions with objects is often neglected. To address this issue, we adapted a tactile belt for enhanced spatial navigation into a bracelet worn on the wrist that allows visually impaired people to [...] Read more.
The problem of supporting visually impaired and blind people in meaningful interactions with objects is often neglected. To address this issue, we adapted a tactile belt for enhanced spatial navigation into a bracelet worn on the wrist that allows visually impaired people to grasp target objects. Participants’ performance in locating and grasping target items when guided using the bracelet, which provides direction commands via vibrotactile signals, was compared to their performance when receiving auditory instructions. While participants were faster with the auditory commands, they also performed well with the bracelet, encouraging future development of this system and similar systems. Full article
(This article belongs to the Section Wearables)
Show Figures

Figure 1

15 pages, 2033 KiB  
Review
Sensory Integration: A Novel Approach for Healthy Ageing and Dementia Management
by Ongart Maneemai, Maira Cristina Cujilan Alvarado, Lina Graciela Calderon Intriago, Alicia Jeanette Donoso Triviño, Joicy Anabel Franco Coffré, Domenico Pratico, Kristof Schwartz, Tadele Tesfaye and Takao Yamasaki
Brain Sci. 2024, 14(3), 285; https://doi.org/10.3390/brainsci14030285 - 18 Mar 2024
Cited by 8 | Viewed by 8241
Abstract
Sensory processing is a fundamental aspect of the nervous system that plays a pivotal role in the cognitive decline observed in older individuals with dementia. The “sensory diet”, derived from sensory integration theory, may provide a tailored approach to modulating sensory experiences and [...] Read more.
Sensory processing is a fundamental aspect of the nervous system that plays a pivotal role in the cognitive decline observed in older individuals with dementia. The “sensory diet”, derived from sensory integration theory, may provide a tailored approach to modulating sensory experiences and triggering neuroplastic changes in the brain in individuals with dementia. Therefore, this review aimed to investigate the current knowledge regarding the sensory diet and its potential application to dementia. This review encompassed an extensive search across multiple databases, including PubMed, Google Scholar, covering articles published from 2010 to 2023. Keywords such as “sensory integration”, “sensory modulation”, “healthy aging”, and “dementia” were utilized to identify relevant studies. The types of materials retrieved included peer-reviewed articles, systematic reviews, and meta-analyses, ensuring a comprehensive overview of the current research landscape. This article offers a comprehensive exploration of the effectiveness of sensory diets such as tactile stimulation, auditory therapies, and visual interventions, which have demonstrated noteworthy efficacy in addressing challenges linked to aging and dementia. Research findings consistently report positive outcomes, such as improved cognitive function, elevated emotional well-being, and enhanced overall quality of life in older individuals. Furthermore, we found that the integration of sensory diets with the metaverse, augmented reality, and virtual reality opens up personalized experiences, fostering cognitive stimulation and emotional well-being for individuals during aging. Therefore, we conclude that customized sensory diets, based on interdisciplinary cooperation and leveraging technological advancements, are effective in optimizing sensory processing and improve the overall well-being of older individuals contending with sensory modulation challenges and dementia. Full article
Show Figures

Figure 1

16 pages, 6273 KiB  
Article
Manipulating Underfoot Tactile Perceptions of Flooring Materials in Augmented Virtuality
by Jack Topliss, Stephan Lukosch, Euan Coutts and Tham Piumsomboon
Appl. Sci. 2023, 13(24), 13106; https://doi.org/10.3390/app132413106 - 8 Dec 2023
Viewed by 2226
Abstract
Underfoot haptics, a largely unexplored area, offers rich tactile information close to that of hand-based interactions. Haptic feedback gives a sense of physicality to virtual environments, making for a more realistic and immersive experience. Augmented Virtuality offers the ability to render virtual materials [...] Read more.
Underfoot haptics, a largely unexplored area, offers rich tactile information close to that of hand-based interactions. Haptic feedback gives a sense of physicality to virtual environments, making for a more realistic and immersive experience. Augmented Virtuality offers the ability to render virtual materials on a physical object, or haptic proxy, without the user being aware of the object’s physical appearance while seeing their own body. In this research, we investigate how the visual appearance of physical objects can be altered virtually to impact the tactile perception of the object. An Augmented Virtuality system was developed to explore this, and two tactile perception experiments, consisting of 18 participants, were conducted. Specifically, we explore whether changing the visual appearance of materials affects a person’s underfoot tactile perception and which tactile perception is most affected by the change through a within-subjects experiment. Additionally, the study examines whether people are aware of changes in visual appearance when focused on other tasks through a between-subjects experiment. The study showed that a change in visual appearance significantly impacts the tactile perception of roughness. Matching visual appearance to physical materials was found to increase awareness of tactile perception. Full article
(This article belongs to the Special Issue Cross Applications of Interactive System and Extended Reality)
Show Figures

Figure 1

25 pages, 12415 KiB  
Article
EEG Investigation on the Tactile Perceptual Performance of a Pneumatic Wearable Display of Softness
by Federico Carpi, Michele C. Valles, Gabriele Frediani, Tanita Toci and Antonello Grippo
Actuators 2023, 12(12), 431; https://doi.org/10.3390/act12120431 - 21 Nov 2023
Cited by 3 | Viewed by 2340
Abstract
Multisensory human–machine interfaces for virtual- or augmented-reality systems are lacking wearable actuated devices that can provide users with tactile feedback on the softness of virtual objects. They are needed for a variety of uses, such as medical simulators, tele-operation systems and tele-presence environments. [...] Read more.
Multisensory human–machine interfaces for virtual- or augmented-reality systems are lacking wearable actuated devices that can provide users with tactile feedback on the softness of virtual objects. They are needed for a variety of uses, such as medical simulators, tele-operation systems and tele-presence environments. Such interfaces require actuators that can generate proper tactile feedback, by stimulating the fingertips via quasi-static (non-vibratory) forces, delivered through a deformable surface, so as to control both the contact area and the indentation depth. The actuators should combine a compact and lightweight structure with ease and safety of use, as well as low costs. Among the few actuation technologies that can comply with such requirements, pneumatic driving appears to be one of the most promising. Here, we present an investigation on a new type of pneumatic wearable tactile displays of softness, recently described by our group, which consist of small inflatable chambers arranged at the fingertips. In order to objectively assess the perceptual response that they can elicit, a systematic electroencephalographic study was conducted on ten healthy subjects. Somatosensory evoked potentials (SEPs) were recorded from eight sites above the somatosensory cortex (Fc2, Fc4, C2 and C4, and Fc1, Fc3, C1 and C3), in response to nine conditions of tactile stimulation delivered by the displays: stimulation of either only the thumb, the thumb and index finger simultaneously, or the thumb, index and middle finger simultaneously, each repeated at tactile pressures of 10, 20 and 30 kPa. An analysis of the latency and amplitude of the six components of SEP signals that typically characterise tactile sensing (P50, N100, P200, N300, P300 and N450) showed that this wearable pneumatic device is able to elicit predictable perceptual responses, consistent with the stimulation conditions. This proved that the device is capable of adequate actuation performance, which enables adequate tactile perceptual performance. Moreover, this shows that SEPs may effectively be used with this technology in the future, to assess variable perceptual experiences (especially with combinations of visual and tactile stimuli), in objective terms, complementing subjective information gathered from psychophysical tests. Full article
(This article belongs to the Special Issue Actuators for Haptic Feedback Applications)
Show Figures

Figure 1

7 pages, 2337 KiB  
Proceeding Paper
Texture Classification Based on Sound and Vibro-Tactile Data
by Mustapha Najib and Ana-Maria Cretu
Eng. Proc. 2023, 58(1), 5; https://doi.org/10.3390/ecsa-10-16082 - 15 Nov 2023
Viewed by 929
Abstract
This paper focuses on the development and validation of an automatic learning system for the classification of tactile data in form of vibro-tactile (accelerometer) and audio (microphone) data for texture recognition. A novel combination of features including the standard deviation, the mean, the [...] Read more.
This paper focuses on the development and validation of an automatic learning system for the classification of tactile data in form of vibro-tactile (accelerometer) and audio (microphone) data for texture recognition. A novel combination of features including the standard deviation, the mean, the absolute median of the deviation, the energy that characterizes the power of the signal, a measure that reflects the perceptual properties of the human system associated with each sensory modality, and the Fourier characteristics extracted from signals, along with principal component analysis, is shown to obtain the best results. Several machine learning models are compared in an attempt to identify the best compromise between the number of features, the classification performance and the computation time. Longer sampling periods (2 s. vs. 1 s) provide more information for classification, leading to higher performance (average of 3.59%) but also augment the evaluation time by an average of 29.48% over all features and models. For the selected dataset, the XGBRF model was identified to represent overall the best compromise between performance and computation time for the proposed novel combination of features over all material types with an F-score of 0.91 and a computation time of 4.69 ms, while kNN represents the next best option (1% improvement in performance at the cost of 2.13 ms increase in time with respect to XGBRF). Full article
Show Figures

Figure 1

22 pages, 397 KiB  
Article
A Transition to Multimodal Multilingual Practice: From SimCom to Translanguaging
by Julia Silvestri and Jodi L. Falk
Languages 2023, 8(3), 190; https://doi.org/10.3390/languages8030190 - 11 Aug 2023
Viewed by 3238
Abstract
Historically, the field of deaf education has revolved around language planning discourse, but little research has been conducted on Deaf and Hard of Hearing (DHH) students with additional disabilities as dynamic multilingual and multimodal language users. The current study focuses on the language [...] Read more.
Historically, the field of deaf education has revolved around language planning discourse, but little research has been conducted on Deaf and Hard of Hearing (DHH) students with additional disabilities as dynamic multilingual and multimodal language users. The current study focuses on the language planning process at a school serving DHH and Deaf–Blind students with varied additional disabilities. A previous Total Communication philosophy at the school was implemented in practice as Simultaneous Communication (SimCom) and later revised as a multimodal-multilingual approach with the goal of separating American Sign Language (ASL) and English and using multimodal communication such as tactile ASL and Augmentative and Alternative Communication (AAC). To implement this philosophy without reverting back to SimCom, the school employed a language planning process using action research to reflect on cycles of improvement. A grounded theory approach was used to identify and analyze themes over a three-year period of language planning and professional development in multimodal communication. Triangulated data includes language planning artifacts and an online survey of staff perceptions—analyzed by coding concepts and categories, relating concepts to define translanguaging mechanisms and attitudes, and developing an overarching theory on how a school values translanguaging after 3 years of valuing complete access to language. In the context of a multilingual, multimodal language planning cycle, developing a shared language ideology guided by how Deaf, DeafBlind, and Deaf-Disabled (DDBDD) people use language emerged as an overarching theme that promoted dynamic languaging and understanding of strategies for effective communication. Full article
(This article belongs to the Special Issue Translanguaging in Deaf Communities)
Back to TopTop