Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (80)

Search Parameters:
Keywords = facial expressions of pain

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
14 pages, 6123 KiB  
Article
Effects of Near-Infrared Diode Laser Irradiation on Pain Relief and Neuropeptide Markers During Experimental Tooth Movement in the Periodontal Ligament Tissues of Rats: A Pilot Study
by Kanako Okazaki, Ayaka Nakatani, Ryo Kunimatsu, Isamu Kado, Shuzo Sakata, Hirotaka Kiridoshi and Kotaro Tanimoto
Int. J. Mol. Sci. 2025, 26(15), 7404; https://doi.org/10.3390/ijms26157404 - 31 Jul 2025
Viewed by 158
Abstract
Pain following orthodontic treatment is the chief complaint of patients undergoing this form of treatment. Although the use of diode lasers has been suggested for pain reduction, the mechanism of laser-induced analgesic effects remains unclear. Neuropeptides, such as substance P (SP) and calcitonin [...] Read more.
Pain following orthodontic treatment is the chief complaint of patients undergoing this form of treatment. Although the use of diode lasers has been suggested for pain reduction, the mechanism of laser-induced analgesic effects remains unclear. Neuropeptides, such as substance P (SP) and calcitonin gene-related peptide (CGRP), contribute to the transmission and maintenance of inflammatory pain. Heat shock protein (HSP) 70 plays a protective role against various stresses, including orthodontic forces. This study aimed to examine the effects of diode laser irradiation on neuropeptides and HSP 70 expression in periodontal tissues induced by experimental tooth movement (ETM). For inducing ETM for 24 h, 50 g of orthodontic force was applied using a nickel–titanium closed-coil spring to the upper left first molar and the incisors of 20 male Sprague Dawley rats (7 weeks old). The right side without ETM treatment was considered the untreated control group. In 10 rats, diode laser irradiation was performed on the buccal and palatal sides of the first molar for 90 s with a total energy of 100.8 J/cm2. A near-infrared (NIR) laser with a 808 nm wavelength, 7 W peak power, 560 W average power, and 20 ms pulse width was used for the experiment. We measured the number of facial groomings and vacuous chewing movements (VCMs) in the ETM and ETM + laser groups. Immunohistochemical staining of the periodontal tissue with SP, CGRP, and HSP 70 was performed. The number of facial grooming and VCM periods significantly decreased in the ETM + laser group compared to the ETM group. Moreover, the ETM + laser group demonstrated significant suppression of SP, CGRP, and HSP 70 expression. These results suggest that the diode laser demonstrated analgesic effects on ETM-induced pain by inhibiting SP and CGRP expression, and decreased HSP 70 expression shows alleviation of cell damage. Thus, although further validation is warranted for human applications, an NIR diode laser can be used for reducing pain and neuropeptide markers during orthodontic tooth movement. Full article
(This article belongs to the Special Issue Advances in Photobiomodulation Therapy)
Show Figures

Figure 1

14 pages, 1124 KiB  
Article
The Correlation Between Body Pain Indicators and the Facial Expression Scale in Sows During Farrowing and Pre-Weaning: The Effects of Parity, the Farrowing Moment, and Suckling Events
by Elena Navarro, Raúl David Guevara, Eva Mainau, Ricardo de Miguel and Xavier Manteca
Animals 2025, 15(15), 2225; https://doi.org/10.3390/ani15152225 - 28 Jul 2025
Viewed by 260
Abstract
Parturition is accepted as a painful situation. Few studies explore pain-specific behaviours during farrowing in sows. The objectives of this study were, first, to assess if behavioural pain indicators (BPIs) are affected by the farrowing moment, parity, and suckling events, and second, to [...] Read more.
Parturition is accepted as a painful situation. Few studies explore pain-specific behaviours during farrowing in sows. The objectives of this study were, first, to assess if behavioural pain indicators (BPIs) are affected by the farrowing moment, parity, and suckling events, and second, to determine the relationship between the Facial Action Units (FAUs) and BPIs during farrowing. Ten Danbred sows were recorded throughout farrowing and on day 19 post-farrowing. Continuous observations of five BPIs and five FAUs were obtained across the three moments studied: (i) at the expulsion of the piglets, (ii) the time interval between the delivery of each piglet, and (iii) 19 days after farrowing, used as a control. Primiparous sows had more BPIs but fewer postural changes than multiparous sows. The BPIs were more frequent during suckling events in the pre-weaning moment. All the FAUs and BPIs were rare or absent post-farrowing (p < 0.05), and almost all of them were more frequent during farrowing (especially at the moment of delivery). Back arching showed the highest correlation with all the FAUs, and tension above the eyes showed the highest correlation with four of the BPIs. The BPIs and FAUs indicate that sows experience more pain during farrowing than during the third week post-farrowing, and piglet expulsion is the most painful moment in farrowing. Full article
(This article belongs to the Section Animal Welfare)
Show Figures

Figure 1

13 pages, 242 KiB  
Review
Objective Measurement of Musculoskeletal Pain: A Comprehensive Review
by Nahum Rosenberg
Diagnostics 2025, 15(13), 1581; https://doi.org/10.3390/diagnostics15131581 - 22 Jun 2025
Viewed by 763
Abstract
Background: Musculoskeletal (MSK) pain is a leading contributor to global disability and healthcare burdens. While self-reported pain scales remain the clinical standard, they are limited by subjectivity and inter-individual variability. Therefore, objective assessment tools are increasingly sought to enhance diagnostic precision, guide treatment, [...] Read more.
Background: Musculoskeletal (MSK) pain is a leading contributor to global disability and healthcare burdens. While self-reported pain scales remain the clinical standard, they are limited by subjectivity and inter-individual variability. Therefore, objective assessment tools are increasingly sought to enhance diagnostic precision, guide treatment, and enable reproducible research outcomes. Methods: This comprehensive narrative review synthesizes evidence from physiological, behavioral, and neuroimaging approaches used to evaluate MSK pain objectively. Emphasis is placed on autonomic biomarkers (e.g., heart rate variability, skin conductance), facial expression analysis, electromyographic methods, and functional neuroimaging modalities such as fMRI and PET. Emerging applications of artificial intelligence and multimodal diagnostic strategies are also discussed. Results: Physiological signals provide quantifiable correlations of pain-related autonomic activity but are influenced by psychological and contextual factors. Behavioral analyses, including facial action coding systems and reflex testing, offer complementary, though complex, indicators. Neuroimaging techniques have identified pain-related brain patterns, yet clinical translation is limited by variability and standardization issues. Integrative approaches show promise for improving diagnostic validity. Conclusions: Objective assessment of MSK pain remains methodologically challenging but holds substantial potential for enhancing clinical diagnostics and personalized management. Future research should focus on multimodal integration, standardization, and translational feasibility to bridge the gap between experimental tools and clinical practice. Full article
(This article belongs to the Section Medical Imaging and Theranostics)
12 pages, 963 KiB  
Article
Comparison of the Prevalence and Location of Trigger Points in Dressage and Show-Jumping Horses
by Karine Portier, Camilla Schiesari, Lisa Gauthier, Lin Tchia Yeng, Denise Tabacchi Fantoni and Maira Rezende Formenton
Animals 2025, 15(11), 1558; https://doi.org/10.3390/ani15111558 - 27 May 2025
Cited by 1 | Viewed by 586
Abstract
Myofascial trigger points (MTrPs) are localized, hypersensitive areas in muscles that can cause pain and reduced performance. This study aimed to compare the prevalence and location of MTrPs in show-jumping and dressage horses. A secondary objective was to evaluate the potential of thermography, [...] Read more.
Myofascial trigger points (MTrPs) are localized, hypersensitive areas in muscles that can cause pain and reduced performance. This study aimed to compare the prevalence and location of MTrPs in show-jumping and dressage horses. A secondary objective was to evaluate the potential of thermography, pressure algometry, and facial expression scoring in characterizing MTrPs in horses. Fourteen horses (seven dressage, seven show-jumping) were examined. Muscle palpation was used to identify MTrPs. Thermography was used to compare the skin surface temperature of MTrPs with adjacent control areas. Additionally, facial expressions were recorded during palpation and scored by three blinded observers using the Horse Grimace Scale (HGS). MTrPs were found in all horses. Both groups showed a high prevalence (>60%) of MTrPs in the back. Dressage horses had a higher prevalence of MTPrs in the neck (17%) and a lower prevalence in the rump (17%) than show-jumping horses (3% and 30%, respectively). Temperatures at MTrP sites were significantly higher than at control points (p < 0.01). Facial expression scores were also significantly higher during MTrP palpation compared to control (16 [0–24] vs. 6 [0–19], p = 0.004). These findings open a perspective for better recognition and treatment of myofascial pain in athletic horses. Full article
(This article belongs to the Section Equids)
Show Figures

Figure 1

25 pages, 6904 KiB  
Article
A Weighted Facial Expression Analysis for Pain Level Estimation
by Parkpoom Chaisiriprasert and Nattapat Patchsuwan
J. Imaging 2025, 11(5), 151; https://doi.org/10.3390/jimaging11050151 - 9 May 2025
Viewed by 809
Abstract
Accurate assessment of pain intensity is critical, particularly for patients who are unable to verbally express their discomfort. This study proposes a novel weighted analytical framework that integrates facial expression analysis through action units (AUs) with a facial feature-based weighting mechanism to enhance [...] Read more.
Accurate assessment of pain intensity is critical, particularly for patients who are unable to verbally express their discomfort. This study proposes a novel weighted analytical framework that integrates facial expression analysis through action units (AUs) with a facial feature-based weighting mechanism to enhance the estimation of pain intensity. The proposed method was evaluated on a dataset comprising 4084 facial images from 25 individuals and demonstrated an average accuracy of 92.72% using the weighted pain level estimation model, in contrast to 83.37% achieved using conventional approaches. The observed improvements are primarily attributed to the strategic utilization of AU zones and expression-based weighting, which enable more precise differentiation between pain-related and non-pain-related facial movements. These findings underscore the efficacy of the proposed model in enhancing the accuracy and reliability of automated pain detection, especially in contexts where verbal communication is impaired or absent. Full article
Show Figures

Figure 1

32 pages, 4102 KiB  
Article
A Multimodal Pain Sentiment Analysis System Using Ensembled Deep Learning Approaches for IoT-Enabled Healthcare Framework
by Anay Ghosh, Saiyed Umer, Bibhas Chandra Dhara and G. G. Md. Nawaz Ali
Sensors 2025, 25(4), 1223; https://doi.org/10.3390/s25041223 - 17 Feb 2025
Viewed by 1248
Abstract
This study introduces a multimodal sentiment analysis system to assess and recognize human pain sentiments within an Internet of Things (IoT)-enabled healthcare framework. This system integrates facial expressions and speech-audio recordings to evaluate human pain intensity levels. This integration aims to enhance the [...] Read more.
This study introduces a multimodal sentiment analysis system to assess and recognize human pain sentiments within an Internet of Things (IoT)-enabled healthcare framework. This system integrates facial expressions and speech-audio recordings to evaluate human pain intensity levels. This integration aims to enhance the recognition system’s performance and enable a more accurate assessment of pain intensity. Such a multimodal approach supports improved decision making in real-time patient care, addressing limitations inherent in unimodal systems for measuring pain sentiment. So, the primary contribution of this work lies in developing a multimodal pain sentiment analysis system that integrates the outcomes of image-based and audio-based pain sentiment analysis models. The system implementation contains five key phases. The first phase focuses on detecting the facial region from a video sequence, a crucial step for extracting facial patterns indicative of pain. In the second phase, the system extracts discriminant and divergent features from the facial region using deep learning techniques, utilizing some convolutional neural network (CNN) architectures, which are further refined through transfer learning and fine-tuning of parameters, alongside fusion techniques aimed at optimizing the model’s performance. The third phase performs the speech-audio recording preprocessing; the extraction of significant features is then performed through conventional methods followed by using the deep learning model to generate divergent features to recognize audio-based pain sentiments in the fourth phase. The final phase combines the outcomes from both image-based and audio-based pain sentiment analysis systems, improving the overall performance of the multimodal system. This fusion enables the system to accurately predict pain levels, including ‘high pain’, ‘mild pain’, and ‘no pain’. The performance of the proposed system is tested with the three image-based databases such as a 2D Face Set Database with Pain Expression, the UNBC-McMaster database (based on shoulder pain), and the BioVid database (based on heat pain), along with the VIVAE database for the audio-based dataset. Extensive experiments were performed using these datasets. Finally, the proposed system achieved accuracies of 76.23%, 84.27%, and 38.04% for two, three, and five pain classes, respectively, on the 2D Face Set Database with Pain Expression, UNBC, and BioVid datasets. The VIVAE audio-based system recorded a peak performance of 97.56% and 98.32% accuracy for varying training–testing protocols. These performances were compared with some state-of-the-art methods that show the superiority of the proposed system. By combining the outputs of both deep learning frameworks on image and audio datasets, the proposed multimodal pain sentiment analysis system achieves accuracies of 99.31% for the two-class, 99.54% for the three-class, and 87.41% for the five-class pain problems. Full article
(This article belongs to the Section Physical Sensors)
Show Figures

Figure 1

24 pages, 2289 KiB  
Article
A Non-Invasive Approach for Facial Action Unit Extraction and Its Application in Pain Detection
by Mondher Bouazizi, Kevin Feghoul, Shengze Wang, Yue Yin and Tomoaki Ohtsuki
Bioengineering 2025, 12(2), 195; https://doi.org/10.3390/bioengineering12020195 - 17 Feb 2025
Cited by 1 | Viewed by 1803
Abstract
A significant challenge that hinders advancements in medical research is the sensitive and confidential nature of patient data in available datasets. In particular, sharing patients’ facial images poses considerable privacy risks, especially with the rise of generative artificial intelligence (AI), which could misuse [...] Read more.
A significant challenge that hinders advancements in medical research is the sensitive and confidential nature of patient data in available datasets. In particular, sharing patients’ facial images poses considerable privacy risks, especially with the rise of generative artificial intelligence (AI), which could misuse such data if accessed by unauthorized parties. However, facial expressions are a valuable source of information for doctors and researchers, which creates a need for methods to derive them without compromising patient privacy or safety by exposing identifiable facial images. To address this, we present a quick, computationally efficient method for detecting action units (AUs) and their intensities—key indicators of health and emotion—using only 3D facial landmarks. Our proposed framework extracts 3D face landmarks from video recordings and employs a lightweight neural network (NN) to identify AUs and estimate AU intensities based on these landmarks. Our proposed method reaches a 79.25% F1-score in AU detection for the main AUs, and 0.66 in AU intensity estimation Root Mean Square Error (RMSE). This performance shows that it is possible for researchers to share 3D landmarks, which are far less intrusive, instead of facial images while maintaining high accuracy in AU detection. Moreover, to showcase the usefulness of our AU detection model, using the detected AUs and estimated intensities, we trained state-of-the-art Deep Learning (DL) models to detect pain. Our method reaches 91.16% accuracy in pain detection, which is not far behind the 93.14% accuracy obtained when employing a convolutional neural network (CNN) with residual blocks trained on actual images and the 92.11% accuracy obtained when employing all the ground-truth AUs. Full article
(This article belongs to the Section Biosignal Processing)
Show Figures

Figure 1

20 pages, 1504 KiB  
Article
Unveiling the Truth in Pain: Neural and Behavioral Distinctions Between Genuine and Deceptive Pain
by Vanessa Zanelli, Fausta Lui, Claudia Casadio, Francesco Ricci, Omar Carpentiero, Daniela Ballotta, Marianna Ambrosecchia, Martina Ardizzi, Vittorio Gallese, Carlo Adolfo Porro and Francesca Benuzzi
Brain Sci. 2025, 15(2), 185; https://doi.org/10.3390/brainsci15020185 - 12 Feb 2025
Cited by 1 | Viewed by 1311
Abstract
Background/Objectives: Fake pain expressions are more intense, prolonged, and include non-pain-related actions compared to genuine ones. Despite these differences, individuals struggle to detect deception in direct tasks (i.e., when asked to detect liars). Regarding neural correlates, while pain observation has been extensively [...] Read more.
Background/Objectives: Fake pain expressions are more intense, prolonged, and include non-pain-related actions compared to genuine ones. Despite these differences, individuals struggle to detect deception in direct tasks (i.e., when asked to detect liars). Regarding neural correlates, while pain observation has been extensively studied, little is known about the neural distinctions between processing genuine, fake, and suppressed pain facial expressions. This study seeks to address this gap using authentic pain stimuli and an implicit emotional processing task. Methods: Twenty-four healthy women underwent an fMRI study, during which they were instructed to complete an implicit gender discrimination task. Stimuli were video clips showing genuine, fake, suppressed pain, and neutral facial expressions. After the scanning session, participants reviewed the stimuli and rated them indirectly according to the intensity of the facial expression (IE) and the intensity of the pain (IP). Results: Mean scores of IE and IP were significantly different for each category. A greater BOLD response for the observation of genuine pain compared to fake pain was observed in the pregenual anterior cingulate cortex (pACC). A parametric analysis showed a correlation between brain activity in the mid-cingulate cortex (aMCC) and the IP ratings. Conclusions: Higher IP ratings for genuine pain expressions and higher IE ratings for fake ones suggest that participants were indirectly able to recognize authenticity in facial expressions. At the neural level, pACC and aMCC appear to be involved in unveiling the genuine vs. fake pain and in coding the intensity of the perceived pain, respectively. Full article
(This article belongs to the Section Sensory and Motor Neuroscience)
Show Figures

Figure 1

17 pages, 6430 KiB  
Article
The Potential for High-Priority Care Based on Pain Through Facial Expression Detection with Patients Experiencing Chest Pain
by Hsiang Kao, Rita Wiryasaputra, Yo-Yun Liao, Yu-Tse Tsan, Wei-Min Chu, Yi-Hsuan Chen, Tzu-Chieh Lin and Chao-Tung Yang
Diagnostics 2025, 15(1), 17; https://doi.org/10.3390/diagnostics15010017 - 25 Dec 2024
Cited by 1 | Viewed by 1392
Abstract
Background and Objective: Cardiovascular disease (CVD), one of the chronic non-communicable diseases (NCDs), is defined as a cardiac and vascular disorder that includes coronary heart disease, heart failure, peripheral arterial disease, cerebrovascular disease (stroke), congenital heart disease, rheumatic heart disease, and elevated blood [...] Read more.
Background and Objective: Cardiovascular disease (CVD), one of the chronic non-communicable diseases (NCDs), is defined as a cardiac and vascular disorder that includes coronary heart disease, heart failure, peripheral arterial disease, cerebrovascular disease (stroke), congenital heart disease, rheumatic heart disease, and elevated blood pressure (hypertension). Having CVD increases the mortality rate. Emotional stress, an indirect indicator associated with CVD, can often manifest through facial expressions. Chest pain or chest discomfort is one of the symptoms of a heart attack. The golden hour of chest pain influences the occurrence of brain cell death; thus, saving people with chest discomfort during observation is a crucial and urgent issue. Moreover, a limited number of emergency care (ER) medical personnel serve unscheduled outpatients. In this study, a computer-based automatic chest pain detection assistance system is developed using facial expressions to improve patient care services and minimize heart damage. Methods: The You Only Look Once (YOLO) model, as a deep learning method, detects and recognizes the position of an object simultaneously. A series of YOLO models were employed for pain detection through facial expression. Results: The YOLOv4 and YOLOv6 performed better than YOLOv7 in facial expression detection with patients experiencing chest pain. The accuracy of YOLOv4 and YOLOv6 achieved 80–100%. Even though there are similarities in attaining the accuracy values, the training time for YOLOv6 is faster than YOLOv4. Conclusion: By performing this task, a physician can prioritize the best treatment plan, reduce the extent of cardiac damage in patients, and improve the effectiveness of the golden treatment time. Full article
(This article belongs to the Section Machine Learning and Artificial Intelligence in Diagnostics)
Show Figures

Figure 1

24 pages, 108807 KiB  
Article
SMEA-YOLOv8n: A Sheep Facial Expression Recognition Method Based on an Improved YOLOv8n Model
by Wenbo Yu, Xiang Yang, Yongqi Liu, Chuanzhong Xuan, Ruoya Xie and Chuanjiu Wang
Animals 2024, 14(23), 3415; https://doi.org/10.3390/ani14233415 - 26 Nov 2024
Cited by 1 | Viewed by 1096
Abstract
Sheep facial expressions are valuable indicators of their pain levels, playing a critical role in monitoring their health and welfare. In response to challenges such as missed detections, false positives, and low recognition accuracy in sheep facial expression recognition, this paper introduces an [...] Read more.
Sheep facial expressions are valuable indicators of their pain levels, playing a critical role in monitoring their health and welfare. In response to challenges such as missed detections, false positives, and low recognition accuracy in sheep facial expression recognition, this paper introduces an enhanced algorithm based on YOLOv8n, referred to as SimAM-MobileViTAttention-EfficiCIoU-AA2_SPPF-YOLOv8n (SMEA-YOLOv8n). Firstly, the proposed method integrates the parameter-free Similarity-Aware Attention Mechanism (SimAM) and MobileViTAttention modules into the CSP Bottleneck with 2 Convolutions(C2f) module of the neck network, aiming to enhance the model’s feature representation and fusion capabilities in complex environments while mitigating the interference of irrelevant background features. Additionally, the EfficiCIoU loss function replaces the original Complete IoU(CIoU) loss function, thereby improving bounding box localization accuracy and accelerating model convergence. Furthermore, the Spatial Pyramid Pooling-Fast (SPPF) module in the backbone network is refined with the addition of two global average pooling layers, strengthening the extraction of sheep facial expression features and bolstering the model’s core feature fusion capacity. Experimental results reveal that the proposed method achieves a mAP@0.5 of 92.5%, a Recall of 91%, a Precision of 86%, and an F1-score of 88.0%, reflecting improvements of 4.5%, 9.1%, 2.8%, and 6.0%, respectively, compared to the baseline model. Notably, the mAP@0.5 for normal and abnormal sheep facial expressions increased by 3.7% and 5.3%, respectively, demonstrating the method’s effectiveness in enhancing recognition accuracy under complex environmental conditions. Full article
(This article belongs to the Section Small Ruminants)
Show Figures

Figure 1

12 pages, 1188 KiB  
Review
Artificial Intelligence-Driven Diagnostic Processes and Comprehensive Multimodal Models in Pain Medicine
by Marco Cascella, Matteo L. G. Leoni, Mohammed Naveed Shariff and Giustino Varrassi
J. Pers. Med. 2024, 14(9), 983; https://doi.org/10.3390/jpm14090983 - 16 Sep 2024
Cited by 11 | Viewed by 3086
Abstract
Pain diagnosis remains a challenging task due to its subjective nature, the variability in pain expression among individuals, and the difficult assessment of the underlying biopsychosocial factors. In this complex scenario, artificial intelligence (AI) can offer the potential to enhance diagnostic accuracy, predict [...] Read more.
Pain diagnosis remains a challenging task due to its subjective nature, the variability in pain expression among individuals, and the difficult assessment of the underlying biopsychosocial factors. In this complex scenario, artificial intelligence (AI) can offer the potential to enhance diagnostic accuracy, predict treatment outcomes, and personalize pain management strategies. This review aims to dissect the current literature on computer-aided diagnosis methods. It also discusses how AI-driven diagnostic strategies can be integrated into multimodal models that combine various data sources, such as facial expression analysis, neuroimaging, and physiological signals, with advanced AI techniques. Despite the significant advancements in AI technology, its widespread adoption in clinical settings faces crucial challenges. The main issues are ethical considerations related to patient privacy, biases, and the lack of reliability and generalizability. Furthermore, there is a need for high-quality real-world validation and the development of standardized protocols and policies to guide the implementation of these technologies in diverse clinical settings. Full article
(This article belongs to the Special Issue Towards Precision Anesthesia and Pain Management)
Show Figures

Figure 1

17 pages, 3975 KiB  
Review
A Review of Automatic Pain Assessment from Facial Information Using Machine Learning
by Najib Ben Aoun
Technologies 2024, 12(6), 92; https://doi.org/10.3390/technologies12060092 - 20 Jun 2024
Cited by 7 | Viewed by 3937
Abstract
Pain assessment has become an important component in modern healthcare systems. It aids medical professionals in patient diagnosis and providing the appropriate care and therapy. Conventionally, patients are asked to provide their pain level verbally. However, this subjective method is generally inaccurate, not [...] Read more.
Pain assessment has become an important component in modern healthcare systems. It aids medical professionals in patient diagnosis and providing the appropriate care and therapy. Conventionally, patients are asked to provide their pain level verbally. However, this subjective method is generally inaccurate, not possible for non-communicative people, can be affected by physiological and environmental factors and is time-consuming, which renders it inefficient in healthcare settings. So, there has been a growing need to build objective, reliable and automatic pain assessment alternatives. In fact, due to the efficiency of facial expressions as pain biomarkers that accurately expand the pain intensity and the power of machine learning methods to effectively learn the subtle nuances of pain expressions and accurately predict pain intensity, automatic pain assessment methods have evolved rapidly. This paper reviews recent spatial facial expressions and machine learning-based pain assessment methods. Moreover, we highlight the pain intensity scales, datasets and method performance evaluation criteria. In addition, these methods’ contributions, strengths and limitations will be reported and discussed. Additionally, the review lays the groundwork for further study and improvement for more accurate automatic pain assessment. Full article
(This article belongs to the Special Issue Medical Imaging & Image Processing III)
Show Figures

Figure 1

9 pages, 624 KiB  
Article
Functional and Esthetic Outcomes of Either Surgically or Conservatively Treated Anterior Frontal Sinus Wall Fractures: A Long-Term Follow-Up
by Oscar Solmell, Ola Sunnergren, Abdul Rashid Qureshi and Babak Alinasab
Craniomaxillofac. Trauma Reconstr. 2024, 17(4), 69; https://doi.org/10.1177/19433875241250225 - 30 Apr 2024
Cited by 1 | Viewed by 133
Abstract
Study Design: Retrospective cohort study. Objective: Frontal sinus fractures (FSFs) can lead to a range of clinical challenges, including facial deformity, impaired facial sensation, cerebrospinal fluid (CSF) leakage, sinus drainage impairment, chronic sinus pain and mucocele formation. The optimal management approach, whether surgical [...] Read more.
Study Design: Retrospective cohort study. Objective: Frontal sinus fractures (FSFs) can lead to a range of clinical challenges, including facial deformity, impaired facial sensation, cerebrospinal fluid (CSF) leakage, sinus drainage impairment, chronic sinus pain and mucocele formation. The optimal management approach, whether surgical or conservative, remains a topic of ongoing discussion. The aim of this study was to evaluate and compare the functional and esthetic outcomes of patients with surgically and conservatively treated FSFs. Methods: In this retrospective study, patients treated for FSFs at the Karolinska university hospital 2004 to 2020 were identified in hospital records and invited to participate in a long-term follow-up. Sequelae and satisfaction with the esthetic result were assessed trough questionnaires and physical examinations. Results: A total of 93 patients were included in the study, with 49 presenting isolated anterior wall fractures and 44 presenting combined anterior and posterior wall fractures. Surgical intervention was performed in 45 cases, while 48 were managed conservatively. Among patients with moderate anterior wall fractures (4–6 mm dislocation), 80% of surgically treated patients compared to 100% of conservatively treated patients expressed satisfactionwith their cosmetic outcomes at follow-up (p = 0.03). In conservatively treated patients with a forehead impression, the anterior wall fracture dislocation ranged from 5.3 to 6.0 mm (p < 0.0001). Approximately 50% of surgically treated patients vs 15% of conservatively treated patients developed impaired forehead sensation at follow-up (p = 0.03). Thirty-six percent of surgically treated patients reported dissatisfaction with surgery-related scarring, particularly those who underwent surgery via laceration or bicoronal incision. Conclusions: This study suggests that anterior FSFs with a dislocation of 5 mm or less can be effectively managed conservatively with high patient satisfaction, low risk of long-term forehead sensation impairment and without potential development of forehead impression. Bicoronal incision or incision via a laceration may be associated with esthetic dissatisfaction and late sequelae such as alopecia. Full article
Show Figures

Figure 1

14 pages, 2311 KiB  
Article
Painful Experiences in Social Contexts Facilitate Sensitivity to Emotional Signals of Pain from Conspecifics in Laboratory Rats
by Satoshi F. Nakashima, Masatoshi Ukezono and Yuji Takano
Animals 2024, 14(9), 1280; https://doi.org/10.3390/ani14091280 - 24 Apr 2024
Cited by 1 | Viewed by 1756
Abstract
Previous studies demonstrated that laboratory rats could visually receive emotional pain signals from conspecifics through pictorial stimuli. The present study examined whether a prior painful emotional experience of the receiver influenced the sensitivity of emotional expression recognition in laboratory rats. The experiment comprised [...] Read more.
Previous studies demonstrated that laboratory rats could visually receive emotional pain signals from conspecifics through pictorial stimuli. The present study examined whether a prior painful emotional experience of the receiver influenced the sensitivity of emotional expression recognition in laboratory rats. The experiment comprised four phases: the baseline preference test, pain manipulation test, post-manipulation preference test, and state anxiety test. In the baseline phase, the rats explored an apparatus comprising two boxes to which pictures of pain or neutral expressions of other conspecifics were attached. In the pain manipulation phase, each rat was allocated to one of three conditions: foot shock alone (pained-alone; PA), foot shock with other unfamiliar conspecifics (pained-with-other; PWO), or no foot shock (control). In the post-manipulation phase, the animals explored the apparatus in the same manner as they did in the baseline phase. Finally, an open-field test was used to measure state anxiety. These findings indicate that rats in the PWO group stayed longer per entry in a box with photographs depicting a neutral disposition than in a box with photographs depicting pain after manipulation. The results of the open-field test showed no significant differences between the groups, suggesting that the increased sensitivity to pain expression in other individuals due to pain experiences in social settings was not due to increased primary state anxiety. Furthermore, the results indicate that rats may use a combination of self-painful experiences and the states of other conspecifics to process the emotional signal of pain from other conspecifics. In addition, changes in the responses of rats to facial expressions in accordance with social experience suggest that the expression function of rats is not only used for emotional expressions but also for communication. Full article
(This article belongs to the Section Mammals)
Show Figures

Figure 1

13 pages, 1976 KiB  
Article
Corrugator Muscle Activity Associated with Pressure Pain in Adults with Neck/Shoulder Pain
by Takahiro Yamada, Hiroyoshi Yajima, Miho Takayama, Konomi Imanishi and Nobuari Takakura
Medicina 2024, 60(2), 223; https://doi.org/10.3390/medicina60020223 - 28 Jan 2024
Viewed by 2075
Abstract
Background and Objectives: No studies have reported corrugator muscle activity associated with pain in people with pain. This study aimed to develop an objective pain assessment method using corrugator muscle activity with pressure pain stimulation to the skeletal muscle. Methods: Participants were 20 [...] Read more.
Background and Objectives: No studies have reported corrugator muscle activity associated with pain in people with pain. This study aimed to develop an objective pain assessment method using corrugator muscle activity with pressure pain stimulation to the skeletal muscle. Methods: Participants were 20 adults (a mean ± SD age of 22.0 ± 3.1 years) with chronic neck/shoulder pain. Surface electromyography (sEMG) of corrugator muscle activity at rest (baseline) and without and with pressure pain stimulation applied to the most painful tender point in the shoulder was recorded. Participants evaluated the intensity of the neck/shoulder pain and the sensory and affective components of pain with pressure stimulation using a visual analogue scale (VAS). The percentages of integrated sEMG (% corrugator activity) without and with pressure pain stimulation to the baseline integrated sEMG were compared, and the relationships between the % corrugator activity and the sensory and affective components of pain VAS scores were evaluated. Results: Without pressure stimulation, an increase in corrugator muscle activity due to chronic neck/shoulder pain was not observed. The % corrugator activity with pressure pain stimulation was significantly higher than that without stimulation (p < 0.01). A significant positive correlation between corrugator muscle activity and the affective components of pain VAS scores with pressure stimulation was found (ρ = 0.465, p = 0.039) and a tendency of positive correlation was found for the sensory component of pain VAS scores (ρ = 0.423, p = 0.063). Conclusions: The increase in corrugator muscle activity with pressure pain stimulation to the tender point in adults with chronic neck/shoulder pain was observed, although increased corrugator muscle activity resulting from the chronic neck/shoulder pain was not. These findings suggest that corrugator muscle activity with pressure pain stimulation can be a useful objective indication for tender point sensitivity assessment in the skeletal muscle with pain. Full article
(This article belongs to the Special Issue Persistent Pain: Advances in Diagnosis and Management)
Show Figures

Figure 1

Back to TopTop