Next Article in Journal
Complex Spiking Neural Network Evaluated by Injury Resistance Under Stochastic Attacks
Previous Article in Journal
Frameless Stereotaxy in Stereoelectroencephalography Using Intraoperative Computed Tomography
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Unveiling the Truth in Pain: Neural and Behavioral Distinctions Between Genuine and Deceptive Pain

by
Vanessa Zanelli
1,
Fausta Lui
1,*,
Claudia Casadio
1,
Francesco Ricci
1,
Omar Carpentiero
1,
Daniela Ballotta
1,
Marianna Ambrosecchia
2,3,
Martina Ardizzi
2,
Vittorio Gallese
2,
Carlo Adolfo Porro
1 and
Francesca Benuzzi
1
1
Department of Biomedical, Metabolic and Neural Sciences, University of Modena and Reggio Emilia, 41125 Modena, Italy
2
Neuroscience Unit, Department of Medicine and Surgery, University of Parma, 43125 Parma, Italy
3
Center for Studies and Research in Cognitive Neuroscience of Cesena, 47522 Cesena, Italy
*
Author to whom correspondence should be addressed.
Brain Sci. 2025, 15(2), 185; https://doi.org/10.3390/brainsci15020185
Submission received: 30 December 2024 / Revised: 7 February 2025 / Accepted: 10 February 2025 / Published: 12 February 2025
(This article belongs to the Section Sensory and Motor Neuroscience)

Abstract

:
Background/Objectives: Fake pain expressions are more intense, prolonged, and include non-pain-related actions compared to genuine ones. Despite these differences, individuals struggle to detect deception in direct tasks (i.e., when asked to detect liars). Regarding neural correlates, while pain observation has been extensively studied, little is known about the neural distinctions between processing genuine, fake, and suppressed pain facial expressions. This study seeks to address this gap using authentic pain stimuli and an implicit emotional processing task. Methods: Twenty-four healthy women underwent an fMRI study, during which they were instructed to complete an implicit gender discrimination task. Stimuli were video clips showing genuine, fake, suppressed pain, and neutral facial expressions. After the scanning session, participants reviewed the stimuli and rated them indirectly according to the intensity of the facial expression (IE) and the intensity of the pain (IP). Results: Mean scores of IE and IP were significantly different for each category. A greater BOLD response for the observation of genuine pain compared to fake pain was observed in the pregenual anterior cingulate cortex (pACC). A parametric analysis showed a correlation between brain activity in the mid-cingulate cortex (aMCC) and the IP ratings. Conclusions: Higher IP ratings for genuine pain expressions and higher IE ratings for fake ones suggest that participants were indirectly able to recognize authenticity in facial expressions. At the neural level, pACC and aMCC appear to be involved in unveiling the genuine vs. fake pain and in coding the intensity of the perceived pain, respectively.

1. Introduction

The International Association for the Study of Pain (IASP) defines pain as an unpleasant sensory and emotional experience associated with, or resembling that associated with, actual or potential tissue damage [1]. This dual sensory-emotional nature makes pain not only a sensory phenomenon but also a crucial social signal, often expressed through specific facial movements.
Facial expressions serve as universal signals of human emotions and can reliably communicate internal states across different contexts in highly social species [2,3]. Pain, as a complex emotional and sensory experience, is accompanied by distinctive facial expressions, which facilitate social communication and support-seeking behaviors by eliciting empathy [4,5,6]. Empathy is broadly defined as the capacity to understand what others feel, whether it is an emotional or sensory state [7]. In literature, human pain-related body behaviors—especially noticeable actions such as limping or guarding—have been found to influence social reinforcement and social behavior. Interestingly, less attention has been given to facial expressions of pain [8]. The literature on pain expression processing and empathy has mostly relied on stimuli featuring body parts rather than faces [9,10]. Indeed, facial pain expressions have a special survival and communicative value due to the fact that they can warn others of imminent danger and elicit helping behavior [4,11]. A typical pain-related facial expression tends to appear consistently across different pain conditions, making it universally recognizable [6]. This set of facial movements include lowered brow, raised cheeks, tightened eyelids, an opened mouth, and a wrinkled nose [6,12]. These features are distinctive of pain and can be differentiated from expressions of other negative emotions such as disgust, fear, or anger [12].
Nevertheless, emotions expressed through nonverbal cues do not always align with an individual’s actual emotional state [13] because of the evolutionary development of interpersonal deception [2,3,11,14,15]. Deception is defined as “an act that is intended to foster in another person a belief or understanding which the deceiver considers false” [16]. This ability must be intentional and conscious, meaning that lying inherently reflects human intention and awareness [14]. There are three major ways in which emotional expressions could be intentionally manipulated. First, an expression may be simulated; namely, individuals display an emotion (e.g., pain) they are not genuinely experiencing. Secondly, an expression can be masked by replacing the genuine emotion with an artificial emotional display (e.g., the person is truly experiencing sadness but displays a fake smile). Finally, a facial expression can be suppressed in such a way as to conceal the emotion the individual is experiencing (e.g., the person is experiencing pain but inhibits the facial expression to conceal it) [15]. According to this distinction, both simulated (or fake) and suppressed expressions are clear examples of deception.
In the past few decades, several attempts have been made to determine the distinctive morphological/visual patterns, both static and dynamic, of deceptive facial expressions compared to genuine ones [17]. In general, spontaneous emotional expressions are more symmetrical compared to those made deliberately, both for positive and negative emotions [2]. Moreover, genuine expressions usually last between 2 and 4 s, whereas deliberate expressions show abrupt onset and offset, and the apex lasts too long [2]. More specifically, simulated (or fake) pain expressions exaggerate typical pain features, such as brow lowering and mouth opening, in both intensity and duration, and include non-pain-related emotions (e.g., shame, happiness; [12,15]). On the other hand, suppressed pain facial expressions are less intense and show residual facial activity due to the pain actually experienced [12].
While the distinctions between simulated and genuine facial expressions of pain might appear evident from the literature, it may be surprising to learn that individuals are not particularly skilled at detecting deception. Actually, several studies have demonstrated that when individuals are asked whether a person is deceiving or not (i.e., direct tasks), their performance is slightly above chance, even if they undergo recognition training [3,15,18,19]. A possible explanation is that people consciously tend to rely on cues, such as brow lowering, likely because of their salience in mental representations of pain expressions, which, however, may not be the best predictors of real pain [6]. Instead, according to some authors, the recognition of pain from facial expressions should depend on other prominent facial cues (eye narrowing [3,6]).
Interestingly, using some kind of indirect tasks (e.g., asking to rate the perceived cooperativeness; [20]), subjects are better at recognizing when emotions are simulated. In line with this view, a study by Stel and van Dijk [13] demonstrated that participants were good at detecting liars when using an indirect task, i.e., to indicate the extent to which the target was experiencing a particular emotion while telling a story, compared to a direct task in which they rated the truthfulness of the target.
While in the field of deception research, the terms ‘indirect’ and ‘direct’ are specifically defined as outlined previously, in the field of emotion research, authors use the terms ‘explicit’ and ‘implicit’ to make a distinction between different tasks. Explicit emotional processing requires participants to label the affective content of stimuli or to perform a valence categorization task (e.g., determining whether a stimulus is positive or negative). It needs declarative evaluation and involves higher cognitive resources to define conscious emotional states. On the other hand, implicit tasks consist of asking the participant to process a non-emotional attribute of the stimulus (e.g., gender discrimination task). This process is meant to be automatic and does not require conscious access to be executed [21,22], and it is based on the assumption that emotional processes operate independently of cognitive attention [23].
Concerning the neural correlates of pain, two distinguishable sets of areas are involved in the sensory and affective components of pain, also collectively referred to as the “pain matrix” [24,25,26]. The primary and secondary somatosensory cortices (S1, S2) and the posterior insula process localization, quality, and intensity discrimination of painful stimuli (i.e., sensory components; [27]). Conversely, the anterior mid-cingulate cortex (aMCC; see [28] for an updated nomenclature of cingulate cortex) and the anterior insula (AI) are engaged in processing the distressing aspects of pain, including the subjective discomfort associated with the nociceptive signal and the motivational drive to eliminate the stimulus responsible for this experience (i.e., affective components; [29]), and their activity correlates with the intensity of perceived pain [30].
These “affective” areas have been found to be activated also during the observation of facial expressions of pain [5,31,32]. However, relatively few studies have focused on the neural mechanisms underlying the distinctions between genuine and fake facial expressions of pain. For example, Zhao and colleagues [33] demonstrated some neural differences between the observation of genuine and fake facial expressions of pain. However, in that study, participants were explicitly informed that they were observing simulated expressions; moreover, even “real” pain expressions were actually simulated by actors.
The aim of this study is to investigate the differences in neural correlates underlying the observation of genuine, fake, and suppressed pain facial expressions. In other words, we seek to determine whether the brain can differentiate between genuine and deceptive expressions. To the best of our knowledge, this is the first experimental design about genuine/deceptive pain that exclusively used video clips showing only facial expressions and not injured body parts, with genuine and suppressed pain videos derived from real painful stimuli applied to the models’ hands, thus ensuring the authenticity of the emotional displays. Moreover, during functional magnetic resonance imaging (fMRI) scanning, participants performed an implicit task (gender discrimination), remaining unaware of which videos involved genuine pain. Additionally, participants later reviewed the videos outside the scanner to provide indirect ratings (e.g., perceived intensity of the expression and of the pain), enabling us to assess whether they could indirectly differentiate between genuine, simulated, and suppressed pain.

2. Materials and Methods

2.1. Participants

Twenty-four right-handed healthy women (mean age = 20.7 years, SD ± 2.9; school age = 13.1 years, SD ± 0.6) were recruited among the students of the University of Modena and Reggio Emilia to take part in the experiment. Handedness was assessed using the Edinburgh Inventory (M = 0.9, SD ± 0.1; [34]). The volunteers had no history of neurological or psychiatric diseases or brain injury, and they had no exclusion criteria for MR. Because of the differences between genders both in expressivity [35,36] and in empathic behavior [37,38], we decided to recruit only women in order to have a homogenous experimental population to study. All participants gave their written informed consent to take part in the study, and they received study credits as a fee for their attendance. The experiment was previously approved by the Local Ethics Committee (Area Vasta Emilia Nord, protocol number 134.14), and it was conducted in accordance with the ethical standards of the 2013 Declaration of Helsinki [39].

2.2. Questionnaires and Interoception Assessment

Before beginning the fMRI experimental procedure, participants completed a series of self-administered questionnaires in their Italian printed version:
  • Empathy Quotient (EQ; [40]);
  • Interpersonal Reactivity Index (IRI; [41]);
  • Toronto Alexithymia Scale 20 (TAS-20; [42]);
  • Pain Catastrophizing Scale (PCS; [43]).
Moreover, we decided to investigate interoceptive accuracy (IA), namely, the individual sensitivity to physiological stimuli originating inside the body, which is one of the most relevant aspects of self-experience and may influence the perception and evaluation of emotional stimuli [44,45]. IA can be assessed through a heartbeat perception task [46], and in the present study, this task was employed as described in Ardizzi et al. [47] and Ambrosecchia et al. [48]. Briefly, the task consisted of silently counting one’s own heartbeats while ECG was being recorded, without any kind of feedback from the experimenters. After a 15 s training period, participants had to count their heartbeats in four different randomized time intervals (25 s, 35 s, 45 s, and 100 s), which were triggered by audio-visual start and stop cues. ECG was recorded through three Ag/AgCl, pre-gelled, 10 mm electrodes (ADInstruments, Oxford, UK) placed in an Einthoven’s triangle configuration, with a sample size of 1 KHz and with an online filter. Afterwards, the R-wave peaks were detected in order to obtain the real number of the participants’ heartbeats (Powerlab and OctalBioAmp 8/30, ADInstruments, United Kingdom).
The IA score was calculated as the mean score of the four separate heartbeat perception intervals according to the following transformation [46,49]:
1 4 ( 1 r e c o r d e d   b e a t s c o u n t e d   b e a t s r e c o r d e d   b e a t s )
According to this transformation, the IA score varies between 0 and 1, with higher scores indicating smaller differences between objectively recorded and subjectively counted heartbeats (i.e., higher interoceptive accuracy).

2.3. Experimental Procedure

An event-related fMRI paradigm was employed. Each participant completed three runs of 36 trials (for a total of 108 trials). Every trial lasted 14 s and began with a brief visual warning signal (WS), which was a change in the background color of the video from black to blue (0.5 s). Then, the WS was followed by stimulus presentation (2.5 s) and a black screen (11 s; see Figure 1). At the beginning and at the end of each run, there was a 20 s rest period.
The stimuli were 27 video clips (20 female and 7 male identities) that belonged to different categories: genuine pain (GP; video clips reproducing facial expressions of individuals who were truly experiencing pain in their hand); deceptive pain, comprising fake pain (FP; video clips representing face expressions of individuals who were asked to simulate a painful experience) and suppressed pain (SP; video clips depicting individuals who were really experiencing pain in their hand but who were asked to suppress their facial expressions); neutral (N; these video clips were used as the control condition and depicted individuals with a neutral expression while receiving a light touch on their hand).
Starting from the 108 video clips, we created four different randomized sequences to present to the participants. Each sequence consisted of 36 video clips per run, divided into 9 GP, 9 FP, 9 SP, and 9 N.
Stimuli were presented via the Esys-fMRI (Invivo Corporation, Gainesville, FL, USA) remote display, timed by custom-made software developed in Visual Basic 6 (http://digilander.libero.it/marco_serafini/stimoli_video/, accessed on 8 January 2015).
Participants were instructed to complete an implicit task, i.e., they had to carefully watch the video and press a button on a response box (Current Designs Inc., Philadelphia, PA, USA) when they detected a male identity in the video clip.
At the end of the scanning session, each volunteer was asked to review the same video clips on a computer outside the magnetic resonance room and to give a posteriori ratings. Using a range from zero to ten (with 0 = no intensity and 10 = the maximum imaginable intensity), they had to rate two indirect parameters: the intensity of the facial expression (IE) and the intensity of the pain they guessed had been really perceived by the individual shown in each video (IP). When participants considered a certain video clip to be expressive but not a facial expression of pain (IE ≠ 0, IP = 0), they were also asked to indicate which emotion they believed it represented.

2.4. Stimuli

The video clips employed as stimuli were recorded and validated in a previous experiment by our research group [5]. Twenty-seven young participants were recorded while experiencing painful or non-painful stimulations on their right hand. They sat comfortably in front of a grey background, wearing a white coat covering their personal clothing, without any kind of distinctive elements (e.g., make-up or jewelry). The experimenter delivered the painful or tactile stimulation manually through an aluminum hollow cylinder containing a sliding brass weight connected to a plastic tip. This tip could sustain either a stainless-steel wire (0.2 mm section) or a foam rubber tip (2 mm). In a preliminary calibration procedure, the brass weight was set so that the subject consistently reported no pain when touched with the foam rubber tip (pain intensity = 0 on a numerical rating scale–NRS–from 0 to 10) and reported a light-moderate pain when touched with the wire (pain intensity = 2–4 on the same 0–10 NRS). In each of the first two sessions, 20 painful and 20 non-painful stimuli were pseudo-randomly alternated: in the first session, the volunteers were instructed to react naturally to each stimulus (GP and N video clips); in the second session, they were instructed to suppress pain during actual painful stimulation (SP and, again, N video clips). After each stimulation, the subject reported the perceived pain intensity on the 0–10 NRS. Finally, in a third session, participants did not undergo any stimulation, but they were asked to simulate some painful experiences (FP video clips). The camera (Sony HDD handycam DCR-SR32, spatial resolution 720 × 576 pixels; Sony, Tokyo, Japan) was arranged 1.5 m in front of the individuals at eye level. The environmental setting was always kept the same for all the recordings.
Subsequently, the collected video clips were validated by an independent group of three female evaluators from the University of Modena and Reggio Emilia. The evaluators, who were blind to the real nature of each video clip, had to watch the video clips and rate, on a scale from zero to ten, both IE and IP. Starting from this validation, 108 video clips (belonging to GP, FP, SP, and N categories) were selected, namely, the ones that received the most consistent ratings from the independent evaluators. The IE mean and the IP mean of the selected video clips, divided into the four above-mentioned categories, are represented in Table 1.

2.5. fMRI Data Acquisition

Functional MRI data were acquired with a Philips Achieva MRI system (Philips, Amsterdam, The Netherlands) at 3T and a BOLD (blood oxygenation level-dependent) sensitive gradient-echo echo-planar sequence [repetition time (TR): 2000 ms; echo time (TE): 30 ms; field of view: 240 mm; 80 × 80 matrix; 35 transverse slices, 3 mm each with a 1 mm gap]. A high-resolution T1-weighted anatomical image was also acquired for each subject to allow anatomical localization and spatial standardization (TR: 9.9 ms; TE: 4.6 ms; 170 sagittal slices; voxel size: 1 mm × 1 mm × 1 mm).

2.6. Behavioral Data Analysis

To perform behavioral analysis, RStudio (version 2024) and TIBCO Statistica, version 14.0.1 (2020) were used.
The questionnaires were scored for each participant, and the mean score was calculated for each questionnaire.
Regarding a posteriori video clips ratings, IE and IP were computed for each video clip category (GP, FP, SP, and N). Analyses of data distribution using the Shapiro-Wilk test showed that IE and IP were not normally distributed; therefore, a non-parametric ANOVA (Friedman’s test) with category as a within factor was conducted. Post-hoc analyses were performed using Wilcoxon signed-rank tests, with the Bonferroni correction applied for multiple comparisons to control for the probability of making at least one Type I error across all tests.

2.7. fMRI Data Analysis

The Matlab R2020a and SPM12 (Wellcome Department of Imaging Neuroscience, London, UK) software were used for fMRI data analysis.
For each participant, all functional volumes were corrected for slice timing, realigned to the first volume acquired, co-registered with the anatomical image, normalized to the MNI (Montreal Neurological Institute) template implemented in SPM12, and smoothed with a 6 mm × 6 mm × 8 mm full-width at half-maximum Gaussian kernel.
At the single-subject level, the four conditions (GP, FP, SP, and N) were modeled by convolving the respective stimulus timing vectors with the standard hemodynamic response function. Condition effects were estimated using a general linear model (GLM) framework, and region-specific effects were investigated with linear contrasts comparing the four experimental conditions. For all the subjects, the stimuli entered in each condition were the same, following the a priori categorization that derived from the preliminary validation of the video clips (see above). Head-motion parameters (translations and rotations) were entered as nuisance variables. Then, group-level random effects analyses were performed by entering the individual contrast images corresponding to the effects of interest into separate one-sample t-tests. Moreover, we conducted a conjunction analysis on genuine pain vs. neutral and fake pain vs. neutral to observe common activations with the same statistical values in order to evaluate regions involved in the processing of clearly visible facial expressions of pain.
Parametric analyses were performed to map regions whose activity was related to IP and IE, using the a posteriori categorization that we established by considering the IP and IE ratings, respectively, given by participants after the scanning session. In order to form the categories, we followed the rules illustrated in Table 2. We assigned the N category to the videos that received a score of 0 for both IE and IP. We then assigned the ‘Other’ category to those videos that had non-zero IE and 0 IP scores, where participants had indicated another type of perceived emotion (e.g., surprise, anger). Finally, we kept the original category for those videos that received non-zero scores for both IE and IP. On average, the N category comprised 18 stimuli, the GP category 33.9 stimuli, the FP category 20.1 stimuli, the SP category 20.9 stimuli, and the “Other” category 13.5 stimuli.
At the single-subject level, the five conditions (GP, FP, SP, N, and Other) were modeled by convolving the respective stimulus timing vectors and the respective IE/IP ratings with the standard hemodynamic response function. Head-motion parameters (translations and rotations) were entered as nuisance variables. At the group level, random effects analyses were performed by entering the individual contrast images corresponding to the effects of interest into separate one-sample t-tests. The stimuli were recategorized according to the participants’ responses, and for each stimulus, the corresponding IE/IP value was included in the analysis with the aim of separately evaluating the parametric effect on the different stimulus categories. However, the analysis of the individual categories did not yield results above the threshold, possibly because the variability in the estimates provided by the participants for stimuli belonging to each single category was too small. Therefore, we decided to use a single contrast that combined the parametric effect across all categories.
Furthermore, the personality questionnaire scores and the IA results were used to perform correlation analyses with functional brain activity in the various conditions/contrasts.
For all the above-mentioned analyses, a double statistical threshold (single-voxel statistics and spatial extent) was used to achieve a combined experiment-wise (i.e., corrected for multiple comparisons) significance level of α < 0.05, as computed by the 3dClustSim AFNI routine with the “-acf ” option (AFNI version 24.2.06).

3. Results

3.1. Behavioral Results

3.1.1. Questionnaires and Interoception Assessment

The mean of EQ was 46.1 with a SD of 8.1; the mean of IRI was 87.9 with a SD of 8.9; the mean of TAS-20 was 50.2 with a SD of 8.4; the mean of PCS was 20.1 with a SD of 6.6. The mean of IA was 0.41 with a SD of 0.28.

3.1.2. IE and IP Ratings

Regarding IE, fake stimuli received the highest ratings (M = 5.2, SD ± 2.2), followed by genuine (M = 4.2, SD ± 2.6), suppressed (M = 2, SD ± 1.8), and neutral (M = 0.5, SD ± 0.8). Concerning IP, genuine stimuli received the highest ratings (M = 4, SD ± 2.8), followed by fake (M = 3.5, SD ± 3), suppressed (M = 3.2, SD ± 2.5), and neutral (M = 1.5, SD ± 2).
The non-parametric ANOVA (Friedman’s test) on IE ratings revealed a significant effect of category (χ2(3) = 1159.18, p < 0.001). Post-hoc tests revealed a significant difference between all the categories (p < 0.001; Figure 2).
The non-parametric ANOVA (Friedman’s test) on IP ratings revealed a significant effect of category (χ2(3) = 347.03, p < 0.001). Post-hoc tests revealed significant differences among all the categories with p < 0.001, except for the difference between fake and suppressed that gave a p < 0.05. However, after applying the Bonferroni correction, the difference between fake and suppressed did not resist (Figure 3).

3.2. fMRI Results

3.2.1. fMRI Results with a Priori Categorization

The following are the most relevant results of the a priori categorization analysis.
Greater BOLD response for the observation of genuine pain compared to fake pain (GP vs. FP) was observed in the pregenual anterior cingulate cortex (pACC; [28]), bilaterally (Table 3, Figure 4).
The contrast between genuine pain and suppressed pain (GP vs. SP) revealed areas of significant BOLD signal changes in a wide range of cortical regions, including the left cerebellum, bilateral superior, middle and inferior temporal gyrus, right fusiform gyrus, bilateral middle occipital gyrus, right inferior parietal lobule, right cuneus, and right insula (see Table 3).
The results of the conjunction analysis performed on the contrasts GP vs. N and FP vs. N revealed greater BOLD response in bilateral superior, middle, and inferior temporal gyrus, bilateral middle occipital gyrus, right inferior parietal lobule, right insula, right temporal pole, left supramarginal gyrus, right middle frontal gyrus and bilateral inferior frontal gyrus, left cerebellum, right inferior occipital gyrus, and right fusiform gyrus (Table 4).

3.2.2. Parametric Analysis with a Posteriori Categorization

A significant parametric relationship was found between the IP ratings provided by the participants for each video clip and brain activity in the anterior mid-cingulate cortex (aMCC; Table 5, Figure 5).
Parametric analysis with IE ratings did not result in any significant activity.

3.2.3. Correlations with Questionnaires and Interoception

No significant correlations were found in the most relevant contrasts of the a priori categorization analysis presented above for the other questionnaires, nor for the IA assessment results.

4. Discussion

The present study aimed to explore the behavioral and neural correlates of the observation of genuine, fake, and suppressed facial expressions of pain. Specifically, we sought to determine whether the brain can unveil the truth in pain expressions. To this end, we asked a group of women to undergo a functional magnetic resonance imaging (fMRI) task during which video clips depicting genuine pain (GP), fake pain (FP), suppressed pain (SP), and neutral (N) facial expressions were presented. After the scanning session, participants reviewed the video clips and provided a posteriori ratings for each video. As far as we are aware, this is the first study to use authentic video clips of facial expressions, where genuine and suppressed pain expressions were derived from actual nociceptive stimuli and deceptive expressions were deliberately simulated by the volunteers. Moreover, in order to minimize potential biases and emphasize automatic neural responses to different categories of pain expressions, we decided to employ an implicit task (gender discrimination) during the fMRI paradigm. In addition, during the post-scan video review, we employed an explicit but indirect task (i.e., subjects had not been told that some of the expressions they observed were genuine, while others were fake or suppressed, and they were not asked to directly recognize who was pretending). In this indirect task, participants rated the video clips according to the intensity of the facial expression (IE) and the intensity of the pain (IP) they observed. Overall, our findings indicate that participants were, on average, able to differentiate between genuine, fake, and suppressed pain through the indirect task. On the neural level, pregenual anterior and middle cingulate cortex activity appears to be crucial in unveiling genuine pain and coding the intensity of pain observed in others.

4.1. Behavioral Findings

Our participants assigned average higher IP ratings to genuine pain facial expressions compared to fake, suppressed, and neutral ones. On the contrary, they assigned average higher IE ratings to fake pain facial expressions compared to genuine, suppressed, and neutral ones. As it is known from the literature that simulated expressions tend to be more pronounced in terms of expressiveness, with the typical components of genuine pain—such as brow lowering and mouth opening—being displayed with greater intensity and for longer durations compared to authentic expressions [12], both these results suggest that subjects were indirectly able to distinguish genuine from simulated expressions of pain. This ability could also be enhanced by the fact that simulated facial expressions often include incongruent non-pain-related actions, conveying emotions such as shame, guilt, or happiness [12,15].
It is also worth noting that suppressed pain expressions showed lower IE ratings than genuine and simulated expressions and higher IE ratings than neutral expressions. These results confirm previous research demonstrating that suppressed pain facial expressions are less intense, but also that they present some residual facial activity (e.g., micro-expressions, such as brow lowering and mouth opening) [12,15]. These micro-expressions might be precisely what lead participants to attribute higher IP ratings to suppressed expressions compared to neutral expressions: it appears that participants were only partly deceived by the effort made by the models in the video clips to mask the pain they actually felt.
Our behavioral results support the hypothesis that asking individuals to rate the expressivity and the pain observed in others allow the recognition of the authenticity of pain facial expressions. This kind of task could be categorized as an indirect task according to the dissociation between direct vs. indirect measures, as it does not directly ask participants whether the models are lying [3,13,15,18,19,20].

4.2. Functional Imaging Findings

According to the dissociation between explicit and implicit emotional processing tasks [22,50], our fMRI protocol can be categorized as an implicit task. Indeed, participants performed a non-emotional task (gender discrimination) with consciously visible emotional stimuli (i.e., pain facial expressions).
Several studies consistently demonstrate that a well-known set of brain regions (“affective” areas of the Pain Matrix) is involved in the observation of facial expressions of pain [4,5,31,32]. Within these regions, the anterior mid-cingulate cortex (aMCC) and the anterior insula (AI) play key roles [29]. It is worth noting that the cingulate cortex nomenclature has been quite controversial in the past few decades; therefore, the same region (aMCC) is also variably named in the literature (e.g., the anterior cingulate cortex, ACC, or dorsal ACC; see, for instance, [31]). Here, we follow Vogt’s classification of the cingulate cortex [28].
The parametric analysis with a posteriori categorization revealed that aMCC activity is related to IP ratings, i.e., the greater the IP score, the greater the activity of aMCC. This region is commonly considered part of the affective network elaborating physical pain [26], and its activity correlates with the intensity of perceived physical pain [30]. aMCC was also found to be activated in various studies comparing the observation of painful vs. non-painful facial expressions [4,5,31,32]. The present finding suggests that aMCC might be the brain structure mainly involved in the “measure” of pain, and therefore in the discrimination between real and fake pain. Interestingly, Budell and colleagues [32] indicated this region, along with several others, as correlating with the amount of pain judged to be present in the observed facial expressions; however, they used as stimuli only pain expressions produced by actors presented as real. A previous study on genuine/deceptive pain expressions [33] proposed that genuine pain, compared to simulated pain, selectively activates aMCC, AI, and right supramarginal gyrus (rSMG), then focused on the interplay between AI and rSMG, rather than on aMCC. However, it is worth noting that the study by Zhao et al. [33] presents several differences compared to our experimental protocol: first, participants were explicitly informed that they would observe deceptive expressions; second, they were also explicitly instructed to recreate the feelings of the models as vividly and intensely as possible; therefore, these instructions might have introduced some significant biases. Moreover, all their videos depicted faces of actors intentionally producing facial expressions of pain; therefore, even the expressions of “genuine” pain were actually simulated.
Our results also show that the pregenual anterior cingulate cortex (pACC) is significantly more activated when observing genuine pain expressions [51,52] compared to fake ones. Some studies have shown activation of the subgenual and perigenual portions of the ACC (corresponding to pACC in Vogt’s classification [28]) for nociceptive stimuli [53], as well as saline injections or visceral distention [54], although not as consistently as for other regions. In fact, the anterior regions of the cingulate cortex are not listed in a recent meta-analysis on the neural representation of acute pain [26]. Interestingly, the pACC showed increased signal when observing negative affective images of body parts [55] when comparing self vs. others’ faces in a study about pain expressions [5], but decreased signal both during and in the anticipation of pain acute perception [56]. Furthermore, the pACC has connections with the orbitofrontal cortex, periacqueductal grey, and autonomic centers and is rich in opioids; overall, this region has been consistently implicated in endogenous pain control, including placebo [51]. The increased signal we found in the pACC may be related to the activation of endogenous control systems, possibly related to an empathic response, which is higher when observing genuine than fake pain expressions.
Additionally, the observation of authentic facial expressions of pain, as compared to suppressed ones (GP vs. SP), more intensely activated a bilateral array of cortical areas located in the temporal, occipital, and insular cortices. These brain regions, and particularly AI, are related both to actual pain perception and to the observation of pain [5,24], as well as to facial processing according to the Haxby and Gobbini model [57,58] Indeed, several studies consistently demonstrated that AI is engaged in processing the distressing aspects of pain [29], to the point that its activation correlates with the questionnaire responses regarding the intensity of perceived pain [30]. On the other hand, the superior temporal gyrus and right fusiform gyrus contribute to both the visual processing of facial expressions (i.e., core system) and the emotional interpretation of their features (i.e., extended system) [57,58]. A possible explanation for why suppressed expressions failed to elicit such brain activations is that inhibited expressions do not actually present a sufficient amount of expressive cues to activate the network. This is certainly consistent with the behavioral data; indeed, IE ratings of suppressed pain expressions are higher than neutral, but lower than all the other ones. Previous studies on emotional body expressions demonstrated that perceiving negative (especially fearful) body expressions significantly activates dorsal stream structures involved in action preparation, with a central role of the parietal cortex [59,60,61,62]. In addition, the identification of facial expressions of emotions was shown to elicit activations in the inferior parietal lobe (IPL; [63]). Consistently, IPL activation was observed during the evaluation of dynamic expressive faces compared to gender evaluation [64], further highlighting the IPL’s role in processing emotionally salient facial cues. Nevertheless, the neural representations of categorical valence (positive, neutral, and negative) were identified within several regions, including IPL, as well as the precuneus, bilateral medial prefrontal cortex (MPFC), left superior temporal sulcus (STS)/postcentral gyrus, right STS/middle frontal gyrus (MFG), and thalamus [65]. Furthermore, emotion-specific but stimulus category-independent neural representations were observed in the left postcentral gyrus, left IPL, and right STS [66].
Buhle and colleagues [67] employed an fMRI study and examined eight independent datasets (218 total subjects) to investigate the neural overlap between the experience of pain and the processing of negative emotions. Even if the study primarily examined the role of the periaqueductal gray, the authors found that other regions were involved in both conditions, including the right inferior parietal lobule and bilateral cuneus. The inferior parietal lobule is located near the secondary somatosensory cortex, and it is linked to both the direct experience of pain and the imagination or observation of pain from both the body and the face [32,68,69]. Cuneus activity is primarily associated with visual processing, and heightened attention to emotional images may have enhanced visual processing for negative compared to neutral stimuli. Interestingly, cuneus activity has also been linked to the affective dimension of pain [70,71]. These findings suggest that the cuneus may play a broader role in processing visual aversive stimuli, similarly to what we observed in our study for genuine and fake emotional expressions.
Although a distinction between genuine and deceptive pain expressions was observed at the level of the pACC, it is noteworthy that the conjunction analysis (GP vs. N and FP vs. N) showed that both types of expressions activate a widespread common network, similar to the one just mentioned, which also included other cortical regions such as bilateral inferior frontal gyrus (IFG) and left supramarginal gyrus (lSMG). IFG is hypothesized to represent a fundamental region within the human mirror neuron system [72]. This, in turn, may represent a mechanism of shared representations proposed to underlie empathy [73], and could be relevant for the process of mentalizing similarity between oneself and the target that one is observing [74]. Additionally, recent findings indicate that the rIFG could play a role in interpreting pain through visual information (encompassing not only facial expressions but also sensory cues), facilitating the observer’s ability to infer another individual’s state and forming the neural foundation for empathy toward pain [32,75]. Consistent with this evidence, our study observed significant activations of the right IFG when facial expressions of pain, whether genuine or simulated, were presented, as compared to neutral facial expressions. This can be explained by a heightened empathic resonance elicited by viewing these images, in contrast to the neutral facial expressions. Recent evidence suggests that the right supramarginal gyrus (SMG) could play a pivotal role in distinguishing self from others affective state [76,77], and in the interaction with affective regions, such as AI, in order to shape the behavioral response to observed pain [78,79]. Recent evidence suggests that during genuine pain scenarios, this modulatory effect could prevent empathic over-arousal and enable appropriate social responses [33]. In line with the literature, our result in the GP vs. N and FP vs. N conjunction shows a greater BOLD response in the left SMG when participants see a dynamic facial expression of pain compared to when neutral facial expressions are presented. This may be due to a greater empathic resonance elicited by facial expressions of pain compared to neutral ones, reinforcing the aforementioned role of the SMG in empathic responses to pain scenarios.
Regarding the cerebellum, several studies have reported its involvement in physical pain perception, with activity centered in its left hemisphere, posterior lobe (see meta-analyses [25,26]). The left posterior activations we found during the observation of genuine pain expressions compared to suppressed ones, and in both genuine and fake expressions, substantially overlap with activation found in studies comparing painful with neutral expressions [31]. Interestingly, previous studies showed widespread posterior cerebellar activation during implicit emotion (not pain) processing [23,50,80].
Previous studies focusing on the neural correlates of implicit and explicit processing of emotional facial expressions have shown that both tasks were able to activate a common network of brain regions, including the occipital and temporal lobes, IFG, and cerebellum, engaged in high-valence stimuli processing. At the same time, some neural distinctions between implicit and explicit emotion processing are proposed. Explicit processing seemed to activate to a greater extent the middle temporal gyrus and the IFG [23,50], whereas implicit processing involved a greater activation of the amygdala [23]. Our results align with previous findings regarding negative emotions; indeed, using an implicit task, we observed activations in occipital and temporal gyri, IFG, and cerebellum during the observation of highly expressive pain facial expressions (GP vs. N and FP vs. N conjunction). The presence of these activations in response to pain expressions, as well as other basic emotions, provides additional evidence supporting the notion that pain fits the paradigms of emotions. We did not demonstrate an enhanced amygdala activation, but this is in line with a study by Scheuerecker et al. [50].

4.3. Limitations and Future Directions

The sample consisted exclusively of women, which may limit the generalizability of the findings to the general population. Additionally, the sample consisted only of young students, which ensured homogeneity; however, it certainly limited the results’ generalizability.
Future directions of research may include extending the experimental paradigm to male participants in order to assess potential gender differences in emotional processing. Moreover, future studies could benefit from a more diverse age range to explore potential age-related differences in the neural mechanisms under investigation.
These results may have significant implications in clinical practice, as they support the idea that caregivers and healthcare professionals could be trained to use these indirect/implicit processes to unveil the truth in others’ pain. Indeed, the assessment of pain is essential to provide appropriate care to individuals in need. A significant concern for the healthcare community involves vulnerable populations who are unable to communicate or express their pain directly. These groups include infants, young children, individuals with mental illnesses, and the elderly. In such cases, caregivers or family members typically rely on observing behavioral or physiological responses to infer the presence or absence of pain. In some cases, observers may lack the necessary training or may be influenced by personal biases, leading to inadequate or inaccurate assessments of the patient’s pain experience [81,82]. Furthermore, social and interpersonal dynamics can affect not only the expression and perception of pain but also the judgments made by those evaluating it [83]. For these reasons, understanding and quantifying the ability to recognize facial expressions of pain represents a crucial first step in developing specific training programs.

5. Conclusions

This study provides valuable insights into both the behavioral and neural mechanisms underlying the recognition of pain expressions. From a behavioral perspective, participants demonstrated an ability to distinguish genuine pain from deceptive expressions, while suppressed expressions were recognized as less intense. Regarding the functional perspective, fMRI findings indicate that a specific brain region, namely the pACC, is more activated by genuine pain facial expressions compared to fake ones. Moreover, the aMCC appears to play a crucial role in evaluating the intensity of pain, with its activation correlating with the perceived pain. These findings contribute to the broader understanding of emotional recognition processing, offering a deeper perspective on the neural dynamics involved in distinguishing genuine from deceptive pain in others.

Author Contributions

Conceptualization, F.B., F.L., C.A.P. and V.G.; methodology, F.B. and F.L.; software, O.C.; validation, F.B. and F.L.; formal analysis, V.Z., C.C., F.R. and O.C.; investigation, M.A. (Martina Ardizzi), M.A. (Marianna Ambrosecchia) and D.B.; resources, F.B. and F.L.; data curation, V.Z.; writing—original draft preparation, V.Z., F.R., O.C., M.A. (Marianna Ambrosecchia), M.A. (Martina Ardizzi), V.G., C.A.P., F.B. and F.L.; writing—review and editing, F.B., F.L., V.Z., C.C., F.R. and D.B.; visualization, V.Z., C.C. and F.R.; supervision, F.B.; project administration, F.B. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Local Ethics Committee Area Vasta Emilia Nord (AVEN; protocol code 134.14, date of approval 15 July 2014).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available on request from the corresponding author. Data is unavailable publicly due to privacy and ethical restrictions.

Acknowledgments

The authors acknowledge Rita Bardoni for collaborating in the validation of the original video clips.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
ACCAnterior cingulate cortex
AIAnterior insula
aMCCAnterior mid-cingulate cortex
BOLDBlood oxygenation level-dependent
EQEmpathy quotient
fMRIFunctional magnetic resonance imaging
FPFake pain
GLMGeneral linear model
GPGenuine pain
IAInteroceptive accuracy
IASPInternational association for the study of pain
IEIntensity of the facial expression
IFGInferior frontal gyrus
IPIntensity of the pain
IPLInferior parietal lobe
IRIInterpersonal reactivity index
MMean
MFGMiddle frontal gyrus
MNI Montreal neurological institute
MPFCMedial prefrontal cortex
NNeutral
NRSNumerical rating scale
pACCPregenual anterior cingulate cortex
PCSPain catastrophizing scale
SDStandard deviation
SMGSupramarginal gyrus
SPSuppressed pain
SPMStatistical parametric mapping
STSSuperior temporal sulcus
TAS-20Toronto alexithymia scale 20
TEEcho time
TRRepetition time
WSWarning signal

References

  1. Raja, S.N.; Carr, D.B.; Cohen, M.; Finnerup, N.B.; Flor, H.; Gibson, S.; Keefe, F.J.; Mogil, J.S.; Ringkamp, M.; Sluka, K.A.; et al. The revised International Association for the Study of Pain definition of pain: Concepts, challenges, and compromises. Pain 2020, 161, 1976–1982. [Google Scholar] [CrossRef] [PubMed]
  2. Ekman, P. Darwin, deception, and facial expression. Ann. N. Y. Acad. Sci. 2003, 1000, 205–221. [Google Scholar] [CrossRef] [PubMed]
  3. Bartlett, M.S.; Littlewort, G.C.; Frank, M.G.; Lee, K. Automatic decoding of facial movements reveals deceptive pain expressions. Curr. Biol. CB 2014, 24, 738–743. [Google Scholar] [CrossRef]
  4. Saarela, M.V.; Hlushchuk, Y.; Williams, A.C.; Schürmann, M.; Kalso, E.; Hari, R. The compassionate brain: Humans detect intensity of pain from another’s face. Cereb. Cortex 2007, 17, 230–237. [Google Scholar] [CrossRef]
  5. Benuzzi, F.; Lui, F.; Ardizzi, M.; Ambrosecchia, M.; Ballotta, D.; Righi, S.; Pagnoni, G.; Gallese, V.; Porro, C.A. Pain mirrors: Neural correlates of observing self or others’ facial expressions of pain. Front. Psychol. 2018, 9, 1825. [Google Scholar] [CrossRef] [PubMed]
  6. Blais, C.; Fiset, D.; Furumoto-Deshaies, H.; Kunz, M.; Seuss, D.; Cormier, S. Facial Features Underlying the Decoding of Pain Expressions. J. Pain 2019, 20, 728–738. [Google Scholar] [CrossRef] [PubMed]
  7. Singer, T.; Seymour, B.; O’Doherty, J.; Kaube, H.; Dolan, R.J.; Frith, C.D. Empathy for pain involves the affective but not sensory components of pain. Science 2004, 303, 1157–1162. [Google Scholar] [CrossRef]
  8. de C Williams, A.C. Pain: Behavioural expression and response in an evolutionary framework. Evol. Med. Public Health 2023, 11, 429–437. [Google Scholar] [CrossRef] [PubMed]
  9. Riečanský, I.; Lamm, C. The Role of Sensorimotor Processes in Pain Empathy. Brain Topogr. 2019, 32, 965–976. [Google Scholar] [CrossRef]
  10. Meng, J.; Li, Y.; Luo, L.; Li, L.; Jiang, J.; Liu, X.; Shen, L. The Empathy for Pain Stimuli System (EPSS): Development and preliminary validation. Behav. Res. 2024, 56, 784–803. [Google Scholar] [CrossRef]
  11. Williams, A.C. Facial expression of pain: An evolutionary account. Behav. Brain Sci. 2002, 25, 439–455. [Google Scholar] [CrossRef] [PubMed]
  12. Hill, M.L.; Craig, K.D. Detecting deception in pain expressions: The structure of genuine and deceptive facial displays. Pain 2002, 98, 135–144. [Google Scholar] [CrossRef]
  13. Stel, M.; van Dijk, E. When do we see that others misrepresent how they feel? Detecting deception from emotional faces with direct and indirect measures. Soc. Influ. 2018, 13, 137–149. [Google Scholar] [CrossRef]
  14. Bond, C.F.; Robinson, M. The evolution of deception. J. Nonverbal Behav. 1988, 12 Pt 2, 295–307. [Google Scholar] [CrossRef]
  15. Porter, S.; ten Brinke, L. Reading between the lies: Identifying concealed and falsified emotions in universal facial expressions. Psychol. Sci. 2008, 19, 508–514. [Google Scholar] [CrossRef]
  16. DePaulo, B.M.; Zuckerman, M.; Rosenthal, R. Detecting deception: Modality effects. In Review of Personality and Social Psychology; Wheeler, L., Ed.; Sage: Beverly Hills, CA, USA, 1980; Volume 1, pp. 125–162. [Google Scholar]
  17. Alkhouli, M.; Al-Nerabieah, Z.; Dashash, M. Analyzing facial action units in children to differentiate genuine and fake pain during inferior alveolar nerve block: A cross-sectional study. Sci. Rep. 2023, 13, 15564. [Google Scholar] [CrossRef] [PubMed]
  18. Hess, U.; Kleck, R.E. The cues decoders use in attempting to differentiate emotion-elicited and posed facial expressions. Eur. J. Soc. Psychol. 1994, 24, 367–381. [Google Scholar] [CrossRef]
  19. Vrij, A.; Mann, S. Telling and detecting lies in a high-stake situation: The case of a convicted murderer. Appl. Cogn. Psychol. 2001, 15, 187–203. [Google Scholar] [CrossRef]
  20. Bond, C.F.; Levine, T.R.; Hartwig, M. New findings in non-verbal lie detection. In Detecting Deception: Current Challenges and Cognitive Approaches; Granhag, P.A., Vrij, A., Verschuere, B., Eds.; Wiley-Blackwell: Hoboken, NJ, USA, 2015; pp. 37–58. [Google Scholar]
  21. Lane, R.D. Neural correlates of conscious emotional experience. In Cognitive Neuroscience of Emotion; Lane, R.D., Nadel, L., Eds.; Oxford University Press: Oxford, UK, 2000; pp. 345–370. [Google Scholar]
  22. Cohen, N.; Moyal, N.; Lichtenstein-Vidne, L.; Henik, A. Explicit vs. implicit emotional processing: The interaction between processing type and executive control. Cogn. Emot. 2015, 30, 325–339. [Google Scholar] [CrossRef]
  23. Critchley, H.; Daly, E.; Phillips, M.; Brammer, M.; Bullmore, E.; Williams, S.; Van Amelsvoort, T.; Robertson, D.; David, A.; Murphy, D. Explicit and implicit neural mechanisms for processing of social information from facial expressions: A functional magnetic resonance imaging study. Hum. Brain Mapp. 2000, 9, 93–105. [Google Scholar] [CrossRef]
  24. Lui, F.; Duzzi, D.; Corradini, M.; Serafini, M.; Baraldi, P.; Porro, C.A. Touch or pain? Spatio-temporal patterns of cortical fMRI activity following brief mechanical stimuli. Pain 2008, 138, 362–374. [Google Scholar] [CrossRef]
  25. Jensen, K.B.; Regenbogen, C.; Ohse, M.C.; Frasnelli, J.; Freiherr, J.; Lundström, J.N. Brain activations during pain: A neuroimaging meta-analysis of patients with pain and healthy controls. Pain 2016, 157, 1279–1286. [Google Scholar] [CrossRef] [PubMed]
  26. Xu, A.; Larsen, B.; Baller, E.B.; Scott, J.C.; Sharma, V.; Adebimpe, A.; Basbaum, A.I.; Dworkin, R.H.; Edwards, R.R.; Woolf, C.J.; et al. Convergent neural representations of experimentally-induced acute pain in healthy volunteers: A large-scale fMRI meta-analysis. Neurosci. Biobehav. Rev. 2020, 112, 300–323. [Google Scholar] [CrossRef]
  27. Treede, R.D.; Kenshalo, D.R.; Gracely, R.H.; Jones, A.K. The cortical representation of pain. Pain 1999, 79, 105–111. [Google Scholar] [CrossRef] [PubMed]
  28. Vogt, B.A. Cingulate cortex in the three limbic subsystems. In Handbook of Clinical Neurology; Elsevier: Amsterdam, The Netherlands, 2019; Volume 166, pp. 39–51. [Google Scholar] [CrossRef]
  29. Price, D.D. Psychological and neural mechanisms of the affective dimension of pain. Science 2000, 288, 1769–1772. [Google Scholar] [CrossRef] [PubMed]
  30. Favilla, S.; Huber, A.; Pagnoni, G.; Lui, F.; Facchin, P.; Cocchi, M.; Baraldi, P.; Porro, C.A. Ranking brain areas encoding the perceived level of pain from fMRI data. NeuroImage 2014, 90, 153–162. [Google Scholar] [CrossRef]
  31. Botvinick, M.; Jha, A.P.; Bylsma, L.M.; Fabian, S.A.; Solomon, P.E.; Prkachin, K.M. Viewing facial expressions of pain engages cortical areas involved in the direct experience of pain. NeuroImage 2005, 25, 312–319. [Google Scholar] [CrossRef] [PubMed]
  32. Budell, L.; Jackson, P.; Rainville, P. Brain responses to facial expressions of pain: Emotional or motor mirroring? NeuroImage 2010, 53, 355–363. [Google Scholar] [CrossRef] [PubMed]
  33. Zhao, Y.; Zhang, L.; Rütgen, M.; Sladky, R.; Lamm, C. Neural dynamics between anterior insular cortex and right supramarginal gyrus dissociate genuine affect sharing from perceptual saliency of pretended pain. eLife 2021, 10, e69994. [Google Scholar] [CrossRef] [PubMed]
  34. Oldfield, R.C. The assessment and analysis of handedness: The Edinburgh inventory. Neuropsychologia 1971, 9, 97–113. [Google Scholar] [CrossRef] [PubMed]
  35. LaFrance, M.; Banaji, M. Toward a reconsideration of the gender-emotion relationship. In Review of Personality and Social Psychology: Emotions and Social Behavior; Clark, M.S., Ed.; Sage: Newbury Park, CA, USA, 1992; Volume 14, pp. 178–202. [Google Scholar]
  36. Hall, J.; Carter, J.; Horgan, T. Gender differences in nonverbal communication of emotion. In Gender and Emotion: Social Psychological Perspectives (Studies in Emotion and Social Interaction); Fischer, A., Ed.; Cambridge University Press: Cambridge, UK, 2000; pp. 97–117. [Google Scholar]
  37. Klein, K.J.K.; Hodges, S.D. Gender differences, motivation, and empathic accuracy: When it pays to understand. Pers. Soc. Psychol. Bull. 2001, 27, 720–730. [Google Scholar] [CrossRef]
  38. Singer, T.; Seymour, B.; O’Doherty, J.P.; Stephan, K.E.; Dolan, R.J.; Frith, C.D. Empathic neural responses are modulated by the perceived fairness of others. Nature 2006, 439, 466–469. [Google Scholar] [CrossRef] [PubMed]
  39. World Medical Association. World Medical Association Declaration of Helsinki. JAMA 2013, 310, 2191. [Google Scholar] [CrossRef]
  40. Preti, A.; Vellante, M.; Baron-Cohen, S.; Zucca, G.; Petretto, D.R.; Masala, C. The empathy quotient: A cross-cultural comparison of the Italian version. Cogn. Neuropsychiatry 2011, 16, 50–70. [Google Scholar] [CrossRef]
  41. Albiero, P.; Ingoglia, S.; Lo Coco, A. Contributo all’adattamento italiano dell’interpersonal reactivity index. Test. Psicom. Metod. 2006, 13, 107–125. [Google Scholar]
  42. Caretti, V.; La Barbera, D.; Craparo, G. La Toronto Alexithymia Scale (TAS-20). In Alessitimia, Valutazione e Trattamento; Caretti, V., La Barbera, D., Eds.; Astrolabio Ubaldini: Rome, Italy, 2005; pp. 17–23. [Google Scholar]
  43. Monticone, M.; Baiardi, P.; Ferrari, S.; Foti, C.; Mugnai, R.; Pillastrini, P.; Rocca, B.; Vanti, C. Development of the Italian version of the pain catastrophizing scale (PCS-I): Cross-cultural adaptation, factor analysis, reliability, validity and sensitivity to change. Qual. Life Res. 2012, 21, 1045–1050. [Google Scholar] [CrossRef]
  44. Pollatos, O.; Herbert, B.M.; Matthias, E.; Schandry, R. Heart rate response after emotional picture presentation is modulated by interoceptive awareness. Int. J. Psychophysiol. 2007, 63, 117–124. [Google Scholar] [CrossRef]
  45. Dunn, B.D.; Galton, H.C.; Morgan, R.; Evans, D.; Oliver, C.; Meyer, M.; Cusack, R.; Lawrence, A.D.; Dalgleish, T. Listening to your heart. How interoception shapes emotion experience and intuitive decision making. Psychol. Sci. 2010, 21, 1835–1844. [Google Scholar] [CrossRef]
  46. Schandry, R. Heart beat perception and emotional experience. Psychophysiology 1981, 18, 483–488. [Google Scholar] [CrossRef] [PubMed]
  47. Ardizzi, M.; Ambrosecchia, M.; Buratta, L.; Ferri, F.; Peciccia, M.; Donnari, S.; Mazzeschi, C.; Gallese, V. Interoception and positive symptoms in schizophrenia. Front. Hum. Neurosci. 2016, 10, 379. [Google Scholar] [CrossRef]
  48. Ambrosecchia, M.; Ardizzi, M.; Russo, E.; Ditaranto, F.; Speciale, M.; Vinai, P.; Todisco, P.; Maestro, S.; Gallese, V. Interoception and autonomic correlates during social interactions. Implications for anorexia. Front. Hum. Neurosci. 2017, 11, 219. [Google Scholar] [CrossRef]
  49. Pollatos, O.; Kurz, A.-L.; Albrecht, J.; Schreder, T.; Kleemann, A.M.; Schöpf, V.; Kopietz, R.; Wiesmann, M.; Schandry, R. Reduced perception of bodily signals in anorexia nervosa. Eat. Behav. 2008, 9, 381–388. [Google Scholar] [CrossRef]
  50. Scheuerecker, J.; Frodl, T.; Koutsouleris, N.; Zetzsche, T.; Wiesmann, M.; Kleemann, A.M.; Brückmann, H.; Schmitt, G.; Möller, H.J.; Meisenzahl, E.M. Cerebral differences in explicit and implicit emotional processing—An fMRI study. Neuropsychobiology 2007, 56, 32–39. [Google Scholar] [CrossRef] [PubMed]
  51. Seymour, B. Pain: A Precision Signal for Reinforcement Learning and Control. Neuron 2019, 101, 1029–1041. [Google Scholar] [CrossRef]
  52. Kano, M.; Oudenhove, L.V.; Dupont, P.; Wager, T.D.; Fukudo, S. Imaging Brain Mechanisms of Functional Somatic Syndromes: Potential as a Biomarker? Tohoku J. Exp. Med. 2020, 250, 137–152. [Google Scholar] [CrossRef]
  53. Peyron, R.; Quesada, C.; Fauchon, C. Cingulate-mediated approaches to treating chronic pain. In Handbook of Clinical Neurology; Elsevier: Amsterdam, The Netherlands, 2019; Volume 166, pp. 317–326. [Google Scholar] [CrossRef]
  54. Vogt, B.A. Pain and emotion interactions in subregions of the cingulate gyrus. Nat. Rev. Neurosci. 2005, 6, 533–544. [Google Scholar] [CrossRef]
  55. Benuzzi, F.; Lui, F.; Duzzi, D.; Nichelli, P.F.; Porro, C.A. Does it look painful or disgusting? Ask your parietal and cingulate cortex. J. Neurosci. 2008, 28, 923–931. [Google Scholar] [CrossRef] [PubMed]
  56. Porro, C.A.; Cettolo, V.; Francescato, M.P.; Baraldi, P. Temporal and intensity coding of pain in human cortex. J. Neurophysiol. 1998, 80, 3312–3320. [Google Scholar] [CrossRef] [PubMed]
  57. Haxby, J.V.; Hoffman, E.A.; Gobbini, M.I. The distributed human neural system for face perception. Trends Cogn. Sci. 2000, 4, 223–233. [Google Scholar] [CrossRef]
  58. Gobbini, M.I.; Haxby, J.V. Neural systems for recognition of familiar faces. Neuropsychologia 2007, 45, 32–41. [Google Scholar] [CrossRef]
  59. de Gelder, B.; Snyder, J.; Greve, D.; Gerard, G.; Hadjikhani, N. Fear fosters flight: A mechanism for fear contagion when perceiving emotion expressed by a whole body. Proc. Natl. Acad. Sci. USA 2004, 101, 16701–16706. [Google Scholar] [CrossRef]
  60. Grèzes, J.; Pichon, S.; de Gelder, B. Perceiving fear in dynamic body expressions. NeuroImage 2007, 35, 959–967. [Google Scholar] [CrossRef] [PubMed]
  61. Pichon, S.; de Gelder, B.; Grezes, J. Emotional modulation of visual and motor areas by dynamic body expressions of anger. Soc. Neurosci. 2008, 3, 199–212. [Google Scholar] [CrossRef]
  62. Goldberg, H.; Christensen, A.; Flash, T.; Giese, M.; Malach, R. Brain activity correlates with emotional perception induced by dynamic avatars. NeuroImage 2015, 122, 306–317. [Google Scholar] [CrossRef]
  63. Kitada, R.; Johnsrude, I.S.; Kochiyama, T.; Lederman, S.J. Brain networks involved in haptic and visual identification of facial expressions of emotion: An fMRI study. NeuroImage 2010, 49, 1677–1689. [Google Scholar] [CrossRef]
  64. Sarkheil, P.; Goebel, R.; Schneider, F.; Mathiak, K. Emotion unfolded by motion: A role for parietal lobe in decoding dynamic facial expressions. Soc. Cogn. Affect. Neurosci. 2013, 8, 950–957. [Google Scholar] [CrossRef]
  65. Kim, J.; Shinkareva, S.V.; Wedell, D.H. Representations of modality general valence for videos and music derived from fMRI data. Neuroimage 2017, 148, 42–54. [Google Scholar] [CrossRef]
  66. Cao, L.; Xu, J.; Yang, X.; Li, X.; Liu, B. Abstract Representations of Emotions Perceived From the Face, Body, and Whole-Person Expressions in the Left Postcentral Gyrus. Front. Hum. Neurosci. 2018, 12, 419. [Google Scholar] [CrossRef]
  67. Buhle, J.T.; Kober, H.; Ochsner, K.N.; Mende-Siedlecki, P.; Weber, J.; Hughes, B.L.; Kross, E.; Atlas, L.Y.; McRae, K.; Wager, T.D. Common representation of pain and negative emotion in the midbrain periaqueductal gray. Soc. Cogn. Affect. Neurosci. 2013, 8, 609–616. [Google Scholar] [CrossRef]
  68. Jackson, P.L.; Brunet, E.; Meltzoff, A.N.; Decety, J. Empathy examined through the neural mechanisms involved in imagining how I feel versus how you feel pain. Neuropsychologia 2006, 44, 752–761. [Google Scholar] [CrossRef]
  69. Kross, E.; Berman, M.G.; Mischel, W.; Smith, E.E.; Wager, T.D. Social rejection shares somatosensory representations with physical pain. Proc. Natl. Acad. Sci. USA 2011, 108, 6270–6275. [Google Scholar] [CrossRef] [PubMed]
  70. Fulbright, R.K.; Troche, C.J.; Skudlarski, P.; Gore, J.C.; Wexler, B.E. Functional MR imaging of regional brain activation associated with the affective experience of pain. Am. J. Roentgenol. 2001, 177, 1205–1210. [Google Scholar] [CrossRef] [PubMed]
  71. Matharu, M.S.; Bartsch, T.; Ward, N.; Frackowiak, R.S.; Weiner, R.; Goadsby, P.J. Central neuromodulation in chronic migraine patients with suboccipital stimulators: A PET study. Brain 2004, 127 Pt 1, 220–230. [Google Scholar] [CrossRef]
  72. Gallese, V. Before and below “theory of mind”: Embodied simulation and the neural correlates of social cognition. Philos. Trans. R. Soc. B Biol. Sci. 2007, 362, 659–669. [Google Scholar] [CrossRef]
  73. Iacoboni, M. Imitation, empathy, and Mirror neurons. Annu. Rev. Psychol. 2009, 60, 653–670. [Google Scholar] [CrossRef] [PubMed]
  74. Majdandžić, J.; Amashaufer, S.; Hummer, A.; Windischberger, C.; Lamm, C. The selfless mind: How prefrontal involvement in mentalizing with similar and dissimilar others shapes empathy and prosocial behavior. Cognition 2016, 157, 24–38. [Google Scholar] [CrossRef]
  75. Li, Y.; Li, W.; Zhang, T.; Zhang, J.; Jin, Z.; Li, L. Probing the role of the right inferior frontal gyrus during Pain-Related empathy processing: Evidence from fMRI and TMS. Hum. Brain Mapp. 2021, 42, 1518–1531. [Google Scholar] [CrossRef] [PubMed]
  76. Hoffmann, F.; Koehne, S.; Steinbeis, N.; Dziobek, I.; Singer, T. Preserved Self-other Distinction During Empathy in Autism is Linked to Network Integrity of Right Supramarginal Gyrus. J. Autism Dev. Disord. 2016, 46, 637–648. [Google Scholar] [CrossRef] [PubMed]
  77. Bukowski, H.; Tik, M.; Silani, G.; Ruff, C.C.; Windischberger, C.; Lamm, C. When differences matter: rTMS/fMRI reveals how differences in dispositional empathy translate to distinct neural underpinnings of self-other distinction in empathy. Cortex 2020, 128, 143–161. [Google Scholar] [CrossRef]
  78. Fallon, N.; Roberts, C.; Stancak, A. Shared and distinct functional networks for empathy and pain processing: A systematic review and meta-analysis of fMRI studies. Soc. Cogn. Affect. Neurosci. 2020, 15, 709–723. [Google Scholar] [CrossRef]
  79. Naor, N.; Rohr, C.; Schaare, L.H.; Limbachia, C.; Shamay-Tsoory, S.; Okon-Singer, H. The neural networks underlying reappraisal of empathy for pain. Soc. Cogn. Affect. Neurosci. 2020, 15, 733–744. [Google Scholar] [CrossRef]
  80. Pierce, J.E.; Thomasson, M.; Voruz, P.; Selosse, G.; Péron, J. Explicit and Implicit Emotion Processing in the Cerebellum: A Meta-analysis and Systematic Review. Cerebellum 2023, 22, 852–864. [Google Scholar] [CrossRef] [PubMed]
  81. Boerner, K.E.; Chambers, C.T.; Craig, K.D.; Pillai Riddell, R.R.; Parker, J.A. Caregiver accuracy in detecting deception in facial expressions of pain in children. Pain 2013, 154, 525–533. [Google Scholar] [CrossRef] [PubMed]
  82. Samolsky Dekel, B.G.; Gori, A.; Vasarri, A.; Sorella, M.C.; Di Nino, G.; Melotti, R.M. Medical Evidence Influence on Inpatients and Nurses Pain Ratings Agreement. Pain Res. Manag. 2016, 2016, 9267536. [Google Scholar] [CrossRef]
  83. Hoffman, K.M.; Trawalter, S.; Axt, J.R.; Oliver, M.N. Racial bias in pain assessment and treatment recommendations, and false beliefs about biological differences between blacks and whites. Proc. Natl. Acad. Sci. USA 2016, 113, 4296–4301. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Experimental design. Each trial (14 s) was composed of these e a brief warning signal (WS) of 0.5 s, video clip presentation (2.5 s), and a continuous black screen (11 s) until the next trial.
Figure 1. Experimental design. Each trial (14 s) was composed of these e a brief warning signal (WS) of 0.5 s, video clip presentation (2.5 s), and a continuous black screen (11 s) until the next trial.
Brainsci 15 00185 g001
Figure 2. Main effect of category on IE ratings. All comparisons were statistically significant (p < 0.001). Error bars depict standard deviation (SD).
Figure 2. Main effect of category on IE ratings. All comparisons were statistically significant (p < 0.001). Error bars depict standard deviation (SD).
Brainsci 15 00185 g002
Figure 3. Main effect of category on IP ratings. All comparisons were statistically significant (p < 0.001), apart from the fake vs suppressed comparison (p < 0.05, does not resist Bonferroni correction). Error bars depict standard deviation (SD).
Figure 3. Main effect of category on IP ratings. All comparisons were statistically significant (p < 0.001), apart from the fake vs suppressed comparison (p < 0.05, does not resist Bonferroni correction). Error bars depict standard deviation (SD).
Brainsci 15 00185 g003
Figure 4. Regions of increased signal for the contrast GP vs. FP (x = 12). L = left; cluster-size threshold k > 46 voxels.
Figure 4. Regions of increased signal for the contrast GP vs. FP (x = 12). L = left; cluster-size threshold k > 46 voxels.
Brainsci 15 00185 g004
Figure 5. Regions whose activity is related to IP ratings (x = −3). L = left; cluster size threshold k > 9.
Figure 5. Regions whose activity is related to IP ratings (x = −3). L = left; cluster size threshold k > 9.
Brainsci 15 00185 g005
Table 1. Validation of the stimuli used in the task. Mean (M) and standard deviation (SD) for intensity of expression (IE) and intensity of pain (IP) provided by three independent evaluators for the selected video clips divided into the four categories (genuine, fake, suppressed, and neutral).
Table 1. Validation of the stimuli used in the task. Mean (M) and standard deviation (SD) for intensity of expression (IE) and intensity of pain (IP) provided by three independent evaluators for the selected video clips divided into the four categories (genuine, fake, suppressed, and neutral).
GenuineFakeSuppressedNeutral
IEIPIEIPIEIPIEIP
M3.42.43.20.041.22.500
SD1.11.30.40.20.40.700
Table 2. Rules used for the a posteriori categorization. From the left column: Intensity of expression (IE), intensity of pain (IP), and the condition in which the stimuli are inserted.
Table 2. Rules used for the a posteriori categorization. From the left column: Intensity of expression (IE), intensity of pain (IP), and the condition in which the stimuli are inserted.
IEIPCondition
00Neutral
≠00Other
≠0≠0Real/Fake/Suppressed (based on the a priori categorization)
Table 3. Regions of increased signal for the contrast GP vs. FP and GP vs. SP. BA = Brodmann Area; l = left; r = right; cluster-size threshold k > 46 voxels and k > 62 voxels, respectively, to reach the combined alfa < 0.05.
Table 3. Regions of increased signal for the contrast GP vs. FP and GP vs. SP. BA = Brodmann Area; l = left; r = right; cluster-size threshold k > 46 voxels and k > 62 voxels, respectively, to reach the combined alfa < 0.05.
BASideClusterVoxel LevelMNI CoordinatesTalairach Coordinates
Brain Areas kZExyzxyz
GP vs. FP
pregenual anterior cingulate cortex32r/l1253.5912386123812
GP vs. SP
Cerebellum l675.01−15−73−46−15−73−35
3.72−18−67−30−18−66−22
2.95−21−79−38−21−78−28
Middle Temporal Gyrus, Superior Temporal Gyrus, Inferior Temporal Gyrus, Inferior Parietal Lobule, Fusiform Gyrus, Middle Occipital Gyrus, Insula, Cuneus21, 22, 37, 40 19, 17r5794.7448−37248−364
4.5451−73250−715
4.5057−40656−387
Middle Temporal Gyrus, Superior Temporal Gyrus. Inferior Temporal Gyrus, Middle Occipital Gyrus.22, 19l2214.55−54−466−53−448
3.99−48−706−48−689
3.21−39−706−39−689
Table 4. Regions of increased signal obtained from the conjunction between the contrasts GP vs. N and FP vs. N. BA = Brodmann Area; l = left; r = right; p < 0.001 uncorrected, cluster-size threshold k > 0 voxels. For the visualization the extension k > 9 was selected.
Table 4. Regions of increased signal obtained from the conjunction between the contrasts GP vs. N and FP vs. N. BA = Brodmann Area; l = left; r = right; p < 0.001 uncorrected, cluster-size threshold k > 0 voxels. For the visualization the extension k > 9 was selected.
BASideClusterVoxel LevelMNI CoordinatesTalairach Coordinates
Brain Areas kZExyzxyz
Middle Temporal Gyrus, Superior Temporal Gyrus,
Inferior Temporal Gyrus,
Middle Occipital Gyrus,
Inferior Parietal Lobule,
Insula
21, 22, 20, 19, 39r5996.9851−40650−387
6.0648−76−248−742
5.8445−31−645−30−4
Middle Temporal Gyrus, Superior Temporal Gyrus,
Temporal Pole
21, 38 r465.95548−22537−19
Middle Temporal Gyrus, Superior Temporal Gyrus,
Inferior Temporal Gyrus,
Middle Occipital Gyrus,
Supramarginal Gyrus


22, 37, 19
l3605.60−51−736−50−709
5.58−51−646−50−629
5.30−57−556−56−538
Middle Frontal Gyrus46r335.434824248438
Inferior Frontal Gyrus,
Insula
47r644.854526−24525−3
Inferior Frontal Gyrus47l704.80−4820−6−4819−6
3.94−3926−2−3925−3
Cerebellum l144.14−12−76−46−12−76−35
3.53−18−82−46−18−81−35
Middle Occipital Gyrus,
Inferior Occipital Gyrus,
Fusiform Gyrus
18, 37r414.0327−85−627−83−1
3.6027−911027−8814
Fusiform Gyrus37r163.9642−43−2242−43−16
Table 5. Regions whose activity is related to IP ratings. BA = Brodmann Area; l = left; cluster size threshold k > 9, to reach the combined alfa < 0.05.
Table 5. Regions whose activity is related to IP ratings. BA = Brodmann Area; l = left; cluster size threshold k > 9, to reach the combined alfa < 0.05.
BASideClusterVoxel LevelMNI CoordinatesTalairach Coordinates
Brain Areas kZExyzxyz
Mid-cingulate cortex24l154.68−3238−3435
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zanelli, V.; Lui, F.; Casadio, C.; Ricci, F.; Carpentiero, O.; Ballotta, D.; Ambrosecchia, M.; Ardizzi, M.; Gallese, V.; Porro, C.A.; et al. Unveiling the Truth in Pain: Neural and Behavioral Distinctions Between Genuine and Deceptive Pain. Brain Sci. 2025, 15, 185. https://doi.org/10.3390/brainsci15020185

AMA Style

Zanelli V, Lui F, Casadio C, Ricci F, Carpentiero O, Ballotta D, Ambrosecchia M, Ardizzi M, Gallese V, Porro CA, et al. Unveiling the Truth in Pain: Neural and Behavioral Distinctions Between Genuine and Deceptive Pain. Brain Sciences. 2025; 15(2):185. https://doi.org/10.3390/brainsci15020185

Chicago/Turabian Style

Zanelli, Vanessa, Fausta Lui, Claudia Casadio, Francesco Ricci, Omar Carpentiero, Daniela Ballotta, Marianna Ambrosecchia, Martina Ardizzi, Vittorio Gallese, Carlo Adolfo Porro, and et al. 2025. "Unveiling the Truth in Pain: Neural and Behavioral Distinctions Between Genuine and Deceptive Pain" Brain Sciences 15, no. 2: 185. https://doi.org/10.3390/brainsci15020185

APA Style

Zanelli, V., Lui, F., Casadio, C., Ricci, F., Carpentiero, O., Ballotta, D., Ambrosecchia, M., Ardizzi, M., Gallese, V., Porro, C. A., & Benuzzi, F. (2025). Unveiling the Truth in Pain: Neural and Behavioral Distinctions Between Genuine and Deceptive Pain. Brain Sciences, 15(2), 185. https://doi.org/10.3390/brainsci15020185

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop