Next Article in Journal
Long-Period Gratings and Microcavity In-Line Mach Zehnder Interferometers as Highly Sensitive Optical Fiber Platforms for Bacteria Sensing
Next Article in Special Issue
Age-Related Differences in Fixation Pattern on a Companion Robot
Previous Article in Journal
A WoT Platform for Supporting Full-Cycle IoT Solutions from Edge to Cloud Infrastructures: A Practical Case
Previous Article in Special Issue
Social STEAM Learning at an Early Age with Robotic Platforms: A Case Study in Four Schools in Spain
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Effects of the Level of Interactivity of a Social Robot and the Response of the Augmented Reality Display in Contextual Interactions of People with Dementia

1
Department of Industrial Design, Eindhoven University of Technology, 5600 MB Eindhoven, The Netherlands
2
Department of Industrial Design, Northwestern Polytechnical University, Xi’an 710072, China
*
Authors to whom correspondence should be addressed.
Sensors 2020, 20(13), 3771; https://doi.org/10.3390/s20133771
Submission received: 31 May 2020 / Revised: 2 July 2020 / Accepted: 3 July 2020 / Published: 5 July 2020
(This article belongs to the Special Issue Human-Robot Interaction and Sensors for Social Robotics)

Abstract

:
The well-being of people with dementia (PWD) living in long-term care facilities is hindered due to disengagement and social isolation. Animal-like social robots are increasingly used in dementia care as they can provide companionship and engage PWD in meaningful activities. While most previous human–robot interaction (HRI) research studied engagement independent from the context, recent findings indicate that the context of HRI sessions has an impact on user engagement. This study aims to explore the effects of contextual interactions between PWD and a social robot embedded in the augmented responsive environment. Three experimental conditions were compared: reactive context-enhanced robot interaction; dynamic context-enhanced interaction with a static robot; a control condition with only the dynamic context presented. Effectiveness evaluations were performed with 16 participants using four observational rating scales on observed engagement, affective states, and apathy related behaviors. Findings suggested that the higher level of interactivity of a social robot and the interactive contextualized feedback helped capture and maintain users’ attention during engagement; however, it did not significantly improve their positive affective states. Additionally, the presence of either a static or a proactive robot reduced apathy-related behaviors by facilitating purposeful activities, thus, motivating behavioral engagement.

1. Introduction

People with dementia (PWD) in long-term care (LTC) facilities may benefit from appropriate technological solutions that target the improvement of current disengaged lifestyles [1], by motivating intrinsic interests and engaging in meaningful activities [2,3]. Animal-like social robots are one major area that is gradually gaining attention in dementia care. Such robotic pets (such as PARO [4,5,6], a robotic baby seal, AIBO [7], a robotic dog, NeCoRo [8], a robotic cat, Huggable [9] and CuDDler [10], robotic teddy bears, and Pleo [11,12,13], a robotic dinosaur) are designed to provide companionship and social support, motivate communications, help regulate anxiety, depression, and agitation, and they also demonstrate similar positive effects as animal-assisted therapy for improving quality of life [14,15]. These uniquely designed robots are equipped with multiple sensors, designed with cute and inviting appearances, and behave to evoke positive human emotions [16].
Animal-like social robots have been introduced to PWD as reasonable substitutes for real animals to avoid potential safety hazards (e.g., allergies, infection, or injury) and extra workload on caregivers (e.g., cleaning and taking care of the animals) caused by real animals [17,18,19]. Related research looked into comparison effects between robotic pets and other animal-related stimuli (such as plush toys, real animals, and animal videos) to investigate the effectiveness among different stimuli. Tamura et al. [20] evaluated the therapeutic effects of interaction with a robotic dog AIBO to a toy dog on severe dementia users. The findings suggested that the interaction with both an AIBO and the toy dog led to improvements in responsive behaviors, such as looking at, communicating with, and caring for the stimuli. Libin et al. [8] investigated the difference between a robotic cat NeCoRo and a plush cat on PWD’s agitation, affect, and engagement. Results indicated both types of cats held promise for decreasing agitated behaviors and had positive influences on affect and engagement. The findings of the above two studies suggested, although not identically, similar positive effects (e.g., decreased agitated behaviors and improved affect and engagement) were seen with both kinds of stimuli. In a different study [4], Takayanagi et al. compared the interaction of a PARO with a stuffed lion toy with PWD. The results showed significantly more positive changes in emotional expression, self-initiated talk, and active interaction with the PARO than with the lion toy. More recently, Moyle et al. [21] undertook a 10-week cluster-randomized controlled trial to explore the effects between a PARO and a look-alike plush toy. The findings purposed that the same participants had varied positive and negative responses toward the PARO during a long-term evaluation, depending on the facilitation and personal statues. In addition, Marx et al. [22] examined the effects of five different kinds of dog-related stimuli on engagement, including a puppy video, a real dog, a plush dog, a robotic dog, and a dog-coloring activity. The results showed no statistical differences in engagement duration and positive attitudes among all stimuli, except the coloring activity.
The aforementioned literature, although demonstrating promising evidence on animal-like social robots’ roles in enhancing the engagement of PWD, the nature and extent of the evidence supporting the use of social robots are unclear, and there is no consensus in the documented results that confirms that a robot with relatively higher interactivity performs significantly better than a plush toy on user engagement or the affective states of PWD. Therefore, more studies are needed to further explore the effects of animal-like social robots on the well-being of PWD, as well as potential influential factors that could contribute to enhanced engagement.
The rich interaction of designed agents (e.g., social robots), suggested by Frens et al. [23] and Wada et al. [24], is one key factor for motivating the engagement of users. What is more, as most previous human–robot interaction (HRI) research studies engaged independently from the context, recent studies indicate that the context of a HRI session has the potential for influencing user engagement as well [5,25]. Hoffman et al. [26] conducted two separate studies to explore HRI sessions in a shared music-listening and a video-watching experience. The findings showed differences in how users perceived the presented stimuli and their attitudes toward the companion robot when adding a shared context. Participants enjoyed the music more due to the presence of the responsive behaviors of the robot. Moreover, Hendriks et al. [27] performed a study with PWD to investigate whether adding audio–visual contextual cues would contribute to a more engaging play experience with a robotic dinosaur—Pleo. The results indicated an increase in gaze behavior when the context was added. To our best knowledge, none of the existing research has undertaken a comparison study between robot interaction and static robot/toy play within a shared context with PWD. Therefore, in this study, we took the exploration of HRI with the elderly with dementia to a new level—an animal-like social robot was embedded in an augmented responsive environment. Two factors—the level of interactivity of a social robot and context of HRI—that could potentially contribute to the enhanced engagement of PWD were combined.
This study was built on our previous work—an interactive installation design called LiveNature [28]. LiveNature engages PWD in relaxing nature experiences through the combined use of a robotic sheep and an augmented reality display mounted on the wall, shown in Figure 1. The dynamic context was responsive, as the HRI not only triggers the motion and sound feedback of the robot but also the visual–audio responses from the display. The 87-inch ultra-high definition screen display shows the dynamic video content of a grass field with a heard of sheep to simulate a window outlook experience of typical Dutch farm scenery. The virtual content on the screen was augmented using a physical interactive interface (an old-fashioned water pump and an animal feeding trough) [29] and a physical robotic sheep, to reinforce the tactile interaction for the completeness of the multi-sensory engagement. The soft-fur covered robotic sheep is a prototype built through reprogramming a Pleo robot and is weighted similar to a real baby lamb. Since this work was undertaken in collaboration with a Dutch residential home, we addressed several aspects of the design that are familiar to a generation of elderly Dutch people to trigger reminiscence, evoke memories, and emotional responses. LiveNature combined a robotic sheep with an augmented responsive environment to: (1) provide a content-related context for improving acceptability at start of the HRI session; (2) help sustaining attention and interests during engagement by providing multiple feedback from both the robot and the screen-display; (3) create a vivid multi-sensory environment for a calm and relaxing rich interactive experience.
Three experimental conditions were adopted in this study with different levels of interactivity of a social robot and contextual influences from the augmented reality display: Condition 1—reactive context-enhanced interaction with a proactive robot (C1); Condition 2—dynamic context-enhanced interaction with a static robot (C2); a control condition with only dynamic context presented (CC). The robotic sheep that invites interactions and provokes contextualized feedback from the environment was used for C1, and it was turned off as the static robot. Effectiveness evaluations were performed using four observational rating scales on observed engagement, affect, and apathy-related behaviors.
The objectives of this study are:
  • To explore the effects of contextual interactions between PWD and an animal-like social robot embedded in the augmented responsive environment in an LTC facility;
  • To investigate which experimental condition is more effective in enhancing engagement in PWD, provoking positive affective responses and reducing apathy related behaviors.
Since most existing work regards social robot use focused on the feasibility, acceptability, and effectiveness of the robotic companions on dementia users [30,31], this study fills in the gap of dementia-related research by exploring the impact of the context of interaction on user engagement, and further investigated the factors that could potentially benefit PWD more during HRI sessions.

2. Materials and Methods

2.1. Participants

Twenty-six residents from a Vitalis WoonZorg Groep LTC center (Vitalis for short), Eindhoven, the Netherlands, where inhabitants with formal diagnoses of dementia live in an enclosed environment, were approached for the study. Informed consent was obtained for twenty-four residents (N = 24) and they went through the screening procedures for eligibility. The inclusion criteria were: (1) a documented formal diagnosis of dementia; (2) an age of 65 and above; (3) a Mini-Mental State Examination (MMSE) score lower than 24; (4) a physical ability to sit, hold, and interact with the robotic sheep. The participants with acute visual or auditory impairment reported by staff were excluded from the experiment. The principal investigator contacted the care facility to hold a family meeting pre-experiment with legally authorized representatives of residents for presenting all relevant information regarding the experiment, signing informed consent, and residents’ rights to refuse to participate during any time. The participant recruitment flow is shown in Figure 2.

2.2. Study Design

To test the effects of the experimental conditions, 21 eligible participants were allocated to two parallel groups (G1: n = 10 and G2: n = 11), randomized by living units and stratified by dementia severity. Within each group, participants experienced one experimental condition (G1 participated in C1; G2 participated in C2) and one control condition—CC. Descriptive details of experiment conditions are shown in Table 1. Two sessions of each interaction took place. The interaction sessions were provided once a week and lasted for four weeks in total. Within each group, half of the participants started with the experiment condition and the others with the control condition for eliminating possible confounding factors. In the end, 16 participants’ data were used in the analysis.
All experimental sessions were performed in a real-life setting of the Vitalis living facility. The installation was situated in the public hallway, where two seats were positioned in front of the installation to create a comfortable sitting environment. Experimental sessions took place during non-planned activity time between 10:00 to 12:30 a.m. and 2:00 to 5:00 p.m. with up to 10 sessions planned each day and each session lasting up to 20 minutes. A trained facilitator invited participants one at a time to spend some time to join the session. She was instructed to be inconspicuous while interacting, but would help encourage engagement when needed, and ended the sessions when participants started to lose interest and focus. The facilitator was trained extensively by employing pre-experiment presentations, written guidelines, and was blinded to the objectives of the study. Video and audio materials of all experimental sessions were recorded.

2.3. Measures

The MMSE test was administered for eligibility examination before the experiment sessions by the facilitator. The evaluation was carried out using the following validated observational rating scales by blinded raters:
  • Observational Measurement of Engagement (OME) [32] was adopted as it is the most widely used scale for assessing the engagement of PWD. It measures engagement through the duration of the time that resident is involved with the stimulus, and level of attention and attitude towards the stimulus on two 7-point Likert scales separately;
  • The Engagement of a Person with Dementia Scale (EPWDS) [33] using a 5-point Likert scale was also adopted for evaluating user engagement, as it compensates OME by providing the verbal and social aspects of engagement. EPWDS emphasizes both the social interaction and activity participation (engagement with the stimulus) of PWD across LTC setting. This 10-item scale measures five dimensions of engagement: affective, visual, verbal, behavioral, and social engagement. Each dimension was assessed separately using a positive and a negative subscale, then interpreted collectively to provide an overall impression of engagement. Each item indicates the extent to which the rater agrees or disagrees with the statement (“strongly disagree” = 1, “strongly agree” = 5);
  • Observed Emotional Rating Scale (OERS) [34], a 5-point Likert scale for evaluating five affective states: pleasure; anger; anxiety/fear; sadness; general alertness. Items were scored according to the intensity presented during experiment sessions;
  • People Environment Apathy Rating Scales–Apathy subscale (PEAR–Apathy subscale) [35], a 4-point Likert scale for assessing apathy related behaviors. It evaluates symptoms of apathy in cognitive, behavioral, and affective domains through six ratings: facial expressions; eye contact; physical engagement; purposeful activity; verbal tone; verbal expression.
The rating scales of OME and OERS were rated on-site through direct observation by an observer after each experiment session. The PEAR–Apathy subscale and EPWDS were rated by a trained research assistant based on the video recordings of the experiment after data collection was completed. Although the rating scales were initially developed for field tests via direct observation, previous research studies also tested the validity and reliability of these tools using videos for indirect observation based ratings [35,36] with positive outcomes. A higher rating score of OME, EPWDS, OERS, or PEAR–Apathy subscale indicates a greater display of a particular effect.

2.4. Ethical Considerations

This study was approved by the Board of Vitalis WoonZorg Groep care center, sent on 14 May 2018. All participants (or legal guardians as representatives, when participants were no longer capable of giving consent) gave their informed consent for inclusion before they participated in the study. The research was permitted and conducted in accordance with the requirements of the Eindhoven University of Technology.

3. Data Analysis

For the inter–rater reliability (IRR) validity check, a second rater rated and coded partially on the video recordings (13 out of 52 sessions), randomly selected. The IRR between the two coders was calculated using Cohen’s kappa statistic, shown in Table A1, Appendix A. According to Fleiss [37,38], a Kappa value between 0.40 and 0.60 was considered a fair agreement, between 0.60 and 0.75 a good agreement, and above 0.75 an excellent agreement. The data rated by the raters (onsite observer and offsite research assistant) were used for further analysis. Data entry and analysis of all observational rating scales were completed using IBM SPSS Statistics, Version 24. We examined data from OERS and found a very low occurrence of items “Anger”, “Anxiety/Fear”, and “Sadness” and, therefore, they were merged as a single item “Negative Affect” [39]. Differences among the control and experiment conditions were compared using non-parametric statistical tests (Kruskal–Wallis H with pairwise comparisons) for categorical ordinal variables. The analysis of variance (ANOVA) with post hoc tests for continuous variables (item Duration in OME, and all items in EPWDS) were used. Significant alpha was set at p < 0.05.

4. Results

4.1. Participant Demographics

Demographic information was collected from documented medical records of each participant. The 16 analyzed participants were within the age range of 78–92 (M = 85.2, SD = 4.8) and were at various stages of dementia, according to staff reports. Sample characteristics were summarized as the means and standard deviations (SD) of the continuous variables and as frequencies and percentages of the categorical ones. The demographic of the participating residents, according to group assignment, is described in Table 2. The t-test showed no significant difference between the characteristics of G1 and G2.

4.2. Results of the Observational Rating Scales Analysis

Results of all rated items from each observational rating scale were summarized using the means and SD, shown in Table A2, Appendix B.
According to the results of OME, a significant difference was found on item “Attitude—Most of the time” with a Kruskal–Wallis test among three conditions (Chi-square = 6.41, p = 0.041, df = 2). Furthermore, pairwise comparisons of experiment conditions showed “Attitude—Most of the time” was significantly higher when interacting with C1 than C2 (p = 0.049), indicating that the participants demonstrated more positive attitude with a higher level of interactivity with the social robot, accompanied by the response from the augmented reality display. No significance was found with the rest of the rating items. The average mean of “Duration” and “Attitude” (both “Most of the time” and “Highest level”) are the lowest during the interaction with C2, meaning the participants were the least engaged when the robot was presented but turned off, compared to the other two conditions.
In terms of EPWDS, statistical differences were found on items of “Visual Engagement” F(2, 49) = 4.36, p = 0.018, “Behavior Engagement” F(2, 49) = 13.32, p < 0.001, “Social Engagement” F(2, 49) = 5.56, p = 0.007, and “Composite Sum” F(2, 49) = 4.13, p = 0.022 by ANOVA tests. Further post hoc examinations showed participants were significantly more engaged with C1 than CC in terms of “Visual” (p = 0.005), “Behavioral” (p < 0.001), and “Social” (p = 0.002) aspects of engagement and engagement in general (Composite Sum, p = 0.006). No significant difference was found between C1 and C2 on any rating items. Besides, significance was also exhibited on “Behavioral Engagement” between C2 and CC (p = 0.003), meaning that with the presence of either a toy or a robot, the behavioral aspect of engagement was reinforced.
No significant difference was found in OERS.
Regarding the PEAR–Apathy subscale, significant differences were found on the rating items of “Eye Contact” (Chi-square = 7.47, p = 0.024, df = 2), “Physical Engagement” (Chi-square = 22.80, p < 0.001, df = 2), “Purposeful Activity” (Chi-square = 22.06, p < 0.001, df = 2), and “Verbal Expression” (Chi-square = 8.28, p = 0.016, df = 2). Pairwise comparison results demonstrated a significant decrease in apathy-related behaviors when interacting with C1 compared to CC on “Eye Contact” (p = 0.023), “Physical Engagement” (p < 0.001), “Purposeful Activity” (p < 0.001), and “Verbal Expression” (p = 0.013). No significance was found on items “Facial Expressions” and “Verbal Tone”. Moreover, C2 had a significantly lower score upon the item “Purposeful Activity” than the control.

5. Discussion and Conclusions

5.1. Effects of Contextual Interactions on PWD

Four different rating scales demonstrated a number of consistent results: (1) the significant results discovered between the interaction with C1 and CC suggested that the contextual interactions, including HRI and multiple sources of feedback from both the robot (proximal interaction) and the augmented reality display (peripheral interaction), helped with capturing and maintaining PWD’s attention (such as visual gazing and physical manipulation) during engagement; however, did not contribute to provoking positive emotions or facilitating verbal communications; (2) the presence of either a proactive or static robot (robot On or Off) can help facilitate purposeful activities and motivate behavioral engagement; (3) participants held the least positive attitude towards the C2 compared to other experimental conditions, and this could be due to the childish feeling when playing with the turned-off robot. In conclusion, as the literature suggests, the notion of engagement is constructed by two essential components: affective state and focused attention [40]. The results of this study indicate the manipulation of the level of interactivity of a social robot and that the contextualized feedback positively influenced the attention aspects of engagement. As the literature suggested, customized content with reminiscent materials may contribute to enhanced affective engagement [28]. Future designs of HRI with PWD need to consider the properly designed interactivity of social robots, the role of contextual cues in HRI, as well as the content being conveyed through the context to enhance engagement from both affect and attention aspects of engagement. In addition, since most research on PWD focused on providing therapeutic stimulations through auditory and visual modalities, tactile interaction is also crucial for motivating interests in activity engagement, especially when other senses are compromised.

5.2. Reflections for Measurement Use

The adoption of four rating scales also generated several reflections on measurement use: (1) the combined use of OERS and OME can provide an effective overall evaluation of PWD’s engagement in terms of aspects of attention and affect/attitude; however, they do not address social or verbal engagement separately, which are useful parameters for social robot interaction evaluations; (2) EPWDS compensate above two scales by offering a separate evaluation of verbal and social engagement. Additionally, the rated scores of each item can be added up as a single sum for an easier comparison between conditions. However, one problem noted by authors is that the dimension “affective engagement” measures the extent to which the observer agrees that the participant displayed positive/negative affect, but it does not cover the frequency of emotional expressions and how long it lingers. This could be further improved or supported in use with OERS; (3) PEAR–Apathy is a valuable scale that assesses the extent to which the participants are intrinsically motivated to behave despite being positively or negatively engaged. It could be adopted in line with engagement evaluation for providing more a comprehensive understanding of user engagement.

5.3. Limitations and Future Works

The first limitation concerns the small sample size and data analysis. The sample size is relatively small due to challenges in participant recruitment and withdrawal during the experiment. Within each group of participants, a repeated measurement design was performed. However, we analyzed the data as between-subject instead of within-subject due to too many drop-out sessions, which consequently resulted in a loss of certain test power. The second limitation concerns the ethics of the social robot use in dementia care, as in certain situations, especially for people with severe stages of dementia, it may create a risk of deception due to the animal characteristics accompanied by environmental visual–auditory materials. Therefore, we insisted on a facilitator presence at all times for supervising the proper use. Future work will regard further statistical analysis of rating scale results over co-variables (e.g., dementia severity, limitation in language, emotion expression, and mobility), which could be tested to provide deeper insights. In addition, a full factorial experiment of the different levels of interactivity of the robot, and with/without the responsive context could be conducted to further investigate the influence of adding a context to HRI.

Author Contributions

Conceptualization, Y.F., E.I.B., S.Y., J.H., and G.W.M.R.; formal analysis, Y.F. and M.R.; investigation, Y.F.; software, Y.F.; supervision, E.I.B., S.Y., J.H., and G.W.M.R.; validation, Y.F.; writing—original draft, Y.F.; writing—review and editing, E.I.B. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Acknowledgments

The authors would like to thank all participating residents, family and staff from Vitalis WoonZorg Groep for their participations, Dirk van de Mortel for his contributions in installation design and building, Famke Boschman for her involvement in the user study, and the China Scholarship Council for sponsorship on Yuan Feng’s doctoral study.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. Reported Cohen’s Kappa coefficient (k) to measure inter–rater reliability between two raters of all rating items of four observational rating scales.
Table A1. Reported Cohen’s Kappa coefficient (k) to measure inter–rater reliability between two raters of all rating items of four observational rating scales.
Rating Scale Itemsk Value
OME
AttentionMost of the time0.776
Highest level0.655
AttitudeMost of the time0.685
Highest level0.675
EPWDS
Affective Engagement0.612
Visual Engagement0.664
Verbal Engagement0.719
Behavioral Engagement0.678
Social Engagement0.606
OERS
Pleasure0.782
General Alertness0.642
Anger1.000
Anxiety/Fear0.755
Sadness1.000
PEAR–Apathy
Facial Expression0.708
Eye Contact0.649
Physical Engagement0.750
Purposeful Activity0.714
Verbal Tone0.745
Verbal Expression0.723

Appendix B

Table A2. Kruskal–Wallis H tests with pairwise comparisons and ANOVA tests with post hoc examinations performed on all rating items of four observational rating scales to disclose the differences in engagement, affect, and apathy among experiment conditions: C1, C2, and CC.
Table A2. Kruskal–Wallis H tests with pairwise comparisons and ANOVA tests with post hoc examinations performed on all rating items of four observational rating scales to disclose the differences in engagement, affect, and apathy among experiment conditions: C1, C2, and CC.
Rating Scale ItemsConditions M (SD)p Value
C1C2CCSig.C1-CC C2-CCC1-C2
OME
Duration (in seconds)678.38 (244.94)502.46 (266.38)671.04 (372.91)0.2600.9470.1280.169
AttentionMost of the time5.69 (0.95)4.85 (0.99)4.85 (1.19)0.094---
Highest level6.23 (0.83)5.69 (1.03)5.54 (0.95)0.111---
AttitudeMost of the time5.46 (1.20)4.31 (1.18)4.62 (1.09)0.0410.1251.0000.049
Highest level5.92 (1.04)5.00 (1.29)5.19 (0.98)0.094---
EPWDS
Affective Engagement8.38 (1.50)7.62 (1.50)7.77 (2.42)0.5780.3750.8240.337
Visual Engagement8.77 (1.69)7.85 (1.82)7.04 (1.73)0.0180.005 **0.1790.183
Verbal Engagement8.23 (1.59)7.85 (2.04)7.54 (1.73)0.5170.2570.6120.583
Behavioral Engagement8.85 (1.21)8.00 (2.19)6.58 (0.76)<0.001 ***<0.001
***
0.003 **0.118
Social Engagement7.69 (1.55)6.77 (1.69)6.12 (1.14)0.007
**
0.002 **0.1750.099
Composite Sum41.92 (6.98)38.08 (8.25)35.04 (6.53)0.0220.006 **0.2130.173
OERS
Pleasure2.54 (0.97)2.15 (0.69)2.15 (0.93)0.419---
General Alertness4.38 (0.77)4.15 (0.80)3.69 (1.02)0.104---
Negative Affect3.38 (0.65)3.54 (0.66)3.42 (0.76)0.669---
PEAR–Apathy
Facial Expression2.15 (0.80)2.69 (0.86)2.69 (0.84)0.120---
Eye Contact1.23 (0.44)1.54 (0.78)1.81 (0.63)0.0240.0230.4380.880
Physical Engagement1.69 (0.75)2.69 (1.18)3.54 (0.71)<0.001 ***<0.001
***
0.0810.088
Purposeful Activity1.62 (0.65)2.46 (1.05)3.42 (0.90)<0.001 ***<0.001
***
0.043 *0.191
Verbal Tone2.08 (0.64)2.46 (0.78)2.42 (0.64)0.345---
Verbal Expression1.46 (0.78)2.00 (1.16)2.35 (0.85)0.0160.0130.5740.542
Note: Bold values are * p < 0.05, ** p < 0.01, *** p < 0.001.

References

  1. Cadieux, M.-A.; Garcia, L.J.; Patrick, J. Needs of people with dementia in long-term care: A systematic review. Am. J. Alzheimer Dis. Dement. 2013, 28, 723–733. [Google Scholar] [CrossRef] [PubMed]
  2. Hancock, G.A.; Woods, B.; Challis, D.; Orrell, M. The needs of older people with dementia in residential care. Int. J. Geriatr. Psychiatry J. Psychiatry Late Life Allied Sci. 2006, 21, 43–49. [Google Scholar] [CrossRef]
  3. Topo, P. Technology studies to meet the needs of people with dementia and their caregivers: A literature review. J. Appl. Gerontol. 2009, 28, 5–37. [Google Scholar] [CrossRef]
  4. Takayanagi, K.; Kirita, T.; Shibata, T. Comparison of verbal and emotional responses of elderly people with mild/moderate dementia and those with severe dementia in responses to seal robot, PARO. Front. Aging Neurosci. 2014, 6, 257. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  5. Chang, W.-L.; Šabanović, S.; Huber, L. Situated analysis of interactions between cognitively impaired older adults and the therapeutic robot PARO. In Proceedings of the International Conference on Social Robotics, Bristol, UK, 27–29 October 2013; pp. 371–380. [Google Scholar]
  6. Hung, L.; Liu, C.; Woldum, E.; Au-Yeung, A.; Berndt, A.; Wallsworth, C.; Horne, N.; Gregorio, M.; Mann, J.; Chaudhury, H. The benefits of and barriers to using a social robot PARO in care settings: A scoping review. BMC Geriatr. 2019, 19, 232. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  7. Kramer, S.C.; Friedmann, E.; Bernstein, P.L. Comparison of the effect of human interaction, animal-assisted therapy, and AIBO-assisted therapy on long-term care residents with dementia. Anthrozoös 2009, 22, 43–57. [Google Scholar] [CrossRef]
  8. Libin, A.; Cohen-Mansfield, J. Therapeutic robocat for nursing home residents with dementia: Preliminary inquiry. Am. J. Alzheimer Dis. Dement. 2004, 19, 111–116. [Google Scholar] [CrossRef]
  9. Stiehl, W.D.; Breazeal, C.; Han, K.-H.; Lieberman, J.; Lalla, L.; Maymin, A.; Salinas, J.; Fuentes, D.; Toscano, R.; Tong, C.H. The huggable: A therapeutic robotic companion for relational, affective touch. In Proceedings of the 2006 3rd IEEE Consumer Communications and Networking Conference, Las Vegas, NV, USA, 8–10 January 2016. [Google Scholar] [CrossRef]
  10. Moyle, W.; Jones, C.; Sung, B.; Bramble, M.; O’Dwyer, S.; Blumenstein, M.; Estivill-Castro, V. What effect does an animal robot called CuDDler have on the engagement and emotional response of older people with dementia? A pilot feasibility study. Int. J. Soc. Robot. 2016, 8, 145–156. [Google Scholar] [CrossRef]
  11. Fernaeus, Y.; Håkansson, M.; Jacobsson, M.; Ljungblad, S. How do you play with a robotic toy animal? A long-term study of Pleo. In Proceedings of the 9th International Conference on Interaction Design and Children, Barcelona, Spain, 9–12 June 2010; pp. 39–48. [Google Scholar]
  12. Perugia, G.; Rodríguez-Martín, D.; Boladeras, M.D.; Mallofré, A.C.; Barakova, E.; Rauterberg, M. Electrodermal activity: Explorations in the psychophysiology of engagement with social robots in dementia. In Proceedings of the 2017 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Lisbon, Portugal, 28 August–1 September 2017; pp. 1248–1254. [Google Scholar]
  13. Moyle, W.; Jones, C.J.; Murfield, J.E.; Thalib, L.; Beattie, E.R.; Shum, D.K.; O’Dwyer, S.T.; Mervin, M.C.; Draper, B.M. Use of a robotic seal as a therapeutic tool to improve dementia symptoms: A cluster-randomized controlled trial. J. Am. Med. Dir. Assoc. 2017, 18, 766–773. [Google Scholar] [CrossRef] [Green Version]
  14. Chu, M.-T.; Khosla, R.; Khaksar, S.M.S.; Nguyen, K. Service innovation through social robot engagement to improve dementia care quality. Assist. Technol. 2017, 29, 8–18. [Google Scholar] [CrossRef] [PubMed]
  15. Mordoch, E.; Osterreicher, A.; Guse, L.; Roger, K.; Thompson, G. Use of social commitment robots in the care of elderly people with dementia: A literature review. Maturitas 2013, 74, 14–20. [Google Scholar] [CrossRef] [PubMed]
  16. Góngora Alonso, S.; Hamrioui, S.; de la Torre Díez, I.; Motta Cruz, E.; López-Coronado, M.; Franco, M. Social robots for people with aging and dementia: A systematic review of literature. Telemed. e-Health 2019, 25, 533–540. [Google Scholar] [CrossRef] [PubMed]
  17. Peluso, S.; De Rosa, A.; De Lucia, N.; Antenora, A.; Illario, M.; Esposito, M.; De Michele, G. Animal-assisted therapy in elderly patients: Evidence and controversies in dementia and psychiatric disorders and future perspectives in other neurological diseases. J. Geriatr. Psychiatry Neurol. 2018, 31, 149–157. [Google Scholar] [CrossRef] [PubMed]
  18. Yakimicki, M.L.; Edwards, N.E.; Richards, E.; Beck, A.M. Animal-assisted intervention and dementia: A systematic review. Clin. Nurs. Res. 2019, 28, 9–29. [Google Scholar] [CrossRef]
  19. Lai, N.M.; Chang, S.M.W.; Ng, S.S.; Tan, S.L.; Chaiyakunapruk, N.; Stanaway, F. Animal-assisted therapy for dementia. Cochrane Database Syst. Rev. 2019. [Google Scholar] [CrossRef]
  20. Tamura, T.; Yonemitsu, S.; Itoh, A.; Oikawa, D.; Kawakami, A.; Higashi, Y.; Fujimooto, T.; Nakajima, K. Is an entertainment robot useful in the care of elderly people with severe dementia? J. Gerontol. Ser. A Biol. Sci. Med. Sci. 2004, 59, M83–M85. [Google Scholar] [CrossRef] [Green Version]
  21. Moyle, W.; Jones, C.; Murfield, J.; Thalib, L.; Beattie, E.; Shum, D.; Draper, B. Using a therapeutic companion robot for dementia symptoms in long-term care: Reflections from a cluster-RCT. Aging Ment. Health 2019, 23, 329–336. [Google Scholar] [CrossRef] [Green Version]
  22. Marx, M.S.; Cohen-Mansfield, J.; Regier, N.G.; Dakheel-Ali, M.; Srihari, A.; Thein, K. The impact of different dog-related stimuli on engagement of persons with dementia. Am. J. Alzheimer Dis. Dement. 2010, 25, 37–45. [Google Scholar] [CrossRef] [Green Version]
  23. Frens, J.W. Designing for rich interaction: Integrating form, interaction, and function. In Proceedings of the 3rd Symposium of Design Research Conference, Basel, Switzerland, 17–18 November 2006; pp. 91–106. [Google Scholar]
  24. Wada, K.; Shibata, T.; Musha, T.; Kimura, S. Robot therapy for elders affected by dementia. IEEE Eng. Med. Biol. Mag. 2008, 27, 53–60. [Google Scholar] [CrossRef]
  25. Salam, H.; Chetouani, M. A multi-level context-based modeling of engagement in human-robot interaction. In Proceedings of the 2015 11th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG), Ljubljana, Slovenia, 4–8 May 2015; pp. 1–6. [Google Scholar]
  26. Hoffman, G.; Bauman, S.; Vanunu, K. Robotic experience companionship in music listening and video watching. Person. Ubiquit. Comput. 2016, 20, 51–63. [Google Scholar] [CrossRef]
  27. Hendrix, J.; Feng, Y.; van Otterdijk, M.; Barakova, E. Adding a context: Will it influence human-robot interaction of people living with dementia? In Proceedings of the International Conference on Social Robotics, Madrid, Spain, 26–29 November 2019; pp. 494–504. [Google Scholar]
  28. Feng, Y.; Yu, S.; van de Mortel, D.; Barakova, E.; Hu, J.; Rauterberg, M. LiveNature: Ambient display and social robot-facilitated multi-sensory engagement for people with dementia. In Proceedings of the 2019 on Designing Interactive Systems Conference, San Diego, CA, USA, 23–28 June 2019; pp. 1321–1333. [Google Scholar]
  29. Feng, Y.; Yu, S.; van de Mortel, D.; Barakova, E.; Rauterberg, M.; Hu, J. Closer to nature: Multi-sensory engagement in interactive nature experience for seniors with dementia. In Proceedings of the Sixth International Symposium of Chinese CHI, Montreal, QC, Canada, 21–22 April 2018; pp. 49–56. [Google Scholar]
  30. Neal, I.; du Toit, S.H.; Lovarini, M. The use of technology to promote meaningful engagement for adults with dementia in residential aged care: A scoping review. Int. Psychogeriatr. 2019. [Google Scholar] [CrossRef] [PubMed]
  31. Lazar, A.; Thompson, H.J.; Piper, A.M.; Demiris, G. Rethinking the design of robotic pets for older adults. In Proceedings of the 2016 ACM Conference on Designing Interactive Systems, Brisbane, Australia, 4–8 June 2016; pp. 1034–1046. [Google Scholar]
  32. Cohen-Mansfield, J.; Marx, M.S.; Freedman, L.S.; Murad, H.; Regier, N.G.; Thein, K.; Dakheel-Ali, M. The comprehensive process model of engagement. Am. J. Geriatr. Psychiatry 2011, 19, 859–870. [Google Scholar] [CrossRef] [Green Version]
  33. Jones, C.; Sung, B.; Moyle, W. Engagement of a Person with Dementia Scale: Establishing content validity and psychometric properties. J. Adv. Nurs. 2018, 74, 2227–2240. [Google Scholar] [CrossRef]
  34. Lawton, M.P.; Van Haitsma, K.; Klapper, J. Observed affect in nursing home residents with Alzheimer’s disease. J. Gerontol. Ser. B Psychol. Sci. Soc. Sci. 1996, 51, P3–P14. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  35. Jao, Y.-L.; Algase, D.L.; Specht, J.K.; Williams, K. Developing the Person–Environment Apathy Rating for persons with dementia. Aging Ment. Health 2016, 20, 861–870. [Google Scholar] [CrossRef] [PubMed]
  36. Kolanowski, A.; Litaker, M.; Buettner, L.; Moeller, J.; Costa, J.; Paul, T. A randomized clinical trial of theory-based activities for the behavioral symptoms of dementia in nursing home residents. J. Am. Geriatr. Soc. 2011, 59, 1032–1041. [Google Scholar] [CrossRef]
  37. Fleiss, J.L.; Levin, B.; Paik, M.C. Statistical Methods for Rates and Proportions; John Wiley & Sons: Hoboken, NJ, USA, 2013. [Google Scholar] [CrossRef]
  38. Cohen, J. Citation-classic-a coefficient of agreement for nominal scales. Curr. Contents Soc. Behav. Sci. 1986, 18. [Google Scholar] [CrossRef]
  39. Cohen-Mansfield, J.; Dakheel-Ali, M.; Jensen, B.; Marx, M.S.; Thein, K. An analysis of the relationships among engagement, agitated behavior, and affect in nursing home residents with dementia. Int. Psychogeriatr. 2012, 24, 742–752. [Google Scholar] [CrossRef]
  40. Perugia, G.; Díaz-Boladeras, M.; Català-Mallofré, A.; Barakova, E.I.; Rauterberg, M. ENGAGE-DEM: A model of engagement of people with dementia. IEEE Trans. Affect. Comput. 2020. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Design LiveNature implemented in Vitalis, including a soft-fur covered robotic sheep and an augmented reality display. The picture on the left shows an example of the interaction session with the control condition, and the picture on the right demonstrated a scenario of a participant interacting with a robotic sheep with response from the augmented reality display.
Figure 1. Design LiveNature implemented in Vitalis, including a soft-fur covered robotic sheep and an augmented reality display. The picture on the left shows an example of the interaction session with the control condition, and the picture on the right demonstrated a scenario of a participant interacting with a robotic sheep with response from the augmented reality display.
Sensors 20 03771 g001
Figure 2. Flow diagram of recruitment, enrollment, allocation, and the number of participants.
Figure 2. Flow diagram of recruitment, enrollment, allocation, and the number of participants.
Sensors 20 03771 g002
Table 1. Three experimental conditions with descriptive details.
Table 1. Three experimental conditions with descriptive details.
VariableC1C2CC
StimulusThe proactive robot: the robotic sheep responds to users’ stroke and touch by moving its head, neck, legs, and tail and making baby lamb sound.The static robot: the robotic sheep was turned off; however, the tactile feature is still available and inviting to stroke and hug.No physical stimulus.
ContextReactive context: the virtual sheep in the screen display responds to users’ stroke and touch by being active and approaching user.Dynamic context: the display plays looped video of the same content as in C1.Dynamic context: same as in C2.
Table 2. Participant demographics including age, gender, type of dementia, marital status, stage of dementia, cognitive functions, wheelchair use, MMSE score, and length of stay in facility. No statistical difference was found between demographic characteristics of two groups of participants.
Table 2. Participant demographics including age, gender, type of dementia, marital status, stage of dementia, cognitive functions, wheelchair use, MMSE score, and length of stay in facility. No statistical difference was found between demographic characteristics of two groups of participants.
CharacteristicsG1
n = 7
G2
n = 9
Total
N = 16
p Value
Age, mean (SD)86.6 (4.2)84.1 (5.2)85.2 (4.8)0.325
Female, n (%)5 (71.4)7 (77.8)12 (75.0)0.789
Type of dementia, n (%)0.717
Alzheimer’s Dementia2 (28.6)3 (33.3)5 (31.3)
Vascular Dementia1 (14.3)2 (22.2)3 (18.8)
Mixed Dementia4 (57.1)4 (44.4)8 (50.0)
Marital status, n (%)0.598
Single/Divorced1 (14.3)1 (11.1)2 (12.5)
Married4 (57.1)4 (44.4)8 (50.0)
Widowed2 (28.6)4 (44.4)6 (37.5)
Stages according to staff records, n (%)0.861
Mild1 (14.3)1 (11.1)2 (12.5)
Middle2 (28.6)3 (33.3)5 (31.3)
Middle to severe3 (42.9)2 (22.2)5 (31.3)
Severe1 (14.3)3 (33.3)4 (25.0)
Cognitive functions reported by staff, n (%)0.877
Mild1 (14.3)2 (22.2)3 (18.8)
Confused at times3 (42.9)3 (33.3)6 (37.5)
Constantly confused3 (42.9)4 (44.4)7 (43.8)
Wheelchair use, n (%)3 (42.9)3 (33.3)6 (37.5)0.719
MMSE score, mean (SD)14 (5.3)11.3 (8.3)12.88 (7.1)0.475
Range8–220–230–23
MMSE Stage, n (%)0.509
Stage 1 (>19)2 (28.6)2 (22.2)4 (25.0)
Stage 2 (10–19)4 (57.1)4 (44.4)8 (50.0)
Stage 3 (<10)1 (14.3)3 (33.3)4 (25.0)
Length of stay, n (%)0.967
Six months or less1 (14.3)1 (11.1)2 (12.5)
More than 6 months2 (28.6)3 (33.3)5 (31.3)
More than 12 months4 (57.1)5 (55.6)9 (56.3)
Note: MMSE score above 19 were considered stage 1 of MMSE Stage, between 10–19 (including 10 and 19) were considered stage 2, and below 10 were considered stage 3.

Share and Cite

MDPI and ACS Style

Feng, Y.; Barakova, E.I.; Yu, S.; Hu, J.; Rauterberg, G.W.M. Effects of the Level of Interactivity of a Social Robot and the Response of the Augmented Reality Display in Contextual Interactions of People with Dementia. Sensors 2020, 20, 3771. https://doi.org/10.3390/s20133771

AMA Style

Feng Y, Barakova EI, Yu S, Hu J, Rauterberg GWM. Effects of the Level of Interactivity of a Social Robot and the Response of the Augmented Reality Display in Contextual Interactions of People with Dementia. Sensors. 2020; 20(13):3771. https://doi.org/10.3390/s20133771

Chicago/Turabian Style

Feng, Yuan, Emilia I. Barakova, Suihuai Yu, Jun Hu, and G. W. Matthias Rauterberg. 2020. "Effects of the Level of Interactivity of a Social Robot and the Response of the Augmented Reality Display in Contextual Interactions of People with Dementia" Sensors 20, no. 13: 3771. https://doi.org/10.3390/s20133771

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop