Next Article in Journal
Seismic Damage Investigation of Spatial Frames with Steel Beams Connected to L-Shaped Concrete-Filled Steel Tubular (CFST) Columns
Next Article in Special Issue
Learning and Planning Based on Merged Experience from Multiple Situations for a Service Robot
Previous Article in Journal
A Weighted Turbo Equalized Multi-Band Underwater Wireless Acoustic Communications
Previous Article in Special Issue
A Multi-Modal Person Recognition System for Social Robots
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Healthcare Students’ Ethical Considerations of Care Robots in The Netherlands

by
Margo A. M. Van Kemenade
1,*,
Johan F. Hoorn
2 and
Elly A. Konijn
3
1
Department of Mechanical Engineering, Inholland University of Applied Sciences, Bergerweg 200, 1817 MN Alkmaar, The Netherlands
2
Department of Computing and School of Design, The Hong Kong Polytechnic University, Hong Kong, China
3
Department of Communication Science, Media Psychology Program Amsterdam, Vrije Universiteit Amsterdam, 1081 HV Amsterdam, The Netherlands
*
Author to whom correspondence should be addressed.
Appl. Sci. 2018, 8(10), 1712; https://doi.org/10.3390/app8101712
Submission received: 22 August 2018 / Revised: 17 September 2018 / Accepted: 17 September 2018 / Published: 20 September 2018
(This article belongs to the Special Issue Social Robotics)

Abstract

:
Background: Older adults are a rapidly growing group world-wide, requiring an increasing amount of healthcare. Technological innovations such as care robots may support the growing demand for care. However, hardly any studies address those who will most closely collaborate with care robots: the (trainee) healthcare professional. Methods: This study examined the moral considerations, perceptions of utility, and acceptance among trainee healthcare professionals toward different types of care robots in an experimental questionnaire design (N = 357). We also examined possible differences between participants’ intermediate and higher educational levels. Results: The results show that potential maleficence of care robots dominated the discussion in both educational groups. Assisting robots were seen as potentially the most maleficent. Both groups deemed companion robots least maleficent and most acceptable, while monitoring robots were perceived as least useful. Results further show that the acceptance of robots in care was more strongly associated with the participants’ moral considerations than with utility. Conclusions: Professional care education should include moral considerations and utility of robotics as emerging care technology. The healthcare and nursing students of today will collaborate with the robotic colleagues of tomorrow.

1. Introduction

Over the past decades, a shift in healthcare needs has become evident. World population is ageing, whereas replacement fertility is dropping [1] (pp. 2–5). Life expectancy is higher than ever [2] and older adults require an increasing amount of healthcare [3,4]. By 2040, the number of older dementia patients is estimated to be 81.1 million worldwide [5]. These people often require more specialised care [6,7]. For example, loneliness among older adults may lead to excess morbidity and mortality [8]. The changes in the amount and specialisation of healthcare needs also increase the costs [9]. On top of severe healthcare budget-cuts in most Western countries, a shortage of educated care professionals is expected in the area of specialised care [2]. This shortage of hands together with the aftermath of a global financial crisis foreshadows a future of low-quality eldercare against high costs [10].
Progressively, the call for care technology becomes louder and care robots seem to be in the vanguard of that development [11]. Care robots come in many forms, from surgery machines (e.g., Da Vinci Surgical System) to cuddle toys (e.g., PARO). Generally, three types of care robots may be distinguished: assistive, monitoring, and companion robots [12]. Assistive robots help with hygiene chores such as washing someone’s hair [13]. Monitoring robots survey matters of behaviour or health [14]. Companion robots provide entertainment and daily management, which is typical for dementia patients [15] as well as for socially isolated seniors [16].
A healthcare robot is not a conventional machine but an agency that makes (partially) independent decisions and executes specialised tasks with little assistance [17]. Healthcare robots provide support for basic activities, assisting seniors and their caretakers [18]. However, despite recent advances in robotics, care robots cannot independently tend to the needs of older adults, nor will they in the foreseeable future. Thus, robots themselves need their minders, on the work floor as well as in operational management. That is, current students of care will be working with robots as their colleagues [19]. Others will be managers and planners, dealing with teams of care personnel and robots. To prepare healthcare systems for new technology, proper training of (future) care workers is required [3]. Higher vocational students will become the care managers that coach the lower vocational students, who become the care professionals, working in mixed robot teams.
The questions that arise from such observations include: How do professionals of tomorrow perceive such robot care, as useful and helpful in their jobs or threatening, expecting massive lay-offs? Does robot care clash with moral principles of “good care?” These and related questions are addressed in the current paper, investigating what trainee professionals think of assistive, monitoring, and companion robots in terms of morality, utility, and acceptance. In view of robot care, how do care students consider patient: (1) autonomy; (2) non-maleficence; (3) beneficence; and (4) justice [20]? These basic principles of healthcare ethics are generally accepted to assess medical and care procedures, treatments, interventions, and technologies.
To guarantee autonomy, patients should have full control of decisions that concern their health. They cannot be forced into treatment. Patients should be informed as completely as possible, and made aware of dangers, benefits, and success rates. Treatment is performed only under the patient’s fully informed-consent. Non-maleficence states to better do nothing than something that worsens the patient’s condition. Treatment should not harm the patient, their kin, environment, or society. An incision may temporarily and locally “harm” a patient as long as the greater benefit is ensured. Therefore, beneficence entails that all acts, thoughts, and deliberations are with the good of the patient in mind. Treatment should be tailored to the individual and education, training, and technology continuously updated, improved, and tested. Justice refers to fairness with equal rights for everyone to the best treatment: medicine, expertise, and other resources are equally distributed among all. This principle demands to address potential conflicts of treatment with current law, legislation, rights, liabilities, and other obligations.
Our first research question (RQ1), then, is: To what extent do higher and lower vocational care students believe that assistive, monitoring, and companion robots affect a patient’s autonomy, may do harm, or be beneficial, and “just”? Conversely, is it that care students fear that robots “take over” a patient’s decision making, hurt them physically, companion robots do not help against loneliness, and denying human empathy to patients is unfair [21,22]?
Another concern of care students might be that robots take their jobs [23]. Do students perceive robots as complementary or are robots considered so capable that robots will replace them? Utility of a robot, in our case, relates to how handy and practical care students think a robot would be during job performance [24]. Acceptance would be the actual agreement to and adoption of robot technology in the work practice, for instance, based on utility and ease of use [25].
About a decade ago, the general tendency among healthcare students was that robots would replace them and that employing robots in care was unacceptable [26]. Ekland [27] and Schulman [28] found that care students were pessimistic about the usefulness of robots in telemedicine. In nursing older adults, the workforce did not believe robots to be useful [29].
Would this position have changed over the past years? What is known is that, if people do not expect too much of the performance of a technology, the behavioural intention to actually use the system is weak [30]. Likewise, if perceived usefulness and perceived ease of use are low, intentions to use new technology drop [25]. The reverse seems also valid: Intention to use and actual use increase if people feel that a device will perform well, is useful, and easy (ibid.). Hence, if care professionals see that a robot is practical and functional, would their initial moral objections become less of an issue [31] or will they co-exist?
Our RQ2, then, is: To what extent do care students believe that assistive, monitoring, and companion robots are acceptable and that robots will be useful in their future occupations? Furthermore, RQ3 asks: Does high perceived usefulness perhaps downplay earlier ethical concerns? To investigate ethical and occupational concerns of care students with different types of care robots, we designed a questionnaire study, probing the contrast between principles of ethics and considerations of utility and acceptance.
In other words, the current study seeks to investigate how care professionals’ moral concerns relate to different types of care robots, whether these differ among the higher and lower educated professionals, how care professionals perceive a care robot’s usefulness, and how their moral concerns relate to perceived utility and acceptability.

2. Methods and Materials

2.1. Participants and Design

In total, 406 respondents completed our questionnaire (M = 21.22 years, SD = 2.47; 87.7% female). Thirty-four participants used an extremely short time to complete the questionnaire and showed answering tendencies (e.g., identical answers everywhere), thus had to be excluded. Another 15 participants were excluded because of extreme scores (outliers) on the composite factors (>1.5*IQR). The final number of participants was N = 357; 21% were intermediate vocationally trained (n = 75; M = 9.12, SD = 2.42; 88% female), and 79% in higher education (N = 282; M = 21.78, SD = 2.17; 87.6% female). Table 1 shows their characteristics.
Recruitment occurred via three healthcare institutes (intermediate vocational) and two universities of applied sciences (higher vocational) in The Netherlands. These students were expected to work as care professionals within four years. A link including an online questionnaire was sent to all 7065 students who were enrolled in care courses at one of these institutes. The response rate was 36% for higher and 5% for intermediate vocational students. A reminder was sent to the latter once.
The research design crossed the three-levelled factor Robot Type (assistive, monitoring, and companion; between-subjects) with the two-levelled factor Education Level (intermediate vs. higher vocational), the dependent variables were measures of medical Ethics (autonomy, beneficence, maleficence, and justice), Utility, and Acceptance. Gender and age were also included as control variables. Age naturally co-varies with an intermediate or higher vocational level because higher vocational usually follows intermediate. All participants were treated according to the university’s ethical guidelines.

2.2. Materials and Procedure

We created a questionnaire based on the literature discussed in the above and piloting focus group sessions with care professionals providing feedback for improvements. Participants in the focus groups did not join the actual study. The improved questionnaire was made available online through Qualtrics. Upon opening the questionnaire link, participants read a brief introduction about healthcare robots, providing a definition and describing one of three different types of care robots (i.e., either assistive, monitoring, or companion robots) illustrated by a picture.
To avoid cross-comparisons among the robot types, participants were assigned randomly to conditions. We used the same questionnaire in each condition. After the introduction, a test question validated the comprehensibility of the task. Then, participants were asked to evaluate the care robot on each of the following items.

2.3. Measures

Each construct was measured by Likert-type items, balanced for indicative and counter-indicative items, each followed by six-point rating scales (1 = strongly disagree, 6 = strongly agree), avoiding the neutral position [32]. Items were recoded afterwards to all point into the same direction as indicated here. Measurement scales were optimized based on principal component analysis (PCA) and reliability analyses (SPSS-V21). PCA was run with direct Oblimin rotation to create uncorrelated factors for the questionnaire items (Field, 2009, p. 671). The final analysis comprised four factors explaining 65.09% of the variance including 18 items with primary loadings over 0.5 and no cross-loadings larger than 0.3. Composite scores were created for each of the four factors based on the mean of the items. The internal consistency of the resulting scales was well beyond the minimum with Cronbach alpha’s > 0.70 [33] (see Table 2). Below, we describe how we measured each of the ethical categories.
Autonomy (3 items; α = 0.87) was defined as a person’s ability to make his/her own choices independently without the interference of other people or outsiders [20]. Example items are: “A care robot increases the independence of a patient” (indicative) and “A care robot makes the patient more dependent” (counter-indicative).
Beneficence referred to a person’s sense of health and well-being [20]. A sample item is: “A care robot treats the patient well.” Non-maleficence refers to a device or treatment not doing harm to patients [20]. Because care professionals actually think in terms of maleficence, we omitted the equivocal “non” from the original scale label. Whereas “maleficence” is not exactly the opposite of non-maleficence, this phrasing was more comprehensible. A sample item is: “A care robot is an obstacle for improving health.” However, results of scale optimization showed a better fit for using only one scale to cover both Beneficence and Maleficence (4 items; α = 0.85). The dominant items on that scale indicted the name Maleficence.
Justice pertained to the need to distribute care evenly across patients [20]. An example item is: “A care robot keeps its promises”. However, the items did not form a reliable scale and we therefore dismissed it from further analysis.
Utility (five items; α = 0.78) concerned the degree to which care professionals believed a robot is practical, “handy,” and useful during the execution of their jobs [24]. A sample item is: “A care robot is practical”.
Acceptance (six items; α = 0.87) referred to care professionals giving admittance and approval to a care robot without protest or refutation, while regarding a robot as a proper and normal care utensil [25]. A sample item is: “I will accept the assistance of a care robot”.
An approximately normal distribution was evident for the composite score data in the current study, with skewness and kurtosis scores between −1 and 1, thus making the data well-suited for parametric statistical analyses.

3. Results

Table 3 presents the descriptive statistics of the dependent variables per Robot Type. To analyse our Research Questions, we ran a 3 (Robot Type: assisting, monitoring, companion) × 2 (Education Level: higher vs. intermediate vocational) GLM MANOVA on the dependent measures Maleficence, Autonomy, Utility, and Acceptance.
The Box M value was associated with a p-value of 0.04. Note, however, that the Box M test is highly sensitive and should be ignored unless N ≥ 200 and p ≤ 0.001 [34]. Our sample size was larger and our p-value was 0.04, so a MANOVA was appropriate to execute.
Results of the multivariate analysis showed a significant main effect of Robot Type on the dependent variables (V = 0.26, F(8,698) = 13.22, p < 0.001, ηp2 = 0.13). However, the main effect of Education Level was not significant (p = 0.111) and neither was the interaction between Education Level and Robot Type (F = 0.78). Thus, specifics of Education Level could not be tested further.
The univariate between-subjects effects showed the main effect of Robot Type on participants’ moral considerations (RQ1) and their evaluations of the different robot type’s utility and acceptance (RQ2). Based on Levene’s F test, the homogeneity of variance assumption was considered satisfactory with p > 0.05 for each subgroup. Robot Type had a significant effect on Maleficence (F(2,353) = 31.86, p < 0.001, ηp2 = 0.15), on Utility (F(2,353) = 3.75, p = 0.024, ηp2 = 0.02), and on Acceptance (F(2,353) = 12.58, p < 0.001, ηp2 = 0.07). The effect of Robot Type on Autonomy was not significant (p > 0.05).
Pairwise comparisons (Tukey) showed that Acceptance was significantly higher for Companion than for Monitoring robots (ΔM = 0.73, SE = 0.15, p < 0.001, 95% CI = 0.44–1.03) and for Assisting robots (ΔM = 0.46, SE = 0.15, p = 0.002, 95% CI = 0.17–0.75). The difference between Monitoring and Assisting robots was not significant (p > 0.05).
Regarding Maleficence, Assisting robots raised significantly higher scores than did Monitoring robots (ΔM = 0.40, SE = 0.14, p = 0.005, 95% CI = 0.12–0.67) and significantly higher scores than Companion robots (ΔM = 1.06, SE = 0.14, p < 0.001, 95% CI = 0.80–1.33). The difference between Monitoring and Companion robots also was significant, Monitoring robots being considered more Maleficent than Companion robots (ΔM = 0.67, SE = 0.14, p < 0.001, 95% CI = 0.40–0.93). Utility merely showed a trend, indicating that Monitoring robots seemed less useful than Companion or Assisting robots.
To answer RQ3, we analysed the extent to which acceptance of a care robot was determined by maleficence and autonomy or by utility. Therefore, we conducted a multiple regression analysis with an interaction term [35] to predict the Acceptance scores through interactions with Autonomy, Utility, and Maleficence. Age and Education Level served as control variables (i.e., moderators).
Regarding assumptions, the Durbin–Watson of 2.032 was well within the range of 1.5–2.5, indicating the absence of autocorrelations. The variance-inflation factor (VIF) was acceptable (<5.0), suggesting no multicollinearity due to interdependency of variables (Rogerson, 2001). Three residual outliers based on a z-score > 3.29 or < −3.29 were removed, which was 0.1% of the most extreme values. The residuals were mean centred. There were no linearity problems and the standardized predicted and residual plot showed no problems of heteroscedasticity. Overall, it was suitable to perform a regression analysis.
The full model with five predictors (Autonomy, Utility, Maleficence, Age, and Education Level) explained 53.6% of the variance in Acceptance ( = 0.54, F(5, 348) = 80.26, p < 0.001), also after Bonferroni correction (α = 0.05/5 = 0.01). The combination of the terms of interaction between Autonomy, Utility, and Maleficence with Age and Education Level as moderators added a marginally significant (according to Bonferroni correction) and small amount of explained variance (2%): ΔR² = 0.02, F(6, 343) = 2.54, p = 0.020. The total explained variance was 55.5% for this model ( = 0.56, F(11, 342) = 38.83, p < 0.001).
Furthermore, potential Maleficence had a significantly negative influence on Acceptance (β = −0.33, t(342) = 8.05, p < 0.001). By contrast, increased Autonomy positively influenced Acceptance (β = 0.18, t(342) = 4.39, p < 0.001) just like Perceived Utility (β = 0.41, t(342) = 9.23, p < 0.001). Age also positively influenced Acceptance but not beyond the Bonferroni-corrected level (β = 0.08, t(342) = 1.99, p = 0.047), whereas Education Level was not significantly related to Acceptance (p > 0.05).
To test the relative weights of moral considerations and utility perceptions on the acceptance of healthcare robots, we performed a hierarchical regression (method Enter) with Autonomy and Maleficence (both indicating moral concerns) entered as predictors in the first block, and Utility in the second block, on the Acceptance measure as dependent variable. Autonomy and Maleficence together significantly explained Acceptance, R2 = 0.48, R2adj = 0.47, F(2,409) = 184.75, p = 0.000, which was mainly due to Maleficence. Utility added a significant increase of 9% in explained variance (R2change = 0.09, F(1,408) = 81.97, p = 0.000). The three predictors together explained a substantial amount of variance in Acceptance (R2 = 0.56, R2adj = 0.56, Fchange(1,408) = 81.97, p = 0.000) in which the largest part is explained by the health professionals’ moral considerations about care robots.

4. Discussion

The current study examined how moral considerations and perceptions of utility and acceptance of different types of care robots were appraised by trainee healthcare and nursing professionals at intermediate and higher educational levels. We also analysed the relative contributions of perceived utility and ethics in robot acceptance. Results showed that trainee care professionals evaluated assistive robots as more maleficent than either monitoring or companion robots. Companion robots were also more likely to be accepted to collaborate with than monitoring and assistive robots. Furthermore, monitoring robots were considered more maleficent than companion robots. Participants also thought that monitoring robots are less useful than companion or assisting robots. Considerations of autonomy did not differentiate between the robot types. No significant differences between the intermediate and higher educational levels were found. Finally, results show that moral concerns weigh more heavily in accepting to work with a robot than practical utility, although both contribute significantly to and explain a substantial amount of variance in accepting robots on the work floor.
Healthcare and nursing students were presented three types of care robots to examine their moral considerations, acceptance and utility perceptions. Results show that these trainee healthcare professionals saw little harm in companion robots and worried most for assistive robots, irrespective of their schooling. From the debriefing interviews, it appeared that assistive robots were perceived as most maleficent because they can actually physically drop someone or make a wrong move. Importantly, our results provide a more positive view on attitudes toward healthcare robots than previous studies showed. Only several years ago, healthcare students feared that robots would replace them and considered employing robots in care unacceptable and not very useful [26,27,28,29]. Today, trainee healthcare and nursing professionals are willing to accept care robots on the work floor provided that the technology does not harm the patient and that patient safety is guaranteed. Acceptance levels vary according to the type of robot with companion robots being the most acceptable and useful.
It is argued in the extant literature that companion robots that, for instance, are able to play the favourite music of an elderly patient who is suffering from dementia, could alleviate loneliness and increase feelings of well-being [36,37,38]. Hence, a robot might provide a perfect interface for accessing such a resource, providing music as treatment with a greater degree of interactivity simulating personal interaction with a human being.
Results further showed that morality was more important than utility in accepting a robot in care although utility was not trivial. This is an important addition to prevailing theories on technology acceptance [25], which primarily focus on utility and ease of use but neglect the possible moral or ethical considerations. The participants’ position in our study is in line with Stephany and Majkowski [24] (p. 131) who opposed the trend that utility considerations often overthrow a “moral sense of care.” In our study, morality and utility together explained a substantial amount of variance in the willingness to accept healthcare robots. Notably, when considering the relative weights in a hierarchical regression analysis, the largest part is explained by moral considerations such as robots doing harm or increasing a patient’s independence. Thus, indeed, moral senses of care overthrow utility considerations in (trainee) care professionals.
Our findings suggest that care robots have the potential to solve urgent problems in care [10], provided that, before implementation, the moral concerns of healthcare and nursing professionals are taken into account. Our results indicate that each type of robot should follow its own line of introduction because their different functionalities come with different moral concerns. Companion robots may take the lead and pave the way because they were seen as highly useful, most acceptable, and least harmful. For the development and implementation of care technology, our results suggest that companion robots can be employed right away without much resistance from the caregiver, both at the work floor (lower/intermediate vocational) and in operational management (higher vocational). It also means that more work has to be put in making robots less threatening. Particularly for assisting machines, this means that robots that perform physical tasks should comply with the highest safety regulations and technology standards. Technically, they should be flawless, which should be demonstrated by extensive user tests. Compare, for example, the aviation industry where fear of flight is countered by proving it is the safest form of transportation available [39].
A limitation might be that the measurement of the moral considerations was based on the principles of medical ethics [20], which are generally used to assess medical and care procedures, treatments, interventions, and technologies. However, no clear measurements of these constructs yet existed (cf. available measurements for utility and acceptance [25]). Therefore, we carefully constructed items following the definitions of these four basic principles of medical ethics. Psychometric analyses showed that the items for “justice” could not be indexed as a reliable scale. Furthermore, the items for “non-maleficence” and “beneficence” showed overlap and had to be merged into one scale rather than two separate constructs. Although “(non)-maleficence” and “beneficence” are not just the opposite [20], the overlap in empirical observation is comprehensible. Finally, all scales used for analyses were internally consistent.
Another methodological consideration is the lower response rate of participants at the lower/intermediate educational level. Therefore, the subsamples in the current study were not equally distributed. Even though MANOVA is a robust technique and Box’s test showed that the homogeneity of variance–covariance matrix was not violated [34], it is desirable that groups are of similar size. The relatively low response rate overall (7065 persons were approached, and only 406 of them completed the questionnaire) is due to a mistake in the planning of our study. School holidays are spread in time over different regions and it happened that it has been sent out during a school holiday to their school e-mail addresses. Moreover, the deadline for completing the questionnaire was set within this holiday period. Therefore, most likely many students did not open their school e-mail. For this exploratory study, we reasoned that the number of participants still is sufficient for interesting insights.
Related, another interesting note is that 75% of all vocational educated participants at the intermediate level were attendees of a Christian (Reformed) school. Because their holiday took place on a different schedule than the secular one, this might explain their relatively higher response rate. Whether religion could have any influence on the acceptance of care robots is unclear. It might be interesting to include religion in future research.
Implications of our results highlight that moral considerations are important in professional health care. Previous research emphasized the responsibilities of nurse educators and healthcare employers to provide learning opportunities for new care professionals in technical skills, to maintain patient safety, and to provide “good care” [40]. To achieve these goals, nursing students and trainee care professionals should understand the importance of using evidence-based guidelines and develop a reflective approach toward performance of technical tasks. This requires nuanced care education that welcomes innovative technology, whereas such innovations should also be consistently tested against ethical principles. With the increasing importance of healthcare technology, it is imperative that (trainee) healthcare professionals should learn skills and gain knowledge concerning health technology and medical informatics [41,42]. This should also include ethical considerations.

5. Conclusions

Trainee healthcare professionals today seem more willing to accept care robots than in previous research [26,27,28,29]. The willingness to accept a care robot in the current study was mainly associated with moral considerations and less so with utility. Their discussion on applying healthcare robots was dominated by potential maleficence, particularly for assisting care robots, irrespective of participants’ educational level. Companion robots were considered most acceptable and least harmful. Overall, we suggest to enrich the curriculum of care students with classes on the ethical implications of robots as new care technology. Based on applied research, students should learn which (type of) machine has value for what specific care task. Future care professionals should be prepared to encounter new mechanical colleagues on the work floor and be able to make the most of their virtues and come to grips with their inconveniences.

Author Contributions

Conceptualization, M.A.M.v.K.; Methodology, J.F.H. and E.A.K.; Software: data management and statistical analyses were performed in SPSS, version 21; Validation: J.F.H. and E.A.K.; Formal Analysis, M.A.M.v.K., J.F.H. and E.A.K.; Investigation; M.A.M.v.K.; Resources: VU University Amsterdam; Data Curation; E.A.K., Writing-Original Draft Preparation: M.A.M.v.K.; Writing-Review & Editing; M.A.M.v.K., J.F.H. and E.A. K.; Visualization, E.A.K.; Supervision, E.A.K. and J.F.H.; Project Administration, VU University Amsterdam; Funding Acquisition, J.F.H., M.A.M.v.K., and E.A.K. through grants of The Netherlands Organization for Scientific Research (NWO), InHolland and VU University Amsterdam.

Funding

This study is part of the SELEMCA project (Services of Electro-Mechanical Care Agencies, grant NWO 646.000.003), which was funded within the Creative Industry Scientific Programme (CRISP) and supported by the Dutch Ministry of Education, Culture and Science. An additional contribution was funded by a personal grant to the first author from the Central Board of InHolland, University of Applied Sciences, and matched by a grant of the VU University to the third author.

Acknowledgments

We are very grateful to the participants and the Regional Training Institute (ROC) Hoornbeeck, ROC Horizon, ROC Scalda, Hogeschool Inholland, and Hogeschool van Amsterdam for their generous cooperation. We would like to thank the Medical Technology Research Group of InHolland for proofreading. The researchers of the SELEMCA group are kindly acknowledged for their critical and constructive remarks. All authors are responsible for the reported research and all participated in the conception and design of the study, the analysis and interpretation of the data, and drafting and revising the manuscript. The first author, MK, collected the data. All authors approved the manuscript as submitted.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. United Nations (UN). World Population Prospects: The 2012 Revision, Key Findings and Advance Tables; Working Paper No. ESA/P/WP.227; United Nations-Population Division: New York, NY, USA, 2013. [Google Scholar]
  2. Knickmann, J.; Snell, E. The 2030 Problem: Caring for aging baby boomers. Health Serv. Res. 2002, 37, 849–884. [Google Scholar] [CrossRef]
  3. World Health Organisation. Global Health and Aging: Preface. National Institute on Aging Website; NIH Publication No. 11-7737; U.S. Department of Health and Human Services: Washington, DC, USA, 2011; Volume 10. Available online: https://www.nia.nih.gov/sites/default/files/2017-06/global_health_aging.pdf (accessed on 15 September 2018).
  4. Miskelly, F. Assistive Technology in Elderly Care. Age Ageing 2001, 30, 455–458. [Google Scholar] [CrossRef] [PubMed]
  5. Ferri, C.; Prince, M.; Brayne, C.; Brodaty, H.; Fragtiglioni, L.; Ganguli, M.; Scazufca, M. Global prevalence of Dementia; a Delphi Consensus Study. Lancet 2005, 366, 2112–2117. [Google Scholar] [CrossRef]
  6. Wolff, J.L.; Starfield, B.; Anderson, G. Prevalence, expenditures and complications of multiple chronic conditions in the elderly. Arch. Intern. Med. 2002, 162, 2269–2276. [Google Scholar] [CrossRef] [PubMed]
  7. Zhang, P.; Imai, K. The Relationship between Age and Healthcare expenditure among persons with diabetes mellitus. Expert Opin. Pharmacother. 2007, 8, 49–57. [Google Scholar] [CrossRef] [PubMed]
  8. Saha, S. Loniness. J. Geriatr. Care Res. 2016, 3, 24. [Google Scholar]
  9. Chandra, A.; Skinner, J. Technology growth and expenditure growth in healthcare. J. Econ. Lit. 2012, 50, 645–680. [Google Scholar] [CrossRef]
  10. Prince, M.; Comas-Herrera, A.; Knapp, A.; Guerchet, M.; Karagiannidou, M. World Alzheimer Report 2016: Improving Healthcare for People Living with Dementia: Coverage, Quality and Costs Now and in the Future; Alzheimer’s Disease International: London, UK, 2016. [Google Scholar]
  11. Broadbent, E. Interactions with robots: The truths we reveal about ourselves. Annu. Rev. Psychol. 2017, 68, 627–652. [Google Scholar] [CrossRef] [PubMed]
  12. Sharkey, A.; Sharkey, N. Granny and the robots: Ethical issues in robot care for the elderly. Ethics Inf. Technol. 2010, 14, 27–40. [Google Scholar] [CrossRef]
  13. Dometios, A.; Papageorgiou, X.S.; Tzafestas, C.; Vartholomeos, C. Towards ICT-supported bath robots: Control architecture description and localized perception of user for robot motion planning. In Proceedings of the 24th Mediterranean Conference on Control and Automation, Athens, Greece, 21–24 June 2016. [Google Scholar]
  14. Wang, Y.; Kavoussi, L. Apparatus and Method for Patient Rounding with a Remote Controlled Robot. U.S. Patent USRE45870E1, 26 January 2016. Available online: https://Patents.google.com/patent/USRE45870 (accessed on 22 December 2016).
  15. Robinson, H.; MacDonald, B.; Kerse, N.; Broadbent, E. The psychosocial effects of a companion robot: A randomized controlled trial. J. Am. Med. Dir. Assoc. 2013, 14, 661–667. [Google Scholar] [CrossRef] [PubMed]
  16. Van Kemenade, M.; Konijn, E.; Hoorn, J. Robots humanize care: Moral concerns versus witnessed benefits for the elderly. In Proceedings of the 8th International Conference on Health Informatics (HEALTHINF), Lisbon, Portugal, 12–15 January 2015. [Google Scholar]
  17. Dautenhahn, K. Social Intelligent Robots: Dimensions of human robot interaction. Philos. Trans. R. Soc. Lond. Ser B Biol. Sci. 2007, 362, 679–704. [Google Scholar] [CrossRef] [PubMed]
  18. Tiwari, P.; Warren, J.; Day, K.; MacDonald, B. Some non-technology implications for wider application of robots to assist older people. Health Care Inform. Rev. Online 2010, 14, 50–65. [Google Scholar]
  19. Oborn, E.; Barrett, M.; Darzi, A. Robots and Service Innovation in Healthcare. J. Health Serv. Policy 2011, 16, 46–50. [Google Scholar] [CrossRef] [PubMed]
  20. Beauchamp, T.; Childress, J. Principles of Biomedical Ethics; Oxford University Press: Oxford, UK, 2013. [Google Scholar]
  21. Bassett, C. Nurses’ perceptions of care and caring. Int. J. Nurs. Pract. 2002, 8, 8–15. [Google Scholar] [CrossRef] [PubMed]
  22. De Cooman, R.; de Gieter, S.; Pepermans, R.; Bois, C.D.; Caers, R.; Jegers, M. Freshmen in Nursing: Job motives and work values of a new generation. J. Nurs. Manag. 2008, 16, 56–64. [Google Scholar] [CrossRef] [PubMed]
  23. McClure, P.K. “You’re Fired,” Says the Robot: The Rise of Automation in the Workplace, Technophobes, and Fears of Unemployment. Soc. Sci. Comput. Rev. 2017, 36, 139–156. [Google Scholar] [CrossRef]
  24. Stephany, K.; Majkowski, P. Technological utility: How it sometimes interferes with caring practice. In The Ethic of Care: A Moral Compass for Canadian Nursing Practice; Bentham Books: Potomac, MD, USA, 2012; pp. 131–144. [Google Scholar] [CrossRef]
  25. Venkatesh, V.; Morris, M.; Davis, F.; Davis, G. User Acceptance of Information Technology: Towards an unified View. MIS Q. 2003, 27, 425–478. [Google Scholar] [CrossRef]
  26. Kristofferson, A.; Coradeschi, S.; Severinson-Eklundh, K. An exploratory study of health professionals’ attitudes about robotic telepresence technology. J. Technol. Hum. Serv. 2011, 29, 263–283. [Google Scholar] [CrossRef]
  27. Ekland, A.; Bowes, A.; Flottorp, S. Effectiveness of telemedicine: A systematic review of reviews. Int. J. Med. Inform. 2010, 79, 736–771. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  28. Schulman, C.; Marttos, A.; Rothernberg, P.; Augenstein, J. Usability of telepresence in a level-1 trauma center. Telemed. J. e-Health 2013, 19, 248–251. [Google Scholar]
  29. Mahoney, D.F. The aging nurse workforce and technology. Gerontechnology 2011, 11, 13–25. [Google Scholar] [CrossRef]
  30. Taiwo, A.; Downe, A. The Theory of User Acceptance and Use of Technology (UTAUT) a meta-analytic review of emperical findings. J. Theor. Appl. Inf. Technol. 2013, 49, 48–58. [Google Scholar]
  31. Deleuze, G. Spinoza: Practical Philosophy; City Light Books: San Francisco, CA, USA, 1988. [Google Scholar]
  32. Norman, G. Likert Scales, levels of measurements and the ‘laws’ of statistics. Adv. Health Sci. Educ. 2010, 15, 625–632. [Google Scholar] [CrossRef] [PubMed]
  33. Field, A. Discovering Statistics Using IBM SPSS Statistics, 4th ed.; Sage: London, UK; Thousand Oaks, CA, USA, 2013. [Google Scholar]
  34. Tabachnik, B.; Fidell, L. Using Multivariate Statistics; Allyn & Bacon: Boston, MA, USA, 2007. [Google Scholar]
  35. Hayes, A.; Scharkow, M. The Relative Trustworthiness of Inferential Tests of the Indirect Effect in Statistical Mediation Analysis. Psychol. Sci. 2013, 24, 1918–1927. [Google Scholar] [CrossRef] [PubMed]
  36. Ray, K.D.; Mittelman, M.S. Music therapy: A nonpharmacological approach to the care of agitation and depressive symptoms for nursing home residents with dementia. Dementia 2017, 16, 689–710. [Google Scholar] [CrossRef] [PubMed]
  37. Thomas, S.; Baier, R.; Kosar, C.; Ogarek, J.; Trepman, A.; Mor, V. Individualized music program is associated with improved outcomes for US nursing home residents with dementia. Am. J. Geriatr. Psychiatry 2017, 25, 931–938. [Google Scholar] [CrossRef] [PubMed]
  38. Sung, H.C.; Chang, A.M.; Lee, W. A preferred music listening intervention to reduce anxiety in older adults with dementia in nursing homes. J. Clin. Nurs. 2010, 19, 1056–1064. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  39. Fleischer, A.; Tchetchik, A.; Toledo, T. Does it pay to reveal safety information? The effect of safety information on flight choice. Transp. Res. Part C Emerg. Technol. 2015, 56, 210–220. [Google Scholar] [CrossRef]
  40. Ewertsson, M.; Gustafsson, M.; Blomberg, K.; Holmström, I.; Allvin, R. Use of Technical Skills and Medical Devices among new registered Nurses: A questionnaire study. Nurse Educ. Today 2015, 35, 1169–1174. [Google Scholar] [CrossRef] [PubMed]
  41. Haux, R.; Swinkels, W.; Ball, M.; Knaup, P.; Lun, C. Transformation of Healthcare through Innovative Use of Information Technology; challenges for health and medical informatics education. Int. J. Med. Inform. 1998, 50, 1–6. [Google Scholar] [PubMed]
  42. Sharts-Hopko, N. The coming revolution in personal care robotics: What does it mean for nurses? Nurs. Adm. Q. 2014, 38, 5–12. [Google Scholar] [CrossRef] [PubMed]
Table 1. Characteristics of participants (N = 357).
Table 1. Characteristics of participants (N = 357).
Level of EducationAge (n)
M (SD)
Gender n (%)n
Intermediate vocational level16–19: 55
20–23: 9
24+: 11
M = 19.12 (SD = 2.42)
♀ 9 (12%)
♂ 66 (88%)
75
Higher vocational level16–19: 52
20–23: 126
24+: 104
M = 21.78 (SD = 2.17)
♀ 35 (12.4%)
♂ 247 (87.6%)
282
All students16–19: 107
20–23: 135
24+: 115
M = 21.22 (SD = 2.47)
♀ 44 (12.3%)
♂ 313 (87.7%)
357
Note. M = mean; SD = standard deviation; n = subsample size.
Table 2. Descriptive statistics of the four composite factors used for further statistical analyses (N = 357).
Table 2. Descriptive statistics of the four composite factors used for further statistical analyses (N = 357).
# ItemsMeanSDCronbach α
Maleficence43.220.950.85
Autonomy33.51.100.87
Utility54.170.820.78
Acceptance63.990.990.87
Note. Items were scored on six-point rating scales (1 = strongly disagree, 6 = strongly agree).
Table 3. Descriptive statistics (Mean, Standard Deviation (SD)) of the four dependent variables (DV) (N = 357).
Table 3. Descriptive statistics (Mean, Standard Deviation (SD)) of the four dependent variables (DV) (N = 357).
Robot TypeDVnMeanSD
AssistingMaleficence1123.64 cd0.75
Autonomy1123.691.15
Utility1124.300.82
Acceptance1124.02 b0.85
MonitoringMaleficence1273.37 ce0.92
Autonomy1273.501.15
Utility1273.9600.84
Acceptance1273.64 a00.99
CompanionMaleficence1182.66 de00.89
Autonomy1183.3200.98
Utility1184.2600.76
Acceptance1184.35 ab00.97
Note. Items were scored on six-point rating scales (1 = strongly disagree, 6 = strongly agree). Equal superscripts indicate significant differences.

Share and Cite

MDPI and ACS Style

Van Kemenade, M.A.M.; Hoorn, J.F.; Konijn, E.A. Healthcare Students’ Ethical Considerations of Care Robots in The Netherlands. Appl. Sci. 2018, 8, 1712. https://doi.org/10.3390/app8101712

AMA Style

Van Kemenade MAM, Hoorn JF, Konijn EA. Healthcare Students’ Ethical Considerations of Care Robots in The Netherlands. Applied Sciences. 2018; 8(10):1712. https://doi.org/10.3390/app8101712

Chicago/Turabian Style

Van Kemenade, Margo A. M., Johan F. Hoorn, and Elly A. Konijn. 2018. "Healthcare Students’ Ethical Considerations of Care Robots in The Netherlands" Applied Sciences 8, no. 10: 1712. https://doi.org/10.3390/app8101712

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop