Next Article in Journal
Intramuscular Injections and Dry Needling within Masticatory Muscles in Management of Myofascial Pain. Systematic Review of Clinical Trials
Next Article in Special Issue
Exploring the Impact of Linguistic Signals Transmission on Patients’ Health Consultation Choice: Web Mining of Online Reviews
Previous Article in Journal
Characteristics and Circumstances Associated with Work-Related Suicides from the National Violent Death Reporting System, 2013–2017
Previous Article in Special Issue
Does Being Ill Improve Acceptance of Medical Technology?—A Patient Survey with the Technology Usage Inventory
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Role of Individual Motivations and Privacy Concerns in the Adoption of German Electronic Patient Record Apps—A Mixed-Methods Study

by
Richard Henkenjohann
1,2
1
Faculty of Linguistics and Information Science, University of Hildesheim, 31141 Hildesheim, Germany
2
Digital Health Center, Hasso Plattner Institute for Digital Engineering gGmbH, University of Potsdam, 14482 Potsdam, Germany
Int. J. Environ. Res. Public Health 2021, 18(18), 9553; https://doi.org/10.3390/ijerph18189553
Submission received: 31 July 2021 / Revised: 23 August 2021 / Accepted: 26 August 2021 / Published: 10 September 2021
(This article belongs to the Special Issue Information Technology's Role in Global Healthcare Systems)

Abstract

:
Germany’s electronic patient record (“ePA”) launched in 2021 with several attempts and years of delay. The development of such a large-scale project is a complex task, and so is its adoption. Individual attitudes towards an electronic health record are crucial, as individuals can reject opting-in to it and making any national efforts unachievable. Although the integration of an electronic health record serves potential benefits, it also constitutes risks for an individual’s privacy. With a mixed-methods study design, this work provides evidence that different types of motivations and contextual privacy antecedents affect usage intentions towards the ePA. Most significantly, individual motivations stemming from feelings of volition or external mandates positively affect ePA adoption, although internal incentives are more powerful.

1. Introduction

Providing efficient healthcare has a genuine impact on society, as it directly influences people’s well-being. The use of information technology (IT), and more specifically, electronic health records (EHRs), improves provisioned quality and reduces healthcare costs in general [1]. Healthcare is information-intensive, since many activities are enabled through storing, processing, and analyzing data. An EHR increases efficiency in healthcare delivery, simplifies monitoring patient health, facilitates monetary savings, reduces paper-based errors, and improves diagnoses and treatments [2,3,4,5,6]. Governments and healthcare systems promote national patient health records as “a way of preserving patients’ health and medical information and maintaining their data in a central facility that ideally can be shared between different healthcare providers” ([7], p. 1). An EHR offers “efficiencies in collecting and storing patient information, contributing to continuity of care and alleviating problems such as misdiagnosis or prescription errors” ([7], p. 1). Patient-administered health records, often referred to as personal health records (PHRs), aim to improve this continuity of care while simultaneously realizing the right to informational self-determination in that each patient is made the owner of all disease-related data [8]. In a PHR, citizens can add valuable self-reported information to their health records and are given control over their data by, for example, deciding who can access their health records [7,9,10,11]. The adoption of such systems, however, is a complex task. First, the implementation of electronic patient records has to be performed on the institutional level where all care providers have to adopt all technologies [1,12]. Secondly, the adoption of PHRs has to take place on the individual level. Individual attitudes are crucial on that level, as patients can reject opting-in to the PHR or demand to opt-out from the PHR [1]. Consequently, it is vital to understand how individuals will be willing to adopt such a system [13,14]. The patient’s choice in not adopting a PHR is crucial in the diffusion process [1] as patients’ resistance can result in “any national efforts unachievable” ([13], p. 360). Though the integration of an electronic health record serves potential benefits, it also constitutes risks for an individual’s privacy [15]. Privacy concerns remain the significant factor for patients in terms of withholding EHR adoption [13,14,16]. In particular, individual health information can easily be de-anonymized when combined [17]. Consequently, health and medical data’s highly sensitive nature results in many ethical issues when establishing a nationwide electronic health record [7].
The German healthcare system is characterized by fragmented care structures that hinder cross-sectoral care of patients and can lead to additional costs for the healthcare system, such as loss of information between practitioners, duplicated examinations, and uncoordinated treatment processes [18]. Germany’s electronic patient record project (elektronische Patientenakte—ePA) intends to overcome these barriers while increasing transparency and efficiency [19,20]. As of 1 January 2021, statutory health insurance companies in Germany have been obligated to offer electronic patient records to their insureds (§ 341 German Social Code, Book V). In an early study, Hoerbst et al. [21] gathered attitudes towards EHRs among Austrian and German citizens and found that citizens are generally interested in managing their health data and exchanging data between healthcare providers; however, data protection concerns were often mentioned. A Eurobarometer survey [22] showed that respondents generally like to have web-based access to their medical records, depending on the possibility of limiting access. Studies showed that privacy controls determined by the patient are a prerequisite for sharing health information [23,24]. A lack of granular controls negatively influences the willingness to share health information with other health professionals [25]. Consequently, worse healthcare outcomes can be expected [26]. Additionally, Caine and Hanania [27] discussed that patients want detailed privacy controls over their data in health records. That is why we aimed to understand different antecedents that add to an individual’s privacy concerns about ePA. For a more comprehensive view, this mixed-method study aimed to research individual’s attitudes towards the ePA by considering different types of motivations.

2. Theoretical Background and Prior Research

2.1. Endogenous Motivations in Driving Usage Intentions

In motivational psychology, it is the consensus that individuals’ motivations can be either intrinsically or extrinsically originated (i.e., [28]), which is also embodied in Davis et al.’s [29] motivational model, which represents the prevailing perspective on understanding user intentions [30]. In this model, “extrinsic motivation influences behavior due to the reinforcement value of outcomes, [while] intrinsic motivation refers to the performance of an activity for no apparent reinforcement other than the process of performing the activity per se” ([29], p. 1112). Consequently, “perceived usefulness is an example of extrinsic motivation, whereas enjoyment is an example of intrinsic motivation” ([29], p. 1112). Even though prevailing technology adoption models help to explain many antecedents to behavioral intentions, such as perceived usefulness and ease of use, traditional technology acceptance models fail to capture significance because of uncaptured “user-beliefs” [30,31]. That is why the extrinsic/intrinsic dichotomy that comes with the conceptualization of extrinsic motivation as perceived usefulness and intrinsic motivation as enjoyment may result in an incomplete understanding [31]. Therefore, a different approach for capturing user intentions was proposed by Malhotra et al. [31]. It utilizes organismic integration theory (OIT) [32], which is a sub-theory of self-determination theory (SDT) [33].
Historically, research has seen motivation as a concept that varies primarily in quantity (cf. [34]), the idea being that more motivated people “will aspire greater achievement and be more successful in their efforts than people with less motivation” ([35], pp. 221–222). In contrast, SDT argues that the different types of motivation are more vital than the level of motivation in predicting behavioral outcomes [33,36]. Thus, the “distinction between autonomous versus controlled is more important than the distinction between intrinsic and extrinsic” ([37], p. 471). The OIT regards motivation as the level of internalization and integration of the activity’s value [32]. Individuals that experience their behaviors as autonomously driven perceive volition, whereas individuals whose behaviors are linked to feelings of pressure recognize themselves as being controlled [35]. There is evidence that perceived volitional motivation has a more significant influence on the behavioral outcome than motivation through external influences [31,35,38,39].
The OIT specifies a taxonomy for the levels of perceived autonomy, referring to the perceived locus of causality (PLOC). The PLOC describes the extent to which someone senses an action as being self-initiated [32]. Figure 1 shows the relations of different types of endogenous motivations to specific PLOC types. With an internal PLOC, individuals see themselves as the originators of their behavior, whereas with an external PLOC, people see themselves as being controlled by external forces [31]. For example, users may be motivated to learn how to use a new piece of technology out of self-interest or compliance with a supervisor. Internal PLOC further splits into identified PLOC and intrinsic PLOC. Feelings of volition are common to both types. Intrinsic PLOC refers to instinctive and spontaneous behavior [40] that results in actions being performed due to inherent enjoyment or fun [32]. Identified PLOC, however, refers to behavior based on individual values and meaningful goals that are performed freely and autonomously [40]. Because identified PLOC motivational behavior results from internalizing external regulations as essential values, this is a type of extrinsic motivation [31]. Both intrinsic and identified PLOC are often combined into a composite of autonomous motivation [31]. Both types imply an internal PLOC, but only identified PLOC can be directorially influenced motivation [31]. In external PLOC, individuals attribute the reasons for their actions to external authority or compliance [32]. A crucial characteristic of external PLOC is that perceived external influences and personal values are not conflicting [31]. Introjected PLOC, however, is defined by a misalignment of perceived external influence and personal values [31]. The conflict can result in affective feelings of guilt and shame or esteem-based pressure to act [32,40]. Introjected PLOC often leads to rejection of the “imposed” behavior [31]. Even though both external PLOC and introjected PLOC are linked to external influence, they result in different behavioral outcomes [32].
The PLOC framework [31] suggests that the different types of PLOC have cumulative effects on behavioral intentions. Understanding endogenous motivation can explain and predict individual differences in usage intentions across a population. The framework can also help explain different behavioral outcomes and why some users more widely accept some technologies than others. The PLOC framework has been applied in various research, such as in sustainable consumer behavior and educational and health-related lifestyle contexts [41,42]. Existing studies demonstrate that the PLOC framework needs to be contextualized. For a context-specific study, relevant contextual variables need to be used [43,44]. Consequently, we contextualized the PLOC model by conducting a qualitative study in the first phase of the mixed-methods design [45,46].

2.2. Privacy Theories and Research in the Health Context

The ubiquitous nature of information technology led to a “privacy is dead” shockwave [47]; however, many still consider health data sensitive and believe it should still be protected [15]. Even though the privacy literature is comprehensive, research in the health context is still ongoing. Only a few studies have examined human privacy in the health context (e.g., [1,2,15,48,49,50,51]). Existing literature demonstrates that protecting health data is increasingly vital to individuals. Privacy concerns result in privacy-protective behaviors, such as rejecting to adopt health solutions, including EHRs, [13,52,53,54], and holding back information from health professionals [2,55], which can negatively influence diagnoses.
The privacy concept has been discussed in various ways, but most literature emphasizes the matter of control [56,57,58,59,60]. For example, building on top of Clarke [61], Bélanger and Crossler [62] defined privacy as an individual’s desire for control over their personal information. Similarly, in the health context, Fox and Connollly [63] define privacy as an individual’s desire to be granted greater control over collecting and disseminating personal health information via health professionals and technology vendors. As the concept of human privacy remains challenging to measure, various other concepts are used as proximal measures. As such, privacy concern has been established as a central measure [62,64,65]. Privacy concern is the extent of the perception of a potential loss of privacy [66], i.e., the general tendency of people to worry about the loss of their informational privacy [67,68].
To measure privacy concerns, Smith et al. [64] introduced a 4-dimensional “Concern for Information Privacy” (CFIP) scale, which queries individuals’ concerns regarding the collection, errors, unauthorized secondary use, and improper access of their information. The CFIP then has been used to measure health information privacy concerns in electronic medical reports and EHRs [1,13,16,52,54]. However, Kordzadeh et al. [48] suggested acknowledging additional factors of Internet Users’ Information Privacy Concerns (IUIPC) [68], namely, the sub-dimensions collection, control, and awareness. The CFIP and IUIPC were combined to a 6-dimensional “Internet Privacy Concerns” (IPC) scale by Hong and Thong [69]. Fox and Connolly [63] then rephrased the IPC measure to create the Health Information Privacy Concern scale. Hong and Thong [69] formed the IPC as a third-order construct, and as a consequence, the HIPC has three dimensions as well. Concerns about collection, secondary usage, and control form a second-order interaction management factor, and errors and improper access constitute the second-order factor information management. Both second-order factors, plus awareness, build the third-order factor (H)IPC. The HIPC is shown in Figure 2. While past studies on healthcare adoption often measured privacy concern with one dimension (cf. [14,48,70]), the complex nature of m-health technology requires a more sophisticated approach to measuring human privacy. The HIPC is multidimensional, as depicted in Figure 2, and is preferably measured as such [63,69].
Prior research has mainly utilized the antecedents→privacy concerns→outcomes (APCO) macro model that posits that a number of antecedents, usually individual traits or contextual factors, form an individual’s privacy concerns, which, in turn, cause behavioral outcomes [65]. However, the majority of studies focus on the outcomes rather than on the antecedents [65]. A major contribution to the research of antecedents on the HIPC has been made by Fox and James [15]. This work will further examine antecedents that are subject to impact the HIPCs while simultaneously validating the factors found by Fox and James [15].

2.3. Risk and Trust Beliefs in Privacy Research

Besides privacy concerns, risk and trust play significant roles in privacy research. Trust beliefs become crucial when dealing with uncertainty [71], where trust is the “belief that the trusted party will fulfill its commitments [72,73] despite the trusting party’s dependence and vulnerability [74,75]” ([76], p. 54). Research shows that greater trust in the vendor’s competence, benevolence, and integrity results in lower privacy concerns [77,78,79]. Research in the health context supports the influence of trust on privacy concerns. For instance, Bansal et al. [14] found that trust influences the customer’s willingness to interact with health-related websites. Dinev et al. [1] also found that trust in EHR system vendors reduces privacy concerns. On the other hand, risk is considered an antithesis to trust and can be described as one’s expectation that information disclosure will have a negative outcome [56]. Studies have shown that risk perceptions increase privacy concerns for health websites [50] and reduce usage intentions for health-promoting wearables [80].
For electronic health records, users expect a heightened probability of privacy breaches and data misuse [13]. Additionally, “the highly sensitive nature of personal medical data adds even more to the uneasiness individuals feel about the violations and misuse” ([1], p. 29). Those concerns are general and are not necessarily linked to specific systems or practices [1]. Consequently, trust is crucial in overcoming risk perceptions concerning electronic health records [1,14]. McKnight et al. [79] distinguished between institution-based trust and disposition to trust in information systems research. For EHRs, an individual can trust a health professional but may not necessarily trust EHR systems; alternatively, an individual may value an EHR but not the institutions or care providers using it [1].

2.4. IT Identity in Predicting IT Adoption Intentions

Self-categorization and social comparison shape an individual’s identity [81]. Self-categorization supports individuals in putting their social environment into order and understanding and recognizing their peers [82]. The self-identity develops over time as people observe and categorize themselves relative to others based on their goals, perceptions of how others respond to them, and their self-evaluations [83]. One type of self-categorization is to see IT as being integral to the sense of self. Carter and Grover ([84], p. 938) defined IT identity as “the extent to which a person views use of IT as integral to his or her sense of self”. The concept assumes that an individual’s IT usage is motivated by positive self-identification with IT use [85]. People who highly self-identify with IT employ IT usage more often than those who do not identify with the technology [84].
Carter [86] defined three dimensions of IT identity that serve an individual’s self-perception about IT: dependency, emotional energy, and relatedness. Dependency is specified as “the degree of reliance a person feels on a particular IT or class of ITs as a source of personal well-being” ([86], p. 115). IT is so ubiquitous that businesses and humans depend on it, so it constitutes one component of individuals’ identities. People express the perception of a need for devices. Emotional energy is defined as “an individual’s enduring feelings of emotional attachment and enthusiasm in relation to an IT or class of ITs” ([86], p. 115). For example, continuous interaction with an IT device could result in confidence, energy, and enthusiasm. Conversely, lack of these emotions can cause negative feelings, such as boredom [87]. Finally, relatedness refers to “a blurring of boundaries between notions of the self and an IT experienced as feelings of connectedness with an IT or class of ITs” ([86], p. 114). When individuals incorporate their devices’ characteristics within their self-identities, they feel intimately connected to these IT devices.
In the digital health context, it was proposed that IT identity influences emotions relating to IT, affecting patients’ decisions on whether or not to adopt healthcare devices or applications [88]. Additionally, the literature indicates that the IT identity theory is a relevant factor in explaining patients’ interaction with m-health applications [89]. Accordingly, this work adopts IT identity and its dimension to predict users’ adoption intention of the ePA application.

3. Prototype

At the time of research, the ePA had not been launched, and as we write now, the ePA has not had high diffusion due to missing technical infrastructure [90]. To overcome this limitation, we reviewed the ePA and provided a prototypical ePA mobile application. Based on the findings in the literature (e.g., [91]), we developed a prototype to be used for further research and present it in Figure 3. The prototype was used to inform the participants of the following studies. Creating a distinct prototype will also help to create a common understanding of the ePA, which should be helpful, given the breadth of available ePA applications. For prototyping, we utilized Figma [92]—a “mid-fidelity” prototyping tool for creating interfaces that can be immediately tested to get practical impressions of the applications [93].
As each health insurer will provide its own version of an ePA application, the applications will be branded. Health insurance companies and technology vendors likely offer ePAs as parts of more comprehensive digital health applications. All electronic documents in the patient file are listed chronologically by name, and by publication date in the document view (second screen). With the ePA being patient-administered, users can upload any documents, even newspaper articles. Another view (third screen) visualizes the permissions screen where all given permissions are listed. The list gives an overview of what health providers were given access, in the past or continuing. The last screen holds a record of every action performed on the patient file. For instance, the list gives an overview of what files have been uploaded, downloaded, deleted, and by whom.

4. The Mixed-Methods Design

We applied a two-stage sequential mixed-methods design to research the intentions, attitudes, and privacy concerns towards ePAs. Mixed-methods research combines “elements of quantitative and qualitative research approaches [] for the broad purposes of breadth and depth of understanding and corroboration” ([94], p. 123). A mixed-methods design approach is compelling in the ever-changing IT context, where researchers encounter problems with the explanatory power of existing theories and findings [30]. Mixed-methods research offers three main advantages: it allows one to address confirmatory and explanatory research questions simultaneously, provides more robust inferences than a single method, and can produce a more comprehensive range of divergent and complementary views [46].
The overall study made use of Venkatesh et al.’s [46] design guidelines. At the beginning of the process, we defined three research questions (one qualitative, one quantitative, and one mixed-methods; see Appendix A). The purpose of the mixed-methods is “developmental,” where the findings from the first strand qualitative method are used to inform the second strand quantitative method [46]. This study followed multiple paradigms from an epistemological perspective, with the first strand being interpretive and the second strand being deductive [46]. The methodology is “mixed-methods multistrand” ([46], p. 443) with a “sequential exploratory design” [95], which is characterized by the qualitative phase followed by its quantitative phase ([46], p. 445). The research design is sequential exploratory-explanatory, as it combines exploratory and explanatory approaches [96]. The study falls into the category of a “dominant-less dominant design,” with the quantitative strand being dominant in the overall design ([97], p. 44). Appendix A proves the design choices made. Figure 4 visualizes the dominant–less dominant design of our mixed-methods study.

5. Phase 1 Qualitative Study

The phase 1 qualitative study aimed to answer the research question: “What are the salient factors determining an individual’s intentions toward using the ePA?” For answering this question, we conducted semi-structured interviews with four individuals.

5.1. Research Methodology

Before the interviews, we created a semi-structured interview guideline. Semi-structured interviews encourage communication, thereby encouraging respondents to reveal underlying concepts [98]. The conversational form allows follow-up questions and prompts based on the answers [98]. This approach is particularly appropriate for generating new theories rather than confirming the established theory. The four interviewees (Appendix C) were identified through purposive sampling, i.e., non-probabilistic sampling where subjects were selected intentionally [99]. The interviews took about 30 min each and were conducted in German with a combination of open-ended and closed questions (Appendix B). All interviews were conducted remotely, recorded, and transcribed. Ethical considerations included measures such as only using encrypted communication channels, using pseudonyms in the transcripts, and not asking for health-related circumstances, such as chronic diseases. Beginning with some general questions, we presented the prototype of the ePA application from Figure 3. The prototype was explained in detail but took no longer than five minutes. Subsequently, the interviewees were asked about general attitudes toward this application and were asked to articulate how these attitudes were constituted. Subsequently, the respondents were asked about their health information privacy concerns (cf. [15]). Other questions cover the interviewee’s usage intentions, perceived benefits, and perceptions of risk.
We used an inductive approach [100] to make sense of the interviews rather than quantifying the data. We started by generating a list of “start list” of codes ([101], p. 58) resulting from the literature review. Then, with a “constant comparative” ([102], p. 105) analysis, we intended to identify the initial concepts and to link them to resulting sets of broader categories [103]. In grounded theory methodology, this procedure is equivalent to the “open coding” phase ([104], p. 12), where “conceptually similar events/actions/interactions are grouped to form categories and subcategories”. We used the software Atlas.ti to apply codes to the transcripts. Through constant comparison, “abstract categories” of labels were assigned to similar concepts ([101], p. 58).

5.2. Findings

The coding of the transcripts revealed different types of motivation-related variables. Identified PLOC was emergent, i.e., interest in accessing health data or more efficient treatments. Additionally, coding revealed that advice from health professionals supports ePA usage intentions, indicating that external PLOC drives adoption intentions. Introjected PLOC can result in rejecting the ePA, as respondents discussed that a negative medical history could result in feelings of shame due to conflicting external expectations in internal values. Among these motivation-related variables, respondents mentioned that a person’s IT experience and age could influence the usage intentions. In the interviews, we found indicators for different privacy concerns, i.e., concern for collection, secondary usage, improper access and errors, and a general desire for privacy and control over their data. Especially regarding the desire for control, many respondents underlined the importance of granular access rights. The trade-off between privacy risk and trust was repeatedly mentioned. Respondents mentioned that the trust regarding their physicians positively influences their intentions and perceived risk, especially regarding the general data collection practices on technical devices, has a negative impact. Among these variables, some interviewees mentioned that usability might play a role in using or discontinuing use of the application. The topics and broader concepts that were emergent through the interviewees are provided in Appendix D. Additionally, Appendix E displays clarifying quotes per the interviewees.

6. Research Model

The research model used (1) the PLOC framework [31] as the underlying theory to capture individual motivations, (2) the HIPC construct with contextualized privacy antecedents, and (3) heuristically evaluated context-based constructs to develop and justify the hypotheses. Table 1 shows the constructs and their definitions. We used intention as the dependent variable because an intention is the most “proximal” influence on behavior ([105], p. 76). An intention is what one plans on doing.
Internal PLOC is identified by the intrinsic and the identified PLOC that both. Feelings of volition characterize both states. Intrinsic PLOC refers to spontaneous behavior and performance for inherent fun, and identified PLOC refers to behavior based on personal values, goals, and outcomes [32]. For the ePA, users may adopt it if they can control it (intrinsic drivers) or be guided by internalized values such as health awareness. For example, one interviewee (I1) said that they likes to see “which current diagnoses I will have or which doctor’s letters and documents come together that exist about me”. Hence,
Hypothesis 1.
Internal PLOC positively influences one’s intentions toward adopting ePA applications.
External PLOC is perceived when one’s actions are attributed to external authority [32]. There must be no conflict between the perceived external influences and an individual’s internal values. The resulting behavior is usually done to comply with external demands. In the case of the ePA, such external demands could arise from recommendations by physicians or the health insurance. Hence,
Hypothesis 2.
External PLOC positively influences one’s intentions toward adopting ePA applications.
Introjected PLOC refers to feelings of shame and guilt that may emerge from other parties prompting them to act in a particular way (e.g., [33]). The user feels tension and confusion as introjected PLOC derives from misalignment between a user’s beliefs about behavior and their self-perceived autonomy [32]. If users experience that either their health insurance or the government is exerting pressure to use the ePA but judge themselves to be autonomous, the resulting uncertainty is likely to influence the usage intentions of ePA applications negatively. Hence,
Hypothesis 3.
Introjected PLOC negatively influences one’s intentions toward adopting ePA applications.
As to Carter and Grover [84], there are three behavioral consequences of IT identity: feature use behavior, enhanced use, and resistance behavior. Consequently, mobile technology identity can lead to both resistance and adoption [84]. Higher mobile technology identity can lead to higher motivation to adopt mobile applications, since people are dependent on and enthusiastic about their phones. This enthusiasm concerning the mobile device can increase the individual’s motivation to adopt m-health applications [107]. Additionally, feelings of IT dependence or relatedness can motivate people because they can link these feelings to dimensions of their identity [84]. Hence,
Hypothesis 4.
Mobile technology identity positively influences one’s intentions toward adopting ePA applications.
The interviews indicate that age has a role in the adoption of ePA applications. One interviewee, being aged 50+, expressed that they is very unfamiliar with technology and media, resulting in being conservative (I4). Another interviewee noted that older people might have problems with handling such applications and would not use the ePA. In literature, demographics, such as age, are commonly associated with privacy concerns. In Laric et al. [70], older participants expressed deeper privacy concerns regarding healthcare services. In contrast, Kordzadeh et al. [48] found younger users to have more significant privacy concerns attributed to their privacy literacy. In Vodicka et al. [108], people under 55 expressed more severe privacy concerns of the physician’s notes from their treatments. Additionally, King et al. [109] found that age correlates with concern about health information privacy. The majority of studies have revealed that privacy concerns increase with age, which is in line with the findings of the phase 1 study. Hence,
Hypothesis 5.
Age positively influences the HIPC.
The health status was an emergent theme in our interviews. While one interviewee stated that they would use the application uncoupled from their health status, other interviewees expressed concerns. For example, one interviewee stated that people with certain chronic diseases retain from using the ePA:
People with serious chronic illnesses, psychological problems, and those who fall under social taboos will hardly use the app.
(I3)
People with severe medical conditions require frequent treatments [110]. Thus, those people generate the most personal health information and are likely to express higher privacy concerns [15]. In Flynn et al. [111], people who feared mental illness’s stigma were less likely to opt into an electronic psychiatric record. Other studies support that health status influences information sensitivity and privacy concerns [14,112]. Based on the literature and phase 1 study findings, we posit that severe health conditions have impacts privacy concerns. Hence,
Hypothesis 6.
A severe health condition positively influences the HIPC.
Anderson and Agarwal [2] claimed that perceived information sensitivity affects privacy concerns and intentions to provide personal health information. In Dinev et al. [56], information sensitivity was associated with perceived risk. Caine and Hanania [27] found that the decision to share data in an EHR with some particular parties was based on the perceived sensitivity of personal health information. Additionally, Bansal and Davenport [14] found a positive correlation between health information sensitivity and privacy concerns. Further, the “highly sensitive nature of personal medical data” increases one’s concerns of violations and data misuse ([1], p. 29). One respondent demonstrated this:
If it says in your documents, you have some sexually transmitted disease or something, you may not want everyone to access it because it’s something that’s only your business.
(I2)
With support from the literature and the qualitative findings, we posit that perceived sensitivity of health information impacts privacy concerns. Hence,
Hypothesis 7.
Perceived information sensitivity positively influences the HIPC.
Trust and risk often are linked to privacy concerns [1,66,113]. Even though the APCO model positions risk as an outcome of privacy concerns, Smith et al. [65] recognized that prior studies support the influence of privacy risk on privacy concerns. Studies have shown that perceived privacy risk positively correlates with different websites, including healthcare ones [50]. In the case of the ePA, both health professionals and health insurance companies handle personal health information. Hence,
Hypothesis 8a.
Perceived risk associated with health professionals positively influences the HIPC.
Hypothesis 8b.
Perceived risk associated with health institutions positively influences the HIPC.
Additionally, trust has been shown to both an outcome and an antecedent of privacy concerns [65]. Prior studies found that trust in physicians and EHRs lowers privacy concerns [1,114]. One respondent in the qualitative study expressed that trust in their health insurer was a factor in using the ePA:
I would trust the health insurance companies. That plays an essential role for me.
(I1)
Hence,
Hypothesis 9a.
Trust in health professionals negatively influences the HIPC.
Hypothesis 9b.
Trust in health institutions negatively influences the HIPC.
Past studies show evidence that privacy concerns influence usage adoptions for health applications, including EHRs [15,53,115]. We consequently posit that with an increased HIPC, individuals will be less likely intend to opt-in to the ePA. Hence,
Hypothesis 10.
The HIPC negatively influences intentions to adopt ePA applications.
Besides these variables mentioned, the questionnaire of the phase 2 study also covered traditional control variables such as education, employment, as the literature advocates that these elements affect behavioral intentions [30,116]. Thus, those factors were added to the questionnaire to enrich understanding of the ePA applications’ usage intentions. Finally, we present the research model conceptualized from the hypotheses in Figure 5.

7. Phase 2 Quantitative Study

The second phase of the mixed-methods study aimed to answer the question, “Does the research model explain usage intentions of the ePA?” Therefore, we conducted a survey of potential German adopters to test the research model.

7.1. Research Methodology

To gather empirical data from potential adopters, we conducted an online survey with a traditional questionnaire design consisting of a closed-questions design. The questionnaire was published on SoSci Survey. The advantages of an online survey are a potentially broader target audience, straightforward distribution and analysis, and the collection of additional measures, such as the time needed to complete the survey. The survey was distributed via e-mail to a list of acquaintances and a market research panel simultaneously to gather responses from participants with diverse socio-demographic backgrounds. To be included to the study, participants needed to be at least 18 and have a permanent residency in Germany. At the beginning of the survey, we presented the screenshot of our prototypical ePA application (Figure 3) with a brief explanation of available features to establish a common understanding of the ePA.
The research model was analyzed using partial least squares structural equation modeling (PLS-SEM). The calculations were made in SmartPLS version 3.3.3 [117]. PLS regression is often used in information systems research to understand behavioral phenomena. We applied current recommendations and validation tools to analyze our model [118].

7.2. Measures and Pilot Testing

For measuring intention to use the ePA, we used a two-item scale based on the literature [30,119]. The endogenous motivation was measured by scales based on Ryan and Connell [32] that were extended to capture ePA adoption. Those measures were greatly influenced by the findings of the phase 1 study. Items for external PLOC measured self-perceived reasons for usage intentions resulting from their health insurance or physician recommendations. The internal PLOC scale measured reasons for using an ePA characterized by self-determined choice and volition. For measuring introjected PLOC, we used items dealing with conflicts between personal values and social norms. We added two-item scales to measure both dependence and emotional energy as characteristics of IT identity [84,86]. For measuring an individual’s Health Information Privacy Concern in the ePA, we added a three-item scale for each dimension—secondary usage, control, errors, and improper access from Fox and James [15,69]. We, however, omitted the awareness construct of the HIPC scale since we did not find evidence for this attribute in the interviews. In addition, the questioning focused on the individuals’ perceptions of their concerns rather than their expectations, as proposed by Hong and Thong [69]. Capturing the individual’s health status involved a three-item scale based on Bansal et al. [14]. We utilized a two-item scale for each category to capture risk perception concerning health professionals and insurance providers and capture trust towards health professionals and insurance providers [15,53,69]. For measuring personal health information sensitivity, we utilized a 5-point-Likert-scale for different categories of health data to rate the perceived sensitivity, based on Laric et al. [70]. Finally, we added items for measuring the demographic characteristics age (offering four categories), employment (four categories), formal educational level (four categories), and prior m-health experience (yes/no). At the end of the survey, we added items that gathered self-reporting seriousness checks to improve data quality [120]. All scales are presented in detail in Appendix F.
The survey used validated construct scales from the literature where applicable. Several best practices were applied to avoid common-method bias [121,122,123]: The questionnaire was designed to maximize user engagement and minimize task difficulty. A “good cover story” ([123], p. 562) on the opening page of the survey aimed to engage respondents’ accuracy and motivation. The introductory text was both descriptive and motivating by emphasizing the respondent’s desire for self-expression [123]. We applied clear and concise language and avoided ambiguous or unfamiliar terms. We separated parts in the questionnaire and repeatedly displayed the image of the prototype in the hope of diminishing “effects of involuntary memory-based and perceptual biases” ([123], p. 563). The questionnaire was pilot tested to validate the instrument. We conducted two on-site and two remote pilot tests. The user tests provided feedback that resulted in the rewording of items and clarified descriptions. Participants reported difficulties with one item from the introjected PLOC scale that we decided to drop from the questionnaire. The questionnaire was then reviewed until judged satisfactory.

7.3. Sample

The sample was intended to be “probabilistic” [46]. The heterogeneity of the sample could be verified by the descriptive analysis of the survey data. The external validity of the sample was reasonably ascertained by assuring that the sample represented the whole German population by comparing the sample with data of German citizens (see Appendix G). The online survey was opened 480 times. A total of 289 participants commenced the survey, which makes a response rate of 60%, though the click-rate of the survey is a vague metric. Among those participants, 250 respondents finished the last page of the survey. Then, incomplete responses (n = 2) were removed, which resulted in a completion rate of 86%. For data cleaning, we followed the practice that all cases should be retained unless evidence suggests a case is aberrant [124]. Leiner ([125], p. 242) proposed a “relative speed index” to eliminate potentially meaningless cases by completion time. We chose a speed index of 2.00 and removed n = 17 responses with completion times two times faster than the median completion time. We also removed cases from respondents who did not give their consent or self-reported their answers as meaningless. After data cleaning, 222 responses were used for further analysis. Participants’ demographic characteristics are shown in Appendix G. Basic descriptive characteristics (mean, standard derivation) are presented in Appendix F.

7.4. Preliminary Analysis Validation

To ascertain the quality of the quantitative results, we evaluated a range of reliability measures to test the convergent and discriminant validity of the scales [121]. We began with assessing the convergent validity by evaluating the multi-item construct quality (see Appendix H). The Cronbach’s alpha revealed undesirable internal consistency ( α < 0.600 ) for both the introjected PLOC and HIPC-control scale. A low alpha indicates poor inter-relatedness between items or heterogeneous constructs [126]. For both scales, we improved internal consistency by dropping one item from the construct. We then further assessed construct reliability by conducting the composite reliability and AVE scores. The composite reliability should exceed 0.700 and be larger than the AVE [124], which was the case for all constructs. We further obtained the outer loadings and t-statistics for all items across each construct. Loadings above 0.700 are often recommended, but lower values can be sufficient [124]. As Appendix I reports, all items had outer loadings above 0.700 and were significant at the p < 0.05 level.
We also calculated the tolerance levels and variance inflation factors (VIF) to check for multi-collinearity. The threshold of 10 [127] was passed for two of the HIPC-access items. The tolerance levels were all greater than 0.10 , but for the before-mentioned items, thereby indicating that multicollinearity generally is no issue. As these two items are used to form a third-order factor to measure HIPC, we did not consider the VIF problematic. Thus, all items were retained for further analysis.
To examine the discriminant validity, we conducted a Fornell–Larcker test (see Appendix J). A latent construct should better explain the variance of its indicator than the variances of other latent constructs [121,128]. The average variance extracted (AVE) from each of the latent constructs should be higher than the highest squared correlation with any other latent variable. Our test ensured that the square root of the AVE exceeded all correlations with other latent constructs, and discriminant validity was given.

7.5. Model Results

The structural model results are summarized in Table 2 and Figure 6.

8. Discussion

The mixed-methods design aimed to discover individual’s intentions toward using the ePA mobile applications. The qualitative study uncovered a range of factors influencing usage intentions to formulate 12 hypotheses. The results of the quantitative study show overall support for most hypotheses. We implemented qualitative analysis, followed by the quantitative analysis (see Table 3) [46]. The results show consistency but also reveal some incompatible findings. Overall, we found the same row of parameters were significant in both qualitative and quantitative studies. Even though the questionnaire was developed from the findings of the qualitative study, we found some significant differences in the findings of the studies: Besides an individual’s health status emerging as a critical factor for ePA adoption in the qualitative study, health status was not significant in the second study. Similarly, one’s positive self-identification with mobile devices (“mobile IT identity”) was not significant in the quantitative study. A limitation of our study is that we did not replicate the divergent results with a new dataset [46,129]. However, we offer a theoretical explanation to remedy the inconsistent findings.
Overall, our meta-inferences are congruent with our research model. We successfully added value beyond the individual studies with the integration of the qualitative and quantitative research strands. Considering that the phase 1 and phase 2 study data were from different sets of respondents and different data-collection approaches, the similarity implies that we utilized solid theoretical models as our research foundation. The mixed-methods helped us determine and understand factors that influence ePA usage intentions. With the qualitative study, we were able to determine a set of aspects and their relevance. In contrast, the quantitative study empirically examined the research model that resulted from the qualitative study to determine what factors influence ePA usage intentions. Table 3 summarizes our meta-inferences.
In particular, the results highlight the predictive power of motivation. Some respondents from the phase 1 study expressed intrinsic PLOC, i.e., “joy” in accessing their data and using the ePA. Other respondents expressed indicators that relate to the perceived usefulness, thereby relating to identified PLOC. For example, one respondent identified that a digital health record helps them to keep track of their data, even when consulting different physicians:
I have moved several times in my life now, even long distances. In the end, I always had to have everything handed over to me in physical form by the family doctor I was seeing.
(I3)
Both intrinsic and identified PLOC were crucial factors for predicting ePA usage intentions among the studies. Some respondents indicated that they considered adopting the ePA when advised to, indicating the motivational power of external PLOC. Those findings were consistent among both strands of our mixed-methods study. However, internal PLOC was a stronger predictor than external PLOC. These findings are consistent with the literature about external rewards [39,130]. We found strong indicators for introjected PLOC, hindering ePA adoption in the qualitative study. The respondents repeatedly expressed uneasiness resulting from a misalignment of perceived social influences and personal values:
I think if you are seriously ill and you carry this application around with you all the time, it’s like carrying your X-rays around with you all the time. I don’t like the idea.
(I2)
People with serious chronic illnesses, psychological problems, and those who fall under social taboos will hardly use the app.
(I3)
The quantitative study supported the negative impact of introjected PLOC. In particular, political pressure and shame were two factors that hindered ePA adoption from the quantitative study.
Contrary to our expectations, the meta-analysis for the “mobile IT identity” and “health status” variables indicate the lack of influence of these factors on ePA adoption or the HIPC. We now attempt to explain these meta-analysis: (1) The low impact of one’s mobile IT identity can be explained by the not-so-technical nature of a health record: Even though the ePA is distributed as a mobile application, such an application does not require a self-identity that is usually attributed to “mobile IT identity”. We assume that, in contrast, ePA applications being heavily gamified might demand positive perceptions towards IT in a pronounced manner. (2) An individual’s health status did not have a significant influence on the HIPC. The share of subjects with self-reported severe health status was generally low in our sample; thus, our quantitative study failed to see an effect on this variable. We argued that people with severe health conditions would express higher privacy concerns; however, none of the interviewees from Study 1 reported severe chronic diseases themselves, but thought that there might be concerns from people with such conditions. On the other hand, populations with multiple chronic conditions may have more motivation to use the ePA to facilitate patient–doctor communication and control privacy settings themselves. Whether a severe health condition has a positive impact on privacy concerns, or a positive impact on the usage intentions, or even both, is not supported by the meta-analysis.
The quantitative study showed evidence that age has an impact on the HIPC. However, contrary to our assumptions, higher age led to lower HIPC. Our literature review showed conflicting findings for the impact of age on an individual’s privacy concern in the health context. One supported explanation is that younger people might have higher privacy concerns attributed to their privacy literacy, which is also supported by the literature [48].
The findings towards perceived risk and trust concerning health professionals and one’s health insurance were congruent among both strands of research. This is a strong indicator of risk and trust being linked to privacy concerns. Our findings are consistent with the literature [1,66,114,131]. For instance, a satisfying experience with one’s health insurance can lead to less resistance when adopting an ePA that is distributed by their health insurance, as stated by one respondent:
I have personally been very, very satisfied with my health insurance company over the years. I am sure that it works well, and I can download the application with confidence. In contrast, for third-party providers, I would have to deal with who is behind the app.
(I3)
Even though the APCO model positions risk as an outcome of the privacy concern, we demonstrated that privacy risk influences privacy concern. This impact was also theorized by Smith et al. [65].
Our findings from both research strands show that the attributed information sensitivity of health data adds to the HIPC towards the ePA. Individuals that perceived their health information as being more sensitive were less likely to adopt the ePA. This finding is consistent with the literature [2,14,27,56]. Overall, the information sensitivity and general demand for privacy differed among the respondents in the qualitative study, which was also reinforced by the quantitative findings. We discussed that the perceived privacy risk and privacy calculus are less profound where electronic patient records are relatively new. Individuals tend to weigh the benefits of the ePA more heavily than the concerns of privacy. However, those societal values may change over time during the diffusion process of the ePA. Additionally, secondary usage of one’s health data can result in uneasiness, for instance, when health data are used for data mining purposes, or when the data impact the services delivered by the health insurance.
In the qualitative study, the control that one could exercise over their health data was an essential factor in ePA usage intentions:
I would like to decide what the doctor can get from me and what insight he can get from me.
(I4)
Additionally, existing literature demonstrated that “patients want granular privacy control over health information in electronic medical records” [27].
The concept of perceived ownership of data in the ePA was also present in our interviews. For instance, one respondent mentioned that they did not feel up to exerting control over their data:
Do I wish I had control over it myself when my family doctor has the data? I would like to have confidence that the control will be realized by someone else.
(I2)
Tang et al. ([132], p. 125) noted that, with the patient having data sovereignty, “different mindsets and levels of trust” will become mandatory. Fox and James [15] researched the HIPC and found that interviewees have differing perceptions of data ownership in the context of EHRs. Perceptions ranged from beliefs that the patient is the owner, to dual ownership, to the perception that the physician owns the data. Fox and James [15] also found that interviewees seeing themselves as single data owners expressed a strong desire for privacy and were highly concerned about unauthorized secondary use, improper access, and control.
Our research model could explain 77% ( R 2 = 0.771 ) of the variance in our sample, which is a satisfactory fit of our theoretical model. Thus, our study demonstrates that we used solid underlying theory, i.e., the PLOC framework, to understand the intentions toward adopting the ePA.

9. Limitations and Future Research

Our study has some limitations, which should create opportunities for future research. First, we interviewed four individuals in the first study, making it unlikely to reach theoretical saturation. Since study 1 was less dominant in the overall study design, this limitation was maintainable. Further, we did not ask health-status-related questions out of ethical considerations, which may have resulted in an incomplete picture of the impacts of health-related factors on ePA usage intentions. Next, we showed and described the prototype from Figure 3 to the participants in a detailed manner. However, the prototype was static, and the remote setting has downsides regarding user comprehension. Further studies should prepare a high-fidelity prototype and consider the impacts of participants’ digital skills and literacy levels.
Second, our sample in study 2 contained 222 usable responses from German citizens. Even though we ascertained external validity employing the demographics (see Appendix G), our sample had an imbalance in the age distribution. Further, we noticed that the share of respondents that reported severe health conditions was generally low. Additionally, an online survey requires a certain level of IT literacy. This is particularly important because a severe health status, IT literacy, and old age may co-occur. Similarly, we measured health status with a two-item scale (see Appendix F) that lacked understanding of the population’s actual health conditions. The scale used in the questionnaire mixes chronic and acute diseases from participants and does not capture multimorbidity, nor polypharmacy. Further studies should improve the measures to capture health status.
Third, even though we discovered several antecedents that impact privacy concerns, we did not capture each antecedent. This limitation has been shared by other studies examining the antecedents of the HIPC [15]. We thus encourage exploring privacy antecedents more comprehensively. Lastly, an individual’s perceptions evolve due to changing societal values or recent events. In addition, perceptions change over the time when a new technology is in the process of diffusion. Currently, Germany’s ePA is in an early testing stage. Even though some issues were already discussed in public, the concept of the ePA is not widespread yet. Thus, future research could apply a longitudinal study to get further insights into users’ adoption intentions.

10. Conclusions

The adoption of the ePA is a complex task. With the launch of new technology, such as the ePA, its adoption faces significant challenges. With a mixed-methods design and by developing a contextual model, we gathered evidence that different types of motivation, the HIPC, and privacy antecedents affect usage intentions regarding ePA. Most importantly, a profound understanding of the different types of motivation is critical to understanding individual usage intentions, since motivational variables were shown to explain the majority of the variance in our sample.
The findings showed the integral positive effect of internal PLOC. Individuals who feel volitional about using an electronic health record are more likely to adopt it. Consequently, policymakers must understand what types of motivation are critical predictors in ePA adoption and use. The findings demonstrate that policymakers have to provide both internal and external incentives. We believe that the results of this work contribute to the growing body of research on technology adoption in the field of the ePA in a German context.

Supplementary Materials

Funding

This research received no external funding.

Institutional Review Board Statement

Ethical review and approval were waived for this study because that ethics committee does not handle approval requests from students. The ethical review committee was approached and the process documented.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the studies.

Data Availability Statement

The survey data are available in the Supplementary Materials to enable reproduction of the study’s results.

Acknowledgments

I thank Thomas Mandl and Joachim Griesbaum, Faculty of Linguistics and Information Science, University of Hildesheim, Germany, for supervising my final thesis, which this paper was derived from. I further thank DongBack Seo, Department of Management Information Systems, Chungbuk National University, South Korea, for giving me inspiration for this work during her mentoring.

Conflicts of Interest

The author declares no conflict of interest.

Appendix A

Table A1. Design Decisions for the Mixed-Methods-Design (Adapted from [46]).
Table A1. Design Decisions for the Mixed-Methods-Design (Adapted from [46]).
PropertyDecision ConsiderationOther Design Decision(s) Likely to Affect Current DecisionDesign Decision and Reference to the Decision Tree
Step 1: decide on the appropriateness of mixed-methods researchResearch questionsQualitative or quantitative method alone was not adequate for addressing the research question. Thus, we used a mixed-methods research approach.NoneIdentify the research questions
  • We wrote the qualitative and quantitative research questions separately first, and a mixed-methods research question second.
  • The qualitative research question was: “What are the salient factors determining an individual’s intentions toward using the ePA?”
  • The quantitative research question was: “Does the research model explain usage intentions of the ePA?”
  • The mixed-methods research question was: “Are the factors identified in the qualitative study and as captured through the research model supported by the results of the quantitative study?”
  • We wrote the research questions in the question format.
  • The quantitative research question depended on the results of the qualitative research question. The mixed-methods question depended on the results of both qualitative and quantitative research questions.
  • The relationship between the questions and the research process is predetermined.
Purposes of mixed-methods researchMixed-methods research helps seeking convergence of results from different methods. We used mixed-methods research to develop hypotheses for empirical testing using the results of the qualitative.Research questionsDevelopmental approach: mixed-methods with the findings from one method used to help inform the other method.
Epistemological perspectiveThe qualitative and quantitative components of the study used different paradigmatic assumptions.Research questions, purposes of mixed methodsMultiple paradigm stance.
Paradigmatic assumptionsThe researcher believed in the importance of research questions and embraced various methodological approaches from different worldviews.Research questions, purposes of mixed methods.Dialectic stance (an interpretive and grounded-theory perspective in the qualitative study and a positivist perspective in the quantitative study).
Step 2: develop strategies for mixed-methods research designsDesign investigation strategyThe mixed-methods study was aimed to develop and test a theory.Research questions, paradigmatic assumptionsStudy 1: exploratory investigation.
Study 2: confirmatory investigation.
Strands/phases of researchThe study involved multiple phases.Purposes of mixed-methods researchMultistrand design.
Mixing strategyThe qualitative and quantitative components of the study were mixed at the data-analysis and inferential stages.Purposes of mixed-methods research, strands/phases of researchPartially mixed methods.
Time orientationWe started with the qualitative phase, followed by the quantitative phase.Research questions, strands/phases of researchSequential (exploratory) design.
Priority of methodological approachThe qualitative and quantitative components were not equally important.Research questions, strands of researchDominant-less dominant design with the quantitative study being the more dominant paradigm.
Step 3: develop strategies for collecting and analyzing mixed-methods dataSampling design strategiesThe samples for the quant. & qual. components of the study differed but came from the same underlying population.Design investigation strategy, time orientationPurposive sampling for the qualitative study, probability sampling for the quantitative study.
Data collection strategiesQualitative data collection in phase 1. Quantitative data collection in phase 2.Sampling design strategies, time orientation, phases of researchQualitative study: closed- and open-ended questions with pre-designed interview guideline. Quantitative study: closed-ended questioning (i.e., traditional survey design).
Data analysis strategyWe analyzed the qualitative data by finding broader categories using the software Atlas.ti. We analyzed the qualitative data first and the quantitative data second.Time orientation, data collection strategy, strands of researchSequential qualitative-quantitative analysis.
Step 4: draw meta-inferences from mixed-methods resultsTypes of reasoningIn our analysis, we focused on developing and then testing/confirming hypotheses.Design-investigation strategyInductive and deductive theoretical reasoning.
Step 5: assess the quality of meta-inferencesInference qualityThe qualitative inferences met the appropriate qualitative standards. The quantitative inferences met the appropriate quantitative standards. We assessed the quality of meta-inferences.Mostly primary design strategies, sampling-design strategies, data-collection strategies, data-analysis strategies, type of reasoningWe used conventional qualitative and quantitative standards to ensure the quality of our inferences. Design and explanatory quality; sample integration; inside-outside legitimation; multiple validities.
Step 6: discuss potential threats and remediesInference qualityWe discussed all potential threats to inference quality in the form of limitations.Data-collection strategies, data-analysis strategiesThreats to sample integration; sequential legitimation

Appendix B. Interview Guideline

  • How would you describe your own privacy, especially on the Internet?
  • Has your information ever been used in an inappropriate manner?
  • Has your health information ever been used in an inappropriate manner?
    • How did you react/have you reacted?
  • How important is the smartphone in your life?
  • Are you currently using, or have you ever used any of these M-Health technologies?
    • Users: What technologies? What data? benefits? reasons for use?
    • Former users: Which technologies? Which data. Any advantages? Reasons for stopping use?
    • Non-users without experience: Would you ever use these technologies? What, why, perceived benefits.
  • Do you believe that you can improve your health through your own behavior?
  • Do you use a personal health record on your cell phone?
    • Can you tell us something about your experience with the app
Presentation of the prototype (Figure 3)
8.
Which aspects of an ePA do you like? Which do you not?
9.
What reasons would play a role in using the electronic patient file and the app?
  • What role does your interest in technology play?
  • What role do health factors play?
  • What role does the publisher of the app play?
10.
Can you imagine your doctor prescribing via an app in the future?
  • What are the advantages?
11.
What are your current concerns regarding the ePA app?
12.
How would you describe your concerns about protecting your health data?
13.
Which groups should have access to your health data, in your opinion?
14.
Is it important for you to know how health data are used and shared?
15.
Do you think that you currently have control over your health data?
16.
How much control over your health data would you like to have?
17.
Is it important for you to be able to restrict which individual documents an individual doctor can access?
18.
When the ePA is introduced, would you give permission for your health data to be recorded?
19.
How would you use the ePA app?
20.
Do you believe that sharing data with physicians/therapists is associated with risks or negative consequences? (Why/what risks?)
21.
What would you do if the app was mandatory on your smartphone tomorrow?

Appendix C

Table A2. Interviewees from Qualitative Study.
Table A2. Interviewees from Qualitative Study.
#ProfileAgeInsurance StatusPrior PHR ExperiencePrior Privacy InvasionAdoption Intention
I1Student (IT related)18–29StatutoryNoNoYes
I2Public employee30–49PrivateNoYesNo
I3Student (business related)18–29StatutoryNoNoYes
I4Retiree50–69StatutoryNoYesYes

Appendix D

Table A3. Emergent Themes from the Interviews.
Table A3. Emergent Themes from the Interviews.
Broader Category of VariablesEmergent VariableI1I2I3I4
AttitudeAttitude
Perceived UsefulnessPerceived Usefulness
Privacy SensitivityPrivacy Sensitivity
Privacy SensitivityPrivacy Risk Awareness
PLOCInterest in accessing data through own person
PLOCLikes to have full-fledged health manager
PLOCLikes to have sovereignty over data
PLOCInterest in efficient treatments
PLOCShame
PLOCPolitical pressure
Health StatusMedical history/Health Status
DemographicsAge
Mobile IT identityDependence
IT experienceM-Health-Experience
IT experienceIT experience
Inherent innovativenessInterest in new innovations
Health BeliefHealth Belief/Self-Efficacy
Prior privacy invasionExperience
Prior privacy invasionResponse
Information sensitivityOverall perception of sensitivity
Information sensitivitySensitive data types
HIPCGeneral HIPC
HIPCDesire for Privacy
HIPCCollection
HIPCSecondary use
HIPCImproper access
HIPCErrors
HIPCControl
HIPCAwareness
Perceived OwnershipPerception of Ownership
Legislation awarenessLegislation awareness
Trust [health institution]Trust
Trust [health professionals]Trust
Trust [technology vendors]Trust
Risk perception [health institution]Risk perception
Risk perception [health professionals]Risk perception
Risk perception [technology vendors]Risk perception
UsabilityUsability

Appendix E

Table A4. Selected Quotes from the Interviews.
Table A4. Selected Quotes from the Interviews.
Category/VariableSelected Quotes
Attitude“I like the fact that all health information is stored in a digital file” (I1)
“Well, I think the idea of centralization is key; I think it’s cool”. (I3)
Inherent innovativeness“People who are critical about technology and digitization will not be able to do much with it and will not want to use it”. (I3)
Privacy sensitivity“My concern is to ensure that as few companies as possible have access to my data”. (I1)
Mobile IT identity“You don’t feel good if you don’t have [your smartphone] with you. Additionally, that’s kind of a weird feeling”. (I2)
Health Belief“I am of the opinion that my own behavior has a serious influence on my own health”. (I1)
Internal PLOC“I like the fact that all health information about the patient can be stored in a digital file, and the patient can, in theory, guarantee access to any doctor, any pharmacy, wherever necessary”. (I1)
“I like the thought of seeing which current diagnoses I’m going to make or which doctor’s letters or whatever documents come together that exist about me”. (I1)
“I have moved several times in my life now, even longer distances. Additionally, in the end, I always had to have everything handed over to me in physical form by the family doctor I was seeing”. (I3)
Introjected PLOC“I think if you are seriously ill and you carry this app around with you all the time, it’s like carrying your X-rays around with you all the time. I don’t like the idea”. (I2)
“Additionally, if someone is still in employment, and then have had a psychological rehab- I don’t know if everyone wants you to read that”. (I4)
Health Status“People with serious chronic illnesses, psychological problems, that is, those who fall under social taboo topics will hardly use the app”. (I3)
HIPC Desire for Privacy“I would feel safer now if the health insurance companies simply had access to what they now have in analog form”. (I1)
HIPC Control“I’d like to decide for myself what the doctor can get from me, what insight he can get from me”. (I4)
HIPC Errors“I can look at the file, [In case of errors] and I could check it. I could do something about it”. (I4)
HIPC Collection“I know that many people are afraid that their contributions will increase as a result, or something similar”. (I1)
HIPC Improper Access“Yes, the protocol is reasonably important. As I don’t want anyone to have someone who is [looking through documents] all the time when I give access to someone, although, of course, it could happen in my family doctor’s office that the trainee can read through everything, I will never notice”. (I2)
“You can only open the ePA app when the phone is unlocked. Nevertheless, I find that these very sensitive personal data are very close to me, so that somebody might look into them”. (I2)
Information Sensitivity“If it says in your documents, you have some sort of sexually transmitted disease or something; you may not want everyone to access it because it’s something that’s only your business”. (I2)
Perceived Ownership“For me personally, it should be mainly the doctor who should be able to interact with this file”. (I1)
“Do I wish control over it myself when my family doctor has the data? I would actually like to have confidence that the control will be realized by someone else”. (I2)
Risk Perception (Health professionals)“Personally, I don’t think I would have a problem if my pharmacy knew what my medical history is”. (I1)
“So currently, I have no worries because they are in a drawer or with some doctor. I’m not worried about that; I don’t want to. However, I’ll just assume that the doctors are abiding by the obligation of confidentiality”. (I3)
Risk perception (tech. vendors)“I would personally reconsider my decision if the provider of the operating system, i.e., Apple or Google, would have access to my data”. (I3)
Trust (Health Professionals)“I have confidence in the doctors where I have been. When I notice that the doctor is unpleasant, I go there only once, and then he will not see me again”. (I4)
“I am still very unsure about these media, so I may not trust the media, unlike the doctors I go to”. (I4)
Trust (institution)“I trust the health insurance companies; that plays an important role for me”. (I1)
“I would feel more comfortable if there was an app from my own health insurance company, who would also take responsibility for it. That’s like in banking; it’s just a matter of trust”. (I3)
“I am personally very, very satisfied with my health insurance company over the years. I am sure that it works well, and I can download the app with confidence. With third-party providers, I would have to deal with who is behind the app”. (I3)
Trust (technology vendors)“If the app is supported by my health insurance company and is serious on a certain governmental, institutional level, then I would use the app. If any new third-party provider were to come around the corner, probably not”. (I3)

Appendix F

Table A5. Scale Items for Construct Measures.
Table A5. Scale Items for Construct Measures.
NameItemMeanStd.dev.
Intention (cf. [30,119])
Int1I can imagine using the ePA app regularly.3.8401.281
Int2I plan to use the ePA app in the future.3.6061.202
External PLOC
I can imagine using the app
EPLOC1because my health insurance recommends it.3.6511.156
EPLOC2because it is recommended by my family doctor or other health professionals.3.9131.148
Internal PLOC
I can imagine using the app
Identified PLOC:
IPLOC1because I am interested in accessing my health data.4.1081.302
IPLOC2because I personally like using the app.3.5801.242
IPLOC3because I think it is important to me.3.6231.141
IPLOC4because I want to share my health data with other health professionals.3.9771.166
IPLOC5because I think it will result in more efficient treatments.4.0591.259
IPLOC6because I like to have sovereignty over my data.3.8631.206
IPLOC7because I would like to have all my health data in one central place.4.0681.279
Intrinsic PLOC:
IPLOC8because I enjoy using an ePA.3.5171.094
Introjected PLOC
IJPLOC1I would feel bad if I didn’t use the ePA app. a 2.2041.145
IJPLOC2I would use the ePA app because people I care about think I should use the app. b --
IJPLOC3I feel political pressure from the government to use the app.1.8481.288
IJPLOC4I find sharing my patient records and having constant access to my health history burdensome.2.2311.265
Mobile Technology Identity [84,86]
Thinking about myself in relation to a mobile device,
Dependence:
ITDep1 I feel dependent on the mobile device.3.0271.168
ITDep2 I feel needing the device.3.5051.030
Emotional Energy:
ITEmo1 I feel enthusiastic about the device.3.6800.867
ITEmo2 I feel confident4.3121.239
Health information privacy concern [15,69]
SUse1I am concerned that my health information may be used for other purposes.3.5181.275
SUse2I am concerned that my health information will be sold to other entities or companies.3.3761.232
SUse3I am concerned that my health information will be shared with other entities without my authorization.3.5070.788
Control1It is important to me that I have control over the health data I provide through the app.4.5320.665
Control2It is important to me that I have control over how my health information is used or shared.4.6331.244
Control3I fear a loss of control if my health data is available through the ePA app. c 2.9771.149
Errors1I am concerned that my data in the ePA app may be incorrect.2.7921.145
Errors2I am concerned that there is no assurance that my health information in the ePA app is accurate.2.8701.264
Errors3I am concerned that any errors in my health data cannot be corrected.2.8111.241
Access1I am concerned that my health data in the app is not protected from unauthorized access.3.5501.197
Access2I am concerned that unauthorized persons may gain access to my health data.3.6391.249
Access3I am concerned that there are insufficient security measures in place to ensure that unauthorized persons do not have access to my health data.3.5160.915
Health status (cf. [14])
HStat1I experience major pains and discomfort for extended periods of time.1.5760.886
HStat2I believe that my general health is poor.1.6500.845
Risk perceptions (cf. [15,53,69])
RiskHP1It would be risky to disclose my personal health information to health professionals.1.9180.926
RiskHP2There would be too much uncertainty associated with giving my personal health information to health professionals.1.9911.226
RiskIn1It would be risky to disclose my personal health information to my health insurance.2.5121.240
RiskIn2There would be too much uncertainty associated with giving my personal health information to my health insurance.2.5980.973
Trust perceptions [15,53,69])
TrustHP1I know health professionals are always honest when it comes to using my health information.3.5050.798
TrustHP2I know health professionals care about patients.3.7820.797
TrustHP3I know health professionals are competent and effective in providing their services.3.6960.843
TrustHP4I trust that health professionals keep my best interests in mind when dealing with my health information.3.7420.978
TrustIn1I know my health insurance is always honest when it comes to using my health information.3.1940.943
TrustIn2I know my health insurance cares about customers.3.3950.973
TrustIn3I know my health insurance is competent and effective in providing their services.3.4631.053
TrustIn4I trust that my health insurance keeps my best interests in mind when dealing with my health information.3.2501.226
Information sensitivity [70]
Prompt: For each type of health information, choose the number that indicates how sensitive you feel this information is.
InfoSen1Current health status3.5811.248
InfoSen2Test results3.7641.287
InfoSen3Health history3.7801.351
InfoSen4Mental health3.9861.350
InfoSen5Sexual health3.8541.381
InfoSen6Genetic information3.8001.460
InfoSen7Addiction information3.7120.806
Demographics/Controls
AgeI am:
(1 = 18–24, 2 = 25–39, 3 = 40–59, 4 = 60+)
EmploymentWhat describes your employment status best?
(1 = Student, 2 = Retired, 3 = Employed, 4 = Other)
EducationWhat is the highest level of education you have completed to date?
(1 = School, 2 = Abitur, 3 = Bachelor’s, 4 = Master’s/Diploma and above, 5 = N/A)
M-healthDo you have experience using Health Apps or Smartwatches for Sport?
(1 = No Experience, 2 = Experience)
HInsuranceAre you privately or statutorily insured?
(1 = Statutory, 2 = Private)
Data Quality [120]
ConsentI hereby confirm that I am at least 18 years old and that I have read and understood the declaration of consent and that I am a permanent resident of Germany.
(1 = No, 2 = Yes)
DQReluncNow let’s be honest: Did you enjoy participating in this study?
(1 = No, 2 = Rather no, 3 = Rather yes, 4 = Yes)
DQMeaninglessDid you perform all tasks as asked in each instruction?
(1 = I completed all tasks as required by the instructions, 2 = Sometimes I clicked something because I was unmotivated or just didn’t know my way around, 3 = I frequently clicked on something so I could finish quickly)
a: Dropped after preliminary analysis. b: Dropped after pilot study. c: Dropped after preliminary analysis.

Appendix G

Table A6. Distribution of Sample and German Citizens.
Table A6. Distribution of Sample and German Citizens.
DimensionSubgroupDistribution
SampleGermany
AbsoluteShare in %Share in %
Age [in years]18–24199%9%
25–399944%23%
40–597936%34%
60+2511%34%
Health insuranceStatutory Health Insurance17781%87%
Private Health Insurance4419%11%
EducationWith Graduation4721%
Abitur5525%
Bachelor’s degree4621%
Master’s degree/diploma or above7232%
Other21%
EmploymentStudent2511%
Retired125%
Employed13360%
Other5224%
Prior M-Health ExperienceIs Adopter of Wearables or M-Health Technology13762%
No Adopter8538%

Appendix H

Table A7. Reliabilities of Multi-Item Constructs.
Table A7. Reliabilities of Multi-Item Constructs.
Cronbach’s AlphaComposite ReliabilityAverage Variance Extracted (AVE)
Access0.9720.9820.947
Control0.8010.9070.830
EPLOC0.8490.9300.869
Errors0.9270.9540.873
HealthStatus0.8240.9170.847
IJPLOC0.6700.8460.736
IPLOC0.9440.9530.718
IT Dep.0.7880.9040.825
IT Emo.0.6290.8420.728
InfoSensitivity0.9500.9590.770
Intention0.9200.9620.926
RiskHP0.9220.9620.927
RiskIn0.9630.9820.964
SUse0.9520.9690.913
TrustHP0.8780.9150.730
TrustIn0.9110.9370.789

Appendix I

Table A8. Loadings of the Multi-Item Constructs.
Table A8. Loadings of the Multi-Item Constructs.
LoadingT Statisticsp-Value
Access1 ← Access0.977228.7660.000
Access2 ← Access0.976232.2720.000
Access3 ← Access0.966121.7770.000
Control1 ← Control0.87812.8680.000
Control2 ← Control0.94379.6450.000
Control3 (dropped from scale)---
EPLOC1 ← EPLOC0.93383.2400.000
EPLOC2 ← EPLOC0.93157.3620.000
Errors1 ← Errors0.94586.6540.000
Errors2 ← Errors0.957115.4800.000
Errors3 ← Errors0.90149.4110.000
HealthStat1 ← HealthStatus0.8933.0130.003
HealthStat2 ← HealthStatus0.9473.9340.000
IJPLOC1 (dropped from scale)---
IJPLOC2 (dropped from scale)---
IJPLOC3 ← IJPLOC0.76210.8870.000
IJPLOC4 ← IJPLOC0.94352.7530.000
IPLOC1 ← IPLOC0.87242.1330.000
IPLOC2 ← IPLOC0.87855.9860.000
IPLOC3 ← IPLOC0.87051.1100.000
IPLOC4 ← IPLOC0.85231.4030.000
IPLOC5 ← IPLOC0.84432.0520.000
IPLOC6 ← IPLOC0.77320.1420.000
IPLOC7 ← IPLOC0.85132.2250.000
IPLOC8 ← IPLOC0.83628.3940.000
ITDep1 ← IT Dependency0.90050.2400.000
ITDep2 ← IT Dependency0.91783.5240.000
ITEmo1 ← IT Emo0.87942.8370.000
ITEmo2 ← IT Emo0.82723.0070.000
InfoSen1 ← InfoSensitivity0.88653.5660.000
InfoSen2 ← InfoSensitivity0.86744.9180.000
InfoSen3 ← InfoSensitivity0.87038.4670.000
InfoSen4 ← InfoSensitivity0.86033.4360.000
InfoSen5 ← InfoSensitivity0.89047.1050.000
InfoSen6 ← InfoSensitivity0.89151.2830.000
InfoSen7 ← InfoSensitivity0.87842.5690.000
Int1 ← Intention0.964143.1680.000
Int2 ← Intention0.961113.5660.000
RiskHP1 ← RiskHP0.95794.1540.000
RiskHP2 ← RiskHP0.969171.9480.000
RiskIn1 ← RiskIn0.982196.1330.000
RiskIn2 ← RiskIn0.982221.3580.000
SUse1 ← SUse0.948102.3060.000
SUse2 ← SUse0.95289.6890.000
SUse3 ← SUse0.966142.7630.000
TrustHP1 ← TrustHP0.8683.4510.001
TrustHP2 ← TrustHP0.8613.5500.000
TrustHP3 ← TrustHP0.8183.5120.000
TrustHP4 ← TrustHP0.8693.4550.001
TrustIn1 ← TrustIn0.8862.3840.017
TrustIn2 ← TrustIn0.9022.3920.017
TrustIn3 ← TrustIn0.8592.4070.016
TrustIn4 ← TrustIn0.9052.3880.017

Appendix J

Table A9. Fornell–Larcker Criterion for Discriminant Validity of Multi-Item Constructs.
Table A9. Fornell–Larcker Criterion for Discriminant Validity of Multi-Item Constructs.
12345678910111213141516
Access0.973
Control0.3120.911
EPLOC−0.501−0.0810.932
Errors0.5990.211−0.4580.934
HealthStat0.1140.0150.0730.1270.920
IJPLOC0.3840.056−0.4430.3330.0920.858
IPLOC−0.494−0.0210.813−0.4680.101−0.4810.848
ITDep−0.135−0.1370.288−0.1760.0750.0060.2100.908
ITEmo−0.236−0.1090.302−0.254−0.15−0.1810.2160.3530.853
InfoSen0.1200.142−0.2610.118−0.0020.118−0.211−0.209−0.0380.878
Intention−0.555−0.080.801−0.4720.057−0.5030.8460.2150.242−0.2560.962
RiskHP0.317−0.009−0.4080.3530.2140.382−0.363-0.029−0.1710.132−0.3850.963
RiskIn0.3360.142−0.290.2980.1380.241−0.295−0.019−0.1450.216−0.30.3940.982
SUse0.8240.292−0.5280.5520.1070.392−0.542−0.105−0.2020.242−0.5760.3530.4040.955
TrustHP−0.2240.0010.384−0.175−0.031−0.2870.2350.1360.314−0.0410.251−0.332−0.139−0.2260.854
TrustIn−0.297−0.090.427−0.2740.033−0.2380.3890.1540.176−0.1360.338−0.247−0.424−0.3660.4470.888
Diagonal numbers represent the square-root of the AVEs. Each SQAVE exceeds all correlations with other latent constructs.

References

  1. Dinev, T.; Albano, V.; Xu, H.; D’Atri, A.; Hart, P. Individuals’ Attitudes Towards Electronic Health Records: A Privacy Calculus Perspective. In Advances in Healthcare Informatics and Analytics; Gupta, A., Patel, V.L., Greenes, R.A., Eds.; Springer International Publishing: Cham, The Netherlands, 2016; pp. 19–50. [Google Scholar] [CrossRef]
  2. Anderson, C.L.; Agarwal, R. The Digitization of Healthcare: Boundary Risks, Emotion, and Consumer Willingness to Disclose Personal Health Information. Inf. Syst. Res. 2011, 22, 469–490. [Google Scholar] [CrossRef]
  3. Evans, R.S. Electronic Health Records: Then, Now, and in the Future. Yearb. Med. Informatics 2016, 25, S48–S61. [Google Scholar] [CrossRef]
  4. Fox, G. “To protect my health or to protect my health privacy?” A mixed-methods investigation of the privacy paradox. J. Assoc. Inf. Sci. Technol. 2020, 71, 1015–1029. [Google Scholar] [CrossRef]
  5. Mishra, A.N.; Anderson, C.; Angst, C.M.; Agarwal, R. Electronic Health Records Assimilation and Physician Identity Evolution: An Identity Theory Perspective. Inf. Syst. Res. 2012, 23, 738–760. [Google Scholar] [CrossRef]
  6. Weber-Jahnke, J.H.; Obry, C. Protecting privacy during peer-to-peer exchange of medical documents. Inf. Syst. Front. 2012, 14, 87–104. [Google Scholar] [CrossRef]
  7. Lupton, D. ‘I’d like to think you could trust the government, but I don’t really think we can’: Australian women’s attitudes to and experiences of My Health Record. Digit. Health 2019, 5, 205520761984701. [Google Scholar] [CrossRef]
  8. Eckrich, F.; Baudendistel, I.; Ose, D.; Winkler, E.C. Einfluss einer elektronischen Patientenakte (EPA) auf das Arzt-Patienten-Verhältnis: Eine systematische Übersicht der medizinethischen Implikationen. Ethik Med. 2016, 28, 295–310. [Google Scholar] [CrossRef]
  9. Greenhalgh, T.; Morris, L.; Wyatt, J.C.; Thomas, G.; Gunning, K. Introducing a nationally shared electronic patient record: Case study comparison of Scotland, England, Wales and Northern Ireland. Int. J. Med. Inform. 2013, 82, e125–e138. [Google Scholar] [CrossRef] [PubMed]
  10. Pearce, C.; Bainbridge, M. A personally controlled electronic health record for Australia. J. Am. Med. Inform. Assoc. 2014, 21, 707–713. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  11. Roehrs, A.; da Costa, C.A.; Righi, R.d.R.; de Oliveira, K.S.F. Personal Health Records: A Systematic Literature Review. J. Med. Internet Res. 2017, 19, e13. [Google Scholar] [CrossRef] [PubMed]
  12. Grätzel von Grätz, P. Start der elektronischen Patientenakte im Januar 2021: Was Ärzten nun blüht. MMW–Fortschritte Med. 2020, 162, 39. [Google Scholar] [CrossRef] [PubMed]
  13. Angst, C.M.; Agarwal, R. Adoption of Electronic Health Records in the Presence of Privacy Concerns: The Elaboration Likelihood Model and Individual Persuasion. MIS Q. 2009, 33, 339. [Google Scholar] [CrossRef] [Green Version]
  14. Bansal, G.; Zahedi, F.; Gefen, D. The impact of personal dispositions on information sensitivity, privacy concern and trust in disclosing health information online. Decis. Support Syst. 2010, 49, 138–150. [Google Scholar] [CrossRef]
  15. Fox, G.; James, T.L. Toward an Understanding of the Antecedents to Health Information Privacy Concern: A Mixed Methods Study. Inf. Syst. Front. 2020, 1–26. [Google Scholar] [CrossRef]
  16. Angst, C.M.; Agarwal, R. Overcoming Personal Barriers to Adoption when Technology Enables Information to be Available to Others. SSRN Electron. J. 2006, 1–22. [Google Scholar] [CrossRef]
  17. Zivanovic, N.N. Medical information as a hot commodity: The need for stronger protection of patient health information. Intell. Prop. L. Bull. 2014, 19, 183. [Google Scholar]
  18. Haas, P. Elektronische Patientenakten: Einrichtungsübergreifende Elektronische Patientenakten als Basis für integrierte patientenzentrierte Behandlungsmanagement-Plattformen; BStift—Bertelsmann Stiftung: Gütersloh, Germany, 2017. [Google Scholar] [CrossRef]
  19. Bertram, N.; Püschner, F.; Gonçalves, A.S.O.; Binder, S.; Amelung, V.E. Einführung einer elektronischen Patientenakte in Deutschland vor dem Hintergrund der internationalen Erfahrungen. In Krankenhaus-Report 2019; Klauber, J., Geraedts, M., Friedrich, J., Wasem, J., Eds.; Springer: Berlin/Heidelberg, Germany, 2019; pp. 3–16. [Google Scholar] [CrossRef] [Green Version]
  20. Heinze, R.G.; Hilbert, J. Vorschläge und Handlungsempfehlungen zur Erarbeitung einer kundenorientierten eHealth-Umsetzungsstrategie; Arbeitsgruppe 7 “IKT und Gesundheit” des Nationalen IT-Gipfels: Dortmund, Germany, 2008. [Google Scholar]
  21. Hoerbst, A.; Kohl, C.D.; Knaup, P.; Ammenwerth, E. Attitudes and behaviors related to the introduction of electronic health records among Austrian and German citizens. Int. J. Med. Inform. 2010, 79, 81–89. [Google Scholar] [CrossRef]
  22. Eurobarometer. Special Eurobarometer 460: Attitudes towards the Impact of Digitisation and Automation on Daily Life; European Union Open Data Portal: Brussels, Belgium, 2017; Available online: https://data.europa.eu/euodp/en/data/dataset/S2160_87_1_460_ENG (accessed on 25 August 2021).
  23. Olson, J.S.; Grudin, J.; Horvitz, E. A study of preferences for sharing and privacy. CHI’05 Extended Abstracts on Human Factors in Computing Systems—CHI’05; ACM Press: Portland, OR, USA, 2005; p. 1985. [Google Scholar] [CrossRef]
  24. Weitzman, E.R.; Kaci, L.; Mandl, K.D. Sharing Medical Data for Health Research: The Early Personal Health Record Experience. J. Med. Internet Res. 2010, 12, e14. [Google Scholar] [CrossRef] [PubMed]
  25. Abdelhamid, M.; Gaia, J.; Sanders, G.L. Putting the Focus Back on the Patient: How Privacy Concerns Affect Personal Health Information Sharing Intentions. J. Med. Internet Res. 2017, 19, e169. [Google Scholar] [CrossRef] [Green Version]
  26. Kisekka, V.; Giboney, J.S. The Effectiveness of Health Care Information Technologies: Evaluation of Trust, Security Beliefs, and Privacy as Determinants of Health Care Outcomes. J. Med. Internet Res. 2018, 20, e107. [Google Scholar] [CrossRef]
  27. Caine, K.; Hanania, R. Patients want granular privacy control over health information in electronic medical records. J. Am. Med. Inform. Assoc. 2013, 20, 7–15. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  28. Deci, E.L. Effects of externally mediated rewards on intrinsic motivation. J. Pers. Soc. Psychol. 1971, 18, 105–115. [Google Scholar] [CrossRef]
  29. Davis, F.D.; Bagozzi, R.P.; Warshaw, P.R. Extrinsic and Intrinsic Motivation to Use Computers in the Workplace1. J. Appl. Soc. Psychol. 1992, 22, 1111–1132. [Google Scholar] [CrossRef]
  30. Venkatesh, V.; Morris, M.G.; Davis, G.B.; Davis, F.D. User Acceptance of Information Technology: Toward a Unified View. MIS Q. 2003, 27, 425. [Google Scholar] [CrossRef] [Green Version]
  31. Malhotra, Y.; Galletta, D.F.; Kirsch, L.J. How Endogenous Motivations Influence User Intentions: Beyond the Dichotomy of Extrinsic and Intrinsic User Motivations. J. Manag. Inf. Syst. 2008, 25, 267–300. [Google Scholar] [CrossRef]
  32. Ryan, R.M.; Connell, J.P. Perceived locus of causality and internalization: Examining reasons for acting in two domains. J. Personal. Soc. Psychol. 1989, 57, 749–761. [Google Scholar] [CrossRef]
  33. Ryan, R.M.; Deci, E.L. Self-Determination Theory and the Facilitation of Intrinsic Motivation, Social Development, and Well-Being. Am. Psychol. 2000, 55, 68–78. [Google Scholar] [CrossRef]
  34. Bandura, A. Self-Efficacy: The Exercise of Control; W.H. Freeman: New York, NY, USA, 1997. [Google Scholar]
  35. Cadwallader, S.; Jarvis, C.B.; Bitner, M.J.; Ostrom, A.L. Frontline employee motivation to participate in service innovation implementation. J. Acad. Mark. Sci. 2010, 38, 219–239. [Google Scholar] [CrossRef]
  36. Deci, E.L.; Ryan, R.M. (Eds.) Handbook of Self-Determination Research; OCLC: 249185072; University of Rochester Press: Rochester, NY, USA, 2004. [Google Scholar]
  37. Shahar, G.; Henrich, C.C.; Blatt, S.J.; Ryan, R.; Little, T.D. Interpersonal relatedness, self-definition, and their motivational orientation during adolescence: A theorical and empirical integration. Dev. Psychol. 2003, 39, 470–483. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  38. Chan, K.W.; Lam, W. The trade-off of servicing empowerment on employees’ service performance: Examining the underlying motivation and workload mechanisms. J. Acad. Mark. Sci. 2011, 39, 609–628. [Google Scholar] [CrossRef] [Green Version]
  39. Dholakia, U.M. How Customer Self-Determination Influences Relational Marketing Outcomes: Evidence from Longitudinal Field Studies. J. Mark. Res. 2006, 43, 109–120. [Google Scholar] [CrossRef]
  40. Deci, E.L.; Ryan, R.M.; Williams, G.C. Need satisfaction and the self-regulation of learning. Learn. Individ. Differ. 1996, 8, 165–183. [Google Scholar] [CrossRef]
  41. Gilal, F.G.; Chandani, K.; Gilal, R.G.; Gilal, N.G.; Gilal, W.G.; Channa, N.A. Towards a new model for green consumer behaviour: A self-determination theory perspective. Sustain. Dev. 2020, 28, 711–722. [Google Scholar] [CrossRef]
  42. Trigueros, R.; Mínguez, L.A.; González-Bernal, J.J.; Jahouh, M.; Soto-Camara, R.; Aguilar-Parra, J.M. Influence of Teaching Style on Physical Education Adolescents’ Motivation and Health-Related Lifestyle. Nutrients 2019, 11, 2594. [Google Scholar] [CrossRef] [Green Version]
  43. Hong, W.; Chan, F.K.Y.; Thong, J.Y.L.; Chasalow, L.C.; Dhillon, G. A Framework and Guidelines for Context-Specific Theorizing in Information Systems Research. Inf. Syst. Res. 2014, 25, 111–136. [Google Scholar] [CrossRef]
  44. Johns, G. The Essential Impact of Context on Organizational Behavior. Acad. Manag. Rev. 2006, 31, 386–408. [Google Scholar] [CrossRef]
  45. Venkatesh, V.; Brown, S.A.; Bala, H. Bridging the Qualitative-Quantitative Divide: Guidelines for Conducting Mixed Methods Research in Information Systems. MIS Q. 2013, 37, 21–54. [Google Scholar] [CrossRef]
  46. Venkatesh, V.; Brown, S.; Sullivan, Y. Guidelines for Conducting Mixed-methods Research: An Extension and Illustration. J. Assoc. Inf. Syst. 2016, 17, 435–494. [Google Scholar] [CrossRef] [Green Version]
  47. Bélanger, F.; Xu, H. The role of information systems research in shaping the future of information privacy: Editorial. Inf. Syst. J. 2015, 25, 573–578. [Google Scholar] [CrossRef] [Green Version]
  48. Kordzadeh, N.; Warren, J.; Seifi, A. Antecedents of privacy calculus components in virtual health communities. Int. J. Inf. Manag. 2016, 36, 724–734. [Google Scholar] [CrossRef]
  49. Kordzadeh, N.; Warren, J. Communicating Personal Health Information in Virtual Health Communities: An Integration of Privacy Calculus Model and Affective Commitment. J. Assoc. Inf. Syst. 2017, 18, 38. [Google Scholar] [CrossRef] [Green Version]
  50. Xu, H.; Dinev, T.; Smith, J.; Hart, P. Information Privacy Concerns: Linking Individual Perceptions with Institutional Privacy Assurances. J. Assoc. Inf. Syst. 2011, 12, 798–824. [Google Scholar] [CrossRef]
  51. Zhang, X.; Liu, S.; Chen, X.; Wang, L.; Gao, B.; Zhu, Q. Health information privacy concerns, antecedents, and information disclosure intention in online health communities. Inf. Manag. 2018, 55, 482–493. [Google Scholar] [CrossRef]
  52. Hwang, H.G.; Han, H.E.; Kuo, K.M.; Liu, C.F. The Differing Privacy Concerns Regarding Exchanging Electronic Medical Records of Internet Users in Taiwan. J. Med. Syst. 2012, 36, 3783–3793. [Google Scholar] [CrossRef] [PubMed]
  53. Li, H.; Gupta, A.; Zhang, J.; Sarathy, R. Examining the decision to use standalone personal health record systems as a trust-enabled fair social contract. Decis. Support Syst. 2014, 57, 376–386. [Google Scholar] [CrossRef]
  54. Li, T.; Slee, T. The effects of information privacy concerns on digitizing personal health records: The Effects of Information Privacy Concerns on Digitizing Personal Health Records. J. Assoc. Inf. Sci. Technol. 2014, 65, 1541–1554. [Google Scholar] [CrossRef]
  55. Campos-Castillo, C.; Anthony, D.L. The double-edged sword of electronic health records: Implications for patient disclosure. J. Am. Med. Inform. Assoc. 2014, 22, e130–e140. [Google Scholar] [CrossRef] [Green Version]
  56. Dinev, T.; Xu, H.; Smith, J.; Hart, P. Information privacy and correlates: An empirical attempt to bridge and distinguish privacy-related concepts. Eur. J. Inf. Syst. 2013, 22, 295–316. [Google Scholar] [CrossRef]
  57. Phelps, J.; Nowak, G.; Ferrell, E. Privacy Concerns and Consumer Willingness to Provide Personal Information. J. Public Policy Mark. 2000, 19, 27–41. [Google Scholar] [CrossRef]
  58. Sheehan, K.B.; Hoy, M.G. Dimensions of Privacy Concern among Online Consumers. J. Public Policy Mark. 2000, 19, 62–73. [Google Scholar] [CrossRef]
  59. Xu, H. The Effects of Self-Construal and Perceived Control on Privacy Concerns. In Proceedings of the 28th International Conference on Information Systems (ICIS 2007), Montreal, QC, Canada, 9–12 December 2007. [Google Scholar]
  60. Xu, H.; Teo, H.H. Alleviating Consumers’ Privacy Concerns in Location-Based Services: A Psychological Control Perspective. In Proceedings of the 25th International Conference on Information Systems (ICIS 2004), Washington, DC, USA, 12–15 December 2004. [Google Scholar]
  61. Clarke, R. Internet privacy concerns confirm the case for intervention. Commun. ACM 1999, 42, 60–67. [Google Scholar] [CrossRef]
  62. Bélanger, F.; Crossler, R.E. Privacy in the Digital Age: A Review of Information Privacy Research in Information Systems. MIS Q. 2011, 35, 1017. [Google Scholar] [CrossRef] [Green Version]
  63. Fox, G.; Connolly, R. Mobile health technology adoption across generations: Narrowing the digital divide. Inf. Syst. J. 2018, 28, 995–1019. [Google Scholar] [CrossRef]
  64. Smith, J.; Milberg, S.J.; Burke, S.J. Information Privacy: Measuring Individuals’ Concerns about Organizational Practices. MIS Q. 1996, 20, 167. [Google Scholar] [CrossRef]
  65. Smith, J.; Dinev, T.; Xu, H. Information Privacy Research: An Interdisciplinary Review. MIS Q. 2011, 35, 989. [Google Scholar] [CrossRef] [Green Version]
  66. Pavlou. State of the Information Privacy Literature: Where are We Now And Where Should We Go? MIS Q. 2011, 35, 977. [Google Scholar] [CrossRef]
  67. Li, H.; Sarathy, R.; Xu, H. The role of affect and cognition on online consumers’ decision to disclose personal information to unfamiliar online vendors. Decis. Support Syst. 2011, 51, 434–445. [Google Scholar] [CrossRef]
  68. Malhotra, N.K.; Kim, S.S.; Agarwal, J. Internet Users’ Information Privacy Concerns (IUIPC): The Construct, the Scale, and a Causal Model. Inf. Syst. Res. 2004, 15, 336–355. [Google Scholar] [CrossRef] [Green Version]
  69. Hong, W.; Thong, J.Y.L. Internet Privacy Concerns: An Integrated Conceptualization and Four Empirical Studies. MIS Q. 2013, 37, 275–298. [Google Scholar] [CrossRef]
  70. Laric, M.V.; Pitta, D.A.; Katsanis, L.P. Consumer concerns for healthcare information privacy: A comparison of US and Canadian perspectives. Res. Healthc. Financ. Manag. 2009, 12, 93–111. [Google Scholar]
  71. Kim, D.J.; Ferrin, D.L.; Rao, H.R. A trust-based consumer decision-making model in electronic commerce: The role of trust, perceived risk, and their antecedents. Decis. Support Syst. 2008, 44, 544–564. [Google Scholar] [CrossRef]
  72. Luhmann, N. Trust and Power; Wiley: Chichester, TO, Canada, 1979. [Google Scholar]
  73. Rotter, J.B. Generalized expectancies for interpersonal trust. Am. Psychol. 1971, 26, 443–452. [Google Scholar] [CrossRef]
  74. Meyer, A.D.; Goes, J.B. Organizational Assimilation of Innovations: A Multilevel Contextual Analysis. Acad. Manag. J. 1988, 31, 897–923. [Google Scholar] [CrossRef]
  75. Rousseau, D.M.; Sitkin, S.B.; Burt, R.S.; Camerer, C. Introduction to Special Topic Forum: Not so Different after All: A Cross-Discipline View of Trust. Acad. Manag. Rev. 1998, 23, 393–404. [Google Scholar] [CrossRef] [Green Version]
  76. Gefen, D.; Karahanna, E.; Straub, D.W. Trust and TAM in Online Shopping: An Integrated Model. MIS Q. 2003, 27, 51. [Google Scholar] [CrossRef]
  77. Mayer, R.C.; Davis, J.H.; Schoorman, F.D. An Integrative Model of Organizational Trust. Acad. Manag. Rev. 1995, 20, 709. [Google Scholar] [CrossRef]
  78. McKnight, D.H.; Choudhury, V.; Kacmar, C. The impact of initial consumer trust on intentions to transact with a web site: A trust building model. J. Strateg. Inf. Syst. 2002, 11, 297–323. [Google Scholar] [CrossRef]
  79. McKnight, D.H.; Choudhury, V.; Kacmar, C. Developing and Validating Trust Measures for e-Commerce: An Integrative Typology. Inf. Syst. Res. 2002, 13, 334–359. [Google Scholar] [CrossRef] [Green Version]
  80. Li, H.; Wu, J.; Gao, Y.; Shi, Y. Examining individuals’ adoption of healthcare wearable devices: An empirical study from privacy calculus perspective. Int. J. Med. Inform. 2016, 88, 8–17. [Google Scholar] [CrossRef]
  81. Ashforth, B.E.; Mael, F. Social Identity Theory and the Organization. Acad. Manag. Rev. 1989, 14, 20. [Google Scholar] [CrossRef]
  82. Stets, J.E.; Burke, P.J. Identity Theory and Social Identity Theory. Soc. Psychol. Q. 2000, 63, 224. [Google Scholar] [CrossRef] [Green Version]
  83. Burke, P.J.; Stets, J.E. A Sociological Approach to Self and Identity. In Handbook of Self and Identity; Leary, M.R., Tangney, J.P., Eds.; The Guilford Press: New York, NY, USA, 2005; pp. 128–152. [Google Scholar]
  84. Carter, M.; Grover, V. Me, My Self, and I(T): Conceptualizing Information Technology Identity and its Implications. MIS Q. 2015, 39, 931–957. [Google Scholar] [CrossRef]
  85. Carter, M.; Petter, S.; Grover, V.; Thatcher, J.B. Information Technology Identity: A Key Determinant of IT Feature and Exploratory Usage. MIS Q. 2020, 44, 40. [Google Scholar] [CrossRef]
  86. Carter, M. Information Technology (IT) Identity: A Conceptualization, Proposed Measures, and Research Agenda. Ph.D. Thesis, Clemson University, Clemson, SC, USA, 2012. Available online: https://tigerprints.clemson.edu/all_dissertations/901 (accessed on 25 August 2021).
  87. Hackbarth, G.; Grover, V.; Yi, M.Y. Computer playfulness and anxiety: Positive and negative mediators of the system experience effect on perceived ease of use. Inf. Manag. 2003, 40, 221–232. [Google Scholar] [CrossRef]
  88. Savoli, A.; Bhatt, M. The Impact of IT Identity on Users’ Emotions: A Conceptual Framework in Health-Care Setting. In Proceedings of the Twenty-third Americas Conference on Information Systems, Boston, MA, USA, 10–12 August 2017. [Google Scholar]
  89. Reychav, I.; Beeri, R.; Balapour, A.; Raban, D.R.; Sabherwal, R.; Azuri, J. How reliable are self-assessments using mobile technology in healthcare? The effects of technology identity and self-efficacy. Comput. Hum. Behav. 2019, 91, 52–61. [Google Scholar] [CrossRef]
  90. Heeser, A. Die elektronische Patientenakte: Eine für alles. Heilberufe 2021, 73, 34–35. [Google Scholar] [CrossRef]
  91. gematik GmbH. Die ePA-App—Was kann Sie? 2020. YouTube. Available online: https://youtu.be/l_5KqAmoIaQ (accessed on 25 August 2021).
  92. Figma. Figma: The Collaborative Interface Design Tool. 2021. Available online: https://www.figma.com (accessed on 25 August 2021).
  93. Figliolia, A.C.; Sandnes, F.E.; Medola, F.O. Experiences Using Three App Prototyping Tools with Different Levels of Fidelity from a Product Design Student’s Perspective. In Innovative Technologies and Learning; Series Title: Lecture Notes in Computer, Science; Huang, T.C., Wu, T.T., Barroso, J., Sandnes, F.E., Martins, P., Huang, Y.M., Eds.; Springer International Publishing: Cham, The Netherlands, 2020; Volume 12555, pp. 557–566. [Google Scholar] [CrossRef]
  94. Johnson, R.B.; Onwuegbuzie, A.J.; Turner, L.A. Toward a Definition of Mixed Methods Research. J. Mix. Methods Res. 2007, 1, 112–133. [Google Scholar] [CrossRef]
  95. Creswell, J.W.; Plano Clark, V.L.; Gutmann, M.L.; Hanson, W.E. Advanced mixed methods research designs. In Handbook of Mixed Methods in Social and Behavioral Research, 1st ed.; Tashakkori, A., Teddlie, C., Eds.; Sage: Thousand Oaks, CA, USA, 2003; pp. 209–240. [Google Scholar]
  96. Creswell, J.W.; Hanson, W.E.; Clark Plano, V.L.; Morales, A. Qualitative Research Designs: Selection and Implementation. Couns. Psychol. 2007, 35, 236–264. [Google Scholar] [CrossRef]
  97. Tashakkori, A.; Teddlie, C. Mixed Methodology: Combining Qualitative and Quantitative Approaches; Number v. 46 in Applied Social Research Methods Series; Sage: Thousand Oaks, CA, USA, 1998. [Google Scholar]
  98. Recker, J. Scientific Research in Information Systems: A Beginner’s Guide; Springer: Berlin/Heidelberg, Germany, 2013. [Google Scholar]
  99. Teddlie, C.; Yu, F. Mixed Methods Sampling: A Typology With Examples. J. Mix. Methods Res. 2007, 1, 77–100. [Google Scholar] [CrossRef]
  100. Glaser, B.G. Basics of Grounded Theory Analysis: Emergence vs Forcing, 2nd ed.; OCLC: 253830505; Sociology Press: Mill Valley, CA, USA, 1992. [Google Scholar]
  101. Miles, M.B.; Huberman, A.M. Qualitative Data Analysis: An Expanded Sourcebook, 2nd ed.; Sage Publications: Thousand Oaks, CA, USA, 1994. [Google Scholar]
  102. Glaser, B.G.; Strauss, A.L. The Discovery of Grounded Theory: Strategies for Qualitative Research; OCLC: 553535517; Aldine: New Brunswick, NB, USA, 2009. [Google Scholar]
  103. Charmaz, K. Constructing Grounded Theory, 2nd ed.; Introducing Qualitative Methods; OCLC: ocn878133162; Sage: London, UK; Thousand Oaks, CA, USA, 2014. [Google Scholar]
  104. Corbin, J.M.; Strauss, A. Grounded theory research: Procedures, canons, and evaluative criteria. Qual. Sociol. 1990, 13, 3–21. [Google Scholar] [CrossRef]
  105. Venkatesh, V.; Brown, S.A. A Longitudinal Investigation of Personal Computers in Homes: Adoption Determinants and Emerging Challenges. MIS Q. 2001, 25, 71. [Google Scholar] [CrossRef] [Green Version]
  106. Fishbein, M.; Ajzen, I. Belief, Attitude, Intention, and Behavior: An Introduction to Theory and Research; Addison-Wesley Series in Social Psychology; Addison-Wesley Pub. Co.: Reading, MA, USA, 1975. [Google Scholar]
  107. Balapour, A.; Reychav, I.; Sabherwal, R.; Azuri, J. Mobile technology identity and self-efficacy: Implications for the adoption of clinically supported mobile health apps. Int. J. Inf. Manag. 2019, 49, 58–68. [Google Scholar] [CrossRef]
  108. Vodicka, E.; Mejilla, R.; Leveille, S.G.; Ralston, J.D.; Darer, J.D.; Delbanco, T.; Walker, J.; Elmore, J.G. Online Access to Doctors’ Notes: Patient Concerns About Privacy. J. Med. Internet Res. 2013, 15, e208. [Google Scholar] [CrossRef] [PubMed]
  109. King, T.; Brankovic, L.; Gillard, P. Perspectives of Australian adults about protecting the privacy of their health information in statistical databases. Int. J. Med. Inform. 2012, 81, 279–289. [Google Scholar] [CrossRef]
  110. Klein, R. An empirical examination of patient-physician portal acceptance. Eur. J. Inf. Syst. 2007, 16, 751–760. [Google Scholar] [CrossRef]
  111. Flynn, H.A.; Marcus, S.M.; Kerber, K.; Alessi, N. Patients’ Concerns About and Perceptions of Electronic Psychiatric Records. Psychiatr. Serv. 2003, 54, 1539–1541. [Google Scholar] [CrossRef] [PubMed]
  112. Lafky, D.B.; Horan, T.A. Personal health records. Health Inform. J. 2011, 17, 63–71. [Google Scholar] [CrossRef] [PubMed]
  113. Dinev, T.; Bellotto, M.; Hart, P.; Russo, V.; Serra, I.; Colautti, C. Privacy calculus model in e-commerce—A study of Italy and the United States. Eur. J. Inf. Syst. 2006, 15, 389–402. [Google Scholar] [CrossRef]
  114. Rahim, F.A.; Ismail, Z.; Samy, G.N. Information Privacy Concerns in Electronic Healthcare Records: A Systematic Literature Review. In Proceedings of the 2013 International Conference on Research and Innovation in Information Systems (ICRIIS), Kuala Lumpur, Malaysia, 27–28 November 2013; pp. 504–509. [Google Scholar] [CrossRef]
  115. Bansal, G.; Davenport, R. Moderating Role of Perceived Health Status on Privacy Concern Factors and Intentions to Transact with High versus Low Trustworthy Health Websites. In Proceedings of the 5th MWAIS (Midwest Association for Information) Conference, Moorhead, MN, USA, 21–22 May 2010. [Google Scholar] [CrossRef]
  116. Venkatesh, V.; Thong, J.Y.L.; Xu, X. Consumer Acceptance and Use of Information Technology: Extending the Unified Theory of Acceptance and Use of Technology. MIS Q. 2012, 36, 157. [Google Scholar] [CrossRef] [Green Version]
  117. Ringle, C.M.; Wende, S.; Becker, J.M. SmartPLS 3; SmartPLS GmbH: Boenningstedt, Germany, 2015. [Google Scholar]
  118. Lowry, P.B.; Gaskin, J. Partial Least Squares (PLS) Structural Equation Modeling (SEM) for Building and Testing Behavioral Causal Theory: When to Choose It and How to Use It. IEEE Trans. Prof. Commun. 2014, 57, 123–146. [Google Scholar] [CrossRef]
  119. Brown, S.A.; Venkatesh, V. Model of Adoption of Technology in Households: A Baseline Model Test and Extension Incorporating Household Life Cycle. MIS Q. 2005, 29, 399. [Google Scholar] [CrossRef]
  120. SoSci Survey. Einverständniserklärung und Selbstauskünfte zum Ausfüllverhalten (Datenqualität); SoSci Survey: Munich, Germany, 2017. [Google Scholar]
  121. MacKenzie, S.B.; Podsakoff, P.M.; Podsakoff, N.P. Construct Measurement and Validation Procedures in MIS and Behavioral Research: Integrating New and Existing Techniques. MIS Q. 2011, 35, 293. [Google Scholar] [CrossRef]
  122. Podsakoff, P.M.; MacKenzie, S.B.; Lee, J.Y.; Podsakoff, N.P. Common method biases in behavioral research: A critical review of the literature and recommended remedies. J. Appl. Psychol. 2003, 88, 879–903. [Google Scholar] [CrossRef]
  123. Podsakoff, P.M.; MacKenzie, S.B.; Podsakoff, N.P. Sources of Method Bias in Social Science Research and Recommendations on How to Control It. Annu. Rev. Psychol. 2012, 63, 539–569. [Google Scholar] [CrossRef] [Green Version]
  124. Hair, J.F. Multivariate Data Analysis, 8th ed.; Cengage: Andover, Hampshire, 2019. [Google Scholar]
  125. Leiner, D.J. Too Fast, too Straight, too Weird: Non-Reactive Indicators for Meaningless Data in Internet Surveys. Surv. Res. Methods 2019, 229–248. [Google Scholar] [CrossRef]
  126. Tavakol, M.; Dennick, R. Making sense of Cronbach’s alpha. Int. J. Med. Educ. 2011, 2, 53–55. [Google Scholar] [CrossRef] [PubMed]
  127. Wooldridge, J.M. Introductory Econometrics: A Modern Approach; Cengage Learning: Boston, MA, USA, 2012. [Google Scholar]
  128. Fornell, C.; Larcker, D.F. Structural Equation Models with Unobservable Variables and Measurement Error: Algebra and Statistics. J. Mark. Res. 1981, 18, 382. [Google Scholar] [CrossRef]
  129. Erzberger, C.; Kelle, U. Making inferences in mixed methods: The rules of integration. In Handbook of Mixed Methods in Social & Behavioral Research; SAGE Publications: New York, NY, USA, 2003; pp. 457–488. [Google Scholar]
  130. Melancon, J.P.; Noble, S.M.; Noble, C.H. Managing rewards to enhance relational worth. J. Acad. Mark. Sci. 2011, 39, 341–362. [Google Scholar] [CrossRef]
  131. Dinev, T.; Hart, P. An extended privacy calculus model for e-commerce transactions. Inf. Syst. Res. 2006, 17, 61–80. [Google Scholar] [CrossRef]
  132. Tang, P.C.; Ash, J.S.; Bates, D.W.; Overhage, J.M.; Sands, D.Z. Personal Health Records: Definitions, Benefits, and Strategies for Overcoming Barriers to Adoption. J. Am. Med. Inform. Assoc. 2006, 13, 121–126. [Google Scholar] [CrossRef] [PubMed] [Green Version]
Figure 1. Different types of endogenous motivations and their relations to the PLOC notation (adapted from [31]).
Figure 1. Different types of endogenous motivations and their relations to the PLOC notation (adapted from [31]).
Ijerph 18 09553 g001
Figure 2. Formative third-order factor to measure the health information privacy concern (adapted from [15]).
Figure 2. Formative third-order factor to measure the health information privacy concern (adapted from [15]).
Ijerph 18 09553 g002
Figure 3. The prototype of the ePA application used in our research to inform the participants.
Figure 3. The prototype of the ePA application used in our research to inform the participants.
Ijerph 18 09553 g003
Figure 4. The mixed-methods multistrand exploratory-explanatory research with the dominant-less dominant design.
Figure 4. The mixed-methods multistrand exploratory-explanatory research with the dominant-less dominant design.
Ijerph 18 09553 g004
Figure 5. The research model with the different hypotheses.
Figure 5. The research model with the different hypotheses.
Ijerph 18 09553 g005
Figure 6. Summary of Full Model Results. * p < 0.05 ; * * p < 0.01 ; * * * p < 0.001 .
Figure 6. Summary of Full Model Results. * p < 0.05 ; * * p < 0.01 ; * * * p < 0.001 .
Ijerph 18 09553 g006
Table 1. Constructs and their definitions.
Table 1. Constructs and their definitions.
ConstructDefinition
Intention toadopt the ePA [106]The subjective probability that a person will perform the behavior of adopting ePA.
Internal PLOC [31]Motivation stemming from feelings of volition where consumers perceive autonomy over their behavior.
External PLOC [31]Motivation stemming from perceived reasons that are attributed to external authority or compliance. No conflict between perceived external influences and personal values exists.
Introjected PLOC [31]Motivation due to a misalignment of perceived social influences and personal values often relates to guilt and shame. The conflict between esteemed pressures and the desire for being autonomous often results in rejection of the “imposed” behavior.
Mobile IT Identity [84]The extent to which a person views IT or their mobile phone as integral to their sense of self.
Health Information PrivacyConcern (HIPC) [15,69]An individual’s perception of their concern for how health entities handle personal data.
Health informationsensitivity [70]The perceived sensitivity of an individual’s different health information.
Risk perceptions [15,56]The perception that information disclosure towards health professionals or health insurance providers will have a negative outcome.
Trust perceptions [15,76]The belief that health professionals or health insurance providers will fulfill their commitments.
AgeThe age of the insurant.
Health StatusAn individual’s reports of severe health conditions.
EducationThe level of formal education of the insurant.
EmploymentEmployment status.
M-Health experienceAn individual’s experience with health-related technologies and applications, i.e., wearables and health-supporting applications.
Table 2. Results for test hypotheses and control variables.
Table 2. Results for test hypotheses and control variables.
Path Coef.T Statisticsp-Values
H1: IPLOC → Intention0.5077.0720.000
H2: EPLOC → Intention0.2743.3400.001
H3: IJPLOC → Intention−0.0852.3180.021
H4: IT Identity → Intention0.0110.2930.770
H5: Age → HIPC−0.0042.5560.011
H6: HealthStatus → HIPC0.0110.8730.383
H7: InfoSensitivity → HIPC0.2585.2990.000
H8a: RiskHP → HIPC0.1148.7570.000
H8b: RiskIn → HIPC0.1178.9830.000
H9a: TrustHP → HIPC−0.1352.8700.004
H9b: TrustIn → HIPC−0.1992.3300.020
H10: HIPC → Intention−0.1102.0960.036
Controls:
Education → Intention−0.0230.7020.483
Prior m-health experience → Intention0.0090.2300.818
Health Insurance → Intention−0.0451.2220.222
Table 3. Development of qualitative inferences, quantitative inferences, and meta-inferences (adapted from [46]).
Table 3. Development of qualitative inferences, quantitative inferences, and meta-inferences (adapted from [46]).
Context and Category of ConstructsSpecific ConstructQualitative InterferenceQuantitative InterferenceMeta-InterferenceExplanation
Motivational variablesInternal PLOC
External PLOC
Introjected PLOC
Motivation-related variables,
especially those stemming from own interests, advice, and shame, affect an individual’s adoption of the ePA.
Consistent with qualitative findings.Individual motivation stemming from external mandates or internal feelings positively affects ePA adoption, although internal ones are stronger. In a conflict between external incentives and internal feelings of autonomous individuals, patients act in more protective ways and reject ePA usage.Motivation has consistently been highlighted to be a strong predictor of adopting a wide range of technologies (e.g., [31,39]). Additionally, the sensitive nature of health information and resulting social pressures (i.e., shame) indicate rejection outcomes.
Self EfficacyMobile IT IdentityIT usage is motivated by a positive self-identification with IT use, and thus ePA adoption is.IT identity was not significant.A positive self-identification with IT has no direct effect on ePA adoption.Even though the ePA is accessed through mobile applications, they do not require a self-identity attributed to “IT identity”.
[c]HIPC/Personal
CharacteristicsAgeHigher age results in deeper privacy concerns and lower ePA adoption.Lower age results in deeper privacy concern.Younger individuals express more privacy concern from using an ePA.Demographics, such as age, are commonly associated with privacy concerns. Younger individuals may express more privacy concern attributed to their privacy literacy [48].
Health StatusThe health status negatively affects adoption stemming from the uneasiness of one’s severe health status.Health status was not significant.The self-perceived health status has no direct effect on the HIPC of ePA usage.Statistic significance might fail to appear due to the low share of subjects with severe health status in our sample.
[c]HIPC/
PerceptionsRiskPerceived risk in processing by physicians and health insurance positively affects HIPC of using the ePA.Consistent with qualitative findings.Perceived risk add to the HIPC of using the ePA; however, trust in the physician or reasonable satisfaction with one’s health insurance lower privacy concerns.Trust & risk are linked to privacy concerns [1,66,131]. Trust in physicians and the ePA lower privacy concerns [1,114].
TrustTrust in physicians or one’s health insurance outweigh perceived risks.
Information SensitivityHealth information, when considered being sensitive, increases privacy concern.Consistent with qualitative findings.Individuals rate sensitivity of certain health information differently (i.e., towards STD), thus willing to share those data differs. Health information sensitivity is generally high.Perceived sensitivity affects privacy concerns and intentions to provide health information [2]. Information sensitivity is associated with perceived risk [56].
HIPCHIPC 3rd order formativeThe interviews gave evidence for all constructs in the HIPC but awareness. In particular, the desire for control and granular permission management is strong, and the lack of those features hinders usage intentions.Consistent with qualitative findings.The HIPC significantly hinders ePA adoption intentions. However, the overall privacy concern is generally low in our sample.Exercise of control over one’s health data is found essential. Granular permissions are often requested [27]. However, the privacy calculus is less profound where PHRs are relatively new. That is why individuals tend to weigh the benefits of the ePA more heavily than the concerns of privacy.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Henkenjohann, R. Role of Individual Motivations and Privacy Concerns in the Adoption of German Electronic Patient Record Apps—A Mixed-Methods Study. Int. J. Environ. Res. Public Health 2021, 18, 9553. https://doi.org/10.3390/ijerph18189553

AMA Style

Henkenjohann R. Role of Individual Motivations and Privacy Concerns in the Adoption of German Electronic Patient Record Apps—A Mixed-Methods Study. International Journal of Environmental Research and Public Health. 2021; 18(18):9553. https://doi.org/10.3390/ijerph18189553

Chicago/Turabian Style

Henkenjohann, Richard. 2021. "Role of Individual Motivations and Privacy Concerns in the Adoption of German Electronic Patient Record Apps—A Mixed-Methods Study" International Journal of Environmental Research and Public Health 18, no. 18: 9553. https://doi.org/10.3390/ijerph18189553

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop