Privacy concerns as a barrier to sharing data about energy use: A rapid realist review

The purpose of this review is to investigate the nature of privacy concerns in the context of Smart Local Energy Systems (SLES) to understand how SLES providers can minimize both users’ concern, and cause for concern, around privacy. We conducted a rapid realist review and thematic framework analysis against Bronfrenbrenner’s socio-ecological model to understand privacy concerns in different contexts. A common privacy concern was that sharing detailed energy use data had the potential to reveal information about home life, and to intrude upon people’s sense of autonomy, choice and control. Evidence suggests that people are willing to accept newdata sharing technologies if the benefits of doing so are clear, anticipated, and mutually beneficial. Building trust, through increasing knowledge and understanding was a mechanism for overcoming privacy concerns, but this was mediated by the organization providing the information. Non-profit organizations were more trusted to ensure appropriate safeguards to privacy were in place. One key barrier to participation with good supporting evidence was that people can resist perceived intrusions on their privacy. This could be actively resisted by refusing to install data collection technologies or passively by non-participation in adapting energy use behaviors: both of which are necessary for SLES to achieve their goals of managing energy demand and building resilience in smart grids.


Introduction
Smart local energy systems can help deliver energy services efficiently by automating complex processes, self-regulating, learning user preferences, and helping inform effective decisions. They cannot do this without data in many different forms, from second-by-second updates on the charge of batteries to the addresses of users' homes. Often, this means that users must actively choose to share such data. Privacy is a key issue for those planning and implementing smart local energy systems (SLES). Privacy issues have been highlighted as a possible barrier to the willingness of both energy users (e.g. consumers, residents, industry) and SLES implementers (e.g. local practitioners, companies, policymakers) to participate and share data in SLES.
Much of the data that SLES need to function effectively is personal. That is, it "relates to an identified or identifiable individual" [1]. Because of the General Data Protection Regulation, SLES operators are legally obliged to pay special attention to how they obtain and use personal data, including getting user permission. Beyond this, however, data privacy is a prominent concern for potential SLES users. Get it wrong on the data -even in appearances -and the effect on trust, participation, and data sharing in the SLES could be seriously damaging.
The purpose of this review is to investigate the nature of privacy concerns in the context of SLES to enable us to provide evidence-informed guidance on how SLES providers can minimize both concern and cause for concern around privacy.

Search strategy
We developed the following list of search terms connected to the initial programme theory above to identify relevant documents. We applied pearl-growing techniques to search terms in systematic reviews in other sectors that were about privacy. We ran pilot searches and used different search terms in combination (adding, altering or removing terms where necessary) around the different concepts of privacy, privacy behaviors and regulation to arrive at a list of documents which was sufficiently broad and manageable given the constraints of a rapid review (table 1. In supplemental information).
We selected studies relevant to the rapid review from the EnergyREV research portal and conducted additional searches for studies published in peer reviewed journals in indexed bibliographic databases, and reports published elsewhere, such as on organisational websites. In addition to searches of bibliographic databases and grey literature, we identified papers that were linked to any effectiveness studies, either as part of an integrated mixed methods study, or as a "sibling study" (e.g. qualitative, economic or process evaluations associated with specific effectiveness studies). Supplemental evidence for systematic reviews in related topics and sector areas was conducted in Google and Google scholar and UCL Library portal.

Screening studies: applying inclusion and exclusion criteria
Inclusion and exclusion criteria were first applied to titles and abstracts. Full papers were obtained for those studies where abstracts suggest that that the studies might meet the inclusion criteria. Where the title and abstract provided insufficient information to be certain, the full paper was obtained and the inclusion and exclusion criteria re-applied. Those that did not meet these criteria were excluded. • Study must include consideration of privacy concern and, in particular, the role this plays in choices around to what extent customers participate. The study will not be included if it presents only a technical privacy solution (e.g. encryption) with no interaction with consumer privacy concerns.

•
The study must include numerical outcomes or views and experiences. • Study must present clear methods for their research.

•
Studies with a focus on energy will be prioritized for inclusion, with studies in other areas included on the basis of theoretical and practical relevance.
All studies that meet the criteria were entered into the EPPI-Centre EPPI-Reviewer systematic review information management [5] software.

Characterising included studies
Data was extracted from the included studies for predefined codes, such as for study type and geographical location, and also inductively from the studies, such as those about participant characteristics and programme theories. We then grouped these descriptive codes into higher order, families of codes to understand the emergent themes in different contexts. The review teams (CV, CM and MF) took samples of initial hits from the search strategy results to blind screen in pairs against the Title and Abstract coding tool. Discrepancies were discussed until agreement was reached. Sampling of records continued until 100% agreement was reached. We used the same procedure when data extracting from studies, with random samples of studies selected and compared between pairs of reviewers. Data extractions were compared for clarity, meaning and completeness and any areas of disagreement discussed with reference to a third reviewer (DS) as to adjudicate when consensus could not be reached.

Quality assessment of the included studies
Individual studies were quality assessed with a checklist appropriate to each of the study designs, to assess the threats to validity common to these types of studies and steps taken by authors to minimize bias. The checklist assessed the study's: 2. Construct validity: the extent to which the concrete measures in the study match up to the intervention theory of change [6]. 3. Conclusion validity (rigor): the reliability and trustworthiness in reaching its findings and conclusions [7]. 4. Relevance/generalizability; to what extent the findings are replicable and generalizable to the SLES context, as well as the relevance of the study to this rapid review.

Results
Thirty four primary studies met the inclusion criteria for the review , with a further 11 systematic reviews [44][45][46][47][48][49][50][51][52][53][54][55] providing evidence from related sectors. The following chart shows the flow of studies from the initial "hits" from the search strategy through the screening on title and abstract and on full text with results for each category.

Description of the included studies
Smart local energy systems (SLES) is a relatively new research area [8] and research on privacy concerns around energy sharing data in this context even more so. Consequently, two thirds of the studies included in this review were published in or after 2015 and none published before 2011. The vast majority of studies were conducted either North America (over a quarter in the USA alone) or western Europe (with around a fifth in the UK). They covered a wide range of academic disciplines, most commonly computer science, energy, social science, technology, business and engineering. With the exception of a few industrial or commercial reports, all of the studies were published in academic journals or as conference papers. While nearly half of the studies examined either smart meters or smart grids, the rest investigated a range of technologies and approaches. For those studies carried out in a particular setting or population, most were based in an urban, residential environment with adult participants, primarily of working age, participants usually had some knowledge and experience of Smart technology use.
Around three quarters of the studies used observational methods, predominantly surveys, case studies or interviews. These were mostly used to investigate what might affect people's perceptions and acceptance of an intervention or technology although some used these methods to assess technology development or implementation. The quality of these studies was mixed: although many were highly relevant, fewer were highly robust (half of the surveys and case studies and none of the interview studies). There was only one experimental study that tested the effect of a real-world intervention.
Most studies on people's perceptions and attitudes, psychological measures were used more than any other type of numerical measure; most frequently measures of privacy concern or acceptability. Measures of technological performance and social norms were also used by some studies while a few measured behavioral or economic outcomes. Only one study measured environmental outcomes. Themes that emerged from qualitative studies most commonly related to the people involved in SLES and the impacts of SLES: the actors involved, who sees the data and who benefits.

Methods of synthesis
None of the included studies measured numerical changes in outcomes that would be suitable for a meta synthesis. Instead, a narrative, thematic framework synthesis was performed. We identified the patterns and themes in the texts of the included studies' findings. We used the socioecological model (SEM) [56] as an organizing framework for understanding privacy and privacy concerns in different contexts: see Figure 3. What information sharing is appropriate in one contextual domain may not be appropriate in another. Breaches of privacy could be understood as the inappropriate or unauthorized transfer of information or knowledge from one domain to another. The framework offers a guide to understanding the different contexts of the determinants of privacy concerns and the barriers and facilitators to data sharing where interventions to address privacy concerns could target their efforts. The socio-ecological framework starts with the individual, their personally held beliefs, preferences, sense of self and core values. The micro system round the individual consists of the closest interpersonal relationships of family and home, and around this is the meso system of two or more systems in which the individual actively participates such as the workplace or community in which they live. The outer domain is the macro system consisting of relationships of wider influence, such as the social and cultural contexts in which the micro to meso are situated.
Compatible with the realist review approach, we considered the programme theories for studies that aimed to overcome privacy concerns and the common themes of barriers to data sharing. Where evidence was limited, we searched for additional systematic review evidence from related sectors that share comparable problems and issues with privacy concerns to further develop and refine the theories on interventions that would be effective in overcoming privacy concerns, how, for whom and in what contexts.

Privacy concerns and barriers to data sharing in different domains
The included studies suggested that, where there were privacy concerns, these were around the intensification of home-based data collection proposed by smart local energy systems, particularly smart meters, and that patterns of behaviors that could be inferred by this had the potential to disrupt the socially acceptable norms of control over information about oneself in the different domains.
The following chart shows the privacy concerns in the form of barriers and facilitators to energy data sharing in these contextual domains. Interventions that overcome or address these privacy concerns then, we can hypothesize, will facilitate data sharing. The most commonly expressed type of privacy concern was around the individual's sense of autonomy choice and control (16 studies).
Control was a term used in diverse ways. It could be described by study participants as control over who has access to information, or control over what happens to the information after it was collected and/or used, or could also mean a sense of being controlled. Privacy concerns around autonomy choice and control were bound up in the desire to defend and control the depth and breadth of personal data that was shared, to limit the uses of that data.
Fewer participants expressed privacy concerns around data ownership, which was around having control over the data collection and its use, as well as having the option to delete data [42]. Participants were less trusting of profit-making third-party organizations to work in the participant's' interests, but did not seem to be overly concerned with the value of the data they shared. Few thought it had or could generate value to themselves.
Participants were familiar with the idea of address and account data being anonymous. In one study [14] nearly all (90%) participants considered anonymity would be a condition of data sharing and assumed that data would be anonymized as a privacy preserving measure.
Identity privacy referred to more than the technical information of addresses and bank details in anonymity privacy, but is around protecting or reserving the information that could be used to infer who a person is, their interests, preferences, and ways of living [26]. Some felt this could create a commercial identity profile that could be used for segmented advertising and marketing, which individuals would not have visibility of and over time could limit people's choices [41].

Barriers and facilitators to data sharing in the individual domain
Even where privacy concerns are low, having a perceived lack of control over data sharing creates concerns. Participants indicated a "principled desire" to be in control of their data sharing such that they would be able to decide, for instance, which parties could access the data under which circumstances. Setting the boundaries around who has access to data and who has ownership of the data, shares both risk and responsibility between the customers and providers through the mechanisms of choice and consent (Nissenbaum 2010 3.3.3. Perceived loss of control over the use of data can be a barrier to participation For many participants, costs savings were an incentive to share data, and one way of realizing costs savings was through flexible energy demand use tariffs. Included studies that considered privacy and balancing energy demand to access lower charges found that relinquishing control and decision-making over the use of their appliances or vehicles to balance energy demand was intrusive and disempowering for some people [13,17,19,24,26,27].
There were some positive views over giving over some control, if this meant that there was an energy expert overseeing the function of the system and could warn of any issues suggested by higher than usual energy consumption.
Participants expressed a desire to keep control over times of day or that some practices that should not be given over to remote control [10]. For some participants, the thought of losing control over appliances or level of heating in the home was given as a reason for non-participation.
A community-based energy project also showed sensitivity to the fact that not everyone can change their individual energy demands on demand from the community energy systems and the reasons may be unknown to the group. None of the participants suggested that the group had a right to know what those circumstances were. 3.3.4. Choice mechanism: options to opt out are often a condition of opting in In 9 studies participants raised issues over having the choice over setting the boundaries of data sharing, and this was a conditional barrier to data sharing [13,17,24,26,27,30,31,33,36]. Even when privacy concerns were low, participants wanted to know what information was being shared with third parties in order to make informed choices in what data to share and with whom, and in doing so share responsibility for the data sharing with the vendor [24,27]. When given the choice, most customers do make some changes to disclosure settings [26]. Projects noted the need to involve customers at the earliest opportunity to allow them to choose the level of involvement [17]. Where no choice exists in the form of opt outs, the numbers of people who wanted smart meters dropped by a third in one study [13] Some people do not wish to relinquish control at any price. For some this is due to the lack of trust in energy providers to work in their best interests [30], while others do not wish to see their choices limited in their ways of living, such as by lowering heating temperatures or turning off appliances [22,36].

Information and the amplification of risk
When people are given detailed information about potential privacy risks for all situations, this can seem overwhelming, create a sense of lack of personal control and in some studies; participants then withdrew consent. Providing information does not necessarily overcome privacy concerns and can have a negative effect where concerns were low to start with [24,27]. It was also reported in one study that many of those with a smart meter could not recall the level of consent they gave and a third of the respondents could not recall what level of data they gave consent to share [13]. Another study found that people would withdraw their consent at the point in time that the terms and conditions were presented to them [26].
3.3.6. Knowledge is a mechanism for data sharing, but mediated by the trust in the sender People's prior experiences and direct knowledge of data sharing affects their decisions and understanding of potential risks. Increasing the practical knowledge of how technologies work can alleviate privacy concerns.
Few people reported that they had personally experienced a data breach but those that had showed less support for smart meters and data sharing [21]. Knowledge around data sharing and privacy was generally low. People had little knowledge of what opportunities and benefits there were in data sharing as well as little knowledge of the energy efficiency of appliances and actual prices of services [11,15]. On the other hand, people who considered that they had a good understanding of how things worked were more positive towards data sharing [14,13]. People who were more positive towards technology use in general were less opposed to smart meters [21].
When asked about what the most important elements of informational campaign to introduce smart metering systems, participants said it was knowing more about how they worked [10]. People who have the smart meters (and know more about how they work in practice) are more willing to share data [13].
Additional review evidence on technology use and acceptance, find that "privacy perceptions influenced trust, which in turn influenced the perceptions of risk" [55]. As perceived risk ultimately impacts on an individual's intention to use, this was a negative predictor of new technology use and acceptance [52]. The individual's trust in the sender of information about risk influences their decisions about the balance of risk and rewards and intention to use the technology.

Privacy controls are a part of life
Several studies show that people are already familiar with settings and controls over their data sharing. Nearly all participants in studies that were offered options to adjust controls over data sharing did so. Some participants accept that controlling privacy and access controls may be necessary but also inconvenient even annoying, [41]. Most people shared information informally with friends and neighbors to compare energy use, costs and efficiencies of appliances and so on. People felt that sharing of data to third parties was a fact of life, even if they were not necessarily enthusiastic about it [31] or felt that it was largely out of their control. However, it is worth noting that participants in seven studies were selected for their prior knowledge of smart meters, or similar data collection technologies, in order to have an opinion about them [10,25,26,30,34,42]. As a result, people who are already comfortable with using technology (and privacy controls for them) are likely to be overrepresented in research about data sharing technologies.
In one large UK based study, people who did not have smart meters were more likely to believe them to collect intrusive data than people who had smart meters [13]. As this is comparing people who have meters with people who do not but not comparing attitudes before and after, we do not know whether (a) privacy concerns were a barrier to participation for the people who did not have the smart meters, or (b) that privacy concerns are allayed for customers once they get them, as they get to know how they work or that privacy is no longer a concern at all given how useful and integrated into usual routines it becomes, like other data sharing technologies in common use.

Trading privacy risk for rewards: Cost savings
Saving money is often cited as an incentive for the enrolment in initiatives that require sharing private data but the evidence on how they achieve cost savings is mixed..
People tend to be willing to share data for cost savings from paying only for the energy they have used instead of estimates and forecasts, and changing consumption patterns to cheaper tariffs at different times, but there is less sustained behavior change in reducing consumption and some reluctance to allowing energy providers to decide energy saving consumption patterns for them.
However, there is a danger that low energy use customers will be unable to realize any cost savings if they cannot reduce their consumption further and/or are unable to change their energy use habits, e.g. Single, older people (for whom cost savings were more of a priority), people on a low income who are operating their energy usage at a minimum already, people with caring responsibilities, people at home with health conditions that have higher heating needs.
In order to anticipate energy costs savings, customers should be able to anticipate what sorts of savings they are likely to make before consenting to share data in exchange. A 2018 Systematic review of Consumers' perceptions of energy use and energy savings in the area of energy savings strategies suggest that people are not very knowledgeable in this area and tend to underestimate energy usage in high energy using devices and overestimate energy use in low usage appliances [57].

Mechanisms of personal values can facilitate data sharing
The review found four studies that said that people who were concerned for the environment were more likely to engage in data sharing and were less concerned about privacy. These individuals were sometimes described as intrinsically motivated, that is, participating towards the benefit of the environment was its own reward but some participants also respond to seeing proof of environmental gains, and being seen to "do the right thing". However, the sustainability of exchanging privacy for environmental gains is not clear. Two of these four studies measured behaviour change. One found that students who belonged to an environmental group performed better than non-member participants, in reducing their consumption of energy when energy use data was made public. It was not clear whether the positive behaviour change was sustained over the longer term. The other study showed that people with pro-environment values were less concerned about privacy and had a higher enrolment rate in utility-controlled charging for their vehicles. The other two studies tended to ask people about their future intentions, or reactions to hypothetical situations.
Looked at another way, the social desirability of having concern for the environment could be considered a potential source of bias in surveys that ask people about their values and associated future intentions. While beliefs and values are predictors of future behaviour, there exists a valueaction gap between what people say is important to them and acting on it when given an opportunity to do so. Studies that have identified this value-action gap related to environmental and climate concerns include waste recycling (at home [43], at school [58], in college [59], in cities [60] and in the community [61]), eating meat [62], hydrogen energy [63], transport [64], electric vehicles [65] and more. There appears to be a weak relationship between pro-environmental values and sustained behaviour change.

Privacy, barriers and facilitators in the micro, interpersonal contexts
3.4.1. Privacy concerns in the micro, interpersonal context Types of privacy in the home domain were concerned with included: location privacy -i.e. that data may reveal the physical location of the home to inappropriate individuals; traceability -i.e. that data breaches and subsequent inference may reveal information about the household ; and relational privacy -i.e. that data contains information about oneself that manages the norms, roles and expectations of one's relationships with others. Relational privacy mediates the choice and control over who one enters into relationships with.
Second only to the individual's sense of self, autonomy and control, studies revealed that the home was considered a reserved space and can be guarded on principle without any direct or specific threat to privacy identified. Home was described as a "castle", a refuge, a protected and private space for peaceful enjoyment and non-interference [12,26,30,37,38,39,41]. Vehicles enjoyed a similar level of privacy that was valued by customers.
"The fact that the electric company can tell when I've turned on the dishwasher or a light bulb or the TV-that's pretty fascinating to me. I don't know how they do that, but do I want them to know that? well it's not a bad thing. It's still a private thing..." [12].

Positive family influences as a facilitator to exchanging privacy for benefits of reducing energy consumption
There is some evidence of enthusiasm to view and share data on household energy use for the purpose of comparison with other family members. There was (at least initially) interest in knowing how much energy different devices used, and in educating family members in energy use. These studies usually asked participants about hypothetical scenarios and intentions to use so it is not clear to what extent the change in household energy use is sustained in the longer term after this initial enthusiasm. (See also privacy exchange for cost benefits through behavior change in the individual domain.) Some studies showed that the initial enthusiasm waned after the implications of detailed energy use data sharing became apparent [12,26,29,30].

Customer engagement: communication of benefits by household
Several studies indicate that people were reassured of their privacy concerns with personalized and detailed communication about the purpose, use and benefits [10,11,14,17,25,26]. Personal visits were preferred in one study [10] with over half of the participants saying that they would like detailed information on how the system worked, and would enable fast detection of faults and repairs, while older people preferred communication by letter followed by a visit [10]. Successful interactions and communication of benefits were tailored to the household. People were disappointed and surprised in one study that the smart meters did not seem to be compatible with other smart devices in the home, and felt left to their own devices to work out how to make the system work, with concepts that were unfamiliar to them (such as using if-then-else algorithmic logic) [26].

Family dynamics: maintaining relational privacy within the home
There is some recognition of the potential for privacy issues to arise when data on activities etc. are shared within households as well as to third parties. In households with multiple users, there may be many different privacy concerns and energy saving or cost saving priorities [34] and a model of the consent with the sole "bill payer" may not reflect the diversity of risk and benefits of sharing energy usage data within the household. One participant in a study of a smart home co-design [42] expressed concern over other members of the same household being able to access to her credit card details, and suggested voice recognition as an authentication that could allow for multiple users.
Detailed energy use data could reveal patterns of behavior that family members would wish to control or limit visibility of under typical circumstances, such as adult content TV viewing or who is (or isn't) doing the chores around the house [12]. Detailed and real-time energy use data could enable some family members to monitor and control the behaviors of other members of the household in ways they were not able to do before and may not be welcome by other members of the household [34].
"So… he monitors it all on his thing (computer) and it drives her insane! So, she thinks its dreadful, she feels violated all the time, cos his workmates will be walking past his desk. One even called her one day saying 'Wow Kay, your power is going through the roof!'" [34] Such technologies that can reveal patterns of behavior, location or control over devices and will need to consider how such technologies can be used as "vectors of control". There was little direct evidence from the energy sector on the impact of energy data collection on coercive control in the home, although this was indicated in the disquiet of members of the household, aware that there could be a shift in power through information within the home.
Research on the use of smart technology for coercive control in the home is limited, most likely due to the fast-changing pace of new smart technologies for the home being developed, but the UK charity Refuge had already identified that 72% of their service users had experienced some form of technology-mediated coercion and control. In response and in consultation with domestic abuse charities, IBM has recently published a report (2019) on Coercive Control Resistant Design Principles' [66] which, in the context of providing smart local energy, are: • Diversity. SLES providers should recognize their diverse user base and have a diverse development team. • Privacy and choice. SLES providers should empower all of their users (not just the named "bill payer") to easily make active and informed decisions about their privacy settings. • Security and data. SLES providers should build secure technology, and only collect necessary data, which will limit the risk that the data can be intercepted and/or be used maliciously.

•
Combatting Gaslighting. Data collection and control over data should disrupt attempts at manipulating someone into doubting their memories and judgement with pertinent, timely notifications and auditing. I.e. there should be limits to deletion of records of activity.

•
Technical ability. SLES providers should ensure that the use of the technology is intuitive and can be understood by all who could be affected by it, regardless of their technical confidence.

Home occupancy
Participants in four studies [26,27,28,29] expressed concerns that there a was potential vulnerability to criminals who can use detailed energy use data, i.e. patterns of no or very low usage data, that could show when the resident is not in the house. However, this was an aspect that was only superficially explored in the studies.
"…They could even 'see' when you are going to bed [by seeing when you] switch off the lights." [26] 3.5. Type of privacy concern and barriers and facilitators in the Meso system of communit, social groups and work 3.5.1. Types of privacy concern in the meso system Reputational privacy is difficult to define. The person whose reputation is in question may be largely unaware of it, except when it is perceived to be damaged. This includes the control of information about oneself that effects one's good standing in others' opinions. Reputation represents a shorthand heuristic for the confidence that can be placed in us to "do the right thing" or to be trusted and for this reason reputation has value and is to be protected.
The internet and records held in digital form provide an element of risk to privacy and reputation not seen before. Digital records offer greater levels of access to a greater number of people and makes the diffusion and lifespan of information difficult to control. Nine studies considered reputational privacy or related privacy concerns in the meso system of community and workspace [11,12,15,28,30,34,35,38].
3.5.2. Barriers and facilitators to data sharing in the Meso system 3.5.3. Peer pressure as mechanism can be positive or negative People tend to be willing to share energy data with family, friends and neighbors to make comparisons of technology (demonstrating little privacy concern in this kind of data sharing) [15,30,33,35]. Comparing new technologies can be a conversation-starter, a way of learning about how technologies work and their functionality and other people's experiences of them, perhaps show them off, but there was less interest in their energy use [33]. Learning that a technology is widely used suggests that its use is a socially acceptable, and so the risks are assumed to be less [35]. Sharing knowledge can foster a supportive environment where individuals learn and get recognition from their peers [15] and feel part of a collective good [28].

"Sometimes it feels a bit futile if you don't think anyone else is doing it. So, I think if
you know that other people are doing it, it makes you feel you're having a bigger impact" [28].
One of the studies was based on a community energy project and considered peer pressure in the context of a community project, with the other studies mainly limited to participants' views of their future intentions and hypothetical scenarios. The study of community energy in practice also cited some negative effects of peer pressure which could impact on one's reputational privacy in the community. Willingness to share energy use data with neighbors seemed to decline as individuals consider how this data might be used.

"You can also see it as an invasion of your privacy. Someone is going to meddle in.
You might experience some sort of social pressure on the way you do your housekeeping." [30].

Trusted third parties for data use
Participants in several studies were more trusting of organizations that did not have a vested interest in energy to handle their data. These included government agencies, consumer organizations, and environmental organizations [31]. One study [13] found that it would be acceptable for police to have access to energy use data to identify some kinds of crime and for data to be shared with health care providers for vulnerable customers.
While these organizations were seen to uphold standards, to be neutral, and provide safeguards, study 11 found that a regulatory framework provided reassurance (and overcame privacy concerns) to customers that appropriate safeguards over their data were in place.
Other studies suggested that the energy provider could be the trusted organization, but that involving more organizations poses a greater risk to data in terms of misuse and data breaches.

Local Energy provider as trusted expert
Rather than being seen as intrusive, input from local energy providers as experts may be welcomed where this helps with complex or unfamiliar technologies or provides a clear benefit, such as monitoring for safety or spotting unusual consumption patterns that could indicate a problem, and could even harm the user experience if this expert oversight was absent when it was expected. [19,26,30].
"We have those sensors in the rooms; then I see it as natural that they look if it runs alright. Or are they just letting everything run without even keeping an eye on what is going on? There must be a reason for why we have sensors in various rooms." [19] 3.5.6. Local Energy provider trusted as community energy arbiter While community control and accountability can be appealing, it requires time and effort (30,38). Some energy users expect this to be part of the service provided by energy companies (part of the premium), particularly enforcement which can lead to ethical issues or could create unpleasant environments when done by peers. Energy companies can provide the middle ground, neither too close or too distant. This means that relational and reputational privacy between the members of community's energy groups can be maintained by being released from responsibilities of monitoring and disciplining its members [30].
In study 30, participants pointed to the energy company as arbiter in "conflict situations" that can arise in the absence of formal rules and with "a lack of authority over someone else's roof". Energy companies could serve to provide the "new balance" that allows for more de-centralized and democratic control over energy production [30].
"...If you have the ambition to become energy-neutral, then you need to have an element of exchange. And if you exchange, you need an institution to organize that." [30] 3.5.7. Positive outcomes for SLES providers are mediated by public support Without the right incentives or regulation, there was a general shared feeling that companies may prioritize their interests at the expense of their customers' interests, which may have a selfdefeating impact on uptake [13,19,22,24,26,36,38]. Customers believed that the main benefit was likely to be to the energy companies and not to themselves [26,36], or at least not enough benefit to make any disruption or change of energy use habits worthwhile [13]. Study aims of providers suggested that without widespread support from the public, the commercial benefits of SLES may not be realized [24,38].

Work life
Only one study suggested privacy concerns around work life could be a barrier to data sharing. In one area it suggested that energy use data could be used to over-monitor workers' behaviorsemployers monitoring employees' time spent making coffee, for instance -and in another it suggested the differences in the appropriateness of sharing data for professional purposes compared to personal use [26]. In vehicles, the Event Data Recorder (EDR) was acceptable in professional vehicles, but most refused to equip their private car as they perceived this to be an invasion of privacy.
As mentioned in the relational privacy in the home, this detail of data collection has a potential to be a vector of control, in providing opportunities for more surveillance than would be expected than in private life. On the other hand, increased patterns of home working are likely to blur the distinction between home and work-life, and home and work energy demands. Before Covid-19, figures from the Office of National Statistics [67] show that 13.7% of the UK workforce worked from home, part of an already rising trend. (The latest figures from ONS, 2020 under Covid-19 now shows this is now 46.6%). Homeworking may become the default for those that can, at least for the foreseeable future.
In terms of privacy, people are likely to share their home with others who are not part of the same organization and confidentially and security of data may be at risk in the home. There may be security issues around the transmission and storage of data and the use of personal devices for work use. This was a concern raised by organizations, but not for individuals.

Pro-active communication and outreach
Most of the information on types of preferred communication came from one large study of four case studies with several thousand participants (Vermont Trasco 38). The case studies described lessons learned in the implementation of community wide smart local energy systems, however, privacy was discussed only briefly. The study found that customers want simple, timely and tailored communications (e.g. weekly emails about bills). Customers valued reliable and predictable bills, weekly emails were appreciated to avoid monthly "bill shock".
It was clear that there was no one-size-fits all solution when it comes to channels of communication, with some customers preferring self-guided means of accessing information, while others preferred active communication from the energy providers. Web portals were more mixed in their success and their overall effectiveness was still "an open question" customer requests were for website interfaces to be easier to understand and use than currently.

Types of privacy concerns and barriers and facilitators in the Macro socio-political, economic and cultural context
3.6.1. Types of privacy concerns in the Macro socio-political, economic and cultural context The types of privacy concerns that related to the macro, socio political economic and cultural contexts were surveillance and discrimination by social, political or market stratification. Concerns around surveillance talked about government spying and "Big Brother"; a general unnerved sense of being watched, or investigated without any specific reason [10,14,20,22,26,28,30,32].
Data used to discriminate against individuals included price and product discrimination, where individuals were presented with different offers and tariffs based on their membership of groups of similar types. This erodes individuals' choices and rights by being subsumed into broader political, commercial and social groups. There were concerns around not being able to know that they were being treated in non-preferential ways, and so would be unable to challenge or rectify this [18,26,30,32,41].
3.6.2. Barriers and facilitators to data sharing in the Macro socio-political, economic, cultural context 3.6.3. Social norms Social norms are defined as the "rules and standards that are understood by members of a group, and that guide or constrain social behaviors without the force of law" [68, p. 152] In a review of reviews on social norms [48], authors found general agreement in the reviews on what social norms were in that they incorporated some element of "social" and that they usually help with decision making in some way. Most of the reviews said that they incorporated an understanding of cooperation and social order, and that social norms effect people's wellbeing. On the other hand, social norms can encourage negative behaviors, such as smoking or drinking alcohol, discrimination, even violence against socially perceived outgroups. Expectations of social norms are not seen to influence participation more than individual beliefs about technology, concerns for the environment, or incentives. Widespread adoption may help establish a social norm and expectations that something that is widely used, or approved of by their friends, can be assumed to be trustworthy. The impact of increased individual understanding of smart meters however was contradictory in this context as it both reduced the demand for social norms by emphasising individual decision making, whilst simultaneously increasing the expectation for social norms against smart meters when more was known about the types of data they could collect [21,23,30,35].
Social norms around privacy and data sharing technologies are an influence on people's decisions that there is societal consensus that privacy is an important aspect of life that should be preserved. Social norms can take time to catch up to the new challenges of emerging technologies and their uses. Companies that use data in novel ways may find resistance from customers who find its use "creepy", in that its use may be legal but not generally considered acceptable [49].
In related systematic reviews, social norms were found to have a significant effect on a range of pro-environmental behaviors and in other studies people who expressed environmental concerns appeared to be less concerned about data sharing). In defining social norms, the review adds to the definition given above the idea that social norms are something that are conditionally followed, and motivated by external (vs. internal) enforcement. What people approve of doing (injunctive norms) and what people actually do (descriptive norms) can be considered another expression of the valuesaction gap.
For people who looked to social norms to decide whether new technologies are generally accepted, there is a risk of over or underestimating the general consensus view.
In the review of reviews of social norms [48], correcting misperceptions was, by far, the most commonly cited intervention through which to change social norms across the reviews.
Social norms can be influenced or corrected for misconceptions in normative beliefs by sending/ receiving more accurate information as individuals can over-estimate the approval or disapproval of others in the same social group to reflect their own beliefs. Injunctive norms are commonly more effective in changing behavior as they signal clearly the approval or disapproval of the group in taking a given action. And injunctive norms are more effectively used in interventions when framed positively [68]. Focusing only on the negative behavior one wants to change signified that this behaviour is a social norm, so, for good or ill, acceptable.
The correcting misconceptions approach was often used by health interventions that encourage change in behavior by demonstrating that fewer (or more) people than thought engage in the behaviors.

Knowledge, understanding and acceptance is mediated by the public mood
Three studies considered the impact of the public mood on privacy and data sharing [11,13,30]. All consider the recent impacts of negative experiences of data sharing on general acceptance and participation in smart meters [28], smart grid [11] and energy markets, particularly in switching [13]. Negative media stories could communicate more widely any problems and issues around privacy and data sharing, affecting societal acceptance. For people coming from a low-knowledge base of energy systems, these media stories could be the main source of information for people on the risks of data sharing with energy companies. The intensity of the telling and retelling of news stories of data breaches and online privacy violations may amplify this perceived risk [69,70] to a wider group of people, and now other platforms such as Twitter and Facebook [71]. Recent examples of how smart meters are represented in UK media, on both left and right, include media stories with headlines of "is your smart meter spying on you?" Guardian 2017, "The smart meter snoopers: already in homes as part of a little known 20m plan to track energy habits" Daily Mail 2019.

The market
Allowing market forces alone to govern the types of in home displays available to customers to help them change their energy use behaviours may have been the reason uptake has been slow and the projected energy savings were not realized [38].
In the same study, it was suggested that competitive tendering needs to balance being specific and being realistic about the level of security that is required: being too specific could limit the market actors that can meet such high standards and so raise prices, and yet leaving the requirements too vague would lead to a "race to the bottom" in terms of pricing and standards.
Manipulation of the goods and services displayed to customers via targeted advertising was generally unpopular. [26,30,40]. The balance of market forces and regulation are also context specific; UK rules on using data for marketing goods and services falls under GDPR, which states (amongst other things) that "Marketers must offer a clear opt-out, inform the individual of the processing activity and have a compelling case for why someone may be interested in their goods or services.".

Trust in governments
Trust in government represents confidence of citizens in the actions of a "government to do what is right and perceived fair" [72].
There were mixed impacts of trust in governments: In some cases, a lack of trust in government may encourage the independence offered by SLES individually or in community projects [30]. More frequently however the distrust of technology that passes data to government (e.g smart meters) may discourage participation in SLES.
Many of the studies reported that there was a general fear of government surveillance, but this concern was often non-specific, a general unnerved feeling of being watched or of a creeping of authority and surveillance into the home, sometimes described as "Big Brother". [10,18,20,38].

"Big data"
The sharing of data for one purpose did not raise concerns, however combining with other data sets to form "Big data" did. Such large-scale data uses could socially, politically and economically "sort" individuals [73], leading to discriminatory practices that, being a commercial practice, would be subject to less scrutiny and regulation that that of data collected and used by the State. This may limit the choices available or make correction of mistakes near impossible as this segmentation would be hidden to the individual [41,30].

"I have nothing to hide. It is just that connections will be made between different
databases. That will result in a profile. . . For many that profile will be just fine, but for a small minority this profile will mark them as terrorists!" [30] A systematic review of the potential applications of 'big data' in smart energy management reported potential benefits [74], it also listed security and privacy as one of the most serious challenges. The review suggested customer ownership of data as a right, and customer data used only with explicit permission. It also suggested governmental regulation and industry self-regulation as possible solutions.

Trust in corporations
With regards to data sharing, profit-making organizations tend to be trusted less than other data users [31]. This was either based on experience or it was seen as safer by default to distrust them until more was known about them. Organizations with a vested interested were the least trusted (insurance companies for vehicles and advertisers for across the board) [31].

At a national level, opting out is a condition of opting in
Public opposition is heightened where there is no opt-out provision. Providing 'opt outs' that require consent to share data will likely reduce public opposition to smart meters but perhaps at the expense of slower uptake.
Where the implementation of smart meters was compulsory, this was met with organized, public resistance that was enough to delay national implementation (Netherlands [22,38], Canada, USA, [37] and European countries [26]).
In Germany, where smart meters are mandatory for large energy use customers, new builds and existing structures undergoing renovation, privacy and data security concerns were a major reason for the late introduction of an opt-out clause for customers (although not for large energy use customers or prosumers) [26].
National case studies of implementation of smart meters in the USA consistently found that giving customers the choice of opting out was a condition of opting in. But opt-out rates were also lower where there were financial penalties or costs of opting out.

Data policies
The regulatory environment applied to data sharing may affect perceptions of risk by the public and therefore the expectations of privacy protection that govern policy decisions [22,26,33,38]. Data policies in a single country may fall foul of regional, e.g. European laws. In one study [38] the Initial proposals of laws for smart meter rollouts did not consider consumer privacy beyond the Dutch data protection act and conflicted with article 8 of the European Convention on Human Rights.

Demographic factors that impact on data sharing.
3.7.1. Sharing the benefits: "Hard to reach" or "far to reach" groups All demographic groups expressed concerns over data privacy and there were no specific demographic characteristics that made people more likely or less to share data than others. Across all social groups two factors indicated reluctance to share data: prior knowledge and experiences of data violations.
Most of the participants in the studies were well-informed, "tech-savvy", and already engaged in technology and data sharing. Often this was a condition of participation, or participants selfselected into the study based on their prior knowledge and interest in the topic. Groups that were underrepresented in studies were those groups that are often described as "hard to reach". Groups that are hard to reach are at risk of social exclusion or isolation and in the context of technology or SLES may be left behind from the benefits of shared decision-making and a decentralised, more democratic energy system. The definition of hard-to-reach is not uncontested, it will vary from place to place and understandings may differ based on the level of their "hard to reach-ness". One reconceptualization of hard to reach is "far to reach", which places the emphasis on the service provider to make the effort to overcome barriers [75].

People with a low income
People in lower socio-economic groups in the included studies said that they were less likely to be aware of the data sharing choices available with smart meters. Research from related areas suggest that people with a low income are less likely to have the technology resources that would gain confidence in the privacy implications of data sharing. [76]. On the other hand, people on low income are more likely to rely on data sharing technologies that are insecure (such as cheap, older smart phones). In one survey of US low-income privacy attitudes, lower income groups were more, not less, concerned for potential privacy violations than their wealthier counterparts as they were aware of the privacy risks but felt they had little choice in the technologies available to them [76].
Fuel poverty organizations such as the National Energy Action (NEA) campaign for Action for Warm homes recommend sharing of energy use data to support fairer, accurate and transparent billing, and to give greater control over energy consumption [77].

Older people and technology use
Barriers and facilitators for older people identified in these studies included a greater interest in saving money than in environmental concerns [10] (people who are more environmentally concerned are less concerned about privacy when saving energy). This is also found in other research that says that environmental concerns and active participation (joining environmental groups, outdoor recreation activities) declines with age [78].
Older participants expressed more expectations of (social) norms against smart meters [23] but, as with other groups, individual assessments of risk were more important than perceived social norms when making their decisions about smart home devices [79].
Evidence from related sectors show that families are often involved in decisions in using homebased technologies, such as assistive technologies [80], and providing additional support on tailoring the technology to meet individual needs encourages its use [79].
Older people's incentives to share data may differ to younger people priorities: families may welcome the opportunity to remote monitor their loved one to alert them to safety or health issues by tracking usual activities [20,26], but older people themselves had mixed feelings about data sharing from not wanting to be a burden, to feelings that an over reliance on technology could replace human contact. Older people were generally positive about the benefits of smart homes for assisting with independent living and health monitoring [81].
3.7.4. The principal-agent problem: Who pays and who benefits are at odds for tenants in the private sector There was no direct evidence of privacy concerns being barriers and facilitators to tenants in the private rented sector in the energy related studies. In related sectors, a systematic review of the tenant and landlord perspectives of energy efficiency interventions points out that in areas of high demand tenants may not feel in a position to bargain or negotiate with their landlords over sharing data for energy efficiencies, particularly if there are installation costs involved. In terms of privacy concerns, suggesting that tenants engage with sharing energy use data and install smart energy technologies may infringe upon their relational privacy between the landlord and tenant in unwelcome ways. Tenants may not feel that any disruption brought about by installation is worthwhile if they are not staying in the property for a long time, or they may fear retaliation in their tenancy not being renewed if they develop a reputation for being troublesome. There is a mismatch between the investment of the landlord and the beneficiary of the tenant in energy efficiency measures; the so-called principalagent problem.
"If the potential adopter [of energy efficiency measures] is not the party that pays the energy bill, then good information in the hands of the potential adopter may not be sufficient for optimal diffusion" [82] The review suggests that private rented tenants, without a central unified tenants association, are in a poor position to negotiate with landlords, and not necessarily uninterested in data sharing. Likely solutions to this problem include economic and regulatory incentives as well as working with both the landlord and tenant.
More barriers to landlords include the time and financial investment needed when there is a high turnover for tenants as technologies would fall into disuse over time as tenants change fairly frequently [83]. According to the Office for National Statistics, younger households are more likely to rent privately, with those in the 25 to 34 years age group representing the largest group. The private rented sector accounted for 20% of households in the UK [84].

Tenants in social housing
UK Government figures for the proportion of households renting from social landlords were 17% of UK households (3.9 million) in years 2016 -2018. The average age of social housing tenants tends to be older than for the private rented sector. The Average tenancy length in social housing is 11 years compared to 3.9 years in the private sector [85].
While landlords in the private sector may be reluctant to install data sharing technologies for their tenants, social housing landlords appear more enthusiastic [86]. There are benefits of economies of scale and stability of tenure compared to the private rental sector.
In related systematic reviews, [87] found that social housing tenants were more motivated by cost savings than environmental concerns (this may be a function of the older average age of social housing tenants). Tenants were willing to respond to energy use feedback, but this was sustained only with ongoing engagement and education efforts [88].
Lack of genuine engagement and involvement of social housing residents in the development and implementation of renewable energies led to declining trust and a lack of belief in the benefit of engagement, which in turn led to little or no change in energy use behaviors [89][90][91]. Residents may not have had much of a choice, an intrusion on their relational privacy, about whether to share data or not, but they were able to resist in-kind with non-participation in adapting their energy use behavior. Social housing residents who were characterized by being elderly, fuel poor, high heat users may be less likely to engage in or benefit from sharing data without additional support.
The review suggests that implementers should be careful not to "oversell" the positive outcomes expected [90,92]. Providers who work with and involve social housing tenants, cost-in and provide ongoing support, education, training and maintenance of the technologies were more likely to be successful in achieving tenant's acceptance and engagement [92].

Theories of change in studies
There were few studies that directly addressed people's privacy concerns with interventions designed to overcome them. Instead, studies described privacy concerns around data sharing technologies as potential barriers and facilitators to technology use and associated data sharing. Mechanisms were the behind-the-scenes, cognitive, emotional or behavioral "triggers" that act as elements in decision making balancing privacy concerns with benefits of data sharing.
Privacy concerns were related to the perceived individual and social consequences that an unauthorized sharing or use of personal data would have in different contextual domains. As data sharing technologies are not without risk, interventions to overcome concerns about the impacts of privacy breaches would be successful if they can minimize the impact of any potential breach of privacy by design, as well as understand the how different types of privacy concerns are incorporated into people's decision making about sharing data.
Thirteen studies explicitly referred to a theory of how interventions to address privacy concerns could work, while nine studies cited social theories and theories about individual behavior.
The theories in the studies could be grouped into types of mechanisms across a continuum of entirely social influences on behavior to entirely individual. Social behavior mechanisms were bound up in social norms to provide a framework on the general consensus on what is approved or disapproved of, whereas theories of individual decision making were bound in rational choice decisions of calculating risks against the rewards.
Privacy ethics lie between the normative understanding of social norms (what one should do) and individual decision-making (based on rational and free choice). These ethical frameworks provide the guiding principles on differential impacts, by which different stakeholders can deliberate over what the risk and benefits of novel technologies could be in the absence of established social norms, legal and frameworks and without direct experience or full and perfect information. They are conditional in that they depend on the current contexts and technology being reviewed at the time and may change over time.
Five of these were studies of smart meters [18,20,23,26,41], three about the smart grid [19,30,36], two studies were about incentives [24,27], one study on energy governance [33], one study was about public vs private energy use feedback [15], and one study was about views on demand side response [16].
3.5.1. Theories of social behavior 3.5.2. Known, unknown and unknowable risks [20] Programme theories around trust included the theory of "phantom risk". This theory aims to understand why there may be a lack of trust in expert opinions by lay persons on risks associated with an unproven (sometimes disproven) causes. Examples of this include wireless radiation associated with mobile phone masts. In the context of smart meters, public opposition cited fears around potential causes of health risks as well as fire hazards, loss of jobs, threats to security (theft) which expert opinion dismissed as non-existent or minimal.
The theory of phantom risk indicates that other factors than privacy concerns alone may underpin people's concerns and non-acceptance of technology such as lack of power and agency, sensationalist stories in the media and lack of knowledge around the technologies [93][94][95].

Social Norms [23] -private data
As mentioned previously, Social norms are defined as the "rules and standards that are understood by members of a group, and that guide or constrain social behaviors without the force of law" [68, p. 152]. Social norms reflect a perception of a generally held positive or negative societal consensus view. In theories of social norms, norms emerge in response to new behaviors that may incur a cost and as such the behavior needs social regulation by approval or disapproval. In the case of smart meters, expectations of normative rules increase where threats to privacy present as the potential costs of the new technologies. The more harmful an individual thinks the new behavior is, the greater assumption that there is or will be social norms to control it. However, people often under or overestimate the consensus view and interventions that aim to influence social norms tend to be around correcting misconceptions. People's actual behavioural response to social norms is also conditional; people can still decide for themselves and individual goals can take precedence, regardless of the social norms, as these are not rules that are not enforced.
This study used the understanding of social norms and norms emergence to understand the ways in which new privacy threats affects the expectations and demands for social norms in response to smart meters.
3.5.4. Social norms -Theory of normative conduct, Theory of warm glow altruism [15] Social norms around energy conservation establish a moral benefit of conserving energy [96] and, conversely, societal approval of establishing one's green credentials by adopting energy conservation behaviors signals a moral benefit.
This study sought to test this norm through a nudge intervention, by making energy use data public to comparable households (in the university halls of residence) energy conservation for all to see would signal one's "green virtues" as well as some gentle competition to encourage behavior change. Unlike providing only private data, making data public could activate the extrinsically motivated with the potential for social approval with associated benefits of an enhanced pro-social reputation. Common knowledge of energy conservation behaviors establishes a normative conduct -i.e. this is what one should do, whilst keeping this data private does not. The theory of warm glow altruism acts as the reward for prosocial environmental behaviors, that is: it feels good to do good.

Theories of privacy ethics: between social norms and individual decision making
3.6.1. Techno-ethics [41] and the framework of contextual integrity [97] As the word suggests, techno-ethics explores the connections between two worlds of information and communication technologies and ethics, a moral framework by which we can anticipate what people might consider good, acceptable or fair.
Smart meters pose new challenges to the norms of a separation of private and public spheres and potential conflicts between moral and political values. For example on the one hand by ensuring fair access to the benefits of convenience and efficiency that detailed data collection and data sharing can offer, and on the other, a potential for intrusion on personal and private sphere, or discrimination in the political or economic sphere where governments or corporations treat people differently. The authors examine the norms that individuals refer to and rely on when it comes to expectations on what is ethical practice in data sharing and its use and the normative conflicts that underpin the anxieties and concerns around smart meters.
3.6.2. Theories of social practice [30,26] Theories of social practice [30,26] take the everyday routines of people as the unit of analysis "to draw attention to the social and material context of human conduct" [30]. The theory balances the two opposing paradigms of voluntarism -that it is individuals and their attitudes and beliefs that determine how they act; and the systemic -that structural paradigms of external pressures of new technologies and policies will influence behaviors. By focusing on social practices, the theory considers the coproduction and feedback loops missing from purely individualist and structuralist approaches in explanations of how Individuals think, choose and act, such as in managing their privacy and sharing data, but also are drawing from "rules of the game", culture and shared knowledge [98].
3.6.3. Social practice: Usable privacy in smart meters [26] In study [26] a practice-based approach is applied to understanding how people make sense of smart meters and privacy in the protected area of the home. The study considers how transparency in the communication of risk is used to make privacy decisions and is incorporated into everyday practices. In smart meter energy data collection, consumers try to understand the abstract nature of energy use data and apply this to the real-world implications of intended and unintended disclosure.
3.6.4. Social practice: Consumers as change agents in smart grids [30] This study considers the decision-making processes of householders' decision to participate (and share data) in smart grid development, and how these decisions and practices are shaped by power and social relations in which the smart grid is embedded. The study looks at the factors that enable these households to become "change agents" and the privacy and autonomy barriers that could prevent them from adopting new practices. A social practice approach recognizes that the households' role has changed over time from passive agents or captive consumers, to active change agents through the expanded opportunities of cooperation through sharing of information-both through horizontal engagement with other households, in citizen and community led smart grid projects and vertically, opening up the household to outside by outsourcing tasks and disclosure of information. In this theory of social practice, smart local energy systems offer opportunities to transform the everyday "energy management practices" of households through revealing and quantifying ways of living through monitoring and generation of data on energy consumption, thus making the home an explicit site of environmental action [30].
3.6.5. Theory of procedural justice: fairness in decision making [18] In terms of fairness, procedural justice was a theory put forward by Guerreiro 2015 [18] to understand the acceptance and use of smart meters. Procedural justice is when the processes of decision making are transparent, fair and appropriate [99] and would involve stakeholders in the decision-making process. The understanding is that if the process is transparent and seen by all to be fair; that they respect and recognize people's agency, dignity and voice, then people are more likely to accept the outcome, even if they don't agree with it. On the other hand, feelings of injustice and unfairness can create suspicion, resistance, and loss of trust and which is difficult to regain and will pose a barrier to acceptance and participation.
3.6.6. Social contract: Data protection is a public good [33] The social contract theory has a long history of explaining why people consent to being governed, apparently accepting restrictions on personal liberty for the benefits of social protection. A widely used definition would be that citizens "comply with the fundamental social rules, laws, institutions, and/or principles of that society', 'by rational agreement' even though individual reasons for complying differ [100]. From the 20th century on, this idea of contract by rational consent considers the conditions under which citizens would consent [101] assuming they had all the relevant information on a particular issue and acted reasonably and fairly.
Given the complexities of consent, and what individuals can reasonably know and expect about what they are consenting to in terms of their privacy rights and data sharing, [102] argues that from a data implementers' perspective, data protection should too be framed not just in terms of individual rights and responsibilities but as a collective public good. Sharing of data and consent should be that to which individuals would reasonably and fairly choose to consent to and for purposes that they could reasonably expect. Services that use data in unexpected ways, while perhaps legal, may not yet be acceptable.
3.6.7. Scripting [19] Meanings of shared control Scripting -in-scripting and de-scripting describe the different ways individuals can take control of and adapt prescribed behaviours that were "scripted" for them. Authors refer to two conflicting visions of the future energy consumer: either the main change is purely technical, and customers passively respond to this new arrangement or they are to be active agents of change themselves, by engaging with the processes of energy provision and consumption.
The study describes the ways of understanding control from an implementer and customer perspective, such as "control over," technological control, or "being controlled" and the meaning of control in the reserved space of the home. The design of the smart grid trial in the study allowed for remote control of appliances as a way of balancing demand and appeared to promote a preference for passive customers in its design (in-scripted), but in practice finds that customers find ways to negotiate with, adapt, and control devices to work for them into their everyday practice which at times were at odds with the original intention and design (de-scripted).

Theories of individual behavior
3.7.1. Technology Acceptance Model and theory of Reasoned Action [18] Technology Acceptance Model (TAM) proposed by Davis (1989) [103] has the perceived ease of use and perceived usefulness as the mechanisms that influence attitudes towards smart meters and towards intention to use [104][105][106].
In the same study, the author refers to the Theory of Reasoned Action (TRA) to explain how getting from intention to use to action was mediated by the individual's sense of self efficacy and control, similar to the concept of ease of use in the technology acceptance model. Both TAM and TRA assume that the individual balances the perceived risk of smart meters with their usefulness and ease of use, making the rational choice for smart meters. 3.7.2. Theory of planned behavior and locus of control [16] The theory of planned behaviour [107] links attitudes, subjective norms and behaviour with two aspects of people's assessment of their ability to act effectively (self-efficacy) and the extent to which control is available to them (controllability).
A psychological explanation behind the capacity of a person to assess their ability to act effectively is the theory of locus of control [108] that is the extent to which an individual believes that that they have power and influence over the outcomes of events in their lives. An external locus of control is a belief that one is not in control of events, a fatalistic view, while conversely the internal locus of control holds beliefs in one's own power in influencing events and outcomes.
In terms of smart meters, the authors consider to what extent an individual's internal or external locus of control is associated their intentions to engage in demand response energy use behaviours.

Innovation Diffusion theory [36]
Innovation adoption and innovation diffusion theories are theories of individual behaviour explaining and predicting how and why some people are more willing to adopt new innovations before others within a social system [109], and diffusion is through which channels and at over what time period these ideas of adoption spread. Early adoption of innovation may be mediated by individual personality attributes, such as openness to change, risk aversion and innovativeness while innovation attributes can affect whether an innovation is adopted or not. Gärling  Research in this area also suggests that consumers' involvement in the product as having a strong effect on the intention to adopt while demographic factors have less of an impact on predicting who would be an early adopter consumer [111].
3.7.4. Incentives: Game theories [24,27] 2 modelling studies [24,27] test what level of incentive offer tips the balance for a customer to trade privacy for benefits. In the first study this was in the form of discounts offers, in the second this was in the form of virtual credits or coupons. In game theory, the first player is the energy provider that needs the fine, granular data from the customer to design reliable responsive grid systems, but this comes at a risk for player 2, the customer, in terms of sacrifices of privacy and potential risk of data being intercepted and customer's safety compromised. Players in this scenario "compete" to achieve their interests.

Guiding principles for interventions to address privacy concerns
From these barriers and enablers of data sharing and privacy concerns in different contexts, and the mechanisms underpinning how people make decisions privacy and sharing data we derived the following 8 guiding principles for the smart local energy service providers. This would include all stakeholders, energy service providers, community groups, private and cooperative who are involved in the design and delivery of smart local energy services.
1. Recognize the mutual benefits of data sharing for smart local energy systems and work with customers as partners 2. Involve people in the design of data sharing technologies from the start 3. Give people a say on the third parties that they are happy to share data with 4. Empower people to set the boundaries around the flow of information about themselves 5. Ensure that the purpose and value of the data collected is transparent and fair 6. Ensure that everyone that is affected by sharing of data is involved in giving their informed consent 7. Recognize that technologies for revealing and monitoring behaviors in the home can be used in unexpected and unwanted ways 8. Ensure there are channels of feedback and ongoing communication to continuously improve service delivery Guiding principle 1. Recognize the mutual benefits of data sharing for smart local energy systems and work with customers as partners.
This principle was derived mainly from barriers and enablers in the Meso system of community and also one barrier that related to the wider socio-economic and cultural contexts. This was in recognition and acceptance (mechanism) that the benefits of sharing data through knowledge and understanding was not one sided on the side of the consumer with promised of lower bills, but an integral part of the smart local energy system sustainable success. The stated aims for SLES in the studies included reducing system costs, increasing the use of renewables, reducing total energy consumption, especially reducing peak demand. Energy customers' active participation in data sharing is an essential part of meeting these aims (outcomes). By working with customers as partners should overcome the privacy concerns that data is being extracted and exploited for profit with little to gain on the customers' side (mechanism).
Some populations are at risk of exclusion from decision making about data sharing, for instance when any initial cost investment is not theirs, or access to other income and benefits have been made conditional on their data sharing. SLES implementers should make resources available for ongoing support, education and involvement to realize ongoing participation for mutual benefit (mechanism).
Guiding principle 2. Involve people in the design of data sharing technologies from the start Most of the privacy concerns expressed in the studies were around the sense of loss of choice autonomy and control, this sense of loss of autonomy and control was met with active resistance, even on a national scale, and passive non-participation in other cases (outcomes). This was overcome when implementers tailored their approaches to address individual concerns and values and were clear about what benefits could be expected, while at the same time, not overselling the benefits (mechanisms). Some people were keen to save money, while others were attracted to environmental benefits of conserving energy and enabling increasing renewable energy (mechanism). While people who were familiar with technology were more likely to be research participants (as this was a factor in self-selection or a condition of participation) some of the participants still found the website portals difficult to understand and use. Others found integrating new technologies with other connected devices in the home challenging and were surprised when their integration was not straightforward as they had expected. Involving people at the earliest opportunity in the design of the data sharing technologies will be more likely to be trusted (mechanism) and used (outcomes).
Guiding principle 3. Give people a say on the third parties that they are happy to share data with People are already familiar to some extent with the right to have some control over privacy settings, and new technologies are assumed to include these everyday controls (contexts). However, usable privacy policies -that is, those that people can actually read, understand and make informed decisions about -are currently the exception not the rule. SLES designers and implementers have an opportunity to build-in the concept of useable privacy from the start. People differ in their opinions of acceptable third parties to share data with, although for profit organization are lower on the list for most people (mechanism). However, some people find the idea of targeted advertising appealing while for others this is the least acceptable third party they would be happy to share data with. People should be given the option of choosing for themselves (mechanism).

Guiding principle 4. Empower people to set the boundaries around the flow of information about themselves
Privacy represented the controlled flow of information about oneself in the different contexts of individual beliefs and values (mechanism) the family and home and wider contexts of work life and community, and the socio economic and cultural domains (contexts) what is considered acceptable or appropriate sharing or revealing of information about oneself or family one domain may not be in another. The loss of control over this flow of information was felt to be to be disempowering (mechanism). The setting of boundaries can vary for different domains, for different purposes and change over time. To make privacy settings usable, they should be easy to access, understand and to change (mechanism).

Guiding principle 5. Ensure that the purpose and value of the data collected is transparent and fair
A lack of clarity over the extent and purpose of data collection led to declining trust and ambivalence (mechanism) over whether people would really see any benefit themselves. Some were disappointed that the flexibility in energy use to qualify for lower tariffs, meant losing some functionality in their day-to-day routines which they did not anticipate (mechanisms). Concerns of the use of energy data were on the whole low (context) and people were more interested in how the technologies worked and for what purpose (mechanisms). Overwhelming people with hard-to-understand privacy conditions generated suspicion and a withdrawal of consent as a default safety position (outcomes).

Guiding principle 6. Ensure that everyone that is affected by sharing of data is involved in giving their informed consent
There are wider impacts of data sharing of both risk and benefit than traditional notions of the single bill payer and informed consent should reflects this as participation of all members of the household are needed for the benefits of data sharing to be realized (outcomes). Family members may differ in their priorities and perceived benefits of data sharing (mechanism), for instance, families of older people saw some advantages in being able to remote monitor their loved one, but this was received less enthusiastically by the older people themselves who did not want to see this kind of monitoring replace human contact. There was an assumption in some studies that understanding new energy data sharing technologies was probably too difficult for older people, while studies in other sectors suggested that in fact they were quite keen to learn about smart technologies potential to retain independence and maintain health. Low income and vulnerable people in receipt of social protection may already feel more monitored in their everyday life compared to the general population and energy related benefits often call for energy use data collection as a condition of energy related benefits. Social housing landlords may be enthusiastic over the economies of scale of the benefits of energy efficiency but neglect to involve their tenants in this decision making, but what was clear from the studies of energy data collection and also related sectors was that without involvement and support for active participation of the person sharing the data, these energy efficiencies would not be realized as the desired behavior change will not be sustained.

Guiding principle 7. Recognise that technologies for revealing and monitoring behaviors in the home can be used in unexpected and unwanted ways
New technologies and new uses for technologies can have unintended consequences (outcomes). The use of detailed energy use data (mechanism) can shift the balance of power within the household to the one who can access and control this data. Reactions to previously unknown information about energy use ranged from being found to be fun and interesting to a bit annoying to controlling and abusive. SLES should incorporate design principles that energy use data cannot be exploited by those that seek to use this to control and abuse others.

Guiding principle 8. Ensure there are channels of feedback and ongoing communication to continuously improve service delivery
The principle is related to the principle above, that new technologies and uses of technologies can face unintended consequences. Taking an ethical approach to responsible innovation (RI) should include ways of quickly learning and responding to these unintended consequences (outcomes). Open channels of communication will encourage collective reflections and evaluation of different stakeholders (mechanism) on the successes and challenges of new ways of using data and their potential and actual impacts. Technology innovation should be an ongoing process rather than only linear one of design followed by implementation, but should include mechanisms of evaluation, stakeholder involvement, redesign and refinements. Ethical issues may not be apparent at the design stage but emerge over time and from different perspectives.

Conclusions
People's expressed privacy concerns that potentially act as barriers to data sharing were wide ranging and depended on the different privacy concerns in different domains. One barrier to participation with good supporting evidence was that people will resist the intrusion on their autonomy, choice and control in the individual domain. This could be actively -by refusing to install data collection technologies for instance, or passively by non-participation in changing or adapting energy use behaviors. Evidence from other sectors suggests that people are willing to accept new technologies and sharing data if the benefits of doing so are clear, anticipated, mutually beneficial and includes choice and control. Not for profits are more trusted than for profit organizations to work in customer interests.
Inclusion and informed consent will require active outreach from SLES providers in a variety of ways to meet people's abilities and preferences, as well as ongoing education and support to ensure that privacy concerns are adequately addressed, that the benefits of sharing data are realistic, and that participation is by informed and active choice.