An IoT Based Mobile Augmented Reality Application for Energy Visualization in Buildings Environments

: Augmented reality (AR) improves how we acquire, understand, and display information without distracting us from the real world. These technologies can be used in di ﬀ erent applications and industries as they can incorporate domain-speciﬁc visualizations on a real-world screen. Mobile augmented reality (MAR) essentially consists of superimposing virtual elements over real objects on the screen, to give added value and enrich the interaction with reality. In numerous plants, it is being used for maintenance and repair tasks, as well as training. The Internet of Things (IoT) is increasingly pervading every aspect of our lives, including the power infrastructure of our buildings. IoT-enabled devices o ﬀ er many connectivity options for helping supervise all-important energy assets.


Introduction
Nowadays, companies, non-governmental organizations (NGO), and government institutions are including different corporate social responsibility (CSR) approximations such as the impact of a corporation on the natural environment. This commitment leads them to limit this environmental impact, in all their operations, by constantly seeking the reduction in energy consumption. Therefore, energy efficiency is not just an isolated effort to be cost-efficient but is also a core target that reflects the attitude towards their customers. As the use of building management systems (BMSs) becomes more widespread, the starting point is to monitor the energy consumption, and next to that, to promote energy-efficient behaviors. To achieve this consciousness, manageable, but also a more granular, visualization of energy data is required [1].
In general, data are available on functional parameters, design, or characteristics with respect to some standard. However, in many cases, the installation information related to energy is poor. This is because, if the one provided by the manufacturers is available, in most cases it is not standardized. Furthermore, knowing the energy information in the different operating states of each machine is difficult in practice [2]. In addition, as energy reports must be properly linked to CSR reports, detailed measurements and contextual analysis can contribute to reducing energy consumption. However, the absence of monitoring on the final load (which has been called submetering or 'behind the meter' monitoring) has so far been an unsurpassed barrier.
The on-time energy information about equipment and system help in timely decision making. However, sharing the same data with the maintenance personnel when they stand before equipment is difficult. The purpose of this work is to explore the usability of new interface applications and human-computer interactions (HCI) for showing different electrical variables (active and reactive power, root means square (RMS) voltage and RMS current, temperature, time) to rise energy-saving consciousness in building users. Moreover, a data fusion process can be used to relate the energy consumption data to the spatial data of each device in order to drive a change in the behavior of the users and help the overall saving [3]. However, the proliferation of such complex technologies is not a warranty of success unless it is consistently linked with user interfaces that relay detailed appliance usage information, and then translates that information into energy savings. These data should be given in a form that enables comparisons among different dates, is understandable by non-expert persons, as well as provides information that helps the user to understand which parameters have the most influence over the total energy consumption. Without this characteristic, users would need a longer time to perceive and analyze such large numbers and different graphs.
Augmented reality (AR) is an approximation to the reality in which physical objects are linked to a virtual counterpart, with contextual computer-generated information. AR has come a long way from a science-fiction dream to a clearly established scientific discipline [4]. Until recently, the cost of augmented reality made it somewhat prohibitive. Fortunately, the situation has changed, and AR can be implemented at a competitive cost in devices ranging from smartphones, tablets, smart-glasses, or even more complex (headsets) to become more realistic. Nowadays, mobile augmented reality (MAR) is in the technology hype cycle, which has attracted the interest of both research and industry. Thus, with the development of smart mobile technology, many applications, services, and content have proliferated. With the support of this type of computer tool, focused on expanding the perspective of the real world using mobile devices, virtual images are presented, which transcend the common perception and reveal a new universe to discover.
The measurement (also known as submetering when considering individual appliance measurements) and adequate visualization of each detail is the first step to know why and, consequently, to become aware and adopt proper measures to save electricity in any operation. This work researches the use of AR tools to improve the visualization of the energy consumption of each appliance, and thus make more efficient and sustainable facilities. The aim of the paper is to describe the architecture of the proposed system, and although the system can be installed at any residential area, as saving electricity is a target in every building, in this paper, the case study has been developed in a nursing home for down syndrome individuals in Cordoba Area run by an NGO. The reason for this is the important need for those NGOs to save money by saving energy consumption. They receive limited public funding, and basically are supported by private funding, which is preferable to be used in specific programs for people more than in utility expense. This NGO is trying to reduce energy consumption as well as their CO 2 emission with different initiatives through installing this developed system. Thus, the purpose is employing AR technologies to support energy data coming from the visualization of Internet of Things (IoT) sensors embedded in smart appliances. The target user in this case in the maintenance team, as they have basic knowledge to understand the electrical variables considered. In fact, this team is currently in charge of the electrical repair works when needed, so this information given through AR will be useful to meet the commitment to energy savings. Section 2 contains the state of the art. Section 3 presents the materials and methods. Section 4 is devoted to the evaluation of the platform and the discussion, including impact and usability evaluation. Finally, Section 5 addresses the conclusion of this work.

Background
Augmented reality is undoubtedly one of the technologies that will transform the way human beings interact with their environment. This technology essentially consists of superimposing virtual elements over real objects on the screen, to give added value and enrich the interaction with reality. There are different fields where AR is currently applied allowing the user to have a more real experience, that is, video-game, museums, and education. Furthermore, AR has grown up enough to have been successfully applied in distinct industrial and commercial environments, such in smart factories [5], building management systems (BMS) [6], aerospace, and automobile industry [7]. AR has also been employed in power system substation environments in the works of [8,9]; in both papers, AR is used in simulation-based training. Applications applying AR to support data visualization in the infrastructure sector are spreading [7]. One reason is that AR annotations are one of the most efficient and intuitive ways to provide contextualized information about objects in real environments, as they appear in the same place as the object in question [8]. The most determining aspects of the design of an industrial AR system for the Industry 4.0 shipyard case were studied in the work of [10], including the ultimate technologies and the most relevant research and developments for the shipbuilding industry. A report on functional requirements for AR suppliers was recently published in order to improve performance and efficiency in different activities related to operation and maintenance engineering [11]. These guidelines will help develop products for industrial users, thus accelerating tasks by enabling the immediate access in the field to real-time data, maintenance manuals, instructions guide, schemes, and so on.
On the other hand, MAR can provide "emotional fit" as the ability to provide an aesthetically pleasing experience for specific shopping behaviors [12]. In a recent paper, MAR enables people to compare the embodied carbon footprints in cereals and bottled water at a grocery store [13]. However, none of the mentioned papers are related to the visualization aspects of energy information for customers. In this work, the aim is to raise a MAR system for users to visualize, in near real-time, the energy consumption of their household appliances as well as additional power quality (PQ) parameters. This application will provide users with a powerful tool to understand their residential energy consumption, helping them follow different strategies to reduce electricity consumption and, therefore, to reduce the bill. The recently published paper [14] presents a user-friendly monitoring system for power awareness based on IoT and AR, however, their measurements come from a traditional smart meter, without performing any measurement related to PQ, nor considering submetering. Having information about PQ can be a determining factor in saving energy and increasing system power capacity. Moreover, in the work of [15], compared with the indicated paper, our great contribution is to make possible the "granularity" in the measure of the energy that is offered to the user. So far, the lack of submetering or recording energy consumption "behind the meter", for example, at system-level or into individual appliances, has been an insurmountable barrier that prevented knowing how energy was consumed in detail. This work solves this issue by a novel, low-cost IoT sensor for measuring and analyzing PQ at the input of any individual alternating current (AC) appliance, providing an early detection and analysis system that controls those critical variables inside the facility and leads to anticipate faults with early-stage alerts based on treatment of on-time data streams. Moreover, the recorded PQ parameters that are processed in the cloud system can help to reduce energy consumption, as PQ disturbances can be automatically analyzed and even compared to standard values.

Materials and Methods
In general terms, an AR system is composed of three main elements: a set of sensor nodes in charge of gathering information from different elements and points and storing it into the cloud; an AR user interface supported by a light device like a smartphone or glasses, through which users can access to the information collected by the sensors; and eventually an AR image target that allows linking the information with a physical point in a dynamic way. The combination of these three elements allows building an application that is much more intuitive than traditional ones. Figure 1 shows the general architecture of our AR system to monitor energy parameters. The components of this platform are an IoT PQ sensor for voltage, real power, current, apparent power, and other important electrical parameters; a communication subsystem in charge of exchanging data between the sensors and the platform, and between the platform and the MAR device; and a server, responsible for data analysis and for storage of the data collected from the smart meter.

IoT Power Quality Sensor
The main hardware device is an embedded IoT PQ sensor (see Figure 2). This sensor has been thoroughly described in the work of [15], and it is an owned design that provides information related to PQ and energy consumption about a smart appliance within any facility. Here, the IoT PQ sensor is used to provide information about voltage, current, real power, apparent power, and other important electrical parameters. The power consumed in an AC load is called true power, also known as active or real power. On the other hand, the product of the two RMS values, voltage and current, is known as apparent power. While real power does the actual work in the load, apparent power is the total power in an AC circuit, both circulating and dissipated. Apparent power must be considered when designing and operating power systems, as, although all the current associated does no work at the load, it still must be supplied by the power source. Therefore, conductors, transformers, and generators must be sized to carry this total current. The ratio of active power to apparent power in a circuit is called the power factor. This is the reason to include both measurements in this IoT sensor. A system with a low power factor will have high circulating currents owing to energy that returns to the source from energy storage in the load. The consequences for those higher currents are losses and reduced efficiency.
All this information will lead to a proper analysis of operational parameter patterns as well as knowledge of the power supply and load conditions of the multiple appliances in the building. These data will be useful for equipment maintenance and support, and ultimately for user energy awareness.

Software Components Overview
Any AR system is mainly composed of two parts. On the one hand, the data source in charge of generating and providing the information of interest (in our case, the IoT PQ sensor) and, on the other hand, the software components (app, data analysis, and so on) in charge of showing to the end-user this information in an amicable way. In this section, the latter part will be described.

Augmented Reality: Vuforia and Unity
To show the contextual energy information, an AR tool able to detect the target images where the energy data are going to be overlapped is needed. At present, there are many libraries for the development of AR applications; some of the most popular are Wikitude, DeepAR, EasyAR, Kudan, Maxst, NyARToolkit, and ARcore. In comparison with these toolkits, Vuforia [16] allows access to a powerful set of functionalities; that is, recognition of the different types of visual objects, text and environment recognition, developing platform-independent applications, and so on. Vuforia works around the image targetconcept, which represents images that Vuforia Engine can detect and track (see Figure 3a). To carry out this process, Vuforia does not use traditional markers such as data matrix codes or QR codes, and the used image targets also do not need special black and white regions to be recognized; in contrast, Vuforia detects and tracks the features that are naturally found in the image itself (see yellow points in Figure 3b) by comparing them against a known target resource database.
Once the system identifies an image target, it is tracked until it disappears from the visual field of the camera. This image is used as the base to overlap the information of interest; in our case, the energy information obtained by the IoT PQ sensor.
Another interesting feature of Vuforia is that it can be easily integrated with Unity [17], which is one of the most popular and powerful platforms used for mobile game applications and is progressing steadily in the augmented reality area. In our case, Unity was used as the base infrastructure for the application layer of our AR system. In other words, Unity helps us to develop the augmented reality application and to design attractive and intuitive graphical interfaces for monitoring the IoT PQ sensor values through the unity object concept (see Figure 3c). These objects (3D or 2D elements) are associated with the image target and they are displayed on the screen whenever the application detects the associated marker (see Figure 3d). Thus, to unfold this PQ AR infrastructure, a combination of the Unity and Vuforia technologies was selected. In short, image target is the image associated with the part of the real world on which the information is going to be overlapped, and the unity objects are the virtual elements (text, 3D objects, and so on) overlapped on the image target.

IoT Software Platform: ThingSpeak
As shown in the system architecture, a fundamental part is the IoT software platform used to store the information obtained by the IoT sensor. The IoT cloud platform [18] offers a powerful way of storing the information collected by sensors and having to access it from everywhere in near real-time. For this project, ThingSpeak was selected as the cloud platform of the AR system.
ThingSpeak offers all the features of an IoT Cloud Platform: storage, remote access, data analysis, friendly interface, and so on. To transmit information from sensor nodes to the cloud, ThingSpeak introduces the channel concept. Each channel allows storing up to eight fields of data, which can contain, as maximum, 255 alphanumeric characters. Furthermore, all the information sent through a ThingSpeak channel is date-time stamped and identified with a sequential ID. It works to highlight that these channels can be set up to be either private or public and they are also associated with Uniform Resource Locators (URL) that can be used to embed the data graphic in, for example, our AR application.

Application Workflow
The main objective of the application is to present data of power measurements when a home appliance is within the view of the camera of a smartphone or tablet. In this way, the user can see in the screen a virtual object that contains the last PQ measurements (obtained by the IoT PQ sensor) superimposed on the frame associated to the image target in the real environment.
This kind of application has two unavoidable important parts: training and recognition. During the former part, the system learns what appliances must be recognized, and during the latter one, the system must recognize the appliances through a camera to show its energy information on it. Each one of these parts will be described in detail in the following subsections.

Application Workflow: Training
To begin, it is necessary to upload the image target that the application must recognize in the real environment in order to combine them with energy information. This is carried out through the Vuforia Target Manager, a web-based tool that enables us to create and manage target databases and shows the image target and its features points identified by the detection algorithms. These features are represented with yellow dots (see Figure 4), which are compared at run time with features in the live camera image. In order to know if the image target can be easily identified, the Vuforia Target Manager assigns to this image a star rating between 1 and 5 stars. The higher the star rating, the better the identification. Figure 4 shows the image targets of a kitchen with a high grade of detectability (4 stars). Image targets can be stored in both local and cloud databases. The local database is recommended when the set of images is below 100, where the database is stored together with the app on the devices. Of course, this solution will identify the image targets much faster. However, if thousands of image targets must be identified, the cloud database solution is needed. In this case, the image targets will be stored on the Vuforia Cloud Recognition service.

Application Workflow: Recognition
Once the whole scenario of interest has been scanned through different image targets, the system is ready to overlap the energy information with the frame capture by the smartphone/tablet when they focus on part of the real world previously captured and associated to an image target. The whole workflow is described through the flow diagram shown in Figure 5. Process:

1.
A smartphone camera captures an image (frame) within the region of interest. 2.
The system extracts the features of the image.

3.
The features obtained are compared with the features previously stored during the training process.

4.
If the features match with some of the stored features, the system gets the identification of the ThingSpeak channel associated with them. 5.
The system accesses to the channel of the ThingSpeak cloud platform to read the energy information associated with the image target. 6.
The AR application overlaps, in runtime, the energy information with the image captured by the smartphone.  Figure 6 shows how energy information is overlapped over the image target when the user focusses its smartphone on the fume hood. It is interesting to highlight that the visualization of this information can be improved using different helper visual effects. In this case, a colored rectangle is used to facilitate the visualization the information in a general way, and on the other hand, in order to help users to observe essential data and their changes, different colors are applied to the text objects depending on the value of the energy parameters. Furthermore, users can always click on the data to visualize them through a ThingSpeak graph; this way, they will be able to see a larger set of data (not only the last one) shown in a graphical way.

Evaluation of the Platform and Discussion
The case study was developed in the kitchen of a nursing home for down syndrome individuals in the Cordoba Area run by an NGO. Once the IoT PQ sensor and the AR system was built to create a different way of showing to the user the energy information of its appliances, it was tested under real-world conditions. Two sensors node characterized by its own node identification were installed to supply and monitor a fume hood and a microwave.
The evaluation of the whole energy AR system was split into two parts: (1) sensor devices and their integration with the cloud and (2) energy augmented reality application. Each of them is then explained in detail.

IoT PQ Sensor and ThingSpeak
In order to evaluate the IoT PQ sensor itself as well as its integration with the cloud, an experiment including two IoT PQ sensors is described. Both sensors were attached to a fume hood and a microwave to obtain their energy information in real-time. The IoT devices were programmed to get data from their attached sensor (information about current and voltage) and send them to the ThingSpeak platform every 15 s. To be more precise, the IoT PQ device can provide six parameters: RMS voltage (V rms ), RMS current (I rms ), active power, apparent power, power factor, and room temperature. This information is sent to the ThingSpeak servers using the concept of channel under the publish/subscribe paradigm. Within a channel, a maximum of eight fields can be added by publication. In the case of study, the IoT PQ device publishes the six parameters mentioned above related to the microwave and fume hood under two different topics. There is another topic number that includes aggregated energy information from the whole kitchen. Figure 7 shows the energy information obtained by an IoT PQ device attached to the kitchen. This information can be accessed from everywhere and any device with an Internet connection. Figure 7a contains V rms , which is 233 V average during the interval measured. Figure 7b presents the I rms varying from 0 to 12.8 A depending on the power selected in the microwave. Figure 7c shows the power factor, which as expected is close to one when the loads are connected. Accordingly, before 08:34 the power factor is close to zero as there were no loads connected (I rms = 0). Finally, Figure 7d is the active power consumed by those two devices.

Energy Augmented Reality System
In this section, to evaluate the AR system from three different point of views, three scenarios are analyzed: (1) selection of proper image targets, (2) evaluation of the system when the image targets correspond with real video frame obtained of real environments, and (3) evaluation of the system when the image targets correspond with printed images.

Image Targets Selection
In any AR system, it is very important to obtain proper image targets that later have to be recognized, and on which virtual information related to them is going to be overlapped. One of the most influential parameters is light. If an image target is obtained under unfavorable lighting conditions, the target detection and tracking can be significantly affected. Thus, it is important to make sure that there is enough light in our room so that the scene details are well visible in the camera view. Figure 8 shows the proportional relation between the rating assigned to the microwave image target when it is taken under different light conditions. As can be observed, when the luminosity is about 40%, the system detects fewer features (only 1 star rating is assigned) than when the luminosity is about 90% (3 stars are assigned). It is important to highlight that two more attributes that must be taken into account to achieve a high rating are (1) the images have to be rich in details and (2) the fact that they should not contain repetitive patterns.

Energy Augmented Reality: Real Environment
To evaluate the correct performance of the whole energy AR system, an Android app was designed and implemented. The energy AR app was tested in a smartphone with the following hardware features: Huawei P20 Little, CPU Kirin 659, 4 GB RAM, Android v.9.
As can be observed in Figure 9, whenever the user focusses its camera on a fume hood or a microwave, they are recognized by the AR app and a pop-up window is shown with all the energy information generated by them.
It is worth mentioning that the AR app was designed to show the value of energy parameters using different colors depending on how close they are to high or low levels; this way, users will be able to interpret the information faster and in a more intuitive way. For example, when the active power value is below 25 W, the color is white, while it is blue if it is above 100 W, green if it is above 1000 W, and eventually red if it is above 3000 W.
Finally, it is also important to mention that, once the camera is focused on a device, for example, the microwave, the energy information is updated in real-time every 15 s on its own. This way, a user does not have to focus the camera on another point and go back to the microwave to have the information updated.

Energy Augmented Reality: Printed Environment
The AR app also allows us to carry out the augmented reality process focusing the camera of the smartphone on the printed image target. In this test, the relation between the distance and the camera incidence angle with respect to the printed image to detect it properly was analyzed. Figure 10a shows that those printed images whose sizes are between 6 cm and 25 cm, the distance at which the mobile augmented reality application detects the image target, increase almost linearly, and for those ones, with smaller sizes, it would be unfeasible for our AR application. However, larger sizes would be suitable and even desirable if larger detection distances are needed. On the other hand, Figure 10b shows the relationship between the angle of incidence of the mobile device camera regarding the detection rate of the image. These results suggest that good detection is possible for angles between −50 and 50 degrees, which is perfectly suitable for the purposes of our application.

System Performance
It is important to guarantee the AR application response within the specified time limits. In this case of study, the energy information generated by the appliances is continually changing, and thus the updates must be reflected in the smartphone of the end-user as soon as possible. In this test, the AR app was evaluated to analyze the time consumption of different variables (see Figure 11) of the application using the Unity Profile tool [19]: (1) AR rendering process shows the time taken by the CPU to perform rendering processes, batches, camera render, triangles, and rendered vertices.  The vertical axis of the graph shown in Figure 11 represents the time required for the mobile device processor to process a change detected by the monitoring system. As can be appreciated, this period ranges from 0 ms to 20 ms. The horizontal axis represents the frames captured by the camera. There are several graphs plot in Figure 11; the blue graph represents the time needed to run the scripts, and the green graph is the time taken to render the frame over the 4G mobile Internet. The yellow graph represents the response time needed to carry out a complete cycle.
Let us see a specific case, concretely, the consumption times of each one of these PQ variables when frame 4 is captured and analyzed. In this case, the time taken to render the frame is 2.27 ms, and the time needed to run the scripts is 7.97 ms. It can be observed that most of the time for this frame is spent on the operations of scripts (almost three times more). It can be also observed the total execution time of the cycle is 18.83 ms. From the response time point of view, this last parameter is the most important one. The average response time is 16.45 ms. These results demonstrate that the use of Vuforia image target, with immediate decoding results, combined with an IoT device, represents an excellent choice in terms of response time. This could be characterized as very reasonable in terms of interactivity.

Impact and Usability Evaluation
Finally, in order to quantitatively evaluate the impact of the energy AR system on the habits of the users (maintenance personal) and its usability, two questionnaires were used to evaluate both aspects. On the one hand, an impact questionnaire helps us to understand the impacts that the system has produced in the users providing an accurate understanding of the implications of the expected saving energy consumption. On the other hand, satisfaction is maybe the most important aspect of usability because, no matter how efficient the system is, if users do not like it, they will not use it. As described in Section 1, all users of this application belong to the staff of the company in charge of maintenance for the NGO, with basic knowledge of electricity and aware of the importance of energy-saving initiatives.
Concretely, 16 persons were interviewed. The results and analysis of both questionnaires are described in the following subsections.

Impact Questionnaire
The impact of the system on the users was evaluated through a small questionnaire composed of five questions, where each question could be evaluated from 1 to 5 points, with 1 being strongly disagree and 5 being strongly agree. The questions were written to know the impact of this system on the habit of the user, including questions about their user awareness and privacy. Concretely, the questions are listed below.

1.
Do you think the energy AR application can influence in your habit to reduce the energy consumption? 2.
Do you think the energy AR application would increase the understandability of your energy environment? 3.
Do you think the energy AR application would bring significant benefits to your everyday life? 4.
Do you think the energy AR application would threaten your privacy? 5.
What is your general opinion about the prototype application?
In order to have feedback from the maintenance team, the results from the test were processed through statistics like mean and standard deviation (Std). They are shown in Table 1. On average, most questions were positively answered (above 4 out of 5), except for question 4, in which the score is 3.06. Focusing on the results, interviewed users consider the system will positively influence their habits to reduce energy consumption (89 scores). Furthermore, the energy AR system will also help them to understand better their energy environment (81 score). Finally, the scoring of question 3 also reveals a positive impact in life of the users. However, the question about privacy reveals a certain degree of concern; it is not a very negative scoring, but it indicates that the privacy aspects must be improved. The standard deviation is low, meaning that the answers are close to the mean, except for question about privacy, in which users do not agree as in the rest of the questions. In general terms, as question 5 shows, users are happy with the proposed AR energy system and with the positive impact it will have in their work routine. The satisfaction of the users with the energy AR system was measured through the system usability scale (SUS) questionnaire [20] which is the most used questionnaire for measuring perceptions of usability. This questionnaire provides a reliable tool for measuring usability and its reliability has been proved in hundreds of publications. This method has recently been successfully employed in the evaluation of AR applications [21][22][23].
According to the authors of [24], a good SUS score for each individual should be above 68 (SUS score > 68). As shown in Table 2, the calculated SUS score for the conducted experiment is 80 and its standard deviation is 0.72. This means that the SUS score is not only above average, but it is also very close to 80.3, and according to the SUS score rank, when the score is 80.3 or higher, the system obtains the grade A, which means people love your system and will recommend it to their friends. It is worth highlighting the standard deviation value, which shows all the interviewed users have the same feeling.

Conclusions
Augmented reality is a technology that is increasingly present in many areas of our life. It helps us to create a more natural interface between humans and the physical world, minimizing the number of hardware devices we must carry with us. First, computers were used as means to visualize data generated by sensors to check the status of the physical world; nowadays, smartphones are everywhere and, in the very near future, instrumented glasses will be used to monitor the world around us, even with technology directly integrated with our eyes. In this work, a monitoring energy system based on augmented reality to visualize in real-time the PQ parameters and the energy consumption of home appliances in an easy way is presented; so easy, that end users only have to directly focus with their smartphones on the home appliance they are interested in to know its energy behavior. From an architectural point view, the system presents an IoT energy device developed by authors to obtain the energy information of the home appliances, then this information is uploaded to the cloud (ThingSpeak platform), and eventually the integration of this information with the augmented reality application when the frame captured by smartphone's camera corresponds with an image target previously stored. The carried-out tests show that the system easily detects the image targets and, most importantly, the response time to detect and overlap the energy information on the captured frame is negligible. Furthermore, two questionnaires were given to the maintenance personal to evaluate the impact of the system in their habits and its usability perception. The results of both questionnaires show a positive impact on both perspectives.
Currently, we are working on the development of algorithms and graphical interfaces to help end-users to understand the meaning of the energy information presented to them. This is probably the best way of making users aware of their energy consumption.