Next Article in Journal
Optimal Body Composition and Anthropometric Profile of World-Class Beach Handball Players by Playing Positions
Next Article in Special Issue
Industrialization and Thermal Performance of a New Unitized Water Flow Glazing Facade
Previous Article in Journal
Resources Confirmation for Tourism Destinations Marketing Efforts Using PLS-MGA: The Moderating Impact of Semirural and Rural Tourism Destination
Previous Article in Special Issue
Overheating in Schools: Factors Determining Children’s Perceptions of Overall Comfort Indoors
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Impact Assessment for Building Energy Models Using Observed vs. Third-Party Weather Data Sets

by
Eva Lucas Segarra
1,*,†,
Germán Ramos Ruiz
1,†,
Vicente Gutiérrez González
1,
Antonis Peppas
2 and
Carlos Fernández Bandera
1
1
School of Architecture, University of Navarra, 31009 Pamplona, Spain
2
School of Mining and Metallurgical Engineering, National Technical University of Athens (NTUA), 15780 Athens, Greece
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Sustainability 2020, 12(17), 6788; https://doi.org/10.3390/su12176788
Submission received: 20 July 2020 / Revised: 14 August 2020 / Accepted: 17 August 2020 / Published: 21 August 2020
(This article belongs to the Special Issue Sustainable Building and Indoor Air Quality)

Abstract

:
The use of building energy models (BEMs) is becoming increasingly widespread for assessing the suitability of energy strategies in building environments. The accuracy of the results depends not only on the fit of the energy model used, but also on the required external files, and the weather file is one of the most important. One of the sources for obtaining meteorological data for a certain period of time is through an on-site weather station; however, this is not always available due to the high costs and maintenance. This paper shows a methodology to analyze the impact on the simulation results when using an on-site weather station and the weather data calculated by a third-party provider with the purpose of studying if the data provided by the third-party can be used instead of the measured weather data. The methodology consists of three comparison analyses: weather data, energy demand, and indoor temperature. It is applied to four actual test sites located in three different locations. The energy study is analyzed at six different temporal resolutions in order to quantify how the variation in the energy demand increases as the time resolution decreases. The results showed differences up to 38% between annual and hourly time resolutions. Thanks to a sensitivity analysis, the influence of each weather parameter on the energy demand is studied, and which sensors are worth installing in an on-site weather station are determined. In these test sites, the wind speed and outdoor temperature were the most influential weather parameters.

1. Introduction

The sustainable development goals report of 2019 highlighted the concern of the United Nations toward a more sustainable world where people can live peacefully on a healthy planet. One of the most important areas for the protection of our planet is the actions to mitigate climate change. “If we do not cut record-high greenhouse gas emissions now, global warming is projected to reach 1.5 °C in the coming decades” [1]. This concern was also endorsed by 186 parties in the Paris agreement on climate change in 2015 [2]. One of the strategies for tackling climate change is to reduce energy consumption (by increasing the system efficiency) and increase the use of clean energy so that greenhouse gas emissions are reduced. In this process of decarbonization, the buildings and the construction sector are critical elements, since as the Global Status Report for Buildings and Construction highlighted, they are responsible “for 36% of final energy use and 39% of energy and process-related carbon dioxide (CO2) emissions in 2018, 11% of which resulted from manufacturing building materials and products such as steel, cement and glass.” [3].
For this reason, building energy models (BEMs) play an important role in the understanding of how to reduce the energy consumed by buildings (lighting, equipment, heating, ventilation, and air-conditioning (HVAC) systems, etc.) and how to optimize their use. As Nguyen et al. highlighted, there is a huge variety of building performance simulation tools [4], and EnergyPlus is one of the most used [5]. In any simulation program, to obtain reliable results, the model must not only be adjusted to the behavior of the building, but all the files on which it depends must be reliable and suitable for the use that will be given to the model. Of all these files, the weather file is, perhaps, one of the most important [6].
Bhandari et al. highlighted that there are three kinds of weather data files, typical, actual, and future, which should be selected according to the use of the energy model [6]. The first corresponds to a representation of the weather pattern of a specific place taking into account a set of years (commonly 20–30 years). For each month, the data are selected from the year that was considered most “typical” for that month so that it represents the most moderate weather conditions, excludes weather extremities, and reflects long-term average conditions for a location. In general, they are used to design and study the behavior of the building under standard conditions, to obtain Energy Performance Certificates, to study the feasibility of a building’s refurbishment, etc. The typical files are not recommended in extreme conditions, such as designing HVAC systems for the worst case scenario.
There are two main sources of typical weather files: those developed by the National Renewable Energy Laboratory (NREL), which come from stations in the United States and its territories, where the different versions (typical meteorological year (TMY), TMY2, and TMY3) take into account different numbers of stations, time periods, solar radiation considerations, etc. [7,8]; and those developed by the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE) as a result of the ASHRAE Research Project 1015 [9,10]: the International Weather for Energy Calculations (IWEC), which takes into account weather stations outside the United States and Canada. The latest version (IWEC2) covers 3012 worldwide locations [11] taking into account a 25 year period (1984–2008) from the Integrated Surface Hourly (ISH) weather database.
The second kind of weather data file, actual, corresponds to a specific location and time period. It could be obtained from an on-site weather station or by processing data from several nearby stations. The latter option is commonly used by external data providers. This type of file is usually used to carry out building energy model calibrations, calculate energy bills and utility costs, study the specific behavior of HVAC systems, etc. As can be seen, these files take into account extraordinary situations, such as heat waves, adverse or extraordinary atmospheric phenomena, etc. [12].
Finally, the last kind of weather data file, future, is mainly used to simulate how it is possible to adapt the building energy demand to an external energy requirement (demand response [13]) or to obtain better use strategies of HVAC systems in certain situations (model predictive control [14,15]). As Lazos et al. highlighted, there are three common groups of forecasting techniques: statistical, machine learning, and physical and numerical methods [16].
Therefore, each kind of weather data file has a certain purpose; therefore, the reliability of the results will largely depend on the accuracy and suitability of the file selected [17,18].
Many studies highlighted the importance of the weather file when performing building energy analysis, as its accuracy is generally assumed, although it is out of the control of the person in charge of the simulation, its difference being significant. The following are some examples of studies that use typical weather files for different applications where the weather data are relevant. It is meaningful in retrofitting scenarios [19], or when quantifying renewable energy as in the cases when sizing photovoltaic solar panels [20], or when used to obtain ground temperatures [21], even when studying climate change; the accuracy of the weather files is also very important, as they are the baseline files in the process of morphing the data [22,23,24]. Therefore, when using such files, it is important to try to use the most recent [25].
There are other studies that analyze the building energy performance using actual weather files, either from commercial vendors [26], or developed using nearby weather stations, not located in the building [6,27,28,29], or the lesser ones, from weather stations placed in the building or in its surroundings [30]. Finally, regarding the future weather files, the uncertainty of the forecast files is closely related to the accuracy of the files on which they are based [31,32,33]. There are many studies that highlight the importance of the weather files, measuring, for example, their impact on passive buildings [34], on micro-grids [35,36], etc., calculating the loads of district energy systems [37], the electricity consumption with demand response strategies [13], or evaluating the effect on comfort conditions [38]. Some analyzed the effect that certain parameters of the weather file have, emphasizing the temperature as the most sensitive value for load forecasting [39,40].
In terms of temporal resolutions, most research focused on annual results when comparing different sets of weather data. Wang et al. analyzed the uncertainties in annual energy consumption due to weather variations and operation parameters for a reference office prototype, concluding that uncertainties caused by operation parameters were much more significant than weather variations [26]. Crawley et al. analyzed the energy results using measured weather data for 30 years and several weather datasets for a set of five locations in the USA, and the variation in annual energy consumption was on average ± 5 % [17]. Seo et al. conducted a similar study, also analyzing the peak electrical demand with similar results: a maximum difference of 5% [41].
In terms of research that focused on monthly criteria, Bhandari et al. found that, when using different weather datasets, the annual energy consumption could vary around ± 7 % , but up to ± 40 % when monthly analysis was performed [6]. Radhi compared the building’s energy performance of using past and recent (up-to-date) weather data with annual and monthly criteria. This showed a difference of 14.5% between the annual electricity consumption simulated with past data and actual consumption, while this difference grew up to 21% for one month when monthly criteria were used [42]. Finally, there were a few studies where the temporal resolution was less than one month. With weekly criteria, Silvero et al. compared five different weather data sources with the observed meteorological data, showing that, for the annual criteria, the results were similar, but for the hottest and coldest weeks of the year, the inaccuracies increased [28].
The aim of this work is to show how to evaluate the impact of using two different actual weather datasets on the building energy model simulations, measuring both their effect on energy demand and indoor temperature. The purpose is to analyze if the data provided by the on-site weather stations (with a high economic cost and maintenance) could be replaced by the actual data provided by a third-party. For this study, four test sites based on real buildings were used to compare the existing variations when weather files with data obtained from real stations and external provider were used in the simulations. These test sites are part of an EU funded H2020 research and innovation project SABINA (SmArt BI-directional multi eNergy gAteway) [43], which aims to develop new technology and financial models to connect, control, and actively manage the generation and storage of assets to exploit synergies between electrical flexibility and the thermal inertia of buildings. The energy demand variation analysis was measured by grouping it into different periods (annual, seasonal, monthly, weekly, daily, and hourly) since as explained by ASHRAE [44], “...the aggregated data will have a reduced scatter and associated CV(RMSE), favoring a model with less granular data.” The objective of the paper is to highlight these differences in the results when using different granular criteria since, depending on the use of the building energy model, their influence can be significant, for example for calibration purposes, where the monthly or hourly criteria are required. A sensitivity analysis was also performed to evaluate the influence of each weather parameter on the energy demand variation when using the two different actual weather datasets.
The main contributions of this research are: (1) four real test sites, with different uses and architectural characteristics, located in three different climates were employed in the study; (2) while most of studies that analyze weather data influence in building energy simulation use the typical meteorological year (TMY) [19,20,21,22,23,24], this study performed a comparison of two actual datasets: on-site and third-party weather data; (3) when the studies used actual weather data, they usually lacked a local weather station due to its expensive installation [6,27,28,29]; instead, the observed weather data from this study were provided by three weather stations installed on the building roofs or in their surroundings, providing on-site measurements; and (4) the energy results are shown with different temporal resolutions (from annual to hourly) in order to highlight the differences in the variations when the data are accumulated.
The paper is structured as follows. In Section 2, we describe the methodology used to analyze both the differences between the on-site and third-party weather datasets and the variation produced by these weather files in the results of the simulations in terms of the energy demand and in terms of the indoor temperature. In Section 3, we show the results obtained from the different approaches: the weather datasets comparison (Section 3.1), energy demand (Section 3.2), and indoor temperature (Section 3.3). Finally, in Section 4, we discuss the results obtained in the study, and in Section 5, the conclusions are presented.

2. Methodology

Figure 1 shows the diagram of the methodology used in this study and the three analyses that were performed: (1) the weather data comparison between the data provided by the on-site weather stations and the third-party data; and through energy model simulations using these two weather datasets; (2) the energy demand comparison; and (3) the indoor temperature comparison.

2.1. Weather Data Comparison Methodology

Once the weather data from the on-site weather station and the third-party are gathered, the first analysis is the comparison using a Taylor Diagram [45,46]. This provides a simple way of visually showing how closely a pattern matches an observation, and it is a useful tool to easily compare different parameters at a glance using the same plot. This type of comparison is widely used when weather parameters are analyzed [28,47,48,49,50,51]. Developed by Taylor in 2001, this diagram shows the correspondence between two patterns (in this case, third-party weather data as the test field (f) and on-site weather data as the reference field (r)) using three statistical metrics: the correlation R, the centered root-mean-squared difference RMS d i f f , and the standard deviation σ of the test and reference field.
The correlation R (3) is used to show how strongly the two fields are related, and it ranges from −1 to 1. The centered root-mean-squared difference RMS d i f f (4) measures the degree of adjustment in amplitude. The closer to 0, the more similar the patterns are. Both indexes provide complementary information quantifying the correspondence between the two fields, but to have a more complete characterization, their variances are also necessary, which are represented by their standard deviations σ f (1) and σ r (2) [46]. To allow the comparison between different weather parameters and to show them in the same plot, RMS d i f f and σ f are normalized by dividing both by the standard deviation of the observations ( σ r ). Thus, the normalized reference data have the following values: σ r = 1, RMS d i f f = 0, and R = 1.
Figure 2 shows the Taylor diagram baseline plot and how it is constructed. The reference point appears in the x-axis as a grey point. The azimuthal positions show the correlation coefficient R between the two fields. The standard deviation for the test field σ f is proportional to the radial distance from the origin, with the solid dashed arc as the reference standard deviation σ r . Finally, the centered root-mean-squared difference RMS d i f f between the test and reference patterns is proportional to the distance to the reference point, and the arcs indicate its value. The diagram allows us to determine the ranking of the test fields by comparing the distance to the reference point.
In Figure 2, two test fields are represented as an example: Example 2 has a correlation ± 0.99 , a RMS d i f f ± 0.24 , and a σ f ± 1.15 . Example 1 has a correlation ± 0.48 , a RMS d i f f ± 1.40 , and a σ f ± 1.55 . Example 2 performs better than Example 1 since all the metrics are better. The diagram shows this in a visual way as Example 2 is closer to the reference point.
σ f 2 = 1 N n = 1 N ( f n f ¯ ) 2
σ r 2 = 1 N n = 1 N ( r n r ¯ ) 2
R = 1 N n = 1 N ( f n f ¯ ) ( r n r ¯ ) σ f σ r
R M S d i f f = 1 N n = 1 N [ ( f n f ¯ ) ( r n r ¯ ) ] 2 1 / 2

2.2. Energy Analysis Methodology

The study of the impact on the energy demand due to the use of two different actual weather datasets is based on the energy simulation, and therefore, building energy models (BEM) are needed. For this study, detailed BEMs are employed using the EnergyPlus engine [52,53]. BEMs require weather files in the EPW format, which are created using the weather data from both sources (on-site and third-party weather data) and employing the Weather Converter tool [54] provided as an auxiliary program by EnergyPlus.
As shown in Figure 1, the energy analysis studies the impact on the energy demand when using both on-site and third-party weather files. BEM provides the energy demand that it requires each weather file to accomplish with the defined requirements (the temperature setpoint of each space). The energy results using the on-site weather file are established as the reference as they corresponded with the weather data measured in the building’s surroundings. Comparing the variation between the energy demand results using both weather files allows us to analyze the impact of using weather data from third-party sources with respect to the reference.
In order to perform a deeper study, a sensitivity analysis is performed to analyze the effects on the energy demand generated by each weather parameter. The methodology consists of replacing variables (one variable at a time) from the on-site weather file with data from the third-party weather file and generating specific weather files for each parameter. For example, when the dry bulb temperature is analyzed, a weather file is prepared that has the dry bulb temperature data from the third-party, but the rest of the weather parameters are maintained the same as in the on-site weather file. This way, the impact on energy demand when using only dry bulb temperature data from a third-party can be studied. This procedure is done for the weather parameters analyzed in the weather comparison: the dry bulb temperature ( T e m p ), relative humidity ( R H ), direct normal irradiation ( D N I ), diffuse horizontal irradiation ( D H I ), wind speed ( W S ), and wind direction ( W D ). These weather parameters are selected for the study as they are all used by EnergyPlus in the simulations, unlike other parameters, such as the global horizontal irradiation [54].
In the case of the other weather parameters provided by the weather stations, such as the atmospheric pressure and precipitation, they are not presented in this study as their impact in the BEMs is low. The process to perform the energy analysis is the same as before: BEM is simulated with the generated weather file with one parameter changed, obtaining an energy demand that is compared to the reference (the energy demand obtained using the on-site weather data).
As was explained in the Introduction, most of the studies that analyzed the effect of using different weather datasets in the building energy simulations used only the annual energy results, and only some of them used smaller temporal resolutions (monthly or weekly). This study presents the analysis according to different temporal resolutions and discusses the differences in the results. The time granularity levels proposed are annual, seasonal, monthly, weekly, daily, and hourly. Thus, the uncertainty metrics calculated for the energy results are related to the accumulated energy demand provided by the model in year, season, month, week, day, and hour periods.
For the statistical analysis of the results, three metrics are used in the study: the mean absolute deviation percent ( M A D P ) (5), the coefficient of variation of the root-mean-squared error ( C V ( R M S E ) ) (6), and the coefficient of determination ( R 2 ) (7). The equations of these statistical indexes are shown as:
M A D P = i = 1 n | y i y ^ i | i = 1 n | y i |
C V ( R M S E ) = 1 y ¯ i i = 1 n ( y i y ^ ) 2 n p
R 2 = n · i = 1 n y i · y ^ i = 1 n y i · i = 1 n y ^ ( n · i = 1 n y i 2 ( i = 1 n y i ) 2 ) · ( n · i = 1 n y ^ 2 ( i = 1 n y ^ ) 2 ) 2 .
In the equations, n is the number of observations, y i the on-site measured data at moment i, and y ^ i the third-party value at that moment.
M A D P and C V ( R M S E ) are both quantitative indexes that show the results in percentage terms. They allow the comparison between different test sites, weather parameters, and time resolutions. M A D P , which is also called the MAD/mean in some studies [55], has advantages that overcome some shortcomings of other metrics. It is not infinite when the actual values are zero, is very large when actual values are close to zero, and does not take extreme values when managing low-volume data [55,56,57]. C V ( R M S E ) , which gives a relatively high weight to large variations, is the other percentage metric selected for this study because it is a common metric in energy analysis. Indeed, the ASHRAE (American Society of Heating, Refrigerating and Air-Conditioning Engineers) Guidelines [44], FEMP (Federal Energy Management Program) [58], and IPMVP (International Performance Measurement and Verification Protocol) [59] use it to verify the accuracy of the models.
The coefficient of determination ( R 2 ) allows us to measure the linear relationship of the two patterns [60]. It ranges between 0.00 and 1.00, and higher values are better. It should be noted that uncertainty cannot be assessed using only this metric as the linear relationship may be strong, but with a substantial bias.
In the study, the M A D P and C V ( R M S E ) metrics are shown for all the temporal resolutions, from annual to hour. However, R 2 is only analyzed for the hourly time grain as the study of the linear relationship of larger time grains variations, which has few points, is meaningless.

2.3. Indoor Temperature Analysis Methodology

The third comparison analysis studies the impact of using the two weather datasets (on-site and third-party weather files) for the building’s indoor temperature. In order to allow the temperature comparison, the energy used by each model is fixed. In other words, both simulations with on-site and third-party weather data use the exact same energy; however, due to the differences in the weather parameters, the indoor temperature is different. The methodology consists of performing the first simulation with the on-site weather file to obtain the baseline energy demand for each thermal zone of the model. This baseline energy demand is then injected into the model using an EnergyPlus script for an HVAC machine that distributes that energy in each thermal zone. Then, the model is simulated for both the on-site and third-party weather files. The results of the building temperature—unifying thermal zone temperature, weighing it with respect to its volume—of these two last simulations are compared to analyze the impact on the indoor temperature conditions.
In this case, two quantitative indexes (mean absolute error M A E (8) and root-mean-squared error R M S E (9)) and a qualitative index ( R 2 (7)) are used to quantify the variation in the shape of the temperature curves.
M A E = 1 n i = 1 n | y i y ^ i |
R M S E = [ 1 n i = 1 n ( y i y ^ i ) 2 ] 1 2 .
The M A E and R M S E indexes are used to determine the average variation of the indoor temperature when using the different weather files in the simulations [61]. Both measure the average magnitude of the variation in the units of the variable of interest and are indifferent to the direction of the differences, overcoming cancellation errors. However, R M S E gives a relatively high weight to large deviations [62,63,64]. R M S E will always be greater than or equal to M A E (due to its quadratic nature); thus, the greater the difference between M A E and R M S E , the greater the variance between the individual dispersions on the sample. In this case, the three metrics are calculated for the hourly criteria.

3. Results

The methodology described in the previous section was applied to four buildings located in three different real test sites for a period of one year (2019). The test sites were an office building at the University of Navarre in Pamplona (Spain), a public school in Gedved (Denmark), and two buildings in the Lavrion Technological and Cultural Park (LTCP) in Lavrion, Greece: H2SusBuild and an administrative building. As shown in Figure 1, three different analyses were performed in the study: weather dataset comparison, energy comparison, and indoor temperature comparison. The following three subsections develop the three analyses.

3.1. Weather Data Comparison

For the analysis and comparison of the weather data, the first step was the data gathering from the on-site and third-party sources for the three locations for the period of study, which is the whole year 2019. On-site weather data were provided by weather stations installed in the buildings’ surroundings. In Pamplona and Gedved, the weather station was installed on the buildings’ roofs. In the case of Lavrion, the weather station was placed in the Technological and Cultural Park where the two test sites were located, near H2SusBuild. Table 1 shows the range, resolution, and accuracy of the sensors that formed part of each weather station. In general, the sensors of Pamplona’s weather station had the best accuracy. In the case of Lavrion, the diffuse solar radiation had a manual shadow bar that required readjustment every two days.
The time period of the measured data gathered from the three weather stations is the whole year 2019. Despite the good quality of the measured weather data, the raw data contained small gaps, usually a few hours, so interpolation was performed in order to fill in the missing data. On the other hand, the Weather Converter tool, which is used to generate the weather files, allows undertaking a complementary validation of the data since it produces a warning if data out of the range are used in the weather files’ generation process.
The third-party actual weather data for the year 2019 and for three locations were provided by meteoblue [65]. They are simulated historic data for a specific place and time calculated with models based on the NMM (nonhydrostatic meso-scale modeling) or NEMS (NOAA Environment Monitoring System) technology, which enables the inclusion of the detailed topography, ground cover, and surface cover. Further information about the computation of the weather data provided by meteoblue is available in [66].
The results of the weather parameter comparison between data from on-site weather stations and third-party (meteoblue) are shown using the Taylor diagram described in Section 2.1. Figure 3 summarizes all the results: showing the three statistics (R, R M S d i f f e r e n c e , and σ ) for the whole period of study (2019) calculated on an hourly basis, for the six weather parameters analyzed ( T e m p , R H , D N I , D H I , W D , and W S ), and for the three locations (each one represented in a different color).
For Pamplona weather (represented in blue), the diagram shows that T e m p provided the weather parameter for this location that better agreed with the on-site observations as it had the highest correlation R of around 0.95, the smallest RMS d i f f ( ± 0.3 ), and a very close σ f (standard deviation) to the reference ( ± 0.95 ). R H , D N I , and D H I provided similar results with a correlation around 0.7–0.8, an RMS d i f f between 0.5–0.6, and a good standard deviation. The parameters that correlated worse with the observed values were the wind parameters, especially W D (R = 0.09).
For Gedved weather (red color), the Taylor diagram shows that T e m p was the third-party weather parameter that agreed best with the on-site observations, with an R of around 0.97. As in Pamplona, the wind parameters delivered the results furthest from the reference point. W S had an acceptable correlation, but a very high standard deviation, and W D performed better for σ f , but had a low correlation (less than 0.5). For the third location, Lavrion (green color), T e m p also had a good correlation R (higher than 0.95). R H , D H I , and W S had a medium R for the observed data (around 0.8), but they presented differences in the other metrics. R H had a better standard deviation than the other two, and R H and D H I had a lower RMS d i f f than W S . In this location, the parameter that provided the worst results was W D , which had the worst R ( 0.22 ) and the highest RMS d i f f (1.5).
Comparing the statistical results for the three locations, the performance of data provided by the third-party varied for each location. Gedved provided the best results for four of the six weather parameters ( T e m p , R H , D H I , and W D ). In the three cases, T e m p was the parameter that best matched the reference (R around 0.95, RMS d i f f lower than 0.4, and σ f near one), and W D was the worst (correlations lower than 0.5 and RMS d i f f higher than 0.9). W S also provided poor correspondence with the observations, especially for the Gedved location. The rest of the parameters were scattered in the medium part of the diagram.
In Appendix A, a deeper analysis is shown where the statistical indexes for the monthly and seasonal data are represented in order to analyze their homogeneity. Figure A1, Figure A2 and Figure A3 show that the T e m p , R H , D N I , D H I , and W D parameters for the three weathers were quite homogeneous, with the seasonal and monthly indexes quite concentrated, providing similar R, RMS d i f f , and σ f . There are some exceptions, such as D N I for November in Pamplona and January in Gedved, which agreed worse with the observations than for the rest of the months. On the other hand, W S had more heterogeneous monthly and seasonal results since more scattered points were seen in the diagrams. In general, the winter and autumn months correlated the worst with the observed data.
Since the wind parameters produced higher variations when comparing on-site and third-party weather datasets, a deeper comparison analysis was performed using wind rose diagrams (see Figure 4). This diagram shows the distribution of the wind speed and wind direction. The rays point to the direction the wind is coming from, and their length indicates the frequency in percentage. The color depends on the wind speed, growing from blue to red colors. Pamplona’s wind rose shows that W S from the third-party data was much higher than observed, and although the prevailing direction was north in both cases, there were important differences in the frequency percentages. For Gedved, the third-party data provided much higher wind speed (yellow to red colors in the wind rose) than the observed data (blue colors) and a different prevailing wind direction. Finally, the wind roses for Lavrion show differences in the prevailing wind direction and very different wind speeds, being higher in the third-party wind rose.

3.2. Energy Analysis Results

After analyzing the variations in the different weather parameters between the on-site and third-party weather data for the three weathers, the following analysis consisted of the study of the impact produced by these variations in the test sites’ building energy demands using detailed BEMs. Figure 5 shows the four test sites analyzed in this study showing a real image from the building and an image of the performed model in EnergyPlus (in colors for the different thermal zones of each building).
The first test site was the office building attached to the Architecture School at the University of Navarre in Pamplona (Spain). It hosts administration uses and classrooms for the postgraduate students. This building is a 755 square meter single-story building built in 1974. It has a concrete structure; the outdoor walls are built of red brick fabric (U value = 0.3 W/m 2 K); the flat roof has the insulation above the deck (U value = 0.2 W/m 2 K); and aluminum window frames were installed in situ with an air chamber. The Gedved public school (Denmark) consists of six buildings and was built in 1979 and then renewed in 2007. The library of one of the school buildings was selected as the test site. It is a one-story building with a total surface area of 1138 m 2 , with a big central space—the library—and nine classrooms and serving spaces around it. The building walls consist of two brick layers with 150 mm insulation in between (U value = 0.27 W/m 2 K). The windows are two-layer double-glazed windows with cold frames. The ceiling is insulated with 200–250 mm mineral wool for the sloping and flat ceiling, respectively (U value = 0.07 W/m 2 K and 0.16 W/m 2 K, respectively), and the floor is made of concrete and contains 150 mm insulation under it (U value = 0.21 W/m 2 K).
In Lavrion (Greece), two buildings from the Lavrion Technological and Cultural Park (LTCP) were used as test sites: H2SusBuild and the administration building. H2SusBuild has a ground floor and an attic floor with a total surface area of approximately 505 m2 . The ground floor hosts a small kitchen, toilets, the control room, and the main area. The attic also hosts two offices and a meeting room. Its envelope consists of a concrete structure with double concrete block walls and single-glazed windows with aluminum frames. It also has external masonry consisting of double brick walls with 10 cm expanded polystyrene (EPS) insulation (U value = 0.25 W/m 2 K). The roof consists of metallic panels with a 2.5 cm polyurethane insulation layer in the middle (U value = 0.75 W/m 2 K). The administration building hosts the LTCP managing authority and administrative services. It is a two-story renovated neoclassic building with a surface area of about 644 m 2 . The building envelope is made of stone approximately 70 cm thick (U value = 1.85 W/m 2 K) with wooden-framed double-glazed large windows. The roof consists of a wooden frame with gutter tiles placed on top (U value = 0.49 W/m 2 K).
For each building, an individual pattern of use corresponding to the actual use of the building was implemented in the simulation model. Each building had its own calendar of use, occupancy, and internal loads of electric equipment and lighting. Regarding the HVAC systems, setpoints and usage hours were defined for each. The office building in Pamplona and H2SusBuild and administration building in Lavrion implemented both heating and cooling systems in the models, and the Gedved school had only a heating system. Table 2 shows the input data of the four models.
The results of the statistical analysis for the energy study are presented in Table 3. The table is divided into four sub-tables, one for each test site. They show the three uncertainty metrics calculated for the energy demand obtained from simulations using the on-site and third-party weather files ( T P W ), with on-site as the reference. The first column of each test site’s table, designated as T P W , shows the difference in percentage ( M A D P ) of the energy demand when the third-party weather file is used in the simulation with respect the the reference simulated with on-site weather data. With the inputs shown in Table 2, the models provide the following annual energy demand: 21.9 kWh/m 2 for the office building, 91.5 kWh/m 2 for the Gedved school, 142.2 kWh/m 2 for H2SusBuild, and 91.6 kWh/m 2 for the administration building.
The table allows performing two different analyses depending on how it is read. The vertical interpretation of the table shows the percentage results as a function of the time resolutions (from annual to hourly criteria) employed for the analysis. On the other hand, horizontally, the variations in the energy demand for the sensitivity analysis changing only one weather parameter at a time are presented ( D H I , D N I , R H , T e m p , W D , and W S ).
The first analysis obtained from Table 3 was the influence of the time resolution used in the study of the energy demand variation. In this case, the analysis was done in the vertical from the annual to hourly criteria. The percentage metrics M A D P and C V ( R M S E ) allowed us to compare the results for each time grain and study its influence in the results. Both indexes were closely related; however, C V ( R M S E ) gave a relatively high weight to large variations. It is remarkable that the differences between C V ( R M S E ) and M A D P decreased as the time grain increased (from hourly to annual criteria) as, when the energy demand was accumulated, the outliers were minimized. The results for the hourly basis showed that the C V ( R M S E ) values were around twice the M A D P values for the four sites and all the weather parameters. This indicates that significant outliers were present in the energy demand results when both simulations based on the on-site and third-party weather datasets were compared.
On the other hand, both the M A D P and C V ( R M S E ) metrics showed how, in the four test sites, the variations in the energy demand grew as the time grain decreased, which matches with ASHRAE’s statement about the energy data granularity [44]. If the results were analyzed with the accumulated energy demand for a period of time (i.e., monthly, annual, etc.), the energy variation was minimized with respect to the hourly analysis. For example, differences of M A D P up to ± 38 % between the annual and hourly criteria are seen in the results for the administration building. In this case, the M A D P for the accumulated data for the year was only 1.29 % ; thus, the annual building energy demand simulated for the third-party weather file was very similar to the reference, simulated with the on-site weather file.
However, for the hourly basis, this variation grew up to 39.45 % , which is a significant deviation. The reason is because, alternately, in some cases, the model simulation overestimated the energy demand needed by the building (the model demanded more energy than the reference), and in other cases, the model underestimated it. When the data were accumulated from the hourly basis to longer periods of time (daily, weekly, monthly, seasonal, and annual), a compensation effect occurred by canceling each other out, which resulted in the minimization of the energy demand variation. As the length of the periods increased, so did the compensation effect and, therefore, also the minimization of the variations.
It is also remarkable that for all the test sites, the C V ( R M S E ) results showed high values for the monthly and hourly resolutions, which are the time criteria commonly used by the energy analysis standards.
The second analysis was the study of the influence of each weather parameter in the energy demand variation. In this case, the interpretation of the tables from Table 3 was done horizontally: the first column presents the results for the simulation with the third-party weather file ( T P W ), which had all the weather parameters changed, and the following columns show the results for the different weather parameters.
Comparing the results of the four test sites using the M A D P and C V ( R M S E ) indexes, some common observations can be made. In all of them, the weather parameter that generated less impact in the simulated energy demand was W D , even though it was the weather parameter that worst fit the on-site weather data, as was shown in the Taylor diagram (Figure 3). The reason is because the mechanical ventilation and infiltration EnergyPlus objects used in these models did not account for W D in the simulations [68].
On the other hand, in the four test sites, when W S was analyzed, it showed an important impact in the energy demand simulations’ outputs. This was mainly due to two causes: The first was the use of dynamic infiltrations introduced by using the EnergyPlus object ZoneInfiltration:EffectiveLeakageArea, which took into account the W S parameter in the calculations [68]. The leakage area in cm 2 was calculated by the calibration process previously developed by the authors [69,70,71]. The second was because the differences between the third-party W S data and on-site data (see the Taylor diagram in Figure 3 and the wind roses from Figure 4) were significant.
In both the Gedved school and H2SusBuild, the third-party wind speed provided faster values, which generated a significant increase in the energy demand during almost all the year, but there were a few moments with a decrease in the energy demand. Therefore, the compensation effect between the overestimated and underestimated energy demand was reduced. This explains why the variation due to W S was high for all the time grains for these test sites. This effect was especially clear in the Gedved school, which did not have a cooling system. In this case, all the time grains for W S provided the same M A D P because the higher W S of the third-party data always meant a higher heating energy demand and no energy demand compensation existed.
Regarding the T e m p parameter, in the weather data analysis (Section 3.1), based on an hourly time grain, it was the parameter with less variation between on-site and third-party data for the three sites. However, the M A D P and C V ( R M S E ) results, especially for the hourly criteria, showed that it had a significant impact on the energy demand in the four test sites. It was the second parameter of influence for the Gedved school, H2SusBuild, and administration building after wind speed and the first one in the office building with an M A D P of ± 26 % . In relation to the solar irradiation, the Taylor diagram (Figure 3) showed that D H I from the third-party weather data provided a better correlation than D N I for the three locations, and this was reflected in the sensitivity energy analysis. For the Gedved school, these two parameters had less impact on the energy demand than for the other three models. The reason is because the school lacked a cooling system; therefore, in summer, when more solar access was available, no energy demand was taken into account.
To conclude the explanation of Table 3, factor R 2 was analyzed. It compared the shape of the energy demand curves from the different simulations and showed that the energy demand simulated with the third-party weather data fit quite well with the energy demand simulated from the on-site data for the four test sites (with R 2 between ± 82 % and ± 95 % ). Regarding the different weather parameters, the results for each parameter matched with the analysis of the hourly percentage indexes. The parameters with lower hourly M A D P and C V ( R M S E ) values had higher R 2 values.
Finally, to show in a visual way the previous analysis of the influence of the time resolution employed in the study and the sensitivity of each weather parameter, the M A D P index results are plotted in Figure 6. Each graph presents the M A D P result for each test site. In the graphs, the six time resolutions are shown on the x axis, and the dashed line presents the results for the simulation with all the third-party weather parameters ( T P W ). Each color represents the results for each weather parameter of the sensitivity analysis. The graphs show how the variations in the energy demand grew as the time grain decreased, especially T e m p . They also show that W S was the most sensitive weather parameter for the Gedved school, H2SusBuild, and administration building. Only in the case of the office building in Pamplona was W S the most sensitive weather parameter taking into account an annual criterion; however, per hour, it changed to T e m p .
Previous analyses showed the significant variations in the energy demand when using different actual weather datasets. In order to study if these differences in the energy demand were mostly due to the building architectures or to the weather, a complementary theoretical analysis was performed, and this is presented in the following section.

Analysis of the Buildings’ Architecture Influence on the Energy Results

Since four test sites were available for this research, a complementary study was performed to analyze the influence of the building’s architecture on the previous energy results. The four buildings, which were completely different in terms of the materials, construction systems, thermal mass, and window-to-wall ratio, were simulated with the same weather data (on-site and third-party weather files). For this study, we selected the most homogeneous weather when comparing the third-party to the on-site weather data: as shown in the previous analysis, energy demands are very sensitive to W S , so the Gedved weather was discarded for this analysis because its W S was the one with the worst fit to the reference (see Figure 3). T e m p was also a sensitive parameter, and the three weathers had similar statistical metrics. Finally, for the solar radiation parameters D H I and D N I , Pamplona’s weather better matched the reference compared to Lavrion. Therefore, the Pamplona weather file was chosen to develop this theoretical study, and for this reason, the four models were configured to have the same internal loads, HVAC systems, and schedules as the Pamplona office building.
Figure 7 shows the M A D P results for this study. The two graphs on the top present the results using the third-party weather file, which had all the weather parameters provided by the weather service. They show how when the test sites were simulated with their own weather files (graph on the left), the M A D P results and the trend of the curve were very different for the four test sites (each colored line represents one test site). However, when the test sites were simulated with Pamplona’s weather file (graph on the right with dashed lines), the curves became very similar, reducing the differences in the M A D P values and in the trend of the curve.
Thus, the main value responsible for the variation in the energy demand was the weather dataset employed in the simulations and not the building’s characteristics. This effect was also reflected in the results from the sensitivity analysis, which are also presented in Figure 7 for the T e m p , D N I , D H I , and W S parameters. The curves from the graphs on the right, which are the simulations of all the test sites with Pamplona’s weather file, were very similar compared to the curves from the graphs on the left, especially in the case of wind speed. This study demonstrated the great influence of the weather parameters on the variation of the building’s energy demand, almost independently of the model, and this showed the importance of the selection of the weather dataset used in the BEM simulations.

3.3. Indoor Temperature Analysis Results

In the indoor temperature study, M A E and R M S E for the hourly criteria were used in the analysis in order to measure the quantitative variation in the temperature curve, and R 2 was used to study the deviation in the temperature curve form. The results are shown in Table 4. In this case, the same energy was injected into the model for the two simulations, using the on-site and third-party weather datasets. The comparison of the thermal zones’ temperature provided by the simulations provided the influence of the weather dataset employed in the indoor temperature conditions. In the table, the results are shown using three criteria: (1) for all the thermal zones (All), where the statistical metrics are calculated using the indoor temperature of all the thermal zones of the building; and (2, 3) for the maximum (Max) and minimum (Min) temperature, where the metrics are calculated using only the temperature of the thermal zone that provides the maximum/minimum temperature in each time step in order to compare the effect in the internal healthy conditions when using the two weather data sources. For this study, the statistical metrics were calculated only for the hourly criteria. The rest of the time grains were not considered since, for the temperature analysis, the data were not accumulated when longer periods were analyzed.
This study provided similar results to the previous energy results. Regarding the results for the temperature of all the thermal zones (All), the high results in the R 2 for the four test sites (from ± 87 to ± 95 % ) showed that the shape of the indoor temperature curve was very similar when the two weather datasets were used in the simulation. However, the quantitative statistical metrics showed a significant impact on the indoor temperature. The Gedved school was the test site with the highest M A E (1.72 °C), in line with the energy analysis where this test site reached deviations in the energy demand of ± 45 % . The main reasons why the Gedved school had a higher impact on the indoor temperature were the influence of the lack of correlation of the wind speed between the on-site and third-party weather datasets and the way infiltrations were simulated based on the leakage area. The office building in Pamplona was the site with a minor impact on the indoor temperature ( M A E of 0.55 °C) when the weather dataset was changed. When the maximum and minimum temperatures reached in the building were employed in the analysis (Max and Min in the table, respectively), it can be seen that the statistical metrics were similar to All. This means that the variation in the indoor temperature when using third-party weather data was stable and produced similar variations when the indoor conditions were minimum or maximum.
There was no high differences between the M A E and R M S E results for the four cases, which indicated that the variations in the indoor temperature were quite homogeneous with no significant outliers. Figure 8 shows, in a visual way, the results with a scatter plot for each test site where the temperature was weighted by a thermal zone volume of air. The office building in Pamplona (above left) had fewer scattered temperature points since they were closer to the black line than the rest of the cases, and most of the points had a difference of 1 °C or less. On the other hand, the Gedved school (above right) provided the worst correlation for almost all the temperature points with a difference bigger than 1 °C due to the difference between both weather files.

4. Discussion

This paper shows how to study the impact of using two different actual weather datasets on building energy model simulations (weather data for the year 2019): one weather dataset with data measured in the building’s surroundings (on-site), which was considered the reference weather data; and the other supplied by a weather service provider (third-party). Four test sites with different uses and architecture characteristics, located in three different locations, were employed in the studMor diagram, (2) an energy demand comparison, and (3) an indoor temperature comparison. In the case of the energy approach, a sensitivity analysis of the main weather parameters was also performed to study the influence of each parameter in the energy simulation. The energy results were provided with a different temporal resolution from the annual to hourly criteria in order to highlight the differences in the results.
The results for the energy analysis showed that as the time grain decreased, the impact of using different weather datasets grew, which agrees with ASHRAE [44]. The differences between the annual and hourly M A D P until 38% are shown in the results. This was because when the energy demand was accumulated in periods of time longer than an hour, the variation in the results was minimized due to the compensation effect of the underestimated and overestimated energy use. This must be taken into account when a weather data source is chosen according to its purpose. For instance, if the weather data will be used for model calibration purposes, it is important to take into consideration the monthly or hourly criteria, as the most used standards (ASHRAE [44], FEMP [58], and IPMVP [59]) employ these time grains for their recommendations, and as the study showed, the use of different weather datasets had a significant impact on the C V ( R M S E ) results. Another application of BEM where the time grain of the analysis is relevant is model predictive control, where the hourly criteria are required.
The sensitivity analysis of the main weather parameters showed the different influence that each parameter had on the energy demand variation of each test site. In this regard, the relative humidity and wind direction had little influence on the models. In the case of the wind direction, the low influence was due to these test sites using mechanical ventilation instead of natural. On the other hand, the results also showed that the two parameters that produced a higher impact in the energy use were the wind speed and temperature. The high influence in the energy demand due to wind speed was explained by the third-party wind speed data having a low correspondence to the on-site data and because the models employed in the study used dynamic infiltrations that took into account the wind speed, instead of other models with constant values in the infiltration parameters. Therefore, the energy results for the wind speed showed that particular attention should be paid to this parameter when BEMs use dynamic infiltrations, as it has a great influence on the model’s energy performance.
With the available data, the results obtained in this study suggested that for this models, some of the weather parameter data could be obtained from third-party weather sources, avoiding the installation of on-site sensors, as they had a low influence on the simulation results. This is the case of the relative humidity, wind direction, and even diffuse horizontal irradiation, the sensor being very expensive. On the other hand, to obtain the wind speed and outdoor temperature data, which are the weather parameters that were shown to be the most influential in the models’ energy performance, we recommend the installation of an anemometer and a temperature sensor near the building. Having both on-site and third-party weather data sources would allow the verification of the data. An on-site sensor would also provide information regarding the micro-climate generated due to the surrounding characteristics of the building, which could be difficult to see reflected in the calculated weather data from a third-party.
The energy study showed that the weather dataset selected for the dynamic energy simulations had a great impact on the buildings energy performance, especially for short temporal resolutions. To emphasize the impact of the weather datasets in the building energy models, a theoretical study was performed, simulating all the test sites with the Pamplona weather file. The results showed that all the weather parameters produced similar variations in the energy demand and also a similar trend of the curve for the different time grains, independently of the model. This demonstrated the significant role played by the weather data and the importance of their correct selection when performing the building energy simulations.
In the case of the indoor temperature study, the significant impact of using different weather datasets was also shown. Although the high R 2 results for the four test sites showed that the shape of the indoor temperature curves was similar when both the on-site and third-party weather files were used, the quantitative metrics demonstrated a significant influence on the indoor temperature of the test sites with a M A E higher than 1.5 °C in some cases.
This paper showed the variation in the simulation results when two different actual weather datasets (on-site and third-party) were employed. In future research, it would be interesting to collect the empirical energy and temperature data from the test sites to study which of the weather datasets is closer to reality when comparing the simulation results using both weather datasets and the actual energy and temperature measurements.

5. Conclusions

The aim of this research is to show how to study the variations in energy demand and indoor temperature when using two different actual weather data sources with the purpose of analyzing if the data from the sensors of an on-site weather station (with high economic cost and maintenance) could be replaced by the data provided by a third-party. In this regard, it can be concluded that this research is not enough to make a general evaluation of the impact of using third-party actual weather data, but it has shown relevant variations in the energy demand and indoor temperature in the four test sites when both weather datasets are used in the simulations, especially for hourly criteria (used in calibration processes and in other applications such as model predictive control). The study also showed that for these types of building energy models, which employ dynamic infiltrations, wind speed’s influence on the energy demand is relevant. The significant variations in the results lead us to make the recommendation for researchers to analyze in detail the impact on the building energy models using third-party actual weather data before employing it. For example, this analysis methodology could be performed before making an investment into a weather station by installing a rented provisional one for a period of time. This way, the most influential weather parameters for a specific building energy model and a third-party weather provider could be determined, and with this information, an informed choice about which sensors are worth being purchased and installed can be made.

Author Contributions

E.L.S. supervised the methodology used in the article, performed the simulations and the analysis, and wrote the manuscript. G.R.R. and C.F.B. developed the methodology and participated in the data analysis. V.G.G. and G.R.R. developed the EnergyPlus models. A.P. provided resources for the study. All the authors revised and verified the manuscript before sending it to the journal. All authors read and agreed to the published version of the manuscript.

Funding

This project received funding from the European Union’s Horizon 2020 research and innovation program under Grant Agreement No. 731211, project SABINA.

Acknowledgments

We would like to thank the National Technical University of Athens for providing the data of the H2SusBuild and administration building test sites located in Lavrion (Greece) and also Insero in the case of the Gedved school test site in Denmark.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
ASHRAEAmerican Society of Heating, Refrigerating and Air-Conditioning Engineers
BEMBuilding energy model
CV(RMSE)Coefficient of variance RMSE
DHIDiffuse horizontal irradiation
DNIDirect normal irradiation
EUEuropean Union
FEMPFederal Energy Management Program
HVACHeating, ventilation, and air-conditioning
IPMVPInternational Performance Measurement and Verification Protocol
LTCPLavrion Technological and Cultural Park
MAEMean absolute error
MADPMean absolute deviation percentage
R 2 Coefficient of determination
RHRelative humidity
R M S d i f f Centered root-mean-squared difference
RMSERoot-mean-squared error
TempTemperature
TPWThird-party weather data
WDWind direction
WSWind speed

Appendix A

The following figures show detailed Taylor diagrams for each one of the weathers analyzed in the study to complement Figure 3. Figure A1 shows Pamplona’s weather, Figure A2 Gedved’s weather, and Figure A3 Lavrion’s weather. For each location, six diagrams are shown, one for each weather parameter (Temp, RH, DNI, DHI, WS, and WD). The diagrams show the statistical indexes (correlation R, centered root-mean-squared difference RMS d i f f , and standard deviation σ f ) calculated on an hourly basis for annual, season, and monthly data.
Figure A1. Normalized Taylor diagrams of Pamplona (Spain) comparing on-site and third-party weather data, showing temperature, relative humidity, direct normal irradiation, diffuse horizontal irradiation, wind speed, and wind direction.
Figure A1. Normalized Taylor diagrams of Pamplona (Spain) comparing on-site and third-party weather data, showing temperature, relative humidity, direct normal irradiation, diffuse horizontal irradiation, wind speed, and wind direction.
Sustainability 12 06788 g0a1
Figure A2. Normalized Taylor diagrams of Gedved (Denmark) comparing on-site and third-party weather data, showing temperature, relative humidity, direct normal irradiation, diffuse horizontal irradiation, wind speed, and wind direction.
Figure A2. Normalized Taylor diagrams of Gedved (Denmark) comparing on-site and third-party weather data, showing temperature, relative humidity, direct normal irradiation, diffuse horizontal irradiation, wind speed, and wind direction.
Sustainability 12 06788 g0a2
Figure A3. Normalized Taylor diagrams of Lavrion (Greece) comparing on-site and third-party weather data, showing temperature, relative humidity, direct normal irradiation, diffuse horizontal irradiation, wind speed, and wind direction.
Figure A3. Normalized Taylor diagrams of Lavrion (Greece) comparing on-site and third-party weather data, showing temperature, relative humidity, direct normal irradiation, diffuse horizontal irradiation, wind speed, and wind direction.
Sustainability 12 06788 g0a3

References

  1. United Nations. The Sustainable Development Goals Report 2019; United Nations: New York, NY, USA, 2019; Available online: https://unstats.un.org/sdgs/report/2019/The-Sustainable-Development-Goals-Report-2019.pdf (accessed on 20 April 2020).
  2. UNFCCC Secretariat. Paris Agreement. Report of the Conference of the Parties on Its Twenty-First Session, Held in Paris from 30 November to 13 December 2015; United Nations Framework Convention on Climate Change. 2015. Available online: https://en.wikipedia.org/wiki/Paris_Agreement (accessed on 20 April 2020).
  3. IEA. Global Status Report for Buildings and Construction, towards a Zero-Emission Efficient and Resilient Buildings and Construction Sector; IEA: Paris, France, 2019. [Google Scholar]
  4. IBPSA-USA—International Building Performance Simulation Association. Building Energy Software Tools (BEST Directory). Available online: https://www.buildingenergysoftwaretools.com/software-listing (accessed on 23 April 2020).
  5. Nguyen, A.T.; Reiter, S.; Rigo, P. A review on simulation-based optimization methods applied to building performance analysis. Appl. Energy 2014, 113, 1043–1058. [Google Scholar] [CrossRef]
  6. Bhandari, M.; Shrestha, S.; New, J. Evaluation of weather datasets for building energy simulation. Energy Build. 2012, 49, 109–118. [Google Scholar] [CrossRef]
  7. Marion, W.; Urban, K. Users manual for TMY2s: Derived from the 1961–1990 National Solar Radiation Data Base; Technical Report; National Renewable Energy Laboratory: Golden, CO, USA, 1995.
  8. Wilcox, S.; Marion, W. Users Manual for TMY3 Data Sets; National Renewable Energy Laboratory: Golden, CO, USA, 2008.
  9. Thevenard, D.J.; Brunger, A.P. The development of typical weather years for international locations: Part I, algorithms. Ashrae Trans. 2002, 108, 376–383. [Google Scholar]
  10. Thevenard, D.J.; Brunger, A.P. The development of typical weather years for international locations: Part II, production/Discussion. Ashrae Trans. 2002, 108, 480. [Google Scholar]
  11. Joe, Y.; Fenxian, H.; Seo, D.; Krarti, M. Development of 3012 IWEC2 Weather Files for International Locations (RP-1477). Ashrae Trans. 2014, 120, 340–355. [Google Scholar]
  12. Henze, G.P.; Pfafferott, J.; Herkel, S.; Felsmann, C. Impact of adaptive comfort criteria and heat waves on optimal building thermal mass control. Energy Build. 2007, 39, 221–235. [Google Scholar] [CrossRef]
  13. Sandels, C.; Widén, J.; Nordström, L.; Andersson, E. Day-ahead predictions of electricity consumption in a Swedish office building from weather, occupancy, and temporal data. Energy Build. 2015, 108, 279–290. [Google Scholar] [CrossRef]
  14. Ramos Ruiz, G.; Lucas Segarra, E.; Fernández Bandera, C. Model predictive control optimization via genetic algorithm using a detailed building energy model. Energies 2019, 12, 34. [Google Scholar] [CrossRef] [Green Version]
  15. Fernández Bandera, C.; Pachano, J.; Salom, J.; Peppas, A.; Ramos Ruiz, G. Photovoltaic Plant Optimization to Leverage Electric Self Consumption by Harnessing Building Thermal Mass. Sustainability 2020, 12, 553. [Google Scholar] [CrossRef] [Green Version]
  16. Lazos, D.; Sproul, A.B.; Kay, M. Optimisation of energy management in commercial buildings with weather forecasting inputs: A review. Renew. Sustain. Energy Rev. 2014, 39, 587–603. [Google Scholar] [CrossRef]
  17. Crawley, D.B.; Huang, Y.J. Does it matter which weather data you use in energy simulations. User News 1997, 18, 2–12. [Google Scholar]
  18. Crawley, D.B. Which weather data should you use for energy simulations of commercial buildings? Trans. Am. Soc. Heat. Refrig. Air Cond. Eng. 1998, 104, 498–515. [Google Scholar]
  19. Erba, S.; Causone, F.; Armani, R. The effect of weather datasets on building energy simulation outputs. Energy Procedia 2017, 134, 545–554. [Google Scholar] [CrossRef]
  20. Voisin, J.; Darnon, M.; Jaouad, A.; Volatier, M.; Aimez, V.; Trovão, J.P. Climate impact analysis on the optimal sizing of a stand-alone hybrid building. Energy Build. 2020, 210, 109676. [Google Scholar] [CrossRef]
  21. González, V.G.; Ruiz, G.R.; Segarra, E.L.; Gordillo, G.C.; Bandera, C.F. Characterization of building foundation in building energy models. In Proceedings of the Building Simulation 2019: 16th Conference of IBPSA, Rome, Italy, 2–4 September 2019. [Google Scholar]
  22. Jentsch, M.F.; James, P.A.; Bourikas, L.; Bahaj, A.S. Transforming existing weather data for worldwide locations to enable energy and building performance simulation under future climates. Renew. Energy 2013, 55, 514–524. [Google Scholar] [CrossRef]
  23. Jentsch, M.F.; Bahaj, A.S.; James, P.A. Climate change future proofing of buildings—Generation and assessment of building simulation weather files. Energy Build. 2008, 40, 2148–2168. [Google Scholar] [CrossRef]
  24. Dickinson, R.; Brannon, B. Generating future weather files for resilience. In Proceedings of the International Conference on Passive and Low Energy Architecture, Los Angeles, CA, USA, 11–13 July 2016; pp. 11–13. [Google Scholar]
  25. Chow, T.T.; Chan, A.L.; Fong, K.; Lin, Z. Some perceptions on typical weather year—From the observations of Hong Kong and Macau. Sol. Energy 2006, 80, 459–467. [Google Scholar] [CrossRef]
  26. Wang, L.; Mathew, P.; Pang, X. Uncertainties in energy consumption introduced by building operations and weather for a medium-size office building. Energy Build. 2012, 53, 152–158. [Google Scholar] [CrossRef] [Green Version]
  27. Song, S.; Haberl, J.S. Analysis of the impact of using synthetic data correlated with measured data on the calibrated as-built simulation of a commercial building. Energy Build. 2013, 67, 97–107. [Google Scholar] [CrossRef]
  28. Silvero, F.; Lops, C.; Montelpare, S.; Rodrigues, F. Generation and assessment of local climatic data from numerical meteorological codes for calibration of building energy models. Energy Build. 2019, 188, 25–45. [Google Scholar] [CrossRef]
  29. Ciobanu, D.; Eftimie, E.; Jaliu, C. The influence of measured/simulated weather data on evaluating the energy need in buildings. Energy Procedia 2014, 48, 796–805. [Google Scholar] [CrossRef] [Green Version]
  30. Cuerda, E.; Guerra-Santin, O.; Sendra, J.J.; Neila, F.J. Understanding the performance gap in energy retrofitting: Measured input data for adjusting building simulation models. Energy Build. 2020, 209, 109688. [Google Scholar] [CrossRef]
  31. Du, H.; Barclay, M.; Jones, P.J. Generating high resolution near-future weather forecasts for urban scale building performance modelling. In Proceedings of the 15th IBPSA Conference, San Francisco, CA, USA, 7–9 August 2017. [Google Scholar]
  32. Du, H.; Jones, P.; Segarra, E.L.; Bandera, C.F. Development of a REST API for obtaining site-specific historical and near-future weather data in EPW format. Presented at Building Simulation and Optimization, Cambridge, UK, 11–12 September 2018. [Google Scholar]
  33. Du, H.; Bandera, C.F.; Chen, L. Nowcasting methods for optimising building performance. In Proceedings of the 16th IBPSA Conference, Rome, Italy, 2–4 September 2019. [Google Scholar]
  34. Henze, G.P.; Kalz, D.E.; Felsmann, C.; Knabe, G. Impact of forecasting accuracy on predictive optimal control of active and passive building thermal storage inventory. HVAC&R Res. 2004, 10, 153–178. [Google Scholar]
  35. Agüera-Pérez, A.; Palomares-Salas, J.C.; de la Rosa, J.J.G.; Florencias-Oliveros, O. Weather forecasts for microgrid energy management: Review, discussion and recommendations. Appl. Energy 2018, 228, 265–278. [Google Scholar] [CrossRef]
  36. Yan, X.; Abbes, D.; Francois, B. Uncertainty analysis for day ahead power reserve quantification in an urban microgrid including PV generators. Renew. Energy 2017, 106, 288–297. [Google Scholar] [CrossRef]
  37. Powell, K.M.; Sriprasad, A.; Cole, W.J.; Edgar, T.F. Heating, cooling, and electrical load forecasting for a large-scale district energy system. Energy 2014, 74, 877–885. [Google Scholar] [CrossRef]
  38. Petersen, S.; Bundgaard, K.W. The effect of weather forecast uncertainty on a predictive control concept for building systems operation. Appl. Energy 2014, 116, 311–321. [Google Scholar] [CrossRef]
  39. Zhao, J.; Duan, Y.; Liu, X. Uncertainty analysis of weather forecast data for cooling load forecasting based on the Monte Carlo method. Energies 2018, 11, 1900. [Google Scholar] [CrossRef] [Green Version]
  40. Haben, S.; Giasemidis, G.; Ziel, F.; Arora, S. Short term load forecasting and the effect of temperature at the low voltage level. Int. J. Forecast. 2019, 35, 1469–1484. [Google Scholar] [CrossRef] [Green Version]
  41. Seo, D.; Huang, Y.J.; Krarti, M. Impact of typical weather year selection approaches on energy analysis of buildings. ASHRAE Trans. 2010, 116, 416–427. [Google Scholar]
  42. Radhi, H. A comparison of the accuracy of building energy analysis in Bahrain using data from different weather periods. Renew. Energy 2009, 34, 869–875. [Google Scholar] [CrossRef]
  43. SABINA SmArt BI-Directional Multi eNergy gAteway. Available online: https://sabina-project.eu/ (accessed on 20 April 2020).
  44. Guideline, A. Guideline 14-2002, Measurement of Energy and Demand Savings; American Society of Heating, Ventilating, and Air Conditioning Engineers: Atlanta, GA, USA, 2002. [Google Scholar]
  45. Taylor, K.E. Taylor Diagram Primer. 2005. Available online: http://wwwpcmdi.llnl.gov/about/staff/Taylor/CV/Taylor_diagram_primer.pdf (accessed on 20 April 2020).
  46. Taylor, K.E. Summarizing multiple aspects of model performance in a single diagram. J. Geophys. Res. Atmos. 2001, 106, 7183–7192. [Google Scholar] [CrossRef]
  47. Dong, T.Y.; Dong, W.J.; Guo, Y.; Chou, J.M.; Yang, S.L.; Tian, D.; Yan, D.D. Future temperature changes over the critical Belt and Road region based on CMIP5 models. Adv. Clim. Chang. Res. 2018, 9, 57–65. [Google Scholar] [CrossRef]
  48. Chen, G.; Zhang, X.; Chen, P.; Yu, H.; Wan, R. Performance of tropical cyclone forecast in Western North Pacific in 2016. Trop. Cyclone Res. Rev. 2017, 6, 13–25. [Google Scholar]
  49. Yan, G.; Wen-Jie, D.; Fu-Min, R.; Zong-Ci, Z.; Jian-Bin, H. Surface air temperature simulations over China with CMIP5 and CMIP3. Adv. Clim. Chang. Res. 2013, 4, 145–152. [Google Scholar] [CrossRef]
  50. Nabeel, A.; Athar, H. Stochastic projection of precipitation and wet and dry spells over Pakistan using IPCC AR5 based AOGCMs. Atmos. Res. 2020, 234, 104742. [Google Scholar] [CrossRef]
  51. De Assis Tavares, L.F.; Shadman, M.; de Freitas Assad, L.P.; Silva, C.; Landau, L.; Estefen, S.F. Assessment of the offshore wind technical potential for the Brazilian Southeast and South regions. Energy 2020, 196, 117097. [Google Scholar] [CrossRef]
  52. Crawley, D.B.; Lawrie, L.K.; Winkelmann, F.C.; Buhl, W.F.; Huang, Y.J.; Pedersen, C.O.; Strand, R.K.; Liesen, R.J.; Fisher, D.E.; Witte, M.J.; et al. EnergyPlus: Creating a new-generation building energy simulation program. Energy Build. 2001, 33, 319–331. [Google Scholar] [CrossRef]
  53. Crawley, D.B.; Lawrie, L.K.; Pedersen, C.O.; Winkelmann, F.C.; Witte, M.J.; Strand, R.K.; Liesen, R.J.; Buhl, W.F.; Huang, Y.J.; Henninger, R.H.; et al. EnergyPlus: An update. In Proceedings of the SimBuild, Boulder, CO, USA, 4–6 August 2004; Volume 1. [Google Scholar]
  54. EnergyPlus. Auxiliary Programs: EnergyPlusTM version 8.9.0 Documentation; U.S. Department of Energy: Washington, DC, USA, 2018.
  55. Kolassa, S.; Schütz, W. Advantages of the MAD/MEAN ratio over the MAPE. Foresight Int. J. Appl. Forecast. 2007, 6, 40–43. [Google Scholar]
  56. Petojević, Z.; Gospavić, R.; Todorović, G. Estimation of thermal impulse response of a multi-layer building wall through in-situ experimental measurements in a dynamic regime with applications. Appl. Energy 2018, 228, 468–486. [Google Scholar] [CrossRef]
  57. Lucas Segarra, E.; Du, H.; Ramos Ruiz, G.; Fernández Bandera, C. Methodology for the quantification of the impact of weather forecasts in predictive simulation models. Energies 2019, 12, 1309. [Google Scholar] [CrossRef] [Green Version]
  58. U.S. Department of Energy. M&V Guidelines: Measurement and Verification for Federal Energy Projects Version 3.0; U.S. Department of Energy: Washington, DC, USA, 2008.
  59. IPMVP Committee. International Performance Measurement and Verification Protocol: Concepts and Options for Determining Energy and Water Savings, Volume I; Technical Report; National Renewable Energy Laboratory: Golden, CO, USA, 2001.
  60. González, V.G.; Colmenares, L.Á.; Fidalgo, J.F.L.; Ruiz, G.R.; Bandera, C.F. Uncertainy’s Indices Assessment for Calibrated Energy Models. Energies 2019, 12, 2096. [Google Scholar] [CrossRef] [Green Version]
  61. Zhao, J.; Liu, X. A hybrid method of dynamic cooling and heating load forecasting for office buildings based on artificial intelligence and regression analysis. Energy Build. 2018, 174, 293–308. [Google Scholar] [CrossRef]
  62. Perez, R.; Cebecauer, T.; Šúri, M. Semi-empirical satellite models. In Solar Energy Forecasting and Resource Assessment; Academic Press: Boston, MA, USA, 2013; pp. 21–48. [Google Scholar]
  63. Kato, T. Prediction of photovoltaic power generation output and network operation. In Integration of Distributed Energy Resources in Power Systems; Elsevier: Amsterdam, The Netherlands, 2016; pp. 77–108. [Google Scholar]
  64. Willmott, C.J.; Matsuura, K. Advantages of the mean absolute error (MAE) over the root mean square error (RMSE) in assessing average model performance. Clim. Res. 2005, 30, 79–82. [Google Scholar] [CrossRef]
  65. Meteoblue. Available online: https://meteoblue.com/ (accessed on 20 April 2020).
  66. Meteoblue Weather Simulation Data. Available online: https://content.meteoblue.com/en/specifications/data-sources/weather-simulation-data/ (accessed on 6 July 2020).
  67. Guglielmetti, R.; Macumber, D.; Long, N. OpenStudio: An Open Source Integrated Analysis Platform; Technical Report; National Renewable Energy Laboratory (NREL): Golden, CO, USA, 2011.
  68. EnergyPlus. EnergyPlus Input Output Reference; U.S. Department of Energy: Washington, DC, USA, 2018.
  69. Ruiz, G.R.; Bandera, C.F.; Temes, T.G.A.; Gutierrez, A.S.O. Genetic algorithm for building envelope calibration. Appl. Energy 2016, 168, 691–705. [Google Scholar] [CrossRef]
  70. Ruiz, G.R.; Bandera, C.F. Analysis of uncertainty indices used for building envelope calibration. Appl. Energy 2017, 185, 82–94. [Google Scholar] [CrossRef]
  71. Fernández Bandera, C.; Ramos Ruiz, G. Towards a new generation of building envelope calibration. Energies 2017, 10, 2102. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Diagram of the methodology used to check the effect of the use of different actual weather data in the building energy simulations. Three analyses are performed in the study: weather data comparison (methodology explained on Section 2.1), energy comparison (Section 2.2), and indoor temperature comparison (Section 2.3).
Figure 1. Diagram of the methodology used to check the effect of the use of different actual weather data in the building energy simulations. Three analyses are performed in the study: weather data comparison (methodology explained on Section 2.1), energy comparison (Section 2.2), and indoor temperature comparison (Section 2.3).
Sustainability 12 06788 g001
Figure 2. A Taylor diagram example.
Figure 2. A Taylor diagram example.
Sustainability 12 06788 g002
Figure 3. The normalized Taylor diagram for the three weather locations (Pamplona (Spain), Gedved (Denmark), and Lavrio (Greece)) compared on an hourly basis for on-site and third-party weather data for the year 2019. This shows the dry bulb temperature, relative humidity, direct normal irradiation, diffuse horizontal irradiation, wind speed, and wind direction.
Figure 3. The normalized Taylor diagram for the three weather locations (Pamplona (Spain), Gedved (Denmark), and Lavrio (Greece)) compared on an hourly basis for on-site and third-party weather data for the year 2019. This shows the dry bulb temperature, relative humidity, direct normal irradiation, diffuse horizontal irradiation, wind speed, and wind direction.
Sustainability 12 06788 g003
Figure 4. Wind rose comparison between the weather station and third-party data of the Pamplona (Spain), Gedved (Denmark), and Lavrion (Greece) locations.
Figure 4. Wind rose comparison between the weather station and third-party data of the Pamplona (Spain), Gedved (Denmark), and Lavrion (Greece) locations.
Sustainability 12 06788 g004
Figure 5. The test sites analyzed. From left to right: office building (University of Navarre, Spain); Gedved school (Denmark); H2SusBuildand administration building in Lavrio (Greece). On top, the real buildings and, on the bottom, the building energy models (SketchUp thermal zone representation, OpenStudio plugin [67]).
Figure 5. The test sites analyzed. From left to right: office building (University of Navarre, Spain); Gedved school (Denmark); H2SusBuildand administration building in Lavrio (Greece). On top, the real buildings and, on the bottom, the building energy models (SketchUp thermal zone representation, OpenStudio plugin [67]).
Sustainability 12 06788 g005
Figure 6. Representation of the M A D P (%) of the energy demand analysis for the four test sites and for the different time grains. The dashed black line represents the results using the weather file with all the third-party weather parameters. Each colored line represents each one of the weather parameter results from the sensitivity analysis.
Figure 6. Representation of the M A D P (%) of the energy demand analysis for the four test sites and for the different time grains. The dashed black line represents the results using the weather file with all the third-party weather parameters. Each colored line represents each one of the weather parameter results from the sensitivity analysis.
Sustainability 12 06788 g006
Figure 7. The comparison between the simulation results when each test site was simulated with its weather (on the left with continuous lines) and when all the test sites were simulated with Pamplona’s weather (on the right with dashed lines). Each color represents a test site. The graphs show the M A D P for the energy demand for the different temporal resolutions. From above to below are shown: results when the third-party weather file was used (all the parameters were changed in the weather file) and the sensitivity analysis results for temperature, direct normal irradiation, diffuse horizontal irradiation, and wind speed.
Figure 7. The comparison between the simulation results when each test site was simulated with its weather (on the left with continuous lines) and when all the test sites were simulated with Pamplona’s weather (on the right with dashed lines). Each color represents a test site. The graphs show the M A D P for the energy demand for the different temporal resolutions. From above to below are shown: results when the third-party weather file was used (all the parameters were changed in the weather file) and the sensitivity analysis results for temperature, direct normal irradiation, diffuse horizontal irradiation, and wind speed.
Sustainability 12 06788 g007
Figure 8. Scatter plots for each test site of the indoor temperature simulated using the on-site and third-party weather files.
Figure 8. Scatter plots for each test site of the indoor temperature simulated using the on-site and third-party weather files.
Sustainability 12 06788 g008
Table 1. Technical specifications of the sensors of the weather stations installed in the office building in Pamplona (Spain), in Gedved School (Denmark), and in the Technological and Cultural Park in Lavrion (Greece).
Table 1. Technical specifications of the sensors of the weather stations installed in the office building in Pamplona (Spain), in Gedved School (Denmark), and in the Technological and Cultural Park in Lavrion (Greece).
SensorPamplona (Spain)Gedved (Denmark)Lavrion (Greece)
RangeResolutionAccuracyRangeResolutionAccuracyRangeResolutionAccuracy
Temperature (°C)−50 to +600.1±0.2−40 to +600.1±0.3−40 to +650.1±0.5
Relative Humidity (%)0 to 1000.1±20 to 1001±2.50 to 1001±3 (0–90%)
±4 (90–100%)
Atmospheric Pressure (mbar)300 to 12000.1±0.5150 to 11500.1±1.5880 to 1080±0.1±1
Precipitation (mm)0.3 to 5.00.01--0.2±2%-0.2±4%/0.25 (<50 mm/h)
±5%/0.25 (>50 mm/h)
Wind Direction (°)0 to 359.90.1<31 to 36011%1 to 3601±4%
Wind Speed (m/s)0 to 750.1±3% (0–35)
±5% (>35)
1 to 9610.1 (5–25)1 to 670.44±1/±5%
Global Solar Radiation (W/m 2 )0 to ±13001<±10%0 to ±13001<±10%0 to15001<10
Diffuse Solar Radiation (W/m 2 )0 to ±13001<±10%0 to ±13001<±10%0 to 15001<20
Table 2. Input data of the four models.
Table 2. Input data of the four models.
Office BuildingPublic SchoolH2SusBuildAdministration Building
Pamplona (Spain)Gedved (Denmark)Lavrion (Greece)Lavrion (Greece)
Lighting (W/m 2 )10108.58.5
Equipment (W/m 2 )81588
Occupation scheduleWd 9–21 h/Sat 9–14 hWd 8–16 h/Sat 8–13 hWd 9–20 h/Sat 9–14 hWd 9–20 h/Sat 9–14 h
Heating setpoint (°C)20Day 21/Night 15.62121
Cooling setpoint (°C)26No cooling2424
Wd: weekdays. Sat: Saturdays.
Table 3. Uncertainty metrics ( M A D P , C V ( R M S E ) , and R 2 ) were used in the energy analysis for the four test sites. T P W : the results using the weather file with all the parameters from the third-party weather data source. The rest changed only one parameter at a time: D H I (diffuse horizontal irradiation), D N I (direct normal irradiation), R H (relative humidity), T e m p (temperature), W D (wind direction), and W S (wind speed).
Table 3. Uncertainty metrics ( M A D P , C V ( R M S E ) , and R 2 ) were used in the energy analysis for the four test sites. T P W : the results using the weather file with all the parameters from the third-party weather data source. The rest changed only one parameter at a time: D H I (diffuse horizontal irradiation), D N I (direct normal irradiation), R H (relative humidity), T e m p (temperature), W D (wind direction), and W S (wind speed).
Office Building, Pamplona (Spain)School, Gedved (Denmark)
Statistic IndexTimeTPWDHIDNIRHTempWDWSTPWDHIDNIRHTempWDWS
MADP (%)Year1.420.794.330.504.070.037.9543.820.826.241.160.680.0146.91
Season9.613.939.013.174.070.0311.7443.820.826.241.161.710.0146.91
Month18.144.6510.292.9712.070.0313.6043.820.826.241.163.590.0146.91
Week17.885.2411.173.4014.390.0413.9943.820.906.241.195.160.0246.91
Day26.036.2511.804.0222.530.0514.0844.410.986.251.378.780.0246.91
Hour29.966.6412.274.7125.580.0814.3145.171.036.271.5010.100.0246.91
CV(RMSE) (%)Year1.420.794.330.504.070.037.9543.820.826.241.160.680.0146.91
Season10.784.7210.114.004.540.0315.4858.420.957.611.492.250.0260.25
Month26.125.5513.154.0214.610.0417.9652.440.997.051.414.390.0254.62
Week26.727.1215.045.1719.680.0620.0154.151.197.101.457.550.0255.29
Day42.909.2918.187.6034.960.0822.0760.391.447.741.8613.110.0359.33
Hour61.4213.1422.3712.5650.710.2030.5591.792.7514.043.5424.990.0588.81
R 2 (%)Hour90.8699.4898.6499.5392.54100.0098.4395.1899.9999.8299.9898.55100.0096.84
H2SusBuild, Lavrion (Greece)Administration Building, Lavrion (Greece)
TPWDHIDNIRHTempWDWSTPWDHIDNIRHTempWDWS
MADP (%)Year32.865.955.221.975.211.0539.401.2911.407.273.459.621.9312.19
Season44.588.149.493.217.252.1639.4027.7513.8313.404.209.622.7321.63
Month45.8310.3211.203.9511.822.9441.6638.7015.5615.804.4114.732.9929.97
Week47.8010.5011.154.0513.022.9342.1438.7015.6915.804.5214.782.9929.97
Day49.4610.7511.404.4617.172.9442.4938.9115.7215.804.9116.603.0029.97
Hour51.5810.9311.855.1219.972.9543.9539.4515.7915.835.4917.883.0130.22
CV(RMSE) (%)Year32.865.955.221.975.211.0539.401.2911.407.273.459.621.9312.19
Season61.659.6910.713.9112.402.5863.6136.2115.5714.264.7815.612.9334.39
Month65.2413.4113.375.0415.663.4465.8943.5320.8918.636.2020.233.8338.06
Week72.4014.1113.975.4619.343.5273.2946.1821.3519.096.6221.593.8540.61
Day82.6514.7714.746.3724.903.6881.4549.8121.9319.637.7624.173.9743.01
Hour90.3718.3517.928.8334.684.8585.6790.8137.5233.6915.0347.916.6772.69
R 2 (%)Hour85.5898.4198.4399.6193.7699.8892.4381.8597.3097.8099.5194.9199.9291.18
Table 4. The statistical metrics ( M A E , R M S E , and R 2 ) used in the indoor temperature analysis for the four test sites. All: metrics calculated with the temperature of all thermal zones; Max/Min: metrics calculated with the temperature of the thermal zone with the maximum/minimum temperature in each time step.
Table 4. The statistical metrics ( M A E , R M S E , and R 2 ) used in the indoor temperature analysis for the four test sites. All: metrics calculated with the temperature of all thermal zones; Max/Min: metrics calculated with the temperature of the thermal zone with the maximum/minimum temperature in each time step.
Statistic IndexOffice Building—PamplonaSchool—GedvedH2SusBuild—LavrionAdministration Building—Lavrion
M A E (°C)—Min/All/Max1.03/0.55/0.622.00/1.72/1.381.32/1.52/1.361.07/1.21/1.53
R M S E (°C)—Min/All/Max0.76/0.73/0.812.23/1.92/1.501.65/1.93/1.701.38/1.50/2.00
R 2 (%)—Min/All/Max91.86/92.20/89.5385.41/89.39/95.5190.21/86.89/85.5895.90/94.74/86.81

Share and Cite

MDPI and ACS Style

Segarra, E.L.; Ruiz, G.R.; González, V.G.; Peppas, A.; Bandera, C.F. Impact Assessment for Building Energy Models Using Observed vs. Third-Party Weather Data Sets. Sustainability 2020, 12, 6788. https://doi.org/10.3390/su12176788

AMA Style

Segarra EL, Ruiz GR, González VG, Peppas A, Bandera CF. Impact Assessment for Building Energy Models Using Observed vs. Third-Party Weather Data Sets. Sustainability. 2020; 12(17):6788. https://doi.org/10.3390/su12176788

Chicago/Turabian Style

Segarra, Eva Lucas, Germán Ramos Ruiz, Vicente Gutiérrez González, Antonis Peppas, and Carlos Fernández Bandera. 2020. "Impact Assessment for Building Energy Models Using Observed vs. Third-Party Weather Data Sets" Sustainability 12, no. 17: 6788. https://doi.org/10.3390/su12176788

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop