# Considering Uncertainties of Key Performance Indicators in Wind Turbine Operation

^{*}

## Abstract

**:**

## Featured Application

**The results enable wind turbine operational managers to consider data handling-related uncertainties when using KPIs. The following recommendations allow mitigating uncertainties through simple measures. Furthermore, methods are proposed to correct systematic deviations.**

## Abstract

## 1. Introduction

_{2}emission indicator used in the cement industry.

## 2. Key Performance Indicators

- Performance KPIs
- Maintenance KPIs
- Reliability KPIs
- Health, Safety and the Environment (HSE) KPIs
- Finance KPIs

#### 2.1. Wind Conditions

#### 2.2. Capacity Factor

#### 2.3. Power Curve

#### 2.4. Production-Based Availability

## 3. Uncertainties

**Data resolution:**Operational data of WT are mostly stored in resolutions of 5, 10, or 15 min. Older projects also use resolutions as low as 30 min, whereas recent projects sometimes store Hz data. When data is aggregated, information like extreme values gets lost in the averaging process. This behavior is not to be confused with the Nyquist–Shannon sampling theorem, which determines the necessary sampling rate in a measurement [38,39]. The IEC 61400-12 [30,31] requires for example, a minimum sampling rate of 1 Hz for most measurements—ambient temperature or air pressure can also be sampled once per minute—whereas it defines aggregated 10 min data to be the foundation for power curve calculations. This work expects the underlying measurement to be done correctly. An additional uncertainty cause is related to the sample length of an aggregation period. Whenever samples are missing in the aggregation period, the remaining samples are weighted more strongly than in other periods. Although missing samples might show up in Hz data, they might not be detectable in 5 min data anymore. Schmiedel [40] investigated the information loss when switching from 1 min data to 15 min data in a different context.**Data gaps/data availability:**Data gaps in a long history of operational data are normal. As shown in [41], the data availability is, on average, at 95%. In most cases, the data availability is higher (median at 99%), but in other cases also much lower. Current data availability requirements seem somewhat arbitrary. The German guideline for site assessment [29] requires, for example, a data availability of 80% without any further notices or restrictions. The technical guideline 10 of the FGW e.V. [42] allocates data gaps to be downtime in its availability calculation. Reasons or data gaps are manifold and include failed sensors, downtime of the whole turbine, including its controller, as well as telecommunication interruptions [43]. Many WT still use the GSM standard of the cellular network. If data are missing, the remaining data points are unlikely to be representative of the period considered. Thus, uncertainties in the calculated KPIs are to be expected. Existing literature mainly deals with different approaches to fill data gaps [38,44,45] and not with their effect on KPIs.**Data completeness:**Especially for older WT, the number of available measurements is often very limited, and supporting inputs for the calculation of KPIs are missing. Similar situations occur when sensors fail for a longer time. If the missing measurements are not vital for the calculation but increase the accuracy, they can be neglected or sometimes replaced by approximations. This approach leads in any case to uncertainties. The present paper discusses this issue for the example of air density since it is a vital factor in obtaining comparable results in power performance assessments [30,31]. Especially data sets of older WTs do not comprise all required measurements (air pressure, temperature, and humidity) for an accurate density calculation, which makes our example a common issue in the industry.

## 4. Method and Data

#### 4.1. Data and Toolset

#### 4.2. Data Resolution

#### 4.3. Data Availability

#### 4.4. Data Completeness

## 5. Results

#### 5.1. Data Resolution

#### 5.2. Data Availability

**a**) the wind speed, (

**b**) the wind power density, and (

**c**) the capacity factor, depending on the data availability and compared to a reference value based on a 100% data availability. The coefficient of variation was chosen over the standard deviation to make different datasets comparable.

#### 5.3. Data Completeness

^{3}. If the considered site or nacelle is at a higher altitude, this assumption leads to systematic high results. A correction for altitude likely results in an underestimation since the average temperature is usually below the standard temperature of 15 °C. If the air density is also corrected for the average temperature, a yearly wind power density KPI is already close to the reference as lower and higher temperatures balance out. Good results and drastically reduced deviations are achieved when temperature measurements are available.

## 6. Discussion

**Data Resolution**

- The mean wind speed and capacity factor are not affected by the data resolution.
- The systematic influence on the mean wind power density is caused by its cubic equation. By using the standard deviation of the wind speed, it can be corrected to an uncertainty of about 0.1%.
- By filtering the data thoroughly, effects on the power curve can be reduced to an acceptable level. The standard deviation of the resulting energy yields can be estimated at 0.4%.
- Whenever downtime events are derived from operational data, the number and length of downtime events heavily depend on the chosen data resolution. Availability metrics are affected accordingly. Thus, discrete event information should be used whenever possible. The production-based availability is furthermore affected through the power curve.

**Data Availability**

- The effect of decreasing data availability mainly depends on the distribution of data gaps in the whole dataset. While keeping the data availability constant, continuous data gaps cause much higher deviations than short and randomly distributed gaps. This emphasizes the need to use a full year of measurements if representative results are to be achieved. If data gaps are entirely random, the deviations can be almost neglected, and low data availabilities provide valid results.
- All discussed KPIs are affected by decreasing data availability. KPIs with a cubic equation, like the capacity factor or wind power density, show a higher sensitivity than the mean wind speed, which has a linear dependence. Site-specific simulations are necessary to quantify uncertainties.
- Short and random data gaps are of low importance when calculating the power curve. In contrast, long and continuous data gaps cause significant and systematic deviations and uncertainties, which can only be partially corrected.

**Data Completeness**

- Missing measurements of the air density or air pressure result in considerable deviation in the mean wind power density and power curve.
- The air density can be well approximated by a correction to the location altitude and the measured ambient temperature. The weather-related fluctuation of the air pressure leads to an uncertainty of about 0.013 kg/m
^{3}. - Measuring the ambient humidity is only of little additional use. It can be assumed to be 50%.
- When approximating the air density, the uncertainty of the mean wind power density is approximately 1 W/m
^{2}. At high wind speeds, this value may be higher. - If a density correction of the wind speed is carried out based on an approximated air density, the corresponding uncertainty of the calculated power curve leads to an uncertainty of about 0.25% when calculating annual energy yields.

## Author Contributions

## Funding

## Conflicts of Interest

## Abbreviations

AI | artificial intelligence |

CMS | condition monitoring system |

FGW e.V. | Fördergesellschaft Windenergie und andere Dezentrale Energien |

GUM | Guide to the Expression of Uncertainty in Measurement |

HSE | Health, Safety and the Environment |

IEC | International Electrotechnical Commission |

IT | information technology |

KPI | key performance indicator |

LCOE | levelized cost of energy |

MEASNET | Measuring Network of Wind Energy Institutes |

NREL | National Renewable Energy Laboratory |

O&M | operation and maintenance |

OpenOA | Open-Source Tool for Wind Farm Operational Performance Analysis |

SCADA | supervisory control and data acquisition system |

WT | wind turbine |

WF | wind farm |

WInD-Pool | Wind Energy Information Data Pool |

## References

- Stehly, T.J.; Beiter, P.C.; Heimiller, D.M.; Scott, G.N. 2017 Cost of Wind Energy Review; National Renewable Energy Lab. (NREL): Golden, CO, USA, 2017. [CrossRef] [Green Version]
- Wiser, R.; Bolinger, M.; Lantz, E. Assessing wind power operating costs in the United States: Results from a survey of wind industry experts. Renew. Energy Focus
**2019**, 30, 46–57. [Google Scholar] [CrossRef] - IEA Wind. IEA Wind TCP Task 26—Wind Technology, Cost, and Performance Trends in Denmark, the European Union, Germany, Ireland, Norway, Sweden, and the United States: 2008–2016; National Renewable Energy Laboratory: Golden, CO, USA, 2018.
- Arwas, P.; Charlesworth, D.; Clark, D.; Clay, R.; Craft, G.; Donaldson, I.; Dunlop, A.; Fox, A.; Howard, R.; Lloyd, C.; et al. Offshore Wind Cost Reduction: Pathways Study; The Crown Estate: London, UK, 2012. [Google Scholar]
- PriceWaterhouseCoopers. Guide to Key Performance Indicators: Communicating the Measures that Matter; PriceWaterhouseCoopers: London, UK, 2007. [Google Scholar]
- Pfaffel, S.; Faulstich, S.; Sheng, S. Recommended key performance indicators for operational management of wind turbines. J. Phys. Conf. Ser.
**2019**, 1356, 012040. [Google Scholar] [CrossRef] [Green Version] - Gonzalez, E.; Nanos, E.M.; Seyr, H.; Valldecabres, L.; Yürüşen, N.Y.; Smolka, U.; Muskulus, M.; Melero, J.J. Key Performance Indicators for Wind Farm Operation and Maintenance. Energy Procedia
**2017**, 137, 559–570. [Google Scholar] [CrossRef] - Stetco, A.; Dinmohammadi, F.; Zhao, X.; Robu, V.; Flynn, D.; Barnes, M.; Keane, J.; Nenadic, G. Machine earning methods for wind turbine condition monitoring: A review. Renew. Energy
**2019**, 133, 620–635. [Google Scholar] [CrossRef] - Traiger, E. Machine Learning for Automated Detection of Wind Farm Underperformance. In Proceedings of the the WindEurope Analysis of Operating Wind Farms Workshop, Vilnius, Lithuania, 16 May 2018. [Google Scholar]
- Sanchez-Marquez, R.; Albarracin Guillem, J.M.; Vicens-Salort, E.; Jabaloyes Vivas, J. A statistical system management method to tackle data uncertainty when using key performance indicators of the balanced scorecard. J. Manuf. Syst.
**2018**, 48, 166–179. [Google Scholar] [CrossRef] - Perotto, E.; Canziani, R.; Marchesi, R.; Butelli, P. Environmental performance, indicators and measurement uncertainty in EMS context: A case study. J. Clean. Prod.
**2008**, 16, 517–530. [Google Scholar] [CrossRef] - Torregrossa, D.; Schutz, G.; Cornelissen, A.; Hernández-Sancho, F.; Hansen, J. Energy saving in WWTP: Daily benchmarking under uncertainty and data availability limitations. Environ. Res.
**2016**, 148, 330–337. [Google Scholar] [CrossRef] [PubMed] - Feiz, R.; Ammenberg, J.; Baas, L.; Eklund, M.; Helgstrand, A.; Marshall, R. Improving the CO
_{2}performance of cement, part I: Utilizing life-cycle assessment and key performance indicators to assess development within the cement industry. J. Clean. Prod.**2015**, 98, 272–281. [Google Scholar] [CrossRef] [Green Version] - Martin, R.; Lazakis, I.; Barbouchi, S.; Johanning, L. Sensitivity analysis of offshore wind farm operation and maintenance cost and availability. Renew. Energy
**2016**, 85, 1226–1236. [Google Scholar] [CrossRef] [Green Version] - Dykes, K.; Ning, A.; King, R.; Graf, P.; Scott, G.; Veers, P. Sensitivity Analysis of Wind Plant Performance to Key Turbine Design Parameters: A Systems Engineering Approach. In Proceedings of the AIAA SciTech 2014, National Harbor, MD, USA, 13–17 January 2014. [Google Scholar] [CrossRef] [Green Version]
- Craig, A.; Optis, M.; Fields, M.J.; Moriarty, P. Uncertainty quantification in the analyses of operational wind power plant performance. J. Phys. Conf. Ser.
**2018**, 1037, 052021. [Google Scholar] [CrossRef] - Craig, A. Uncertainty Quantification in Wind Plant Energy Estimation. In AIAA Scitech 2019 Forum; American Institute of Aeronautics and Astronautics: Reston, VA, USA, 2019; p. 1. [Google Scholar] [CrossRef]
- Fitz-Gibbon, C.T. Performance Indicators; Multilingual Matters: Bristol, UK, 1990. [Google Scholar]
- Shahin, A.; Mahbod, M.A. Prioritization of key performance indicators. Int. J. Product. Perform. Manag.
**2007**, 56, 226–240. [Google Scholar] [CrossRef] - Hahn, B. 17. Wind Farm Data Collection and Reliability Assessment for O&M Optimization: Expert Group Report on Recommended Practices, 1st ed.; IEA Wind TCP: Olympia, WA, USA, 2017. [Google Scholar]
- Lindley, D.V. Understanding Uncertainty; Wiley: Hoboken, NJ, USA, 2013. [Google Scholar] [CrossRef] [Green Version]
- JCGM. Evaluation of Measurement Data—Guide to the Expression of Uncertainty in Measurement (GUM): JCGM 100:2008; JCGM—Joint Committee for Guides in Metrology: Sevres, France, 2008. [Google Scholar]
- Dangendorf, S.; Burzel, A.; Wahl, T.; Mudersbach, C.; Jensen, J.; Oumeraci, H. Unsicherheits- und Sensitivitätsanalyse im Rahmen einer integrierten Risikoanalyse: Zwischenbericht Aktivität 4.5 im Forschungsprojekt XtremRisK (03F0483A); TU Braunschweig: Brunswick, Germany, 2012. [Google Scholar]
- Daneshkhah, A.R. Uncertainty in Probabilistic Risk Assessment: A Review. Nucl. Eng. Des.
**2004**, 115, 173–179. [Google Scholar] - Mudersbach, C. Untersuchungen zur Ermittlung von hydrologischen Bemessungsgrößen mit Verfahren der instationären Extremwertstatistik. Ph.D. Thesis, Universität Siegen, Siegen, Germany, 2009. [Google Scholar]
- Ang, A.H.S.; Tang, W.H. Probability Concepts in Engineering: Emphasis on Applications in Civil & Environmental Engineering, 2nd ed.; Wiley: Hoboken, NJ, USA, 2007. [Google Scholar]
- Merz, B. Hochwasserrisiken: Grenzen und Möglichkeiten der Risikoabschätzung; Schweizerbart: Stuttgart, Germany, 2006. [Google Scholar]
- JCGM. Auswertung von Messdaten—Eine Einführung zum “Leitfaden zur Angabe der Unsicherheit beim Messen” und zu den dazugehörigen Dokumenten: JCGM 104:2009; JCGM: Sevres, France, 2009. [Google Scholar]
- FGW. Technical Guideline for Power Plants Part 6: Determination of Wind Potential and Energy Yields; FGW: Berlin, Germany, 2017. [Google Scholar]
- International Electrotechnical Commission. Power Performance Measurements of Electricity Producing Wind Turbines (IEC 61400-12-1); International Electrotechnical Commission: Geneva, Switzerland, 2017. [Google Scholar]
- International Electrotechnical Commission. Power Performance of Electricity-Producing Wind Turbines Based on Nacelle Anemometry (IEC 61400-12-2); International Electrotechnical Commission: Geneva, Switzerland, 2013. [Google Scholar]
- Zhang, J.; Hodge, B.M.; Gomez-Lazaro, E.; Lovholm, A.L.; Berge, E.; Miettinen, J.; Holttinen, H.; Cutululis, N.; Litong-Palima, M.; Sorensen, P.; et al. Analysis of Variability and Uncertainty in Wind Power Forecasting: An International Comparison: Preprint; U.S. Department of Energy Office of Scientific and Technical Information: Oak Ridge, TN, USA, 2013.
- Lange, B.; Rohrig, K.; Ernst, B.; Schlögl, F.; Cali, Ü.; Jursa, R.; Moradi, J. Wind power prediction in Germany—Recent advances and future challenges. In Proceedings of the Poster at European Wind Energy Conference, Athen, Greece, 27 February–2 March 2006. [Google Scholar]
- Sherwin, B.; Fields, J. IEC 61400-15 Working Group: Progress Update #2 – Meeting 13; National Renewable Energy Laboratory: Golden, CO, USA, 2018.
- MEASNET. Calibration Uncertainty Parameters in MEASNET Wind Tunnels used for Anemometer Calibration; MEASNET: Madrin, Spain, 2012. [Google Scholar]
- International Electrotechnical Commission. Production Based Availability for Wind Turbines (IEC 61400-26-2); International Electrotechnical Commission: Geneva, Switzerland, 2014. [Google Scholar]
- International Electrotechnical Commission. Time Based Availability for Wind Turbines (IEC 61400-26-1), 2010-12.; International Electrotechnical Commission: Geneva, Switzerland, 2012. [Google Scholar]
- Madsen, H. Time Series Analysis; Texts in Statistical Science Series; Chapman & Hall/CRC: Boca Raton, FL, USA, 2008; Volume 72. [Google Scholar]
- Tavner, P.; Edwards, C.; Brinkman, A.; Spinato, F. Influence of Wind Speed on Wind Turbine Reliability. Wind Eng.
**2006**, 30, 55–72. [Google Scholar] [CrossRef] - Schmiedel, A. Untersuchung des Informationsverlustes von Zeitreihen beim Übergang von Minuten- zu Viertelstundendurchschnittswerten. Bachelor’s Thesis, Technische Hoschule Chemnitz, Chemnitz, Germany, 2011. [Google Scholar]
- Faulstich, S.; Pfaffel, S.; Hahn, B. Performance and reliability benchmarking using the cross-company initiative WInD-Pool. In Proceedings of the RAVE Offshore Wind R &D Conference, Bremerhaven, Germany, 14 October 2015. [Google Scholar]
- FGW. Technical Guideline for Power Plants Part 10: Determination of Site Quality Following Commissioning; FGW: Berlin, Germany, 2018. [Google Scholar]
- Hirsch, J.; Faulstich, S.; Fraunhofer Institute for Energy Economics and Energy System Technology. HERA-VPP—High Efficiency, Reliability, Availability of Virtual Power Plants: Abschlussbericht: Laufzeit des Vorhabens: 01.08.2014–31.01.2016; Fraunhofer Institute for Energy Economics and Energy System Technology: Kassel, Germany, 2016. [Google Scholar] [CrossRef]
- Little, R.J.; Rubin, D.B. Statistical Analysis with Missing Data; Wiley Series in Probability and Mathematical Statistics; Wiley: New York, NY, USA, 1987. [Google Scholar]
- Aubinet, M.; Vesala, T.; Papale, D. (Eds.) Eddy Covariance: A Practical Guide to Measurement and Data Analysis; Springer Atmospheric Sciences; Springer: Dordrecht, The Netherlands, 2012. [Google Scholar]
- Klaas, T.; Pauscher, L.; Callies, D. LiDAR-mast deviations in complex terrain and their simulation using CFD. Meteorol. Z.
**2015**, 24, 13. [Google Scholar] [CrossRef] - ENGIE. Welcome to ENGIE’s First Open Data Windfarm. Available online: https://opendata-renewables.engie.com (accessed on 24 April 2019).
- MEASNET. Evaluation of Site-Specific Wind Conditions: Version 2; MEASNET: Madrid, Spain, 2016. [Google Scholar]
- Optis, M.; Perr-Sauer, J.; Philips, C.; Craig, A.E.; Lee, J.C.Y.; Kemper, T.; Sheng, S.; Simley, E.; Williams, L.; Lunacek, M.; et al. OpenOA: An Open-Source Code Base for Operational Analysis of Wind Power Plants. Wind Energy Sci. Discuss.
**2019**, 1–14. [Google Scholar] [CrossRef] - Optis, M. OpenOA: Open-Source Tool for Wind Farm Operational Performance Analysis. In Proceedings of the Drivetrain Reliability Collaborative Meeting 2019, Golden, CO, USA, 19–20 February 2019. [Google Scholar]
- NREL/OpenOA: GitHub Repository; National Renewable Energy Laboratory: Golden, CO, USA, 2019. Available online: https://github.com/NREL/OpenOA (accessed on 18 June 2019).
- Grange, S.K. Technical Note: Averaging Wind Speeds And Directions; University of Auckland: Auckland, New Zealand, 2014. [Google Scholar] [CrossRef]
- Bertino, S. A Measure of Representativeness of a Sample for Inferential Purposes. Int. Stat. Rev.
**2006**, 74, 149–159. [Google Scholar] [CrossRef] - PCWG. Power Curve Working Group—Part of CFARs. Available online: https://pcwg.org (accessed on 10 October 2019).

**Figure 1.**Exemplary causes for uncertainties of performance KPIs grouped by their categories in a causal diagram.

**Figure 2.**Approach of this paper to analyze sensitivities and uncertainties for selected KPIs and uncertainty causes.

**Figure 3.**Effect of different data resolutions on (

**a**) (wind power density) the calculation of an average wind power density and (

**b**) (availability) availability metrics in the case that downtimes are solely identified based on SCADA-Data.

**Figure 4.**Coefficients of variation for different KPIs caused by a reduction in data availability. Randomly distributed data gaps are compared to continuous data gaps. (

**a**) wind speed, (

**b**) wind power density, (

**c**) capacity factor.

**Figure 5.**Effect of decreasing data availability on the uncertainty of power curves. The uncertainty is represented by deviations in the calculation of a generic annual energy yield using a rayleigh wind speed distribution (6.45 m/s) as reference.

Key Performance Indicators | |||||
---|---|---|---|---|---|

Uncertainty Causes | Wind Speed | Capacity Factor | Wind Power Density | Power Curve | Production-Based Availability |

Data Resolution | - | - | ! | ! | ! |

Data Availability | ! | ! | ! | ! | ↗ |

Data Completeness | - | - | ! | ↗ | ↗ |

© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Pfaffel, S.; Faulstich, S.; Rohrig, K.
Considering Uncertainties of Key Performance Indicators in Wind Turbine Operation. *Appl. Sci.* **2020**, *10*, 898.
https://doi.org/10.3390/app10030898

**AMA Style**

Pfaffel S, Faulstich S, Rohrig K.
Considering Uncertainties of Key Performance Indicators in Wind Turbine Operation. *Applied Sciences*. 2020; 10(3):898.
https://doi.org/10.3390/app10030898

**Chicago/Turabian Style**

Pfaffel, Sebastian, Stefan Faulstich, and Kurt Rohrig.
2020. "Considering Uncertainties of Key Performance Indicators in Wind Turbine Operation" *Applied Sciences* 10, no. 3: 898.
https://doi.org/10.3390/app10030898