Abstract
The growing adoption of renewable energy conversion systems and smart infrastructures has increased the demand for accurate monitoring solutions to ensure system performance and reliability, as well as seamless integration with cloud-based platforms. Voltage and current sensing are central to this task; however, sensor selection often involves a trade-off between cost and measurement precision. Rather than comparing technologies as equivalent options, this study investigates the practical impact of using low-cost versus high-precision sensors in electrical power generation monitoring. The evaluation includes representative low-cost sensors and high-precision alternatives based on instrumentation amplifiers and a closed-loop Hall-effect transducer. All sensors were characterized under controlled laboratory conditions and analyzed using statistical indicators, including MAE, RMSE, MAPE, and R2. Results show that high-precision sensors achieved R2 > 0.97 and MAPE < 4%, whereas low-cost sensors showed R2 as low as 0.73 and errors exceeding 10% under dynamic irradiance conditions. Low-cost sensors present deviations of 5–8% in RMS measurement, while high-precision sensors maintain error below 1%.