#
Forecasting Air Temperature on Edge Devices with Embedded AI^{ †}

^{*}

^{†}

## Abstract

**:**

## 1. Introduction

- First, (sensor) data related to relevant environmental variables internal to the greenhouse, which have to be maintained within suitable ranges (e.g., air humidity and temperature), are collected through devices equipped with sensors (denoted as IoT sensing nodes, or sensor nodes, SNs), generally organized as Wireless Sensor Networks (WSNs). Moreover, internal greenhouse data gathered by SNs are usually sent to less constrained nodes, denoted as gateways (GWs) and connected to the Internet. GWs forward SNs’ data to processing and storing infrastructures located in the Cloud [6]. Then, data can be retrieved and visualized (through appropriate User Interfaces, UIs), as well as kept as input data for further processing. Hence, monitoring of relevant variables inside the greenhouse is relevant for both end-users (farmers) and for researchers [7,8,9,10,11].
- Secondly, additional control devices (i.e., actuator nodes), installed inside the greenhouse in order to regulate its internal climate [12,13], can be integrated within the aforementioned collection system. As an example, if a dangerous air humidity index is detected by SNs, a ventilation system would automatically be activated in order to lower the air humidity.
- Thirdly, complex models and/or forecasting algorithms are developed with the goal of predicting the future values of the monitored environmental variables, for example allowing us to preemptively schedule some operations (e.g., the activation of a warming system) to avoid these internal variables reaching undesired conditions (i.e., too low temperatures). To this end, the greenhouse’s internal variables have been satisfactorily forecast through Deep Learning (DL) algorithms, e.g., based on Neural Networks (NNs) [14,15,16], and selecting data collected from different sources as input (namely, internal and external variables of a greenhouse, possibly measured by SNs).

## 2. Background

#### 2.1. Overview on Neural Networks

#### 2.2. Evaluation Metrics

## 3. Related Work

## 4. Methodology

- Relevant air temperature data, measured with sensors inside a greenhouse associated with an Italian demonstrator of the H2020 project AFarCloud [30], are collected and processed to remove outliers and spurious data (Section 4.1).
- The greenhouse indoor temperature sensor data collected with a sampling period ${T}_{\mathrm{samp}}=10$ min are arranged in a time series. Furthermore, from this original time series, six additional time series are derived downsampling the first time series with longer sampling periods (Section 4.2).
- The number of input variables of the model $SW$ and the sampling period ${T}_{\mathrm{samp}}$ are defined as the two design parameters. Moreover, ${T}_{\mathrm{samp}}$ is reintegrated as the prediction time horizon; in fact, the predicted temperature value is the one corresponding to the next temperature value after the most recent one of the sliding window: this samples is, by construction, ${T}_{\mathrm{samp}}$ ahead. Furthermore, a proper set of values related to these parameters is selected for testing purposes (Section 4.3).
- Starting from the collected sensor data and according to the number of parameters’ values to be tested, multiple data sets are created. Furthermore, each data set is split into training and test subsets (Section 4.4).
- Three NN architectures, based on an ANN, a RNN, and a LSTM, are introduced and trained with the data sets resulting from the previous steps (Section 4.5).
- The NN model presented in [19] is re-trained with a significantly larger data set—including data from 6 more months (Section 4.6).
- All models are evaluated on the test subsets and their performances are compared in terms of RMSE, MAPE, R${}^{2}$, and NetScore (Section 5).
- Finally, the best three models (among a total of 210) on the considered engineered data sets (step 4) are performance-wise compared with relevant literature approaches (Section 5).

#### 4.1. Data Collection and Cleaning

#### 4.2. Engineering Time Series from Sensor Data

`NaN`. Moreover, for this reason, the overall number of samples in the time series, denoted as ${N}_{\mathrm{samp}}$, is smaller than the total number of samples ${N}_{\mathrm{tot}}$ ideally collected in the considered 16 month time interval with a sampling period equal to ${T}_{0}$ = 10 min and no data loss—more precisely, ${N}_{\mathrm{tot}}$ = ${N}_{\mathrm{samp}}+{N}_{\mathrm{lost}}$.

#### 4.3. Sliding Window-Based Prediction

#### 4.4. Data Pre-Processing and Data Sets Creation

`NaN`values—corresponding to missing air temperature values for specific instants of time in the original time series, as explained in Section 4.2—have been discarded and not included in the data sets. Then, each data set is split (randomly) into a training subset and a test subset with a ratio $3:1$.

#### 4.5. Models Training

#### 4.6. “Old Model” Re-Training

## 5. Experimental Results

#### 5.1. Sliding Window and Sampling Interval

#### 5.2. NN Architecture

#### 5.3. Performance Analysis and Literature Comparison

- the value of R${}^{2}$ of the considered NN-based models is higher than those of all the references listed in Table 1.

#### 5.4. Possible Application Scenario and Reference Architecture

## 6. Conclusions

## Author Contributions

## Funding

## Institutional Review Board Statement

## Informed Consent Statement

## Data Availability Statement

## Conflicts of Interest

## Abbreviations

AFarCloud | Aggregate Farming in the Cloud |

AI | Artificial Intelligence |

ANN | Artificial Neural Network |

BP | Back Propagation |

CGA | Conjugate Gradient Algorithm |

DL | Deep Learning |

DNN | Deep Neural Network |

FaaS | Farm-as-a-Service |

GW | Gateway |

ICT | Information and Communication Technology |

IoT | Internet of Things |

LM | Levenberg-Marquardt |

LSTM | Long Short-Term Memory |

MAC | Multiply–ACcumulate |

MAPE | Mean Absolute Percentage Error |

ML | Machine Learning |

MLP | Multi-Layer Perceptron |

NARX | Nonlinear AutoRegressive with eXternal input |

NN | Neural Network |

PSO | Particle Swarm Optimization |

R${}^{2}$ | Coefficient of determination |

RBF | Radial Basis Function |

ReLU | Rectified Linear Unit |

RMSE | Root Mean Squared Error |

RNN | Recurrent Neural Network |

SA | Smart Agriculture |

SBC | Single Board Computer |

SF | Smart Farming |

SN | Sensor Node |

UI | User Interface |

WSN | Wireless Sensor Network |

## References

- Codeluppi, G.; Cilfone, A.; Davoli, L.; Ferrari, G. VegIoT Garden: A modular IoT Management Platform for Urban Vegetable Gardens. In Proceedings of the IEEE International Workshop on Metrology for Agriculture and Forestry (MetroAgriFor), Portici, Italy, 24–26 October 2019; pp. 121–126. [Google Scholar] [CrossRef]
- Kumar, A.; Tiwari, G.N.; Kumar, S.; Pandey, M. Role of Greenhouse Technology in Agricultural Engineering. Int. J. Agric. Res.
**2010**, 5, 779–787. [Google Scholar] [CrossRef] [Green Version] - Francik, S.; Kurpaska, S. The Use of Artificial Neural Networks for Forecasting of Air Temperature inside a Heated Foil Tunnel. Sensors
**2020**, 20, 652. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Escamilla-García, A.; Soto-Zarazúa, G.M.; Toledano-Ayala, M.; Rivas-Araiza, E.; Gastélum-Barrios, A. Applications of Artificial Neural Networks in Greenhouse Technology and Overview for Smart Agriculture Development. Appl. Sci.
**2020**, 10, 3835. [Google Scholar] [CrossRef] - Bot, G. Physical Modeling of Greenhouse Climate. IFAC Proc. Vol.
**1991**, 24, 7–12. [Google Scholar] [CrossRef] - Belli, L.; Cirani, S.; Davoli, L.; Melegari, L.; Mónton, M.; Picone, M. An Open-Source Cloud Architecture for Big Stream IoT Applications. In Interoperability and Open-Source Solutions for the Internet of Things: International Workshop, FP7 OpenIoT Project, Held in Conjunction with SoftCOM 2014, Split, Croatia, 18 September 2014, Invited Papers; Podnar Žarko, I., Pripužić, K., Serrano, M., Eds.; Springer International Publishing: Berlin/Heidelberg, Germany, 2015; pp. 73–88. [Google Scholar] [CrossRef]
- Kochhar, A.; Kumar, N. Wireless sensor networks for greenhouses: An end-to-end review. Comput. Electron. Agric.
**2019**, 163, 104877. [Google Scholar] [CrossRef] - Codeluppi, G.; Cilfone, A.; Davoli, L.; Ferrari, G. LoRaFarM: A LoRaWAN-Based Smart Farming Modular IoT Architecture. Sensors
**2020**, 20, 2028. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Abbasi, M.; Yaghmaee, M.H.; Rahnama, F. Internet of Things in agriculture: A survey. In Proceedings of the 3rd International Conference on Internet of Things and Applications (IoT), Isfahan, Iran, 17–18 April 2019; pp. 1–12. [Google Scholar] [CrossRef]
- Davoli, L.; Belli, L.; Cilfone, A.; Ferrari, G. Integration of Wi-Fi mobile nodes in a Web of Things Testbed. ICT Express
**2016**, 2, 95–99. [Google Scholar] [CrossRef] [Green Version] - Tafa, Z.; Ramadani, F.; Cakolli, B. The Design of a ZigBee-Based Greenhouse Monitoring System. In Proceedings of the 7th Mediterranean Conference on Embedded Computing (MECO), Budva, Montenegro, 10–14 June 2018; pp. 1–4. [Google Scholar] [CrossRef]
- Wiboonjaroen, M.T.W.; Sooknuan, T. The Implementation of PI Controller for Evaporative Cooling System in Controlled Environment Greenhouse. In Proceedings of the 17th International Conference on Control, Automation and Systems (ICCAS), Jeju, Korea, 18–21 October 2017; pp. 852–855. [Google Scholar] [CrossRef]
- Zou, Z.; Bie, Y.; Zhou, M. Design of an Intelligent Control System for Greenhouse. In Proceedings of the 2nd IEEE Advanced Information Management, Communicates, Electronic and Automation Control Conference (IMCEC), Xi’an, China, 25–27 May 2018; pp. 1–1635. [Google Scholar] [CrossRef]
- Moon, T.; Hong, S.; Young Choi, H.; Ho Jung, D.; Hong Chang, S.; Eek Son, J. Interpolation of Greenhouse Environment Data using Multilayer Perceptron. Comput. Electron. Agric.
**2019**, 166, 105023. [Google Scholar] [CrossRef] - Taki, M.; Abdanan Mehdizadeh, S.; Rohani, A.; Rahnama, M.; Rahmati-Joneidabad, M. Applied machine learning in greenhouse simulation; New application and analysis. Inf. Process. Agric.
**2018**, 5, 253–268. [Google Scholar] [CrossRef] - Yue, Y.; Quan, J.; Zhao, H.; Wang, H. The Prediction of Greenhouse Temperature and Humidity Based on LM-RBF Network. In Proceedings of the IEEE International Conference on Mechatronics and Automation (ICMA), Changchun, China, 5–8 August 2018; pp. 1537–1541. [Google Scholar] [CrossRef]
- Lee, Y.; Tsung, P.; Wu, M. Techology Trend of Edge AI. In Proceedings of the International Symposium on VLSI Design, Automation and Test (VLSI-DAT), Hsinchu, Taiwan, 16–19 April 2018; pp. 1–2. [Google Scholar] [CrossRef]
- Wong, A. NetScore: Towards Universal Metrics for Large-Scale Performance Analysis of Deep Neural Networks for Practical On-Device Edge Usage. In Image Analysis and Recognition; Karray, F., Campilho, A., Yu, A., Eds.; Springer International Publishing: Cham, Switzerland, 2019; pp. 15–26. [Google Scholar] [CrossRef] [Green Version]
- Codeluppi, G.; Cilfone, A.; Davoli, L.; Ferrari, G. AI at the Edge: A Smart Gateway for Greenhouse Air Temperature Forecasting. In Proceedings of the IEEE International Workshop on Metrology for Agriculture and Forestry (MetroAgriFor), Trento, Italy, 4–6 November 2020; pp. 348–353. [Google Scholar] [CrossRef]
- Abiodun, O.I.; Jantan, A.; Omolara, A.E.; Dada, K.V.; Mohamed, N.A.; Arshad, H. State-of-the-art in artificial neural network applications: A survey. Heliyon
**2018**, 4, e00938. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Cifuentes, J.; Marulanda, G.; Bello, A.; Reneses, J. Air Temperature Forecasting Using Machine Learning Techniques: A Review. Energies
**2020**, 13, 4215. [Google Scholar] [CrossRef] - Kavlakoglu, E. AI vs. Machine Learning vs. Deep Learning vs. Neural Networks: What’s the Difference? Available online: https://www.ibm.com/cloud/blog/ai-vs-machine-learning-vs-deep-learning-vs-neural-networks (accessed on 2 March 2021).
- Ferrero Bermejo, J.; Gómez Fernández, J.F.; Olivencia Polo, F.; Crespo Márquez, A. A Review of the Use of Artificial Neural Network Models for Energy and Reliability Prediction. A Study of the Solar PV, Hydraulic and Wind Energy Sources. Appl. Sci
**2019**, 8, 1844. [Google Scholar] [CrossRef] [Green Version] - Gardner, M.; Dorling, S. Artificial neural networks (the multilayer perceptron)—A review of applications in the atmospheric sciences. Atmos. Environ.
**1998**, 32, 2627–2636. [Google Scholar] [CrossRef] - Bengio, Y.; Simard, P.; Frasconi, P. Learning long-term dependencies with gradient descent is difficult. IEEE Trans. Neural Netw.
**1994**, 5, 157–166. [Google Scholar] [CrossRef] [PubMed] - Hochreiter, S.; Schmidhuber, J. Long Short-Term Memory. Neural Comput.
**1997**, 9, 1735–1780. [Google Scholar] [CrossRef] [PubMed] - Hongkang, W.; Li, L.; Yong, W.; Fanjia, M.; Haihua, W.; Sigrimis, N. Recurrent Neural Network Model for Prediction of Microclimate in Solar Greenhouse. IFAC-PapersOnLine
**2018**, 51, 790–795. [Google Scholar] [CrossRef] - Jung, D.H.; Seok Kim, H.; Jhin, C.; Kim, H.J.; Hyun Park, S. Time-serial analysis of deep neural network models for prediction of climatic conditions inside a greenhouse. Comput. Electron. Agric.
**2020**, 173, 105402. [Google Scholar] [CrossRef] - Taki, M.; Ajabshirchi, Y.; Ranjbar, S.F.; Rohani, A.; Matloobi, M. Heat transfer and MLP neural network models to predict inside environment variables and energy lost in a semi-solar greenhouse. Energy Build.
**2016**, 110, 314–329. [Google Scholar] [CrossRef] - Aggregate Farming in the Cloud (AFarCloud) H2020 Project. Available online: http://www.afarcloud.eu (accessed on 14 February 2021).
- Podere Campáz—Produzioni Biologiche. Available online: https://www.poderecampaz.com (accessed on 1 February 2020).
- Chollet, F. Keras5. 2015. Available online: https://keras.io (accessed on 15 May 2021).
- Raspberry Pi. Available online: https://www.raspberrypi.org/ (accessed on 1 June 2021).
- Mazzia, V.; Khaliq, A.; Salvetti, F.; Chiaberge, M. Real-Time Apple Detection System Using Embedded Systems With Hardware Accelerators: An Edge AI Application. IEEE Access
**2020**, 8, 9102–9114. [Google Scholar] [CrossRef] - Shadrin, D.; Menshchikov, A.; Ermilov, D.; Somov, A. Designing Future Precision Agriculture: Detection of Seeds Germination Using Artificial Intelligence on a Low-Power Embedded System. IEEE Sens. J.
**2019**, 19, 11573–11582. [Google Scholar] [CrossRef]

**Figure 1.**Methodological steps (white rectangles) and corresponding outcomes (violet rectangles with rounded corners) of this paper, with reference to its internal structure.

**Figure 3.**Comparison between the internal organization of a RNN, owing a sort of internal memory, and an ANN.

**Figure 4.**Sensor data collected during a 16-month time period: in each month, the average daily number of collected samples (obtained as the ratio between the number of gathered samples per month normalized and the number of days of the month) is shown.

**Figure 5.**Illustrative representation of the first 18 samples ${\{{z}_{k}^{\left({T}_{0}\right)}=z\left(k{T}_{0}\right)\}}_{k=1}^{18}$ (black dots) and of the first 9 samples ${\{{z}_{h}^{\left(2{T}_{0}\right)}=z\left(2h{T}_{0}\right)\}}_{h=1}^{9}$ (black dots over violet squares), obtained downsampling $\left\{{z}_{k}^{\left({T}_{0}\right)}\right\}$ with a factor equal to 2.

**Figure 6.**Sliding window-based prediction at epoch k: the $SW$ observations ${\left\{{z}_{k-i+1}\right\}}_{i=1}^{SW}$ are used to predict the value ${z}_{k+1}$, denoted as ${\widehat{z}}_{k+1}$—the superscript ${}^{\left({T}_{\mathrm{samp}}\right)}$ is omitted for simplicity.

**Figure 7.**The k-th sample $\underline{d}\left(k\right)$ in the data set ${\mathcal{D}}_{SW}^{\left({T}_{\mathrm{samp}}\right)}$ is composed of $SW$ values of air temperatures (composing the $SW$-dimensional vector $\underline{x}\left(k\right)$ of input variables) and of an output variable $y\left(k\right)$ (corresponding to the air temperature at epoch $k+1$, which has to be forecast starting from $\underline{x}\left(k\right)$).

**Figure 9.**Experimental results with the three proposed NN-based models, namely, LSTM (

**a**,

**b**,

**c**), ANN (

**d**,

**e**,

**f**), and RNN (

**g**,

**h**,

**i**), for different values of $SW$ and ${T}_{\mathrm{samp}}$, in terms of RMSE (

**a**,

**d**,

**g**), MAPE (

**b**,

**e**,

**h**) and R${}^{2}$ (

**c**,

**f**,

**i**).

**Figure 10.**Experimental results, in terms of RMSE, on the three proposed NN-based models, namely, (

**a**) LSTM, (

**b**) ANN, and (

**c**) RNN, for different values of $SW$ and ${T}_{\mathrm{samp}}$.

**Figure 11.**Experimental results on the three proposed NN-based models, namely, LSTM, ANN, and RNN, for a few relevant combination of $SW$ and ${T}_{\mathrm{samp}}$, expressed in terms of (

**a**) RMSE, (

**b**) MAPE, and (

**c**) R${}^{2}$.

**Figure 12.**Possible reference architecture and application scenario for the developed forecasting model.

**Table 1.**Representative literature papers in the context of air temperature forecasting inside greenhouses through the usage of NNs.

Ref. | NN Model | Performances (on Test Set) | Data Set Details | ||||||
---|---|---|---|---|---|---|---|---|---|

Input Variables | Architectural Type | Training Algorithm | RMSE (${}^{\circ}$C) | MAPE (%) | R${}^{2}$ | Size (Samples No) | Collection Interval | Sampling Interval | |

[3] | External temperature and solar radiation, wind speed, heater temperature, datetime reference | ANN | BP, CGA | $2.5$–$3.0$ | N/A | N/A | 1368 | ≈2 months | 1 h |

[14] | Internal solar radiation, air temperature and humidity, and soil moisture, CO${}_{2}$, atmospheric pressure, datetime reference | ANN | BP | $0.839$ | N/A | $0.977$ | ≈87,408 | 19 months | 10 min |

[15] | External solar radiation and temperature, wind speed | ANN, RBF | BP | $0.20\pm 0.02$, $0.13\pm 0.01$ | $0.93\pm 0.10$, $0.59\pm 0.07$ | $0.76\pm 0.05$, $0.89\pm 0.03$ | N/A | N/A | N/A |

[16] | External solar radiation, heater temperature, internal air temperature and humidity, wind speed, history of actuators, shadow screen | RBF | BP, LM | $0.0019$ | N/A | N/A | 1728 | 12 days | 10 min |

[19] | External apparent temperature, dew point, air humidity, air temperature and UV index, datetime reference | ANN | BP | $1.50$ | $4.91$ | $0.965$ | 5346 | 10 months | 1 h |

[28] | External temperature, solar radiation and humidity, wind speed and direction, history of actuators | ANN, RNN-LSTM, NARX | BP | $0.89$–$0.94$, $0.45$–$0.71$, $0.52$–$1.32$ | N/A | $0.94$, $0.96$–$0.97$, $0.86$–$0.96$ | ≈470,000 | 1 year | 5, 10, 15, 20, 25, 30 min |

[27] | Internal air and soil temperature, internal solar radiation, humidity and CO${}_{2}$ | RNN | BP | $0.865$ | $1.7$ | $0.925$ | 1152 | 8 days | 10 min |

**Table 2.**Details concerning the engineered data sets, in terms of number of samples, $SW$ and ${T}_{\mathrm{samp}}$.

Data Set | ${\mathit{T}}_{\mathbf{samp}}$ [min] | $\mathit{SW}$ [Samples] | Size [Samples] | Training Subset Size [Samples] | Test Subset Size [Samples] | Data Set | ${\mathit{T}}_{\mathbf{samp}}$ [min] | $\mathit{SW}$ [Samples] | Size [Samples] | Training Subset Size [Samples] | Test Subset Size [Samples] |
---|---|---|---|---|---|---|---|---|---|---|---|

${\mathcal{D}}_{1}^{\left(10\right)}$ | 10 | 1 | $36,330$ | 27,248 | 9082 | ${\mathcal{D}}_{10}^{\left(10\right)}$ | 10 | 10 | $32,696$ | $24,522$ | 8174 |

${\mathcal{D}}_{2}^{\left(10\right)}$ | 10 | 2 | $35,828$ | $26,871$ | 8957 | ${\mathcal{D}}_{3}^{\left(10\right)}$ | 10 | 3 | $35,363$ | $26,523$ | 8840 |

${\mathcal{D}}_{4}^{\left(10\right)}$ | 10 | 4 | $34,923$ | $26,193$ | 8730 | ${\mathcal{D}}_{5}^{\left(10\right)}$ | 10 | 5 | $34,512$ | $25,884$ | 8628 |

${\mathcal{D}}_{6}^{\left(10\right)}$ | 10 | 6 | $34,118$ | $25,589$ | 8529 | ${\mathcal{D}}_{7}^{\left(10\right)}$ | 10 | 7 | $33,734$ | $25,301$ | 8433 |

${\mathcal{D}}_{8}^{\left(10\right)}$ | 10 | 8 | $33,366$ | $25,025$ | 8341 | ${\mathcal{D}}_{9}^{\left(10\right)}$ | 10 | 9 | $33,020$ | $24,765$ | 8255 |

${\mathcal{D}}_{1}^{\left(120\right)}$ | 120 | 1 | 2985 | 2239 | 746 | ${\mathcal{D}}_{10}^{\left(120\right)}$ | 120 | 10 | 2457 | 1843 | 614 |

${\mathcal{D}}_{2}^{\left(120\right)}$ | 120 | 2 | 2912 | 2184 | 728 | ${\mathcal{D}}_{3}^{\left(120\right)}$ | 120 | 3 | 2843 | 2133 | 710 |

${\mathcal{D}}_{4}^{\left(120\right)}$ | 120 | 4 | 2781 | 2086 | 695 | ${\mathcal{D}}_{5}^{\left(120\right)}$ | 120 | 5 | 2723 | 2043 | 680 |

${\mathcal{D}}_{6}^{\left(120\right)}$ | 120 | 6 | 2666 | 2000 | 666 | ${\mathcal{D}}_{7}^{\left(120\right)}$ | 120 | 7 | 2611 | 1959 | 652 |

${\mathcal{D}}_{8}^{\left(120\right)}$ | 120 | 8 | 2558 | 1919 | 639 | ${\mathcal{D}}_{9}^{\left(120\right)}$ | 120 | 9 | 2507 | 1881 | 626 |

${\mathcal{D}}_{1}^{\left(20\right)}$ | 20 | 1 | $18,102$ | $13,577$ | 4525 | ${\mathcal{D}}_{10}^{\left(20\right)}$ | 20 | 10 | $16,006$ | $12,005$ | 4001 |

${\mathcal{D}}_{2}^{\left(20\right)}$ | 20 | 2 | $17,812$ | $13,359$ | 4453 | ${\mathcal{D}}_{3}^{\left(20\right)}$ | 20 | 3 | $17,539$ | $13,155$ | 4384 |

${\mathcal{D}}_{4}^{\left(20\right)}$ | 20 | 4 | $17,286$ | $12,965$ | 4321 | ${\mathcal{D}}_{5}^{\left(20\right)}$ | 20 | 5 | $17,050$ | $12,788$ | 4262 |

${\mathcal{D}}_{6}^{\left(20\right)}$ | 20 | 6 | $16,822$ | $12,617$ | 4205 | ${\mathcal{D}}_{7}^{\left(20\right)}$ | 20 | 7 | $16,605$ | $12,454$ | 4151 |

${\mathcal{D}}_{8}^{\left(20\right)}$ | 20 | 8 | $16,399$ | $12,300$ | 4099 | ${\mathcal{D}}_{9}^{\left(20\right)}$ | 20 | 9 | $16,196$ | $12,147$ | 4049 |

${\mathcal{D}}_{1}^{\left(30\right)}$ | 30 | 1 | $12,067$ | 9051 | 3016 | ${\mathcal{D}}_{10}^{\left(30\right)}$ | 30 | 10 | $10,618$ | 7964 | 2654 |

${\mathcal{D}}_{2}^{\left(30\right)}$ | 30 | 2 | $11,862$ | 8897 | 2965 | ${\mathcal{D}}_{3}^{\left(30\right)}$ | 30 | 3 | $11,678$ | 8759 | 2919 |

${\mathcal{D}}_{4}^{\left(30\right)}$ | 30 | 4 | $11,504$ | 8628 | 2876 | ${\mathcal{D}}_{5}^{\left(30\right)}$ | 30 | 5 | $11,341$ | 8506 | 2835 |

${\mathcal{D}}_{6}^{\left(30\right)}$ | 30 | 6 | 11184 | 8388 | 2796 | ${\mathcal{D}}_{7}^{\left(30\right)}$ | 30 | 7 | $11,030$ | 8273 | 2757 |

${\mathcal{D}}_{8}^{\left(30\right)}$ | 30 | 8 | $10,885$ | 8164 | 2721 | ${\mathcal{D}}_{9}^{\left(30\right)}$ | 30 | 9 | $10,749$ | 8062 | 2687 |

${\mathcal{D}}_{1}^{\left(40\right)}$ | 40 | 1 | 9008 | 6756 | 2252 | ${\mathcal{D}}_{10}^{\left(40\right)}$ | 40 | 10 | 7690 | 5768 | 1922 |

${\mathcal{D}}_{2}^{\left(40\right)}$ | 40 | 2 | 8829 | 6622 | 2207 | ${\mathcal{D}}_{3}^{\left(40\right)}$ | 40 | 3 | 8656 | 6492 | 2164 |

${\mathcal{D}}_{4}^{\left(40\right)}$ | 40 | 4 | 8495 | 6372 | 2123 | ${\mathcal{D}}_{5}^{\left(40\right)}$ | 40 | 5 | 8341 | 6256 | 2085 |

${\mathcal{D}}_{6}^{\left(40\right)}$ | 40 | 6 | 8196 | 6147 | 2049 | ${\mathcal{D}}_{7}^{\left(40\right)}$ | 40 | 7 | 8062 | 6047 | 2015 |

${\mathcal{D}}_{8}^{\left(40\right)}$ | 40 | 8 | 7931 | 5949 | 1982 | ${\mathcal{D}}_{9}^{\left(40\right)}$ | 40 | 9 | 7806 | 5855 | 1951 |

${\mathcal{D}}_{1}^{\left(50\right)}$ | 50 | 1 | 7220 | 5415 | 1805 | ${\mathcal{D}}_{10}^{\left(50\right)}$ | 50 | 10 | 6185 | 4639 | 1546 |

${\mathcal{D}}_{2}^{\left(50\right)}$ | 50 | 2 | 7079 | 5310 | 1769 | ${\mathcal{D}}_{3}^{\left(50\right)}$ | 50 | 3 | 6944 | 5208 | 1736 |

${\mathcal{D}}_{4}^{\left(50\right)}$ | 50 | 4 | 6816 | 5112 | 1704 | ${\mathcal{D}}_{5}^{\left(50\right)}$ | 50 | 5 | 6695 | 5022 | 1673 |

${\mathcal{D}}_{6}^{\left(50\right)}$ | 50 | 6 | 6584 | 4938 | 1646 | ${\mathcal{D}}_{7}^{\left(50\right)}$ | 50 | 7 | 6479 | 4860 | 1619 |

${\mathcal{D}}_{8}^{\left(50\right)}$ | 50 | 8 | 6376 | 4782 | 1594 | ${\mathcal{D}}_{9}^{\left(50\right)}$ | 50 | 9 | 6280 | 4710 | 1570 |

${\mathcal{D}}_{1}^{\left(60\right)}$ | 60 | 1 | 6006 | 4505 | 1501 | ${\mathcal{D}}_{10}^{\left(60\right)}$ | 60 | 10 | 5146 | 3860 | 1286 |

${\mathcal{D}}_{2}^{\left(60\right)}$ | 60 | 2 | 5886 | 4415 | 1471 | ${\mathcal{D}}_{3}^{\left(60\right)}$ | 60 | 3 | 5772 | 4329 | 1443 |

${\mathcal{D}}_{4}^{\left(60\right)}$ | 60 | 4 | 5668 | 4251 | 1417 | ${\mathcal{D}}_{5}^{\left(60\right)}$ | 60 | 5 | 5570 | 4178 | 1392 |

${\mathcal{D}}_{6}^{\left(60\right)}$ | 60 | 6 | 5478 | 4109 | 1369 | ${\mathcal{D}}_{7}^{\left(60\right)}$ | 60 | 7 | 5389 | 4042 | 1347 |

${\mathcal{D}}_{8}^{\left(60\right)}$ | 60 | 8 | 5306 | 3980 | 1326 | ${\mathcal{D}}_{9}^{\left(60\right)}$ | 60 | 9 | 5224 | 3918 | 1306 |

**Table 3.**Minimum (min), maximum (max), and average (avg) values of RMSE, MAPE, and R${}^{2}$ obtained over the 210 trained models.

NN Arch. Type | RMSE [${}^{\circ}$C] | MAPE [%] | R${}^{2}$ | |||||||
---|---|---|---|---|---|---|---|---|---|---|

Value | ${\mathit{T}}_{\mathbf{samp}}$ | $\mathit{SW}$ | Value | ${\mathit{T}}_{\mathbf{samp}}$ | $\mathit{SW}$ | Value | ${\mathit{T}}_{\mathbf{samp}}$ | $\mathit{SW}$ | ||

ANN | Min | $0.402$ | 10 | 5 | $1.03$ | 10 | 4 | $0.699$ | 120 | 3 |

Max | $4.561$ | 120 | 3 | $16.35$ | 120 | 3 | $0.998$ | 10 | 4, 5 | |

Avg | $1.52$ | N/A | N/A | $4.29$ | N/A | N/A | $0.96$ | N/A | N/A | |

RNN | Min | $0.290$ | 10 | 5 | $0.87$ | 10 | 5 | $0.776$ | 120 | 3 |

Max | $3.933$ | 120 | 3 | $14.14$ | 120 | 3 | $0.999$ | 10 | 5 | |

Avg | $1.45$ | N/A | N/A | $4.10$ | N/A | N/A | $0.96$ | N/A | N/A | |

LSTM | Min | $0.294$ | 10 | 5 | $0.89$ | 10 | 5 | $0.766$ | 120 | 3 |

Max | $4.024$ | 120 | 3 | $14.08$ | 120 | 3 | $0.999$ | 10 | 5 | |

Avg | $1.46$ | N/A | N/A | $4.17$ | N/A | N/A | $0.96$ | N/A | N/A |

**Table 4.**Prediction performances of the three proposed models on a reduced selection of $SW$ and ${T}_{\mathrm{samp}}$ (those which are better performing).

Data Set | ${\mathit{T}}_{}$ | $\mathit{SW}$ | RMSE [${}^{\circ}$C] | MAPE [%] | R${}^{2}$ | ||||||
---|---|---|---|---|---|---|---|---|---|---|---|

LSTM | RNN | ANN | LSTM | RNN | ANN | LSTM | RNN | ANN | |||

${\mathcal{D}}_{2}^{\left(10\right)}$ | 10 | 2 | $0.470$ | $0.769$ | $0.608$ | $1.42$ | $2.80$ | $1.91$ | $0.997$ | $0.992$ | $0.995$ |

${\mathcal{D}}_{3}^{\left(10\right)}$ | 10 | 3 | $0.830$ | $0.696$ | $0.923$ | $2.51$ | $2.14$ | $2.98$ | $0.991$ | $0.994$ | $0.989$ |

${\mathcal{D}}_{4}^{\left(10\right)}$ | 10 | 4 | $0.370$ | $0.501$ | $0.407$ | $1.19$ | $1.52$ | $1.03$ | $0.998$ | $0.997$ | $0.998$ |

${\mathcal{D}}_{5}^{\left(10\right)}$ | 10 | 5 | $0.294$ | $0.289$ | $0.402$ | $0.89$ | $0.87$ | $1.04$ | $0.999$ | $0.999$ | $0.998$ |

${\mathcal{D}}_{6}^{\left(10\right)}$ | 10 | 6 | $0.371$ | $0.464$ | $0.449$ | $1.16$ | $1.53$ | $1.18$ | $0.998$ | $0.997$ | $0.997$ |

${\mathcal{D}}_{7}^{\left(10\right)}$ | 10 | 7 | $0.577$ | $0.598$ | $0.997$ | $1.78$ | $1.94$ | $2.95$ | $0.996$ | $0.996$ | $0.987$ |

${\mathcal{D}}_{9}^{\left(10\right)}$ | 10 | 9 | $0.542$ | $0.685$ | $0.717$ | $1.69$ | $2.08$ | $2.23$ | $0.996$ | $0.994$ | $0.993$ |

${\mathcal{D}}_{5}^{\left(20\right)}$ | 20 | 5 | $0.434$ | $0.438$ | $0.674$ | $0.90$ | $0.93$ | $1.68$ | $0.998$ | $0.998$ | $0.994$ |

${\mathcal{D}}_{6}^{\left(20\right)}$ | 20 | 6 | $0.447$ | $0.439$ | $0.655$ | $0.93$ | $0.91$ | $1.63$ | $0.997$ | $0.998$ | $0.995$ |

${\mathcal{D}}_{7}^{\left(20\right)}$ | 20 | 7 | $0.458$ | $0.461$ | $0.509$ | $1.13$ | $1.16$ | $1.17$ | $0.997$ | $0.997$ | $0.997$ |

${\mathcal{D}}_{8}^{\left(20\right)}$ | 20 | 8 | $0.453$ | $0.466$ | $0.805$ | $0.98$ | $1.07$ | $2.06$ | $0.997$ | $0.997$ | $0.992$ |

${\mathcal{D}}_{9}^{\left(20\right)}$ | 20 | 9 | $0.684$ | $0.674$ | $0.606$ | $2.00$ | $1.77$ | $1.64$ | $0.994$ | $0.994$ | $0.995$ |

${\mathcal{D}}_{10}^{\left(20\right)}$ | 20 | 10 | $0.897$ | $0.907$ | $0.820$ | $2.80$ | $2.94$ | $2.08$ | $0.990$ | $0.990$ | $0.992$ |

${\mathcal{D}}_{3}^{\left(30\right)}$ | 30 | 3 | $0.974$ | $0.894$ | $0.765$ | $2.97$ | $2.66$ | $1.61$ | $0.987$ | $0.989$ | $0.992$ |

${\mathcal{D}}_{5}^{\left(30\right)}$ | 30 | 5 | $0.640$ | $0.693$ | $0.961$ | $1.72$ | $1.89$ | $2.90$ | $0.995$ | $0.994$ | $0.988$ |

${\mathcal{D}}_{7}^{\left(30\right)}$ | 30 | 7 | $0.778$ | $0.799$ | $0.782$ | $1.73$ | $1.97$ | $1.68$ | $0.993$ | $0.992$ | $0.992$ |

${\mathcal{D}}_{8}^{\left(30\right)}$ | 30 | 8 | $0.657$ | $0.682$ | $0.883$ | $1.39$ | $1.38$ | $1.99$ | $0.995$ | $0.994$ | $0.990$ |

**Table 5.**Prediction performance and complexity of a subset of evaluated models, in terms of RMSE, MAPE, R${}^{2}$, accuracy (namely, number of samples predicted with RMSE lower than 1 ${}^{\circ}\mathrm{C}$), MAC operations, number of parameters, and NetScore of the models.

Model | RMSE [${}^{\circ}$C] | MAPE [%] | R${}^{2}$ | Accuracy [%] | MAC Operations | Parameters Number | NetScore |
---|---|---|---|---|---|---|---|

Model in [19] | $1.50$ | $4.91$ | $0.965$ | $48.87$ | 1018 | 1018 | $17.05$ |

Re-trained [19] | $2.28$ | $6.54$ | $0.931$ | $34.91$ | 1018 | 1018 | $3.60$ |

${\mathrm{LSTM}}_{5}^{\left(10\right)}$ | $0.294$ | $0.89$ | $0.999$ | $99.18$ | 22,192 | 4625 | $-0.59$ |

${\mathrm{RNN}}_{5}^{\left(10\right)}$ | $0.289$ | $0.87$ | $0.999$ | $99.28$ | 5712 | 1361 | $25.25$ |

${\mathrm{ANN}}_{5}^{\left(10\right)}$ | $0.402$ | $1.04$ | $0.998$ | $97.28$ | 464 | 464 | $60.31$ |

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Codeluppi, G.; Davoli, L.; Ferrari, G.
Forecasting Air Temperature on Edge Devices with Embedded AI. *Sensors* **2021**, *21*, 3973.
https://doi.org/10.3390/s21123973

**AMA Style**

Codeluppi G, Davoli L, Ferrari G.
Forecasting Air Temperature on Edge Devices with Embedded AI. *Sensors*. 2021; 21(12):3973.
https://doi.org/10.3390/s21123973

**Chicago/Turabian Style**

Codeluppi, Gaia, Luca Davoli, and Gianluigi Ferrari.
2021. "Forecasting Air Temperature on Edge Devices with Embedded AI" *Sensors* 21, no. 12: 3973.
https://doi.org/10.3390/s21123973