# On-Device IoT-Based Predictive Maintenance Analytics Model: Comparing TinyLSTM and TinyModel from Edge Impulse

^{1}

^{2}

^{3}

^{4}

^{*}

## Abstract

**:**

## 1. Introduction

- For real-time application, the real-time performance data to develop the model were gathered using an IoT device installed on the equipment.
- For multi-conditional parameters that may separately affect the equipment life, the data are labeled using the fuzzy expert system based on the maintainers’ expertise.
- To predict an equipment’s remaining useful life depending on equipment’s component’s status, the IoT-based real-time predictive analytics models—LSTM and Model from Edge Impulse—are developed and compared.
- For the fault prediction and early notification on maintenance priority suggestions, each model is converted to TinyModel, and its deployment onto an IoT device for continuous real-time health monitoring is simulated.

## 2. Data Gathering and Processing

#### 2.1. Data Acquisition

#### 2.2. Data Preprocessing and Labeling Using Fuzzy Expert System

#### 2.2.1. Data Preprocessing

#### 2.2.2. Recall of Fuzzy Expert System in Predictive Maintenance

#### 2.2.3. Data Labeling (RUL) Using a Fuzzy Expert System

## 3. Predictive Analytics Models (LSTM and Model from Edge Impulse)

#### 3.1. Long Short-Term Memory (LSTM)

_{t}) and to control the output as well.

_{t}at time t, using the sigmoid function, the forget gate compute Equation (6), to decide on which information in the memory state to keep or to throw away.

_{t}).

_{t}) over Equation (8). The two gates’ outputs are then combined and pointwise multiplied in Equation (9) to create an update to the cell memory state (C

_{t}).

_{t}) by performing Equation (12):

_{t}: current input at time t, h

_{t}− 1: hidden state from the previous iteration, b: bias, and C

_{t}cell memory state at time t.

#### 3.2. Predictive Analytics Model from Edge Impulse

## 4. Results and Discussion

#### 4.1. LSTM Model Structure and Performance Metrics

#### Model from Edge Impulse Structure and Performance Metrics

#### 4.2. TinyModel

#### 4.3. Simulating the Deployment and Inference Creation

#### 4.4. Models’ Comparison

## 5. Conclusions

## Author Contributions

## Funding

## Data Availability Statement

## Conflicts of Interest

## Abbreviations

AI | Artificial Intelligence |

EI | Edge Impulse |

IoT | Internet of Things |

LSTM | Long Short-Term Memory |

MSE | Mean Squared Error |

PdM | Predictive Maintenance |

RNN | Recurrent Neural Networks |

RUL | Remaining Useful Life |

TF | TensorFlow |

TinyML | Tiny Machine Learning |

## References

- Ran, Y.; Zhou, X.; Lin, P.; Wen, Y.; Deng, R. A Survey of Predictive Maintenance: Systems, Purposes and Approaches. arXiv
**2019**, arXiv:1912.07383. [Google Scholar] - Lee, J.; Wu, F.; Zhao, W.; Ghaffari, M.; Liao, L.; Siegel, D. Prognostics and health management design for rotary machinery systems—Reviews, methodology and applications. Mech. Syst. Signal Process.
**2014**, 42, 314–337. [Google Scholar] [CrossRef] - Aydemir, G.; Acar, B. Anomaly monitoring improves remaining useful life estimation of industrial machinery. J. Manuf. Syst.
**2020**, 56, 463–469. [Google Scholar] [CrossRef] - Xia, T.; Song, Y.; Zheng, Y.; Pan, E.; Xi, L. An ensemble framework based on convolutional bi-directional LSTM with multiple time windows for remaining useful life estimation. Comput. Ind.
**2020**, 115, 103182. [Google Scholar] [CrossRef] - Wang, B.; Lei, Y.; Li, N.; Yan, T. Deep separable convolutional network for remaining useful life prediction of machinery. Mech. Syst. Signal Process.
**2019**, 134, 106330. [Google Scholar] [CrossRef] - Wu, Y.; Yuan, M.; Dong, S.; Lin, L.; Liu, Y. Remaining useful life estimation of engineered systems using vanilla LSTM neural networks. Neurocomputing
**2018**, 275, 167–179. [Google Scholar] [CrossRef] - Li, H.; Zhao, W.; Zhang, Y.; Zio, E. Remaining useful life prediction using multi-scale deep convolutional neural network. Appl. Soft Comput. J.
**2020**, 89, 106113. [Google Scholar] [CrossRef] - Sikorska, J.Z.; Hodkiewicz, M.; Ma, L. Prognostic modelling options for remaining useful life estimation by industry. Mech. Syst. Signal Process.
**2011**, 25, 1803–1836. [Google Scholar] [CrossRef] - Ayvaz, S.; Alpay, K. Predictive maintenance system for production lines in manufacturing: A machine learning approach using IoT data in real-time. Expert Syst. Appl.
**2021**, 173, 114598. [Google Scholar] [CrossRef] - Alipour, M.; Mohammadi-Ivatloo, B.; Zare, K. Stochastic Scheduling of Renewable and CHP-Based Microgrids. IEEE Trans. Ind. Inform.
**2015**, 11, 1049–1058. [Google Scholar] [CrossRef] - Li, X.; Ding, Q.; Sun, J.Q. Remaining useful life estimation in prognostics using deep convolution neural networks. Reliab. Eng. Syst. Saf.
**2018**, 172, 1–11. [Google Scholar] [CrossRef] [Green Version] - Benkedjouh, T.; Medjaher, K.; Zerhouni, N.; Rechak, S. Remaining useful life estimation based on nonlinear feature reduction and support vector regression. Eng. Appl. Artif. Intell.
**2013**, 26, 1751–1760. [Google Scholar] [CrossRef] - Celikmih, K.; Inan, O.; Uguz, H. Failure prediction of aircraft equipment using machine learning with a hybrid data preparation method. Sci. Program.
**2020**, 2020, 8616039. [Google Scholar] [CrossRef] - Zhang, X.H.; Kang, J.S. Hidden Markov models in bearing fault diagnosis and prognosis. In Proceedings of the 2010 Second International Conference on Computational Intelligence and Natural Computing, Wuhan, China, 13–14 September 2010; Volume 2, pp. 364–367. [Google Scholar]
- Kotsiopoulos, T.; Sarigiannidis, P.; Ioannidis, D.; Tzovaras, D. Machine Learning and Deep Learning in smart manufacturing: The Smart Grid paradigm. Comput. Sci. Rev.
**2021**, 40, 100341. [Google Scholar] [CrossRef] - Thanasis, J.; Ma, Y.; Zhang, L.; Gao, R.X.; Wu, D. Deep learning for smart manufacturing: Methods and applications. J. Manuf. Syst.
**2018**, 48, 144–156. [Google Scholar] - Zhou, Y.; Hefenbrock, M.; Huang, Y.; Riedel, T.; Beigl, M. Automatic Remaining Useful Life Estimation Framework with Embedded Convolutional LSTM as the Backbone. Lect. Notes Comput. Sci.
**2021**, 12460, 461–477. [Google Scholar] - Li, J.; Li, X.; He, D. A Directed Acyclic Graph Network Combined With CNN and LSTM for Remaining Useful Life Prediction. IEEE Access
**2019**, 7, 75464. [Google Scholar] [CrossRef] - Elsheikh, A.; Yacout, S.; Ouali, M.S. Bidirectional handshaking LSTM for remaining useful life prediction. Neurocomputing
**2019**, 323, 148–1456. [Google Scholar] [CrossRef] - Yu, Y.; Hu, C.; Si, X.; Zheng, J.; Zhang, J. Averaged Bi-LSTM networks for RUL prognostics with non-life-cycle labeled dataset. Neurocomputing
**2020**, 402, 134–147. [Google Scholar] [CrossRef] - Compare, M.; Baraldi, P.; Zio, E.T. Challenges to IoT-Enabled Predictive Maintenance for Industry 4.0. IEEE Internet Things J.
**2020**, 7, 4585–4597. [Google Scholar] [CrossRef] - Advanced ML for Every Solution. Available online: https://www.edgeimpulse.com (accessed on 30 May 2022).
- About Keras. Available online: https://keras.io/about/ (accessed on 30 May 2022).
- Niyonambaza, I.; Zennaro, M.; Uwitonze, A. Predictive maintenance (Pdm) structure using internet of things (iot) for mechanical equipment used into hospitals in Rwanda. Futur. Internet
**2020**, 12, 224. [Google Scholar] [CrossRef] - Bekar, E.T.; Nyqvist, P.; Skoogh, A. An intelligent approach for data pre-processing and analysis in predictive maintenance with an industrial case study. Adv. Mech. Eng.
**2020**, 12, 1–14. [Google Scholar] [CrossRef] - Zadeh, L.A. Fuzzy sets. Inf. Control
**1965**, 8, 338–353. [Google Scholar] [CrossRef] [Green Version] - Mihigo, I.N.; Zennaro, M.; Uwitonze, A. Enhancing the Priority for the Maintenance Activities of the Hospitals’ Mechanical Equipment Using the Fuzzy Expert System. In Proceedings of the 13th EAI International Conference, AFRICOMM 2021, Zanzibar, Tanzania, 1–3 December 2021; pp. 170–181. [Google Scholar]
- Baban, M.; Baban, C.F.; Moisi, B. A Fuzzy Logic-Based Approach for Predictive Maintenance of Grinding Wheels of Automated Grinding Lines. In Proceedings of the 23rd International Conference on Methods and Models in Automation and Robotics MMAR, Miedzyzdroje, Poland, 27–30 August 2018; pp. 483–486. [Google Scholar]
- Baban, M.; Baban, C.F.; Suteu, M.D. Maintenance Decision-Making Support for Textile Machines: A Knowledge-Based Approach Using Fuzzy Logic and Vibration Monitoring. IEEE Access
**2019**, 7, 83504–83514. [Google Scholar] [CrossRef] - Kumar, E.V.; Chaturvedi, S.K. Prioritization of maintenance tasks on industrial equipment for reliability: A fuzzy approach. Int. J. Qual. Reliab. Manag.
**2011**, 28, 109–126. [Google Scholar] [CrossRef] - Borjalilu, N.; Ghambari, M. Optimal maintenance strategy selection based on a fuzzy analytical network process: A case study on a 5-MW powerhouse. Int. J. Eng. Bus. Manag.
**2018**, 10, 1–10. [Google Scholar] [CrossRef] - Andrew, A.; Kumanan, S. Development of an intelligent decision making tool for maintenance planning using fuzzy logic and dynamic scheduling. Int. J. Inf. Technol.
**2020**, 12, 27–36. [Google Scholar] [CrossRef] - Gallab, M.; Bouloiz, H.; Alaoui, Y.L.; Tkiouat, M. Risk Assessment of Maintenance activities using Fuzzy Logic. Procedia Comput. Sci.
**2019**, 148, 226–235. [Google Scholar] [CrossRef] - Jang, J.R. ANFIS: Adaptive Network-Based Fuzzy Inference System. IEEE Trans. Syst. Man Cybern.
**1993**, 23, 665–685. [Google Scholar] [CrossRef] - Fuzzy Logic—Controls, Concepts, Theories and Application: A Mamdani Type Fuzzy Logic Controller. Available online: www.intechopen.com (accessed on 30 May 2022).
- Hochreiter, S.; Schmidhuber, J. Long Short-Term Memory. Neural Comput.
**1997**, 9, 1735–1780. [Google Scholar] [CrossRef] - TensoFlow Home Page. Available online: https://www.tensorflow.org/ (accessed on 30 May 2022).
- Arduino. Nano Sense Ble. Available online: https://docs.arduino.cc/hardware/nano-33-ble-sense/ (accessed on 30 May 2022).

Temp. (°C ) | Vib. (mm/s) | Curr. A (mA) | Curr. B (mA) |
---|---|---|---|

33.44 | 48 | 0.12 | 0.12 |

33.88 | 46 | 0.08 | 17.53 |

30.56 | 16 | 2.72 | 0.10 |

32.50 | 11 | 0.03 | 0.12 |

35.56 | 7 | 0.01 | 0.12 |

35.88 | 47 | 0.01 | 0.10 |

43.63 | 47 | 0.06 | 0.12 |

Temp. (°C ) | Vib. (mm/s) | Cur. A (mA) | Cur. B (mA) | M. Priority (0 to 1) | RUL (Days) |
---|---|---|---|---|---|

89 | 12 | 2.53 | 17.65 | 0.90 | 1 |

47 | 6 | 6.07 | 0.01 | 0.86 | 2 |

34.81 | 12 | 4.89 | 0.12 | 0.86 | 3 |

44.88 | 6 | 2.79 | 0.03 | 0.86 | 4 |

31.44 | 9 | 5.21 | 0.15 | 0.86 | 5 |

33.75 | 0 | 3.56 | 0.11 | 0.86 | 6 |

26.87 | 0 | 0.1 | 15.45 | 0.89 | 7 |

28.25 | 50 | 0.13 | 0.05 | 0.86 | 8 |

38 | 7 | 0.13 | 0.12 | 0.38 | 9 |

34 | 6 | 0.11 | 0.11 | 0.31 | 10 |

37.88 | 8 | 0.1 | 0.12 | 0.38 | 11 |

30.31 | 0 | 0.1 | 0.12 | 0.15 | 12 |

30.31 | 10 | 0.1 | 0.12 | 0.15 | 13 |

29.25 | 0 | 0.08 | 0.12 | 0.13 | 14 |

29.25 | 0 | 0.08 | 0.11 | 0.13 | 15 |

29.25 | 0 | 0.13 | 0.1 | 0.13 | 16 |

28.44 | 0 | 0.1 | 0.11 | 0.13 | 17 |

25 | 0 | 0.13 | 0.33 | 0.12 | 18 |

28.5 | 0 | 0.03 | 0.32 | 0.13 | 19 |

28.5 | 0 | 0.13 | 0.11 | 0.13 | 20 |

30.19 | 10 | 0.1 | 0.11 | 0.15 | 21 |

30.18 | 9 | 0.11 | 18.2 | 0.89 | 22 |

Parameters | Optimum Metrics’ Value |
---|---|

Model training dataset portion | $80\%$ |

Model evaluation dataset portion | $20\%$ |

Model Type | Sequential |

LSTM layer | 32 neurons |

Hidden Dense layer | 16 neurons |

Dropout packaging | 0.2 |

Output layer (Dense) | 1 neuron |

Optimizer | Adam |

Learning rate | 0.001 |

Epoch | 5 |

Performance metrics | MSE (Mean Square Error) and Coefficient of determination ${R}^{2}$ |

Batch size | 16 |

Time step window | 60 |

Train MSE | 0.0295 |

Test MSE | 0.01 |

${R}^{2}$ | 0.77 |

Parameters | Specifications |
---|---|

Training Cycles | 10 Cycles |

Training dataset | $80\%$ of the entire dataset |

Testing dataset | $20\%$ of the entire dataset |

Validation dataset (to be used during training) | $20\%$ of the entire dataset |

Learning rate | 0.005 |

Activation | ReLu |

Batch size | 32 |

Epoch | 10 |

Loss function | Mean Squared Error (MSE) |

Model type | Sequential |

Input layer | 4 features |

Hidden Dense layer at first level | 20 neurons |

Hidden Dense level at second level | 10 neurons |

Output layer | 1 class (1 neuron—no Activation) |

Temp. (°C ) | Vib. (mm/s) | Cur. A (mA) | Cur. B (mA) | Actual RUL (Days) |
---|---|---|---|---|

50.56 | 45 | 0.14 | 17.97 | 1 |

54.31 | 25 | 2.63 | 0.01 | 1 |

54.31 | 55 | 2.65 | 0 | 1 |

55.13 | 127 | 2.71 | 18.07 | 1 |

47.69 | 50 | 0.14 | 18.09 | 1 |

41.44 | 42 | 0.14 | 18.23 | 1 |

37.88 | 48 | 0.14 | 18.12 | 1 |

36 | 70 | 0.14 | 18.06 | 1 |

Element | For LSTM Model | For Model from Edge Impulse |
---|---|---|

Model building platform | TensorFlow | Edge Impulse |

Free version of platform | No limitation on data size and training time but keep confirming the work in progress | Limited data size and training time |

Library | Keras [23] | Keras [23] |

Data preprocessing | In same platform | Out of Edge Impulse |

Element | LSTM Model | Model from Edge Impulse |
---|---|---|

Model Type | Sequential | Sequential |

Model structure | Based on Neural networks block | Based on Neural networks block. |

Model build up | Customized by a developer | There is a proposal of standardized inbuilt model which could be customized. |

Training time for same dataset | Long | Short |

Regression performance metrics | To be defined by the developer | Defaulted as MSE and can be customized in Expert mode. |

Outputs representation | Customized by the developer depending on the metrics to be presented | Defaulted and limited |

Activation | To be defined and mostly ReLu for regression model (Keras standardized) | Defaulted as ReLu and can be customized in Expert mode |

Ordinary model building simplicity | Depends on the experience of the developer | Standardized inbuilt model may perform well on the data and in case of improvement, it is easy even for less experienced developer |

Regression output | Single Value | Class |

Overfitting possibility | Much | Less |

Model Train loss (MSE) | 0.0295 | 0.11 on validation dataset |

Model Test loss (MSE) | 0.0092 | 0.11 |

Model performance | ${R}^{2}$: $77\%$ | Accuracy: $99.87\%$ |

Element | LSTM Model | Model from Edge Impulse |
---|---|---|

Converting the ordinary model to TinyModel | Using TensorFlow Lite | Inbuilt conversion |

TinyML device required memory | Not assumed | Both RAM and ROM (flash) memory are estimated for a given edge device. |

Latency of the TinyModel on IoT device | Not assumed | Estimated by Edge Impulse platform. Latency equals to 1 ms in our case Cortex-M4F—64 MHz) |

Microcontroller for edge deployment | On Choice: in this case Arduino Nano BLE Sense is chosen | On Choice: in this case Arduino Nano BLE Sense is chosen |

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Mihigo, I.N.; Zennaro, M.; Uwitonze, A.; Rwigema, J.; Rovai, M.
On-Device IoT-Based Predictive Maintenance Analytics Model: Comparing TinyLSTM and TinyModel from Edge Impulse. *Sensors* **2022**, *22*, 5174.
https://doi.org/10.3390/s22145174

**AMA Style**

Mihigo IN, Zennaro M, Uwitonze A, Rwigema J, Rovai M.
On-Device IoT-Based Predictive Maintenance Analytics Model: Comparing TinyLSTM and TinyModel from Edge Impulse. *Sensors*. 2022; 22(14):5174.
https://doi.org/10.3390/s22145174

**Chicago/Turabian Style**

Mihigo, Irene Niyonambaza, Marco Zennaro, Alfred Uwitonze, James Rwigema, and Marcelo Rovai.
2022. "On-Device IoT-Based Predictive Maintenance Analytics Model: Comparing TinyLSTM and TinyModel from Edge Impulse" *Sensors* 22, no. 14: 5174.
https://doi.org/10.3390/s22145174