Intelligent Symmetry-Based Vision System for Real-Time Industrial Process Supervision
Abstract
1. Introduction
Research Gap and Motivation
- High computational demand: Most existing CNN based methods require GPU acceleration and are unsuitable for low-cost embedded platforms, limiting real-time deployment in small-scale plants.
- Limited exploitation of geometric symmetry: Few approaches explicitly leverage the inherent circular or axial symmetry of analog scales to simplify feature extraction and improve robustness under illumination changes.
- Dependence on costly sensors and digital transmitters: Prior works often assume the replacement of analog gauges with digital ones, which is economically infeasible for SMEs.
- Restricted adaptability and scalability: Many existing solutions are designed for a single instrument type and lack modularity for multi-variable monitoring.
- Insufficient integration with edge and IoT architectures: Most vision-based systems rely on PC-level processing and do not address real-time communication or supervisory integration.
2. Proposed Methods
2.1. Symmetry Formalization in Analog Instrument Geometry
2.2. Dataset and Preprocessing
2.3. ANN Hyperparameters
2.4. Rationale for Selecting Artificial Neural Networks
3. Image Processing
3.1. Pressure Gauge
3.2. Rotameter
4. Artificial Neural Networks
4.1. Structure
4.2. Training and Operation
4.3. Output Layer Function and Training Process
4.4. Dataset Construction and Preprocessing
5. Results and Discussion
5.1. ANN Convergence
5.2. System Operation
5.3. Statistical Performance Evaluation
5.4. Comparative Discussion and Practical Implications
5.5. Generalization Capability
5.6. Limitations of the Study
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Gold, K.L. Effects of Industry 4.0 on Small and Medium-Scale Enterprises: An Analytical and Bibliometric Review. Sage Open 2025, 15, 21582440251336514. [Google Scholar] [CrossRef]
- Yaqub, M.Z.; Alsabban, A. Industry-4.0-Enabled Digital Transformation: Prospects, Instruments, Challenges, and Implications for Business Strategies. Sustainability 2023, 15, 8553. [Google Scholar] [CrossRef]
- Elhusseiny, H.M.; Crispim, J. SMEs, Barriers and Opportunities on adopting Industry 4.0: A Review. Procedia Comput. Sci. 2022, 196, 864–871. [Google Scholar] [CrossRef]
- Pantusin, F.J.; Ortiz, J.S.; Carvajal, C.P.; Andaluz, V.H.; Yar, L.G.; Roberti, F.; Gandolfo, D. Digital Twin Integration for Active Learning in Robotic Manipulator Control Within Engineering 4.0. Symmetry 2025, 17, 1638. [Google Scholar] [CrossRef]
- Nguyen, T.N.M.; Guo, Y.; Qin, S.; Frew, K.S.; Xu, R.; Agar, J.C. Symmetry-aware recursive image similarity exploration for materials microscopy. npj Comput. Mater. 2021, 7, 166. [Google Scholar] [CrossRef]
- Liu, S.; Suganuma, M.; Okatani, T. Symmetry-aware Neural Architecture for Embodied Visual Navigation. Int. J. Comput. Vis. 2024, 132, 1091–1107. [Google Scholar] [CrossRef]
- Frustaci, F.; Spagnolo, F.; Perri, S.; Cocorullo, G.; Corsonello, P. Robust and High-Performance Machine Vision System for Automatic Quality Inspection in Assembly Processes. Sensors 2022, 22, 2839. [Google Scholar] [CrossRef] [PubMed]
- Zhang, S.; Bai, G.; Li, H.; Liu, P.; Zhang, M.; Li, S. Multi-Source Knowledge Reasoning for Data-Driven IoT Security. Sensors 2021, 21, 7579. [Google Scholar] [CrossRef] [PubMed]
- Ninama, H.; Raikwal, J.; Ravuri, A.; Sukheja, D.; Bhoi, S.K.; Jhanjhi, N.Z.; Elnour, A.A.H.; Abdelmaboud, A. Computer vision and deep transfer learning for automatic gauge reading detection. Sci. Rep. 2024, 14, 23019. [Google Scholar] [CrossRef]
- Ghaffari, M.; Zhang, R.; Zhu, M.; Lin, C.E.; Lin, T.-Y.; Teng, S.; Li, T.; Liu, T.; Song, J. Progress in symmetry preserving robot perception and control through geometry and learning. Front. Robot. AI 2022, 9, 969380. [Google Scholar] [CrossRef]
- Bertens, T.; Caasenbrood, B.; Saccon, A.; Jalba, A. Symmetry-induced ambiguity in orientation estimation from RGB images. Mach. Vis. Appl. 2025, 36, 40. [Google Scholar] [CrossRef]
- Katsigiannis, M.; Mykoniatis, K. Enhancing industrial IoT with edge computing and computer vision: An analog gauge visual digitization approach. Manuf. Lett. 2024, 41, 1264–1273. [Google Scholar] [CrossRef]
- Hütten, N.; Meyes, R.; Meisen, T. Vision Transformer in Industrial Visual Inspection. Appl. Sci. 2022, 12, 11981. [Google Scholar] [CrossRef]
- Wang, C.-H.; Huang, K.-K.; Chang, R.-I.; Huang, C.-K. Scale-Mark-Based Gauge Reading for Gauge Sensors in Real Environments with Light and Perspective Distortions. Sensors 2022, 22, 7490. [Google Scholar] [CrossRef]
- Rani, A.; Ortiz-Arroyo, D.; Durdevic, P. A survey of vision-based condition monitoring methods using deep learning: A synthetic fiber rope perspective. Eng. Appl. Artif. Intell. 2024, 136, 108921. [Google Scholar] [CrossRef]
- Zakaria, M.; Karaaslan, E.; Catbas, F.N. Advanced bridge visual inspection using real-time machine learning in edge devices. Adv. Bridge Eng. 2022, 3, 27. [Google Scholar] [CrossRef]
- Khan, A.; Rauf, Z.; Sohail, A.; Khan, A.R.; Asif, H.; Asif, A.; Farooq, U. A survey of the vision transformers and their CNN-transformer based variants. Artif. Intell. Rev. 2023, 56, 2917–2970. [Google Scholar] [CrossRef]
- Roth, W.; Schindler, G.; Klein, B.; Peharz, R.; Tschiatschek, S.; Fröning, H.; Pernkopf, F.; Ghahramani, Z. Resource-Efficient Neural Networks for Embedded Systems. J. Mach. Learn. Res. 2024, 25, 2506–2556. [Google Scholar]
- Seng, K.P.; Ang, L.-M. Embedded Intelligence: State-of-the-Art and Research Challenges. IEEE Access 2022, 10, 59236–59258. [Google Scholar] [CrossRef]
- Tang, B.; Chen, L.; Sun, W.; Lin, Z. Review of surface defect detection of steel products based on machine vision. IET Image Process. 2023, 17, 303–322. [Google Scholar] [CrossRef]
- Zhu, S.; Li, Z.; Long, K.; Zhou, S.; Zhou, Z. Study of illumination and reflection performances on light-colored pavement materials. Constr. Build. Mater. 2024, 456, 139239. [Google Scholar] [CrossRef]
- John, J.G.; N, A. Illumination Compensated images for surface roughness evaluation using machine vision in grinding process. Procedia Manuf. 2019, 34, 969–977. [Google Scholar] [CrossRef]
- Chen, S.; Yu, Z.; Mou, Y.; Fang, H. An analytical model of the detecting structure of electrostatic inductive electric field sensor. Measurement 2023, 222, 113618. [Google Scholar] [CrossRef]
- Li, Y.; Guan, S.; Yin, X.; Wang, X.; Liu, J.; Na Wong, I.; Wang, G.; Chen, F. Measurement of road safety situation by CRITIC-TODIM-NMF: A lesson system of legislation and regulation for the United States. Measurement 2023, 220, 113333. [Google Scholar] [CrossRef]












| ANN Type | Instrument | Input Neurons | Hidden Neurons | Output Neurons | Learning Rate | Activation | Iterations |
|---|---|---|---|---|---|---|---|
| Scale Localization | Pressure Gauge | 20,000 | 30 | 7 | 0.01 | Sigmoid | 200 |
| Indicator Tracking | Pressure Gauge | 1000 | 20 | 6 | 0.01 | Sigmoid | 400 |
| Scale Localization | Rotameter | 14,000 | 30 | 7 | 0.01 | Sigmoid | 200 |
| Float Tracking | Rotameter | 600 | 20 | 6 | 0.01 | Sigmoid | 200 |
| Network Type | Instrument | Iterations to Convergence | Training Time (min) | Final MAE | Execution Time per Cycle (ms) |
|---|---|---|---|---|---|
| Scale Positioning ANN | Pressure Gauge | 200 | 30 | 0.018 | 3.74 |
| Indicator Tracking ANN | Pressure Gauge | 400 | 38 | 0.021 | 3.81 |
| Scale Positioning ANN | Rotameter | 200 | 32 | 0.019 | 3.70 |
| Float Tracking ANN | Rotameter | 200 | 40 | 0.020 | 3.68 |
| Instrument | Reference Range | MAE | RMSE | |
|---|---|---|---|---|
| Pressure Gauge | 0–200 PSI | 0.589 PSI | 0.731 PSI | 0.985 |
| Rotameter | 1–10 GPM | 0.085 GPM | 0.097 GPM | 0.978 |
| Instrument | Measurement Range | Number of Samples | Mean Error (%) | Standard Deviation (%) | Maximum Deviation (%) | Test Environment |
|---|---|---|---|---|---|---|
| Pressure Gauge | 0–200 PSI | 180 | 2.37 | 0.84 | 4.85 | Factory Hall A—22 °C, 300 lx |
| Rotameter | 1–10 GPM | 140 | 3.18 | 1.05 | 5.00 | Fluid Test Bench—21 °C, 280 lx |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Corrales, G.; Gálvez, C.; Pruna, E.P.; Andaluz, V.H.; Ortiz, J.S. Intelligent Symmetry-Based Vision System for Real-Time Industrial Process Supervision. Symmetry 2025, 17, 2143. https://doi.org/10.3390/sym17122143
Corrales G, Gálvez C, Pruna EP, Andaluz VH, Ortiz JS. Intelligent Symmetry-Based Vision System for Real-Time Industrial Process Supervision. Symmetry. 2025; 17(12):2143. https://doi.org/10.3390/sym17122143
Chicago/Turabian StyleCorrales, Gabriel, Catherine Gálvez, Edwin P. Pruna, Víctor H. Andaluz, and Jessica S. Ortiz. 2025. "Intelligent Symmetry-Based Vision System for Real-Time Industrial Process Supervision" Symmetry 17, no. 12: 2143. https://doi.org/10.3390/sym17122143
APA StyleCorrales, G., Gálvez, C., Pruna, E. P., Andaluz, V. H., & Ortiz, J. S. (2025). Intelligent Symmetry-Based Vision System for Real-Time Industrial Process Supervision. Symmetry, 17(12), 2143. https://doi.org/10.3390/sym17122143

