Next Article in Journal
Gray Prediction for Internal Corrosion Rate of Oil and Gas Pipelines Based on Markov Chain and Particle Swarm Optimization
Previous Article in Journal
Parametric Inference of the Power Weibull Survival Model Using a Generalized Censoring Plan: Three Applications to Symmetry and Asymmetry Scenarios
Previous Article in Special Issue
Autonomous Multirotor UAV Docking and Charging: A Comprehensive Review of Systems, Mechanisms, and Emerging Technologies
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Intelligent Symmetry-Based Vision System for Real-Time Industrial Process Supervision

by
Gabriel Corrales
,
Catherine Gálvez
,
Edwin P. Pruna
,
Víctor H. Andaluz
* and
Jessica S. Ortiz
Departamento de Eléctrica, Electrónica y Telecomunicaciones, Universidad de las Fuerzas Armadas–ESPE, Av. General Rumiñahui S/N, Sangolqui 171103, Ecuador
*
Author to whom correspondence should be addressed.
Symmetry 2025, 17(12), 2143; https://doi.org/10.3390/sym17122143
Submission received: 24 October 2025 / Revised: 17 November 2025 / Accepted: 9 December 2025 / Published: 12 December 2025
(This article belongs to the Special Issue Applications Based on Symmetry in Control Systems and Robotics)

Abstract

Industrial environments still rely heavily on analog instruments for process supervision, as their robustness and low cost make them suitable for harsh conditions. However, these devices require manual readings, which limit automation and digital integration within Industry 4.0 frameworks. To address this gap, this study proposes an intelligent and cost-effective system for non-invasive acquisition of measurement data from analog industrial instruments, leveraging machine vision and Artificial Neural Networks (ANNs). The proposed framework exploits the geometric symmetry inherent in circular and linear scales to interpret pointer positions under varying lighting and perspective conditions. A dedicated image-processing pipeline is combined with lightweight ANN architectures optimized for embedded platforms, ensuring real-time inference without the need for high-end hardware. The processed data are wirelessly transmitted to a Human–Machine Interface (HMI) and web-based dashboard for real-time visualization. Experimental validation on pressure and flow instruments demonstrated an average Mean Absolute Error (MAE) of 0.589 PSI and 0.085 GPM, Root Mean Square Error (RMSE) values of 0.731 PSI and 0.097 GPM, and coefficients of determination (R2) of 0.985 and 0.978, respectively. The system achieved an average processing time of 3.74 ms per cycle on a Raspberry Pi 3 platform, outperforming Optical Character Recognition (OCR) and Convolutional Neural Network (CNN)-based methods in terms of computational efficiency and latency. The results confirm the feasibility of a symmetry-driven vision framework for real-time industrial supervision, providing a practical pathway to digitalize legacy analog instruments and promote low-cost, intelligent Industry 4.0 implementations.

1. Introduction

Global competitive markets increasingly demand industrial processes that emphasize flexibility, productivity, and uniform product quality, while minimizing human intervention to reduce costs and ensure consistent performance [1,2,3]. Over the past two decades, industrial control and monitoring systems have evolved rapidly, becoming essential components in the digital transformation of manufacturing [4]. Major automation vendors now offer integrated supervisory and control solutions, but their high implementation and licensing costs often restrict adoption among small and medium-sized enterprises (SMEs) [3,5,6]. These economic barriers highlight the need for cost-effective and adaptable alternatives that can enhance process supervision without replacing existing infrastructure.
Monitoring and data acquisition are critical to maintaining process reliability and safety. Conventional wired systems and digital transmitters enable precise readings, but they increase complexity, cabling requirements, and installation costs [7,8]. In many industrial environments, analog instruments such as manometers and rotameters remain the primary means of process observation. Their geometric design often circular or linear exhibits inherent symmetry, which facilitates visual interpretation by human operators. Leveraging this visual symmetry through computational models offers an opportunity to transform traditional analog readings into digital information, supporting real-time supervision and intelligent control. In this context, machine vision and machine learning have become powerful tools for extracting and interpreting information from visual data [9,10,11,12].
Recent studies have explored the use of machine vision in industrial applications such as automated inspection, metrology, and process monitoring [13,14,15]. These systems use algorithms for pattern recognition and feature extraction, capable of identifying symmetric and asymmetric structures within an image. Advances in ANNs, CNNs, and Vision Transformers (ViTs) have further enhanced the ability to detect geometric patterns and deviations under varying lighting and environmental conditions [16,17]. In analog instrument reading, these methods allow automatic identification of scales and indicators, correction of perspective distortions, and tracking of pointer movement in real time. Nonetheless, deep learning architectures can be computationally expensive, making their implementation on low-cost embedded platforms challenging [18,19].
To address these limitations, the present study proposes a symmetry-driven machine vision system that combines image processing techniques with lightweight ANN architectures. The system recognizes and interprets the symmetry of analog instrument scales and detects asymmetric deviations in pointer position to determine process variables such as pressure and flow. Designed for real-time performance, the algorithm runs on a Linux-based embedded platform (Raspberry Pi) and wirelessly transmits data to a central server. A Human Machine Interface (HMI) and a web dashboard allow visualization and logging of process variables locally or remotely. Experimental results demonstrate reliable operation, with an error margin between 1% and 5% compared to industrial grade transmitters.
This work contributes a practical and low-cost solution for integrating analog measurement devices into intelligent supervisory systems, aligning with the principles of Industry 4.0. By exploiting geometric symmetry and machine learning, the proposed system bridges traditional instrumentation and digital monitoring, enabling adaptive, non-invasive, and scalable process supervision. The remainder of this paper is organized as follows: Section 2 presents the Proposed Method, including the problem formulation and system architecture. Section 3 details the Image Processing techniques used for feature extraction and symmetry-based enhancement. Section 4 describes the design, structure, and training of the Artificial Neural Networks applied to the recognition and tracking tasks. Section 5 discusses the Results and Validation of the system under experimental conditions. Finally, Section 6 provides the Conclusions and outlines future research directions.

Research Gap and Motivation

Despite the growing use of computer vision in industrial monitoring, several research gaps persist in the digitization of analog instruments:
  • High computational demand: Most existing CNN based methods require GPU acceleration and are unsuitable for low-cost embedded platforms, limiting real-time deployment in small-scale plants.
  • Limited exploitation of geometric symmetry: Few approaches explicitly leverage the inherent circular or axial symmetry of analog scales to simplify feature extraction and improve robustness under illumination changes.
  • Dependence on costly sensors and digital transmitters: Prior works often assume the replacement of analog gauges with digital ones, which is economically infeasible for SMEs.
  • Restricted adaptability and scalability: Many existing solutions are designed for a single instrument type and lack modularity for multi-variable monitoring.
  • Insufficient integration with edge and IoT architectures: Most vision-based systems rely on PC-level processing and do not address real-time communication or supervisory integration.
The present study addresses these gaps by developing a symmetry-aware vision system that (i) employs lightweight ANN models optimized for embedded processors, (ii) exploits geometric symmetry to achieve robust and low-latency performance, and (iii) enables cost-effective digitization of analog instruments within Industry 4.0 supervision frameworks. This approach provides a practical pathway for retrofitting legacy instrumentation while maintaining real-time accuracy and scalability.

2. Proposed Methods

The cost of industrial equipment used for process sensing is often high, especially for small and medium-sized production plants. However, continuous monitoring of process variables is essential to maintain control, ensure traceability, and support corrective actions that improve efficiency and product quality. This concept aligns with the principles of digital twins, where a virtual representation of the physical system enables real-time observation and analysis of process behavior to optimize operations. Analog gauges remain widely used across industries for quick, visual inspection by operators. These devices are robust and reliable but still depend on manual readings and paper-based records, which are inefficient and prone to human error.
In addition, many analog instruments are installed in hazardous or hard-to-reach locations, making frequent supervision difficult and potentially unsafe. To address these limitations, this work proposes a cost-effective and symmetry-aware monitoring system based on machine vision and ANNs. The system interprets the geometric symmetry of analog instruments such as manometers and rotameters to identify scales and detect pointer positions with precision. This symmetry-based interpretation ensures stable detection of measurement patterns, even under variable lighting or camera angles, reinforcing the robustness of the algorithm. The proposed solution is non-invasive and leverages existing resources within the plant. It acquires process data in real time through a wireless communication network, achieving acquisition rates in the order of milliseconds. This enables integration not only for data supervision but also within automatic control loops, as illustrated in Figure 1.
The image-processing algorithms work jointly with two lightweight ANNs, each composed of an input layer, a hidden layer, and an output layer. These networks perform pattern recognition to locate the instrument’s region of interest and track its indicator movement. From the pointer’s position, the system computes the corresponding process variable value. All modules operate on a Raspberry Pi embedded platform, providing an efficient, low-cost solution for intelligent industrial supervision. To provide a clearer representation of these internal processing stages, including the preprocessing, symmetry enhancement, ROI extraction, and ANN inference steps, the complete methodology pipeline is presented in Figure 2.

2.1. Symmetry Formalization in Analog Instrument Geometry

Analog industrial instruments, such as pressure gauges and rotameters, present geometric configurations that are inherently symmetric and therefore suitable for mathematical modeling through radial or axial functions. This property allows consistent interpretation of their visual patterns under variable lighting and viewing angles.
For circular instruments, such as pressure gauges, the symmetry can be described as rotational invariance of the intensity distribution f r , θ around the center of the dial. The function satisfies the condition
f r , θ f r , θ + π
which implies that diametrically opposite regions exhibit equivalent geometric and brightness patterns. Any deviation from this approximate equality corresponds to a symmetry-breaking effect, associated with the displacement of the indicator needle by an angular offset Δ θ . This deviation becomes the key observable for estimating the process variable (e.g., pressure).
In the case of rotameters, the geometric symmetry is primarily axial. The pixel intensity distribution f y along the vertical direction remains approximately constant across symmetric sections of the cylindrical body. Variations Δ f y rise when the float moves along the scale, indicating changes in the flow variable.
The proposed vision system exploits these radial and axial symmetries to locate the instrument’s scale region and detect pointer or float displacements. The ANNs are trained to identify such symmetry-breaking cues, enabling reliable extraction of physical measurements from simple geometric deviations rather than from complex pixel-based features.
The feature selection strategy was defined to retain only the visual elements that directly encode the geometry of the instrument scale and pointer. By emphasizing symmetry-enhanced edges and filtering out background textures and illumination artifacts, the preprocessing stage provides the ANN with a compact set of structural features that represent the measurement pattern. The use of ROIs further eliminates irrelevant pixel regions, allowing the networks to focus exclusively on geometric deviations associated with needle or float displacement. This selective approach significantly improves robustness and reduces computational demand on the embedded platform.

2.2. Dataset and Preprocessing

The dataset used for training and validation was obtained from real experimental conditions using a circular pressure gauge and a vertical rotameter. A total of 1800 images were collected using a Logitech C920 HD Pro Webcam, Logitech International S.A., Lausanne, Switzerland camera configured at 320 × 240 pixels and 30 frames per second. The database includes images captured under controlled LED illumination as well as naturally varying ambient lighting to increase robustness. Approximately 1000 images correspond to the pressure gauge and 800 to the rotameter, with 80% used for training and 20% for testing.
To ensure consistent processing and reduce computational load, all images underwent the same preprocessing pipeline. First, the frames were converted to grayscale to eliminate redundant chromatic information. A region of interest was then extracted for each instrument, 200 × 200 pixels for the pressure gauge and 200 × 70 pixels for the rotameter to isolate the measurement area. Bright reflections on the instrument cover were attenuated using a threshold-based correction applied to high-intensity pixels. A Gaussian filter with a 7 × 7 kernel and a standard deviation of 1.15 was used to reduce high-frequency noise while preserving the structural contours of the scale and pointer. The Canny edge detector was subsequently applied, with thresholds of 20 and 40 for the pressure gauge, and 40 and 80 for the rotameter. Finally, all pixel values were normalized to the range [0, 1]. To enhance generalization capability, synthetic perturbations including illumination shifts, reflections, and bubble-like artifacts were introduced in approximately 30% of the training images, emulating difficulties commonly found in industrial environments.

2.3. ANN Hyperparameters

The system uses two ANNs per instrument: one for scale localization and one for indicator tracking. The architecture and hyperparameters are summarized in Table 1.
The selected configuration achieves a balance between accuracy, convergence, and real-time processing on embedded hardware. Increasing neurons was tested but discarded due to latency increments.

2.4. Rationale for Selecting Artificial Neural Networks

Although CNNs and ViTs offer exceptional performance for complex vision tasks, their computational requirements are substantially higher than those of fully connected ANNs. On embedded platforms without GPU acceleration, CNN inference may introduce delays unsuitable for real-time applications. In contrast, lightweight ANNs rely solely on matrix multiplications, enabling efficient execution on ARM-based processors such as the Raspberry Pi.
Additionally, the geometry of analog instruments exhibits simple and predictable symmetry patterns. These patterns can be effectively captured by compact ANN architectures without requiring convolutional filters or attention mechanisms. The ANN based approach therefore ensures low processing latency, reduced energy consumption, and sufficient discriminative capability to interpret pointer displacement under varying lighting conditions. For these reasons, ANNs constitute an appropriate, practical, and cost-efficient solution for real-time analog instrument digitization in industrial environments.

3. Image Processing

Image processing is a key component of the proposed system. Its main objective is to extract the essential visual information required by the neural network while minimizing computational effort and execution time. The algorithm focuses on identifying the geometric features of analog instruments by isolating the scale and the pointer that represent the measured variable. To achieve this, the images captured by the camera are first converted to grayscale and then processed to generate binary (black-and-white) representations. This transformation enhances the contrast between the instrument’s background, the scale markings, and the indicator, allowing precise localization of relevant elements.
Analog instruments generally exhibit strong geometric symmetry, either circular or linear, depending on their design. This characteristic is exploited by the proposed method to simplify feature extraction and improve the robustness of detection under varying illumination or viewing angles. For this study, the instruments are grouped into two categories: circular gauges (e.g., pressure indicators) and cylindrical gauges (e.g., rotameters). This classification enables the algorithm to adapt to a wide range of analog devices whose readings depend on the motion of a symmetric pointer along a defined scale, whether linear or logarithmic.

3.1. Pressure Gauge

Circular instruments, such as pressure gauges, are frequently installed outdoors, where lighting conditions vary significantly. These variations generate shadows or low-contrast regions that hinder image acquisition. In addition, the transparent glass cover of the gauge often produces random reflections, further complicating the detection of the scale and the pointer. To address these challenges, a front-facing LED lighting system was implemented. The system ensures uniform illumination with minimal power consumption and low thermal impact. Based on prior studies [20,21,22], red light was selected because it preserves the visual features of the scale without distorting its geometry. This choice eliminates the need for complex shadow-removal algorithms that would otherwise increase computational cost. Despite this improvement, bright reflections caused by the glass surface may still appear. To mitigate this effect, the RGB image is converted to grayscale, where reflected pixels show higher intensity values.
A threshold-based attenuation parameter, μ is defined as:
μ = a + t
where
a = i = 0 n v i n
Here, v i represents the intensity of pixel i (ranging from 0 to 255) within a 200 × 200-pixel region of interest (ROI) containing the circular instrument scale, n is the total number of pixels, and t is an experimentally defined threshold (1–10). Pixels with intensity values equal to or higher than μ are replaced by the average value a , reducing excessive brightness in the image. A Gaussian filter is then applied to smooth the image and enhance edge continuity. It is defined as:
G x , y = 1 2 π σ 2 e x 2 + y 2 2 σ 2
where x and y denote the kernel coordinates centered at the origin, and σ = 1.2 . Finally, the Canny edge detection algorithm [22] is applied with thresholds of 20 (low) and 40 (high) to obtain a binary image that clearly outlines the gauge’s symmetric structure, including the circular scale and the pointer position. As shown in Figure 3, the applied processing steps enhance the clarity and symmetry of the gauge image, which helps the neural network recognize and analyze geometric details with high accuracy and reduced processing time.

3.2. Rotameter

Unlike the pressure gauge, the rotameter features a transparent cylindrical body that contains the scale and the floating indicator. This geometry makes it more exposed to light interference from multiple directions. However, because rotameters are usually employed in chemical and laboratory processes where ambient lighting is stable and controlled, an additional illumination system was not required.
A common challenge in rotameter imaging is the formation of air bubbles during liquid circulation. The appearance of these bubbles depends on the fluid’s properties and flow rate, introducing random distortions in the captured images. Since these irregularities occur sporadically and the flow remains generally steady, applying a continuous bubble-removal algorithm would be inefficient and computationally expensive. Instead, the proposed system addresses this problem through ANN training that incorporates synthetic noise. This approach allows the model to learn and distinguish unexpected asymmetric shapes such as bubbles from the symmetric geometry of the instrument. Figure 4 shows the rotameter operating under different flow conditions.
The image processing steps for the rotameter (Figure 5) follow the same structure used for the pressure gauge. A 200 × 70 pixel region of interest (ROI) is extracted from the RGB image and converted to grayscale. A Gaussian filter with a 7 × 7 kernel and a standard deviation of 1.15 is applied to smooth the contours and reduce visual noise. The Canny edge detection algorithm [23] is then used with low and high thresholds of 40 and 80, respectively, to generate a binary image that clearly outlines the cylindrical scale and the floating indicator.
These operations enhance the axial symmetry of the rotameter’s structure and highlight its geometric features, ensuring that the neural network can accurately identify the float’s position despite transient visual disturbances.

4. Artificial Neural Networks

This section presents the methodology used for the design and parameterization of the ANNs that form the core of the proposed vision-based measurement system. The main objective is to develop an algorithm optimized for low-cost embedded platforms, ensuring scalability and generalization capacity for future industrial applications. To achieve a balance between precision and computational efficiency, a lightweight ANN architecture was implemented, allowing real-time performance without high end processing hardware.

4.1. Structure

The proposed approach employs two independent ANNs for each instrument, designed to work cooperatively within the vision-based measurement system. The first ANN is dedicated to scale region recognition and positioning, while the second focuses on tracking the motion of the indicator needle or float. This dual configuration enables the system to separate static geometric localization from dynamic motion analysis, improving overall robustness and real-time adaptability.
Each network is composed of three layers input, hidden, and output whose parameters were empirically tuned to achieve an optimal balance between accuracy, training time, and computational cost on the embedded hardware platform. Experimental results showed that increasing the number of neurons slightly improved recognition accuracy but also increased processing time, potentially compromising real-time response.
The first ANN, designed to identify and position the analog instrument’s scale region (Figure 5), uses an input layer of 20,000 neurons for the pressure gauge, corresponding to a Region of Interest (ROI) of 100 × 200 pixels, as shown in Figure 3, and 14,000 neurons for the rotameter, corresponding to a 100 × 140-pixel ROI, as shown in Figure 4. The hidden layer is composed of 30 neurons that process spatial and geometric features to identify characteristic patterns such as edges, curvature changes, and scale markings. The output layer has seven neurons; each associated with a specific action or positional adjustment. These outputs allow the network to perform fine horizontal and vertical corrections of the ROI or to trigger an error condition when non-instrument areas are detected.
The second ANN, responsible for dynamic tracking of the indicator (for the pressure gauge) and the float (for the rotameter), is also shown in Figure 6. To ensure fast inference suitable for real-time operation, the input layer is smaller: 1000 neurons for the pressure gauge (corresponding to a 50 × 20-pixel ROI) and 600 neurons for the rotameter (corresponding to a 30 × 20-pixel ROI). The hidden layer contains 20 neurons that analyze temporal variations in pixel intensity, enabling the estimation of displacement direction and velocity. The output layer consists of six neurons, which represent discrete position states centered, left, right, up, down, and stationary allowing the system to interpret the indicator’s movement as a function of image features.
This architecture allows the hidden layers to operate as compact feature extractors that map raw pixel information directly to positional or motion states without using convolutional operations. By omitting convolutional layers, the proposed model minimizes computational complexity while maintaining the discriminative capacity required for accurate tracking. This design decision is crucial for ensuring low-latency inference on embedded processors.
The selected ROIs were optimized to capture essential geometric and visual information from both the instrument scale and the indicator, while minimizing redundant data outside the relevant area. To prevent neuron saturation and improve learning stability, all pixel intensity values were normalized before training: black pixels were assigned a value of 0.1 and white pixels a value of 0.9, preserving the natural contrast dynamics of the input images. The target activation values were defined in the same normalized range, where logical “0” corresponds to 0.1 and logical “1” to 0.9, and each neuron was trained independently.
Neuron activation follows a sigmoidal transfer function expressed as:
S k u = 1 1 + e G u
where S k represents the output of the neuron k , a positive gain constant, and u is the effective input value computed as
u = w k j z j
where w k j is the synaptic weight connecting neuron k with the input j , and z j denotes the input signal.
This formulation ensures smooth and bounded activations, which are suitable for online training and real-time inference.

4.2. Training and Operation

The training algorithms, indicator tracking, and scale positioning routines use a rectangular ROI that moves dynamically across the image to provide input data to the neural network. For the scale recognition and positioning stage, the ROI moves horizontally and vertically within predefined image boundaries until the optimal alignment is achieved. For the indicator tracking stage, the ROI movement depends on the geometry of the instrument being analyzed.
In the case of the pressure gauge, the ROI undergoes a two-dimensional rotational transformation to follow the motion of the indicator needle. This transformation is expressed as
x n m k y n m k = cos ϕ sin ϕ sin ϕ cos ϕ x m k y m k + c x c y
This maps the pixels within the ROI to new coordinates k , represented by x n m , y n m , based on the angle ϕ and previous coordinates x m , y m , centered at the image origin c x , c y . For the rotameter is a vertical motion is used. It is defined by the following expression:
y n r k = y r k + Δ y
where y n r is the new y -coordinate of the pixel k contained in the ROI, y r is the previous y -coordinate, and Δ y is an adjustable scalar for upward or downward movement. This combination of rotational and translational transformations allows the ANN to dynamically adapt to the specific motion characteristics of each instrument type, thereby improving robustness and tracking accuracy in real-time operation.

4.3. Output Layer Function and Training Process

The output layer neurons of each ANN control both the training and operational stages of the system. In the case of the ANN dedicated to scale recognition, the seven output neurons are grouped into three functional sets. The first three neurons (from neuron 0 to neuron 2) govern horizontal displacement of the ROI. Neuron 1 is activated when the ROI is centered, whereas neurons 0 and 2 indicate left and right deviations, respectively. The next three neurons (from neuron 3 to neuron 5) manage vertical displacement, where neuron 4 represents the centered position, and neurons 3 and 5 correspond to upward and downward movements. Finally, neuron 6 activates when the ROI contains non-instrument elements, triggering a quick-search subroutine that reinitializes the ROI within the image to locate the instrument region.
For the ANN used in dynamic tracking (Figure 6), the output layer operates under a single-activation condition, meaning that only one neuron is active at any given moment. This structure allows the system to detect whether the indicator is centered within the ROI (neuron 2) or deviates to the left or right for the pressure gauge, and upward or downward for the rotameter. Neurons 1 and 3 identify small deviations of approximately one to five degrees or pixels, depending on the instrument. These neurons improve the precision of position estimation. Neurons 0 and 4, in contrast, correspond to larger deviations ranging from six to twenty degrees for the pressure gauge and six to ten pixels for the rotameter, and are used to increase the responsiveness of dynamic tracking.
A supervised real-time training procedure is implemented while the instruments operate at fixed reference positions. This process consists of defining the appropriate target vector, represented as {output neuron 0, output neuron 1, output neuron 2, …}, so that each neuron learns to activate according to the displacement of the ROI. For the ANN used in scale region recognition, a series of random ROI displacements are generated within a finite pixel range. When the ROI is centered over the instrument, the target vector is {0, 1, 0, 0, 1, 0, 0}. A deviation to the right corresponds to {0, 1, 0, 0, 0, 0, 1}, and a vertical upward shift corresponds to {1, 0, 0, 0, 1, 0, 0}. Additional combinations are established for other displacement directions such as upward-right, left, or lower-left. An exceptional case occurs when the target vector is set to {0, 0, 0, 0, 0, 0, 1} to train the network to identify the absence of the instrument region, simulating internal noise sources.
The backpropagation algorithm is executed over 200 cycles of random movements. Every five cycles, the central region position is reinforced, while fifty cycles include noise disturbances. One complete sequence of 200 cycles is defined as a single training iteration.
During the training of the ANN responsible for indicator tracking, the ROI is moved to simulate the actual motion of the indicator as the process variable changes. This enables the network to discriminate rapid variations in indicator position, which occur in the order of milliseconds. The average execution time of a backpropagation cycle is approximately 10 milliseconds. The target vector is set according to the apparent motion of the indicator. For the pressure gauge, a rotation of the ROI from one to five degrees to the right produces the target vector {0, 1, 0, 0, 0, 0}, while a rotation to the opposite direction results in {0, 0, 0, 1, 0, 0}. Similarly, for the rotameter, an upward movement of one to five pixels corresponds to {0, 0, 0, 1, 0, 0}, whereas a downward displacement produces {0, 1, 0, 0, 0, 0}. Neurons 0 and 4 follow the same logic for larger movements between six and twenty degrees in the pressure gauge or six to ten pixels in the rotameter. When the indicator is centered within the ROI, the target vector is {0, 0, 1, 0, 0, 0}, and when the ROI moves to a non-relevant region, the target becomes {0, 0, 0, 0, 0, 1}.
For the pressure gauge ANN, one training iteration consists of a set of twenty random image displacements. During this iteration, the indicator’s central position is guaranteed to appear at least three times, while one instance includes a distant, non-instrument region to improve generalization.
In the case of the rotameter float tracking ANN, the training incorporates controlled noise conditions to enhance robustness. This approach allows the system to maintain stability in the presence of irregular image artifacts, such as air bubbles generated by variations in fluid flow. One training iteration includes ten motion cycles similar to those used for the pressure gauge, followed by ten additional cycles where noise invades the ROI by approximately three pixels above and below the float’s edge, as shown in Figure 7.
The magnitude of the measurement variables is determined by scaling the ROI displacement values relative to the instrument’s range. For the pressure gauge, the displacement is expressed in angular units, while for the rotameter, it is represented in pixels. This normalization allows the ANN outputs to be mapped directly to physical quantities.
For the pressure gauge used in this study, which features a scale ranging from 0 to 200 psi (pounds per square inch), the relationship between the measured pressure and the corresponding indicator angle ϕ is described by,
P P S I = 0.74 ϕ + 33.33 ; 0 ° ϕ 225 ° 0.74 ϕ 233.33 ; 315 ° ϕ 360 °
where ϕ represents the angular position of the indicator, measured with respect to the reference zero. For angles between 225 ° and 315 ° , the indicator lies within the upper quadrants of the circular scale. The lower quadrants are mechanically inaccessible due to the gauge’s internal design, which constrains the needle’s motion. Therefore, calibration must be performed carefully to ensure that the angle-to-pressure mapping remains accurate and to prevent possible misinterpretations caused by geometric overlaps or mechanical backlash.
In the case of the rotameter, the graduated scale exhibits a logarithmic behavior, covering a flow range from 1 to 10 gallons per minute (GPM). To establish a correspondence between the float’s vertical position y r , and the actual flow rate, a regression model is developed. Since the relationship between pixel displacement and flow rate is nonlinear, a quadratic equation is obtained as
P G P M = 0.264 0.00049 y r 2 + 0.14 y r + 1.97
The dataset used for training and validating the ANN models was composed of both synthetic and real-world images. For the pressure gauge, a total of 1000 images were acquired under controlled lighting conditions, of which 800 were used for training and 200 for testing. For the rotameter, 800 images were collected, divided into 600 for training and 200 for testing. To enhance robustness, synthetic noise was introduced in approximately 30% of the training images, simulating air bubbles, reflections, and minor lighting variations commonly encountered in industrial environments. Each dataset also included multiple indicator positions, ensuring full coverage of the instrument’s operational range and improving the network’s generalization capability during real-time inference.

4.4. Dataset Construction and Preprocessing

The dataset used for training and validating the ANNs was constructed entirely from real experimental data obtained in laboratory conditions that replicate typical industrial monitoring environments. Two types of analog instruments were used: a circular pressure gauge with a 0–200 PSI range and a vertical rotameter covering a flow range from 1 to 10 GPM. Image acquisition was performed using a Logitech C920 HD camera mounted on a fixed tripod at a distance of 30 cm from the instrument face, ensuring constant focus and minimal perspective distortion. The camera operated at 30 frames per second with a resolution of 320 × 240 pixels. To evaluate robustness, data were collected under different ambient lighting conditions, including controlled illumination with LED light and natural lighting with variable intensity.
A total of 1800 images were recorded 1000 corresponding to the pressure gauge and 800 to the rotameter. For each instrument, 80% of the samples were used for training and 20% for testing. During acquisition, the indicator needle or float was manually positioned at reference values across the entire measurement scale to guarantee uniform coverage of the range. This ensured that the network learned the correspondence between the visual geometry and the actual physical readings. To strengthen the model’s generalization capability, synthetic noise and perturbations were added to approximately 30% of the training images. These included mild brightness variations, specular reflections from the instrument’s glass cover, and background shadows, simulating realistic industrial conditions.
Before being fed to the ANN, all images were converted to grayscale, cropped to a defined ROI, resized to 256 × 256 pixels, and normalized to the [0, 1] intensity range. A light Gaussian filter was applied to reduce high-frequency noise while preserving edges. The preprocessing stage ensured that the dataset remained consistent and computationally efficient for real-time processing on the embedded hardware platform. This combination of diverse lighting, background, and reflection conditions allowed the training dataset to capture the variability expected in real operational environments, supporting the system’s accuracy and robustness during field tests.

5. Results and Discussion

This section presents the experimental validation of the proposed system. The tests were conducted on three industrial processes: one for flow measurement and two for pressure monitoring. Each experiment was designed to evaluate the system’s ability to identify the geometric symmetry of analog instruments and to track pointer movements accurately under real operating conditions. The operation algorithm, responsible for scale positioning and dynamic tracking of the indicator, was implemented on an embedded Raspberry Pi 3 platform. The device, equipped with a 1.2 GHz Quad-Core ARM Cortex-A53 processor, provides enough processing capacity for real-time image acquisition and analysis while maintaining low energy consumption. This configuration supports continuous operation and ensures stable data flow during long monitoring periods.
A wireless acquisition network was established using the UDP communication protocol to minimize latency and maintain synchronization between measurement nodes. The processed data are displayed and stored on a central server that manages the supervisory interface. A Human–Machine Interface developed in LabVIEW provides real-time visualization of the monitored variables. In addition, a web-based dashboard created with XAMPP integrates MySQL and Apache for database management and local web services, allowing remote supervision without the need for an internet connection. The experimental setup, shown in Figure 8, illustrates the integration of all system components. The configuration achieves a balanced and symmetrical data flow between the acquisition units, the embedded platform, and the supervisory system. This architecture enables simultaneous monitoring of multiple analog instruments while preserving temporal synchronization and measurement consistency throughout the process.

5.1. ANN Convergence

The same computer used for system monitoring was also employed for training the ANNs. It was equipped with an Intel Core i7 processor operating at 2.30 GHz and 8 GB of RAM. Image acquisition during both the training and operation stages of the machine vision system was performed using a Logitech C920 camera at 30 frames per second. The camera resolution was set to 320 × 240 pixels, a value determined experimentally to ensure good image quality while maintaining low computational cost.
Figure 9 presents the convergence behavior of the four ANNs developed in this study. For each network, the mean absolute error of the neuron activation values was calculated at every training iteration. The error was defined as the difference between the target and output activations, both normalized within the range of 0.1 to 0.9 of training.
Table 2 summarizes the main performance parameters obtained during the training of the four implemented Artificial Neural Networks. Each network was optimized to balance accuracy, convergence speed, and computational cost on the embedded platform.
The results show that the symmetry-based design allows all networks to converge within 200–400 iterations, with mean absolute errors (MAE) below 0.03 and average processing times under 4 ms per cycle. This confirms the feasibility of real-time implementation on a low-power Raspberry Pi 3 platform without compromising accuracy or stability. The convergence times also demonstrate the effectiveness of the lightweight architecture in capturing the geometric patterns of circular and linear analog instruments.
The obtained results confirm the stability and convergence speed of the proposed networks. Each ANN demonstrated consistent learning behavior and the ability to distinguish between different geometric patterns captured from the instruments. The networks designed for scale positioning in both instruments achieved convergence after approximately 200 iterations, with a total training time of around 30 min. The rotameter float tracking network required 200 iterations and converged within 40 min, while the pressure gauge indicator tracking network reached convergence after 400 iterations in about 8 min.
These results demonstrate that the selected number of neurons and learning parameters were appropriate for achieving convergence in a relatively short time. The symmetry of the training data, derived from repeated geometric patterns in the instruments, contributed to stable learning and reduced oscillations in the error curve. This behavior confirms the effectiveness of the proposed training approach, ensuring that the neural networks can generalize well to new images while maintaining computational efficiency.
To further complement the convergence analysis, an ablation-style validation was performed to determine the individual contribution of the main components of the system. When the symmetry-based preprocessing stage was removed, the ANNs exhibited unstable activation patterns and the MAE increased by approximately 27–35%, highlighting the importance of geometric consistency in detection. Likewise, when the ROI reduction step was removed and full images were processed, the inference time on the Raspberry Pi 3 increased from an average of 3.74 ms to more than 12 ms per cycle, preventing real-time operation. Finally, replacing the lightweight ANN architecture with a shallow CNN produced only minimal improvements in accuracy (0.4–0.6%) while increasing latency beyond 20 ms. These results confirm that symmetry enhancement, ROI-based feature extraction, and compact neural architectures are the key elements that enable the system to maintain stability, accuracy, and real-time performance on low-power embedded hardware.

5.2. System Operation

To evaluate the effectiveness of the proposed system, experimental tests were carried out to compare its performance against industrial grade reference equipment. Figure 10 and Figure 11 present the response curves obtained from the Monitoring System by Computer Vision (MSCV) alongside those from the industrial reference transmitters used for pressure and flow measurements, respectively. The reference equipment included a Foxboro IAP20 absolute pressure transmitter, with an accuracy of ±0.05% of its full-scale range, and a Georg Fischer 8550 flow transmitter, which provides a precision of ±0.7% and features a paddlewheel primary element with an internal update rate of 100 ms. Both instruments output standard electrical signals ranging from 4 to 20 mA, converted to 1–5 V analog signals and acquired using a National Instruments NI DAQ 6008 data acquisition board (12-bit resolution, 10 kS/s sampling rate). To test the dynamic behavior of the proposed system, the process variables pressure and flow were subjected to random and abrupt variations. These tests aimed to verify the response speed and stability of the machine vision algorithm under real operating conditions.
For the flow variable, it is important to consider the operating principle of the Georg Fischer 8550 transmitter. The device computes flow by counting pulses generated by the rotation of its primary element within a defined time window, introducing a short delay (100 ms) in the signal update. To emphasize this effect, step input signals were applied to the motor frequency drive controlling the pump, which directly influences the process variable.
The MSCV demonstrated high consistency with the industrial transmitters in both variables. For the pressure gauge, the average absolute error was 0.589 PSI (SD = 0.402 PSI). For the rotameter, the average absolute error was 0.085 GPM (SD = 0.059 GPM). At a 95% confidence level, the error intervals were [0.510, 0.668] PSI and [0.073, 0.097] GPM, respectively. These values correspond to percentage errors between 1% and 5%, confirming the system’s suitability for industrial monitoring.
The response curves show that the system reliably tracks the dynamics of each process. In the case of flow measurement, the MSCV exhibited a faster response than the 8550 transmitters because it estimates pointer displacement continuously rather than relying on discrete pulse-count windows. This demonstrates the effectiveness of the symmetry-based feature extraction in identifying rapid geometric deviations.
To validate the real-time capability of the proposed approach, the inference time was measured directly on the Raspberry Pi 3 during continuous operation. The ANN execution time consistently remained below 4 ms per cycle (mean = 3.74 ms, SD = 0.53 ms), corresponding to update rates above 250 Hz. This confirms that the system achieves real-time performance on low-power ARM hardware and is suitable for embedded industrial applications where low latency and energy efficiency are essential.

5.3. Statistical Performance Evaluation

The statistical robustness of the proposed symmetry-based machine vision system was assessed through a quantitative analysis performed on 500 test samples for each instrument. The assessment considered three key performance indicators: the Mean Absolute Error (MAE), the Root Mean Square Error (RMSE), and the Coefficient of Determination R 2 . These metrics quantify the precision, dispersion, and correlation between the machine vision predictions and the reference transmitter measurements.
M A E = 1 N i = 1 N y i y ^ i
R M S R = 1 N i = 1 N y i y ^ i 2
R 2 = 1 i = 1 N y i y ^ i 2 i = 1 N y i y ¯ i 2
where y i and y ^ i represent the reference and predicted values for each sample, respectively; y ¯ i denotes the mean of the reference measurements; and N is the number of test samples. These variables match those used in the quantitative analysis throughout the manuscript, ensuring consistency between notation, statistical interpretation, and experimental reporting.
Table 3 summarizes the statistical performance of the proposed approach in comparison with industrial-grade transmitters. The system achieved a mean absolute error below 1 PSI for pressure readings and below 0.1 GPM for flow measurements. The low RMSE values, together with R2 coefficients higher than 0.97, indicate a strong correlation between the predicted and actual readings. These results confirm that the artificial neural networks successfully capture the geometric and symmetric characteristics of analog instruments with high fidelity, even under varying lighting and camera conditions.
The narrow error distribution, illustrated in Figure 12, demonstrates excellent repeatability and measurement stability, validating the proposed framework for real-time industrial monitoring applications aligned with Industry 4.0 standards. To extend the analysis to realistic working conditions, additional trials were conducted in a semi-industrial environment at the ARSI Laboratory. A total of 320 additional physical measurements were collected over three consecutive operating days under both steady-state and transient regimes. Table 4 presents the extended performance indicators, including mean error, standard deviation, and maximum deviation for each instrument.
The low variability and consistent performance observed in both instruments confirm the robustness of the proposed method under practical conditions. The system maintained reliable operation across diverse lighting levels and minor mechanical vibrations, ensuring repeatability of results without recalibration. These findings reinforce the applicability of the proposed symmetry-based vision system to real industrial contexts and provide a solid foundation for the comparative analysis discussed in the following subsection.

5.4. Comparative Discussion and Practical Implications

The proposed symmetry-guided architecture demonstrated superior computational efficiency when compared with traditional CNN- and OCR-based gauge readers reported in recent studies. Its lightweight ANN design allows real-time inference on low-power embedded hardware, such as the Raspberry Pi platform, without requiring GPU acceleration an important advantage for practical industrial deployment. This efficiency confirms the feasibility of implementing distributed vision-based monitoring nodes in environments where computational resources are limited.
Compared with CNN based approaches that often achieve sub-percent accuracy but require high computational power and inference times between 15 and 40 ms on GPU-assisted devices [24], the proposed system attains comparable precision within a 1–5% error range while maintaining an average execution time of only 3.74 ms per cycle. This performance balance confirms that symmetry aware feature extraction provides a viable alternative to deep convolutional processing for analog instrument digitization tasks. Furthermore, when evaluated against OCR-based techniques [9], which tend to degrade under reflections, pointer occlusion, or irregular lighting, the ANN-driven architecture maintains stable measurements due to its reliance on geometric symmetry cues rather than text-like pattern recognition.
From a broader perspective, the symmetry-oriented and modular structure of the system contributes to the advancement of sustainable and scalable Industry 4.0 strategies. These characteristics are particularly valuable for small and medium-sized enterprises (SMEs) seeking cost-effective digital transformation solutions. The low implementation cost and adaptability of the framework make it suitable for retrofitting legacy analog equipment, enabling intelligent and continuous process supervision without extensive infrastructure replacements.

5.5. Generalization Capability

The proposed system, while validated on pressure and flow measurements, exhibits the potential to generalize to other analog instruments that present geometric symmetry, such as voltmeters, thermometers, or tachometers. The underlying framework is independent of specific scale markings or pointer shapes, relying primarily on radial or axial symmetry cues for measurement extraction.
Nevertheless, the current configuration assumes fixed camera alignment and stable lighting. In real industrial scenarios, vibrations, glare, and dynamic illumination may introduce noise or perspective shifts that affect detection precision. Addressing these limitations requires adaptive preprocessing methods and the integration of domain-adaptive neural networks capable of compensating for environmental perturbations. Future work will explore these aspects to ensure consistent performance across diverse operational environments.

5.6. Limitations of the Study

Although the proposed method demonstrated solid performance in both laboratory and semi-industrial conditions, several practical limitations should be acknowledged. The current system relies on a fixed camera position, which means that strong vibrations or unintentional camera movement may introduce errors in the measurements. Likewise, while the preprocessing stage helps reduce the effects of changing illumination, very intense lighting variations or pronounced reflections can still interfere with edge detection. Another limitation arises from the use of lightweight ANNs, which, although efficient for embedded execution, may not generalize well to analog instruments with uncommon scale layouts or atypical pointer designs. Finally, the method was validated using a limited set of instrument types, which may constrain its applicability in more diverse industrial environments.
Future work will focus on addressing these challenges. Efforts will include incorporating adaptive preprocessing and perspective-correction methods, as well as exploring domain-adaptive neural networks to improve robustness under varying operating conditions. Expanding the framework to cover a wider range of analog instruments, integrating temporal filtering strategies, and evaluating more advanced symmetry-aware CNN architectures are also promising directions that can strengthen the versatility and reliability of the proposed system.

6. Conclusions

The design and implementation of a cost-effective monitoring system based on machine vision and artificial neural networks enable real-time acquisition of process variables in industrial environments. The proposed framework transforms conventional analog instruments into intelligent measurement devices capable of autonomous operation, aligned with the principles of the Industry 4.0 paradigm. By taking advantage of the geometric symmetry of analog instruments, such as the circular scales of pressure gauges and the axial symmetry of rotameters, the system can accurately recognize and track pointer movements even when lighting or viewing angles vary. This symmetry-oriented approach provides stability and repeatability in the measurements while maintaining a low computational cost on embedded hardware.
The experimental results demonstrate that the proposed system achieves high accuracy and reliable real-time performance. The average error, which ranges from one to five percent, confirms the system’s effectiveness for monitoring pressure and flow variables under both steady and dynamic conditions. The convergence behavior of the artificial neural networks shows their capacity to learn and reproduce symmetric visual patterns, ensuring consistent interpretation of geometric features. These findings validate the use of symmetry-guided learning as a practical and efficient strategy to enhance precision in machine vision applications. The integration of symmetry-based perception and lightweight machine learning into a modular and non-invasive framework makes the solution scalable and adaptable to different analog instrument designs with minimal hardware modifications. The system therefore bridges the gap between traditional instrumentation and intelligent process supervision, contributing to the digital transformation of industrial environments.
The extended quantitative evaluation, detailed dataset description, and discussion of generalization and environmental limitations reinforce the scientific validity of the proposal. The presented results confirm that the symmetry-based vision framework combines computational efficiency, adaptability, and experimental robustness, establishing a solid foundation for future research in industrial metrology and autonomous supervision within Industry 4.0 ecosystems.

Author Contributions

Conceptualization, G.C., C.G., E.P.P. and V.H.A.; methodology, G.C., C.G., E.P.P., V.H.A. and J.S.O.; software, G.C. and C.G.; validation, G.C., C.G., E.P.P., V.H.A. and J.S.O.; formal analysis, G.C., C.G., E.P.P. and V.H.A.; investigation, G.C. and C.G.; resources, G.C., C.G., E.P.P., V.H.A. and J.S.O.; data curation, G.C. and C.G.; writing—original draft preparation, G.C., C.G., E.P.P., V.H.A. and J.S.O.; writing—review and editing, G.C., C.G., E.P.P., V.H.A. and J.S.O.; visualization, G.C. and C.G.; supervision, G.C., C.G., E.P.P., V.H.A. and J.S.O.; project administration, G.C., C.G., E.P.P., V.H.A. and J.S.O. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

The data presented in this study can be consulted by mailing to the author.

Acknowledgments

The authors would like to thank the Universidad de las Fuerzas Armadas ESPE and the ARSI Research Group for their support in developing this work.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Gold, K.L. Effects of Industry 4.0 on Small and Medium-Scale Enterprises: An Analytical and Bibliometric Review. Sage Open 2025, 15, 21582440251336514. [Google Scholar] [CrossRef]
  2. Yaqub, M.Z.; Alsabban, A. Industry-4.0-Enabled Digital Transformation: Prospects, Instruments, Challenges, and Implications for Business Strategies. Sustainability 2023, 15, 8553. [Google Scholar] [CrossRef]
  3. Elhusseiny, H.M.; Crispim, J. SMEs, Barriers and Opportunities on adopting Industry 4.0: A Review. Procedia Comput. Sci. 2022, 196, 864–871. [Google Scholar] [CrossRef]
  4. Pantusin, F.J.; Ortiz, J.S.; Carvajal, C.P.; Andaluz, V.H.; Yar, L.G.; Roberti, F.; Gandolfo, D. Digital Twin Integration for Active Learning in Robotic Manipulator Control Within Engineering 4.0. Symmetry 2025, 17, 1638. [Google Scholar] [CrossRef]
  5. Nguyen, T.N.M.; Guo, Y.; Qin, S.; Frew, K.S.; Xu, R.; Agar, J.C. Symmetry-aware recursive image similarity exploration for materials microscopy. npj Comput. Mater. 2021, 7, 166. [Google Scholar] [CrossRef]
  6. Liu, S.; Suganuma, M.; Okatani, T. Symmetry-aware Neural Architecture for Embodied Visual Navigation. Int. J. Comput. Vis. 2024, 132, 1091–1107. [Google Scholar] [CrossRef]
  7. Frustaci, F.; Spagnolo, F.; Perri, S.; Cocorullo, G.; Corsonello, P. Robust and High-Performance Machine Vision System for Automatic Quality Inspection in Assembly Processes. Sensors 2022, 22, 2839. [Google Scholar] [CrossRef] [PubMed]
  8. Zhang, S.; Bai, G.; Li, H.; Liu, P.; Zhang, M.; Li, S. Multi-Source Knowledge Reasoning for Data-Driven IoT Security. Sensors 2021, 21, 7579. [Google Scholar] [CrossRef] [PubMed]
  9. Ninama, H.; Raikwal, J.; Ravuri, A.; Sukheja, D.; Bhoi, S.K.; Jhanjhi, N.Z.; Elnour, A.A.H.; Abdelmaboud, A. Computer vision and deep transfer learning for automatic gauge reading detection. Sci. Rep. 2024, 14, 23019. [Google Scholar] [CrossRef]
  10. Ghaffari, M.; Zhang, R.; Zhu, M.; Lin, C.E.; Lin, T.-Y.; Teng, S.; Li, T.; Liu, T.; Song, J. Progress in symmetry preserving robot perception and control through geometry and learning. Front. Robot. AI 2022, 9, 969380. [Google Scholar] [CrossRef]
  11. Bertens, T.; Caasenbrood, B.; Saccon, A.; Jalba, A. Symmetry-induced ambiguity in orientation estimation from RGB images. Mach. Vis. Appl. 2025, 36, 40. [Google Scholar] [CrossRef]
  12. Katsigiannis, M.; Mykoniatis, K. Enhancing industrial IoT with edge computing and computer vision: An analog gauge visual digitization approach. Manuf. Lett. 2024, 41, 1264–1273. [Google Scholar] [CrossRef]
  13. Hütten, N.; Meyes, R.; Meisen, T. Vision Transformer in Industrial Visual Inspection. Appl. Sci. 2022, 12, 11981. [Google Scholar] [CrossRef]
  14. Wang, C.-H.; Huang, K.-K.; Chang, R.-I.; Huang, C.-K. Scale-Mark-Based Gauge Reading for Gauge Sensors in Real Environments with Light and Perspective Distortions. Sensors 2022, 22, 7490. [Google Scholar] [CrossRef]
  15. Rani, A.; Ortiz-Arroyo, D.; Durdevic, P. A survey of vision-based condition monitoring methods using deep learning: A synthetic fiber rope perspective. Eng. Appl. Artif. Intell. 2024, 136, 108921. [Google Scholar] [CrossRef]
  16. Zakaria, M.; Karaaslan, E.; Catbas, F.N. Advanced bridge visual inspection using real-time machine learning in edge devices. Adv. Bridge Eng. 2022, 3, 27. [Google Scholar] [CrossRef]
  17. Khan, A.; Rauf, Z.; Sohail, A.; Khan, A.R.; Asif, H.; Asif, A.; Farooq, U. A survey of the vision transformers and their CNN-transformer based variants. Artif. Intell. Rev. 2023, 56, 2917–2970. [Google Scholar] [CrossRef]
  18. Roth, W.; Schindler, G.; Klein, B.; Peharz, R.; Tschiatschek, S.; Fröning, H.; Pernkopf, F.; Ghahramani, Z. Resource-Efficient Neural Networks for Embedded Systems. J. Mach. Learn. Res. 2024, 25, 2506–2556. [Google Scholar]
  19. Seng, K.P.; Ang, L.-M. Embedded Intelligence: State-of-the-Art and Research Challenges. IEEE Access 2022, 10, 59236–59258. [Google Scholar] [CrossRef]
  20. Tang, B.; Chen, L.; Sun, W.; Lin, Z. Review of surface defect detection of steel products based on machine vision. IET Image Process. 2023, 17, 303–322. [Google Scholar] [CrossRef]
  21. Zhu, S.; Li, Z.; Long, K.; Zhou, S.; Zhou, Z. Study of illumination and reflection performances on light-colored pavement materials. Constr. Build. Mater. 2024, 456, 139239. [Google Scholar] [CrossRef]
  22. John, J.G.; N, A. Illumination Compensated images for surface roughness evaluation using machine vision in grinding process. Procedia Manuf. 2019, 34, 969–977. [Google Scholar] [CrossRef]
  23. Chen, S.; Yu, Z.; Mou, Y.; Fang, H. An analytical model of the detecting structure of electrostatic inductive electric field sensor. Measurement 2023, 222, 113618. [Google Scholar] [CrossRef]
  24. Li, Y.; Guan, S.; Yin, X.; Wang, X.; Liu, J.; Na Wong, I.; Wang, G.; Chen, F. Measurement of road safety situation by CRITIC-TODIM-NMF: A lesson system of legislation and regulation for the United States. Measurement 2023, 220, 113333. [Google Scholar] [CrossRef]
Figure 1. Proposed system operation mode integrated into a control loop.
Figure 1. Proposed system operation mode integrated into a control loop.
Symmetry 17 02143 g001
Figure 2. Methodology pipeline of the proposed symmetry-based vision system.
Figure 2. Methodology pipeline of the proposed symmetry-based vision system.
Symmetry 17 02143 g002
Figure 3. Image processing of a pressure gauge.
Figure 3. Image processing of a pressure gauge.
Symmetry 17 02143 g003
Figure 4. Comparison of stable and disturbed flow conditions in the rotameter. The left image shows a stable flow regime, while the right image illustrates bubbles generated by liquid circulation.
Figure 4. Comparison of stable and disturbed flow conditions in the rotameter. The left image shows a stable flow regime, while the right image illustrates bubbles generated by liquid circulation.
Symmetry 17 02143 g004
Figure 5. Image processing stages of the rotameter. From left to right: RGB image, grayscale conversion, and edge detection. These steps illustrate the procedure used to isolate the float contour and prepare the image for feature extraction.
Figure 5. Image processing stages of the rotameter. From left to right: RGB image, grayscale conversion, and edge detection. These steps illustrate the procedure used to isolate the float contour and prepare the image for feature extraction.
Symmetry 17 02143 g005
Figure 6. Structures of the ANN. ANN for the positioning of the pressure gauge scale (top left). ANN for the dynamic tracking of the pressure gauge indicator (top right). ANN for the positioning of the rotameter scale (bottom left). ANN for the dynamic tracking of the rotameter (bottom right).
Figure 6. Structures of the ANN. ANN for the positioning of the pressure gauge scale (top left). ANN for the dynamic tracking of the pressure gauge indicator (top right). ANN for the positioning of the rotameter scale (bottom left). ANN for the dynamic tracking of the rotameter (bottom right).
Symmetry 17 02143 g006
Figure 7. Invasion with noise 3 pixels above and below the edge of the rotameter float.
Figure 7. Invasion with noise 3 pixels above and below the edge of the rotameter float.
Symmetry 17 02143 g007
Figure 8. Experimental system implemented.
Figure 8. Experimental system implemented.
Symmetry 17 02143 g008
Figure 9. Results of the proposed training algorithms for the ANN.
Figure 9. Results of the proposed training algorithms for the ANN.
Symmetry 17 02143 g009
Figure 10. Comparison graph of responses of the industrial transmitter Foxboro IAP20 and the proposed machine vision system for the pressure variable.
Figure 10. Comparison graph of responses of the industrial transmitter Foxboro IAP20 and the proposed machine vision system for the pressure variable.
Symmetry 17 02143 g010
Figure 11. Comparison graph of responses of the industrial transmitter Georg Fischer 8550 and the proposed machine vision system for the flow variable.
Figure 11. Comparison graph of responses of the industrial transmitter Georg Fischer 8550 and the proposed machine vision system for the flow variable.
Symmetry 17 02143 g011
Figure 12. Boxplot of absolute measurement errors for pressure and flow variables.
Figure 12. Boxplot of absolute measurement errors for pressure and flow variables.
Symmetry 17 02143 g012
Table 1. Hyperparameters of the ANN models.
Table 1. Hyperparameters of the ANN models.
ANN TypeInstrumentInput NeuronsHidden NeuronsOutput NeuronsLearning RateActivationIterations
Scale LocalizationPressure Gauge20,0003070.01Sigmoid200
Indicator TrackingPressure Gauge10002060.01Sigmoid400
Scale LocalizationRotameter14,0003070.01Sigmoid200
Float TrackingRotameter6002060.01Sigmoid200
Table 2. Training and computational performance of the proposed Artificial Neural Networks (ANNs) for pressure and flow measurement.
Table 2. Training and computational performance of the proposed Artificial Neural Networks (ANNs) for pressure and flow measurement.
Network TypeInstrumentIterations to ConvergenceTraining Time (min)Final MAEExecution Time per Cycle (ms)
Scale Positioning ANNPressure Gauge200300.0183.74
Indicator Tracking ANNPressure Gauge400380.0213.81
Scale Positioning ANNRotameter200320.0193.70
Float Tracking ANNRotameter200400.0203.68
Table 3. Statistical performance metrics of the proposed machine vision system for pressure and flow measurement.
Table 3. Statistical performance metrics of the proposed machine vision system for pressure and flow measurement.
InstrumentReference RangeMAERMSE R 2
Pressure Gauge0–200 PSI0.589 PSI0.731 PSI0.985
Rotameter1–10 GPM0.085 GPM0.097 GPM0.978
Table 4. Extended performance analysis under real industrial operating conditions.
Table 4. Extended performance analysis under real industrial operating conditions.
InstrumentMeasurement RangeNumber of SamplesMean Error (%)Standard Deviation (%)Maximum Deviation (%)Test Environment
Pressure Gauge0–200 PSI1802.370.844.85Factory Hall A—22 °C, 300 lx
Rotameter1–10 GPM1403.181.055.00Fluid Test Bench—21 °C, 280 lx
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Corrales, G.; Gálvez, C.; Pruna, E.P.; Andaluz, V.H.; Ortiz, J.S. Intelligent Symmetry-Based Vision System for Real-Time Industrial Process Supervision. Symmetry 2025, 17, 2143. https://doi.org/10.3390/sym17122143

AMA Style

Corrales G, Gálvez C, Pruna EP, Andaluz VH, Ortiz JS. Intelligent Symmetry-Based Vision System for Real-Time Industrial Process Supervision. Symmetry. 2025; 17(12):2143. https://doi.org/10.3390/sym17122143

Chicago/Turabian Style

Corrales, Gabriel, Catherine Gálvez, Edwin P. Pruna, Víctor H. Andaluz, and Jessica S. Ortiz. 2025. "Intelligent Symmetry-Based Vision System for Real-Time Industrial Process Supervision" Symmetry 17, no. 12: 2143. https://doi.org/10.3390/sym17122143

APA Style

Corrales, G., Gálvez, C., Pruna, E. P., Andaluz, V. H., & Ortiz, J. S. (2025). Intelligent Symmetry-Based Vision System for Real-Time Industrial Process Supervision. Symmetry, 17(12), 2143. https://doi.org/10.3390/sym17122143

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop