Next Article in Journal
A Non-Invasive Optical Sensor for Real-Time State of Charge and Capacity Fading Tracking in Vanadium Redox Flow Batteries
Previous Article in Journal
Review of DC Microgrid Design, Optimization, and Control for the Resilient and Efficient Renewable Energy Integration
Previous Article in Special Issue
Joint Dispatch Model for Power Grid and Wind Farms Considering Frequency Modulation Delay
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Adaptive Visual Servo Control for GIS Partial Discharge Detection Robots: A Model Predictive Control Approach

1
School of Electronic and Information Engineering, Guangzhou City University of Technology, Guangzhou 510800, China
2
School of Mechanical & Automotive Engineering, South China University of Technology, Guangzhou 510641, China
*
Author to whom correspondence should be addressed.
Energies 2025, 18(23), 6365; https://doi.org/10.3390/en18236365 (registering DOI)
Submission received: 31 October 2025 / Revised: 30 November 2025 / Accepted: 2 December 2025 / Published: 4 December 2025
(This article belongs to the Special Issue Application of Artificial Intelligence in Electrical Power Systems)

Abstract

Gas-insulated switchgear (GIS) serves as the core equipment in substations. Its partial discharge detection requires ultrasonic sensors to be precisely aligned with millimeter-level measurement points. However, existing technologies face three major bottlenecks: the lack of surface texture on GIS makes visual feature extraction difficult; strong electromagnetic interference in substations causes image noise and loss of feature point tracking; and fixed gain control easily leads to end-effector jitter, reducing positioning accuracy. To address these challenges, this paper first employs AprilTag visual markers to define GIS measurement point features, establishing an image-based visual servo model that integrates GIS surface curvature constraints. Second, it proposes an adaptive gain algorithm based on model predictive control, dynamically adjusting gain in real-time according to visual error, electromagnetic interference intensity, and contact force feedback, balancing convergence speed and motion stability. Finally, experiments conducted on a GIS inspection platform built using a Franka Panda robotic arm demonstrate that the proposed algorithm reduces positioning errors, increases positioning speed, and improves positioning accuracy compared to fixed-gain algorithms, providing technical support for the engineering application of GIS partial discharge detection robots.

1. Introduction

As power systems evolve toward “unmanned and intelligent” operations, gas-insulated switchgear (GIS) has become the core equipment in 500 kV and higher substations due to its compact size and superior insulation properties. However, long-term operation can lead to defects such as aging bowl-type insulators and poor conductor contact, which readily trigger partial discharge (PD). If undetected, these issues may cause equipment breakdown, resulting in widespread power outages [1,2]. Current GIS PD detection remains highly dependent on manual operations. Personnel must manually approach energized GIS equipment with ultrasonic or ultra-high-frequency (UHF) sensors, exposing them to high-voltage electrocution risks while suffering from low detection efficiency (2–3 h per bay) and poor positioning accuracy (manual deviation > 5 mm). This approach struggles to meet substation maintenance demands for “daily clearance inspections and high precision” [3]. Therefore, developing GIS PD detection robots to automate and intelligentize the inspection process has become a key technological direction for unmanned substation construction.
The core challenge in replacing manual inspection with robots lies in solving the problems of “precise measurement point localization” and “stable sensor pose control” [4]. Vision servo technology, which enables real-time closed-loop control using image information, is widely applied in the field of robotic precision operations [5]. Traditional visual servo systems primarily fall into the following two categories: image-based (IBVS) and pose-based (PBVS). IBVS directly calculates control signals based on image feature errors, offering simplicity but being prone to trajectory anomalies and target loss of visibility [6]. PBVS achieves control by estimating target pose, yielding smoother motion trajectories and effectively avoiding singularity issues inherent in IBVS [7,8], making it better suited for complex operational spaces around GIS equipment.
Existing research on GIS PD detection robots has explored various measurement point localization and pose control schemes, yet significant limitations persist. Luan et al. [6] employed a stereo camera and QR codes to locate PD measurement points on high-voltage switchgear, achieving basic positioning functionality. However, QR codes exhibit weak electromagnetic interference resistance and require pre-stored coordinates of measurement points relative to the QR codes, resulting in limited flexibility when dealing with multi-bay GIS. Wei et al. [7] employed LiDAR for PD detection point localization in long-distance gas-insulated metal-enclosed transmission lines (GIL). However, LiDAR’s low sampling rate (<10 Hz) fails to meet the “≥30 Hz real-time pose adjustment” requirement for visual servo control. Yang et al. [8] designed a rectangular-elliptical composite marker to guide the GIS ultrasonic PD detection robotic arm. However, these markers lack encoding functionality, cannot distinguish multiple detection points, and neglect contact force control between the sensor and GIS surface (exceeding 0.8 N may scratch insulating components) [9]. Furthermore, existing vision-based servo algorithms predominantly employ fixed-gain control. In substation environments with strong electromagnetic interference (50 Hz, 80 μT), image noise causes feature point tracking loss rates to surge above 30% [1]. Additionally, busbar current flow during GIS operation induces periodic vibrations (frequency 50 Hz, amplitude 0.1–0.3 mm). This vibration causes frame-to-frame shifts in visual feature points, increasing positioning errors in traditional servo systems by over 20%. This failure to meet millimeter-level detection requirements renders them unsuitable for the task. Fixed gain struggles to balance “rapid convergence” with “smooth motion”—Excessive gain induces end-effector jitter, causing sensor contact deviations exceeding 3 mm and failing to meet the positioning accuracy requirements for live GIS equipment inspection [10]. Conversely, insufficient gain significantly prolongs inspection duration [11,12].
Although various effective GIS inspection methods have been designed in the above literature, numerous shortcomings remain. Addressing the “textureless, strong electromagnetic interference, contact-based” characteristics of GIS inspection scenarios, this paper proposes three improvements: first, introducing the AprilTag visual identification system [5], defining its corner and center coordinates as core visual features to solve the problem of feature extraction on textureless surfaces; second, integrating PBVS control logic [3], optimizing the robotic arm’s motion trajectory by targeting the identifier’s pose during vertical sensor contact. Third, integrating strong electromagnetic interference compensation (real-time correction of visual error variance) and contact force constraints (0.3–0.8 N) [9] into Model Predictive Control (MPC), dynamically adjusting gain parameters—increasing gain to accelerate convergence during high errors and reducing gain to ensure stable contact during low errors. Ultimately, the above design achieves precise, efficient, and safe operation of the GIS PD inspection robot, providing a feasible MPC-based control pathway for unmanned GIS inspection.

2. Image-Based Visual Servo Control System Model

To achieve millimeter-level precision alignment of ultrasonic sensors with measurement points in GIS partial discharge detection, it is necessary to first establish a visual servo system framework tailored to GIS scenario characteristics. Subsequently, mathematical modeling quantifies the mapping relationship between visual features and robotic arm motion, providing theoretical support for subsequent adaptive control algorithms. This chapter begins with the design of the system control architecture. Considering constraints such as GIS’s textureless surfaces and strong electromagnetic interference, it establishes the control logic and mathematical models. Key design decisions reference relevant research and industry standards.

2.1. Image-Based Visual Servo Control Architecture

The surface of GIS equipment consists of metallic or insulating non-textured materials, and substations are subject to a strong 50 Hz, 80 μT power frequency electromagnetic environment [13]. Traditional vision-based servo structures often suffer from feature extraction failures and insufficient anti-interference capabilities. To address this, a vision-based servo control structure tailored for GIS inspection is designed (Figure 1), comprising two core modules: visual perception and execution logic, and a closed-loop feedback process.
The visual perception layer focuses on addressing “textureless feature extraction” and “strong electromagnetic interference resistance”; AprilTag 36h11 visual markers [14] are affixed near GIS partial discharge hotspots. These markers’ unique codes distinguish measurement points across multiple GIS bays, circumventing the lack of natural surface features on textureless surfaces. Visual acquisition employs an “eye-in-hand” configuration [15], mounting a Realsense SR300 camera (manufactured by Intel Corporation, Santa Clara, CA, USA) coaxially with ultrasonic sensors (manufactured by Pepperl+Fuchs GmbH, Mannheim, Baden-Württemberg, Germany; approx. 200 g) at the robotic arm’s end. The algorithm layer incorporates 50 Hz power frequency magnetic field noise filtering logic [16], utilizing grayscale variance correction and multi-frame feature point matching to reduce tracking loss caused by electromagnetic interference. To counteract the effects of GIS vibration, rubber damping pads (manufactured by 3M Company, Maplewood, MN, USA; damping coefficient 0.3, thickness 5 mm) were installed between the camera mount and the robotic arm end effector. Simultaneously, a “multi-frame feature smoothing filter” was incorporated at the algorithmic level—averaging the corner coordinates of five consecutive AprilTag frames. This reduces vibration-induced feature offset errors from ±0.5 mm to ±0.1 mm, ensuring positioning stability.
The execution phase prioritizes “precise positioning + safe contact”. A Franka Panda 7-axis collaborative robot (manufactured by Franka Emika GmbH, Munich, Bavaria, Germany; repeatability accuracy ±0.1 mm) was selected to meet the “measurement point positioning deviation ≤2 mm” requirement specified in DL/T 555-2021 [17] “Standardized Operation for Live-Line Inspection of GIS Equipment” [18]. The kinematic parameters of the preset ultrasonic sensor (approx. 200 g) limit the maximum translational velocity of the robotic arm’s end effector to 5 mm/s. To protect against power grid surges at substations (typical peak 1.2 kV), a 220 V/10 kA surge protector (Model: DEHNguard 950, manufactured by DEHN GmbH + Co. KG, Neumarkt in der Oberpfalz, Bavaria, Germany) is connected in series at the front end of the power supply circuit for the servo drive (built into the robotic arm). Additionally, an overvoltage protection circuit is designed on the control board PCB—Employing an SMBJ24CA TVS diode (manufactured by ON Semiconductor, Phoenix, AZ, USA) connected in parallel at the drive’s power supply terminal. Upon detecting a sudden grid voltage surge exceeding 264 V, the TVS diode conducts and diverts the surge current within 0.1 ms. Simultaneously, a relay disconnects the drive’s power supply to prevent component damage. Experimental verification, simulating a 1.2 kV/2 ms surge impact, resulted in no system failures, with the drive operating normally. Through velocity smoothness constraints, the contact force between the sensor and GIS surface is indirectly controlled within 0.3–0.8 N [19], preventing damage to insulating components.
The system workflow is fully adapted to GIS inspection tasks, the camera captures AprilTag marker images in real time, extracts the pixel coordinates of the marker’s four corner points and center as core visual features, and calculates the deviation between the current features and the “expected features when the sensor precisely aligns with the measurement point” (the expected features must satisfy “the center of the circle coinciding with the center of the sensor probe, and the distribution of corner points corresponding to the sensor’s vertical alignment” [20,21]). This deviation is fused with contact force signals as a control input to drive the robotic arm’s posture adjustment. The process repeats until feature deviation falls below 0.5 mm (corresponding to ±0.8 mm sensor alignment accuracy), completing single-point positioning.

2.2. Mathematical Model for Vision Servo Control

To quantify the mapping relationship between “visual feature error” and “robot arm movement speed,” a mathematical model was established based on nonlinear control theory. Supplementary correction rules incorporating GIS scene constraints (texture-free surfaces, strong electromagnetic interference, and equipment structure) were integrated to ensure the model directly meets the precise positioning requirements for GIS measurement points. Throughout the derivation process, core formulas adhere to the original definitions and discretization logic.

2.2.1. Core Variables and Error Definitions

For the “eye-in-hand” layout and GIS detection targets, core variables linking visual features to measurement point positioning accuracy must first be defined to address the challenges of representing textureless surfaces and quantifying alignment states in GIS:
Real-time Visual Feature Vector: Describes the AprilTag identification state at time t when the robotic arm end-effector velocity is v. Its dimensions are 10 × 1 (4 corner points (x/y) pixel coordinates + 1 center point’s (x/y) pixel coordinate), expressed as:
s ( v , t ) = s 1 x , s 1 y , s 2 x , s 2 y , s 3 x , s 3 y , s 4 x , s 4 y , s c x , s c y T ,
This definition directly addresses the challenge of GIS metal/insulation surfaces lacking natural visual features through the encoding characteristics of the AprilTag identifier [9]. Here, six and siy (i = 1,2,3,4) denote the pixel coordinates of the identifier’s corner points, while scx and scy represent the pixel coordinates of the identifier’s center point. These can be obtained via real-time camera capture and feature extraction modules.
Desired visual feature vector: for the ideal state where the ultrasonic sensor precisely aligns with the GIS measurement point, two GIS-specific requirements [10] must be met; first, the center coordinates must coincide with the center pixel coordinates of the sensor probe to ensure minimal positional deviation. Second, the distribution of corner pixels must reflect the “sensor axis perpendicular to the GIS surface,” with an attitude angle error ≤ 0.2°, to guarantee the signal-to-noise ratio for partial discharge signal acquisition (preventing signal attenuation due to tilted alignment).
Visual Feature Error: defined as the difference between real-time features and desired features, it directly reflects the positional/attitudinal deviation between the sensor and GIS measurement points. As the core input for servo control, its physical significance is “a visual quantification metric for GIS measurement point positioning deviation.” The expression is:
e ( v , t ) = s ( v , t ) s * ,
when it approaches 0, the sensor’s alignment accuracy with GIS survey points meets the specified requirements.

2.2.2. Variance Minimization Objective Function

The core objective of visual servo control is to achieve progressive convergence of the feature error e ( v , t ) toward zero, thereby enabling precise positioning of GIS survey points. Considering the time-varying nonlinear nature of errors in GIS environments—such as sudden shifts in feature points caused by strong electromagnetic interference, leading to abrupt error changes—a nonlinear least-squares method is employed to define the objective function. This transforms the error minimization problem into a solvable optimization problem, with the objective function expressed as:
E ( v , t ) = 1 2 e T ( v , t ) e ( v , t ) ,
In Equation (2), E ( v , t ) represents the sum of squares of feature errors (scalar), whose physical meaning is “a comprehensive quantitative indicator of GIS survey point positioning errors.”—When E ( v , t ) = 0 , the sensor perfectly aligns with the survey point, as E ( v , t ) gradually decreases, the position/attitude deviation between the sensor and the GIS survey point synchronously diminishes. This function design directly addresses the core requirement of “high-precision positioning” in GIS detection, preventing non-compliance with contact accuracy due to error dispersion.

2.2.3. Model Linearization and Velocity Control Law Derivation

Since E(v,t) is a nonlinear function of the robotic arm’s end-effector velocity v and time t, directly solving for the minimum presents mathematical challenges. It must be linearized via Taylor series expansion while simultaneously meeting the real-time requirements of GIS detection, with single-frame image processing time ≤ 33 ms.
At a discrete time step tk (sampling period Δ t = 20   ms , matching the camera’s 30 Hz frame rate [22], meeting the GIS detection requirement of “real-time pose adjustment ≥ 30 Hz”. It should be noted that a camera frame rate of 30 Hz corresponds to a theoretical single-frame interval of 33.33 ms. The controller employs a “frame buffer preprocessing” strategy; images output by the camera are stored in real-time into a memory buffer. The controller reads the latest frame data from the buffer every 20 ms (without waiting for the next frame output, ensuring no frame loss). Simultaneously, The control system is equipped with an Intel i5-10400 CPU (manufactured by Intel Corporation, Santa Clara, CA, USA) to support real-time data processing and algorithm operation; the Intel i5-10400 CPU tests the algorithm’s computational load; single-frame AprilTag feature extraction takes 8 ms, visual error calculation takes 5 ms, and MPC law solving takes 4 ms. The total processing time of 17 ms < 20 ms meets the GIS detection requirement for “≥30 Hz real-time pose adjustment”. The end-effector velocity (vk), (E(v,t)) is expanded to the first-order term (omitting the second-order term o ( Δ 2 ) , since the error variation rate in the GIS environment is small, making the second-order term negligible for the result):
E v , t = E ( v k , t k ) + E k v v v k + E k t t t k + ,
In Equation (3), ( E k = E ( v k , t k ) ) , ( E k v ) , ( E k t ) represent the first-order partial derivatives of E(v,t) with respect to velocity v and time t at the point ( v k , t k ) .
When E ( v , t ) takes its minimum value, its partial derivative with respect to velocity v is 0. Substituting into Equation (3) and rearranging yields:
E k v + 2 E k v 2 v v k + 2 E k t v t t k + o 2 = 0 ,
After combining and reorganizing the above equation and then discretizing it, we obtain:
  v k + 1 = v k 2 E k t v t + E k v 2 E k v 2 ,
t is the sampling period, substituting back into J k   = e k v and rearranging yields:
v k + 1 = v k J k T e k + J k T e k t t + J k T t e k t J k T J k + S k ,
Equation (6) v k + 1 represents the velocity in the Cartesian space of the visual sensor. Based on the previous moment, it calculates the vector v for the next moment. When the target’s velocity v k + 1   i s z e r o , e k is solely a function of v k . At this point, the product term e k t t of the deviation velocity and the time increment becomes zero.
In Equation (6), S k = 1 0 6 I (where I is the identity matrix) serves as a regularization term. Its function is to prevent computational failure caused by a singularity in J k T J k . This situation frequently occurs in complex operational spaces of GIS (e.g., when the robotic arm approaches protruding structures such as busbars or insulators). The regularization term ensures continuous control law output, preventing detection task interruptions caused by robotic arm movement stoppages.

2.2.4. GIS Scenario-Based Jacobi Correction

In the “eyes-on-the-arm” configuration, where the camera moves synchronously with the robotic arm, the Jacobian matrix J k requires adjustment based on GIS scene characteristics to mitigate positioning errors caused by strong electromagnetic interference and equipment structure. This ensures model outputs meet GIS inspection accuracy and safety requirements. Considering the dynamic impact of robotic arm joint movements on the Jacobian matrix J k , the Jacobian update quantity U k is defined to quantify changes between adjacent time steps J k :
U k t = J k T t e k t ,
By approximating the Jacobian matrix difference at adjacent time points J k T t J k T J k 1 T Δ t (a discretization approximation satisfying GIS real-time detection requirements), substituting into Equation (7) yields:
U k J k T J k 1 T t · e k ,
Rearranging Equation (8) yields:
U k t = ( J k T J k 1 T ) e k ,
U k : Jacobian matrix update at time k , dimensions 10 × 6, dimensionless, quantifying the change in the Jacobian matrix between adjacent time steps; t : sampling period, unit ms, set to 20 ms in this paper to match the camera frame rate; J k T : Transpose of the Jacobian matrix at time step k , dimensions 6 × 10, dimensionless, describing the mapping relationship between visual error and end-effector velocity; J k 1 T : Transpose of the Jacobian matrix at time step k 1 , dimensions 6 × 10, dimensionless, used to compute the time derivative of the Jacobian matrix; e k : Visual feature error at time k , dimension 10 × 1, unit: pixels, reflecting the deviation between the current feature and the desired feature.
GIS-specific correction rules: substituting Equation (8) into Equation (6) yields the speed control formula for the “eye-in-hand” system adapted for GIS detection. Regarding the second-order derivative constraint a 2 E k a v 2 0 in discretization, theoretically, this constraint ensures the validity of the Taylor expansion. However, in GIS detection scenarios, the rate of change in visual feature errors is <0.05 mm/ms, and the influence of the second-order term on control output is <5%. (Simulation verification: positioning error ±0.82 mm when retaining the second-order term, ±0.8 mm when omitted). This can be neglected to simplify the formula:
v k + 1 = v k J k T e k + J k T e k t J k T J k + S k ,
For GIS detection scenarios, two critical corrections must be applied to the Jacobian matrix to address positioning errors caused by strong electromagnetic interference and equipment structure.
Strong Electromagnetic Interference Compensation: when measured electromagnetic intensity at substations exceeds 80 μT [23], increased image feature noise may cause estimation errors in the Jacobian matrix. To mitigate this, the computational weight of (Uk) is increased (weight coefficient = measured electromagnetic intensity/80 μT). This compensates for noise effects on the Jacobian matrix, preventing positioning failures caused by feature tracking loss.
Equipment Curvature Correction: most GIS equipment features cylindrical structures (e.g., 110 kV GIS cross-section diameter of 400 mm). When the ultrasonic sensor is positioned close to the GIS surface (d < 10 mm), surface curvature may cause sensor tilt during contact. This necessitates correcting the rotational velocity component to ensure the sensor axis remains perpendicular to the GIS surface, preventing positioning deviations exceeding the 2 mm upper limit.
These correction rules fully adapt the mathematical model to GIS inspection scenarios, simultaneously guaranteeing positioning accuracy and mitigating equipment damage risks. This establishes a scenario-specific theoretical foundation for subsequent adaptive gain algorithm design.

3. Design of Adaptive Controllers Based on Model Predictive Control

Controller Design Objectives

The core objective of the controller remains “enhancing system convergence speed and stability through dynamic gain adjustment,” but now incorporates dedicated adaptation targets for GIS detection: Electromagnetic Interference Resistance: Compensates for image noise caused by strong electromagnetic environments in substations to ensure feature tracking success rates; Contact Force Pre-control: Indirectly ensures sensor contact force with GIS surfaces remains between 0.3 and 0.8 N through velocity smoothness constraints; Efficiency Adaptation: Reduces convergence time to meet GIS maintenance requirements of ≤1 h per bay inspection.
The design objective of this controller is to achieve exponential convergence of the objective function [14]
e t = s t s * ,
toward the origin, where s t represents the current image state in the sequence and s * denotes the desired target state. The adaptive controller is designed to enable the system output to asymptotically track that of a stable reference model in response to piecewise continuous bounded inputs [24,25]. The visual servo velocity controller is defined as follows:
v = ± λ L e + e t ,
The “±” sign in the formula is determined by the relative relationship between the camera and robot coordinate systems: this system employs an “eye-in-hand” configuration, where the camera optical axis aligns with the positive direction of the Franka Panda robotic arm’s end-effector Z-axis (i.e., the camera orientation matches the movement direction of the robotic arm’s end-effector), hence using “+”; if the camera optical axis opposes the robotic arm’s movement direction (e.g., in an “eye-in-hand” configuration), use “-”. For GIS detection scenarios, the “eye-in-hand” configuration prevents GIS structures from obstructing the camera’s field of view, thus consistently using the “+” sign. The text following an equation need not be a new paragraph. Please punctuate equations as regular text.
Where L e serves as a composite Jacobian matrix, composed of camera parameters, target object model parameters, and depth information, obtained by calculating the robot Jacobian matrix and image interaction matrix; v represents the velocity of the camera in Cartesian space, λ denotes the gain coefficient, and the sign of the control rate depends on the relationship between the robot and the camera.
In industrial settings, the operating environment of vision servo systems exhibits significant dynamic uncertainty and complexity, specifically manifested as follows [26,27]: first, environmental parameters are influenced by the coupling of external disturbances and internal characteristics. This makes traditional fixed-gain control—which relies on preset parameters and cannot adjust in real time—unable to dynamically adapt to changing on-site conditions. Consequently, the system faces the contradiction of “excessive gain causing oscillations and insufficient gain prolonging convergence” under disturbances, making it difficult to balance servo closed-loop stability and response timeliness. Second, initial velocity discontinuities violate actuator dynamics—sudden velocity changes during robotic arm startup generate instantaneous impact torques. This not only induces high-frequency vibrations in end-effectors but also accelerates wear on motor winding insulation and bearing losses, shortening equipment lifespan. Third, from a control theory perspective, under specific initial configurations—such as when the robotic arm is at joint limit boundaries or when feature points are unevenly distributed within the camera’s field of view—the composite Jacobian matrix is prone to singular values, triggering the “camera pullback” phenomenon.
To address these issues, this paper leverages the “multi-constraint predictive optimization” capability of Model Predictive Control (MPC) to design a real-time adaptive gain control law integrated into the vision servo system. This control law employs a rolling time domain optimization strategy, continuously sampling the system’s current state (including visual feature error, robotic arm end-effector velocity, and environmental disturbance intensity) to predict short-term system evolution trends. Based on this, it dynamically configures gain parameters with the optimization objectives of “maximizing convergence speed, minimizing velocity fluctuations, and avoiding singular values.” This design specifically addresses the insufficient dynamic adaptability of traditional fixed-gain systems, effectively enhancing the convergence speed and disturbance rejection capability of the system in complex industrial scenarios. The specific method flow is shown in Figure 2.
In GIS partial discharge detection scenarios, the application of vision servo systems must address not only the system’s inherent time-varying and nonlinear characteristics but also complex constraints specific to power environments: joint limits must align with GIS equipment layouts, such as avoiding protruding components such as adjacent busbars and insulators; workspace is constrained by the narrow bays of substations, with single-bay widths often less than 1.5 m; continuous velocity constraints directly impact sensor contact force control to prevent scratching GIS insulators; initial configuration may suffer feature extraction errors due to GIS’s textureless surfaces; dynamic obstacle avoidance must circumvent structural components such as grounding brackets; feature visibility is susceptible to tracking loss from strong electromagnetic interference. If these constraints trigger bottlenecks, they can cause detection task interruptions or even equipment collision risks.
Therefore, this paper employs model predictive control, integrating GIS inspection constraints (e.g., electromagnetic interference compensation, contact force pre-control) into the visual servo gain loop. Adaptive gain adjustment enhances system performance through core steps, including model state prediction, error correction, and continuous variation. The following section illustrates the algorithmic logic using field-of-view constraints as an example.
In the model state prediction phase for GIS partial discharge detection, t o c c is defined as the trigger time for visual feature (e.g., AprilTag identifier) occlusion caused by GIS equipment structures (e.g., busbars, insulators). T represents the time interval from feature occlusion to its reappearance within the field of view. Since GIS detection requires ensuring continuous, precise contact between ultrasonic sensors and test points, model prediction must maintain servo control continuity during feature occlusion. Thus, the prediction interval is defined as T p = [ t o c c , t o c c + T ] . Assuming H predictions occur within the T p interval and the time interval τ between adjacent prediction intervals is constant, the relationship between T p ,   H ,   a n d   τ is expressed as follows:
τ = T p H ,
This relationship ensures that during feature occlusion, the model maintains sufficient prediction density to preserve the robotic arm’s pose control accuracy for GIS measurement points, preventing sensor contact deviations exceeding limits due to visual interruptions. Additionally, assuming the camera’s spatial velocity v remains constant over the time interval τ, as shown in the following formula:
v t = v t k = c o n s t , t t k , t k + 1 , k = 1 H ,
Therefore, based on the image feature state s t , the estimated image feature state expression at time t k is:
s t k = s t k 1 + τ L s s t k 1 , Z m t k 1 v t k 1 ,
In GIS partial discharge detection scenarios, Equation (15) can be implemented as follows: when the relative pose between the camera and GIS equipment (including measurement points) and equipment model parameters are difficult to measure precisely, the state of the image feature space is predicted within the time interval T p = [ t o c c , t o c c + T ] . The prediction accuracy is determined by the initial model feature s m ( t o c c ) and the depth correlation parameter Z m ( t o c c ) at time t o c c .
During the model calibration phase, when the actual image feature s ( t ) is measurable, the model error can be corrected using the following framework to approximate the true feature state s m ( t ) :
s m t k s t k C Z m t k C Z m t k C X m i t k u m i t k C Z m t k C Y m i t k v m i t k C Z m t k
where u m i t k and v m i t k represent the model pixel coordinates in the x and y directions corresponding to the i-th feature point at time t k , respectively, and C X m i t k and C Y m i t k denote the x- and y-directional Cartesian coordinates of feature point i in the camera coordinate system at time t k .
For GIS detection scenarios, this technique faces a critical issue: during the prediction period, the actual depth information Z ( t ) the GIS device is unmeasurable due to factors such as enclosed cavity structures and failure of depth sensors caused by strong electromagnetic interference. The correction process does not compensate for errors in depth information. This results in deviations between the actual spatial coordinates x ( t k ) and y ( t k ) and the model state C X m i t k and C Y m i t k . Consequently, the actual image feature s t k and the model-predicted image feature s m t k will exhibit errors within T p , thereby affecting the ultrasonic sensor’s contact accuracy with the GIS measurement point. To address the challenge of measuring depth information under interference, we propose a “Visual-Ultrasonic Fusion Depth Compensation Solution”: ① Hardware Integration: Coaxially mount an ultrasonic sensor (measurement accuracy ±0.2 mm) and a camera at the end of the robotic arm to synchronously capture AprilTag visual features and ultrasonic distance data; ② Fusion Model: Establish a mapping relationship between visual feature dimensions and depth: Z = k × ( s r e f / s ) , where k is the calibration coefficient (obtained through standard distance calibration), s r e f is the pixel distance from the AprilTag center to the corner at the 100 mm standard distance, and s is the real-time measured pixel distance.
During the continuous model change phase, while model state prediction and correction can resolve image feature recovery issues, GIS detection demands stringent accuracy in sensor-to-point alignment. When strong electromagnetic interference or GIS equipment structures (buses, insulators, etc.) trigger external constraints, the vision servo system is prone to abrupt motion changes, causing sensor alignment deviations to exceed limits. Therefore, ensuring detection task continuity under GIS multi-constraint events is essential.
Substituting Equation (12) into Equation (15) derives the relationship between pose error in the prediction space and control gain during GIS contact point alignment:
e t k = e t k 1 + τ λ 2 L s s t k 1 , Z m t k 1 L e + e t k 1 ,
e ( t k ) : Visual feature error at time k , dimension 10 × 1, unit pixels, calculated as the difference between real-time features and desired features. e ( t k 1 ) : Visual feature error at time step k 1 , dimension 10 × 1, unit: pixels, representing the error value from the previous time step; τ : Model prediction step size, unit: ms, set to 20 ms to align with the sampling period t ; λ 2 : Square of the adaptive gain λ , dimensionless, enhances error convergence trend; L s : Image interaction matrix, dimension 10 × 6, dimensionless, determined by camera intrinsic parameters (focal length, principal point coordinates) and feature point depth; Z m ( t k 1 ) : Model-predicted feature point depth at time k 1 , unit mm; L e + : Pseudoinverse of the composite Jacobian matrix, dimensions 6 × 10, dimensionless, used to convert image-space errors into end-effector velocity.
Drawing on the concept of adaptive gain [17,18], we analogously derive an adaptive controller for continuous model changes tailored to GIS detection:
λ x = ( λ 0 λ x 1 ) e λ 0 λ 0 λ x 1 x + λ x 1 ,
where x 1 = i x i is the sum of absolute values of non-zero elements in the pose error vector; x i denotes non-zero elements in the vector; λ 0 = λ 0 is the initial gain under “no electromagnetic interference and clear features”; λ 0 is the slope of λ at λ = 0 . Through adaptive adjustment of the gain coefficient, interaction with GIS measurement point pose errors is achieved: when the initial pose error is large (e.g., due to strong electromagnetic interference causing blurred visual markers, with errors exceeding 1 mm), the gain λ takes a large value to ensure the continuity of the GIS detection robotic arm’s motion and rapid approach to the measurement point. When the error is small, and the approach aligns with the target pose, the gain adaptively decreases to ensure terminal convergence stability, reduce arm jitter near the measurement point, and prevent scratching of GIS insulators.
Equations (11) and (12) further yield the velocity expression for the GIS inspection robotic arm’s adaptive controller:
v ˙ = ± λ x L e + e t ,
The camera-space velocity from Equation (19) must be converted to the joint target velocity of the GIS inspection robotic arm. Combined with motor encoder data and gear ratios, this generates pulse commands. This conversion process must account for GIS-specific constraints, such as confined equipment spacing and obstacle-avoidance for busbars/insulators, achieved through multi-layer processing:
q ˙ = W s f     W s g     J + c f     H c f     W s t     v ˙ ,
where W s t is the saturation constraint vector storing the maximum permissible speeds detected by the GIS system. It stores the maximum translation speed V T and maximum rotation speed V r for the visual servo. The translation and rotation speeds of each degree of freedom V c must not exceed V T and V r . If exceeded, they are set to the maximum speed limit: q ˙ m a x = ( F     V T , F     V T , F     V T , F     V r , F     V r , F     V r ) , where F is the direction factor, F = [ 1 , 1 ] . H c f is the velocity transformation matrix converting “camera velocity relative to its current pose” to “camera velocity relative to the GIS device’s world coordinate system,” where the GIS device is a fixed target object, and the world coordinate system is anchored to its cavity reference. J + c f is the inverse Jacobian matrix of the camera coordinate system relative to the world coordinate system. Based on LU decomposition and kinematic differential equations, combined with the 6-joint structure of the GIS detection robotic arm and obstacle avoidance constraints, it converts spatial velocity into joint velocity. W s g represents singularity vector handling. When the robot arm’s pose approaches singular configurations caused by proximity to GIS device structures (e.g., busbars), W s g = 0; otherwise, it equals 1. W s f denotes GIS workspace prediction. The following equation determines whether the robot arm’s end-effector (ultrasonic sensor) exceeds the GIS tolerance at velocity v ˙ :
exp v ˙ = c t M c t + t ,
where t is the control cycle, and c ( t ) M c ( t + t ) represents the spatial displacement corresponding to the spatial velocity v ˙ of the camera. If the predicted displacement exceeds the GIS workspace, the robotic arm halts movement to prevent collision with equipment.

4. Experimental Results and Analysis

4.1. Establishment of an Experimental Platform for GIS Partial Discharge Detection

To validate the effectiveness of the proposed model predictive control-based adaptive visual servo algorithm in GIS partial discharge detection scenarios, an experimental platform simulating a GIS detection environment was constructed, as shown in Figure 3. The core configuration and scenario-adaptive design are as follows:
Actuator: The Franka Panda seven-axis collaborative robot was selected to simulate the GIS inspection robot body. Its high-rigidity joints (repeatability accuracy ± 0.1 mm) meet the “deviation ≤ 2 mm” requirement for GIS measurement point positioning specified in DL/T 555-2021. It also supports end-effector force control, providing the hardware foundation for subsequent sensor contact force regulation.
Visual Perception Module: The robotic arm’s end-effector employs a “hand-mounted camera” configuration equipped with a Realsense SR300 camera—Algorithms incorporate “50 Hz power frequency magnetic field noise filtering logic for substations”. AprilTag 36h11 visual markers are affixed to simulated GIS equipment surfaces to address feature extraction challenges on textureless GIS surfaces. Marker locations correspond to high-risk partial discharge detection points (e.g., flanges of bowl-type insulators). System calibration is a critical step in ensuring GIS positioning accuracy, comprising three specific steps: ① Camera intrinsic parameter calibration: Employing Zhang Zhengyou’s chessboard method, post-calibration lens distortion error is <0.5 pixels, guaranteeing feature extraction precision; ② Robotic arm motion calibration: Correcting joint zero-position errors via laser tracker, enhancing end-effector positioning accuracy from ±0.3 mm to ±0.1 mm; ③ Hand-eye calibration: Employing the “nine-point method”, collecting visual-robotic arm position data from 9 known coordinate points to compute the hand-eye transformation matrix, achieving a post-calibration transformation error <0.2 mm. Calibration effectiveness verification: Single-frame feature extraction error reduced from ±1.2 pixels pre-calibration to ±0.3 pixels post-calibration, laying the foundation for subsequent millimeter-level positioning.
Control and Communication: The image processing PC runs Linux 16.04 with the PREEMPT_RT real-time kernel, paired with an Intel I350-T2 V2 network card. This configuration meets the GIS inspection requirement of “real-time pose adjustment ≥30 Hz”, preventing measurement point misalignment caused by data transmission delays under electromagnetic interference.

4.2. Experimental Design and Results Analysis

The non-standard symbols used in this section are defined as follows to ensure consistent notation: λ 0 : Initial gain, dimensionless, set to 1.0, representing the gain under conditions of “no electromagnetic interference and clear features,” used for rapid convergence; λ‖x‖1: Error-approaching gain, dimensionless, set to 0.4, corresponding to the gain when ‘feature error < 0.5 mm’, used to suppress oscillations; λ 0 : Slope of the gain curve at λ = 0, unit 1/mm, set to 0.2, controlling the rate of gain change with error; T p : Model prediction time window, in ms, set to 100 ms to cover five sampling cycles and ensure prediction continuity; | | x | | 1 : Norm of the error vector, in mm, calculated as the sum of the absolute values of each error element to quantify the total error magnitude.
In a laboratory setting, a typical GIS substation layout was replicated: structures including simulated 400 mm-diameter GIS busbars and 250 mm-diameter insulators were installed to construct a “textureless + strong electromagnetic interference” testing environment. The experimental task involved: a robotic arm driving an ultrasonic sensor equivalent tip to simulate a testing probe, precisely aligning with GIS measurement point markers. System Continuous Operation Test: In a laboratory simulation of a substation’s 3-day inspection cycle (72 h), continuous positioning detection was performed on 12 GIS measurement points. No hardware failures occurred during the test, with positioning accuracy consistently maintained at ±0.8 mm and feature point tracking success rate exceeding 98%, meeting the “continuous operation” requirements for substations. Two comparative schemes were set up, each undergoing 50 repeated experiments to eliminate random errors:
Control Group (Fixed Gain): Employed a traditional fixed-gain IBVS algorithm with gain λ = 0.125—simulating the control strategy commonly used in existing GIS inspection robots; Experimental Group (Adaptive Gain): Employed the proposed algorithm, with the gain expression referenced from Equation (16). Parameters were configured as follows: λ0 = 1 for rapid convergence to the test point when the initial error is large; λ‖x‖∞ = 0.4 to reduce gain near the test point and prevent oscillations; λ0′ = 0.2 to balance convergence speed and contact stability. All parameter values were designed based on the principle of “efficiency first, safety protection” for GIS inspection.
Experimental results are as follows:
The convergence iteration counts for both algorithm groups are compared in Figure 4 (horizontal axis: number of experiments; vertical axis: iterations required for “sensor contact with measurement point”). Data indicates:
(a) Fixed Gain Group: Average convergence iterations ~550. Calculated based on the experimental control cycle of 20 ms, single contact point alignment requires 11 s—Comparable to a single GIS interval (containing 12 measurement points), the total duration is approximately 132 s. When combined with equipment obstacle avoidance and other processes, the single-interval detection time exceeds 2 h, failing to meet the operational efficiency requirement of “daily inspection” (20+ intervals per station).
(b) Adaptive Gain Group: The average convergence iteration count is approximately 250 times, with a single alignment taking only 5 s. The detection time per interval can be compressed to within 60 s, achieving over 50% efficiency improvement and fully meeting the high-efficiency requirements of GIS detection.
The convergence characteristics of the robotic arm’s end-effector speed at different gains are shown in Figure 5. Combined with the GIS inspection requirement analysis for “smooth sensor contact and avoiding damage to insulating components”:
Fixed Gain Group: End-effector speed decays slowly. During initial servo operation, unaccounted GIS obstacle avoidance constraints (e.g., simulated insulator protrusions) cause ±15% speed oscillations—This oscillation causes excessive force when the sensor contacts the GIS surface (maximum measured contact force: 1.2 N), potentially scratching the glaze surface of bowl-type insulators. Simultaneously, feature point shifts induced by oscillation reduce the signal-to-noise ratio for partial discharge signal acquisition (measured SNR decrease: 18%).
Adaptive Gain Group: Terminal velocity rapidly decayed after servo activation, with oscillation amplitude controlled within ±5%. After 250 iterations, velocity converged to 0—corresponding to a stable sensor contact force of 0.5–0.6 N. No collisions with simulated GIS structures occurred during the contact process, validating the algorithm’s interference resistance and environmental adaptability under complex GIS constraints.
To more intuitively demonstrate the performance advantages of the proposed algorithm, this paper statistically analyzed key metrics for both sets of algorithms across 50 repeated experiments. The results are shown in Table 1.
Compared with other adaptive methods, all data were measured under identical experimental conditions (90 μT electromagnetic interference, 1.5 kW load) to ensure fair comparison. The results are shown in Table 2:
Based on the quantitative data in Table 1 and the convergence curves in Figure 4 and Figure 5, the performance advantages of the proposed algorithm are analyzed as follows:
The adaptive visual servo algorithm proposed in this paper demonstrates significant advantages in convergence efficiency, motion smoothness, and system robustness. Compared to fixed-gain strategies, this algorithm dynamically adjusts gains to substantially enhance convergence speed and effectively suppress end-point oscillations while maintaining positioning accuracy. Furthermore, under strong electromagnetic interference, the algorithm demonstrates robust adaptive interference resistance, significantly reducing feature tracking loss rates. This ensures high reliability and engineering applicability for GIS inspection tasks in complex industrial environments.
Experiments adapted to GIS partial discharge detection scenarios demonstrate that the proposed adaptive visual servo algorithm achieves over 50% improvement in contact point alignment efficiency compared to traditional fixed-gain algorithms, while maintaining positioning accuracy (±0.8 mm). While keeping terminal velocity oscillations below 5% and maintaining contact force within the standard range. This effectively addresses the core challenges in GIS detection—difficulty in extracting texture features, weak resistance to strong electromagnetic interference, and difficulty in controlling contact force—providing experimental support for the engineering application of GIS detection robots.

4.3. Economic Feasibility Analysis

For the large-scale engineering promotion of the proposed system, economic feasibility is a key evaluation criterion. This section conducts a streamlined cost–benefit analysis based on operational data from a typical 110 kV substation (20 bays), with all monetary values presented in USD to ensure broader applicability. Core indicators are summarized in Table 3, focusing on key cost-saving and investment recovery metrics to avoid redundant details.
The total one-time hardware cost (including Franka Panda robotic arm, vision module, and control system) is approximately $11,000, with an annual depreciation cost of $2200 over a 5-year service life (industry-standard for automation equipment).
Traditional manual inspection requires two personnel per bay (2 h duration) with a unit cost of $110 per bay. The proposed system reduces this to $20 per bay (mainly covering electricity and sensor calibration), resulting in an annual cost saving of $1800 for a 20-bay substation.
With annual net savings of $1800, the system’s payback period is approximately 1.8 years, demonstrating significant economic attractiveness for substation operation and maintenance departments.
This analysis confirms that the proposed system not only achieves superior technical performance but also offers remarkable cost-effectiveness. The streamlined cost structure and rapid payback period make it a financially viable solution for unmanned substation construction, complementing its technical advantages to lay a solid foundation for engineering applications.

5. Conclusions

Addressing core challenges in visual servo systems for partial discharge detection in GIS (Gas-Insulated Switchgear) scenarios—including difficulty in extracting features from textureless surfaces, vulnerability to strong electromagnetic interference in substations, and complexity in controlling ultrasonic sensor contact force—this paper proposes an adaptive gain algorithm for visual servo systems tailored to this context. Leveraging model predictive control’s ability to adapt to complex nonlinear systems while balancing multivariable and constrained optimization, the algorithm demonstrates enhanced performance.
Through comparative simulations of GIS inspection environments, the proposed algorithm demonstrates significantly improved convergence efficiency at sensor contact points compared to traditional fixed-gain methods. It effectively suppresses end-effector velocity oscillations, maintains contact force stability within industry-standard ranges, and achieves positioning accuracy meeting GIS partial discharge detection requirements. These enhancements substantially improve the system’s dynamic response and interference resistance.
It should be objectively noted that the current algorithm still has room for optimization. For instance, model prediction errors caused by the inability to measure the depth information of GIS equipment under strong electromagnetic interference have not been completely eliminated. Adaptability to extreme operating conditions, such as oil contamination and dust in substations, also requires enhancement. Future research will focus on multi-sensor data fusion to compensate for errors and on integrating GIS equipment models to optimize and gain control logic, aiming to further enhance the algorithm’s engineering applicability. This direction will also be a core research focus for our laboratory moving forward.

Author Contributions

Conceptualization, Y.L.; Methodology, Y.L.; Software, Y.L. and Z.Z.; Validation, Y.L.; Formal analysis, Y.L. and Y.X.; Investigation, Y.L., Z.Z. and Y.X.; Resources, Y.L.; Data curation, Y.L. and Z.Z.; Writing—original draft, Y.L.; Writing—review & editing, Z.Z.; Visualization, Y.X.; Supervision, Y.X.; Project administration, Y.X.; Funding acquisition, Y.X. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the Characteristic Innovation Project of Ordinary Universities in Guangdong Province (Grant No. 2025KTSCX232), the project “Research on Key Technologies of Visual Servo Robots Based on 3D Image Information” (Grant No. 53-K0224003), and the project “Multi-objective Optimization Research of Microgrids Under Stability Constraints” (Grant No. 55-K0225002).

Data Availability Statement

The original contributions presented in this study are included in the article. Further inquiries can be directed to the corresponding author.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Khan, Q.; Refaat, S.S.; Abu-Rub, H.; Toliyat, H.A. Partial discharge detection and diagnosis in gas insulated switchgear: State of the art. IEEE Elect. Insul. Mag. 2019, 35, 16–33. [Google Scholar] [CrossRef]
  2. Huang, S.; Wu, Z.; Ren, Z.; Liu, H.; Gui, Y. Review of electric power intelligent inspection robot. Electr. Meas. Instrum. 2020, 57, 26–38. [Google Scholar]
  3. Chaumette, F.; Hutchinson, S. Visual servo control. I. Basic approaches. IEEE Robot. Autom. Mag. 2006, 13, 82–90. [Google Scholar] [CrossRef]
  4. Burlacu, A.; Condurache, D. A different approach to solving the PBVS control problem. In Proceedings of the IEEE 29th International Symposium on Industrial Electronics (ISIE), Delft, The Netherlands, 17–19 June 2020; pp. 1359–1364. [Google Scholar]
  5. Olson, E. AprilTag: A robust and flexible visual fiducial system. In Proceedings of the IEEE International Conference on Robotics and Automation, Shanghai, China, 9–13 May 2011; pp. 3400–3407. [Google Scholar]
  6. Luan, Y.; Li, J.; Li, C.; Huang, R.; Lv, J. Development and application of partial discharge detection robot for high voltage switchgear. Electr. Power 2019, 52, 169–176. [Google Scholar]
  7. Wei, X.; Teng, Y.; Liu, Z.; Deng, J.; Jia, Y. Application research of the partial discharge automatic detection device and diagnostic method based on the ultrasonic in long distance GIL equipment. J. Phys. Conf. Ser. 2019, 1213, 052088. [Google Scholar] [CrossRef]
  8. Yang, R.; Wang, Y.; Tao, M.; Dong, E.; Zhu, T.; Chen, Z.; Yang, W. Research on visual navigation manipulator for GIS ultrasonic partial discharge detection. In Proceedings of the IEEE International Conference on Mechatronics and Automation (ICMA), Beijing, China, 13–16 October 2020; pp. 437–442. [Google Scholar]
  9. Ji, X. Standardized Operation of Live Inspection for GIS Equipment; China Electric Power Press: Beijing, China, 2017. [Google Scholar]
  10. Li, S.; Li, D.; Zhang, C.; Wan, J.; Xie, M. RGB-D Image Processing Algorithm for Target Recognition and Pose Estimation of Visual Servo System. Sensors 2020, 20, 430. [Google Scholar] [CrossRef] [PubMed]
  11. Capolei, M.C.; Andersen, N.A.; Lund, H.H.; Falotico, E.; Tolu, S. A Cerebellar Internal Models Control Architecture for Online Sensorimotor Adaptation of a Humanoid Robot Acting in a Dynamic Environment. IEEE Robot. Autom. Lett. 2020, 5, 80–87. [Google Scholar] [CrossRef]
  12. Han, N.; Ren, X.; Zheng, D. Visual Servoing Control of Robotics with A Neural Network Estimator Based on Spectral Adaptive Law. IEEE Trans. Ind. Electron. 2023, 70, 12586–12595. [Google Scholar] [CrossRef]
  13. Hao, T.; Xu, D.; Qin, F. Image-Based Visual Servoing for Position Alignment with Orthogonal Binocular Vision. IEEE Trans. Instrum. Meas. 2023, 72, 1–10. [Google Scholar] [CrossRef]
  14. Ma, M.; Qian, R.; Wang, W. Vision-based Adaptive Tracking Control of a Mobile Robot: Algorithms and Experimental Validation. In Proceedings of the 2022 34th Chinese Control and Decision Conference (CCDC), Hefei, China, 15–17 August 2022; pp. 5732–5737. [Google Scholar]
  15. Hu, K.; Tao, J.; Wu, G.; Zeng, P. A Study of Automatic Positioning Control Based on Vision Servo Control. In Proceedings of the 2023 35th Chinese Control and Decision Conference (CCDC), Yichang, China, 20–22 May 2023; pp. 4389–4393. [Google Scholar]
  16. Fried, J.; Lizarralde, F.; Leite, A.C. Adaptive Image-based Visual Servoing with Time-varying Learning Rates for Uncertain Robot Manipulators. In Proceedings of the 2022 American Control Conference (ACC), Atlanta, GA, USA, 8–10 June 2022; pp. 3838–3843. [Google Scholar]
  17. DL/T 555-2021; Standardized Operation for Live-Line Inspection of GIS Equipment. China Electricity Council (CEC): Beijing, China, 2021.
  18. Obana, Y.; Huang, Q. Study about Variable Adjustment Rule of AC Servo Motor Using Simple Adaptive Control. In Proceedings of the 2018 15th International Conference on Control, Automation, Robotics and Vision (ICARCV), Singapore, 18–21 November 2018; pp. 644–649. [Google Scholar]
  19. Chen, J.; Jia, B.; Zhang, K. Trifocal Tensor-Based Adaptive Visual Trajectory Tracking Control of Mobile Robots. IEEE Trans. Cybern. 2017, 47, 3784–3798. [Google Scholar] [CrossRef] [PubMed]
  20. Shao, Z.; Zhang, J. Vision-Based Adaptive Trajectory Tracking Control of Wheeled Mobile Robot with Unknown Translational External Parameters. IEEE ASME Trans. Mechatron. 2024, 29, 358–365. [Google Scholar] [CrossRef]
  21. Taylor, G.; Kleeman, L. Hybrid Position-Based Visual Servoing with Online Calibration for a Humanoid Robot. In Proceedings of the 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Sendai, Japan, 28 September–2 October 2004; pp. 686–691. [Google Scholar]
  22. Cai, C.; Dean-Leon, E.; Mendoza, D.; Somani, N.; Knoll, A. Uncalibrated 3D Stereo Image-Based Dynamic Visual Servoing for Robot Manipulators. In Proceedings of the 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan, 3–7 November 2013; pp. 63–70. [Google Scholar]
  23. Liang, X.; Wang, H.; Liu, Y.H.; You, B.; Liu, Z.; Jing, Z.; Chen, W. Fully Uncalibrated Image-Based Visual Servoing of 2DOFs Planar Manipulators with a Fixed Camera. IEEE Trans. Cybern. 2022, 52, 10895–10908. [Google Scholar] [CrossRef] [PubMed]
  24. Kermorgant, O.; Chaumette, F. Dealing with constraints in sensor-based robot control. IEEE Trans. Robot. 2013, 30, 244–257. [Google Scholar] [CrossRef]
  25. Cheng, M.; Li, D.; Zhou, N.; Tang, H.; Wang, G.; Li, S.; Bhatti, U.A.; Khan, M.K. Vision-Motion Codesign for Low-Level Trajectory Generation in Visual Servoing Systems. IEEE Trans. Instrum. Meas. 2023, 72, 1–14. [Google Scholar] [CrossRef]
  26. Han, L.; Zhang, Y.; Wang, H. Hybrid Adaptive Vision-Force Control Under the Bottleneck Constraint. IEEE Trans. Control Syst. Technol. 2023, 31, 382–393. [Google Scholar] [CrossRef]
  27. Ilyushin, Y.V.; Novozhilov, I.M. Software implementation of a pulse regulator of a distributed control object. In Proceedings of the IEEE II International Conference on Control in Technical Systems (CTS), St. Petersburg, Russia, 25–27 October 2017; pp. 315–317. [Google Scholar]
Figure 1. Visual Servo Control Architecture for GIS Partial Discharge Detection.
Figure 1. Visual Servo Control Architecture for GIS Partial Discharge Detection.
Energies 18 06365 g001
Figure 2. Adaptive Gain Control Process Based on Model Predictive Control.
Figure 2. Adaptive Gain Control Process Based on Model Predictive Control.
Energies 18 06365 g002
Figure 3. Experimental platform of visual servo based on a GIS partial discharge detection scenario.
Figure 3. Experimental platform of visual servo based on a GIS partial discharge detection scenario.
Energies 18 06365 g003
Figure 4. Convergence of two servo algorithms in the GIS measuring point fitting task.
Figure 4. Convergence of two servo algorithms in the GIS measuring point fitting task.
Energies 18 06365 g004
Figure 5. Comparison of manipulator end speed convergence under different gains. Standard deviation of convergence time for 50 experiments: Fixed gain group ±1.2 s, adaptive gain group ±0.5 s.
Figure 5. Comparison of manipulator end speed convergence under different gains. Standard deviation of convergence time for 50 experiments: Fixed gain group ±1.2 s, adaptive gain group ±0.5 s.
Energies 18 06365 g005
Table 1. Performance Comparison of Two Visual Servoing Algorithms in GIS Point Alignment Tasks. Performance Improvement Percentage Calculation Method: Performance Improvement = (Fixed Gain Group Data-Adaptive Gain Group Data)/Fixed Gain Group Data × 100%.
Table 1. Performance Comparison of Two Visual Servoing Algorithms in GIS Point Alignment Tasks. Performance Improvement Percentage Calculation Method: Performance Improvement = (Fixed Gain Group Data-Adaptive Gain Group Data)/Fixed Gain Group Data × 100%.
Performance MetricsFixed Gain Group (Control Group)Adaptive Gain Group (Experimental Group)Performance Enhancement
Average number of iterations to convergence55025054.5%
Single-point alignment time (s)11554.5%
End-speed oscillation amplitude (%)±15±566.7%
Maximum Contact Force (N)1.20.650.0%
Positioning Accuracy (mm)±1.5±0.846.7%
Feature Point Tracking Loss Rate (%)>30<10>66.7%
Standard Deviation of Single Positioning Error (mm)±0.3±0.1-
Table 2. Performance Comparison of Different Adaptive Visual Servoing Methods.
Table 2. Performance Comparison of Different Adaptive Visual Servoing Methods.
Control MethodConvergence Time (s)Positioning Accuracy (mm)Tracking Success Rate Under Electromagnetic Interference (%)Single-Frame Computation Time (ms)
Fixed Gain IBVS (as referenced in this document)11 ± 1.5 7010
Neural Network Adaptation [12]8 ± 1.2 8545
Sliding Mode Control Adaptive [23]9 ± 1.1 8835
This paper’s MPC adaptive5 ± 0.8 9217
Table 3. Cost–Benefit Comparison Summary.
Table 3. Cost–Benefit Comparison Summary.
ItemManual InspectionProposed SystemKey Advantage
Total Hardware Investment-$11,000One-time reasonable investment
Unit Bay Inspection Cost$110$2082% cost reduction
Annual Maintenance Cost$730$17077% maintenance savings
Investment Payback Period-1.8 yearsRapid cost recovery
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Luo, Y.; Zhang, Z.; Xie, Y. Adaptive Visual Servo Control for GIS Partial Discharge Detection Robots: A Model Predictive Control Approach. Energies 2025, 18, 6365. https://doi.org/10.3390/en18236365

AMA Style

Luo Y, Zhang Z, Xie Y. Adaptive Visual Servo Control for GIS Partial Discharge Detection Robots: A Model Predictive Control Approach. Energies. 2025; 18(23):6365. https://doi.org/10.3390/en18236365

Chicago/Turabian Style

Luo, Yongchao, Zifan Zhang, and Yingxi Xie. 2025. "Adaptive Visual Servo Control for GIS Partial Discharge Detection Robots: A Model Predictive Control Approach" Energies 18, no. 23: 6365. https://doi.org/10.3390/en18236365

APA Style

Luo, Y., Zhang, Z., & Xie, Y. (2025). Adaptive Visual Servo Control for GIS Partial Discharge Detection Robots: A Model Predictive Control Approach. Energies, 18(23), 6365. https://doi.org/10.3390/en18236365

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Article metric data becomes available approximately 24 hours after publication online.
Back to TopTop