1. Introduction
At present, the world is at a critical stage of energy transition, and the development of new power systems is also advancing continuously. Against this backdrop, power cables, as a core medium in the power transmission and distribution process, play an irreplaceable role. Driven by both energy transition and the development of new power systems, the market demand for power cables is constantly increasing, and the scale of the industry is also expanding continuously. However, throughout their entire lifecycle, power cables remain vulnerable to the adverse effects of various factors. Manufacturing deviations during production may introduce latent defects. During installation and operation, prolonged exposure to combined stresses—such as mechanical forces, elevated temperatures, and high humidity—can further precipitate localized insulation degradation and conductor contact failures [
1,
2]. If such localized defects are not detected early and addressed with targeted repairs, their deterioration will accelerate over time, eventually exceeding the insulation tolerance limit and evolving into permanent failures. These failures can lead to unplanned outages in the power system, resulting not only in regional power disruptions and direct economic losses but also potentially giving rise to secondary safety hazards. Such incidents pose significant threats to social well-being and public safety. Therefore, strengthening defect detection throughout the entire lifecycle of power cables holds significant engineering value and practical significance [
3,
4].
Common cable detection methods currently include insulation resistance testing (IR) [
5], very-low-frequency dielectric loss testing (VLF-tan δ) [
6], and withstand voltage testing [
7]. Among these, IR and VLF-tan δ can only evaluate the overall condition of power cables and fail to locate localized minor defects. Withstand voltage testing serves as a direct means to assess cable insulation performance but is inherently destructive: if latent defects exist in the cable, the applied high voltage may directly promote defect propagation or even cause insulation breakdown. The widely used partial discharge (PD) detection enables non-destructive testing and exhibits high sensitivity to early-stage localized defects [
8]; however, the amplitude of PD signals is extremely low, making them vulnerable to interference from complex electromagnetic environments during testing, which undermines the reliability of detection results [
9]. Additionally, Time Domain Reflectometry (TDR) enables the rapid detection and localization of localized defects [
10,
11], but its narrow input signal bandwidth limits its sensitivity to minor impedance variations [
12].
In recent years, as power cables have evolved toward higher voltages, longer transmission distances, and more complex installation environments, traditional detection technologies have increasingly revealed limitations—such as insufficient sensitivity and weak anti-interference capability—in detecting localized defects. Consequently, scholars worldwide have conducted in-depth and sustained research on the application of Frequency Domain Reflectometry (FDR) in assessing the insulation condition of power cables. Leveraging the high responsiveness of broadband signals to localized impedance changes, FDR has emerged as a key integrated technology for detecting, locating, and identifying micro-localized defects in power cables, providing more accurate localized defect assessment for cable condition-based maintenance.
Currently, the mainstream techniques in FDR are the Fast Fourier Transform (FFT) method and the Inverse Fast Fourier Transform (IFFT) method. Due to the non-integer periodicity of cable frequency-domain response data, FFT analysis is prone to spectral leakage and the fence effect. Meanwhile, during time-domain signal reconstruction via IFFT, the process faces dual challenges: the influence of the Gibbs phenomenon and electromagnetic noise interference. These inherent flaws significantly reduce the localization accuracy and identification sensitivity of localized cable defects.
Furthermore, signals inherently experience attenuation during transmission through cables. The amplitude of defect reflection signals progressively diminishes with increasing cable length, which makes the defect reflection peaks in the latter sections of long cables susceptible to being obscured by background noise, thereby complicating their effective identification. In contrast, for short cables where the distance between defects is minimal and characteristic signals are more subtle, algorithms must demonstrate higher frequency-domain resolution. This enhancement is essential to facilitate effective differentiation between adjacent defects and to ensure accurate extraction of fine features.
Differences in cable voltage ratings lead to fundamental variations in their structural design and operational electromagnetic environments. Compared to 35 kV medium-voltage cables, 110 kV high-voltage cables exhibit more intricate structures, such as cross-interconnected grounding configurations, and operate within significantly stronger environments of electromagnetic interference: On one hand, cross-interconnected structures alter signal transmission paths, introducing more complex reflection and refraction components into frequency-domain signals and substantially increasing the complexity of frequency-domain analysis; on the other hand, strong electromagnetic interference in high-voltage environments injects significant interference components into signals, further compromising the accuracy of defect identification.
Given the significant differences in signal transmission characteristics, structural complexity, and electromagnetic environments among cables of varying lengths and voltage ratings, systematically analyzing the technical properties of the FFT and IFFT methods and clarifying their applicable scenarios under different cable operating conditions is of great engineering significance for improving the practical application effectiveness of FDR. This study conducts a comparative analysis of the FFT and IFFT methods. This paper provides a comparative analysis of two technical approaches at a fundamental level, emphasizing their respective inherent advantages. Through the presentation of on-site measurement results, it further investigates the variations in various detection parameters under differing operational conditions. The analysis elucidates the applicable testing scenarios for each method.