Next Article in Journal
A Novel Compact Beamforming Network Based on Quasi-Twisted Branch Line Coupler for 5G Applications
Next Article in Special Issue
Effects of Scale Regularization in Fraud Detection Graphs
Previous Article in Journal
Polar Codes for 6G and Beyond Wireless Quantum Optical Communications
Previous Article in Special Issue
Fuzzy Logic Control for Adaptive Braking Systems in Proximity Sensor Applications
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Multi-Scale Dual Discriminator Generative Adversarial Network for Gas Leakage Detection

by
Saif H. A. Al-Khazraji
1,
Hafsa Iqbal
1,*,
Jesús Belmar Rubio
2,
Fernando García
1 and
Abdulla Al-Kaff
1
1
Autonomous Mobility and Perception Laboratory (AMPL), Departamento de Ingenieria de Sistemas y Automática, Universidad Carlos III de Madrid, 28911 Madrid, Spain
2
Departmento de Física, Universidad Carlos III de Madrid, 28911 Madrid, Spain
*
Author to whom correspondence should be addressed.
Electronics 2025, 14(17), 3564; https://doi.org/10.3390/electronics14173564
Submission received: 4 August 2025 / Revised: 2 September 2025 / Accepted: 3 September 2025 / Published: 8 September 2025

Abstract

Gas leakages pose significant safety risks in urban environments and industrial sectors like the Oil and Gas Industry (OGI), leading to accidents, fatalities, and economic losses. This paper introduces a novel generative AI framework, the Multi-Scale Dual Discriminator Generative Adversarial Network (MSDD-GAN), designed to detect and localize gas leaks by generating thermal images from RGB input images. The proposed method integrates three key innovations: (1) Attention-Guided Masking (AttMask) for precise gas leakage localization using saliency maps and a circular Region of Interest (ROI), enabling pixel-level validation; (2) Multi-scale input processing to enhance feature learning with limited data; and (3) Dual Discriminator to validate the thermal image realism and leakage localization accuracy. A comprehensive dataset from laboratory and industrial environment has been collected using a FLIR thermal camera. The MSDD-GAN demonstrated robust performance by generating thermal images with the gas leakage indications at a mean accuracy of 81.6 % , outperforming baseline cGANs by leveraging a multi-scale generator and dual adversarial losses. By correlating ice formation in RGB images with the leakage indications in thermal images, the model addresses critical challenges of OGI applications, including data scarcity and validation reliability, offering a robust solution for continuous gas leak monitoring in pipeline.

1. Introduction

Gas risks from leaks, emissions, and fatalities due to exposure to gas in confined spaces are among the most common causes of accidents in urban areas with complex gas distribution networks, as well as in the Oil and Gas Industry (OGI). Additionally, leaks and losses during gas transportation contribute to significant costs and environmental issues for the industry. Detecting gas leaks is especially challenging because gases are invisible, unlike other fluids, making traditional detection methods less effective-particularly given the vast distances pipelines cover and the often harsh environments in which they operate. To address these challenges, our approach leverages widely available RGB cameras and advanced Machine Learning (ML), offering a cost-effective solution that does not require specialized thermal imaging equipment. This method can improve detection accuracy and accessibility compared to conventional techniques, which often rely on expensive sensors or fiber optic systems.
This article is proposed within the digitalization framework led by Universidad Carlos III de Madrid, Spain (UC3M), through the AMPL laboratory. It focuses on urban and industrial environments, where gas leak detection has been identified as a critical enhancement to the creation of digital twins. By integrating gas leakage detection into digital twin process, the proposed work aims to improve monitoring and safety in these complex settings. In these environments, the detection and localization of these events are of vital importance, which is due to the personal and material risks involved. The proposal seeks to provide a tool that leverages digitalization capabilities onboard a ground vehicle equipped with systems such as cameras and LiDAR, in order to identify events as dangerous as those mentioned here. Furthermore, this research contributes to the growing field of AI-driven environmental monitoring and industrial safety, opening new possibilities by stating the challenges in various industrial applications beyond gas leakage detection.
The generative artificial intelligence (GenAI), particularity through thermographic imaging techniques, has shown potential in creating enhanced visual representations of otherwise imperceptible phenomena. This approach draws motivation from related applications, such as using GenAI to colorize old movies [1] or generate human facial images through future lifespan projections [2]. In our case, we extend these concepts to the critical domain of gas leakage detection. By developing a system that can generate thermal images from RGB input, we aim to make gas leakage detection more accessible, efficient, and widespread. This could significantly improve safety measures, reduce economic losses associated with undetected gas leakage, and contribute to environmental protection efforts.
The solution proposed is inspired by recent advances in GenAI to develop an innovative model that visualizes and detects gas leaks, making these invisible hazards perceptible to the human eye. Conditional Generative Adversarial Networks (cGANs) are a prominent example of GenAI models, capable of learning complex data distributions and generating realistic synthetic data across various domains. In this work, we leverage the generative capability of cGANs to bridge the gap between RGB imaging and thermal imaging, enabling accurate detection of gas leakage points in pipeline networks. While cGANs have been widely employed in the literature to generate synthetic images, and addressing the issue of data scarcity [3,4]. The application of computer-vision based methods in OGI has been constrained by limited availability of relevant high-quality data. To address this, our paper introduces a novel GenAI-based approach that utilizes RGB images to generate corresponding thermal images, thereby enabling the detection and localization of gas leaks. Our model includes attention-guided masking (AttMask) strategy, which directs the model’s focus to regions with a high probability of leakage. Moreover, multi-scale input processing method is proposed to learn better features with limited data. Acting as a digital twin, the proposed model serves as a digital gas detector and has demonstrated its effectiveness in OGI field applications.
Detailed contribution: The four-fold contribution of this work consists of the following:
  • Comprehensive comparative analysis of existing state-of-the-art (SOTA) methods: We provide an in-depth comparison of the traditional, machine learning (ML) and deep learning (DL)-based techniques employed for monitoring and detection of gas leakage, including digital gas detection platforms.
  • Novel dataset collection: A unique dataset of paired RGB and thermal imaging is collected in both laboratory and industrial environment for gas leakage detection, addressing data scarcity and digital gas detector capabilities in this field.
  • MSDD-GAN model: We propose a novel GenAI model named Multi-Scale Dual Discriminator Generator Adversarial Network (MSDD-GAN) for gas leakage detection, functioning as a digital platform for thermal imaging.
  • Leveraging cGAN’s model: We leverage the SOTA model of cGAN to generate thermal images from RGB images, serving as a digital gas detector platform. We also provide its comparison with the proposed MSDD-GAN.
Paper organization: This paper is organized as follows: Section 2 provides a comprehensive review of existing SOTA methods and challenges in gas leakage detection. Section 3 formulates the problem statement, defining the research objectives and scope. Section 4 discusses the dataset collection process, detailing the acquisition of thermal and RGB image pairs in both laboratory and industrial environments. Section 5 presents the proposed methodology, elaborating on the architecture and implementation of cGANs and MSDD-GAN. Section 6 highlights the implementation parameters of the proposed model as well as discuss the results and findings of our work, offering a detailed analysis of the model’s performance employing different performance matrices. Finally, Section 7 concludes this work by highlighting the future directions.

2. Literature Review

Effective gas leak detection and pipeline monitoring are important challenges in the OGI, where traditional techniques often require high maintenance costs and struggle with timely identification of leaks and corrosion [5]. As pipelines age or sustain external damage, the likelihood of leaks increases, underscoring the need for regular and reliable inspection methods to ensure safety and efficiency. Table 1 provides a comparison of several relevant studies that address these challenges. This section is structured into three subsections which discuss traditional, machine learning (ML), and deep learning (DL)-based approaches, as outlined below.

2.1. Traditional Methods

A variety of traditional and sensor-based methods have been explored for gas leak detection and pipeline monitoring in OGI. Early approaches relied on numerical simulations and electromagnetic analysis. For example, Maxwell electromagnetic simulation software has been effectively used for the numerical analysis of welding defects, allowing for the validation of the reliability of magnetic flux leakage (MFL) techniques for diagnosing pipeline welding defects using experimental data [11,16]. These techniques involve significant maintenance costs and have a very limited scope.
Remote sensing technologies, such as satellite-based monitoring, have also gained attention and have been employed to analyze the atmospheric methane (CH4) and nitrogen dioxide (NO2) concentrations in the industrial area of Zhoushan, China [17]. These studies have revealed variations in CH4 due to petroleum activities and identified NO2 as a tracer for CH4 emissions. This technology allows for the analysis of surface temperature and provides information on the urban heat island effect but unable to identify the specific location of gas leaks.
Gas leakage, particularly methane, poses significant environmental risks in marine environments. Recent studies using seismic and hydro-acoustic data reveal that shallow gas accumulations near wells drive methane releases in regions like the North Sea, though these remain locally significant rather than globally impactful [18,19].
Nanotechnology-based sensors are an emerging area, with nanoparticles (NPs) being incorporated into sensors for downhole reservoir studies [20]. However, NP-based methods still face limitations in achieving real-time, accurate leakage detection in natural gas pipelines. To address this, three-dimensional capacitance array sensors have been proposed for early detection [21], but their efficiency degrades due to sediment build-up in pipelines.
Optical and infrared-based detection methods have also seen significant development. Multispectrum infrared (MSIR) flame detectors, such as FL4000H and its variants, are increasingly used for heat hotspot detection in OGI [18]. The transition from traditional UV/IR to triple infrared (IR3) detection has notably improved detection performance, offering superior immunity to false alarms and a wider field of view. IR3 detectors can identify heat hotspots at long distances and maintain reliability even in harsh environments [22]. Full-scale experiments have shown that infrared energy intensity at 4.504   μ m correlates with the area of thermal hotspots, with detector performance depending more on this intensity than on temporal variations, thereby improving thermal mapping and design evaluations. Optical gas imaging (OGI) is another promising method, utilizing spectrally filtered thermal imaging to visualize and quantify gas leaks, making otherwise invisible emissions detectable from a safe distance [23,24]. These methods rely on complex and costly hardware, demand precise calibration, and often exhibit limited generalization and adaptability, resulting in high maintenance requirements-particularly in complex environments.
Other specialized detectors, such as those using Cobalt-60 and Sodium Iodide with Thallium, have been proposed to enhance data collection for gas leakage detection [7]. Wireless sensor networks (WSNs) have also emerged as a promising solution for real-time pipeline condition monitoring, corrosion detection, and gas leakage detection. Recent studies have focused on WSN-based systems for monitoring anomalous events across upstream, midstream, and downstream sectors [25,26], outlining the deployment requirements and industrial challenges associated with these systems.
The adoption of digital twining, cloud and edge computing in OGI has been explored to further enhance monitoring capabilities. However, implementation has been slow due to high costs and persistent security concerns [15,27]. From a manufacturing perspective, solutions such as composite flexible pipelines have been introduced [8] to prevent leaks, though these approaches are often cost-prohibitive due to the use of expensive materials. Ultimately, a fundamental challenge remains: many gases are invisible to the naked eye, complicating detection and necessitating the development of more advanced, reliable, and non-invasive monitoring technologies.
The aforementioned discussion highlights the limitations of traditional methods, particularly their high-maintenance costs, reliance on specialized equipment and materials, and sensitivity to environmental factors. These challenges indicate the need for more adaptive and intelligent solutions. ML and DL algorithms offer a transformative approach by enabling automated, accurate, and real-time detection of gas leaks using standard RGB data. Such models are capable of learning complex correlations between visible cues and thermal signatures, providing a scalable and cost-effective alternative to conventional techniques. Subsequent subsections discuss the SOTA ML and DL methods used for gas leak detection.

2.2. ML-Based Methods

The integration of multiple inspection techniques with machine learning (ML) algorithms has demonstrated improved performance for defect detection in OGI pipelines which are a primary cause of gas leakage [8,28]. A wide range of ML algorithms, including support vector machines (SVMs), random forests, and hybrid-ML algorithms, have been successfully applied to pipeline leakage detection and corrosion prediction [10]. Recent studies highlight the potential of combining different ML algorithms to improve accuracy and robustness [29]. In [30,31], authors reviews the application of unsupervised machine learning techniques in active infrared thermography (AIRT) for defect detection in composite materials, focusing on image denoising, segmentation, and other post-processing tasks. It highlights recent advances in models for thermogram analysis and provides guidance for researchers entering the field. Advanced feature extraction techniques are also evolving, with manifold learning-based approaches proposed for capturing complex, non-linear relationships in pipeline data. These methods have shown promise in improving the accuracy and reliability of leak detection models [6]. For instance, CH4 leakages from oil wells are now classified into categories such as active, plugged and abandoned, and inactive wells [32]. However, current supervised learning methods often struggle with cross-domain generalization and limited availability of labeled fault samples. To address this, metric learning with pseudo-label strategies has demonstrated improved data utilization and adaptability in challenging scenarios [33].
The aforementioned ML-based approaches have been extended to mobile applications. For example, an improved MobileNetV2 model has been developed to detect pipeline leaks by analyzing hammering sounds recorded via smartphones [34]. This approach leverages the Mel-spectrogram features for early detection of small leakages from underground storage tanks (USTs). One notable approach is digital twinning [15], which integrates ground system data with cloud-based processing for real-time gas leak monitoring. However, cloud-related data processing delays can hinder the real-time effectiveness of such systems.
Despite the progress achieved with ML-based algorithms, these methods depend heavily on well-annotated data, which is often scarce or impractical to obtain in real-world OGI fields. Additionally, traditional ML approaches typically rely on hand-crafted features or shallow models, which struggle to capture the complex, non-linear relationships inherent in pipeline sensor and image data. Most existing ML techniques are primarily applied to numerical or acoustic signals, with limited capability for advanced visualization or direct image-based leak localization. As a result, these methods fail to detect or accurately localize gas leaks that do not produce strong sensor signals or occur in visually complex, noisy, or data-scarce environments [10,35].

2.3. DL-Based Methods

The aforementioned discussion highlights the limitations of ML-based techniques, DL methods have emerged as efficient and adaptive alternatives for gas leakage detection in OGI. Unlike traditional ML approaches, which often depend on manual feature engineering and struggle to capture complex, non-linear relationships in pipeline data, DL models—particularly those leveraging advanced neural network architectures—can automatically learn hierarchical representations directly from raw data, including images and time-series data. This capability enables more robust, scalable, and automated leak detection and localization, even in challenging scenarios where ML techniques fall short [36,37].
Recent years have seen a surge in the adoption of data-driven, AI-based techniques for pipeline monitoring. Deep neural networks such as Convolutional Neural Networks (CNNs) and Recurrent Neural Networks (RNNs), including Long Short-Term Memory (LSTM) models [38], have demonstrated strong performance in analyzing both numerical and virtual-sensor data. For example, hybrid frameworks combining 1D-CNNs with LSTMs have been proposed to exploit both spatial and temporal features in pipeline monitoring, enabling automatic extraction of relevant features from complex, multi-dimensional data streams [39,40]. Semi-supervised approaches, such as improved LSTM-autoencoders (LSTM-AE) coupled with one-class Support Vector Machines (OCSVM), have addressed the challenge of limited labeled data by learning normal operational patterns and flagging deviations as potential leaks [29]. Field validations have shown the effectiveness of these models for real-time detection and classification of pipeline conditions [41]. Furthermore, advanced architectures that integrate CNNs, bi-directional LSTMs (BiLSTM), and multi-head attention mechanisms have been developed to capture both local and long-term dependencies in time-series data, further improving anomaly detection accuracy [14].
Deep learning models typically require large volumes of high-quality, labeled data for effective training, which is often scarce in industrial OGI applications. Model validation and generalization remain difficult, especially when data is limited or highly variable due to changing pipeline conditions or sensor noise. Moreover, many existing DL approaches are primarily focused on numerical or acoustic data, with relatively limited research dedicated to leveraging image-based or visualization techniques for tracking invisible gas leaks. This restricts the ability to detect and localize leaks that do not produce strong sensor signals, particularly in visually complex or data-constrained environments [8,29].
These limitations motivate the need for innovative, data-efficient, and non-invasive solutions that can leverage standard RGB images for robust gas leakage detection. Generative deep learning models, such as Generative Adversarial Networks (GANs) [42], offer a promising direction by learning complex correlations between visible features around leakage points and their corresponding thermal signatures. By generating synthetic thermal images from RGB inputs, these models can make invisible gas leaks perceptible and actionable, enabling continuous, low-cost, and real-time monitoring of pipeline networks. In this context, we propose an efficient alternative for gas leak monitoring using a digital twin-based generative AI model that synthesizes thermal images to detect leakage spots in pipelines. The Multi-Scale Dual Discriminator Generative Adversarial Network (MSDD-GAN) is designed to map ice by-products accumulated at leakage regions to their respective gas volumes, effectively serving as a digital gas detector. By incorporating multi-scale inputs and dual discriminators, the MSDD-GAN achieves high accuracy in both thermal image generation and leak localization, even with limited training data. This approach combines the strengths of advanced AI techniques with thermal imaging, offering a more efficient and accurate method for detecting and visualizing gas leaks in pipelines. Leveraging AI-generated thermal images as a digital twin, our method addresses the shortcomings of traditional and existing DL-based detection techniques, providing a comprehensive and scalable solution for pipeline integrity management in the OGI field.

3. Problem Statement

In OGI, the use of image-based models for gas leakage detection are still evolving. As we know, thermal cameras that can detect gases, are expensive. Therefore, this paper focuses on detecting gas leakage spots in pipelines by generating thermal images from RGB images. For this purpose, a GenAI model is proposed, which could transform regular CCTV camera images into thermal images, and allow localization of gas leakage spots. This method expands coverage beyond traditional handheld thermal cameras, leveraging existing CCTV networks to detect leaks in areas where conventional methods are ineffective.
To optimize computational resources, we incorporate turbo processing, as shown in Figure 1. Since gas leakage is not a continuous event, the incorporation of turbo processing reduces computational complexity by processing the CCTV RGB-images only when triggered by the SCADA system. The SCADA system will trigger our proposed GenAI model upon receiving an indication of potential leaks based on real-time data of critical parameters such as pressure and temperature. This comprehensive system offers an efficient solution for detecting gas leaks in extensive pipeline networks, addressing scenarios where traditional methods may prove inadequate.

4. Experimental Dataset

Gas pipeline leaks are a significant concern, but they can only be detected by OGI professionals who specialize in identifying potential issues in pipeline infrastructure. Several signs commonly indicate the presence of a gas leak. One of the most noticeable signs is the accumulation of ice around the leaking area, which forms due to the rapid expansion of gas and its interaction with surrounding moisture in cold temperatures. Another indication is the appearance of water droplets around the joints of the pipeline, which occur as a result of condensation from gas leakage. In some cases, oil spots may also emerge around the pipeline joints, indicating leaks at these specific locations.
Despite the importance of gas leakage detection in OGI, there is a notable scarcity of publicly available datasets in this field. This poses significant challenges for researchers working on AI-based leakage detection models. The sensitive nature of industrial operations and potential safety concerns often restrict the sharing of data. To address this limitation, we have collected the datasets in two different environments, D1: laboratory and D2: industrial. The following subsections provide descriptions of these datasets.

4.1. Laboratory Data (D1)

This data is collected at the Lab of Universidad Carlos III de Madrid (UC3M) in Spain, which specializes in advanced sensor technology. Thermal images were captured using FLIR E 96 thermal camera while CO2 gas was released through a hose mounted on a stand in an outdoor area, as shown in Figure 2. The objective was to capture leakage that may occur at various time intervals. As the gas flows from the cylinder through the hose, the experiment aims to monitor both the gas propagation within the hose and the area around the leakage. Given that gas leakage is often associated with ice formation as a by-product, the experiment also seeks to explore the correlation between the quantity of ice formed and the volume of gas leaked.
During the experiment, RGB images along with thermal images were captured to observe the gas flow inside the hose at different time slots, synchronously. Figure 3 provides some of the examples of RGB-thermal image pairs. These images illustrate the relationship between ice formation and gas propagation, as time passes. The initial dataset comprises 238 images, which were extracted at a sampling rate of 15 frames/s.
The dataset consists of two distinct phases of the leakage; (i) before the leakage occurs (when no gas is present) and (ii) during gas propagation within the hose. The dataset is divided into three subsets; training, validation, and testing set with a split-ratio of 60:20:20. This split-ratio ensures that the model is trained on pre-leak and early-leak conditions, while being tested on more advanced stages of gas leakage, thus evaluating its ability to generalize to different leak scenarios. The collected dataset can be assessed via the following link: https://github.com/saifkhazraji/thermal_images_gas_detection.git (accessed on 2 September 2025).

4.2. Industrial Data (D2)

Another set of data is collected in the industrial environment of the Halfaya Oil and Gas Field in Iraq; some examples of the data are shown in Figure 4. This dataset provides clear examples of gas leakage indicators that have been reported in the field. Figure 4 depicts a leakage scenario, highlighting the accumulation of ice and visible water droplets around the affected pipeline joints. These images were captured from the site, recorded during one of their routine pipeline investigation using FLIR thermal cameras. The total number of collected images is 167 RGB-thermal pairs which were extracted from the video at a sampling rate of 15 frames/s.
This figure highlights the importance of regular monitoring and maintenance of gas pipelines, especially in areas prone to environmental challenges such as temperature fluctuations and moisture accumulation. Timely identification and remediation of these signs can significantly mitigate the risk of more severe pipeline damage or hazardous leaks, which ultimately enhancing safety and operational integrity.

5. Proposed Methodology

The proposed methodology is organized into the following subsections: pre-processing steps, including data augmentation and attention-guided masking; a modified SOTA model of cGANs to generate thermal images, the proposed MSDD-GAN model and its performance comparison with cGANs; and finally, discussion of the employed performance metrics for this work.

5.1. Data Augmentation

In this study, data augmentation was applied to both laboratory and industrial datasets to maximize their utility.
  • Laboratory data (D1): The data includes 238 RGB-thermal image pairs that were augmented to create 560 pairs by rotating, resizing, and adding varying levels of noise. This data augmentation allows the model to generalize better and perform effectively even with noisy data.
  • Industrial data (D2): Similarly, the images collected from the Halfaya Oil and Gas Field were augmented by rotating, resizing, and adding varying level of noise, resulting in a total of 560 RGB-thermal image pairs. This process not only increases the volume of data but also improves the model’s robustness by exposing it to a wider variety of scenarios, thereby enhancing its ability to detect gas leakage under diverse conditions.

5.2. Attention-Guided Masking (AttMask)

In this work, we introduce a novel binary segmentation masking process within our model’s framework to generate thermal images for gas leakage detection. The AttMask strategy provides a saliency map to guide our model to focus on specific informative areas of the image. The focus of this method is on accurately identifying and verifying the location of gas leaks by introducing a secondary discriminator responsible for detecting the leakage spots and ensuring their precise positioning in the generated thermal images.
To achieve this, we implement an attention-guided masking process wherein only predefined positions of potential gas leaks are filtered from both target and generated thermal images. This filtering is implemented using a circular Region of Interest (ROI) mask applied to both images. The ROI ensures that the evaluation process focuses solely on critical regions where gas leakage occurs, as shown in Figure 5. The mask is created by determining the centroid coordinates of the gas leakage location in the target thermal image using the impixelinfo tool of MATLAB which provides precise information about the pixel values and coordinates in the target thermal image. This precise localization enables pixel-by-pixel comparison between the corresponding regions of the generated and target thermal images, which allowing both accurate gas leakage detection and localization.

5.3. Conditional GANs (cGANs)

The first GAN model for this kind of application was introduced in [43], which provides the structure and implementation details. GANs have since been applied to various tasks, such as image-to-image translation [44], where RGB images are transformed into other types of images. For our work, this transformation is crucial to identify potential areas of gas leakage. This subsection provide a brief background of cGANs by highlighting the modifications performed to generate thermal images for gas leakage detection before explaining the MSDD-GAN in subsequent subsection.
cGANs plays a central role in the proposed model and underlying the relationship between the thermal images, indicating gas leakage and the corresponding RGB images showing ice formations around the leakage points. This correlation arises because leaking gas lowers the surrounding temperature, which causes the moisture to freeze and accumulate as ice at the leakage point. An example RGB images with ice are shown in Figure 6. Ice formation around a leak serves as a visual indicator of leakage intensity: the more ice observed around a spot, the greater the likelihood of a gas leakage. Our model learns this correlation and is capable of generating thermal images from RGB images.
Inspired by GANs, cGANs architecture comprises two core components: a generator ( G ) and a discriminator ( D ) . The generator G synthesizes thermal images from input RGB images ( B ) , while the discriminator evaluates whether a thermal image is real ( C ) or generated (C′). During training, G learns to generate realistic thermal images that deceive D , while D improves the ability to distinguish between real and synthetic images. If D consistently identifies the generated image as fake D ( C | B ) 0 , the generator update its parameters to improve realism until equilibrium is achieved.
When an indication signal from SCADA is received, an RGB image B, which depicts the visible ice accumulation around pipeline leaks is fed as an input to G, it generates a thermal image, s.t., C = G ( B ) , highlighting the areas of potential gas leakage. The D receives a pair of images as input: a real thermal image ( C ) and the generated thermal image ( C ) , where C is generated by G. The D then computes the probability indicating whether the thermal image in the pair is real or fake, given the input RGB image, i.e., D ( C | B ) and D ( C | B ) , respectively. In the processing block (from Figure 7), the discriminator then evaluates the accuracy of C generated by G. Figure 7 illustrates the training process of cGANs to learn the optimal thermal image generation from the RGB image. During training, G and D engage in a minimax game: the generator seeks to produce thermal images that are indistinguishable from real ones, while the discriminator strives to correctly classify each pair as real or fake. The subsequent blocks in Figure 7 calculate the adversarial loss for the cGAN passes through two loss functions. The discriminator loss ( L D ) combines binary cross-entropy terms for real and fake classifications as follows:
L D = E B , C [ l o g D ( C | B ) ] E B [ l o g ( 1 D ( C | B ) ] ,
where E denotes the expected value. The generator loss ( L G ) focuses on making generated images appear realistic to D :
L G = E B l o g D ( C | B ) .
Once the loss computed, the validation block compares the generated thermal image with the target thermal images by using the L1 reconstruction loss as follows:
L r e c o n = E B , C [ | | C G ( B ) | | 1 ] .
The proposed model employs L1 reconstruction loss because the objective is to generate thermal images that preserve edge details and produce more realistic results. In contrast, L2 loss tends to overly smooth or blur the outputs due to its averaging effects. The total generator loss then becomes
L G t o t a l = L G + λ L r e c o n ,
where λ controls the reconstruction weight. Total loss in cGANs L total is the summation of generator and discriminator losses, with the objective of minimizing the generator loss L G and maximizing the discriminator loss L D . Mathematically, it can be written as follows:
L total = arg min G max D L D + L G .
Minimizing generator loss enables the model to generate more realistic thermal images that can fool the discriminator and can accurately indicate the gas leakage at ice-accumulating spots, whereas maximizing the discriminator loss improves the ability to better distinguish the real and generated thermal images.
Training proceeds until equilibrium is reached, where the discriminator cannot reliably distinguish real from generated images. Upon convergence, the process terminates successfully, providing a validated thermal image ( C = G ( B ) ) that highlights probable gas leakage locations correlated with the RGB input’s ice accumulation patterns.

5.4. Multi-Scale Dual Discriminator Generative Adversarial Network (MSDD-GAN)

The proposed MSDD-GAN model is designed to generate thermal images from RGB inputs for gas leakage detection, by taking into account the foundational principles of cGAN. While traditional cGANs require intensive training with large dataset; on the contrary MSDD-GAN is capable of providing reliable results even when only small datasets are available. The proposed architecture of MSDD-GAN is illustrated in Figure 8.
Both the generator and discriminator components of the proposed model have been modified. On the generator side, the noise image A and multi-scale resolution RGB images B are provided as inputs. Specifically, image pairs at three different resolutions 256 × 256 , 192 × 192 , and 64 × 64 pixels-are used, representing 100 % , 75 % , and 25 % of the original image resolution, respectively. This multi-scale approach allows the generator to capture features across different levels of features, which improve its ability to generate realistic thermal images even with the small dataset. This multi-scale input strategy can be mathematically formalized as
C = G multi-scale ( B , A ) ,
where G multi-scale comprises of the network layers F G , detailed in Table 2, to generate the thermal image C .
The proposed MSDD-GAN architecture incorporates a dual discriminator approach to improve the accuracy of thermal image generation and gas leakage localization. The first discriminator ( D 1 ) , is responsible for validating the correlation between the target thermal image ( C ) and the generated thermal image ( C ) . This process can be expressed as
D 1 = F D ( C , C ) ,
where F D represents the network layers of the discriminator, as described in Table 2. In contrast, the second discriminator D 2 , focuses on verifying the generator’s output with a binary mask ( M ) , ensuring that the generator accurately localizes the gas leakage. The binary mask, obtained from AttMask (described in Section 5.2) allows D 2 to focus attention on gas leakage spots and isolate them as ROI. The formulation for D 2 is given by:
D 2 = F D M ( C ) , M ( C ) .
Both discriminators share the same architecture; however, they process different inputs. The detailed network topology of MSDD-GAN, along with the encoder-decoder of G and D is presented in Table 2. The D 2 plays a pivotal role in accurately validating the location of gas leakage by comparing the predefined leakage locations in the masked target images with their corresponding locations in the generated images and computing the loss ( L D 2 ). This comparison ultimately improves the accuracy for gas leakage detection and localization. The total discriminator loss ( L D ) is computed as follows:
L D = L D 1 + L D 2 ,
where L D 1 and L D 2 calculated as
L D 1 = E B , C log ( D 1 ( C | B ) ) + E B log ( 1 D 1 ( C | B ) ) ,
L D 2 = E B , C log ( D 2 ( C | B ) ) + E B log ( 1 D 2 ( C | B ) ) .
The generator loss L G is computed similarly to the cGAN, as described in Equations (2)–(4). The overall loss L t o t a l is computed as described in Equation (5). The dual-discriminator structure of the proposed MSDD-GAN model significantly improves performance in thermal image generation, which allows to detect and localize the gas leakage from the generated thermal images C . Moreover, it effectively addresses the challenges posed by limited training data in this field.

5.5. Performance Metrics

Evaluating image-to-image translation algorithms involves several key metrics, including Structural Similarity Index (SSIM), Peak Signal-to-Noise Ratio (PSNR), and Mean Absolute Error (MAE). These metrics provide a comprehensive assessment of the quality and accuracy of the generated images compared to the ground-truth images.
SSIM evaluates the similarity between two images by considering luminance, contrast, and structure changes. Values range from −1 to 1, with high values indicating greater structural similarity between target C and generated thermal image C . Mathematically, SSIM can be defined as follows:
SSIM ( C , C ) = ( 2 μ C μ C + γ 1 ) ( 2 σ C C + γ 2 ) ( μ C 2 + μ C 2 + γ 1 ) ( σ C 2 + σ C 2 + γ 2 ) ,
where { μ C , σ C 2 } and { μ C , σ C 2 } are the mean and variance of the target C and generated thermal images C , respectively. σ C C defines the covariance. γ 1 and γ 2 are small constants that are included to avoid division by zero.
PSNR measures the quality of generated thermal images wrt target thermal images, higher PSNR value indicates C is similar to C, suggesting better performance of our model. PSNR can be defined as follows:
PSNR = 20 · log 10 MAX I MSE ,
where M A X I is the maximum possible pixel value in thermal images (e.g., 255 for 8-bit images) and Mean Squared Error (MSE) is calculated as follows:
M S E = 1 n i = 1 n ( C i C i ) 2 ,
where n is the dimension of the thermal images. Mean Absolute Error (MAE) directly measures the average error magnitude between C and C as follows:
MAE = 1 n i = 1 n C i C i .

6. Results and Discussions

The results section presents the implementation details of the proposed model, along with both qualitative and quantitative evaluations for a comprehensive analysis. In addition, a comparison with state-of-the-art GAN models is provided.

6.1. Implementation Details

During the initial training stages ( t 0 ), the generator loss L G is notably high, around 50. This high loss is characteristic of GANs models, as the generator starts learning with random initialized weights, producing low-quality generated thermal images that the discriminator can easily distinguish from real samples. As training advances, L G decreases significantly as shown in Figure 9 and stabilizes between approximately 5 dB and 10 dB. This reduction indicates that the generator is improving its ability to generate more realistic thermal images that challenge the discriminator’s ability to distinguish from real images. The fluctuations in loss reflect the dynamic nature of MSDD-GAN training, where both generator and discriminator continuously adjust weights to improve output. Table 3 presents a summary of the model’s training parameters. Figure 10 shows the generated thermal images obtained at different stages of the training process.
MATLAB was used during the training phase to generate AttMask images leveraging the impixelinfo (see Section 5.2), which were then provided as input to the GAN models (e.g., MSDD-GAN and other GAN variants) implemented in Python. This required manual coordination between the two environments, as the training process relied on external model architectures outside MATLAB. In contrast, the testing phase was fully automated: the pre-trained weights were loaded, and inference was performed using an end-to-end pipeline without the need for manual intervention. Thus, while training involved interaction between MATLAB and Python environments, the testing was entirely automatic.

6.2. Qualitative Results

The test set consists of a video sequence provided as input to the model. Frames are extracted at a fixed frame rate and processed through the attention-guided masking module, after which the trained weights of the GAN models are employed to make inference. Evaluation of the model’s performance relies on three key metrics, i.e., SSIM, PSNR, and Accuracy. The efficacy of the proposed MSDD-GAN model in successfully generating thermal images from RGB images is visually demonstrated in Figure 11 and Figure 12. These figures showcase the model’s ability to generate thermal images that closely match the target images in terms of gas leakage representation.

6.3. Quantitative Results

Quantitatively, the MSDD-GAN achieved an average accuracy that is 3.78% higher than the baseline cGAN (see, Table 4 and Figure 13). This performance improvement is particularly notable given the model’s improved resilience to noisy images. These results collectively highlight the MSDD-GAN’s robustness and superior performance in thermal image generation for gas leakage detection, even under challenging input conditions.
The MSDD-GAN demonstrates superior performance with a median SSIM (Equation (10)) of 0.801, compared to cGAN’s median SSIM of 0.72 for dataset collected in laboratory environment D1 (Figure 14). This higher median and broader range of SSIM values for the MSDD-GAN indicate that the generated thermal images have higher structural similarity with target thermal images The variability in SSIM scores suggests that while MSDD-GAN can produce images of better quality, it also encounters variability in performance across diverse samples. In contrast, cGAN consistently generates low-quality images but with less variability.
PSNR (from Equation (11)) is a measure often used to quantify image quality, where higher PSNR values indicate better accuracy and less distortion. MSDD-GAN consistently outperforms cGAN with higher PSNR values (see Figure 15). Notably, MSDD-GAN attains a median PSNR of 60.5dB, significantly higher than the cGAN value of 53.3dB. This indicates that MSDD-GAN generates thermal images with superior quality, lower reconstruction error and improve the preservation of thermal image characteristics.
Accuracy of generated images is calculated by comparing them to target images. Pixel errors are evaluated by identifying differences between generated and target images. Pixels in a generated image that differ from the corresponding pixels in the target image are marked as errors. Accuracy is then determined by the proportion of zero values in the error matrix, considering the total number of pixels in the image. Figure 16 illustrates the accuracy scores with MSDD-GAN outperforming cGAN, achieving an accuracy of 84.6%.
It is observed that the model’s ability to generate thermal images using D2, i.e., industrial dataset, is slightly less than with D1, i.e., the laboratory dataset. This is evident from the performance metrics demonstrated in above figures. The discrepancy is attributed to the nature of D2, which contains a greater variety of pipelines than D1, Additionally, D2 has fewer training samples, which may impact the model’s performance.

6.4. Additional Comparison

Further comparative results with Pix2Pix GAN [45] and CycleGAN [46] models show that baseline Pix2Pix GAN yielded lower SSIM (0.6876) and PSNR (21.09 dB) demonstrating inferior performance, especially in gas leak region generation. CycleGAN attained even lower SSIM (0.3084) and PSNR (12.41 dB), generating thermal images with reasonable background quality but poor localization in gad leakage areas. This highlights the necessity for a masked loss function to direct the model’s focus and penalize error more strongly in relevant regions. Figure 17 and Figure 18 present the qualitative results corresponding to the Pix2Pix GAN [45] and CycleGAN, respectively. These images show that the state-of-the-art models fail to accurately generate thermal images, particularly in the gas leakage areas.

6.5. Robustness Test

A comprehensive robustness analysis is performed to evaluate the performance of the model with varying noise levels, assessing its ability to generate high-quality outputs despite input degradation. The test protocol involves distorting input images with two distinct noise levels prior to model processing, i.e., 3 dB (low noise) and 7 dB (moderate noise). This setup simulates real-world scenarios where thermal images may be subject to various forms of noise or distortion. The model’s performance was quantitatively monitored using SSIM. SSIM values were computed for each noise level, and scatter plots were generated to visualize the distribution of these values which provides insights into the consistency and stability of the model’s performance.
The scatter plots in Figure 19a and Figure 19b display the SSIM behavior of test images with added noise of 3 dB and 7 dB, respectively. The results demonstrate that the cGAN performance (see Figure 19) degrades as the noise level increases. In contrast, Figure 20 shows that the MSDD-GAN exhibits greater noise immunity, maintaining approximately constant SSIM values even with increased noise levels. For this analysis, 3 dB and 7 dB noise was added in the images, with the corresponding SSIM values depicted in Figure 20a and Figure 20b respectively. At 3 dB and 7dB noise levels, the SSIM values are 0.795 and 0.790, respectively, compared to an SSIM of 0.800 for MSDD-GAN without noise. This minimal decline in SSIM as noise increases reflects the robustness of the proposed model to noise, which is why Figure 20a,b display very similar plots across various noise levels.

7. Conclusions

This work contributes significantly to the field of OGI by integrating a GenAI concept for thermal images generation using RGB images. The proposed MSDD-GAN model, utilizing a dual discriminators architecture, demonstrates significant improvements over baseline method of cGAN in producing high-quality thermal images. Moreover, SOTA cGAN model is modified to generate the thermal images and provide its comparison with proposed MSDD-GAN model. Our model trained with target thermal and RGB image pairs, successfully understands the correlation between indications of gas leakage from RGB images such as ice formation. Performance metrics for the MSDD-GAN are promising, with mean values of 81.6% accuracy, 59.97 dB PSNR, and 0.799 SSIM across test dataset collected in laboratory environment. Similarly, it shows superior performance on the industrial dataset. Moreover, robustness test was conducted to analyze the model’s performance with different noise levels.
The proposed model acts as a digital platform and offers a cost-effective, continuous monitoring solution for gas leaks, enhancing both safety measures and economic efficiency in industrial settings. In conclusion, MSDD-GAN represents a substantial advancement in thermal imaging technology for gas leakage detection. Its ability to generate thermal images from RGB images opens new avenues for widespread industrial implementation. The proposed method is currently limited to detecting and localizing gas leaks only after leakage indications become visible in RGB images. In the future, however, this work could be extended to predict leaks before any visual indicators appear by integrating GenAI models with the traditional sensors already embedded in the OGI system. Moreover, future work could focus on expanding the model’s applicability to diverse environmental conditions and exploring its potential in other industrial safety applications.

Author Contributions

Conceptualization, S.H.A.A.-K. and H.I.; Methodology, S.H.A.A.-K., H.I. and A.A.-K.; Software, S.H.A.A.-K. and H.I.; Validation, S.H.A.A.-K. and H.I.; Formal analysis, S.H.A.A.-K. and H.I.; Investigation, S.H.A.A.-K. and H.I.; Resources, H.I., F.G., A.A.-K. and J.B.R.; Data curation, S.H.A.A.-K. and J.B.R.; Writing-Review & Editing, H.I.; Visualization, S.H.A.A.-K. and H.I.; Supervision, F.G. and A.A.-K.; Project administration, F.G. and A.A.-K.; Funding acquisition, F.G. and A.A.-K. All authors have read and agreed to the published version of the manuscript.

Funding

This work has been supported by the Spanish Government through the projects PID2021-128327OA-I00, and TED2021-129374A-I00 funded by MCIU/AEI/10.13039/501100011033, by “ERDF A way of making Europe” and by the European Union NextGenerationEU/PRTR respectively.

Data Availability Statement

The dataset collected in this work are available at the web repository: https://github.com/saifkhazraji/thermal_images_gas_detection.git (accessed on 2 September 2025).

Conflicts of Interest

The authors declare no conflict of interest. The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

References

  1. Anwar, S.; Tahir, M.; Li, C.; Mian, A.; Khan, F.S.; Muzaffar, A.W. Image colorization: A survey and dataset. Inf. Fusion 2025, 114, 102720. [Google Scholar] [CrossRef]
  2. Kim, H.; Kim, H.; Shim, J.; Hwang, E. A robust kinship verification scheme using face age transformation. Comput. Vis. Image Underst. 2023, 231, 103662. [Google Scholar] [CrossRef]
  3. Savaglio, C.; Barbuto, V.; Mangione, F.; Fortino, G. Generative Digital Twins: A Novel Approach in the IoT Edge-Cloud Continuum. IEEE Internet Things Mag. 2024, 8, 42–48. [Google Scholar] [CrossRef]
  4. Zhu, Y.; Cheng, J.; Liu, Z.; Zou, X.; Cheng, Q.; Xu, H.; Wang, Y.; Tao, F. Data generation approach based on data model fusion: An application for rolling bearings fault diagnosis with small samples. IEEE Trans. Instrum. Meas. 2024, 74, 3501916. [Google Scholar] [CrossRef]
  5. Khalaf, A.H.; Xiao, Y.; Xu, N.; Wu, B.; Li, H.; Lin, B.; Nie, Z.; Tang, J. Emerging AI technologies for corrosion monitoring in oil and gas industry: A comprehensive review. Eng. Fail. Anal. 2024, 155, 107735. [Google Scholar] [CrossRef]
  6. Lu, J.; Li, J.; Fu, Y.; Du, Y.; Hu, Z.; Wang, D. Natural gas pipeline leak diagnosis based on manifold learning. Eng. Appl. Artif. Intell. 2024, 136, 109015. [Google Scholar] [CrossRef]
  7. Kabir, M.; Afarideh, H.; Ghergherehchi, M.; Chai, J.S. Innovative projection acquisition algorithm for optimizing portable LNDCT in oil and gas pipeline imaging. Nucl. Eng. Technol. 2024, 56, 4355–4364. [Google Scholar] [CrossRef]
  8. Bahaman, U.S.F.; Mustaffa, Z.; Seghier, M.E.A.B.; Badri, T.M. Evaluating the reliability and integrity of composite pipelines in the oil and gas sector: A scientometric and systematic analysis. Ocean Eng. 2024, 303, 117773. [Google Scholar] [CrossRef]
  9. Miao, X.; Zhao, H.; Gao, B.; Song, F. Corrosion leakage risk diagnosis of oil and gas pipelines based on semi-supervised domain generalization model. Reliab. Eng. Syst. Saf. 2023, 238, 109486. [Google Scholar] [CrossRef]
  10. Imran, M.M.H.; Jamaludin, S.; Ayob, A.F.M. A critical review of machine learning algorithms in maritime, offshore, and oil & gas corrosion research: A comprehensive analysis of ANN and RF models. Ocean Eng. 2024, 295, 116796. [Google Scholar] [CrossRef]
  11. Zhang, C.; Bi, J.; Lv, Y.; Li, M.; Qi, Y.; Zhou, K.; Zhang, M.; Du, T. Numerical analysis and experimental research on detection of welding defects in pipelines based on magnetic flux leakage. Pet. Res. 2023, 8, 550–560. [Google Scholar] [CrossRef]
  12. Al-Sabaeei, A.M.; Alhussian, H.; Abdulkadir, S.J.; Jagadeesh, A. Prediction of oil and gas pipeline failures through machine learning approaches: A systematic review. Energy Rep. 2023, 10, 1313–1338. [Google Scholar] [CrossRef]
  13. Lu, H.; Iseley, T.; Behbahani, S.; Fu, L. Leakage detection techniques for oil and gas pipelines: State-of-the-art. Tunn. Undergr. Space Technol. 2020, 98, 103249. [Google Scholar] [CrossRef]
  14. Liang, J.; Liang, S.; Ma, L.; Zhang, H.; Dai, J.; Zhou, H. Leak detection for natural gas gathering pipeline using spatio-temporal fusion of practical operation data. Eng. Appl. Artif. Intell. 2024, 133, 108360. [Google Scholar] [CrossRef]
  15. Knebel, F.P.; Trevisan, R.; do Nascimento, G.S.; Abel, M.; Wickboldt, J.A. A study on cloud and edge computing for the implementation of digital twins in the Oil & Gas industries. Comput. Ind. Eng. 2023, 182, 109363. [Google Scholar] [CrossRef]
  16. Alemaw, A.S.; Slavic, G.; Iqbal, H.; Marcenaro, L.; Gomez, D.M.; Regazzoni, C. A data-driven approach for the localization of interacting agents via a multi-modal dynamic bayesian network framework. In Proceedings of the 2022 18th IEEE International Conference on Advanced Video and Signal Based Surveillance (AVSS), Madrid, Spain, 29 November–2 December 2022; pp. 1–8. [Google Scholar]
  17. Yang, X.; Tao, Y.; Wang, X.C.; Zhao, G.; Lee, C.T.; Yang, D.; Wang, B. City-scale methane emissions from the midstream oil and gas industry: A satellite survey of the Zhoushan archipelago. J. Clean. Prod. 2024, 449, 141673. [Google Scholar] [CrossRef]
  18. Böttner, C.; Haeckel, M.; Schmidt, M.; Berndt, C.; Vielstädte, L.; Kutsch, J.A.; Karstens, J.; Weiß, T. Greenhouse gas emissions from marine decommissioned hydrocarbon wells: Leakage detection, monitoring and mitigation strategies. Int. J. Greenh. Gas Control 2020, 100, 103119. [Google Scholar] [CrossRef]
  19. Iqbal, H.; Marin, P.; Marcenaro, L.; Gomez, D.M.; Regazzioni, C. Bayesian Geometric-based Interactions Learning model for self-aware autonomous agents. Signal Process. 2025, 239, 110237. [Google Scholar] [CrossRef]
  20. Pang, S.; Zhao, L.; An, Y. Advanced developments in nanotechnology and nanomaterials for the oil and gas industry: A review. Geoenergy Sci. Eng. 2024, 238, 212872. [Google Scholar] [CrossRef]
  21. Li, L.; Chen, H.; Huang, Y.; Xu, G.; Zhang, P. A new small leakage detection method based on capacitance array sensor for underground oil tank. Process Saf. Environ. Prot. 2022, 159, 616–624. [Google Scholar] [CrossRef]
  22. Nitta, K.; Kanno, S.; Oka, Y. Experimental study on responsiveness of multi-spectrum infrared flame detectors to diffuse flames partially hidden by obstacles installed in oil and gas processing facilities. J. Loss Prev. Process Ind. 2024, 91, 105378. [Google Scholar] [CrossRef]
  23. Ravikumar, A.P.; Wang, J.; Brandt, A.R. Are optical gas imaging technologies effective for methane leak detection? Environ. Sci. Technol. 2017, 51, 718–724. [Google Scholar] [CrossRef]
  24. Sadia, H.; Sherien, S.; Iqbal, H.; Zeeshan, M.; Khan, A.; Rehman, S. Range estimation in radar using maximum likelihood estimator. In Proceedings of the 2017 20th International Conference of Computer and Information Technology (ICCIT), Dhaka, Bangladesh, 22–24 December 2017; pp. 1–5. [Google Scholar]
  25. Aalsalem, M.Y.; Khan, W.Z.; Gharibi, W.; Khan, M.K.; Arshad, Q. Wireless Sensor Networks in oil and gas industry: Recent advances, taxonomy, requirements, and open challenges. J. Netw. Comput. Appl. 2018, 113, 87–97. [Google Scholar] [CrossRef]
  26. Zaal, H.; Iqbal, H.; Campo, D.; Marcenaro, L.; Regazzoni, C.S. Incremental learning of abnormalities in autonomous systems. In Proceedings of the 2019 16th IEEE International Conference on Advanced Video and Signal Based Surveillance (AVSS), Taipei, Taiwan, 18–21 September 2019; pp. 1–8. [Google Scholar]
  27. Bashir, A.; Mohsin, M.A.; Jazib, M.; Iqbal, H. Mindtwin ai: Multiphysics informed digital-twin for fault localization in induction motor using ai. In Proceedings of the 2023 International Conference on Big Data, Knowledge and Control Systems Engineering (BdKCSE), Sofia, Bulgaria, 2–3 November 2023; pp. 1–8. [Google Scholar]
  28. Iqbal, H.; Sadia, H.; Al-Kaff, A.; Garcia, F. Novelty Detection in Autonomous Driving: A Generative Multi-Modal Sensor Fusion Approach. IEEE Open J. Intell. Transp. Syst. 2025, 6, 799–812. [Google Scholar] [CrossRef]
  29. Zuo, Z.; Ma, L.; Liang, S.; Liang, J.; Zhang, H.; Liu, T. A semi-supervised leakage detection method driven by multivariate time series for natural gas gathering pipeline. Process Saf. Environ. Prot. 2022, 164, 468–478. [Google Scholar] [CrossRef]
  30. Liu, Y.; Yao, Y.; Wang, F.; Sfarra, S.; Liu, K. Review of unsupervised machine learning methods in active infrared thermography for defect detection and analysis. Quant. Infrared Thermogr. J. 2025, 1–28. [Google Scholar] [CrossRef]
  31. Liu, Y.; Liu, A.; Gao, S. A flame image soft sensor for oxygen content prediction based on denoising diffusion probabilistic model. Chemom. Intell. Lab. Syst. 2024, 255, 105269. [Google Scholar] [CrossRef]
  32. Lebel, E.D.; Lu, H.S.; Vielstadte, L.; Kang, M.; Banner, P.; Fischer, M.L.; Jackson, R.B. Methane emissions from abandoned oil and gas wells in California. Environ. Sci. Technol. 2020, 54, 14617–14626. [Google Scholar] [CrossRef] [PubMed]
  33. Yuan, L.; Lang, X.; Zhang, Z.; Liu, Q.; Cao, J. Pipeline leakage aperture identification method based on pseudolabel learning. Meas. Sci. Technol. 2023, 34, 115301. [Google Scholar] [CrossRef]
  34. Peng, L.; Zhang, J.; Li, Y.; Du, G. A novel percussion-based approach for pipeline leakage detection with improved MobileNetV2. Eng. Appl. Artif. Intell. 2024, 133, 108537. [Google Scholar] [CrossRef]
  35. Aljameel, S.S.; Alabbad, D.A.; Alomari, D.; Alzannan, R.; Alismail, S.; Alkhudir, A.; Aljubran, F.; Nikolskaya, E.; Rahman, A.u. Oil and Gas Pipelines Leakage Detection Approaches: A Systematic Review of Literature. Int. J. Saf. Secur. Eng. 2024, 14, 773–786. [Google Scholar] [CrossRef]
  36. Korjani, M.; Conley, D.; Smith, M. Temporal Deep Learning Image Processing Model for Natural Gas Leak Detection Using OGI Camera. In Offshore Technology Conference Asia; OTC: Kuala Lumpur, Malaysia, 2024; p. D021S015R006. [Google Scholar]
  37. Shaheen, M.T.; Iqbal, H.; Khurshid, N.; Sadia, H.; Saeed, N. SwinSegFormer: Advancing Aerial Image Semantic Segmentation for Flood Detection. IEEE Open J. Comput. Soc. 2025, 6, 645–657. [Google Scholar] [CrossRef]
  38. Spatafora, M.A.N.; Allegra, D.; Giudice, O.; Stanco, F.; Battiato, S. Natural gas leakage detection: A deep learning framework on IR video data. In Proceedings of the 2022 26th International Conference on Pattern Recognition (ICPR), Montreal, QC, Canada, 21–25 August 2022; pp. 636–642. [Google Scholar]
  39. Gao, H.; Hao, F.; Zhang, Y.; Song, X.; Hou, N. Application of Novel SN-1DCNN-LSTM framework in small sample oil and gas pipeline leakage detection. Frankl. Open 2024, 6, 100073. [Google Scholar] [CrossRef]
  40. Opara, S.U.; Okere, C.J. A review of methane leakage from abandoned oil and gas wells: A case study in Lubbock, Texas, within the Permian Basin. Energy Geosci. 2024, 5, 100288. [Google Scholar] [CrossRef]
  41. Spandonidis, C.; Theodoropoulos, P.; Giannopoulos, F.; Galiatsatos, N.; Petsa, A. Evaluation of deep learning approaches for oil & gas pipeline leak detection using wireless sensor networks. Eng. Appl. Artif. Intell. 2022, 113, 104890. [Google Scholar] [CrossRef]
  42. Hu, X.; Zhang, H.; Ma, D.; Wang, R. A tnGAN-based leak detection method for pipeline network considering incomplete sensor data. IEEE Trans. Instrum. Meas. 2020, 70, 3510610. [Google Scholar] [CrossRef]
  43. Goodfellow, I.; Pouget-Abadie, J.; Mirza, M.; Xu, B.; Warde-Farley, D.; Ozair, S.; Courville, A.; Bengio, Y. Generative adversarial networks. Commun. ACM 2020, 63, 139–144. [Google Scholar] [CrossRef]
  44. Bansal, V.; Jain, A.; Walia, N.K. Diabetic retinopathy detection through generative AI techniques: A review. Results Opt. 2024, 16, 100700. [Google Scholar] [CrossRef]
  45. Aljohani, A.; Alharbe, N. Generating synthetic images for healthcare with novel deep pix2pix gan. Electronics 2022, 11, 3470. [Google Scholar] [CrossRef]
  46. Ma, X. A comparison of art style transfer in Cycle-GAN based on different generators. J. Phys. Conf. Ser. 2024, 2711, 012006. [Google Scholar] [CrossRef]
Figure 1. (a) Overview of the integration of proposed GenAI model with SCADA system, (b) turbo processing of gas leakage detection.
Figure 1. (a) Overview of the integration of proposed GenAI model with SCADA system, (b) turbo processing of gas leakage detection.
Electronics 14 03564 g001
Figure 2. Experimental setup in laboratory environment for collecting dataset (D1): (a) gas cylinder, (b) a hose mounted on a stand used for the experiment, and (c) the FLIR thermal camera.
Figure 2. Experimental setup in laboratory environment for collecting dataset (D1): (a) gas cylinder, (b) a hose mounted on a stand used for the experiment, and (c) the FLIR thermal camera.
Electronics 14 03564 g002
Figure 3. Laboratory data D1: (a) RGB images depicting ice formation around the hose during CO2 gas leakage, ice accumulation increasing as leakage progresses. (b) Thermal images showing the hose changing color to black, indicating gas flow, with corresponding ice formation visible in RGB images.
Figure 3. Laboratory data D1: (a) RGB images depicting ice formation around the hose during CO2 gas leakage, ice accumulation increasing as leakage progresses. (b) Thermal images showing the hose changing color to black, indicating gas flow, with corresponding ice formation visible in RGB images.
Electronics 14 03564 g003
Figure 4. Industrial data D2 collected in Halfaya oil and gas industry: (a) RGB images, where the red circle indicates the leakage while green shows no leakage case, (b) corresponding thermal images.
Figure 4. Industrial data D2 collected in Halfaya oil and gas industry: (a) RGB images, where the red circle indicates the leakage while green shows no leakage case, (b) corresponding thermal images.
Electronics 14 03564 g004
Figure 5. Attention-guided mask: (a) highlighted ROI of gas leakage in thermal image, (b) binary representation of ROI, and (c) generated thermal image with ROI.
Figure 5. Attention-guided mask: (a) highlighted ROI of gas leakage in thermal image, (b) binary representation of ROI, and (c) generated thermal image with ROI.
Electronics 14 03564 g005
Figure 6. Laboratory experiment: figure depicting the ice accumulation around the hose as the gas leaks.
Figure 6. Laboratory experiment: figure depicting the ice accumulation around the hose as the gas leaks.
Electronics 14 03564 g006
Figure 7. Modified conditional GAN (cGAN) architecture employed to generate thermal images for gas leakage detection.
Figure 7. Modified conditional GAN (cGAN) architecture employed to generate thermal images for gas leakage detection.
Electronics 14 03564 g007
Figure 8. MSDD-GAN architecture.
Figure 8. MSDD-GAN architecture.
Electronics 14 03564 g008
Figure 9. Training loss (dB) of the proposed MSDD-GAN model.
Figure 9. Training loss (dB) of the proposed MSDD-GAN model.
Electronics 14 03564 g009
Figure 10. Generated thermal images at various stages of training the proposed MSDD-GAN model.
Figure 10. Generated thermal images at various stages of training the proposed MSDD-GAN model.
Electronics 14 03564 g010
Figure 11. MSDD-GAN demonstration for two different test samples; the figure depicts the input RGB, target thermal and generated thermal image for D1 dataset, respectively.
Figure 11. MSDD-GAN demonstration for two different test samples; the figure depicts the input RGB, target thermal and generated thermal image for D1 dataset, respectively.
Electronics 14 03564 g011
Figure 12. MSDD-GAN demonstration for two different test samples; the figure depicts the input RGB, target thermal and generated thermal image for D2 dataset, respectively.
Figure 12. MSDD-GAN demonstration for two different test samples; the figure depicts the input RGB, target thermal and generated thermal image for D2 dataset, respectively.
Electronics 14 03564 g012
Figure 13. Boxplot of performance metrics for the comparison of cGAN and MSDD-GAN with both D1 and D2 datasets, (a) accuracy (%) and PSNR (dB), and (b) SSIM and MAE.
Figure 13. Boxplot of performance metrics for the comparison of cGAN and MSDD-GAN with both D1 and D2 datasets, (a) accuracy (%) and PSNR (dB), and (b) SSIM and MAE.
Electronics 14 03564 g013
Figure 14. Comparison of cGAN with MSDD-GAN model using SSIM performance metric, computed with test-set of both datasets D1 and D2.
Figure 14. Comparison of cGAN with MSDD-GAN model using SSIM performance metric, computed with test-set of both datasets D1 and D2.
Electronics 14 03564 g014
Figure 15. Comparison of cGAN with MSDD-GAN model using PSNR performance metric, computed with test-set of both datasets D1 and D2.
Figure 15. Comparison of cGAN with MSDD-GAN model using PSNR performance metric, computed with test-set of both datasets D1 and D2.
Electronics 14 03564 g015
Figure 16. Comparison of cGAN with MSDD-GAN model using accuracy performance metric, computed with test-set of both datasets D1 and D2.
Figure 16. Comparison of cGAN with MSDD-GAN model using accuracy performance metric, computed with test-set of both datasets D1 and D2.
Electronics 14 03564 g016
Figure 17. Pix2Pix GAN results: (a) original RGB image, (b) generated thermal image using Pix2Pix GAN model, and (c) generated thermal image using proposed MSDD-GAN model.
Figure 17. Pix2Pix GAN results: (a) original RGB image, (b) generated thermal image using Pix2Pix GAN model, and (c) generated thermal image using proposed MSDD-GAN model.
Electronics 14 03564 g017
Figure 18. CycleGAN results: (a) original RGB image, (b) generated thermal image using the Pix2Pix GAN model, and (c) generated thermal image using proposed MSDD-GAN model.
Figure 18. CycleGAN results: (a) original RGB image, (b) generated thermal image using the Pix2Pix GAN model, and (c) generated thermal image using proposed MSDD-GAN model.
Electronics 14 03564 g018
Figure 19. SSIM consistency analysis for cGAN with noise levels of (a) 3 dB and (b) 7 dB.
Figure 19. SSIM consistency analysis for cGAN with noise levels of (a) 3 dB and (b) 7 dB.
Electronics 14 03564 g019
Figure 20. SSIM consistency analysis for MSDD-GAN with noise levels of (a) 3 dB and (b) 7 dB.
Figure 20. SSIM consistency analysis for MSDD-GAN with noise levels of (a) 3 dB and (b) 7 dB.
Electronics 14 03564 g020
Table 1. Theoretical comparison of the objectives, methods, and limitations of SOTA methods used for gas leakage detection in OGI sector.
Table 1. Theoretical comparison of the objectives, methods, and limitations of SOTA methods used for gas leakage detection in OGI sector.
Ref.ObjectiveMethod/TechDatasetLimitations
[6]Address cross-domain issues
in leakage detection
Semi-supervised
domain adaptation
Cross-domain
pipeline data
Lack visual localization of
leakage, not robust
with limited data
[7]Semi-supervised anomaly
detection in gas pipeline
LSTM-AE,
OCSVM
Real gas pipeline
sensor data
Focused only on numerical
signals, does not provide
visual localization of leaks
[8]Percussion-based,
leak detection
MobileNetV2
(DL on audio)
Pipe hammering
audio sound
Limited to accessible sites,
not suitable for image-based
detection or remote monitoring
[9]CH4 leakage from marine
decommissioned wells
Seismic and hydro-
acoustic sensing
Data from 43 wells
in the North Sea
Requires specialized equipment,
does not support real-time
or visual detection
[10]Manifold learning for
leakage diagnosis
VMD, SPD matrix,
LLE, SVM
Pipeline sensor
condition signal
Relies on numerical data,
limiting practical deployment
in complex environments
[11]Enhance small leakage
detection using ML
CNN, BiLSTM,
attention mechanism
SCADA batch data
from pipelines
Requires extensive labeled
data, less effective for
visually ambiguous leaks
[12]Develop low cost WSN for
real-time leak detection
2D-CNN, LSTM-
AE, field tests
Experimental
pipeline network
Sensor deployment and
maintenance is challenging,
not suitable for non-invasive
detection
[13]Investigate MFL for detecting
welding defects in pipelines
Electromagnetic
simulation
406 pipelines
samples
Limited to certain defect
types, restricting broader
applicability
[14]Predict pipeline failures
using ML and hybrid models
ANNs, SVM,
biometric analysis
Pipeline sensor
signal data
Not design for real-time
leak detection, limiting its
use for monitoring
[15]Predict CH4 emission trends
in China’s OGI sector
Emission estimation,
inventories analysis
Chinese OGI
operational data
Provide macroscopic
estimations; not suitable
for local leak detection
Table 2. Network topography of proposed MSDD-GAN model.
Table 2. Network topography of proposed MSDD-GAN model.
ModelsLayersActivationParameters
Generator1. Input layerA and CImage sizes: 256 × 256, 192 × 192, 64 × 64
2. Encoder with 4 Conv2DLeakyRelu: α = 0.2Kernel: 4 × 4 , stride: 2, padding: same
Filter sizes for 4 layers: 64, 128, 256, 512
3. Decoder with 4 Conv2DTransposeReluKernel: 4 × 4 , stride: 2, padding: same
Filter sizes for 4 layers: 512, 256, 128, 64
Discriminator1. Input layerC and C’Image sizes: 256 × 256
2. Conv2DleakyRelu: α = 0.2Filter: 64, kernel: 4 × 4 , stride: 2
3. Conv2DLeakyRelu: α = 0.2Filter: 128, kernel: 4 × 4 , stride: 2
4. BatchNorm[0, 1]
5. Conv2DLeakyRelu: α = 0.2Filter: 256, kernel: 4 × 4 , stride: 2
6. BatchNorm[0, 1]
7. Conv2DLeakyRelu: α = 0.2Filter size: 512, kernel: 4 × 4 , padding: same
8. BatchNorm[0, 1]
9. Output layersigmoidSingle output per batch
Table 3. Summary of MSDD-GAN training parameters.
Table 3. Summary of MSDD-GAN training parameters.
ParameterConfigurations
Epochs100
# (training samples)560
Batch size1
Loss target0.01 dB
# (generators)1
# (discriminators)2
Total training time37 h
Total training steps56,000
Hardware descriptionGPU: AMD Ryzen 7 5800H
with Radeon Graphics (3.20 GHz)
RAM: 32 GB DDR5
Table 4. Quantitative comparison of the proposed MSDD-GAN and baseline cGAN.
Table 4. Quantitative comparison of the proposed MSDD-GAN and baseline cGAN.
DatasetModelACC (%)PSNR (dB)SSIMMAE
D1cGAN78.6453.350.6880.049
MSDD-GAN81.6059.980.8000.035
Pix2Pix GAN48.3121.090.6870.145
CycleGAN49.4212.410.3080.391
D2cGAN72.9648.040.6400.057
MSDD-GAN75.2054.110.7530.043
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Al-Khazraji, S.H.A.; Iqbal, H.; Rubio, J.B.; García, F.; Al-Kaff, A. Multi-Scale Dual Discriminator Generative Adversarial Network for Gas Leakage Detection. Electronics 2025, 14, 3564. https://doi.org/10.3390/electronics14173564

AMA Style

Al-Khazraji SHA, Iqbal H, Rubio JB, García F, Al-Kaff A. Multi-Scale Dual Discriminator Generative Adversarial Network for Gas Leakage Detection. Electronics. 2025; 14(17):3564. https://doi.org/10.3390/electronics14173564

Chicago/Turabian Style

Al-Khazraji, Saif H. A., Hafsa Iqbal, Jesús Belmar Rubio, Fernando García, and Abdulla Al-Kaff. 2025. "Multi-Scale Dual Discriminator Generative Adversarial Network for Gas Leakage Detection" Electronics 14, no. 17: 3564. https://doi.org/10.3390/electronics14173564

APA Style

Al-Khazraji, S. H. A., Iqbal, H., Rubio, J. B., García, F., & Al-Kaff, A. (2025). Multi-Scale Dual Discriminator Generative Adversarial Network for Gas Leakage Detection. Electronics, 14(17), 3564. https://doi.org/10.3390/electronics14173564

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop