Next Article in Journal
Generating High-Resolution 3D Faces and Bodies Using VQ-VAE-2 with PixelSNAIL Networks on 2D Representations
Previous Article in Journal
How Industry 4.0 and Sensors Can Leverage Product Design: Opportunities and Challenges
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Image Noise Removal in Ultrasound Breast Images Based on Hybrid Deep Learning Technique

by
Baiju Babu Vimala
1,*,
Saravanan Srinivasan
2,
Sandeep Kumar Mathivanan
3,
Venkatesan Muthukumaran
4,
Jyothi Chinna Babu
5,*,
Norbert Herencsar
6 and
Lucia Vilcekova
7
1
School of Computer Science and Engineering, Vellore Institute of Technology, Vellore 632014, India
2
Department of Computer Science and Engineering, Vel Tech Rangarajan Dr. Sagunthala R&D Institute of Science and Technology, Chennai 600062, India
3
School of Information Technology and Engineering, Vellore Institute of Technology, Vellore 632014, India
4
Department of Mathematics, College of Engineering and Technology, SRM Institute of Science and Technology, Kattankulathur 603203, India
5
Department of Electronics and Communications Engineering, Annamacharya Institute of Technology and Sciences, Rajampet 516126, India
6
Department of Telecommunications, Faculty of Electrical and Communication Engineering, Brno University of Technology, Technicka 12, 616 00 Brno, Czech Republic
7
Faculty of Management, Comenius University Bratislava, Odbojarov 10, 820 05 Bratislava, Slovakia
*
Authors to whom correspondence should be addressed.
Sensors 2023, 23(3), 1167; https://doi.org/10.3390/s23031167
Submission received: 13 December 2022 / Revised: 11 January 2023 / Accepted: 16 January 2023 / Published: 19 January 2023
(This article belongs to the Section Sensing and Imaging)

Abstract

:
Rapid improvements in ultrasound imaging technology have made it much more useful for screening and diagnosing breast problems. Local-speckle-noise destruction in ultrasound breast images may impair image quality and impact observation and diagnosis. It is crucial to remove localized noise from images. In the article, we have used the hybrid deep learning technique to remove local speckle noise from breast ultrasound images. The contrast of ultrasound breast images was first improved using logarithmic and exponential transforms, and then guided filter algorithms were used to enhance the details of the glandular ultrasound breast images. In order to finish the pre-processing of ultrasound breast images and enhance image clarity, spatial high-pass filtering algorithms were used to remove the extreme sharpening. In order to remove local speckle noise without sacrificing the image edges, edge-sensitive terms were eventually added to the Logical-Pool Recurrent Neural Network (LPRNN). The mean square error and false recognition rate both fell below 1.1% at the hundredth training iteration, showing that the LPRNN had been properly trained. Ultrasound images that have had local speckle noise destroyed had signal-to-noise ratios (SNRs) greater than 65 dB, peak SNR ratios larger than 70 dB, edge preservation index values greater than the experimental threshold of 0.48, and quick destruction times. The time required to destroy local speckle noise is low, edge information is preserved, and image features are brought into sharp focus.

1. Introduction

The process of noise removal from an image has been studied for decades as academics attempt to tackle this “classical challenge”. Filters were once used by scientists in order to lessen the visual disturbances in photographs. In the past, they were effective up to a certain degree of noise in an image. However, using such filters would cause the image to blur. In addition, if the image is overly noisy, the final product will be so fuzzy that important information will be lost [1]. Breast illness is on the rise, and breast hyperplasia is the most prevalent breast condition, while malignancy is the most frequent female malignancy. Early identification and efficient treatment boost the clinical outcome of breast illness, according to this research [2]. Ultrasound imaging technology is now used to diagnose and treat breast cancer. Due to its low cost and good performance, ultrasonic imaging is used to identify breast cancer. Breast ultrasound imaging is an excellent tool for checking breast health. Ultrasound scattering causes speckle noises. Speckle noise lowers image resolution, alters how the clinician interprets the ultrasound image, and prevents accurate health monitoring and assessment [3]. Consequently, the significance of the research into reducing the speed of the speckles in ultrasound imaging of the breast cannot be overstated. Complex machine learning techniques such as three-dimensional deep learning efficiently handle image data. As shown, the proposed deep learning classification method successfully identified data based on both local and global features. The use of deep learning for the interpretation and investigation of breast ultrasound images was more effective and used less computing power than traditional approaches. Using a Deep Convolutional Neural Network (DCNN) [4], we developed a method for automatically classifying lesions in ultrasound images of the thyroid and breast. With the same architecture and transfer learning in mind, we proposed a general DCNN structure parameter setting and evaluated this overarching method using real-world ultrasound images. The authors conduct a comprehensive comparison of deep image noise-removal algorithms. We begin by separating the DCNN into four categories: additive white noise images, genuine noisy images, blind noise removal, and hybrid noisy images. We then compare and contrast the goals and underlying ideas of the various deep learning approaches. Next, we conduct a quantitative and qualitative comparison of the state-of-the-art approaches using publicly available datasets for noise removal. Finally, we suggest several obstacles and avenues for further study [5]. Facial recognition algorithms function best in laboratory conditions where lighting, expression, and position may all be carefully manipulated. There have been studies using infrared (IR) and three-dimensional (3D) images of faces for facial recognition. It has also been shown in studies that fusing various facial modalities yields better results than using a single one. Noise removal of the image is considered a common problem in the image-processing domain and computer-aided vision, where the objective is to approximate the original image by reducing noise. Image noise is frequently unavoidable due to intrinsic (sensor) and extrinsic (environment) factors. In several applications, including image restoration, eye tracking, image registration, segmentation techniques, and image classification, recovering the actual image content is critical for good performance. While various methods have been developed for image noise removal, image noise reduction remains difficult, particularly when images are collected under poor conditions with excessive noise [6].
The proposed technique suppresses local speckle noise in ultrasound breast images, retains edge information, and makes image features apparent, laying the groundwork for intelligent ultrasound image processing and application. This study will focus on improving the versatility of the investigation of different types of noise and other features of ultrasound, MRI, and CT image characteristics as well.

2. Literature Review

Deep learning is used for noise removal in the image. Deep learning algorithms for image noise removal vary greatly. Deep learning-based discriminative learning can handle Gaussian noise. Deep learning-based optimization methods estimate actual noise. Few studies have summarized deep learning image noise-removal algorithms [7]. Noise removal is crucial for medical imaging analysis, diagnosis, and therapy. Deep learning-based image noise-removal algorithms are effective but restricted by sample. We develop a deep feed forward noise-removal CNN for medical image noise removal using a modest sample set to demonstrate a new method for solving basic eyesight issues [8]. We propose a different training program that effectively adapts, initially conceived for unsupervised learning, to the activities of image noise removal and blind inpainting. This is accomplished with the help of encoding and deep networks that have been pre-trained with a noise-removal auto-encoder. For the image noise removal challenge, our solution performs just as well as the popular K-singular value decomposition (KSVD) sparse coding method [9]. In this technique for image noise removal, contrast is adaptively increased. It is not always feasible to get high-quality photos of whatever is trying to be captured. This is due to the fact that they were shot under a wide range of lighting conditions. Because of this, the perceived quality of the obtained photographs is low. As a result, it is crucial to boost the quality of photographs while keeping edge detail intact as much as possible [10]. The purpose of this study is to offer a thorough overview of the most up-to-date developments in deep neural network-based image noise-removal methods. To achieve this, we first provide a detailed explanation of the problem statement surrounding image noise removal, before moving on to the specifics of the datasets used as standards and the metrics often used during objective evaluations [11]. To comprehend the area and advancement of deep learning in noise removal, research scholars, academics, and industry experts should evaluate numerous methodologies. The research presented here presents three distinct noise-canceling frameworks: the wavelet, the pulse-coupled, and the CNN [12]. The noise removal uses fast CNN architecture. The spatial-wise attention residual network (SARN) method helps train bigger, better datasets. SARN CNN preserves details and image edges during reconstruction [13]. Image restoration removes noise and creates a copy as similar to the original as is feasible. This study focuses on feed-forward noise removal. Using a convolutional neural network (CNN), we can deblock Joint Photographic Experts Group (JPEG) images, improve resolution using super-resolution, and remove blinding Gaussian noise. Using batch normalization and residual training boosts noise-removal performance and reduces operation time [14]. Using the raw information available in k-space, we compare two unsupervised methods for noise removal in magnetic resonance imaging (MRI) images in the complex image space. Both approaches are based on Stein’s unbiased risk estimator, although the latter is more restrictive due to its use of a blind-spot network. Both approaches are put through their paces on two datasets: one with genuine MRI scans of knees and the other with simulated scans of the brain. These datasets will be utilized for noise removal since they include information on the intricate image space [15]. In the presented approach, a stretched convolutional neural network is combined with pre- and post-processing methods to enlarge the receptive field. As a result, big distance pixels in source images will help enrich the image features of the learned model, which effectively denoises the images [16]. The author proposes a lightweight, effective neural network-based raw image denoiser that works well on popular mobile devices and yields great-quality noise-removal results. The two main takeaways from our research are: (i) a simpler network trained on synthetic sensor-specific data may outperform bigger networks trained on general data; and (ii) a unique k-Sigma Transform can reduce the huge noise level variance under a varied international organization for standardization (ISO) settings, enabling a tiny network to effectively manage a broad range of noise levels [17]. The author suggests a new deep network design for non-local image noise removal, which may be used for both grayscale and colour images. The variational approaches that make use of the non-local self-similarity feature of natural images have inspired the general architecture of the proposed network. Using this idea as a foundation, we develop deep networks capable of non-local processing that also greatly benefit from discriminative training [18]. Due to the complex nature of data collection, developing an unsupervised speckle-reduction solution for practical purposes is a difficult challenge. In practice, the distortion distribution is often too complicated for the simple additive white Gaussian hypothesis to hold, which severely compromises the effectiveness of Gaussian denoisers [19]. Classifying HIs improves clinicians' abilities to diagnose ailments and treat patients. Due to their ability to automatically extract characteristics, deep learning algorithms have found widespread use in a number of different sectors, most notably medical imaging [20].

3. Proposed Framework Methodologies

3.1. Dataset Availability

INbreast and CBIS-DDSM (Curated Breast Imaging Subset-Digital Database for Screening Mammography) datasets were used to collect the experimental data. Dataset from INbreast: There were 120 instances (412 photos) in INbreast, 91 of them originated with females who had both breasts (four images each), whereas the remaining 30 were from people who had had a mastectomy (2 images each). There were a number of different inflammatory lesions that caused deformities. The expert also sent us the detailed outlines in extensible markup language (XML) format. Data from the CBIS-DDSM: The breast data set, which is one of the most popular and extensive data collections, is split into four different directories. The breast dataset is a well-known and sizable data collection that has been subdivided into benign without call-backs, benign, malignancies, and normal. Each folder contains several instances, each representing a sample of a certain kind of breast examination. Using the proposed approach in this paper, we rapidly destroy local speckle noise in a sample of 1000 ultrasound breast images drawn from the aforementioned two datasets, where 800 images are utilized to prepare a logical-pool recurrent neural network, and the remaining 200 images are utilized as test samples during analysis.

3.2. Effective Destruction of Local Speckle Noise in Breast Images

Most methods for reducing the amount of noise in ultrasound images take the form of an additive background noise model. The ultrasound breast local speckle noise reduction while protecting the confidentiality of data at the edge was achieved by using a Logical-Pool Recurrent Neural Network (LPRNN) for the local speckle destruction method. Figure 1 shows this to be the case. The method of local speckle noise removal from an ultrasound breast image consists mostly of three steps: pre-processing, training, and denoising. Initially, the obtained ultrasound breast images are aligned, the standardized and clear ultrasound images are located, and the contrast increase processing is finished. After the data has been processed, data expansion processing is carried out in order to create training samples, and the LPRNN is then obtained by training on these samples. Contrast enhancement processing utilizing the trained LPRNN is followed by speckle denoising of the ultrasound breast images. Figure 1 depicts the local speckle denoising method for breast ultrasound images, and it has two different phases: (a) training phase; (b) noise-free phase. Segmenting digital images is crucial in many disciplines.
Image segmentation separates objects from the backdrop. Object detection requires image segmentation. Noisy images are more problematic. Noises such as salt-and-pepper, Gaussian, Poisson, and speckle damage most digital photos. Speckle noise impacts pixels in a grayscale image and is common in low-luminance imaging such as specific absorption rate (SAR) and MRI. Image enhancement reduces speckle noise before object identification, segmentation techniques, edge detection, etc.

3.3. Breast Image Pre-Processing and Image Enhancement

It is very crucial to do pre-processing on breast ultrasound images before attempting to decrease ultrasound image local speckle noise. The algorithm for image-guided filtering is used in the operation of pre-processing ultrasound breast images. This operation is broken down into three distinct parts. The image contrast of the breast imaging that was supplied is processed in the first phase of the procedure. The algorithm of image-guided filtration is what is used in the second stage to accomplish the desired augmentation of the ultrasound image’s level of detail. In the third phase, the high-pass filtering for spatial analysis is applied to the ultrasound breast images in order to reduce the excessive sharpening that was introduced in the previous step. Ultrasound imaging of the breast may be drastically improved by adjusting the grayscale setting. Employing logarithmic and exponential transformations throughout a variety of grayscale mean M from [0, 260], this research enhances the contrast of grayscale values in breast ultrasound images as input. When applied to an input breast ultrasound image f ( i ,   j ) , the logarithmic transform may be used to increase contrast by enlarging the lower grey value interval and compressing the upper grey value interval. The logarithmic equation can be expressed as:
h ( i , j ) = a . L ( f ( i , j ) ) b + c
Here, c and a illustrate the exponential and log co-efficient transformation; correspondingly, L(.) is considered as the logarithmic function, and b is the constant with the (0,1) interval value. In this research, we employed exponential transformation to handle images with too high brightness, avoiding the whitening and inadequate compensating issues that arise when using log transformation. The transformation equation of the exponential can be expressed as:
h ( i , j ) = a b ( f ( i , j ) a c 1
Here, h(i,j) is the outcome of the image after the transformation. The combined information, along with transformation, then the equation can be written as:
{ h ( i ,   j ) = a . L ( f ( i , j ) ) b + c 0 N < 100 h ( i ,   j ) = f ( i , j ) 100 N < 180 h ( i , j ) = a b ( f ( i , j ) a c 1 180 M < 260
From Equation (3), the grayscale value of the image can be modified in order to set a value after the transformation is carried out. It provides an enhanced image with respect to its contrast and offers good guide image acquisition. Linear transformation filtering is used as a guide in the image-filtering process. Therefore, a weighted mean is utilized to depict the filtered outcome ni of the ith pixel point of the ultrasound breast image to be local-speckle-noise destructed with contrast enhancement finished M, the guided-image Im, and the outcome improved ultrasound breast image N. The computation equation can be written as:
n i = j m j Z i j ( I m )
Here, mj indicates the jth weighted value of the average vector, Z i j ( I m ) is considered as the function of the kernel filter. Then, the computational equation is written as:
Z i j ( I m ) = r : ( i ,   j ) ϵ v r ( ( I m i μ r ) ( I m j μ r ) ϑ r 2 + ϵ | v | 2
Here, the operation takes the image Im, and the kernel value is Z i j ( I m ) and it is free of correlation along with the ultrasound breast image M, which is local-speckle-noise destructed. However, v r is referred to as the kernel window, then the pixel count in the window is denoted as | v | , then the μ r and ϑ r 2 are the mean and variance, respectively, the ϵ is referred to as a smoothing operator. The guided filtering of the image for breast (ultrasound) information improvement is written as:
I m = n + F I m F n
Here, F is the parameter of improved degree correction.

3.4. High-Pass Ultrasound Image Spatial Filtering for Breast Images

In order to accomplish the aim of reducing the factors in a particular space, the spatial filter modifies the distribution frequency of the image, which ultimately results in an increase in the image contrast. In order to make the image clearer, the high-pass filter is used to limit the images that have a low frequency throughout the locally efficient processing of the input image. This successfully eliminates the local over-sharpening that would have been induced by the image-guided filtration function. The H p ( u , v ) is a template of a high-pass filter, and it can be expressed as:
H p ( u , v ) = 1 7 ( 2 1 2 1 18 1 2 1 2 )
Here, the ultrasound breast image output is b u i ( a , b ) after the image guided-filtering improvement is written as:
b u i ( a , b ) = u = 0 r f ( a u ,   b v ) u = 0 r H p ( u , v )
The u and v indicate the various parameter templates of a high-pass filter, and a and b illustrate the various image parameters of ultrasound breast images, and r is the highest value of the template parameter. However, Equations (6) and (8) will be combined in order to obtain the ultrasound breast image spatial high-pass filter, then I m ( a , b ) is written as:
I m ( a , b ) = u = 0 r I m ( a u ,   b v ) u = 0 r H p ( u , v )
The pre-processing steps of high-pass ultrasound image spatial filtering for breast images can be written as:
  • Step 1: To improve the process of acquiring the bootstrap image, we first determine a grayscale value of N, of the source ultrasound breast image f, and then use Equation (3) to identify the associated breast ultrasound image h that best fits the categorization.
  • Step 2: Having completed Step 1, the ultrasound breast image is utilized as the reference image I’m of the guide-filtration algorithm.
  • Step 3: I’’ is the ultrasound breast outcome, and it is achieved by improving the edge data of I’ along with a high-pass filter in order to enlarge the edge data retention.

3.5. Logical-Pool Recurrent Neural Network—Local Speckle Noise Destruction

To address the issue of local speckle noise destruction in ultrasound breast images as an image and image mapping problem, we introduce a logical-pool recurrent neural network and implement a training model between endpoints on training samples composed of pre-processed, finished breast images. The ultrasound images may have their local speckle noise reduced by using a logical-pool recurrent neural network; however, this comes at the expense of edge blurring and some data loss. Using the groundwork laid by the introduction of logical-pool recurrent neural networks, the proposed algorithm explicitly targets edge information loss as the objective of the modification function, thereby decreasing the likelihood of edge information loss in ultrasound breast images during local speckle noise destruction. The fundamental structure of a recurrent neural network is shown in Figure 2. Feeding the ultrasound images of the breast into a logical-pool recurrent neural network model yielded results with and without the local speckle noise reduction. The layers of a logical-pool recurrent neural network model include an input layer, a convolution layer, a pooling layer, a fully connected layer, and an output layer.
Figure 3 depicts the convolutional nodes (1,2,3,4,5 and 6) all these nodes are performing a fully connected network and it is a detailed design of a logical-pool recurrent neural network model. Layers Zij (input), Gl (convolution), Rl (pooling), R1 (complete connection), and R2 (output) are shown in Figure 3, with sizes of 8 × 32 × 32, 5 × 5 × 3, 2 × 2 × 1, 9 × 9 × 1, and 3 × 3 × 1, respectively. When the entirely connected layer is placed before the layer outcome, the convolution layer and pooling layer are arranged in counter-clockwise order, and the data flowing through and out of the convolution and pooling layers are neural network feature bodies. To select the stacked high-level features, it is necessary to first determine the logical-pool recurrent convolution kernels of all the convolution layers, then create a new feature space by combining the recurrent convolution kernels of different convolution layers, and finally activate the nonlinear function by improving the bias term. The pooling layer is the most crucial aspect of a logical-pool recurrent neural network since it may decrease the number of features without altering the local information of the features. Configuring the neural network layer as ln, and V is the pooling layer, and the neural network input layer is referred to and written as:
V = [ p 1 l n , p 2 l n ,   p 3 l n ,   p 4 l n , p 5 l n p r l n ] A X Y Z K
In order to provide an output, the max pooling layer must first figure out the highest number of the cube of the number of features.
V 1 A X Y Z K
Before feature extraction, the deep feature size is (X, Y, Z), but after feature extraction, the deep feature size is ( X , Y , Z ) and the K is referred to as the count of spaces of features. All neurons in a fully linked layer communicate with neurons in the layer above. Breast ultrasound images following the preceding pre-processing are used to train a logical-pool recurrent neural network-based image local speckle noise destruction framework, with the resulting images being the model’s output layer’s local-speckle-noise-free output.

Local Speckle Noise Destruction Algorithm

Here are the operations included in the recurrent deep learning-based method that eliminates rapid local speckle noise in ultrasound breast images. Here, the input original ultrasound breast image f(x, y) is pre-processed, and a logical-pool recurrent neural network architecture is constructed, and the outcome is ultrasound breast images after the local speckle noise destruction, shown in Figure 4.
Algorithm 1. Noise removal of the local speckle noise.
1:Begin
2:Logarithmic and computational transforms are used to improve the differentiation of the input ultrasound breast images; the algorithm (guided filter) is used to improve the details of the glandular ultrasound images; and the spatial high-pass filtering algorithm is used to denoise the over-sharpening of the ultrasound breast images, all based on their grayscale values
3:The pre-processed ultrasound breast images are fed into a local-speckle-noise destruction model of a logical-pool recurrent neural network
4:Ultrasound breast images are susceptible to losing image edge information during the local speckle noise reduction procedure. If we want to preserve the edge information after local speckle noise removal is applied, we will need to understand how that information is lost during processing. The meaning of “edge information loss”.
Loss Edge ( P ) = log I m ( a , b ) i , j | b i + 1 , j b i , j |
5:In order to construct ultrasound image gradients, we first analyze the aforementioned stages and then use edge loss pairs to compare the edges of canonical clear images of ultrasound breast images. The unique anatomy of the breast emphasizes the significance of the gradients in the vertical plane. That is why we first use contrast in the vertical direction to depict breast ultrasound images. Integrating edge loss Loss Edge ( P ) and L1 distance with a recurrent neural network yields the following objective function:
P ,   C = arg   min P max C   Loss HRNN ( P , C ) + b   Loss Loss 1 ( P ) + β Loss Edge ( P )
6:Enhance the loss function to optimize the edge-specific improvement feature of the ultrasound images during training with the logical-pool recurrent neural network. The resulting model will be more responsive in edge local speckle noise destruction in ultrasound images, enhancing its effect on ultrasound breast images
7:While noise removal reduces the local speckle noise of ultrasound breast images, the edge information is preserved by the action of the advantage term in the logical-pool recurrent neural network as described above
8:End

3.6. Performance Metric Evaluation Standards

When it comes to reducing local speckle noise in the detection of ultrasound breast images, the training impact of a logical-pool recurrent neural network is greater when both the mean square error (MSE) and the false identification rate are lower. Different ultrasound breast images have varying levels of local speckle noise destruction depending on how close both the PSNR, and SNR of the denoised images, are to the respective threshold levels of 55 and 70 dB. To determine the PSNR, one does the following:
Peak   signal   to   noise   ratio = 10 log 10 ( maximum I 2 mean   square   error )
Here, maximum I 2 is the highest value color image coordinates, and the mean square error is the deviation function of the image. Results of local speckle noise removal from ultrasound breast images, as measured by the border protection index (BPI): Evaluation of the proposed algorithm’s calculation of local speckle noise destruction in ultrasonic images requires consideration of not just the signal-to-noise ratio and peak-to-average ratio, but also the degree to which image boundaries are preserved after noise-removal. When evaluating edge retention performance, a higher EPI value indicates a more robust BPI. Specifically, this is the equation used to calculate EPI from experimental data:
B o r d e r   p r o t e c t i o n   i n d e x = i = 0 n ( a ¯ a ) i = 0 n ( a ¯ a ) 2 i = 0 n ( b ¯ b ) i = 0 n ( b ¯ b ) 2
Here, a is the image before the ultrasound, b is the image after the ultrasound has removed the local speckle noise, and is referred to as the Laplace-operator. The effectiveness of local speckle noise destruction in ultrasound breast images is proportional to the improvement in image quality achieved by noise removal.
T d e s = k = 1 m t d e s k
The above formula is used to calculate the time till destruction. The effectiveness of local speckle noise removal in ultrasound breast images is proportional to the amount of time spent on image noise removal after the use of various approaches.

4. Experimental Results

We used two datasets of ultrasound breast images as the object, scaled the exploratory image to the respective ultrasound image in pixels, fed it into the LPRNN for iterative trials, and tracked the connection among both the mean square error and false detection accuracy of image local speckle noise identification as the total count of iterations changed. Figure 5, shows this to be the case. Table 1 illustrates the LPRNN training outcome in terms of both mean square error and false identification rates. Figure 5 depicts the graphical view of LPRNN training outcomes.
Figure 5, shows that the training MSE for the LPRNN using the proposed algorithm is consistently higher than the rate of incorrect identification. Both the mean square error and the false recognition rates decrease as the number of iterations increases. The mean square error and false recognition rate both fell to about 1.2% after 100 iterations. The fact that the mean square error and false recognition rates of image local speckle noise identification converge to small values shows that the LPRNN is superior for ultrasound image training. Table 2 illustrates the image noise removal state-of-art method.
As the experimental object, a sample of ultrasound breast images was chosen randomly from the two data sets, and local speckle noise destruction tests were carried out using the proposed technique to evaluate the signal-to-noise ratio (SNR) and peak SNR. In addition, the image noise-removal ultrasonic images had thresholds of 50 dB for anticipated SNR and 65 dB for peak ratio. Figure 6, shows that the SNR and peak SNR of the ultrasound images processed by the proposed method were both higher than the experiment thresholds of 60 dB and 65 dB after the local speckle noise was removed. This exemplifies the superior effectiveness of the proposed algorithm over state-of-the-art approaches in noise removal from breast ultrasound images. The proposed technique was tested on a dataset consisting of ultrasound breast images with a BPI threshold of 0.45, and the results were statistical analyses of the EPI data collected after suppression.
Figure 7 depicts the BPI range after local speckle noise destruction. Selecting an ultrasound breast image from the INbreast dataset and the CBIS-DDSM dataset allowed us to denoise the local speckle noise in the ultrasound breast image and examine the intuitive impact of the proposed method on the noise removal of ultrasound image local speckle noise. Figure 8 shows the difference in image quality before and after speckle noise reduction was applied. Figure 8, shows that local speckles were abundant in the initial ultrasound breast images taken from both datasets, leading to erroneous conclusions about the breast’s health and the potential for minor lesions to be missed.
Contrarily, when local speckle noise is eradicated using the proposed approach, ultrasound image details are normal and clear, and the images are smooth and evenly distributed. In addition, following ultrasound image refining, the proposed technique has the effect of preserving edge information and making features more visible, allowing for more accurate monitoring of breast health. This demonstrates that the proposed approach successfully denoises ultrasound breast images by eliminating local speckle noise without losing any useful information. The ultrasound breast images in the INbreast dataset and the CBIS-DDSM dataset were each subjected to white Gaussian noise at levels of 25 dB, 45 dB, and 65 dB, and the algorithms’ denoising outcomes were compared based on PSNR and the amount of time it took to process the images. Table 3 illustrates the outcomes of these experiments. Abbreviation describes the list of shortforms and their respective abbreviations. Figure 9 depicts the noise removal comparison chart with a graphical view of proposed and state-of-art methods. Table 3 shows that when white Gaussian noise is added to the INbreast and DDSM datasets in varying levels, the proposed approach has lower noise overall and a much smaller peak noise compared to the other algorithms. The proposed method has reduced noise sensitivity and better local speckle noise removal.

5. Discussion

As can be observed in Figure 6, the BPI values of the denoised ultrasound images processed using the proposed technique were all high and far above the experimental threshold of 0.45. For denoised ultrasound images, the methods of Saeed Izadi et al. (2022) [6], Thayammal et al. (2021) [5], and Sujeet More et al. (2021) [8] achieve BPI values of 0.40–0.45. For ultrasound images, the Nguyen Thanh-Trung et al. (2021) [11] method has a lower BPI, with a maximum value of just 0.33, while the Dihan Zheng et al. (2021) [14] algorithm consistently has a BPI below 0.2. This clearly demonstrates that the proposed algorithm does not cause a loss of edge information while local speckle noise is suppressed, resulting in good visualization of ultrasound breast images.

6. Conclusions

Clinical objectivity and manipulation are greatly enhanced when ultrasound breast images are processed and used using a LPRNN. This work employs a LPRNN, an algorithm of a guided filter, and high-pass filtering of a spatial method to destroy ultrasound image local speckle noise. In experiments, the proposed technique suppresses local speckle noise in ultrasound breast images, retains edge information, and makes image features apparent, laying the groundwork for intelligent ultrasound image processing and application. This proposed work may not examine the various types of noise and other features of ultrasound image characteristics. We guarantee that our future research will look into additional details about various image noises and breast image characteristics in order to improve the comprehension and clinical use of ultrasound images.

Author Contributions

Conceptualization, B.B.V. and S.S.; methodology, S.K.M.; validation, V.M.; resources, B.B.V.; data curation, S.S.; writing—original draft preparation, B.B.V. and S.S.; writing—review and editing, J.C.B., N.H. and L.V.; visualization, S.K.M.; supervision, V.M, J.C.B., N.H. and L.V.; project administration, V.M., J.C.B., L.V. and N.H. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Faculty of Management, Comenius University Bratislava, Odbojarov 10, 820 05 Bratislava, Slovakia.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviation

Short FormAbbreviation
LPRNN Logical-Pool Recurrent Neural Network
SNRSignal-to-Noise Ratio
PSNRPeak Signal-to-Noise Ratio
DCNNDeep Convolutional Neural Network
IRInfraRed
3DThree Dimensional
KSVDK-singular Value Decomposition
SARNSpatial-wise Attention Residual network
CNNConvolutional Neural Network
JPEGJoint Photographic Experts Group
MRIMagnetic Resonance Imaging
ISOInternational Organization for Standardization
CBIS-DDSM Curated Breast Imaging Subset-Digital Database for Screening Mammography
XMLExtensible Markup Language
SARSpecific Absorption Rate
MSEMean Square Error
BPIBorder Protection Index

References

  1. Tian, C.; Fei, L.; Zheng, W.; Xua, Y.; Zuo, W.; Lin, C.-W. Deep Learning on Image Denoising: An Overview. Neural Netw. 2020, 131, 251–275. [Google Scholar] [CrossRef] [PubMed]
  2. Zhou, H.; Mian, A.; Wei, L.; Creighton, D.; Hossny, M. Recent Advances on Single modal and Multimodal Face Recognition: A Survey. IEEE Trans. Hum.-Mach. Syst. 2014, 44, 701–716. [Google Scholar] [CrossRef]
  3. Jifara, W.; Jiang, F.; Rho, S.; Cheng, M.; Liu, S. Medical image denoising using convolutional neural network: A residual learning approach. J. Supercomput. 2019, 75, 704–718. [Google Scholar] [CrossRef]
  4. Xie, J.; Xu, L.; Chen, E. Image Denoising and Inpainting with Deep Neural Networks. Adv. Neural Inf. Process. Syst. 2012, 25, 1–9. [Google Scholar]
  5. Thayammal, S.; Sankaramalliga, G.; Priyadarsini, S.; Ramalakshmi, K. Performance Analysis of Image Denoising using Deep Convolutional Neural Network. IOP Conf. Ser. Mater. Sci. Eng. 2021, 1070, 012085. [Google Scholar] [CrossRef]
  6. Izadi, S.; Sutton, D.; Hamarneh, G. Image Denoising in the Deep Learning Era; Springer Nature: Dordrecht, The Netherlands, 2022; pp. 1–59. [Google Scholar]
  7. Gupta, K. Study of Deep Learning Techniques on Image Denoising. IOP Conf. Ser. Mater. Sci. Eng. 2021, 1022, 012007. [Google Scholar] [CrossRef]
  8. More, S.; Singla, J.; Song, O.-Y.; Tariq, U.; Malebary, S. Denoising Medical Images Using Deep Learning in IoT Environment. Comput. Mater. Contin. 2021, 69, 3127–3143. [Google Scholar] [CrossRef]
  9. Nirmal, A.; Raval, P.; Patel, S. Analysis of Image Denoising Techniques with CNN and Residual Networks in Deep Learning. J. Interdiscip. Cycle Res. 2020, 12, 222–246. [Google Scholar]
  10. López, M.M.; Frederick, J.M.; Ventura, J. Evaluation of MRI Denoising Methods Using Unsupervised Learning. Front. Artif. Intell. 2021, 4, 642731. [Google Scholar] [CrossRef] [PubMed]
  11. Trung, N.T.; Trinh, D.-H.; Trung, N.L.; Luong, M. Low Dose CT Image Denoising using Deep Convolutional Neural Networks with Extended Receptive Fields. Signal Image Video Process. 2022, 16, 1963–1971. [Google Scholar] [CrossRef]
  12. Wang, Y.; Huang, H.; Xu, Q.; Liu, J.; Liu, Y.; Wang, J. Practical Deep Raw Image Denoising on Mobile Devices. In European Conference on Computer Vision; Lecture Notes in Computer Science Book Series; Springer: Cham, Switzerland, 2020; Volume 12351, pp. 1–17. [Google Scholar]
  13. Lefkimmiatis, S. Non-local Color Image Denoising with Convolutional Neural Networks. Computer Vision and Pattern Recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 1–15. [Google Scholar]
  14. Zheng, D.; Tan, S.H.; Zhang, X.; Shi, Z.; Ma, K.; Bao, C. An Unsupervised Deep Learning Approach for Real-World Image Denoising. In Proceedings of the International Conference on Learning Representations, Addis Ababa, Ethiopia, 26–30 April 2020; pp. 1–12. [Google Scholar]
  15. Latif, G.; Iskandar, D.A.; Alghazo, J.; Butt, M.; Khan, A.H. Deep CNN based MR image denoising for tumor segmentation using watershed transform. Int. J. Eng. Technol. 2018, 7, 37–42. [Google Scholar] [CrossRef]
  16. Saravanan, S.; Kumar, V.V.; Sarveshwaran, V.; Indirajithu, A.; Elangovan, D.; Allayear, S.M. Computational and Mathematical Methods in Medicine Glioma Brain Tumor Detection and Classification Using Convolutional Neural Network. Comput. Math. Methods Med. 2022, 2022, 4380901. [Google Scholar] [CrossRef] [PubMed]
  17. Aslam, M.A.; Aslam; Cui, D. Breast Cancer Classification using Deep Convolutional Neural Network. J. Phys. Conf. Ser. 2020, 1584, 012005. [Google Scholar] [CrossRef]
  18. Aljuaid, H.; Alturki, N.; Alsubaie, N.; Cavallaro, L.; Liotta, A. Computer-aided diagnosis for breast cancer classification using deep neural networks and transfer learning. Comput. Methods Programs Biomed. 2022, 223, 106951. [Google Scholar] [CrossRef] [PubMed]
  19. Zahoor, S.; Shoaib, U.; Lali, I.U. Breast Cancer Mammograms Classification Using Deep Neural Network and Entropy-Controlled Whale Optimization Algorithm. Diagnostics 2022, 12, 557. [Google Scholar] [CrossRef] [PubMed]
  20. Khikani, H.A.; Elazab, N.; Elgarayhi, A.; Elmogy, M.; Sallah, M. Breast Cancer Classification Based on Histopathological Images Using a Deep Learning Capsule Network. arXiv 2022, arXiv:2208.00594. [Google Scholar]
Figure 1. Local speckle denoising method for breast ultrasound images.
Figure 1. Local speckle denoising method for breast ultrasound images.
Sensors 23 01167 g001aSensors 23 01167 g001b
Figure 2. Recurrent neural network.
Figure 2. Recurrent neural network.
Sensors 23 01167 g002
Figure 3. Logical-pool support RNN.
Figure 3. Logical-pool support RNN.
Sensors 23 01167 g003
Figure 4. Algorithm 1 for noise removal of the local speckle noise.
Figure 4. Algorithm 1 for noise removal of the local speckle noise.
Sensors 23 01167 g004
Figure 5. LPRNN training outcome.
Figure 5. LPRNN training outcome.
Sensors 23 01167 g005
Figure 6. Graphical view of image noise removal different state-of-art-method [5,6,8,9,11,14].
Figure 6. Graphical view of image noise removal different state-of-art-method [5,6,8,9,11,14].
Sensors 23 01167 g006
Figure 7. BPI after local speckle noise-removal [5,6,8,9,11,14].
Figure 7. BPI after local speckle noise-removal [5,6,8,9,11,14].
Sensors 23 01167 g007
Figure 8. INbreast and CBIS-DDSM dataset image comparison (a) before and after the destruction; (b) before and after the local speckle destruction.
Figure 8. INbreast and CBIS-DDSM dataset image comparison (a) before and after the destruction; (b) before and after the local speckle destruction.
Sensors 23 01167 g008
Figure 9. Graphical view of image noise removal comparison chart of proposed and state-of-art methods [5,6,8,9,11,14].
Figure 9. Graphical view of image noise removal comparison chart of proposed and state-of-art methods [5,6,8,9,11,14].
Sensors 23 01167 g009
Table 1. LPRNN training results, both mean square error and false recognition rate.
Table 1. LPRNN training results, both mean square error and false recognition rate.
No. of IterationsMean Square ErrorFalse Recognition Value
11.40.7
210.5
30.80.4
40.70.3
50.40.2
60.30.1
Table 2. Image noise removal state-of-art method.
Table 2. Image noise removal state-of-art method.
MethodsSignal-to-Noise RatioPeak Signal-to-Noise Ratio
Value in dB
Proposed LPRNN 63.868.7
Saeed Izadi et al. (2022) [6]58.764.2
Thayammal et al. (2021) [5]57.463.7
Sujeet More et al. (2021) [8]59.865.1
Nguyen Thanh-Trung et al. (2021) [11]56.563.4
Dihan Zheng et al. (2021) [14]54.262.8
Aayushi Nirmal et al. (2020) [9]52.662.6
Table 3. Image noise removal comparison chart of proposed and state-of-art methods.
Table 3. Image noise removal comparison chart of proposed and state-of-art methods.
MethodsINbreast DatasetCBIS-DDSM Dataset
25 db45 db65 db25 db45 db65 db
Proposed LPRNN 6910111513
Saeed Izadi et al. (2022) [6]211921212317
Thayammal et al. (2021) [5]101316131618
Sujeet More et al. (2021) [8]182123141721
Nguyen Thanh-Trung et al. (2021) [11]112124172019
Dihan Zheng et al. (2021) [14]192022152120
Aayushi Nirmal et al. (2020) [9]172121152122
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Vimala, B.B.; Srinivasan, S.; Mathivanan, S.K.; Muthukumaran, V.; Babu, J.C.; Herencsar, N.; Vilcekova, L. Image Noise Removal in Ultrasound Breast Images Based on Hybrid Deep Learning Technique. Sensors 2023, 23, 1167. https://doi.org/10.3390/s23031167

AMA Style

Vimala BB, Srinivasan S, Mathivanan SK, Muthukumaran V, Babu JC, Herencsar N, Vilcekova L. Image Noise Removal in Ultrasound Breast Images Based on Hybrid Deep Learning Technique. Sensors. 2023; 23(3):1167. https://doi.org/10.3390/s23031167

Chicago/Turabian Style

Vimala, Baiju Babu, Saravanan Srinivasan, Sandeep Kumar Mathivanan, Venkatesan Muthukumaran, Jyothi Chinna Babu, Norbert Herencsar, and Lucia Vilcekova. 2023. "Image Noise Removal in Ultrasound Breast Images Based on Hybrid Deep Learning Technique" Sensors 23, no. 3: 1167. https://doi.org/10.3390/s23031167

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop