You are currently viewing a new version of our website. To view the old version click .
Sensors
  • Communication
  • Open Access

25 January 2022

A Fast Two-Stage Bilateral Filter Using Constant Time O(1) Histogram Generation

,
and
Department of Computer Science, National Chengchi University, Taipei City 11605, Taiwan
*
Author to whom correspondence should be addressed.
This article belongs to the Special Issue Computational Intelligence in Image Analysis

Abstract

Bilateral Filtering (BF) is an effective edge-preserving smoothing technique in image processing. However, an inherent problem of BF for image denoising is that it is challenging to differentiate image noise and details with the range kernel, thus often preserving both noise and edges in denoising. This letter proposes a novel Dual-Histogram BF (DHBF) method that exploits an edge-preserving noise-reduced guidance image to compute the range kernel, removing isolated noisy pixels for better denoising results. Furthermore, we approximate the spatial kernel using mean filtering based on column histogram construction to achieve constant-time filtering regardless of the kernel radius’ size and achieve better smoothing. Experimental results on multiple benchmark datasets for denoising show that the proposed DHBF outperforms other state-of-the-art BF methods.

1. Introduction

Edge preservation and smoothing of the content in an image is a basic problem in machine vision, computer graphics, and image processing. To remove shadows from a single input image, Yang et al. recovered a 3-D intrinsic image using edge-aware smoothing from the 2-D intrinsic image [1]. Stereo matching resolved ambiguities caused by nearby pixels with different parallaxes but similar colors using adaptive support weight-based trilateral filtering [2]. The salient regions of an image were extracted for image segmentation by the re-blurring model with bilateral and morphological filtering [3]. The joint bilateral filter was modified to up-sample the depth image with the high-resolution edge guidance for image super-resolution [4]. The guided image filter converted the structures of the guidance image to the output one so as to be available for transmission map refinement in image dehazing [5]. Yin et al. investigated the global, local, and social characteristics widely used in the image smoothing to develop the image reconstruction model [6]. Licciardo et al. implemented the edge-preserving image smoothing hardware architecture for real-time 60 fps tone mapping in 1920 × 1080 pixels of image [7]. The work [8] extracted fine details from underexposed or overexposed images using content-adaptive bilateral filtering in the gradient domain for multi-exposure image fusion. The bright-pass bilateral filtering was presented to estimate the scene illumination for low-light image enhancement [9].
Gaussian low-pass filtering is a technique that reduces pixels’ differences by weighted averaging for image smoothing in various applications. However, such low-pass filtering cannot preserve image details, e.g., edges and textures. The linear translation-variant function f then describes the above filtering process below:
f ( p ) = q K p , q ( Q ) P q ,
where K p , q denotes each pixel q centred at pixel p in the filter kernel K, and Q and P are guidance and input images, respectively. For example, the kernel of Bilateral Filtering (BF) [10] described by Equation (1) is formulated below:
K p , q ( Q ) = 1 n exp ( p q 2 σ s 2 ) exp ( P p Q q 2 σ r 2 ) ,
where n is a normalization factor, and σ s and σ r are the window size of the neighborhood expansion and the change of the edge amplitude intensity, respectively. Exponential distribution function is generally used in Equation (2) to calculate the influence of different spatial distances by exp ( p q 2 σ s 2 ) , and exp ( P p Q q 2 σ r 2 ) describing the contribution of the pixel intensity range. When Q and P are identical, Equation (2) is simplified as a single image smoothing form.
In this paper, we extend the existing BF frameworks by proposing a novel Dual-Histogram BF (DHBF) method for better denoising results. The main contributions of ours are threefold.
  • We improve the range kernel in BF based on an edge-preserving noise-reduced guidance image, where isolated noisy pixels are removed to avoid erroneously judging those noisy pixels as edges;
  • We adopt mean filtering based on column histogram construction to approximate the spatial kernel, achieving constant-time filtering regardless of the kernel radius’ size and better smoothing;
  • We conducted an extensive experiment on multiple benchmark datasets for denoising and demonstrated that the proposed DHBF performs favorably against other state-of-the-art BF methods.
The remainder of this paper is organized in the following sections. Section 2 reviews the state-of-the-art BF methods. Section 3 describes the proposed method. Section 4 provides both the qualitative and quantitative evaluations, and runtime estimation. Finally, Section 5 draws the conclusions of this paper.

3. Proposed Method

We propose a novel Dual-Histogram BF (DHBF) method to achieve better denoising and edge-preserving smoothing with O ( 1 ) time complexity regardless of the kernel radius. It contains two stages as shown in Figure 1. In Stage 1, we produce the noise-reduced guidance map G using the proposed computationally inexpensive edge-preserving BF to remove noise but preserve image details. Next, Stage 2 outputs the final filtered image based on the proposed edge-preserving BF with the range kernel computed using the guidance image obtained from the output patch of Stage 1.
Figure 1. The flowchart of the DHBF method.

3.1. Local Histogram Generation

Many local histogram generation algorithms have been presented in previous studies [28,29,30,31,32]. The simplest method directly visited each pixel in the sliding window. The major bottleneck is the time-consuming search operation. Previous work [28] proposed to reduce the spatial redundancy in the calculation of histograms. There is a large intersection area between two consecutive pixels. Hence, the spatial redundancy was available to the incoming region while modifying the boundary information [28]. However, this algorithm consumes O ( r ) per pixel cost, where r is the search radius.
Inspired from the integral image approach, a constant time O ( 1 ) method computed the local histogram in a Cartesian data space constructing a superset of the cumulative image formulation, named integral histogram [29]. Based on the distributive characteristics, the sliding window can be divided into disjoint column regions to update histograms by vector-based operations [30]. This distributive property is applied to several image applications and histogram-based functions, including but not limited to entropy, distance norm, and cumulative distribution [31]. Peng et al. proposed a simple but effective differential histogram data structure [32] being applied to further improve the computational performance of the column histogram-based algorithm in [30].
Peng et al. have demonstrated that their proposed algorithm was more efficient than other O ( 1 ) local histogram generation algorithms [32]. Hence, we implement the differential column histogram structure in local histogram generation for both Stages 1 and 2 in the DHBF.

3.2. Two-Stage Filtering

Let I be the input image and G be the guidance image. For each incoming pixel I x , the corresponding output pixel J x is calculated as:
J x = s Ω x K x , s ( I x , G s ) I x / s Ω x K x , s ( I x , G s ) ,
where Ω x is the sliding window centred at x. K x , s stands for the composite bilateral kernel, including the spatial kernel α and range kernel β , is given as:
K x , s ( I x , G s ) = α ( x , s ) β ( I x , G s ) ,
where α and β are formulated by the following two Gaussian-based exponential functions:
α ( x , s ) = exp x s 2 σ 2 ,
β ( I x , G s ) = exp I x G s 2 δ 2 ,
where σ and δ are the standard deviations for the spatial and range kernels. Replacing Equation (5) with a constant coefficient, we can simplify Equation (3) by the histogram-based computation expressed below:
J x = l = l min l max β ( I x , l ) H x ( l ) l / l = l min l max β ( I x , l ) H x ( l ) ,
where [ l min , l max ] is the pixel intensity range, and H x is the local histogram corresponding to the sliding window Ω x centered at x of the guidance image G .
To keep the center pixel’s origin intensity, we replace G x with I x in H x . We adopt column histogram construction [32] to achieve O ( 1 ) time complexity regardless of the radius size of the kernel. In Stage 1, we generate the guidance image G by substituting I x for G x in Equation (7).

4. Experimental Results

As shown in Figure 2, Figure 3, Figure 4, Figure 5 and Figure 6, we chose five benchmark datasets, BSDS100 [33], Set5 [34], Set14 [35], Urban100 [36], and USC-SIPI [37] containing test images with a wide variety of contents. These images are converted to their grayscale versions for testing. We compare the proposed DHBF with three preivous BF methods, including BF [10], OFBF [22], and GABF [25]. These methods were all implemented with Matlab R2018b and tested in a laptop with Intel i7 CPU at 2.8 GHz and 16 GB RAM. We set both σ and δ to 15 in all compared methods. For DHBF, we set δ = 45 in Stage 1 and δ = 15 in Stage 2. The other parameters were set as the compared methods. To simulate noisy images, we add a random Gaussian noise with the standard deviation of 0.10 .
Figure 2. Thumbnails of BSDS100.
Figure 3. Thumbnails of Set5.
Figure 4. Thumbnails of Set14.
Figure 5. Thumbnails of Urban100.
Figure 6. Thumbnails of USC-SIPI.
To show the visual perception effects of edge-preserving image smoothing from each BF-based method, we selected one representative sample from each of the above five databases. We observed that some obvious noise artifacts still remain in the image after the BF [10], OFBF [22], and GABF [25] are applied to it. In contrast, our DHBF method further enhanced the output quality of the BF method based on the proposed two-stage framework. Our proposed DHBF can focus on the photometric representation between pixels to refine the image details regarding BF behavior. For example, our DHBF method further preserved the edges with noise reduction at the man head, butterfly wings, girl face, building, and texture chart shown in Figure 7, Figure 8, Figure 9, Figure 10 and Figure 11, respectively.
Figure 7. Visual comparison of the representative sample for image smoothing on BSDS100.
Figure 8. Visual comparison of the representative sample for image smoothing on Set5.
Figure 9. Visual comparison of the representative sample for image smoothing on Set14.
Figure 10. Visual comparison of the representative sample for image smoothing on Urban100.
Figure 11. Visual comparison of the representative sample for image smoothing on USC-SIPI.
In addition to the above qualitative analysis, we also used peak Signal-to-Noise Ratio (PSNR) [38], Structural Similarity (SSIM) [39], Feature Similarity (FSIM) [40], and Gradient Magnitude Similarity Deviation (GMSD) [41] to perform Full-reference Image Quality Assessment (FR-IQA) of the output effects of different methods. PSNR is the ratio of the maximum possible power of a signal to the power of destructive noise that affects its accuracy. It is an evaluation index that can be used to quantify image distortion [38]. Given a ground-truth image and distortion image, SSIM measures the similarity of the two images to reflect the judgment of human eyes on image quality more appropriately than PSNR [39]. As the human visual system perceives images based on some low-level features, FSIM used phase consistency to describe the local image structure and extracted the gradient magnitude to supplement image variable attributes [40]. To explore the sensitive gradient changes of image distortion, GMSD computed the pixel-wise gradient magnitude map predicting perceptual image quality accurately [41].
Table 1 lists the FR-IQA results obtained using all the compared methods. We can see that DHBF performs favorably against other BF methods. Figure 12 demonstrate a runtime comparison of the compared BF methods with various kernel radius, ranging from 10 to 80 for processing 1024 × 1024 test images. As shown, both OFBF and DHBF methods run with constant time regardless of the radius sizes of the filter kernel. By contrast, BF and GABF require additional computation costs when a larger radius is used. According to all experiments mentioned above, DHBF performs the best for denoising while achieving O ( 1 ) time complexity.
Table 1. Full-reference Image Quality Assessment for different methods. The best scores are in bold.
Figure 12. Runtime comparisons of the compared BF methods.

5. Conclusions

This work proposed a novel Dual-histogram Bilateral Filtering (DHBF) with the range kernel computed based on the computationally inexpensive edge-preserving guidance image to produce better denoising results than existing state-of-the-art BF methods. In contrast to the compared computational-intelligence-based BF approach, OFBF, approximating the truncated Gaussian kernel, we calculated the spatial kernel using mean filtering based on column histogram construction, both achieving O ( 1 ) time complexity regardless of the kernel radius’s size. Subjective and objective experimental results on five benchmark datasets have verified that our proposed method performed favorably against other BF methods for edge-preserving smoothing.

Author Contributions

Conceptualization, S.-W.C.; methodology, S.-W.C.; software, S.-W.C.; validation, Y.-T.L.; formal analysis, Y.-T.L.; investigation, S.-W.C. and Y.-T.P.; resources, Y.-T.P.; data curation, Y.-T.P.; writing—original draft preparation, Y.-T.P. and S.-W.C.; writing—review and editing, Y.-T.P.; visualization, Y.-T.L.; supervision, Y.-T.P.; project administration, Y.-T.P.; funding acquisition, Y.-T.P. All authors have read and agreed to the published version of the manuscript.

Funding

This paper was supported in part by the Ministry of Science and Technology, Taiwan under Grants MOST 110-2221-E-004-010, 110-2622-E-004-001, 109-2622-E-004-002, 110-2634-F-019-001, 110-2634-F-019-002, 110-2221-E-019-062, and 110-2622-E-019-006.

Institutional Review Board Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Yang, Q.; Tan, K.-H.; Ahuja, N. Shadow removal using bilateral filtering. IEEE Trans. Image Process. 2012, 21, 4361–4368. [Google Scholar] [CrossRef] [PubMed]
  2. Chen, D.; Ardabilian, M.; Chen, L. A fast trilateral filter-based adaptive support weight method for stereo matching. IEEE Trans. Circuits Syst. Video Technol. 2015, 25, 730–743. [Google Scholar] [CrossRef] [Green Version]
  3. Li, H.; Ngan, K.N. Unsupervized video segmentation with low depth of field. IEEE Trans. Circuits Syst. Video Technol. 2007, 17, 1742–1751. [Google Scholar] [CrossRef]
  4. Xie, J.; Feris, R.S.; Sun, M.-T. Edge-guided single depth image super resolution. IEEE Trans. Image Process. 2016, 25, 428–438. [Google Scholar] [CrossRef]
  5. He, K.; Sun, J.; Tang, X. Guided image filtering. IEEE Trans. Pattern Anal. Mach. Intell. 2013, 35, 1397–1409. [Google Scholar] [CrossRef]
  6. Yin, J.; Chen, B.; Li, Y. Highly accurate image reconstruction for multimodal noise suppression using semi-supervised learning on big data. IEEE Trans. Multimed. 2018, 20, 3045–3056. [Google Scholar] [CrossRef]
  7. Licciardo, G.D.; D’Arienzo, A.; Rubino, A. Stream processor for real-time inverse tone mapping of Full-HD images. IEEE Trans. Very Large Scale Integr. (VLSI) Syst. 2015, 23, 2531–2539. [Google Scholar] [CrossRef]
  8. Li, Z.; Zheng, J.; Zhu, Z.; Wu, S. Selectively detail-enhanced fusion of differently exposed images with moving objects. IEEE Trans. Image Process. 2014, 23, 4372–4382. [Google Scholar] [CrossRef]
  9. Ghosh, S.; Chaudhury, K.N. Fast bright-pass bilateral filtering for low-light enhancement. In Proceedings of the International Conference on Image Processing, Taipei, Taiwan, 22–25 September 2019; pp. 205–209. [Google Scholar]
  10. Tomasi, C.; Manduchi, R. Bilateral filtering for gray and color images. In Proceedings of the International Conference on Computer Vision, Bombay, India, 4–7 January 1998; pp. 839–846. [Google Scholar]
  11. Caponetto, R.; Fortuna, L.; Nunnari, G.; Occhipinti, L.; Xibilia, M.G. Soft computing for greenhouse climate control. IEEE Trans. Fuzzy Syst. 2000, 8, 753–760. [Google Scholar] [CrossRef] [Green Version]
  12. Pal, S.K.; Talwar, V.; Mitra, P. Web mining in soft computing framework: Relevance, state of the art and future directions. IEEE Trans. Neural Netw. 2002, 13, 1163–1177. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  13. Tseng, V.S.; Ying, J.J.-C.; Wong, S.T.C.; Cook, D.J.; Liu, J. Computational intelligence techniques for combating COVID-19: A survey. IEEE Comput. Intell. Mag. 2020, 15, 10–22. [Google Scholar] [CrossRef]
  14. Porikli, F. Constant time O(1) bilateral filtering. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Anchorage, AK, USA, 24–26 June 2008; pp. 1–8. [Google Scholar]
  15. He, S.; Yang, Q.; Lau, R.W.H.; Yang, M.-H. Fast weighted histograms for bilateral filtering and nearest neighbor searching. IEEE Trans. Circuits Syst. Video Technol. 2016, 26, 891–902. [Google Scholar] [CrossRef]
  16. Yang, Q.; Tan, K.-H.; Ahuja, N. Real-time O(1) bilateral filtering. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Fontainebleau Resort, FL, USA, 20–25 June 2009; pp. 557–564. [Google Scholar]
  17. Yu, W.; Franchetti, F.; Hoe, J.C.; Chang, Y.-J.; Chen, T. Fast bilateral filtering by adapting block size. In Proceedings of the IEEE International Conference on Image Processing, Hong Kong, China, 26–29 September 2010; pp. 3281–3284. [Google Scholar]
  18. Chaudhury, K.N.; Sage, D.; Unser, M. Fast O(1) bilateral filtering using trigonometric range kernels. IEEE Trans. Image Process. 2011, 20, 3376–3382. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  19. Chaudhury, K.N. Constant-time filtering using shiftable kernels. IEEE Singal Process. Lett. 2011, 18, 651–654. [Google Scholar] [CrossRef] [Green Version]
  20. Sugimoto, K.; Kamata, S.-I. Compressive bilateral filtering. IEEE Trans. Image Process. 2015, 24, 3357–3369. [Google Scholar] [CrossRef]
  21. Chaudhury, K.N.; Ghosh, S. On fast bilateral filtering using Fourier kernels. IEEE Singal Process. Lett. 2016, 23, 570–573. [Google Scholar] [CrossRef] [Green Version]
  22. Ghosh, S.; Nair, P.; Chaudhury, K.N. Optimized Fourier bilateral filtering. IEEE Signal Process. Lett. 2018, 25, 1555–1559. [Google Scholar] [CrossRef] [Green Version]
  23. Gavaskar, R.G.; Kunal, N.; Chaudhury, K.N. Fast adaptive bilateral filtering. IEEE Trans. Image Process. 2019, 28, 779–790. [Google Scholar] [CrossRef] [PubMed]
  24. Dai, L.; Tang, L.; Tang, J. Speed up bilateral filtering via sparse approximation on a learned cosine dictionary. IEEE Trans. Circuits Syst. Video Technol. 2020, 30, 603–617. [Google Scholar] [CrossRef]
  25. Chen, B.-H.; Tseng, Y.-S.; Yin, J.-L. Gaussian-adaptive bilateral filter. IEEE Signal Process. Lett. 2020, 27, 1670–1674. [Google Scholar] [CrossRef]
  26. Zhang, X.; Dai, L. Fast bilateral filtering. Electron. Lett. 2019, 55, 258–260. [Google Scholar] [CrossRef]
  27. Guo, J.; Chen, C.; Xiang, S.; Ou, Y.; Li, B. A fast bilateral filtering algorithm based on rising cosine function. Neural. Comput. Appl. 2019, 31, 5097–5108. [Google Scholar] [CrossRef]
  28. Huang, T.S.; Yang, G.J.; Tang, G.Y. A fast two-dimensional median filtering algorithm. IEEE Trans. Acoust. Speech Signal Process. 1979, 27, 13–18. [Google Scholar] [CrossRef] [Green Version]
  29. Porikli, F. Integral histogram: A fast way to extract histograms in Cartesian spaces. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, San Diego, CA, USA, 20–26 June 2005; pp. 829–836. [Google Scholar]
  30. Perreault, S.; Hebert, P. Median filtering in constant time. IEEE Trans. Image Process. 2007, 16, 2389–2394. [Google Scholar] [CrossRef] [Green Version]
  31. Wei, Y.; Tao, L. Efficient histogram-based sliding window. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, San Francisco, CA, USA, 13–18 June 2010; pp. 3003–3010. [Google Scholar]
  32. Peng, Y.-T.; Chen, F.-C.; Ruan, S.-J. An efficient O(1) contrast enhancement algorithm using parallel column histograms. IEICE Trans. Inf. Syst. 2013, 96, 2724–2725. [Google Scholar] [CrossRef] [Green Version]
  33. Martin, D.; Fowlkes, C.; Tal, D.; Malik, J. A database of human segmented natural images and its application to evaluating segmentation algorithms and measuring ecological statistics. In Proceedings of the International Conference on Computer Vision, Vancouver, BC, Canada, 7–14 July 2001; pp. 416–423. [Google Scholar]
  34. Bevilacqua, M.; Roumy, A.; Guillemot, C.; AlberiMorel, M.L. Low-complexity single-image super-resolution based on nonnegative neighbor embedding. In Proceedings of the British Machine Vision Conference, Surrey, UK, 3–7 September 2012. [Google Scholar]
  35. Zeyde, R.; Protter, M.; Elad, M. On single image scale-up using sparse-representation. In Proceedings of the International Conference on Curves and Surfaces, Avignon, France, 24–30 June 2010; pp. 711–730. [Google Scholar]
  36. Huang, J.-B.; Singh, A.; Ahuja, N. Single image super-resolution from transformed self-exemplars. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 7–12 June 2015; pp. 5197–5206. [Google Scholar]
  37. The USC-SIPI Image Database. Available online: https://sipi.usc.edu/database/ (accessed on 31 August 2021).
  38. Wang, Z.; Bovik, A.C. Mean squared error: Love it or leave it?—A new look at signal fidelity measures. IEEE Signal Process. Mag. 2009, 26, 98–117. [Google Scholar] [CrossRef]
  39. Wang, Z.; Bovik, A.C.; Sheikh, H.R.; Simoncelli, E.P. Image quality assessment: From error visibility to structural similarity. IEEE Trans. Image Process. 2004, 13, 600–612. [Google Scholar] [CrossRef] [Green Version]
  40. Zhang, L.; Zhang, L.; Mou, X.; Zhang, D. FSIM: A feature similarity index for image quality assessment. IEEE Trans. Image Process. 2011, 20, 2378–2386. [Google Scholar] [CrossRef] [Green Version]
  41. Xue, W.; Zhang, L.; Mou, X.; Bovik, A.C. Gradient magnitude similarity deviation: A highly efficient perceptual image quality index. IEEE Trans. Image Process. 2014, 23, 684–695. [Google Scholar] [CrossRef] [Green Version]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Article Metrics

Citations

Article Access Statistics

Multiple requests from the same IP address are counted as one view.