^{1}

^{2}

^{*}

^{1}

This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution license (http://creativecommons.org/licenses/by/3.0/).

In order to improve algorithm efficiency and performance, a technique for image fusion based on the Non-subsampled Contourlet Transform (NSCT) domain and an Accelerated Non-negative Matrix Factorization (ANMF)-based algorithm is proposed in this paper. Firstly, the registered source images are decomposed in multi-scale and multi-direction using the NSCT method. Then, the ANMF algorithm is executed on low-frequency sub-images to get the low-pass coefficients. The low frequency fused image can be generated faster in that the update rules for

Image fusion is an effective technology that synthesizes data from multiple sources and reduces uncertainty, which is beneficial to human and machine vision. In the past decades, it has been adopted in a variety of fields, including automatic target recognition, computer vision, remote sensing, robotics, complex intelligent manufacturing, medical image processing, and military purposes. Reference [

Pixel-level image fusion consists of two parts: space domain and frequency domain. The classic algorithms in the frequency domain include Intensity Hue Saturation (IHS) [

Until recently, the multi-resolution decomposition based algorithms have been widely used in the multi-source image fusion field, and effectively overcome spectrum distortion. Wavelet transformation provides great time-frequency analytical features and is the focus of multi-source image fusion. NSWT is made up of the tensor product of two one-dimension wavelets, solving the shift-invariant lacking problem that the traditional wavelets cannot do. Being lacking in anisotropy, NSWT fails to express direction-distinguished texture and edges sparsely. In 2002, Do and Vetteri proposed a flexible contourlet transform method that may efficiently detect the geometric structure of images attributed to their properties of multi-resolution, local and directionality [

Non-Negative Matrix Factorization (NMF) is a relatively new matrix analysis method [

The remainder of this paper is organized as follows: we introduce NSCT in Section 2. This is followed by a brief discussion on how NMF is constructed, and how we improve it. Section 4 presents the whole framework of the fusion algorithm. Section 5 shows experimental results for image fusion using the proposed technique, as well as the discussion and comparisons with other typical methods. Finally, the last Section concludes with a discussion of our and future works.

NSCT is proposed on the grounds of contourlet conception [

The structure of NSCT consists of two parts, as shown in ^{L}

NMF is a recently developed matrix analysis algorithm [

Conduct _{j}, j_{•1}, _{•2}, _{•}_{N}_{•j}_{j}, j_{•1}, _{•2},…, _{•}_{N}_{•1}, _{•2},…, _{•}_{N}

In the purpose of finding the appropriate factors

In respect to _{ia}_{aj}_{F}

Roughly speaking, the NMF algorithm has high time complexity that results in limited advantages for the overall performance of algorithm, so that the introduction of improved iteration rules to optimize the NMF is extremely crucial to promote the efficiency. In the point of algorithm optimization, NMF is a majorization problem that contains a non-negative constraint. Until now, a wide range of decomposition algorithms have been investigated on the basis of non-negative constraints, such as the multiplicative iteration rules, interactive non-negative least squares, gradient method and projected gradient [

As we know, the Lee-Seung algorithm continuously updates _{ij}_{ij}

We notice that the optimal _{j}^{th}_{i}^{th}

We propose to revise the algorithm claimed in article [_{ij}_{ij}_{j}_{i}^{T}^{T}Ax

We can easily obtain the step-length formula of _{j}_{i}_{j}, He_{j}^{T}, A^{T}e_{i}, W^{T}e_{i}

Learning from article [^{T}q/p^{T}A^{T}Ap_{j}_{i}

Obviously, when

As we know, approximation of an image belongs to the low-frequency part, while the high-frequency counterpart exhibits detailed features of edge and texture. In this paper, the NSCT method is utilized to separate the high and low components of the source image in the frequency domain, and then the two parts are dealt with different fusion rules according to their features. As a result, the fused image can be more complementary, reliable, clear and better understood.

By and large, the low-pass sub-band coefficients approximate the original image at low-resolution; it generally represents the image contour, but high-frequency details such as edges, region contours are not contained, so we take the ANMF algorithm to determine the low-pass sub-band coefficients which including holistic features of the two source images. The band-pass directional sub-band coefficients embody particular information, edges, lines, and boundaries of region, the main function of which is to obtain as many spatial details as possible. In our paper, a NHM based local self-adaptive fusion method is adopted in band-pass directional sub-band coefficients acquisition phase, by calculating the identical degree of the corresponding neighborhood to determine the selection for band-pass coefficients fusion rules (

Given that the two source images are

Adopt NSCT to implement the multi-scale and multi-direction decompositions for source images

Construct matrix _{A}_{B}

The fusion rule NHM is applied to band-pass directional sub-band coefficients
_{i,l}^{l}_{i,l}_{i,l}

when _{i,l}_{i,l}

Perform inverse NSCT transform on the fusion coefficients of

To verify the effectiveness of the proposed algorithm, three groups of images are used under the MATLAB 7.1 platform in this Section. All source images must be registered and with 256 gray levels. By comparison with the five typical algorithms below: NSCT-based method (M1), NMF-based method (classic NMF, M2), weighted NMF-based method (M3), PCA and wavelet, we can learn more about the one presented in our paper.

It may be possible to evaluate the image fusion subjectively, but subjective evaluation is likely affected by the biases of different observers, psychological status and even mental states. Consequently, it is absolutely necessary to establish a set of objective criteria for quantitative evaluation. In this paper, we select the Information Entropy (IE), Standard Deviation (SD), Average Gradient (AG), Peak Signal to Noise Ratio (PSNR), Q index [

A pair of “Balloon” images are chose to be source images, both are 200 by 160 in size. As can be seen from

From an intuitive point of view, the M1method produces a poor intensity that makes

_{aF}, AE_{bF} denote similarity between source image (a) and the fused one; (b) and the fused one, respectively) followed by wavelet, M3, PCA, M2, and method M1 has the highest AE. Therefore, in terms of transferring details, the performances of our method, wavelet, M3, PCA, M2, and M1 decrease.

From

As revealed in _{aF} and AE_{bF}, and that of wavelet, M3, PCA, M2 and M1 arrange in ascending order.

A group of registered visible and infrared images with a size of 360 by 240 showing a person walking in front of a house are labeled as

Of these,

In so far as IE, AG, and PSNR are concerned, the proposed technique is evidently better than the former four ones. Specially, the value of our method exceeds them by 1.6%, 4.9%, and 0.7% while the SD is slightly smaller when compared with M3. In index Q, the optimal value is obtained on the basis of the wavelet approach, while that of M1 holds the final place. As for MI, our method still ranks the first place in _{aF} and AE_{bF} are the smallest.

In this section, we compare the performance of ANMF with that of algorithms presented in article [

Image fusion with different models and numerical tests are conducted in our experiments, where the above four experiments indicate that the proposed method has a notable superiority in image fusion performance over the four other techniques examined (see Sections 5.2–5.4), and has better iteration efficiency (see Section 5.5). We observed that images based on wavelet and our proposed methods enjoy the best visual effect, and then the PCA, M3, M2, and M1 are the worst. In addition to visual inspection, quantitative analysis is also conducted to verify the validity of our algorithm from the angles of information amount, statistical features, gradient, signal to noise ratio, edge preservation, information theory and similarity of structure. The values in these metrics prove that the experiments achieve the desired objective.

In this paper, we have presented a technique for image fusion based on the NSCT and ANMF model. The accelerated NMF method modifies the traditional update rules of

This work is supported by Sichuan Provincial Department of Education (11ZB034). The authors also gratefully acknowledge the helpful comments and suggestions of the reviewers, which have improved the presentation.

Diagram of NSCT, NSP and NSDFB. (

Flowchart of fusion algorithm.

Multi-focus source images and fusion results. (

Medical source images and fusion results. (

Visible and infrared source images and fusion results. (

Numerical comparison between three algorithms.

The tradeoff selection for

SD | AG | SD | AG | ||
---|---|---|---|---|---|

0.55 | 30.478 | 8.3784 | 0.75 | 30.539 | 8.5109 |

0.6 | 30.664 | 8.4322 | 0.8 | 30.541 | 8.4376 |

0.65 | 30.412 | 8.4509 | 0.9 | 30.629 | 8.4415 |

0.7 | 30.456 | 8.5322 | 0.95 | 30.376 | 8.2018 |

Comparison of the fusion methods for multi-focus images.

IE | 7.3276 | 7.4594 | 7.4486 | 7.4937 | 7.5982 | 7.5608 |

SD | 28.705 | 29.728 | 29.934 | 30.206 | 31.127 | 30.539 |

AG | 8.4581 | 8.2395 | 8.4595 | 8.4853 | 8.5014 | 8.5109 |

PSNR(dB) | 35.236 | 36.246 | 36.539 | 36.746 | 37.533 | 37.224 |

Q Index | 0.9579 | 0.9723 | 0.9706 | 0.9812 | 0.9901 | 0.9844 |

MI | 3.4132 | 3.5268 | 3.9801 | 4.0538 | 4.1257 | 4.2578 |

ESAM values between multi-focus and fused images.

AE_{aF}16 × 16 |
20.37 | 19.96 | 19.89 | 19.82 | 19.27 | 18.96 |

AE_{aF}32 × 32 |
19.85 | 19.32 | 19.29 | 19.24 | 18.95 | 18.42 |

AE_{aF}64 × 64 |
19.06 | 18.62 | 18.53 | 18.42 | 18.13 | 17.95 |

AE_{bF}16 × 16 |
20.08 | 19.43 | 19.38 | 19.35 | 18.87 | 18.54 |

AE_{bF}32 × 32 |
19.62 | 18.88 | 18.81 | 18.76 | 18.11 | 17.96 |

AE_{bF}64 × 64 |
18.98 | 18.27 | 18.15 | 18.03 | 17.66 | 17.38 |

Comparison of the fusion methods for medical images.

IE | 5.4466 | 5.7628 | 5.7519 | 5.8875 | 6.1022 | 6.0641 |

SD | 29.207 | 27.768 | 27.883 | 28.549 | 31.836 | 31.628 |

AG | 20.361 | 26.583 | 25.194 | 27.358 | 28.573 | 29.209 |

PSNR(dB) | 36.842 | 37.238 | 37.428 | 37.853 | 38.737 | 38.458 |

Q Index | 0.9607 | 0.9695 | 0.9714 | 0.9821 | 0.9874 | 0.9835 |

MI | 4.0528 | 4.3726 | 4.3942 | 4.5522 | 4.8736 | 5.0837 |

ESAM values between CT, MRI and fused images.

AE_{aF}16 × 16 |
18.45 | 18.09 | 17.83 | 17.64 | 17.33 | 17.04 |

AE_{aF}32 × 32 |
18.13 | 17.67 | 17.32 | 17.08 | 16.79 | 16.58 |

AE_{aF}64 × 64 |
17.74 | 17.22 | 16.95 | 16.82 | 16.57 | 16.12 |

AE_{bF}16 × 16 |
18.39 | 18.12 | 17.79 | 17.53 | 17.38 | 17.11 |

AE_{bF}32 × 32 |
18.08 | 17.74 | 17.21 | 17.09 | 16.91 | 16.62 |

AE_{bF}64 × 64 |
17.76 | 17.36 | 17.05 | 16.85 | 16.34 | 16.17 |

Comparison of the fusion methods for visible and infrared images.

IE | 6.2103 | 6.3278 | 6.6812 | 6.7216 | 6.8051 | 6.7962 |

SD | 23.876 | 22.638 | 25.041 | 24.865 | 25.137 | 25.029 |

AG | 3.2746 | 3.0833 | 3.3695 | 3.4276 | 3.5234 | 3.5428 |

PSNR(dB) | 37.093 | 38.267 | 38.727 | 38.971 | 39.765 | 39.021 |

Q Index | 0.9761 | 0.9784 | 0.9812 | 0.9836 | 0.9956 | 0.9903 |

MI | 3.8257 | 4.2619 | 4.3128 | 4.5595 | 4.6392 | 4.7156 |

ESAM values between visible, infrared and fused images.

AE_{aF}16 × 16 |
22.53 | 22.17 | 21.88 | 21.69 | 21.14 | 21.03 |

AE_{aF}32 × 32 |
22.14 | 21.84 | 21.65 | 21.13 | 20.82 | 20.56 |

AE_{aF}64 × 64 |
21.75 | 21.36 | 20.83 | 20.52 | 20.06 | 19.94 |

AE_{bF}16 × 16 |
22.44 | 22.13 | 21.76 | 21.38 | 21.03 | 20.87 |

AE_{bF}32 × 32 |
22.08 | 21.22 | 20.93 | 20.69 | 20.47 | 20.15 |

AE_{bF}64 × 64 |
21.69 | 20.87 | 20.55 | 20.07 | 19.89 | 19.68 |