Next Article in Journal
Project ARES: Driverless Transportation System. Challenges and Approaches in an Unstructured Road
Next Article in Special Issue
Can Deep Models Help a Robot to Tune Its Controller? A Step Closer to Self-Tuning Model Predictive Controllers
Previous Article in Journal
Optimization-Based Antenna Miniaturization Using Adaptively Adjusted Penalty Factors
Previous Article in Special Issue
Integral Sliding Mode Anti-Disturbance Control for Markovian Jump Systems with Mismatched Disturbances
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Image Fusion Algorithm Selection Based on Fusion Validity Distribution Combination of Difference Features

School of Information and Communication Engineering, North University of China, Taiyuan 030051, China
*
Author to whom correspondence should be addressed.
Electronics 2021, 10(15), 1752; https://doi.org/10.3390/electronics10151752
Submission received: 17 June 2021 / Revised: 12 July 2021 / Accepted: 19 July 2021 / Published: 21 July 2021

Abstract

:
Aiming at addressing the problem whereby existing image fusion models cannot reflect the demand of diverse attributes (e.g., type or amplitude) of difference features on algorithms, leading to poor or invalid fusion effect, this paper puts forward the construction and combination of difference features fusion validity distribution based on intuition-possible sets to deal with the selection of algorithms with better fusion effect in dual mode infrared images. Firstly, the distances of the amplitudes of difference features between fused images and source images are calculated. The distances can be divided into three levels according to the fusion result of each algorithm, which are regarded as intuition-possible sets of fusion validity of difference features, and a novel construction method of fusion validity distribution based on intuition-possible sets is proposed. Secondly, in view of multiple amplitude intervals of each difference feature, this paper proposes a distribution combination method based on intuition-possible set ordering. Difference feature score results are aggregated by a fuzzy operator. Joint drop shadows of difference feature score results are obtained. Finally, the experimental results indicate that our proposed method can achieve optimal selection of algorithms that has relatively better effect on the fusion of difference features according to the varied feature amplitudes.

1. Introduction

The fusion of dual mode Infrared images can synthesize their imaging advantages, which is conducive to effective storage of the detection images, significantly improving the imaging quality and detection accuracy of the detection system [1,2]. It has been widely used in military surveillance, infrared countermeasures, industrial monitoring and fault detection of lesions, etc. The infrared intensity and polarization images fusion of the site is an effective way to ensure the integrity of the information can be fully utilized, and the complementary advantages of image can improve the detection and recognition of target objects [3,4,5]. However, due to the differences in imaging mechanisms and the diversity of detection environments and targets, the difference characteristics of the two images in the same scene are complicated and varied. Different features have different fusion performance (fusion effectiveness) for each algorithm, so it is difficult to meet the different needs with static algorithms, which is only possible by selecting the appropriate algorithm according to the difference features, so as to meet the fusion requirements of complex infrared imaging scenes. At the moment, adjusting the fusion algorithm dynamically and optimally according to different attributes of difference features has been the key technology and a hot topic to improve pertinence and effectiveness of dual mode infrared image fusion [6].
At present, for dual mode infrared image fusion, through qualitative analysis of the relationship between some known difference feature types and multiple fusion algorithms, a better fusion algorithm is determined. By combining support value conversion (SVT) wherein the difference in brightness relationship with Top-Hat transform, [7] has improved the fused images contrast. Reference [8] analyzed the fusion performance of multi-resolution transform domain methods such as DWT, SWT, CVT, CT, DTCWT, NSCT, etc., established the corresponding relationship between these algorithms and difference features, and achieved good fusion results. Xiang [9] considered the influence of statistical difference features in the NSCT domain on the adaptive dual-channel unit link PCNN, which enriched the details of fused images. Meng [10] described the relationship between the brightness difference feature and the fusion algorithm with salient maps and points of interest, and reserved salient and bright targets for the fused image. In Reference [11] the mapping relationship between the edge details of visible light and infrared images and the fusion scheme based on compressed sensing was studied. The above algorithms have a good fusion effect under certain circumstances.
The above studies all use the crisp value to describe the fusion effect of difference features. It is recommended to apply fuzzy sets to overcome the situation where the crisp value cannot resolve the uncertain relationship between the difference feature and the fusion algorithm. An improved fuzzy set was used to fuse the low-frequency part of infrared and visible image fusion [12]. In addition, based on basic human judgment and membership functions, fuzzy logic inference was used to process multi-band image synchronization fusion [13]. Therefore, the fuzzy set theory is usually used to solve the image fusion problem by combining some multi-resolution transform domain methods instead of crisp values.
For actual target detection, however, another attribute of the difference features between the two kinds of images, called ‘amplitude’, in addition to the one known as ‘type’, equally has great effects on the fusion results. The changes of difference feature types and amplitudes are also random in practice, especially for a dynamic detection scene, making the changes of various attributes more complicated. The existing methods only consider the single attribute-type fusion effect on algorithms, which cannot reflect the impact of different attributes (such as type and amplitude) on algorithm selection, and is unable to quantify fusion validity change state of difference feature attributes, leading to poor or invalid fusion effects. Therefore, only when an algorithm with pertinence is selected optimally according to different attributes of difference features can the fusion quality of dual mode infrared image be improved.
For the fusion of infrared polarization and intensity images, the pre-selected fusion algorithm cannot always maintain better fusion performance when the amplitudes of different features are different, so the fusion algorithm effect is not fixed and changes dynamically along with the amplitudes of difference features. For the actual detection images, the fusion validity used to describe the degree of influence of the difference feature value is mostly predicted and estimated through the fusion results of existing, limited and similar scene images, so the measurement of fusion validity is predictable and possible. Therefore, a possibility distribution is needed to adopt the description of fusion validity change process [14,15,16]. However, the possibility distribution can express the dynamic changes of integration fusion validity, only describing double-sided properties, but it cannot describe the middle state of fusion validity of difference features [17,18,19], which has a great influence on fusion algorithm selection. In order to solve this problem, some researchers used intuitionistic fuzzy sets to model and manage uncertainty, which means that the process of word calculation is completed in the field of group decision-making. Tirupal studied a multi-modal medical image fusion model based on Yager’s intuitionistic fuzzy sets [20]. The initial image is converted to complementary intuition Yager fuzzy images and the best new objective function value, called intuitionistic fuzzy entropy, is used to obtain membership functions and non-membership function parameters. In Reference [21], a new infrared and visible image fusion method employing a non-subsampled contourlet transform (NSCT) with intuitionistic fuzzy sets is put forward, which outperformed the advanced fusion methods in terms of objective assessment and visual quality.
Therefore, this paper takes advantage of intuitionistic fuzzy set and possibility theory and proposes a novel construction method of fusion validity distribution based on intuition-possible sets to solve the algorithm selection problem. For multiple difference features, a distribution combination method based on intuition-possible set ordering is proposed. Fusion validity scores of the algorithms on multiple intervals of difference feature amplitude are calculated, and comprehensive values of the algorithms for different feature amplitudes are obtained. Based on this, for each difference feature, the algorithm with relative better effect is selected. The flow chart of the method is shown in Figure 1. The descriptions of some particular terms are listed in Table 1.
The following main contributions of this paper can be highlighted:
(1)
Intuition-possible sets are built to model fusion validity of attributes of image difference features.
(2)
A novel construction method of fusion validity distribution based on intuition-possible sets is proposed, which can reflect the fusion validity change process of attributes of image difference features to algorithms.
(3)
This paper puts forward a distribution combination method based on intuition-possible set ordering to solve the optimal algorithm selection problem that has a relatively better effect on the fusion of the difference features according to the varied image feature attribute values, which provides the basis to algorithm classification and mimicry bionic fusion. The rest of this paper is organized as follows: Section 2 briefly analyses the type of difference feature for infrared polarization and intensity images. Section 3 determines intuition-possible set on fusion effect according to the distance of amplitudes of difference features, then proposes a fusion validity distribution construction method. A distribution combination method based on intuition possible set ordering is put forward. Experimental results and comparisons are given and analyzed. Conclusions are presented in Section 4.

2. Determination of the Type of Difference Feature

The main reason for the formation of the difference features is that there are differences between the types of imaging characteristics, including the radiation difference between the target and background, the atmospheric transmission difference and the imager response difference. It can be seen from eight groups of closely registered infrared polarization and intensity experimental images (with a uniform size of 256 × 256) in Figure 2 that the infrared polarization images have sharp edge feature details, etc., but lack sufficient brightness characteristics. Infrared intensity image information based on thermal radiation luminance characteristics is significant, but lacks sufficient edge and detailed features. Thus, the difference in the infrared polarization and intensity image, edge features and details are evident. The difference feature selected in this paper includes gray mean, standard deviation, edge intensity and spatial frequency. These are defined as follows:
  • Gray mean: In the grayscale image, the brightness information changes continuously from dark to bright. The difference gray mean value represents the absolute value of mean difference of all pixel intensity values of two types of images in the dual-mode infrared images, thus it can effectively reflect the change of the difference of brightness characteristics of images.
  • Edge intensity: The edge information is the contour structural feature of the human eye recognition feature information, and the distributions of the two types of images are very different. The difference edge intensity represents the absolute value of the difference in the edge amplitude intensity of the two types of images. This paper selects the Sobel operator based on the commonly used edge extraction operators to extract the edge amplitude intensity information to characterize the edge feature difference change of the images.
  • Standard deviation: The difference standard deviation can reflect the discrete situation of the gray scale of the dual mode infrared image compared to the average gray scale. The larger the difference standard deviation is, the more discrete the gray level distribution is, indicating the greater the contrast between the two kinds of images, the more information available, that is, the better the fusion effect.
  • Spatial frequency: Difference spatial frequency can reflect the sharpness of pixel gray value changes in dual-mode infrared images, can effectively represent image texture feature information, and reflect the image’s ability to describe the contrast of small details. The greater the difference spatial frequency, the clearer the fused image. The above can be well described image information, so we adopt four difference features in this paper, labelled as T1, T2, T3 and T4.

3. Construction of Fusion Validity of Difference Feature Amplitude Based on Intuition Possible Sets

3.1. Intuition Possible Sets

Let X be the universe of discourse, X is a primary variable taken from X, if there are two mappings: π A ˜ 1 : X [ 0 , 1 ] and π A ˜ 2 : X [ 0 , 1 ] , satisfying
x X π A ˜ 1 ( x ) [ 0 , 1 ]
and:
x X π A ˜ 2 ( x ) [ 0 , 1 ]
and meeting the following condition:
0 π A ˜ 1 ( x ) + π A ˜ 2 ( x ) 1
Then, A ˜ is called an intuition possible set in the universe of discourse X, and it can be expressed as Equation (4):
A ˜ = { < x , π A ˜ 1 ( x ) , π A ˜ 2 ( x ) > | x X }
Here, π A ˜ 1 and π A ˜ 2 are possibility distribution and non-possibility distribution of A ˜ respectively, and π A ˜ 1 ( x ) and π A ˜ 2 ( x ) represent possibility degree and non-possibility degree of x about A ˜ . π A ˜ 3 ( x ) = 1 - π A ˜ 1 ( x ) - π A ˜ 2 ( x ) is hesitancy degree of x about A ˜ . We can denote π A ˜ 1 ( x j ) and π A ˜ 2 ( x j ) of x j ( j = 1 , 2 , , n ) as π j 1 and π j 2 . < π j 1 , π j 2 > is an ordered pair of possibility degree and non-possibility degree of x j .
For infrared polarization and intensity image fusion, X represents different amplitude values of difference feature, and intuition possibility set A ˜ is the set of high fusion effect of difference feature to algorithms. Then π A ˜ 1 ( x ) represents the possibility degree of high fusion validity of difference feature amplitude x to algorithms when we set the value of difference feature amplitude equal to x , and π A ˜ 2 ( x ) is the non-possibility degree of high fusion validity of difference feature amplitude, that is, the possibility degree of low fusion validity of difference feature amplitude x . π A ˜ 3 ( x ) is the hesitancy degree of medium fusion validity of difference feature amplitude x .

3.2. Calculation of the Distance of the Amplitudes of Difference Feature between Fused Image and Source Images

In this paper, the eight pairs of experimental images in Figure 2 are fused respectively by 12 classical fusion algorithms such as the Discrete Wavelet Transform [22,23], Quaternion Wavelet Transform [24,25], Dual Tree Complex Wavelet Transform [26,27], Wavelet Packet Transform [28], Non-Subsample Shearlet Transform [29,30,31], Laplacian Pyramid Transform [32,33], Gradient Pyramid Transform [34], KSVD [35,36], Weighted Average Fusion, Principal Component Analysis [37], Top-Hat Transform [38], and Guided Filter [39,40,41]. selected from multi-scale transform, sparse representation, variation, morphology, and pixel domain fusion algorithm classifications, etc., labelled as A1, A2, A3, A4, A5, A6, A7, A8, A9, A10, A11 and A12. Thus we can obtain fused images of different algorithms.
We first obtain a series of image blocks using 16 × 16 size and non-overlap block method to process fused image and source images respectively. Then the distances of the amplitudes of difference feature between fused image and source images are calculated by distance similarity as Equation (5).
D i X = min 1 i n X ¯ f i - X ¯ P i , X ¯ f i - X ¯ I i D X = { d 1 X , , d i X , d n X }
where D X is n dimensional vector of the distances of the amplitudes of difference feature for all image blocks. X ¯ f i represents the mean of the amplitudes of difference feature for the ith image block in fused images. Also, X ¯ P i and X ¯ I i are, respectively, the mean of the amplitudes of difference feature for the i-th image block in infrared polarization and infrared intensity images. X can be difference feature T1, T2, T3 or T4.
Discrete points figure of the four types of difference feature can be obtained as shown in Figure 3. Here, the amplitudes of difference feature are assigned as abscissa axis, and the normalized distances of the amplitudes of difference feature between fused image (e.g., A5) and source images are given as vertical axis.

3.3. Distribution Construction

For the infrared polarization and intensity images, the better fusion results(that is, higher fusion validity) of the fused images are shown as follows: (i) The brightness feature value of the fused image is closely related to that of the infrared intensity image. (ii) The edge and texture feature values of the fused images are also as close as possible to those of the infrared polarization images. Algorithm 1 summarizes the procedure of the distribution construction step by step in pseudo code format.
Algorithm 1: Distribution construction
Input: Infrared polarization image, infrared intensity image and fused image
Output: Fusion validity distribution
Step 1: The amplitudes of difference feature
// T1, T2, T3 and T4.
Step 2: Calculation of the distance of the amplitudes of difference feature using (5)
Step 3: Building of Intuition possible sets on fusion effect
Step 4: Construction of fusion validity distribution
    a. Determine the number of image blocks N X k
    b. Initialize n 1 X k , n 2 X k and n 3 X k
    c. Update n 1 X k , n 2 X k and n 3 X k using Equations (6)–(8)
    d. Calculate fusion validity
According to algorithm fusion result to the amplitudes of difference feature , the distances can be divided into three intervals, as [ 0 , X ¯ f i - X ¯ P i + X ¯ I i 2 ) , ( X ¯ f i - min ( X ¯ P i , X ¯ I i ) , 1 ] and [ X ¯ f i - X ¯ P i + X ¯ I i 2 , X ¯ f i - min ( X ¯ P i , X ¯ I i ) ] , where X ¯ P i + X ¯ I i 2 represents the mean of the amplitudes of difference feature for fused image using weighted average method. Three cases are discussed:
(1)
For image blocks, if the distances of the amplitudes of difference feature between fused image and source images are in the interval [ 0 , X ¯ f i - X ¯ P i + X ¯ I i 2 ) , then fused images in these image blocks have better effect than those based on weighted average method. So in this case, there is definite possibility for the set of high fusion effect of difference feature to algorithms.
(2)
When the distances of the amplitudes of difference feature are in the interval ( X ¯ f i - min ( X ¯ P i , X ¯ I i ) , 1 ] , difference features with high complementary have not been fused in fused image effectively, so there should be a definite possibility for the set of low fusion effect of difference feature to algorithms.
(3)
If the distance of the amplitudes of difference feature between fused image and source images are in the interval [ X ¯ f i - X ¯ P i + X ¯ I i 2 , X ¯ f i - min ( X ¯ P i , X ¯ I i ) ] , we cannot determine the fusion validity to be high or low under this circumstance, i.e., medium state.
The amplitudes of difference feature in Figure 3 can be divided into K intervals, and we can obtain the total number of image blocks N X k included in each amplitude interval X k ( k = 1 , 2 , , K ) . Meanwhile, the number of image blocks n 1 X k , n 2 X k and n 3 X k can be got included in three distance intervals for each amplitude interval. The mathematical formulation of fusion validity of intuition possible set is given as follows:
π A ˜ 1 , X k = n 1 X k N X k s . t 0 d X k < X ¯ f k - X ¯ p k + X ¯ I k 2
π A ˜ 2 , X k = n 2 X k N X k s . t X ¯ f k - m i n ( X ¯ p k , X ¯ I k ) < d X k 1
π A ˜ 3 , X k = n 3 X k N X k s . t X ¯ f k - X ¯ p k + X ¯ I k 2 d X k X ¯ f k - min ( X ¯ p k , X ¯ I k )
Here, π A ˜ 1 , X k + π A ˜ 2 , X k + π A ˜ 3 , X k = 1 .
Then, fusion validity distribution of intuition possible set can be obtained as shown in Figure 4. Similarly, we can get the fusion validity distribution of each difference feature to A1, A2, A3, A4, A5, A6, A7, A8, A9, A10, A11 and A12. Take difference feature T1 as an example, Figure 5 shows fusion validity distribution of difference feature T1 to different algorithms.

3.4. Combination of Fusion Validity Distribution Based on Intuition Possible Set Ordering

According to the above method, we can obtain fusion validity matrix based on intuition possible set of difference feature to fusion algorithm A i ( i = 1 , 2 , m ). For difference feature T1, fusion validity matrix F is given by:
F = π i k 1 , π i k 2 m × L =     1         2                                 L A 1 A 2 A m π 11 1 , π 11 2 π 12 1 , π 12 2 π 1 L 1 , π 1 L 2 π 21 1 , π 21 2 π 22 1 , π 22 2 π 2 L 1 , π 2 L 2 π m 1 1 , π m 1 2 π m 2 1 , π m 2 2 π m L 1 , π m L 2
Then, the total evaluation value V k ( k = 1 , 2 , K ) of the amplitude of difference feature T1 to algorithm Ai is expressed as follows:
V i = 1 L k = 1 L < π i k 1 , π i k 2 > = < π i 1 , π i 2 > ( i = 1 , 2 , , m )
Obviously, V i ( i = 1 , 2 , , m ) is also an intuition-possible set.
Let A ˜ = < π i k 1 , π i k 2 > be an intuition-possible set, the score function and accuracy function of A ˜ is defined as [42,43]:
m ( A ˜ ) = i = 1 m k = 1 L π i k 1 - π i k 2 1 - ( π i k 1 + π i k 2 ) ( 1 - π i k 1 - π i k 2 ) Δ ( A ˜ ) = i = 1 m k = 1 L π i k 1 + π i k 2 1 - ( π i k 1 + π i k 2 ) ( 1 - π i k 1 - π i k 2 )
For two intuition possible sets A ˜ i and A ˜ j [44,45,46], then:
1.
If m ( A ˜ i ) > m ( A ˜ j ) , then A ˜ i is superior to A ˜ j , denoted by A ˜ i > A ˜ j ;
2.
If m ( A ˜ i ) < m ( A ˜ j ) , then A ˜ j is superior to A ˜ i , denoted by A ˜ i < A ˜ j ;
3.
If m ( A ˜ i ) = m ( A ˜ j ) , then,
(1)
If Δ ( A ˜ i ) > Δ ( A ˜ j ) , then A ˜ i is superior to A ˜ j , denoted by A ˜ i > A ˜ j ;
(2)
If Δ ( A ˜ i ) = Δ ( A ˜ j ) , then A ˜ j is equivalent to A ˜ i , denoted by A ˜ i ~ A ˜ j
(3)
If Δ ( A ˜ i ) < Δ ( A ˜ j ) , then A ˜ j is superior to A ˜ i , denoted by A ˜ i < A ˜ j .
We can calculate score results ranging from the 1st amplitude interval of difference feature T1 to the 10th interval regarding fusion algorithm A i , as shown in Table 2. Table 3 is score results ranging from the 11th amplitude interval of difference feature T1 to the 20th interval.
According to Table 2 and Table 3, for each amplitude interval of difference feature T1, the nearer the score of algorithm approximates to 1, the better fusion effect this algorithm will have. The nearer the score of algorithm approximates to −1, the worse fusion effect the algorithm will have. Also, we can compute scores of each amplitude interval of the other three classes of difference features based on intuition possible set ordering. The advantage of such approach is that we can know the mapping between the amplitude of difference feature and fusion algorithms clearly. When the amplitude of difference feature changes, the fusion validity of each algorithm will also change accordingly. The total evaluation values of the amplitudes of difference features to algorithm A i are presented in Table 4 and pictorially depicted in Figure 6.
According to Table 4 and Figure 6, we can easily observe that, for difference feature T1, the A12 algorithm outperforms the other algorithms when source images contain T2, T3, and T4 with the proposed intuition-possible set ordering method in this study. For difference feature T2, T3, and T4, the A6 algorithm has obvious advantages in the fidelity of salient information (including contrast, edge feature and texture) and human visual effect.
The score results of difference features use the disjunctive fuzzy operator to select the optimal fusion algorithm with the largest score. The aggregation of four difference feature score results of source images are presented in Figure 7. We can also obtain joint drop shadows of aggregation of four difference feature score results, shown in Figure 8.
By counting the numbers of each fusion algorithm appears in the joint drop shadow graph and the corresponding score values, the proportion fi of the fusion algorithm Ai in the area and the average score value E i ¯ are calculated. Because the values are both related to the performance of algorithm Ai, the fusion index Ei is constructed, as shown in Equation (11). The fusion index of different fusion algorithms synthesis of all the difference feature amplitudes is summed, and the fusion algorithm corresponding to the highest value is the optimal fusion algorithm:
E i = f i E i ¯

3.5. Experimental Results and Comparisons

To test the proposed fusion validity distribution construction and combination method in this study, we select two groups of infrared polarization and intensity images with different scenes (as shown in Figure 9) to experimental verification. Figure 10 and Figure 11 are fused images with the above fusion algorithms for each group of verified images respectively. The fusion index Ei for each group are shown in Table 5 and Table 6.
We adopt both subjective and objective assessment to analyse the fused results with the algorithm selected by the proposed fusion validity distribution construction and combination method. It can be observed from Figure 10 that the fusion results of A1, A6 and A12 algorithms can retain the brightness of infrared intensity image, while the other fused images are dark as a whole. Under the premise of preserving the fused images with high brightness and contrast, algorithm A6 has better texture detail and edge contour (particularly the box region of Figure 10) preservation performance, also it has better visual effects and resolution. For the box region with great difference of source image in Figure 11, algorithms A5 and A6 have better visual brightness, contrast, texture details and edge profile edge contour preservation performance than the other algorithms, particularly the A5 algorithm, which preserves the texture details completely. The other fused images are dark, leading to poor human visual effect.
It can be observed from Table 5 and Table 6 and Figure 12, for the first group, in the five combination forms (T1 and T2, T1 and T3, T1 and T4, T2 and T3, T2 and T4), the fusion indicators of fusion algorithm A6 are higher than other algorithms, and it is obvious that A6 is much larger than other algorithms on the final calculation, so it is concluded that A6 is the optimal fusion algorithm for the first group. In the same way, A5 is the optimal fusion algorithm for the second group. In order to prove the effectiveness and advantages of this method in this study, based on the first set of images, a comparative analysis was made with the existing fusion effectiveness based on fuzzy theory. The fusion validity as follows:
μ i = X ¯ f , j i m a x ( X ¯ f , j i ) ( X ¯ f , j i ) 2 + max ( X ¯ P i , X ¯ I i ) 2
Thus, we can obtain fusion validity scatterplot of the four difference features, as shown in Figure 13.
The difference feature information and fuzzy operators in the above cases are used to rank the fusion algorithm. With fusion validity based on fuzzy theory, the order of fusion algorithms, namely, A6 > A10 > A11 > A12 > A1 > A7 >A5 >A9 > A3 > A2 > A8 > A4.
The rankings order compared with the result obtained by this paper is slightly different, but the best algorithm of the two methods is the same, that is A6. It shows that the method proposed in this paper is effective.
To further demonstrate the correctness of the sort order, we compare the values of Xj=1:8 (information entropy, Q0, Qw, QE, VIFF, SSIM, mutual information, and average gradient). The fusion results of the 12 fusion algorithms corresponding to the two groups of source images are evaluated, we employ grade score Ri (as shown in Formula (12)) to fully consider all evaluation indexes evaluate:
R i = j = 1 8 r . j , i = 1 , 2 , , 12
According to the ranking r . j of each fusion algorithm about the value, the optimal fusion algorithm is selected and compared with the selected fusion algorithm of proposed method. The results are shown in Table 7.
The following can be observed from the analysis of the data in the Table 7, the R of algorithm A6 in the first group is significantly smaller than other fusion algorithms, which indicates that A6 is the optimal fusion algorithm of the first group, with the strongest overall performance. With regard to the suboptimal algorithm A1, the X2 and X3 values are higher than the others, and the X1 value is second highest, and the other values rank ahead. It is reasonable that A1 is the suboptimal algorithm. Therefore, we can use the proposed method based on the intuitive possible sets to select the best fusion algorithm in dual mode infrared images.
In the second group, the R of the A5 algorithm is significantly better. For other fusion algorithms, it is consistent with the experimental results in Table 5 and Table 6, which verifies the correctness and effectiveness of the method in this paper. The algorithm with better fusion result by subjective and objective analysis are selected by the proposed fusion validity distribution construction and combination method, which shows that based on intuition possible set, it is feasible and effective to select the best fusion algorithm. According to multi-attributes (e.g., type and amplitude) of difference feature, the proposed method can select relatively better fusion algorithm specifically.

4. Conclusions

This paper proposes a novel fusion validity distribution construction method based on intuition-possible sets. We have considered the dynamic changes of difference feature attributes (including type and amplitude) to select algorithm and established the mappings of the attributes of difference feature and fusion algorithm. The proposed distribution can realize quantitative description of fusion effect changing process of difference feature of images to algorithms.
For multiple amplitude intervals of difference features, this paper put forward a distribution combination method based on intuition possible set ordering. Through calculating the score function and accuracy function of intuition possible sets on fusion validity, we can select fusion algorithm with relatively better result. It provides new way to algorithm classification and mimicry bionic fusion.
In the process of approaching a solution, future research is formulated with the following questions in mind:
(1)
In this article, four difference features are used to select the best one of the twelve fusion algorithms. In further research, we should consider the recently published solutions for image fusion to choose the algorithm with better fusion effects.
(2)
Although the method proposed in this paper has great advantages in selecting the optimal fusion algorithm, there is still some room for improvement. We utilize a fuzzy operator to aggregate difference feature score results in this paper. It would be very interesting to apply some fuzzy weighted averaging operators to cope with fusion validity distribution combination of difference features, such as fuzzy ordered weighted averaging operator, and Pythagorean fuzzy averaging and geometric averaging operators, etc.

Author Contributions

Conceptualization, L.J. and F.Y.; methodology, L.J.; software, X.G.; validation, L.J. and X.G.; writing—original draft preparation, L.J.; writing—review and editing, L.J. and X.G.; visualization, F.Y.; supervision, F.Y.; project administration, L.J. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by in part by the National Natural Science Foundation of China under Grant (61702465), in part by the Shanxi Province Science Foundation of Youths (201901D211238), and in part by the Shanxi Province Science and Technology Innovation Project of Higher Education (2020L0264).

Institutional Review Board Statement

We choose to exclude this statement since the study did not involve humans or animals.

Informed Consent Statement

We all decide to submit this manuscript for publication.

Data Availability Statement

We choose to exclude this statement since the study did not report any data.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Li, S.; Jin, W.; Xia, R.; Li, L.; Wang, X. Radiation correction method for infrared polarization imaging system with front-mounted polarizer. Opt. Express 2016, 24, 26414–26430. [Google Scholar] [CrossRef]
  2. Hu, P.; Yang, F.; Wei, H.; Ji, L.; Liu, D. A multi-algorithm block fusion method based on set-valued mapping for dual-modal infrared images. Infrared Phys. Technol. 2019, 102, 102977. [Google Scholar] [CrossRef]
  3. Sappa, A.D.; Carvajal, J.A.; Aguilera, C.A.; Oliveira, M.; Romero, D.; Vintimilla, B.X. Wavelet-based visible and infrared image fusion: A comparative study. Sensors 2016, 16, 861. [Google Scholar] [CrossRef] [Green Version]
  4. Zhang, L.; Yang, F.; Ji, L. Multi-Scale Fusion Algorithm Based on Structure Similarity Index Constraint for Infrared Polarization and Intensity Images. IEEE Access 2017, 5, 24646–24655. [Google Scholar] [CrossRef]
  5. Liang, J.; Zhang, W.F.; Ren, L.Y. Polarimetric dehazing method for visibility improvement based on visible and infrared image fusion. Appl. Opt. 2016, 55, 8221–8226. [Google Scholar] [CrossRef] [PubMed]
  6. Zhou, Z.; Wang, B.; Li, S.; Dong, M. Perceptual fusion of infrared and visible images through a hybrid multi-scale decomposition with Gaussian and bilateral filters. Inf. Fusion 2016, 30, 15–26. [Google Scholar] [CrossRef]
  7. Lin, S.-Z.; Wang, D.-J.; Zhu, X.-H.; Zhang, S.-M. Fusion of infrared intensity and polarization images using embedded multi-scale transform. Optik 2015, 126, 5127–5133. [Google Scholar] [CrossRef]
  8. Gangapure, V.N.; Banerjee, S.; Chowdhury, A. Steerable local frequency based multispectral multifocus image fusion. Inf. Fusion 2015, 23, 99–115. [Google Scholar] [CrossRef]
  9. Xiang, T.; Yan, L.; Gao, R. A fusion algorithm for infrared and visible images based on adaptive dual-channel unit-linking PCNN in NSCT domain. Infrared Phys. Technol. 2015, 69, 53–61. [Google Scholar] [CrossRef]
  10. Meng, F.; Guo, B.; Song, M.; Zhang, X. Image fusion with saliency map and interest points. Neurocomputing 2016, 177, 1–8. [Google Scholar] [CrossRef]
  11. Liu, Z.; Yin, H.; Fang, B.; Chai, Y. A novel fusion scheme for visible and infrared images based on compressive sensing. Opt. Commun. 2015, 335, 168–177. [Google Scholar] [CrossRef]
  12. Cai, H.Y.; Zhuo, L.R.; Chen, X.D. Infrared and visible image fusion based on BEMSD and improved fuzzy set. Infrared Phys. Technol. 2019, 98, 201–211. [Google Scholar] [CrossRef]
  13. Wang, B.; Zeng, J.C.; Lin, S.Z. Multi-band images synchronous fusion based on NSST and fuzzy logical inference. Infrared Phys. Technol. 2019, 98, 94–107. [Google Scholar] [CrossRef]
  14. Ji, L.N.; Yang, F.B.; Wang, X.X. Similarity measure and weighted combination method of nonlinear possibility distributions. J. Nonlinear Convex Anal. 2019, 20, 787–800. [Google Scholar]
  15. Satapathi, G.S.; Srihari, P. Rough fuzzy joint probabilistic association for tracking multiple targets in the presence of ECM. Expert Syst. Appl. 2018, 106, 132–140. [Google Scholar] [CrossRef]
  16. Dubois, D.; Prade, H.; Rico, A. Graded cubes of opposition and possibility theory with fuzzy events. Int. J. Approx. Reason. 2017, 84, 168–185. [Google Scholar] [CrossRef] [Green Version]
  17. Peng, X.; Yang, Y. Algorithms for interval-valued fuzzy soft sets in stochastic multi-criteria decision making based on regret theory and prospect theory with combined weight. Appl. Soft Comput. 2017, 54, 415–430. [Google Scholar] [CrossRef]
  18. Zhang, H.M.; Yue, L.Y. New distance measures between intuitionistic fuzzy sets and interval-valued fuzzy sets. Inf. Sci. 2013, 245, 181–196. [Google Scholar] [CrossRef]
  19. Ji, L.N.; Yang, F.B.; Guo, X.M. Set-valued mapping cloud model and its application for fusion algorithm selection of dual mode infrared images. IEEE Access 2021, 9, 54338–54349. [Google Scholar] [CrossRef]
  20. Tirupal, T.; Mohan, B.C.; Kumar, S.S. Multimodal medical image fusion based on yager’s intuitionistic fuzzy sets. Iran. J. Fuzzy Syst. 2019, 16, 33–48. [Google Scholar]
  21. Zhang, K.; Huang, Y.; Yuan, X.; Ma, H.; Zhao, C. Infrared and visible image fusion based on intuitionistic fuzzy sets. Infrared Phys. Technol. 2020, 105, 103124. [Google Scholar] [CrossRef]
  22. Kavitha, S.; Thyagharajan, K.K. Efficient DWT-based fusion techniques using genetic algorithm for optimal parameter estimation. Soft Comput. 2017, 21, 3307–3316. [Google Scholar] [CrossRef]
  23. Yang, Y. Multi-Sensor Image Fusion Based on a New Discrete Wavelet Transform Based Technique. Sens. Lett. 2013, 11, 2137–2140. [Google Scholar] [CrossRef]
  24. Liu, Y.; Jin, J.; Wang, Q.; Shen, Y.; Dong, X. Region level based multi-focus image fusion using quaternion wavelet and normalized cut. Signal Process. 2014, 97, 9–30. [Google Scholar] [CrossRef]
  25. Chai, P.; Luo, X.; Zhang, Z. Image Fusion Using Quaternion Wavelet Transform and Multiple Features. IEEE Access 2017, 5, 6724–6734. [Google Scholar] [CrossRef]
  26. Yu, B.; Jia, B.; Ding, L.; Cai, Z.; Wu, Q.; Law, C.H.R.; Huang, J.; Song, L.; Fu, S. Hybrid dual-tree complex wavelet transform and support vector machine for digital multi-focus image fusion. Neurocomputing 2016, 182, 1–9. [Google Scholar] [CrossRef]
  27. Hu, G.; Li, X.; Liang, D. Thin cloud removal from remote sensing images using multidirectional dual tree complex wavelet transform and transfer least square support vector regression. J. Appl. Remote Sens. 2015, 9, 095053. [Google Scholar] [CrossRef]
  28. Bao, W.X.; Zhu, X.L. A Novel Remote Sensing Image Fusion Approach Research Based on HSV Space and Bi-orthogonal Wavelet Packet Transform. J. Indian Soc. Remote Sens. 2015, 43, 467–473. [Google Scholar] [CrossRef]
  29. Zhang, B.; Lu, X.; Pei, H.; Zhao, Y. A fusion algorithm for infrared and visible images based on saliency analysis and non-subsampled Shearlet transform. Infrared Phys. Technol. 2015, 73, 286–297. [Google Scholar] [CrossRef]
  30. Moonon, A.-U.; Hu, J. Multi-Focus Image Fusion Based on NSCT and NSST. Sens. Imaging Int. J. 2015, 16, 1–16. [Google Scholar] [CrossRef]
  31. Kong, W.; Wang, B.; Lei, Y. Technique for infrared and visible image fusion based on non-subsampled shearlet transform and spiking cortical model. Infrared Phys. Technol. 2015, 71, 87–98. [Google Scholar] [CrossRef]
  32. Wang, W.; Chang, F. A Multi-focus Image Fusion Method Based on Laplacian Pyramid. J. Comput. 2011, 6, 2559–2566. [Google Scholar] [CrossRef]
  33. Du, J.; Li, W.; Xiao, B.; Nawaz, Q. Union Laplacian pyramid with multiple features for medical image fusion. Neurocomputing 2016, 194, 326–339. [Google Scholar] [CrossRef]
  34. Qu, X.J.; Zhang, F.; Zhang, Y. Feature-Level Fusion of Dual-Band Infrared Images Based on Gradient Pyramid Decomposition. Appl. Mech. Mater. 2013, 347–350, 2380–2384. [Google Scholar] [CrossRef]
  35. Li, Y.; Li, F.; Bai, B.; Shen, Q. Image fusion via nonlocal sparse K-SVD dictionary learning. Appl. Opt. 2016, 55, 1814–1823. [Google Scholar] [CrossRef] [Green Version]
  36. Liu, Y.; Liu, S.; Wang, Z. A general framework for image fusion based on multi-scale transform and sparse representation. Inf. Fusion 2015, 24, 147–164. [Google Scholar] [CrossRef]
  37. Vijayarajan, R.; Muttan, S. Discrete wavelet transform based principal component averaging fusion for medical images. AEU Int. J. Electron. Commun. 2015, 69, 896–902. [Google Scholar] [CrossRef]
  38. Zhu, P.; Ma, X.; Huang, Z. Fusion of infrared-visible images using improved multi-scale top-hat transform and suitable fusion rules. Infrared Phys. Technol. 2017, 81, 282–295. [Google Scholar] [CrossRef]
  39. Li, S.; Kang, X.; Hu, J. Image Fusion with Guided Filtering. IEEE Trans. Image Process. 2013, 22, 2864–2875. [Google Scholar] [CrossRef]
  40. Toet, A.; Hogervorst, M.A. Multiscale image fusion through guided filtering. Target & Background Signatures II. Int. Soc. Opt. Photonics 2016, 9997, 99970J. [Google Scholar]
  41. Kou, F.; Chen, W.; Wen, C.; Li, Z. Gradient Domain Guided Image Filtering. IEEE Trans. Image Process. 2015, 24, 4528–4539. [Google Scholar] [CrossRef] [PubMed]
  42. Liu, P.; Li, D. Some Muirhead Mean Operators for Intuitionistic Fuzzy Numbers and Their Applications to Group Decision Making. PLoS ONE 2017, 12, e0168767. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  43. Xu, Z.; Yager, R.R. Some geometric aggregation operators based on intuitionistic fuzzy sets. Int. J. Gen. Syst. 2006, 35, 417–433. [Google Scholar] [CrossRef]
  44. Wan, S.-P.; Yi, Z.-H. Power Average of Trapezoidal Intuitionistic Fuzzy Numbers Using Strict t-Norms and t-Conorms. IEEE Trans. Fuzzy Syst. 2015, 24, 1035–1047. [Google Scholar] [CrossRef]
  45. Chen, S.-M.; Chang, C.-H. A novel similarity measure between Atanassov’s intuitionistic fuzzy sets based on transformation techniques with applications to pattern recognition. Inf. Sci. 2015, 291, 96–114. [Google Scholar] [CrossRef]
  46. Guo, K.; Song, Q. On the entropy for Atanassov’s intuitionistic fuzzy sets: An interpretation from the perspective of amount of knowledge. Appl. Soft Comput. 2014, 24, 328–340. [Google Scholar] [CrossRef]
Figure 1. Flowchart of the proposed method.
Figure 1. Flowchart of the proposed method.
Electronics 10 01752 g001
Figure 2. Eight groups of infrared polarization and infrared intensity experimental images.
Figure 2. Eight groups of infrared polarization and infrared intensity experimental images.
Electronics 10 01752 g002
Figure 3. Discrete points figure of the distance of the four types of difference feature. (a) difference feature T1; (b) difference feature T2; (c)difference feature T3; (d)difference feature T4.
Figure 3. Discrete points figure of the distance of the four types of difference feature. (a) difference feature T1; (b) difference feature T2; (c)difference feature T3; (d)difference feature T4.
Electronics 10 01752 g003
Figure 4. Fusion validity distribution of each difference feature to NSST algorithm. (a) difference feature T1; (b) difference feature T2; (c) difference feature T3; (d) difference feature T4.
Figure 4. Fusion validity distribution of each difference feature to NSST algorithm. (a) difference feature T1; (b) difference feature T2; (c) difference feature T3; (d) difference feature T4.
Electronics 10 01752 g004
Figure 5. Fusion validity distribution of difference feature M to various algorithms.
Figure 5. Fusion validity distribution of difference feature M to various algorithms.
Electronics 10 01752 g005aElectronics 10 01752 g005b
Figure 6. Histogram of the total evaluation values of difference features to algorithms.
Figure 6. Histogram of the total evaluation values of difference features to algorithms.
Electronics 10 01752 g006
Figure 7. Aggregation of difference feature score results.
Figure 7. Aggregation of difference feature score results.
Electronics 10 01752 g007aElectronics 10 01752 g007b
Figure 8. Joint drop shadow of difference feature score results.
Figure 8. Joint drop shadow of difference feature score results.
Electronics 10 01752 g008
Figure 9. Two groups of verified images.
Figure 9. Two groups of verified images.
Electronics 10 01752 g009
Figure 10. Fusion results of the first group of images using different methods.
Figure 10. Fusion results of the first group of images using different methods.
Electronics 10 01752 g010
Figure 11. Fusion results of the second group of images using different methods
Figure 11. Fusion results of the second group of images using different methods
Electronics 10 01752 g011
Figure 12. Ei of different fusion algorithms of two groups.
Figure 12. Ei of different fusion algorithms of two groups.
Electronics 10 01752 g012
Figure 13. Fusion validity scatterplot of the four difference features.
Figure 13. Fusion validity scatterplot of the four difference features.
Electronics 10 01752 g013
Table 1. The descriptions of particular terms.
Table 1. The descriptions of particular terms.
Particular TermsDescription
Difference featureThe difference information of infrared polarization and intensity image.
Diverse attributeThe type and amplitude of difference features.
The type of difference featuresThe brightness, the edge and detailed features, including gray mean, standard deviation, edge intensity and spatial frequency.
The amplitude of difference featuresThe absolute difference of the feature pixels intensity value of two types of image.
Fusion validityIt is used to measure effective degree of fusing the features in fused images and the source images for the specific fusion algorithm.
Fusion validity distributionIt can reflect fusion validity changing process of attributes of image difference features to algorithms.
Intuition possible setsIt can realize quantitative description of fusion effect changing process of difference feature of images to algorithms.
Relative better effectFrom the comprehensive consideration of objective evaluation and subjective evaluation, this algorithm has the best fusion effect.
Distribution constructionThe changing process based on the new method (intuition possible sets) is constructed.
Table 2. Score results in the range of the 1st of difference feature amplitude interval through the 10th interval.
Table 2. Score results in the range of the 1st of difference feature amplitude interval through the 10th interval.
Fusion AlgorithmDifference Feature Amplitude Interval
12345678910
A10.21430.47420.65350.65260.80950.91820.88290.89620.93940.9167
A20.35710.39180.29700.22110.21430.18180.04500.01890.01010.0167
A30.04290.0206−0.01980.14740.12700.1091−0.0270−0.0094−0.0101−0.0500
A4−0.6286−0.4845−0.4356−0.2737−0.06350.072700−0.0101−0.0333
A50.35710.54640.61390.70530.75400.82730.83780.72640.64650.6167
A60.38570.56700.67330.66320.61900.78180.81080.68870.73740.6167
A70.24290.41240.50500.49470.51590.58180.63060.49060.52530.3833
A80.70000.70100.72280.53680.57940.50910.48650.50000.58590.5333
A91.00000.051500000000
A100.75710.63920.37620.28420.27780.14550.07210.09430.06060.0500
A110.91430.18560.01980.0105000000
A120.12860.56700.71290.67370.82540.91820.87390.76420.80810.8167
Table 3. Score results in the range of the 11st of difference feature amplitude interval through the 20th interval.
Table 3. Score results in the range of the 11st of difference feature amplitude interval through the 20th interval.
Fusion AlgorithmDifference Feature Amplitude Interval
11121314151617181920
A10.93100.92860.92450.95351.00000.92310.91671.000000
A20−0.1000−0.0755−0.1395−0.0870−0.1538−0.166700−1.0000
A3−0.0115−0.0429−0.0943−0.02330.26090.2308−0.3333−0.33330−1.0000
A40−0.0143−0.0377000−0.2500−0.33330−1.0000
A50.72410.82860.77360.83720.86960.92311.00001.000000
A60.81610.80000.84910.88370.86960.92310.91671.00000−1.0000
A70.49430.60000.56600.67440.47830.38460.33330.33330−1.0000
A80.36780.07140.24530.0465−0.4783−0.5385−0.3333−0.333301.0000
A900−0.9623−1.0000−1.0000−1.0000−1.0000−1.00000−1.0000
A100.04600.07140.01890.0930−0.0870−0.1538−0.3333−0.33330−1.0000
A110−0.0429−0.7736−1.0000−1.0000−1.0000−1.0000−1.00000−1.0000
A120.89660.94290.90570.97671.00000.92311.00001.000001.0000
Table 4. The total evaluation values of difference features.
Table 4. The total evaluation values of difference features.
Fusion AlgorithmT1T2T3T4
A10.74670.45790.33540.3260
A20.17380.19230.19010.2168
A30.14470.24180.10360.2181
A40.18190.11150.08170.1921
A50.67940.41270.23830.1781
A60.73010.57550.49670.4929
A70.48230.13550.18710.2020
A80.46350.32410.06510.0303
A90.40070.09120.21930.3080
A100.24470.28170.16570.1079
A110.39730.18990.18110.2653
A120.78670.50070.27440.3739
Table 5. Ei of each fusion algorithms of the first group.
Table 5. Ei of each fusion algorithms of the first group.
Difference FeaturesIndexFusion Algorithm
A1A2A3A4A5A6A7A8A9A10A11A12
T1, T2 f i 0.020700.005900.02370.66570.2219000.01480.00590.0059
E i ¯ 0.214000.203100.38130.34390.2101000.22260.20310.5980
Ei0.004400.0012000.22890.0466000.00330.00120.0035
T1, T3 f i 0.10300000.02580.40520.05850.12880.00810.009400.2646
E i ¯ 0.33410000.31250.31180.14300.42670.12100.268800.3333
Ei0.03440000.00810.12630.00840.05500.00100.002500.0882
T1, T4 f i 0.15920.010000.01490.11940.33830.17410.064700.069700
E i ¯ 0.18180.226400.28140.53930.35610.19170.269100.291800
Ei0.02890.002300.00420.06440.12050.03340.017400.020300
T2, T3 f i 0.16330000.08540.42960.19850.057800.022600.0302
E i ¯ 0.37270000.56500.50050.21200.398100.655000.5400
Ei0.06090000.04830.21510.04210.023000.014800.0163
T2, T4 f i 0.21110000.04270.25630.1231000.042700.2739
E i ¯ 0.24400000.57550.48920.2553000.659900.4480
Ei0.05150000.02460.12540.0314000.028200.1227
T3, T4 f i 0.26010000.09340.10860.10100.098500.025300.3131
E i ¯ 0.32100000.60090.59510.18940.321300.747100.4510
Ei0.08350000.05610.06460.01910.031600.018900.1412
sum0.26360.00230.00120.00420.21050.88080.18100.12700.00100.08800.00120.3719
rank3101194156127112
Table 6. Ei of each fusion algorithms of the second group.
Table 6. Ei of each fusion algorithms of the second group.
Difference FeaturesIndexFusion Algorithm
A1A2A3A4A5A6A7A8A9A10A11A12
T1, T2 f i 00.25530.018400.32890.04740.12630.01840.01450.009400.1158
E i ¯ 00.68910.849200.63610.81240.78050.84920.65490.085100.6876
Ei00.17590.015600.20920.03850.09860.01560.00950.000800.0796
T1, T3 f i 0.07970.1087000.59660.152200000.03380.0072
E i ¯ 0.79750.6974000.67900.785800000.70480.7717
Ei0.06360.0758000.40510.119600000.02380.0056
T1, T4 f i 00.27460.040500.21970.10690.10120.04050.0825000.2572
E i ¯ 00.68570.645600.57140.82490.77460.64560.2839000.6969
Ei00.18830.026100.12550.08820.07840.02610.0234000.1793
T2, T3 f i 0.01500000.66750.19250.042500.14570.024600
E i ¯ 0.50560000.26520.49160.414600.40150.028500
Ei0.00760000.17700.09460.017600.05850.000700
T2, T4 f i 0000.00250.43470.07790.138200.002300.05530.1457
E i ¯ 0000.30740.16680.48440.452500.610500.34650.2850
Ei0000.00080.07250.03770.062500.001400.01920.0415
T3, T4 f i 0.0090000.00450.60140.25450.03830000.01800.0721
E i ¯ 0.7382000.26110.19950.43210.37990000.34260.3236
Ei0.0067000.00120.12000.11000.01450000.00620.0233
sum0.07780.44000.04180.00191.10940.48860.27160.00490.17050.00150.04920.3293
rank739111251061284
Table 7. The evaluation index of different fusion algorithms for two group images.
Table 7. The evaluation index of different fusion algorithms for two group images.
GroupMethodEvaluation IndexRRank
X1X2X3X4X5X6X7X8
1A10.57910.43560.71130.30510.58860.41163.10110.0534362
A20.36980.38070.55910.20220.61740.34454.35070.06376410
A30.57340.41180.70600.28780.46350.38862.82130.0556588
A40.40560.37770.56510.19050.43270.34323.11730.03768212
A50.55870.40050.66970.27450.51000.36933.04070.0485557
A60.58730.46630.63530.27430.85140.43857.18860.0686211
A70.47210.38510.59550.21010.67970.35106.32750.0597516
A80.36510.30950.66390.24520.48630.27243.35680.04877211
A90.36210.44460.61100.23660.50290.41413.44650.0313629
A100.40070.45310.56480.21730.75420.46334.08540.0418505
A110.45870.42140.67080.27510.65020.39783.00440.0668414
A120.45110.52410.54410.25820.54510.51977.41210.0558403
2A10.45360.38530.53860.37190.16914.50030.06050.1259537
A20.48490.37790.6570.3740.311.90770.05380.2986373
A30.48950.35490.64880.34580.16781.56540.04590.25176410
A40.43190.34210.56390.33050.1592.75790.0430.15687711
A50.49210.43680.64060.48510.29898.06340.06570.3351171
A60.46630.38830.67530.38240.2681.92360.0660.2915322
A70.43770.36660.68130.35460.27081.67540.06590.2613505
A80.43980.38460.55380.36760.18294.43820.06540.1175579
A90.44770.3770.56540.32110.20215.23340.04890.322516
A100.28430.05240.06520.02440.02554.66240.03540.02038812
A110.44420.37040.63750.36190.28232.26380.05040.227548
A120.50280.46650.53590.47110.16996.32970.0480.1505434
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Ji, L.; Yang, F.; Guo, X. Image Fusion Algorithm Selection Based on Fusion Validity Distribution Combination of Difference Features. Electronics 2021, 10, 1752. https://doi.org/10.3390/electronics10151752

AMA Style

Ji L, Yang F, Guo X. Image Fusion Algorithm Selection Based on Fusion Validity Distribution Combination of Difference Features. Electronics. 2021; 10(15):1752. https://doi.org/10.3390/electronics10151752

Chicago/Turabian Style

Ji, Linna, Fengbao Yang, and Xiaoming Guo. 2021. "Image Fusion Algorithm Selection Based on Fusion Validity Distribution Combination of Difference Features" Electronics 10, no. 15: 1752. https://doi.org/10.3390/electronics10151752

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop