Infrared and Visible Image Fusion via Sparse Representation and Guided Filtering in Laplacian Pyramid Domain
Abstract
:1. Introduction
2. Related Works
2.1. Deep Learning on Image Fusion
2.2. Traditional Methods of Image Fusion
3. Laplacian Pyramid Transform
4. Proposed Fusion Method
4.1. Image Decomposition
4.2. Low-Frequency Fusion
4.3. High-Frequency Fusion
4.4. Image Reconstruction
5. Experimental Results and Discussion
5.1. Experimental Setup
5.2. Analysis of LP Decomposition Levels
5.3. Qualitative and Quantitative Analysis
5.4. Experimental Expansion
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Liu, Y.; Liu, S.; Wang, Z. A general framework for image fusion based on multi-scale transform and sparse representation. Inf. Fusion 2015, 24, 147–164. [Google Scholar] [CrossRef]
- Huo, X.; Deng, Y.; Shao, K. Infrared and visible image fusion with significant target enhancement. Entropy 2022, 24, 1633. [Google Scholar] [CrossRef]
- Luo, Y.; Luo, Z. Infrared and visible image fusion: Methods, datasets, applications, and prospects. Appl. Sci. 2023, 13, 10891. [Google Scholar] [CrossRef]
- Li, L.; Lv, M.; Jia, Z.; Jin, Q.; Liu, M.; Chen, L.; Ma, H. An effective infrared and visible image fusion approach via rolling guidance filtering and gradient saliency map. Remote Sens. 2023, 15, 2486. [Google Scholar] [CrossRef]
- Ma, X.; Li, T.; Deng, J. Infrared and visible image fusion algorithm based on double-domain transform filter and contrast transform feature extraction. Sensors 2024, 24, 3949. [Google Scholar] [CrossRef]
- Wang, Q.; Yan, X.; Xie, W.; Wang, Y. Image fusion method based on snake visual imaging mechanism and PCNN. Sensors 2024, 24, 3077. [Google Scholar] [CrossRef]
- Feng, B.; Ai, C.; Zhang, H. Fusion of infrared and visible light images based on improved adaptive dual-channel pulse coupled neural network. Electronics 2024, 13, 2337. [Google Scholar] [CrossRef]
- Yang, H.; Zhang, J.; Zhang, X. Injected infrared and visible image fusion via L1 decomposition model and guided filtering. IEEE Trans. Comput. Imaging 2022, 8, 162–173. [Google Scholar]
- Zhang, X.; Boutat, D.; Liu, D. Applications of fractional operator in image processing and stability of control systems. Fractal Fract. 2023, 7, 359. [Google Scholar] [CrossRef]
- Zhang, X.; He, H.; Zhang, J. Multi-focus image fusion based on fractional order differentiation and closed image matting. ISA Trans. 2022, 129, 703–714. [Google Scholar] [CrossRef]
- Zhang, X.; Yan, H. Medical image fusion and noise suppression with fractional-order total variation and multi-scale decomposition. IET Image Process. 2021, 15, 1688–1701. [Google Scholar] [CrossRef]
- Yan, H.; Zhang, X. Adaptive fractional multi-scale edge-preserving decomposition and saliency detection fusion algorithm. ISA Trans. 2020, 107, 160–172. [Google Scholar] [CrossRef] [PubMed]
- Zhang, X.; Yan, H.; He, H. Multi-focus image fusion based on fractional-order derivative and intuitionistic fuzzy sets. Front. Inf. Technol. Electron. Eng. 2020, 21, 834–843. [Google Scholar] [CrossRef]
- Zhang, J.; Ding, J.; Chai, T. Fault-tolerant prescribed performance control of wheeled mobile robots: A mixed-gain adaption approach. IEEE Trans. Autom. Control 2024, 69, 5500–5507. [Google Scholar] [CrossRef]
- Zhang, J.; Xu, K.; Wang, Q. Prescribed performance tracking control of time-delay nonlinear systems with output constraints. IEEE/CAA J. Autom. Sin. 2024, 11, 1557–1565. [Google Scholar] [CrossRef]
- Wu, D.; Wang, Y.; Wang, H.; Wang, F.; Gao, G. DCFNet: Infrared and visible image fusion network based on discrete wavelet transform and convolutional neural network. Sensors 2024, 24, 4065. [Google Scholar] [CrossRef]
- Wei, Q.; Liu, Y.; Jiang, X.; Zhang, B.; Su, Q.; Yu, M. DDFNet-A: Attention-based dual-branch feature decomposition fusion network for infrared and visible image fusion. Remote Sens. 2024, 16, 1795. [Google Scholar] [CrossRef]
- Li, X.; He, H.; Shi, J. HDCCT: Hybrid densely connected CNN and transformer for infrared and visible image fusion. Electronics 2024, 13, 3470. [Google Scholar] [CrossRef]
- Mao, Q.; Zhai, W.; Lei, X.; Wang, Z.; Liang, Y. CT and MRI image fusion via coupled feature-learning GAN. Electronics 2024, 13, 3491. [Google Scholar] [CrossRef]
- Wang, Z.; Chen, Y.; Shao, W. SwinFuse: A residual swin transformer fusion network for infrared and visible images. IEEE Trans. Instrum. Meas. 2023, 71, 5016412. [Google Scholar] [CrossRef]
- Ma, J.; Tang, L.; Fan, F. SwinFusion: Cross-domain long-range learning for general image fusion via swin transformer. IEEE-CAA J. Autom. Sin. 2022, 9, 1200–1217. [Google Scholar] [CrossRef]
- Gao, F.; Lang, P.; Yeh, C.; Li, Z.; Ren, D.; Yang, J. An interpretable target-aware vision transformer for polarimetric HRRP target recognition with a novel attention loss. Remote Sens. 2024, 16, 3135. [Google Scholar] [CrossRef]
- Huang, L.; Chen, Y.; He, X. Spectral-spatial Mamba for hyperspectral image classification. Remote Sens. 2024, 16, 2449. [Google Scholar] [CrossRef]
- Zhang, X.; Demiris, Y. Visible and infrared image fusion using deep learning. IEEE Trans. Pattern Anal. Mach. Intell. 2023, 45, 10535–10554. [Google Scholar] [CrossRef]
- Zhang, X.; Ye, P.; Xiao, G. VIFB: A visible and infrared image fusion benchmark. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, Seattle, WA, USA, 14–19 June 2020. [Google Scholar]
- Li, H.; Wu, X. CrossFuse: A novel cross attention mechanism based infrared and visible image fusion approach. Inf. Fusion 2024, 103, 102147. [Google Scholar] [CrossRef]
- Liu, Y.; Chen, X.; Wang, Z. Deep learning for pixel-level image fusion: Recent advances and future prospects. Inf. Fusion 2018, 42, 158–173. [Google Scholar] [CrossRef]
- Liu, Y.; Chen, X.; Cheng, J. Infrared and visible image fusion with convolutional neural networks. Int. J. Wavelets Multiresolut. Inf. Process. 2018, 16, 1850018. [Google Scholar] [CrossRef]
- Yang, C.; He, Y. Multi-scale convolutional neural networks and saliency weight maps for infrared and visible image fusion. J. Vis. Commun. Image Represent. 2024, 98, 104015. [Google Scholar] [CrossRef]
- Wei, H.; Fu, X.; Wang, Z.; Zhao, J. Infrared/Visible light fire image fusion method based on generative adversarial network of wavelet-guided pooling vision transformer. Forests 2024, 15, 976. [Google Scholar] [CrossRef]
- Ma, J.; Xu, H. DDcGAN: A dual-discriminator conditional generative adversarial network for multi-resolution image fusion. IEEE Trans. Image Process. 2020, 29, 4980–4995. [Google Scholar] [CrossRef]
- Chang, L.; Huang, Y. DUGAN: Infrared and visible image fusion based on dual fusion paths and a U-type discriminator. Neurocomputing 2024, 578, 127391. [Google Scholar] [CrossRef]
- Lv, M.; Jia, Z.; Li, L.; Ma, H. Multi-focus image fusion via PAPCNN and fractal dimension in NSST domain. Mathematics 2023, 11, 3803. [Google Scholar] [CrossRef]
- Lv, M.; Li, L.; Jin, Q.; Jia, Z.; Chen, L.; Ma, H. Multi-focus image fusion via distance-weighted regional energy and structure tensor in NSCT domain. Sensors 2023, 23, 6135. [Google Scholar] [CrossRef]
- Li, L.; Lv, M.; Jia, Z.; Ma, H. Sparse representation-based multi-focus image fusion method via local energy in shearlet domain. Sensors 2023, 23, 2888. [Google Scholar] [CrossRef]
- Ma, J.; Ma, Y.; Li, C. Infrared and visible image fusion methods and applications: A survey. Inf. Fusion 2019, 45, 153–178. [Google Scholar] [CrossRef]
- Liu, Y.; Wang, L.; Cheng, J. Multi-focus image fusion: A survey of the state of the art. Inf. Fusion 2020, 64, 71–91. [Google Scholar] [CrossRef]
- Chen, H.; Deng, L. SFCFusion: Spatial-frequency collaborative infrared and visible image fusion. IEEE Trans. Instrum. Meas. 2024, 73, 5011615. [Google Scholar] [CrossRef]
- Chen, H.; Deng, L.; Zhu, L.; Dong, M. ECFuse: Edge-consistent and correlation-driven fusion framework for infrared and visible image fusion. Sensors 2023, 23, 8071. [Google Scholar] [CrossRef]
- Li, X.; Tan, H. Infrared and visible image fusion based on domain transform filtering and sparse representation. Infrared Phys. Technol. 2023, 131, 104701. [Google Scholar] [CrossRef]
- Chen, Y.; Liu, Y. Multi-focus image fusion with complex sparse representation. IEEE Sens. J. 2024; early access. [Google Scholar]
- Li, S.; Kwok, J.T.; Wang, Y. Multifocus image fusion using artificial neural networks. Pattern Recognit. Lett. 2002, 23, 985–997. [Google Scholar] [CrossRef]
- Chang, C.I.; Liang, C.C.; Hu, P.F. Iterative Gaussian–Laplacian pyramid network for hyperspectral image classification. IEEE Trans. Geosci. Remote Sens. 2024, 62, 5510122. [Google Scholar] [CrossRef]
- Burt, P.J.; Adelson, E.H. The laplacian pyramid as a compact image code. IEEE Trans. Commun. 1983, 31, 532–540. [Google Scholar] [CrossRef]
- Chen, J.; Li, X.; Luo, L. Infrared and visible image fusion based on target-enhanced multiscale transform decomposition. Inf. Sci. 2020, 508, 64–78. [Google Scholar] [CrossRef]
- Yin, M.; Liu, X.; Liu, Y. Medical image fusion with parameter-adaptive pulse coupled neural network in nonsubsampled shearlet transform domain. IEEE Trans. Instrum. Meas. 2019, 68, 49–64. [Google Scholar] [CrossRef]
- He, K.; Sun, J.; Tang, X. Guided image filtering. IEEE Trans. Pattern Anal. Mach. Intell. 2013, 35, 1397–1409. [Google Scholar] [CrossRef]
- Li, S.; Kang, X.; Hu, J. Image fusion with guided filtering. IEEE Trans. Image Process. 2013, 22, 2864–2875. [Google Scholar]
- Available online: https://figshare.com/articles/dataset/TNO_Image_Fusion_Dataset/1008029 (accessed on 1 May 2024).
- Mitianoudis, N.; Stathaki, T. Pixel-based and region-based image fusion schemes using ICA bases. Inf. Fusion 2007, 8, 131–142. [Google Scholar] [CrossRef]
- Bavirisetti, D.P.; Dhuli, R. Fusion of infrared and visible sensor images based on anisotropic diffusion and Karhunen-Loeve transform. IEEE Sens. J. 2016, 16, 203–209. [Google Scholar] [CrossRef]
- Bavirisetti, D.P.; Dhuli, R. Two-scale image fusion of visible and infrared images using saliency detection. Infrared Phys. Technol. 2016, 76, 52–64. [Google Scholar] [CrossRef]
- Li, H.; Wu, X.; Kittler, J. MDLatLRR: A novel decomposition method for infrared and visible image fusion. IEEE Trans. Image Process. 2020, 29, 4733–4746. [Google Scholar] [CrossRef]
- Zhang, H.; Xu, H.; Xiao, Y. Rethinking the image fusion: A fast unified image fusion network based on proportional maintenance of gradient and intensity. In Proceedings of the AAAI Conference on Artificial Intelligence, New York, NY, USA, 7–12 February 2020; Volume 34, pp. 12797–12804. [Google Scholar]
- Li, H.; Wu, X.; Kittler, J. RFN-Nest: An end-to-end residual fusion network for infrared and visible images. Inf. Fusion 2021, 73, 72–86. [Google Scholar] [CrossRef]
- Tang, H.; Liu, G. EgeFusion: Towards edge gradient enhancement in infrared and visible image fusion with multi-scale transform. IEEE Trans. Comput. Imaging 2024, 10, 385–398. [Google Scholar] [CrossRef]
- Xiang, W.; Shen, J.; Zhang, L.; Zhang, Y. Infrared and visual image fusion based on a local-extrema-driven image filter. Sensors 2024, 24, 2271. [Google Scholar] [CrossRef] [PubMed]
- Qu, X.; Yan, J.; Xiao, H. Image fusion algorithm based on spatial frequency-motivated pulse coupled neural networks in nonsubsampled contourlet transform domain. Acta Autom. Sin. 2008, 34, 1508–1514. [Google Scholar] [CrossRef]
- Li, S.; Han, M.; Qin, Y.; Li, Q. Self-attention progressive network for infrared and visible image fusion. Remote Sens. 2024, 16, 3370. [Google Scholar] [CrossRef]
- Li, L.; Zhao, X.; Hou, H.; Zhang, X.; Lv, M.; Jia, Z.; Ma, H. Fractal dimension-based multi-focus image fusion via coupled neural P systems in NSCT domain. Fractal Fract. 2024, 8, 554. [Google Scholar] [CrossRef]
- Zhai, H.; Ouyang, Y.; Luo, N. MSI-DTrans: A multi-focus image fusion using multilayer semantic interaction and dynamic transformer. Displays 2024, 85, 102837. [Google Scholar] [CrossRef]
- Li, L.; Ma, H.; Jia, Z.; Si, Y. A novel multiscale transform decomposition based multi-focus image fusion framework. Multimed. Tools Appl. 2021, 80, 12389–12409. [Google Scholar] [CrossRef]
- Li, B.; Zhang, L.; Liu, J.; Peng, H. Multi-focus image fusion with parameter adaptive dual channel dynamic threshold neural P systems. Neural Netw. 2024, 179, 106603. [Google Scholar] [CrossRef]
- Liu, Z.; Blasch, E.; Xue, Z. Objective assessment of multiresolution image fusion algorithms for context enhancement in night vision: A comparative study. IEEE Trans. Pattern Anal. Mach. Intell. 2012, 34, 94–109. [Google Scholar] [CrossRef]
- Zhai, H.; Chen, Y.; Wang, Y. W-shaped network combined with dual transformers and edge protection for multi-focus image fusion. Image Vis. Comput. 2024, 150, 105210. [Google Scholar] [CrossRef]
- Haghighat, M.; Razian, M. Fast-FMI: Non-reference image fusion metric. In Proceedings of the IEEE 8th International Conference on Application of Information and Communication Technologies, Astana, Kazakhstan, 15–17 October 2014; pp. 424–426. [Google Scholar]
- Wang, X.; Fang, L.; Zhao, J.; Pan, Z.; Li, H.; Li, Y. MMAE: A universal image fusion method via mask attention mechanism. Pattern Recognit. 2025, 158, 111041. [Google Scholar] [CrossRef]
- Zhang, X.; Li, W. Hyperspectral pathology image classification using dimension-driven multi-path attention residual network. Expert Syst. Appl. 2023, 230, 120615. [Google Scholar] [CrossRef]
- Zhang, X.; Li, Q. FD-Net: Feature distillation network for oral squamous cell carcinoma lymph node segmentation in hyperspectral imagery. IEEE J. Biomed. Health Inform. 2024, 28, 1552–1563. [Google Scholar] [CrossRef]
- Nejati, M.; Samavi, S.; Shirani, S. Multi-focus image fusion using dictionary-based sparse representation. Inf. Fusion 2015, 25, 72–84. [Google Scholar] [CrossRef]
- Zhang, H.; Le, Z. MFF-GAN: An unsupervised generative adversarial network with adaptive and gradient joint constraints for multi-focus image fusion. Inf. Fusion 2021, 66, 40–53. [Google Scholar] [CrossRef]
- Xu, H.; Ma, J.; Le, Z. FusionDN: A unified densely connected network for image fusion. In Proceedings of the Thirty-Fourth AAAI Conference on Artificial Intelligence (AAAI), New York, NY, USA, 7–12 February 2020; Volume 34, pp. 12484–12491. [Google Scholar]
- Xu, H.; Ma, J.; Jiang, J. U2Fusion: A unified unsupervised image fusion network. IEEE Trans. Pattern Anal. Mach. Intell. 2022, 44, 502–518. [Google Scholar] [CrossRef]
- Zhang, Y.; Xiang, W. Local extreme map guided multi-modal brain image fusion. Front. Neurosci. 2022, 16, 1055451. [Google Scholar] [CrossRef]
- Hu, X.; Jiang, J.; Liu, X.; Ma, J. ZMFF: Zero-shot multi-focus image fusion. Inf. Fusion 2023, 92, 127–138. [Google Scholar] [CrossRef]
- Li, J.; Zhang, J.; Yang, C.; Liu, H.; Zhao, Y.; Ye, Y. Comparative analysis of pixel-level fusion algorithms and a new high-resolution dataset for SAR and optical image fusion. Remote Sens. 2023, 15, 5514. [Google Scholar] [CrossRef]
- Li, L.; Ma, H.; Jia, Z. Multiscale geometric analysis fusion-based unsupervised change detection in remote sensing images via FLICM model. Entropy 2022, 24, 291. [Google Scholar] [CrossRef] [PubMed]
- Li, L.; Ma, H.; Zhang, X.; Zhao, X.; Lv, M.; Jia, Z. Synthetic aperture radar image change detection based on principal component analysis and two-level clustering. Remote Sens. 2024, 16, 1861. [Google Scholar] [CrossRef]
- Li, L.; Ma, H.; Jia, Z. Change detection from SAR images based on convolutional neural networks guided by saliency enhancement. Remote Sens. 2021, 13, 3697. [Google Scholar] [CrossRef]
- Li, L.; Ma, H.; Jia, Z. Gamma correction-based automatic unsupervised change detection in SAR images via FLICM model. J. Indian Soc. Remote Sens. 2023, 51, 1077–1088. [Google Scholar] [CrossRef]
Levels | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|
1 | 0.5686 | 0.5392 | 0.6205 | 0.9068 | 0.5592 | 3.8195 | 0.8155 | 0.5440 | 0.3016 | 0.8307 |
2 | 0.5669 | 0.5467 | 0.6655 | 0.9124 | 0.5565 | 3.1350 | 0.8099 | 0.4438 | 0.3402 | 0.8317 |
3 | 0.5727 | 0.5394 | 0.6764 | 0.9138 | 0.5619 | 2.6628 | 0.8075 | 0.3760 | 0.3644 | 0.8301 |
4 | 0.5768 | 0.5306 | 0.6699 | 0.9140 | 0.5654 | 2.4378 | 0.8065 | 0.3460 | 0.3716 | 0.8233 |
5 | 0.5765 | 0.5131 | 0.6521 | 0.9138 | 0.5655 | 2.3160 | 0.8060 | 0.3321 | 0.3832 | 0.8079 |
6 | 0.5775 | 0.5113 | 0.6292 | 0.9133 | 0.5662 | 2.4575 | 0.8064 | 0.3540 | 0.3871 | 0.7980 |
ICA | 0.4017 | 0.4461 | 0.5300 | 0.9139 | 0.3956 | 1.8567 | 0.8038 | 0.2775 | 0.2654 | 0.7064 |
ADKLT | 0.4026 | 0.5404 | 0.4651 | 0.8778 | 0.3976 | 1.5936 | 0.8034 | 0.2382 | 0.1851 | 0.7098 |
MFSD | 0.4247 | 0.5756 | 0.5898 | 0.9017 | 0.4203 | 1.3551 | 0.8031 | 0.1983 | 0.2056 | 0.7252 |
MDLatLRR | 0.3248 | 0.4957 | 0.4136 | 0.8874 | 0.3184 | 1.0944 | 0.8028 | 0.1556 | 0.2958 | 0.6882 |
PMGI | 0.3880 | 0.5035 | 0.4399 | 0.9024 | 0.3803 | 1.8901 | 0.8041 | 0.2747 | 0.2028 | 0.7361 |
RFNNest | 0.3372 | 0.4939 | 0.3991 | 0.9031 | 0.3300 | 1.7239 | 0.8036 | 0.2546 | 0.2155 | 0.6856 |
EgeFusion | 0.1968 | 0.4298 | 0.3371 | 0.8688 | 0.1901 | 1.1886 | 0.8029 | 0.1665 | 0.2154 | 0.4970 |
LEDIF | 0.5058 | 0.5702 | 0.6512 | 0.9087 | 0.5001 | 1.2948 | 0.8030 | 0.1929 | 0.2572 | 0.8143 |
Proposed | 0.5860 | 0.6029 | 0.7047 | 0.9248 | 0.5838 | 2.7156 | 0.8067 | 0.3908 | 0.3280 | 0.8802 |
ICA | 0.4002 | 0.4417 | 0.4899 | 0.9569 | 0.3987 | 2.3254 | 0.8051 | 0.3427 | 0.2676 | 0.7434 |
ADKLT | 0.4043 | 0.5699 | 0.4124 | 0.9249 | 0.3993 | 1.8767 | 0.8041 | 0.2756 | 0.1595 | 0.7093 |
MFSD | 0.4175 | 0.6009 | 0.6229 | 0.9539 | 0.4128 | 1.7852 | 0.8039 | 0.2594 | 0.1677 | 0.6909 |
MDLatLRR | 0.3382 | 0.4503 | 0.5120 | 0.9142 | 0.3370 | 1.2513 | 0.8030 | 0.1769 | 0.2772 | 0.7223 |
PMGI | 0.4605 | 0.5269 | 0.5454 | 0.9516 | 0.4610 | 2.1395 | 0.8043 | 0.3089 | 0.1939 | 0.7885 |
RFNNest | 0.4098 | 0.5803 | 0.4507 | 0.9460 | 0.4066 | 2.1851 | 0.8048 | 0.3098 | 0.1841 | 0.7168 |
EgeFusion | 0.2011 | 0.3987 | 0.3715 | 0.8835 | 0.1971 | 1.1956 | 0.8029 | 0.1666 | 0.2133 | 0.5511 |
LEDIF | 0.5870 | 0.5920 | 0.6801 | 0.9538 | 0.5845 | 1.5422 | 0.8034 | 0.2297 | 0.2578 | 0.8901 |
Proposed | 0.6880 | 0.6771 | 0.7431 | 0.9623 | 0.6860 | 3.6399 | 0.8112 | 0.5043 | 0.2976 | 0.9458 |
ICA | 0.6748 | 0.6689 | 0.7446 | 0.8854 | 0.6642 | 4.1877 | 0.8113 | 0.6531 | 0.7358 | 0.8365 |
ADKLT | 0.5891 | 0.6599 | 0.6499 | 0.8739 | 0.5764 | 3.7880 | 0.8098 | 0.5907 | 0.6140 | 0.7521 |
MFSD | 0.6183 | 0.6423 | 0.7634 | 0.8751 | 0.6071 | 3.5683 | 0.8091 | 0.5492 | 0.6331 | 0.7636 |
MDLatLRR | 0.3124 | 0.4782 | 0.4074 | 0.8460 | 0.3083 | 2.4512 | 0.8060 | 0.4063 | 0.5687 | 0.5580 |
PMGI | 0.5529 | 0.2891 | 0.5425 | 0.8676 | 0.5400 | 3.2741 | 0.8082 | 0.5181 | 0.5801 | 0.5961 |
RFNNest | 0.5053 | 0.6186 | 0.5145 | 0.8723 | 0.4964 | 3.6997 | 0.8095 | 0.5728 | 0.6163 | 0.7138 |
EgeFusion | 0.2452 | 0.4732 | 0.3511 | 0.8070 | 0.2414 | 2.1513 | 0.8053 | 0.3561 | 0.4598 | 0.5115 |
LEDIF | 0.6390 | 0.6455 | 0.7146 | 0.8829 | 0.6314 | 3.4861 | 0.8088 | 0.5387 | 0.7371 | 0.8444 |
Proposed | 0.7252 | 0.6830 | 0.8105 | 0.8887 | 0.7182 | 4.4156 | 0.8131 | 0.6674 | 0.8141 | 0.9395 |
ICA | 0.4523 | 0.3979 | 0.5932 | 0.9004 | 0.4478 | 2.1008 | 0.8045 | 0.3153 | 0.4024 | 0.7236 |
ADKLT | 0.3585 | 0.4032 | 0.3922 | 0.8670 | 0.3529 | 1.7737 | 0.8038 | 0.2697 | 0.2615 | 0.6098 |
MFSD | 0.4416 | 0.4786 | 0.6176 | 0.8861 | 0.4388 | 1.4931 | 0.8033 | 0.2229 | 0.3066 | 0.6666 |
MDLatLRR | 0.3157 | 0.4746 | 0.3772 | 0.8874 | 0.3131 | 1.2763 | 0.8029 | 0.1830 | 0.4091 | 0.6339 |
PMGI | 0.3799 | 0.3587 | 0.4497 | 0.8783 | 0.3764 | 1.7162 | 0.8035 | 0.2594 | 0.3257 | 0.7108 |
RFNNest | 0.2971 | 0.4159 | 0.3138 | 0.8920 | 0.2961 | 2.0997 | 0.8046 | 0.3137 | 0.3343 | 0.6153 |
EgeFusion | 0.2123 | 0.4800 | 0.3351 | 0.8582 | 0.2101 | 1.2046 | 0.8029 | 0.1720 | 0.2723 | 0.4726 |
LEDIF | 0.5120 | 0.4597 | 0.6724 | 0.8911 | 0.5081 | 1.5419 | 0.8033 | 0.2354 | 0.3847 | 0.7865 |
Proposed | 0.5947 | 0.5076 | 0.6975 | 0.9059 | 0.5915 | 2.5337 | 0.8062 | 0.3571 | 0.5059 | 0.8553 |
ICA | 0.4317 | 0.4496 | 0.5277 | 0.9074 | 0.4197 | 2.1172 | 0.8048 | 0.3167 | 0.3192 | 0.7050 |
ADKLT | 0.4078 | 0.4733 | 0.4205 | 0.8789 | 0.3919 | 1.7968 | 0.8041 | 0.2704 | 0.2341 | 0.6745 |
MFSD | 0.4274 | 0.5103 | 0.5657 | 0.8948 | 0.4124 | 1.6584 | 0.8038 | 0.2459 | 0.2467 | 0.6627 |
MDLatLRR | 0.3364 | 0.4735 | 0.4251 | 0.8915 | 0.3274 | 1.3278 | 0.8033 | 0.1924 | 0.3453 | 0.6478 |
PMGI | 0.4258 | 0.4580 | 0.5123 | 0.8961 | 0.4121 | 2.3462 | 0.8055 | 0.3399 | 0.2777 | 0.7095 |
RFNNest | 0.3480 | 0.4679 | 0.3692 | 0.8988 | 0.3347 | 2.1126 | 0.8047 | 0.3067 | 0.2306 | 0.6146 |
EgeFusion | 0.2041 | 0.4421 | 0.3164 | 0.8606 | 0.1964 | 1.2972 | 0.8032 | 0.1850 | 0.2504 | 0.4683 |
LEDIF | 0.5222 | 0.5062 | 0.6390 | 0.8996 | 0.5085 | 1.8827 | 0.8044 | 0.2810 | 0.3165 | 0.7919 |
Proposed | 0.5768 | 0.5306 | 0.6699 | 0.9140 | 0.5654 | 2.4378 | 0.8065 | 0.3460 | 0.3716 | 0.8233 |
ICA | 0.6248 | 0.6334 | 0.7991 | 0.8949 | 0.6191 | 6.2557 | 0.8247 | 0.8360 | 0.6340 | 0.8339 |
FusionDN | 0.6018 | 0.6008 | 0.7663 | 0.8833 | 0.5952 | 5.7908 | 0.8221 | 0.7684 | 0.6221 | 0.8224 |
PMGI | 0.3901 | 0.5656 | 0.4736 | 0.8815 | 0.3857 | 5.8641 | 0.8225 | 0.8004 | 0.4620 | 0.6738 |
U2Fusion | 0.6143 | 0.5682 | 0.7835 | 0.8844 | 0.6093 | 5.7765 | 0.8221 | 0.7725 | 0.6657 | 0.7912 |
LEGFF | 0.6810 | 0.6751 | 0.8195 | 0.8937 | 0.6754 | 5.6138 | 0.8214 | 0.7473 | 0.7565 | 0.8817 |
ZMFF | 0.7087 | 0.7412 | 0.8687 | 0.8925 | 0.7030 | 6.6271 | 0.8271 | 0.8838 | 0.7853 | 0.9313 |
EgeFusion | 0.3576 | 0.4034 | 0.5032 | 0.8472 | 0.3541 | 3.2191 | 0.8120 | 0.4248 | 0.5405 | 0.5991 |
LEDIF | 0.7051 | 0.6898 | 0.8390 | 0.8932 | 0.7005 | 5.7546 | 0.8222 | 0.7659 | 0.7665 | 0.9146 |
Proposed | 0.7503 | 0.7745 | 0.8819 | 0.8997 | 0.7487 | 7.4854 | 0.8332 | 0.9980 | 0.8302 | 0.9700 |
ICA | 0.5940 | 0.7460 | 0.7562 | 0.8674 | 0.5877 | 6.0569 | 0.8242 | 0.8304 | 0.6298 | 0.8594 |
FusionDN | 0.5243 | 0.4996 | 0.6556 | 0.8527 | 0.5187 | 5.3504 | 0.8203 | 0.7179 | 0.5856 | 0.7638 |
PMGI | 0.4237 | 0.5933 | 0.5061 | 0.8558 | 0.4177 | 5.4884 | 0.8210 | 0.7614 | 0.4750 | 0.7031 |
U2Fusion | 0.5502 | 0.5156 | 0.6970 | 0.8565 | 0.5447 | 5.1498 | 0.8194 | 0.6991 | 0.6212 | 0.7830 |
LEGFF | 0.6190 | 0.6060 | 0.7067 | 0.8692 | 0.6106 | 4.8291 | 0.8183 | 0.6555 | 0.7075 | 0.8266 |
ZMFF | 0.6395 | 0.7102 | 0.7994 | 0.8631 | 0.6322 | 5.7795 | 0.8228 | 0.7914 | 0.6834 | 0.8804 |
EgeFusion | 0.2874 | 0.3277 | 0.3757 | 0.8255 | 0.2841 | 2.8055 | 0.8111 | 0.3761 | 0.5191 | 0.5539 |
LEDIF | 0.6599 | 0.6585 | 0.7610 | 0.8673 | 0.6538 | 5.1592 | 0.8199 | 0.7031 | 0.6968 | 0.8971 |
Proposed | 0.7348 | 0.8204 | 0.8467 | 0.8779 | 0.7312 | 8.2343 | 0.8412 | 1.1244 | 0.7876 | 0.9825 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Li, L.; Shi, Y.; Lv, M.; Jia, Z.; Liu, M.; Zhao, X.; Zhang, X.; Ma, H. Infrared and Visible Image Fusion via Sparse Representation and Guided Filtering in Laplacian Pyramid Domain. Remote Sens. 2024, 16, 3804. https://doi.org/10.3390/rs16203804
Li L, Shi Y, Lv M, Jia Z, Liu M, Zhao X, Zhang X, Ma H. Infrared and Visible Image Fusion via Sparse Representation and Guided Filtering in Laplacian Pyramid Domain. Remote Sensing. 2024; 16(20):3804. https://doi.org/10.3390/rs16203804
Chicago/Turabian StyleLi, Liangliang, Yan Shi, Ming Lv, Zhenhong Jia, Minqin Liu, Xiaobin Zhao, Xueyu Zhang, and Hongbing Ma. 2024. "Infrared and Visible Image Fusion via Sparse Representation and Guided Filtering in Laplacian Pyramid Domain" Remote Sensing 16, no. 20: 3804. https://doi.org/10.3390/rs16203804
APA StyleLi, L., Shi, Y., Lv, M., Jia, Z., Liu, M., Zhao, X., Zhang, X., & Ma, H. (2024). Infrared and Visible Image Fusion via Sparse Representation and Guided Filtering in Laplacian Pyramid Domain. Remote Sensing, 16(20), 3804. https://doi.org/10.3390/rs16203804