CResDAE: A Deep Autoencoder with Attention Mechanism for Hyperspectral Unmixing
Highlights
- What are the main findings?
- The proposed CResDAE model, which integrates a channel attention mechanism and deep residual modules, demonstrates superior performance in hyperspectral unmixing compared to both conventional and deep learning-based methods.
- Application of CResDAE to real GF-5 satellite data from Yunnan successfully identifies key surface materials, including Forest, Grassland, Silicate, Carbonate, and Sulfate.
- What is the implication of the main finding?
- The model provides a more interpretable and effective tool for geological surveys by explicitly addressing the limitations of band selection and physical constraints in existing unmixing methods.
- It offers reliable data support for mineral exploration in covered regions, enhancing the ability to analyze surface material composition quantitatively.
Abstract
1. Introduction
2. Materials and Methods
2.1. Nonlinear Spectral Model
2.2. Deep Autoencoder Network
2.3. Channel Attention Module
2.4. Deep Residual Encoder
2.5. CResDAE Model Architecture
- (1)
- Mean Squared Error (MSE) Loss: Measures the squared Euclidean distance between the reconstructed and the true spectra, reflecting numerical reconstruction accuracy [57].
- (2)
- Spectral Angle Mapper (SAM) Loss: Measures the angle between the predicted and true spectra on the unit hypersphere, emphasizing spectral directional consistency [58].
3. Data Acquisition and Processing
3.1. Urban Dataset
3.1.1. Urban Hyperspectral Image
3.1.2. Spectral Signatures of Ground Objects
3.2. GF-5 Hyperspectral Data
3.2.1. Hyperspectral Image of the Study Area
3.2.2. Endmember Spectra of Surface Materials
4. Experiment
4.1. Construction of Simulated Hyperspectral Dataset
4.2. CResDAE Model Training
4.3. Unmixing on Simulated Datasets and Multi-Model Evaluation
- (1)
- (2)
- (3)
- LinearAE (Linear Autoencoder): A simple autoencoder structure in which the encoder output represents abundance, and the decoder performs a fixed linear combination of endmembers. It is suited for linear mixing scenarios [66].
- (4)
- ConvAE (Convolutional Autoencoder): This model treats hyperspectral images as spatial tensors and introduces convolutional layers to extract local features, enhancing the spatial continuity of abundance estimation [67].
- (5)
- (1)
- Root Mean Squared Error (RMSE): Measures the average Euclidean distance between the reconstructed spectra and the ground truth spectra after unmixing [70].
- (2)
- Spectral Angle Distance (SAD): Measures the angle between the estimated and ground truth spectra, reflecting their directional similarity. It is commonly used to assess spectral similarity at the endmember or pixel level. The unit is either radians or degrees—the smaller the value, the better the similarity [71].
- (3)
- Spectral Information Divergence (SID): An information-theoretic metric that quantifies the dissimilarity between two spectral vectors in terms of their probability distributions. A smaller value indicates higher spectral similarity [72].
- (4)
- Peak Signal-to-Noise Ratio (PSNR): Measures the fidelity of image or spectral reconstruction and is commonly used for quality assessment in image processing. A higher value indicates better reconstruction quality [73].
4.4. Urban Data Unmixing
4.5. Unmixing of GF-5 Hyperspectral Imagery
4.6. Discussion
- Channel Attention Mechanism significantly improves the model’s sensitivity to critical mineral spectral regions by assigning higher weights to key spectral positions (e.g., mineral absorption peaks), while suppressing noise and redundant bands. This allows the model to effectively distinguish endmembers even when their reflectance spectra are very similar.
- Deep Residual Structure mitigates training degradation and enhances feature expressiveness, helping to avoid performance drops in deeper networks and improving both convergence speed and training stability.
- Nonlinear Skip Connections enable the model to handle common nonlinear mixing effects observed in real-world scenarios. This structure allows the model to approximate bilinear interactions and can be extended to more complex forms such as exponential mixing models (e.g., PNMM).
- (1)
- Application Potential and Practical Value in Real Environments: In field experiments using GF-5 hyperspectral imagery from Yunnan, the CResDAE model exhibited strong unmixing capability and robustness. It was able to stably extract endmember spectra and their spatial distributions even under complex and variable natural surface conditions.
- (2)
- Sensitivity to Endmember Proportion and High-Abundance Regions: CResDAE demonstrated high sensitivity to fine-grained variations in endmember proportions (as seen in Figure 24), effectively capturing gradual transitions in boundary regions rather than abrupt changes. In particular, it accurately revealed the cross-distribution of layered silicates and sulfates, reflecting spatial patterns of material migration and surface weathering.
- (3)
- Interpretability under Semi-Supervised Conditions: Under semi-supervised learning conditions—guided by only a limited number of ROI samples—CResDAE extracted endmember spectra that closely matched the mean spectral curves of the ROIs, indicating strong interpretability.
- (4)
- Suitability for Complex Surface Environments: The abundance estimation method of CResDAE is well-suited for fine-grained modeling in areas with complex terrain and coexisting or overlapping materials, addressing the limitations of traditional classification and spectral identification methods in handling mixed pixels.
- (5)
- Application Value in Mineral Exploration and Potential in Vegetated Areas: The results demonstrate that CResDAE can transform hyperspectral imagery into environmental and geological information maps with high reliability, supporting field geological surveys and remote sensing-based mineral exploration. For example: Abundance maps can help identify mineral-enriched zones (e.g., carbonate anomalies), serving as remote sensing indicators of alteration zones; Unmixing results can assist in constructing mineral assemblage layers (e.g., carbonate + muscovite/illite + clay-type indicator minerals) for anomaly zone delineation; When overlaid with existing geological maps, the analysis can significantly narrow the scope and reduce the cost of field investigations.
5. Conclusions
- (1)
- The proposed CResDAE model introduces a channel attention mechanism and deep residual structure into the hyperspectral unmixing framework. This enhances the model’s ability to assign adaptive weights to spectral bands in geological unmixing tasks and equips the abundance estimation network with nonlinear skip connections. As a result, the model achieves improved nonlinear modeling capacity and spectral feature responsiveness, demonstrating superior robustness and generalization across both synthetic and real hyperspectral datasets.
- (2)
- Experimental results show that CResDAE consistently outperforms traditional and other deep learning-based methods. It exhibits notably better decoupling performance and spatial continuity, especially in boundary regions of mixed pixels. Compared with MVC + NMF, VCA+K-Hype, LinearAE, ConvAE, and NAE on synthetic datasets, CResDAE achieves average improvements of 0.1919 in RMSE, 0.6726 in SAD, and 2.0995 in SID. On the Urban dataset, compared to NAE, RMSE improves by 27.01%, while SAD, SID, and PSNR improve by 37.39%, 40.79%, and 25.24%, respectively.
- (3)
- In the real-data experiments on GF-5 hyperspectral imagery from the Yunnan region, the proposed CResDAE model demonstrates strong adaptability for cross-regional land cover recognition. It effectively distinguishes forest, grassland, silicate, carbonate, and sulfate materials across different areas. The extracted endmember spectral curves exhibit high interpretability, and the spatial distribution of the abundance maps shows strong consistency with actual surface cover. These results provide high-confidence data support for field geological surveys and remote sensing-based mineral exploration, offering decision-making references and technical assistance for geological prospecting in covered regions.
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Tong, Q.X.; Zhang, B.; Zheng, L.F. Hyperspectral Remote Sensing; Higher Education Press: Beijing, China, 2006; pp. 1–3. [Google Scholar]
- Qian, S.E. Hyperspectral satellites, evolution, and development history. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 14, 7032–7056. [Google Scholar] [CrossRef]
- Li, J.; Zheng, K.; Gao, L.; Han, Z.; Li, Z.; Chanussot, J. Enhanced Deep Image Prior for Unsupervised Hyperspectral Image Super-Resolution. IEEE Trans. Geosci. Remote Sens. 2025, 63, 5504218. [Google Scholar] [CrossRef]
- Li, J.; Zheng, K.; Gao, L.; Ni, L.; Huang, M.; Chanussot, J. Model-Informed Multistage Unsupervised Network for Hyperspectral Image Super-Resolution. IEEE Trans. Geosci. Remote Sens. 2024, 62, 5516117. [Google Scholar] [CrossRef]
- Liu, C.; Xing, C.; Hu, Q.; Wang, S.; Zhao, S.; Gao, M. Stereoscopic hyperspectral remote sensing of the atmospheric environment: Innovation and prospects. Earth-Sci. Rev. 2022, 226, 103958. [Google Scholar] [CrossRef]
- Ang, K.L.M.; Seng, J.K.P. Big data and machine learning with hyperspectral information in agriculture. IEEE Access 2021, 9, 36699–36718. [Google Scholar] [CrossRef]
- Nisha, A.; Anitha, A. Current advances in hyperspectral remote sensing in urban planning. In Proceedings of the 2022 Third International Conference on intelligent computing instrumentation and control technologies (ICICICT), Kannur, India, 11–12 August 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 94–98. [Google Scholar]
- Qasim, M.; Khan, S.D.; Haider, R.; Rasheed, M.U. Integration of multispectral and hyperspectral remote sensing data for lithological mapping in Zhob Ophiolite, Western Pakistan. Arab. J. Geosci. 2022, 15, 599. [Google Scholar] [CrossRef]
- Chakraborty, R.; Kereszturi, G.; Pullanagari, R.; Durance, P.; Ashraf, S.; Anderson, C. Mineral prospecting from biogeochemical and geological information using hyperspectral remote sensing-Feasibility and challenges. J. Geochem. Explor. 2022, 232, 106900. [Google Scholar] [CrossRef]
- Li, D.; Chen, S.; Chen, X. Research on method for extracting vegetation information based on hyperspectral remote sensing data. Trans. CSAE 2010, 26, 181–185, (In Chinese with English Abstract). [Google Scholar]
- Chen, J.; Ma, L.; Chen, X.H.; Rao, Y.H. Research progress of spectral mixture analysis. J. Remote Sens. 2016, 20, 1102–1109, (In Chinese with English Abstract). [Google Scholar] [CrossRef]
- Bioucas-Dias, J.M.; Plaza, A.; Dobigeon, N.; Parente, M.; Du, Q.; Gader, P.; Chanussot, J. Hyperspectral unmixing overview: Geometrical, statistical, and sparse regression-based approaches. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2012, 5, 354–379. [Google Scholar] [CrossRef]
- Zhang, B.; Sun, X. Hyperspectral Images Unmixing Algorithm; Science Press: Beijing, China, 2015. [Google Scholar]
- Borsoi, R.A.; Imbiriba, T.; Bermudez, J.C.M.; Richard, C.; Chanussot, J.; Drumetz, L.; Tourneret, J.Y.; Zare, A.; Jutten, C. Spectral variability in hyperspectral data unmixing: A comprehensive review. IEEE Geosci. Remote Sens. Mag. 2021, 9, 223–270. [Google Scholar] [CrossRef]
- Guerra, R.; Santos, L.; López, S.; Sarmiento, R. A new fast algorithm for linearly unmixing hyperspectral images. IEEE Trans. Geosci. Remote Sens. 2015, 53, 6752–6765. [Google Scholar] [CrossRef]
- Campos, S.P. Linear Spectral Mixing Model: Theoretical Concepts, Algorithms and Applications of Studies in the Legal Amazon. Rev. Bras. Cartogr. 2020, 72, 50. [Google Scholar]
- Dobigeon, N.; Tourneret, J.Y.; Richard, C.; Bermudez, J.C.M.; McLaughlin, S.; Hero, A.O. Nonlinear unmixing of hyperspectral images: Models and algorithms. IEEE Signal Process. Mag. 2013, 31, 82–94. [Google Scholar] [CrossRef]
- Heylen, R.; Parente, M.; Gader, P. A review of nonlinear hyperspectral unmixing methods. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2014, 7, 1844–1868. [Google Scholar] [CrossRef]
- Winter, M.E. N-FINDR: An algorithm for fast autonomous spectral end-member determination in hyperspectral data. In Imaging spectrometry V; SPIE: Bellingham, WA, USA, 1999; Volume 3753, pp. 266–275. [Google Scholar]
- Nascimento, J.M.; Dias, J.M. Vertex component analysis: A fast algorithm to unmix hyperspectral data. IEEE Trans. Geosci. Remote Sens. 2005, 43, 898–910. [Google Scholar] [CrossRef]
- Iordache, M.D.; Bioucas-Dias, J.M.; Plaza, A. Sparse unmixing of hyperspectral data. IEEE Trans. Geosci. Remote Sens. 2011, 49, 2014–2039. [Google Scholar] [CrossRef]
- Zhou, Y.; Rangarajan, A.; Gader, P.D. A Gaussian mixture model representation of endmember variability in hyperspectral unmixing. IEEE Trans. Image Process. 2018, 27, 2242–2256. [Google Scholar] [CrossRef]
- Lee, D.D.; Seung, H.S. Learning the parts of objects by non-negative matrix factorization. Nature 1999, 401, 788–791. [Google Scholar] [CrossRef]
- Zhou, G.; Xie, S.; Yang, Z.; Yang, J.M.; He, Z. Minimum-volume-constrained nonnegative matrix factorization: Enhanced ability of learning parts. IEEE Trans. Neural Netw. 2011, 22, 1626–1637. [Google Scholar] [CrossRef]
- Palsson, F.; Sigurdsson, J.; Sveinsson, J.R.; Ulfarsson, M.O. Neural network hyperspectral unmixing with spectral information divergence objective. In Proceedings of the 2017 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Fort Worth, TX, USA, 23–28 July 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 755–758. [Google Scholar]
- Deshpande, V.S.; Bhatt, J.S. A practical approach for hyperspectral unmixing using deep learning. IEEE Geosci. Remote Sens. Lett. 2021, 19, 5511505. [Google Scholar]
- Dong, H.; Zhang, X.; Zhang, J.; Meng, H.; Jiao, L. Graph-based Adaptive Network with Spatial-Spectral Features for Hyperspectral Unmixing. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2025, 18, 12865–12881. [Google Scholar] [CrossRef]
- Zhao, X.; Ma, J.; Wang, L.; Zhang, Z.; Ding, Y.; Xiao, X. A review of hyperspectral image classification based on graph neural networks. Artif. Intell. Rev. 2025, 58, 172. [Google Scholar] [CrossRef]
- Ghosh, P.; Roy, S.K.; Koirala, B.; Rasti, B.; Scheunders, P. Hyperspectral unmixing using transformer network. IEEE Trans. Geosci. Remote Sens. 2022, 60, 5535116. [Google Scholar] [CrossRef]
- Duan, Y.; Xu, X.; Li, T.; Pan, B.; Shi, Z. UnDAT: Double-aware transformer for hyperspectral unmixing. IEEE Trans. Geosci. Remote Sens. 2023, 61, 5522012. [Google Scholar] [CrossRef]
- Jin, Z.; Yi, X.; Liu, Y.; Zhang, H. Multilinear hyperspectral unmixing based on autoencoder and recurrent neural network. Appl. Soft Comput. 2025, 185, 113972. [Google Scholar] [CrossRef]
- Alshahrani, A.A.; Bchir, O.; Ben Ismail, M.M. Autoencoder-Based Hyperspectral Unmixing with Simultaneous Number-of-Endmembers Estimation. Sensors 2025, 25, 2592. [Google Scholar] [CrossRef]
- Wan, L.; Chen, T.; Plaza, A.; Cai, H. Hyperspectral unmixing based on spectral and sparse deep convolutional neural networks. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 14, 11669–11682. [Google Scholar] [CrossRef]
- Yu, Y.; Ma, Y.; Mei, X.; Fan, F.; Huang, J.; Li, H. Multi-stage convolutional autoencoder network for hyperspectral unmixing. Int. J. Appl. Earth Obs. Geoinf. 2022, 113, 102981. [Google Scholar] [CrossRef]
- Wang, J.; Xu, J.; Chong, Q.; Liu, Z.; Yan, W.; Xing, H.; Xing, Q.; Ni, M. SSANet: An adaptive spectral–spatial attention autoencoder network for hyperspectral unmixing. Remote Sens. 2023, 15, 2070. [Google Scholar] [CrossRef]
- Qu, Y.; Qi, H. uDAS: An untied denoising autoencoder with sparsity for spectral unmixing. IEEE Trans. Geosci. Remote Sens. 2018, 57, 1698–1712. [Google Scholar] [CrossRef]
- Li, J.; Zheng, K.; Li, Z.; Gao, L.; Jia, X. X-Shaped Interactive Autoencoders with Cross-Modality Mutual Learning for Unsupervised Hyperspectral Image Super-Resolution. IEEE Trans. Geosci. Remote Sens. 2023, 61, 5518317. [Google Scholar] [CrossRef]
- Yu, X.C.; Xiong, L.P.; Xu, J.D. Mineral mapping based on secondary scattering mixture model. Remote Sens. Land Resour. 2014, 26, 60–68. [Google Scholar]
- Ozkan, S.; Kaya, B.; Akar, G.B. Endnet: Sparse autoencoder network for endmember extraction and hyperspectral unmixing. IEEE Trans. Geosci. Remote Sens. 2018, 57, 482–496. [Google Scholar] [CrossRef]
- Su, Y.; Marinoni, A.; Li, J.; Plaza, J.; Gamba, P. Stacked nonnegative sparse autoencoders for robust hyperspectral unmixing. IEEE Geosci. Remote Sens. Lett. 2018, 15, 1427–1431. [Google Scholar] [CrossRef]
- Bedini, E. The use of hyperspectral remote sensing for mineral exploration: A review. J. Hyperspectral Remote Sens. 2017, 7, 189–211. [Google Scholar] [CrossRef]
- Lin, H.L.; Zhang, X.; Sun, Y.L. Hyperspectral sparse unmixing of minerals with single scattering albedo. J. Remote Sens. 2016, 20, 53–61. [Google Scholar] [CrossRef]
- Peyghambari, S.; Zhang, Y. Hyperspectral remote sensing in lithological mapping, mineral exploration, and environmental geology: An updated review. J. Appl. Remote Sens. 2021, 15, 031501. [Google Scholar] [CrossRef]
- Attallah, Y.; Zigh, E.; Adda, A.P. Optimized 3D-2D CNN for automatic mineral classification in hyperspectral images. Rep. Geod. Geoinformatics 2024, 118, 82–91. [Google Scholar] [CrossRef]
- Zhang, X.; Xing, W.U.; Honglei, L.I.N.; Nan, W.A.N.G. Retrieval of mineral abundances of delta region in Eberswalde, Mars. Natl. Remote Sens. Bull. 2021, 22, 304–312. [Google Scholar]
- Zhao, P.; Chen, Y. Digital geology and quantitative mineral exploration. Earth Sci. Front. 2021, 28, 1. [Google Scholar]
- Cheng, Q.; Gao, M. Comparative Studies of Nonlinear Models and Their Applications to Magmatic Evolution and Crustal Growth of the Huai’an Terrane in the North China Craton. Fractal Fract. 2025, 9, 38. [Google Scholar] [CrossRef]
- Li, C.; Zhou, K.; Gao, W.; Luo, X.; Tao, Z.; Liu, P.; Qiu, W. Geochemical prospectivity mapping using compositional balance analysis and multifractal modeling: A case study in the Jinshuikou area, Qinghai, China. J. Geochem. Explor. 2024, 257, 107361. [Google Scholar] [CrossRef]
- Zuo, R.; Yang, F.; Cheng, Q.; Kreuzer, O.P. A novel data-knowledge dual-driven model coupling artificial intelligence with a mineral systems approach for mineral prospectivity mapping. Geology 2025, 53, 284–288. [Google Scholar] [CrossRef]
- Heylen, R.; Scheunders, P. A multilinear mixing model for nonlinear spectral unmixing. IEEE Trans. Geosci. Remote Sens. 2015, 54, 240–251. [Google Scholar] [CrossRef]
- Altmann, Y.; Dobigeon, N.; Tourneret, J.Y. Unsupervised post-nonlinear unmixing of hyperspectral images using a Hamiltonian Monte Carlo algorithm. IEEE Trans. Image Process. 2014, 23, 2663–2675. [Google Scholar] [CrossRef]
- Hinton, G.E.; Salakhutdinov, R.R. Reducing the dimensionality of data with neural networks. Science 2006, 313, 504–507. [Google Scholar] [CrossRef]
- Woo, S.; Park, J.; Lee, J.Y.; Kweon, I.S. Cbam: Convolutional block attention module. In Proceedings of the European conference on computer vision (ECCV), Munich, Germany, 8–14 September 2018; pp. 3–19. [Google Scholar]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 770–778. [Google Scholar]
- Zeng, Y.; Ritz, C.; Zhao, J.; Lan, J. Attention-based residual network with scattering transform features for hyperspectral unmixing with limited training samples. Remote Sens. 2020, 12, 400. [Google Scholar] [CrossRef]
- Yang, X.; Chen, J.; Wang, C.; Chen, Z. Residual dense autoencoder network for nonlinear hyperspectral unmixing. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2022, 15, 5580–5595. [Google Scholar] [CrossRef]
- Haykin, S. Neural Networks and Learning Machines, 3rd ed.; Pearson Education: London, UK, 2009. [Google Scholar]
- Kruse, F.A.; Lefkoff, A.B.; Boardman, J.W.; Heidebrecht, K.B.; Shapiro, A.T.; Barloon, P.J.; Goetz, A.F.H. The spectral image processing system (SIPS)—Interactive visualization and analysis of imaging spectrometer data. Remote Sens. Environ. 1993, 44, 145–163. [Google Scholar] [CrossRef]
- Palsson, B.; Sigurdsson, J.; Sveinsson, J.R.; Ulfarsson, M.O. Hyperspectral Unmixing Using a Neural Network Autoencoder. IEEE Access 2018, 6, 25646–25656. [Google Scholar] [CrossRef]
- Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef]
- Sigurdsson, J.; Ulfarsson, M.O.; Sveinsson, J.R. Semi-supervised hyperspectral unmixing. In Proceedings of the 2014 IEEE Geoscience and Remote Sensing Symposium, Quebec City, QC, Canada, 13–18 July 2014; IEEE: Piscataway, NJ, USA, 2014. [Google Scholar]
- Jia, S.; Qian, Y. Constrained nonnegative matrix factorization for hyperspectral unmixing. IEEE Trans. Geosci. Remote Sens. 2008, 47, 161–173. [Google Scholar] [CrossRef]
- Nus, L.; Miron, S.; Brie, D. Estimation of the regularization parameter of an on-line NMF with minimum volume constraint. In Proceedings of the 2018 IEEE 10th Sensor Array and Multichannel Signal Processing Workshop (SAM), Sheffield, UK, 8–11 July 2018; IEEE: Piscataway, NJ, USA, 2018. [Google Scholar]
- Heinz, D.C.; Chang, C.I. Fully constrained least squares linear spectral mixture analysis method for material quantification in hyperspectral imagery. IEEE Trans. Geosci. Remote Sens. 2001, 39, 529–545. [Google Scholar] [CrossRef]
- Chen, J.; Richard, C.; Honeine, P. A novel kernel-based nonlinear unmixing scheme of hyperspectral images. In Proceedings of the 2011 Conference Record of the Forty Fifth Asilomar Conference on Signals, Systems and Computers (ASILOMAR), Pacific Grove, CA, USA, 6–9 November 2011; IEEE: Piscataway, NJ, USA, 2011; pp. 1898–1902. [Google Scholar][Green Version]
- Cronjaeger, C.; Pattison, R.C.; Tsay, C. Tensor-Based Autoencoder Models for Hyperspectral Produce Data. In Computer Aided Chemical Engineering; Elsevier: Amsterdam, The Netherlands, 2022; Volume 49, pp. 1585–1590. [Google Scholar][Green Version]
- Ji, D.J.; Park, J.; Cho, D.H. ConvAE: A new channel autoencoder based on convolutional layers and residual connections. IEEE Commun. Lett. 2019, 23, 1769–1772. [Google Scholar] [CrossRef]
- Wang, M.; Zhao, M.; Chen, J.; Rahardja, S. Nonlinear unmixing of hyperspectral data via deep autoencoder networks. IEEE Geosci. Remote Sens. Lett. 2019, 16, 1467–1471. [Google Scholar] [CrossRef]
- Zhang, J.; Zhang, X.; Meng, H.; Sun, C.; Wang, L.; Cao, X. Nonlinear unmixing via deep autoencoder networks for generalized bilinear model. Remote Sens. 2022, 14, 5167. [Google Scholar] [CrossRef]
- Hastie, T.; Tibshirani, R.; Friedman, J. The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd ed.; Springer: New York, NY, USA, 2009. [Google Scholar]
- Dennison, P.E.; Halligan, K.Q.; Roberts, D.A. A comparison of error metrics and constraints for multiple endmember spectral mixture analysis and spectral angle mapper. Remote Sens. Environ. 2004, 93, 359–367. [Google Scholar] [CrossRef]
- Chang, C.-I. An information-theoretic approach to spectral variability, similarity, and discrimination for hyperspectral image analysis. IEEE Trans. Inf. Theory 2000, 46, 1927–1932. [Google Scholar] [CrossRef]
- Netravali, A.N. Digital Pictures: Representation, Compression, and Standards; Springer: Berlin/Heidelberg, Germany, 2013. [Google Scholar]
- Tuia, D.; Persello, C.; Bruzzone, L. Domain Adaptation for the Classification of Remote Sensing Data: An Overview of Recent Advances. IEEE Geosci. Remote Sens. Mag. 2016, 4, 41–57. [Google Scholar] [CrossRef]
- Baghbaderani, R.K.; Qu, Y.; Qi, H. Unsupervised Hyperspectral Image Domain Adaptation through Unmixing-Based Domain Alignment. In Proceedings of the IGARSS 2023–2023 IEEE International Geoscience and Remote Sensing Symposium, Pasadena, CA, USA, 16–21 July 2023; pp. 5906–5909. [Google Scholar]

























| Index | Layers | Activation Function | Units |
|---|---|---|---|
| 1 | Input layer | - | 128 |
| 2 | Activation layer | LeakyReLU (0.1) | 128 |
| 3 | Channel attention module | ReLU & Sigmoid | 128 |
| 4 | Hidden layer (Residual block) | ReLU | 128 |
| 5 | Hidden layer | - | 64 |
| 6 | Activation layer | LeakyReLU (0.1) | 64 |
| 7 | Hidden layer | - | 32 |
| 8 | Activation layer | LeakyReLU (0.1) | 32 |
| 9 | Output layer (Abundance estimation) | - | 5 |
| Index | Layers | Activation Function | Units |
|---|---|---|---|
| 10 | Linear reconstruction layer (Endmember combination) | - | 162 |
| 11 | Hidden layer (Nonlinear compensation) | - | 512 |
| 12 | Activation layer | LeakyReLU(0.1) | 512 |
| 13 | Hidden layer | - | 162 |
| 14 | Output layer (Nonlinear restoration) | ReLU | 162 |
| SNR = 10 | SNR = 20 | SNR = 30 | |||||||
|---|---|---|---|---|---|---|---|---|---|
| Linear | Bilinear | Pnmm | Linear | Bilinear | Pnmm | Linear | Bilinear | Pnmm | |
| MVC + NMF | 0.4347 | 0.4297 | 0.4302 | 0.4426 | 0.4471 | 0.4456 | 0.4431 | 0.4497 | 0.3883 |
| VCA+K-Hype | 0.4051 | 0.4088 | 0.4058 | 0.4142 | 0.4217 | 0.4159 | 0.4194 | 0.4187 | 0.3542 |
| LinearAE | 0.3712 | 0.3317 | 0.3590 | 0.3385 | 0.3494 | 0.3777 | 0.3495 | 0.3566 | 0.3074 |
| ConvAE | 0.3800 | 0.3746 | 0.4605 | 0.3914 | 0.4250 | 0.3776 | 0.3761 | 0.3974 | 0.3309 |
| NAE | 0.1819 | 0.2049 | 0.1286 | 0.1664 | 0.1707 | 0.1912 | 0.1436 | 0.1300 | 0.1643 |
| CResDAE | 0.1658 | 0.1741 | 0.1737 | 0.1566 | 0.1660 | 0.1332 | 0.1402 | 0.1645 | 0.1408 |
| SNR = 10 | SNR = 20 | SNR = 30 | |||||||
|---|---|---|---|---|---|---|---|---|---|
| Linear | Bilinear | Pnmm | Linear | Bilinear | Pnmm | Linear | Bilinear | Pnmm | |
| MVC + NMF | 1.0161 | 1.0125 | 1.0147 | 1.0252 | 1.0242 | 1.0264 | 1.0312 | 1.0362 | 0.9209 |
| VCA+K-Hype | 1.1624 | 1.1757 | 1.1721 | 1.1884 | 1.1996 | 1.1889 | 1.2050 | 1.1770 | 1.0877 |
| LinearAE | 1.1355 | 0.9767 | 1.1080 | 1.0007 | 1.0534 | 1.1462 | 1.0591 | 1.0671 | 0.9697 |
| ConvAE | 1.1397 | 1.1262 | 1.2382 | 1.1453 | 1.1856 | 1.1269 | 1.1236 | 1.1664 | 1.0459 |
| NAE | 0.2808 | 0.4291 | 0.2524 | 0.3332 | 0.3351 | 0.3995 | 0.2874 | 0.2472 | 0.3892 |
| CResDAE | 0.2696 | 0.2909 | 0.2879 | 0.2573 | 0.2762 | 0.2021 | 0.2894 | 0.2740 | 0.2856 |
| SNR = 10 | SNR = 20 | SNR = 30 | |||||||
|---|---|---|---|---|---|---|---|---|---|
| Linear | Bilinear | Pnmm | Linear | Bilinear | Pnmm | Linear | Bilinear | Pnmm | |
| MVC + NMF | 5.9117 | 5.8382 | 5.8822 | 6.0518 | 6.0218 | 6.0849 | 6.1661 | 6.2834 | 4.0380 |
| VCA+K-Hype | 6.4039 | 6.9592 | 6.4769 | 6.5783 | 7.6265 | 7.1941 | 7.3115 | 7.0767 | 3.7939 |
| LinearAE | 6.0235 | 4.2305 | 5.6046 | 4.5542 | 4.9663 | 6.0869 | 5.2693 | 5.4662 | 3.3157 |
| ConvAE | 5.4805 | 6.4422 | 6.3845 | 5.7064 | 6.8234 | 5.2746 | 4.3741 | 5.6570 | 3.0629 |
| NAE | 3.6937 | 5.1555 | 3.3955 | 3.9539 | 4.2868 | 4.6669 | 3.6184 | 3.3508 | 3.7517 |
| CResDAE | 3.4522 | 3.7031 | 3.5962 | 3.0901 | 3.4293 | 2.3115 | 3.6076 | 3.3468 | 3.0264 |
| Mixing Type | SNR | RMSE | RMSE1 | RMSE2 | RMSE3 | RMSE4 | RMSE5 | SAD | SID | PSNR |
|---|---|---|---|---|---|---|---|---|---|---|
| linear | 10 | 0.2855 | 0.3576 | 0.3562 | 0.2666 | 0.1653 | 0.2330 | 0.8010 | 7.9020 | 10.8886 |
| 20 | 0.2682 | 0.3372 | 0.3318 | 0.2534 | 0.1411 | 0.2276 | 0.7476 | 7.3013 | 11.4301 | |
| 30 | 0.2498 | 0.3511 | 0.2429 | 0.2407 | 0.1528 | 0.2198 | 0.6636 | 6.6191 | 12.0497 | |
| bilinear | 10 | 0.3113 | 0.3997 | 0.4076 | 0.2797 | 0.1600 | 0.2341 | 0.8912 | 8.6397 | 10.1365 |
| 20 | 0.2716 | 0.3536 | 0.2999 | 0.2782 | 0.1500 | 0.2320 | 0.7526 | 7.4973 | 11.3228 | |
| 30 | 0.2614 | 0.3564 | 0.2860 | 0.2410 | 0.1526 | 0.2267 | 0.7131 | 7.0073 | 11.6550 | |
| pnmm | 10 | 0.3212 | 0.3924 | 0.4466 | 0.2867 | 0.1589 | 0.2341 | 0.9190 | 9.0545 | 9.8654 |
| 20 | 0.3018 | 0.3801 | 0.4026 | 0.2776 | 0.1437 | 0.2263 | 0.8536 | 8.3068 | 10.4048 | |
| 30 | 0.3085 | 0.3848 | 0.4065 | 0.2875 | 0.1638 | 0.2305 | 0.8759 | 8.5839 | 10.2144 | |
| Mean | 0.2866 | 0.3681 | 0.3533 | 0.2679 | 0.1543 | 0.2293 | 0.8020 | 7.8791 | 10.8853 | |
| Mixing Type | SNR | RMSE | RMSE1 | RMSE2 | RMSE3 | RMSE4 | RMSE5 | SAD | SID | PSNR |
|---|---|---|---|---|---|---|---|---|---|---|
| linear | 10 | 0.2184 | 0.1828 | 0.2811 | 0.2390 | 0.1062 | 0.2404 | 0.4945 | 4.3943 | 13.2131 |
| 20 | 0.1934 | 0.2504 | 0.1963 | 0.1429 | 0.1051 | 0.2329 | 0.4660 | 4.5356 | 14.2722 | |
| 30 | 0.1794 | 0.1984 | 0.1937 | 0.1316 | 0.1155 | 0.2311 | 0.4357 | 4.1221 | 14.9229 | |
| bilinear | 10 | 0.2431 | 0.2051 | 0.3153 | 0.2865 | 0.1119 | 0.2437 | 0.5696 | 5.2564 | 12.2843 |
| 20 | 0.1932 | 0.1933 | 0.2083 | 0.1904 | 0.1084 | 0.2407 | 0.4646 | 4.0251 | 14.2784 | |
| 30 | 0.1940 | 0.2389 | 0.1980 | 0.1445 | 0.1209 | 0.2374 | 0.4853 | 4.9833 | 14.2449 | |
| pnmm | 10 | 0.2259 | 0.1769 | 0.2983 | 0.2564 | 0.1140 | 0.2368 | 0.5329 | 4.6277 | 12.9226 |
| 20 | 0.2161 | 0.2037 | 0.2965 | 0.1966 | 0.1025 | 0.2346 | 0.5010 | 4.8449 | 13.3055 | |
| 30 | 0.2196 | 0.2485 | 0.2717 | 0.1990 | 0.1146 | 0.2298 | 0.5691 | 5.1954 | 13.1674 | |
| Mean | 0.2092 | 0.2109 | 0.2510 | 0.1985 | 0.1110 | 0.2364 | 0.5021 | 4.6650 | 13.6235 | |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhao, C.; Wang, J.; Qiao, Q.; Zhou, K.; Bi, J.; Zhang, Q.; Wang, W.; Li, D.; Liao, T.; Li, C.; et al. CResDAE: A Deep Autoencoder with Attention Mechanism for Hyperspectral Unmixing. Remote Sens. 2025, 17, 3622. https://doi.org/10.3390/rs17213622
Zhao C, Wang J, Qiao Q, Zhou K, Bi J, Zhang Q, Wang W, Li D, Liao T, Li C, et al. CResDAE: A Deep Autoencoder with Attention Mechanism for Hyperspectral Unmixing. Remote Sensing. 2025; 17(21):3622. https://doi.org/10.3390/rs17213622
Chicago/Turabian StyleZhao, Chong, Jinlin Wang, Qingqing Qiao, Kefa Zhou, Jiantao Bi, Qing Zhang, Wei Wang, Dong Li, Tao Liao, Chao Li, and et al. 2025. "CResDAE: A Deep Autoencoder with Attention Mechanism for Hyperspectral Unmixing" Remote Sensing 17, no. 21: 3622. https://doi.org/10.3390/rs17213622
APA StyleZhao, C., Wang, J., Qiao, Q., Zhou, K., Bi, J., Zhang, Q., Wang, W., Li, D., Liao, T., Li, C., Qiu, H., & Qu, G. (2025). CResDAE: A Deep Autoencoder with Attention Mechanism for Hyperspectral Unmixing. Remote Sensing, 17(21), 3622. https://doi.org/10.3390/rs17213622

