# Neighboring Discriminant Component Analysis for Asteroid Spectrum Classification

^{1}

^{2}

^{3}

^{4}

^{*}

## Abstract

**:**

## 1. Introduction

- (1)
- Instead of empirically determining the spectral features via the presence or absence of specific spectral features to define asteroid class boundaries for classification, this paper presents a novel supervised Neighboring Discriminant Components Analysis (NDCA) model for discriminative asteroid spectral feature learning by simultaneously maximizing the neighboring between-class scatter and data variances, minimizing the neighboring within-class scatter to alleviate the overfitting problem caused by outliers and enhancing the discrimination and generalization ability of the model.
- (2)
- With the neighboring discrimination learning strategy, the proposed NDCA model has stronger robustness to abnormal samples and outliers, and the generalization performance can thus be improved. In addition, the NDCA model transforms the data from the observation space into a more separable subspace, and the key category-related knowledge can be well discovered and preserved for different classes of asteroids with neighboring structure preservation and label prior guidance.
- (3)
- The performance of the proposed NDCA model is verified on real-world asteroid dataset covering the spectral wavelengths from 0.45 to 2.45 μm by combining with different baseline classifier models, including the nearest neighbor (NN), support vector machine (SVM) and extreme learning machine (ELM). In particular, the best result is achieved by ELM, with a classification accuracy of about 95.19%.

## 2. Related Work

#### 2.1. Notations Used in This Paper

#### 2.2. Low-Dimensional Feature Learning for Spectral Data

#### 2.3. Classifier Models for Spectral Data Classification

## 3. The Proposed Neighboring Discriminant Component Analysis Model: Formulation and Optimization

_{b}= [N

_{b}

_{1}, N

_{b}

_{2}, …, N

_{bc}, …, N

_{bC}] global neighboring samples ${X}_{b}$ can be obtained with N

_{bc}as the number of the neighboring samples in the c-th class for computing the neighboring between-class scatter matrix. Secondly, compute the local centroid ${m}_{bc}=\left(1/{N}_{bc}\right)\times {\sum}_{j=1}^{{N}_{bc}}{x}_{bcj}$ for the c-th class, and ${x}_{bcj}$ is the j-th sample in the c-th class of the neighboring samples ${X}_{b}$. Finally, the neighboring between-class scatter matrix is calculated as follows.

_{wc}= R

_{w}· N

_{c}neighboring samples to ${m}_{wc}$ by using within-class neighboring ratio R

_{w}(0 < R

_{w}< 1) in the i-th class. Secondly, refine the local centroid of each class using the samples in the obtained neighboring group of samples ${X}_{wc}$. Finally, compute the neighboring within-class scatter matrix as follows:

**P**, the goals of neighboring between-class scatter maximization, within-class scatter minimization and neighboring principal components preservation can be simultaneously achieved. Accordingly, the side effects of outliers and noised samples will be suppressed to the largest extent. As a result, the global and local neighboring discriminative structures and principal components will be enhanced and preserved by using the neighboring learning mechanism. Furthermore, optimization problem (13) can be transformed into the following one by introducing an equality constraint [49]:

**P**is calculated and set as zero, resulting in the following equations:

_{1}, λ

_{2}…, λ

_{d}of the eigenvalue decomposition problem as described below.

**P**and then classified by the trained classifier model.

## 4. Experiments

#### 4.1. Preprocessing for the Asteroid Spectral Data

#### 4.2. Experimental Setup and Results

#### 4.3. Analysis for NDCA Parameters

^{g}, g = −4, −3, −2, −1, 0, 1, 2, 3, 4}, while Rw and Rb were selected from the candidate parameter set {0.5, 0.55, 0.6, 0.65, 0.7, 0.75, 0.8, 0.85, 0.9, 0.95, 1}. As shown in Figure 11, Figure 12 and Figure 13, one can observe that the average classification performance change surfaces in sub-figures (a) of Figure 11, Figure 12 and Figure 13 are smoother and more stable within a wide parameter setting range, which means that the classification is not very sensitive to the settings of parameter pair (γ, μ). By contrast, the classification performance changes more acutely with the variations of different parameter pairs (R

_{w}, R

_{b}).

#### 4.4. Analysis for ELM Classifier Parameters

^{−5}to 10 and then degrades when α increases from 10 to 10

^{5}. In the experiments, α can be set around 10 by which promising performance can be expected.

- (1)
- The benefits of feature learning for asteroid spectrum classification. In the experiments shown in Table 6, Table 7 and Table 8, the original observed raw spectral data without feature learning were directly fed into the classifier models, i.e., NN, SVM and ELM, for classification. The average classification performances achieved by NN, SVM and ELM were 89.2085%, 91.7047% and 92.6963%, respectively, which were generally the worst performance among all the comparative methods. In contrast, the classification performance achieved by the same classifier models after feature learning obtained some improvement. For example, LPP plus NN, SVM and ELM can, respectively, achieve the improved classification accuracies of 89.7565%, 92.8158% and 94.4711%. The results can verify the benefits of feature learning for the improvement of asteroid spectral classification accuracy.
- (2)
- The advantages of the proposed NDCA model. In comparison with several representative low-dimensional feature learning methods, the proposed NDCA model can generally achieve better classification performance by combining with different classifier models. Specifically, NDCA plus NN, SVM and ELM can achieve the highest classification accuracies of 94.1971%, 93.6377% and 95.1895%, respectively. The improvements are mainly due to the following two aspects. Firstly, the NDCA model is a supervised dimension reduction method and inherits the merits of the existing methods, which can fully utilize label knowledge in order to find the key category-related information of spectral data for discriminative asteroid spectral feature learning and classification. Secondly, the introduction of neighboring learning methodology can significantly reduce the side effects of outliers and noised samples in order to alleviate the overfitting problem, which will enhance the robustness of the leant low-dimensional features and finally improve the generalization ability and classification performance of the proposed model in testing.
- (3)
- The superiority of ELM. Three baseline classifier models, including NN, SVM and ELM, were used in the experiments. In particular, the best results are obtained by NDCA plus ELM with a classification accuracy of about 95.19%, which is generally superior to the comparing classifier models. To the best of our knowledge, this work is the first attempt to apply ELM in asteroid spectrum classification, and very competitive performance has been achieved, which can provide new application scenarios and perspectives for ELM community.
- (4)
- Future work discussion. First, future work will consider employing feature selection methods in order to study the asteroid spectral characteristics. Distinct from feature learning/extraction methods, which adopts the idea of data transformation, feature/band selection methods use the idea of selection and aim to automatically select a small subset of representative spectral bands in order to remove spectral redundancy while simultaneously preserving the significant spectral knowledge. Since the feature selection is performed in the original observation space, the specific selected bands have clearer physical meanings with better interpretability. As a result, feature/band selection is an important technique for spectral dimensionality reduction and has room for further improvement. Second, the visualization in Figure 10 for the scatter points of the first two components acquired by different methods showed that some classes of asteroid spectra with limited training samples are seriously mixed and overlapped. One possible reason is that the numbers of training samples from different classes were unbalanced. For example, the number of samples for ‘S’ class asteroid is 199, while the ‘A’ class asteroid only has six samples. When classifying data with complex class distribution, the regular learning algorithm has a natural tendency to favor the majority class by assuming balanced class distribution or equal misclassification cost. As a result, the sample imbalance problem will result in learning bias, and the generalization ability of the obtained model is, thus, restricted. It is significant to deal with the data imbalanced problem and establish balanced data distribution by some sampling or algorithmic methods in future works such that the imbalanced class distribution problem can be well handled and alleviated, which can improve the accuracy of asteroid spectral data analysis.

## 5. Conclusions

## Author Contributions

## Funding

## Institutional Review Board Statement

## Informed Consent Statement

## Data Availability Statement

## Acknowledgments

## Conflicts of Interest

## Appendix A

- 1
- Deriving $Tr\left({P}^{T}{S}_{Nb}P\right)$ from Equation (10).

**P**. The neighboring between-class scatter matrix in feature space is described below.

- 2
- Deriving ${P}^{T}{X}_{b}{X}_{b}^{T}P$ from Equation (11).

**P**, i.e., ${\overline{x}}_{{}_{bi}}={P}^{T}{x}_{bi}$. With the idea of PCA, the variance of the projected data is maximized as follows.

- 3
- Deriving $Tr\left({P}^{T}{S}_{Nw}P\right)$ from Equation (12).

**P**. The neighboring within-class scatter in feature space is described as below.

**P**.

## References

- Zhang, Y.; Jiang, J.; Zhang, G. Compression of remotely sensed astronomical image using wavelet-based compressed sensing in deep space exploration. Remote Sens.
**2021**, 13, 288. [Google Scholar] [CrossRef] - Wu, W.; Liu, W.; Qiao, D.; Jie, D. Investigation on the development of deep space exploration. Sci. China Technol. Sci.
**2012**, 55, 1086–1091. [Google Scholar] [CrossRef] - Dorsky, L.I. Trends in instrument systems for deep space exploration. IEEE Aerosp. Electron. Syst. Mag.
**2001**, 16, 3–12. [Google Scholar] [CrossRef] - Seager, S.; Bains, W. The search for signs of life on exoplanets at the interface of chemistry and planetary science. Sci. Adv.
**2015**, 1, e1500047. [Google Scholar] [CrossRef] [Green Version] - Cole, G.H. Planetary Science: The Science of Planets around Stars; Taylor & Francis: Abingdon, UK, 2002. [Google Scholar]
- Keil, K. Thermal alteration of asteroids: Evidence from meteorites. Planet. Space Sci.
**2000**, 48, 887–903. [Google Scholar] [CrossRef] - Carry, B. Density of asteroids. Planet. Space Sci.
**2012**, 73, 98–118. [Google Scholar] [CrossRef] [Green Version] - Lu, X.P.; Jewitt, D. Dependence of light curves on phase angle and asteroid Shape. Astron. J.
**2019**, 158, 220. [Google Scholar] [CrossRef] - Bus, S.J.; Binzel, R.P. Phase II of the small main-belt asteroid spectroscopic survey: A feature-based taxonomy. Icarus
**2002**, 158, 146–177. [Google Scholar] [CrossRef] - Xu, S.; Binzel, R.P.; Burbine, T.H.; Bus, S.J. Small main-belt asteroid spectroscopic survey. Bull. Am. Astron. Soc.
**1993**, 25, 1135. [Google Scholar] - Howell, E.S.; Merényi, E.; Lebofsky, L.A. Classification of asteroid spectra using a neural network. J. Geophys. Res.
**1994**, 99, 10847–10865. [Google Scholar] [CrossRef] [Green Version] - Binzel, R.P.; Harris, A.W.; Bus, S.J.; Burbine, T.H. Spectral properties of near-Earth objects: Palomar and IRTF results for 48 objects including spacecraft targets (9969) Braille and (10302) 1989 ML. Icarus
**2001**, 151, 139–149. [Google Scholar] [CrossRef] [Green Version] - Vilas, F.; Mcfadden, L.A. CCD reflectance spectra of selected asteroids: I. Presentation and data analysis considerations. Icarus
**1992**, 100, 85–94. [Google Scholar] [CrossRef] - Zellner, B.; Tholen, D.J.; Tedesco, E.F. The eight-color asteroid survey: Results for 589 minor planets. Icarus
**1985**, 61, 355–416. [Google Scholar] [CrossRef] - Xu, S.; Binzel, R.P.; Burbine, T.H.; Bus, S.J. Small main-belt asteroid spectroscopic survey: Initial results. Icarus
**1995**, 115, 1–35. [Google Scholar] [CrossRef] - Burbine, T.H.; Binzel, R.P. Small main-belt asteroid spectroscopic survey in the near-infrared. Icarus
**2002**, 159, 468–499. [Google Scholar] [CrossRef] [Green Version] - Bus, S.J.; Binzel, R.P. Phase II of the small main-belt asteroid spectroscopic survey: The observations. Icarus
**2002**, 158, 106–145. [Google Scholar] [CrossRef] - Bus, S.J. Compositional Structure in the Asteroid Belt: Results of a Spectroscopic Survey. Ph.D. Thesis, Massachusetts Institute of Technology, Massachusetts Avenue, Cambridge, MA, USA, 1999. [Google Scholar]
- Tholen, D.J. Asteroid Taxonomy from Cluster Analysis of Photometry. Ph.D. Thesis, University of Arizona, Tucson, AZ, USA, 1984. [Google Scholar]
- DeMeo, F.E.; Binzel, R.P.; Slivan, S.M.; Bus, S.J. An extension of the Bus asteroid taxonomy into the near-infrared. Icarus
**2009**, 202, 160–180. [Google Scholar] [CrossRef] [Green Version] - Xu, S. CCD Photometry and Spectroscopy of Small Main-Belt Asteroids. Ph.D. Thesis, Massachusetts Institute of Technology, Massachusetts Avenue, Cambridge, MA, USA, 1994. [Google Scholar]
- Imani, M.; Ghassemian, H. Band clustering-based feature extraction for classification of hyperspectral images using limited training samples. IEEE Geosci. Remote Sens. Lett.
**2014**, 11, 1325–1329. [Google Scholar] [CrossRef] - Taşkın, G.; Kaya, H.; Bruzzone, L. Feature selection based on high dimensional model representation for hyperspectral images. IEEE Trans. Image Process.
**2017**, 26, 2918–2928. [Google Scholar] [CrossRef] - Wood, X.H.; Kuiper, G.P. Photometric studies of asteroids. Astrophys. J.
**1963**, 137, 1279. [Google Scholar] [CrossRef] - Gaffey, M.J.; Burbine, T.H.; Binzel, R.P. Asteroid spectroscopy: Progress and perspectives. Meteoritics
**1993**, 28, 161–187. [Google Scholar] [CrossRef] - Sun, W.; Du, Q. Hyperspectral band selection: A review. IEEE Geosci. Remote Sens. Mag.
**2019**, 7, 118–139. [Google Scholar] [CrossRef] - Herrmann, F.J.; Friedlander, M.P.; Yilmaz, O. Fighting the curse of dimensionality: Compressive sensing in exploration seismology. IEEE Signal Process. Mag.
**2012**, 29, 88–100. [Google Scholar] [CrossRef] - Zhang, L.; Zhang, L.; Tao, D.; Bu, B. Hyperspectral remote sensing image subpixel target detection based on supervised metriclearning. IEEE Trans. Geosci. Remote Sens.
**2013**, 52, 4955–4965. [Google Scholar] [CrossRef] - Dong, Y.; Liang, T.; Zhang, Y.; Du, B. Spectral-spatial weighted kernel manifold embedded distribution alignment for remote sensing image classification. IEEE Trans. Cybern.
**2021**, 51, 3185–3197. [Google Scholar] [CrossRef] [PubMed] - Guo, T.; Luo, F.; Zhang, L.; Zhang, B.; Tan, X.; Zhou, X. Learning structurally incoherent background and target dictionaries for hyperspectral target detection. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens.
**2020**, 13, 3521–3533. [Google Scholar] [CrossRef] - Rodger, A.; Laukamp, C.; Fabris, A. Feature Extraction and Clustering of Spectrally Measured Drill Core to Identify Mineral Assemblages and Potential Spatial Boundaries. Minerals
**2021**, 11, 136. [Google Scholar] [CrossRef] - Luo, F.; Zhang, L.; Zhou, X.; Guo, T.; Cheng, Y.; Yin, T. Sparse-adaptive hypergraph discriminant analysis for hyperspectral image classification. IEEE Geosci. Remote Sens. Lett.
**2020**, 17, 1082–1086. [Google Scholar] [CrossRef] - Luo, F.; Zhang, L.; Du, B.; Zhang, L. Dimensionality reduction with enhanced hybrid-graph discriminant learning for hyperspetral image classification. IEEE Trans. Geosci. Remote Sens.
**2020**, 58, 5336–5353. [Google Scholar] [CrossRef] - Guo, T.; Luo, F.; Zhang, L.; Tan, X.; Liu, J.; Zhou, X. Target detection in hyperspectral imagery via sparse and dense hybrid representation. IEEE Geosci. Remote Sens. Lett.
**2020**, 17, 716–720. [Google Scholar] [CrossRef] - Luo, F.; Du, B.; Zhang, L.; Zhang, L.; Tao, D. Feature learning using spatial-spectral hypergraph discriminant analysis for hyperspectral Image. IEEE Trans. Cybern.
**2019**, 49, 2406–2419. [Google Scholar] [CrossRef] [PubMed] - Hotelling, H.H. Analysis of complex statistical variables into principal components. Br. J. Educ. Psychol.
**1933**, 24, 417–520. [Google Scholar] [CrossRef] - Fisher, R.A. The statistical utilization of multiple measurements. Ann. Hum. Genet.
**1938**, 8, 376–386. [Google Scholar] [CrossRef] - He, X.; Niyogi, P. Locality preserving projections. Adv. Neural Inf. Process. Syst.
**2004**, 16, 153–160. [Google Scholar] - Gui, J.; Wang, C.; Zhu, L. Locality preserving discriminant projections. In Emerging Intelligent Computing Technology and Applications. With Aspects of Artificial Intelligence, Proceedings of the International Conference on Intelligent Computing, Ulsan, Korea, 16–19 September 2009; Springer: Berlin/Heidelberg, Germany, 2009; pp. 566–572. [Google Scholar]
- Zhang, L.; Wang, X.; Huang, G.B.; Liu, T.; Tan, X. Taste recognition in E-tongue using local discriminant preservation projection. IEEE Trans. Cybern.
**2018**, 49, 947–960. [Google Scholar] [CrossRef] [PubMed] - Martinez, A.M.; Kak, A.C. PCA versus LDA. IEEE Trans. Pattern Anal. Mach. Intell.
**2001**, 23, 228–233. [Google Scholar] [CrossRef] [Green Version] - He, X.; Yan, S.; Hu, Y.; Niyogi, P.; Zhang, H.J. Face recognition using Laplacianfaces. IEEE Trans. Pattern Anal. Mach. Intell.
**2005**, 27, 328–340. [Google Scholar] - Yan, S.; Xu, D.; Zhang, B.; Zhang, H.; Yang, Q.; Lin, S. Graph embedding and extensions: A general framework for dimensionality reduction. IEEE Trans. Pattern Anal. Mach. Intell.
**2007**, 29, 40–51. [Google Scholar] [CrossRef] [Green Version] - Vapnik, V.N. An overview of statistical learning theory. IEEE Trans. Neural Netw.
**1999**, 10, 988–999. [Google Scholar] [CrossRef] [Green Version] - Huang, G.B.; Zhou, H.; Ding, X.; Zhang, R. Extreme learning machine for regression and multi class classification. IEEE Trans. Syst. Man Cybern. Part B
**2011**, 42, 513–529. [Google Scholar] [CrossRef] [Green Version] - Zhang, L.; Zhang, D. Domain adaptation extreme learning machines for drift compensation in E-nose systems. IEEE Trans. Instrum. Meas.
**2014**, 64, 1790–1801. [Google Scholar] [CrossRef] [Green Version] - Guo, T.; Zhang, L.; Tan, X. Neuron pruning based discriminative extreme learning machine for pattern classification. Cogn. Comput.
**2017**, 9, 581–595. [Google Scholar] [CrossRef] - Zhang, L.; Zhang, D. Robust visual knowledge transfer via extreme learning machine based domain adaptation. IEEE Trans. Image Process.
**2016**, 25, 4959–4973. [Google Scholar] [CrossRef] [PubMed] - Boyd, S.; Vandenberghe, L. Convex Optimization; Cambridge University Press: New York, NY, USA, 2009. [Google Scholar]

**Figure 3.**The spectral preprocessing for (1) Ceres with category ‘C’. (

**a**) Original spectra; (

**b**) smoothed spectra; (

**c**) fitted spectra spanning the wavelength range from 0.45 to 2.45 μm.

**Figure 4.**The spectral preprocessing for (2957) Tatsuo with category ‘K’. (

**a**) Original spectra; (

**b**) smoothed spectra; (

**c**) fitted spectra spanning the wavelength range from 0.45 to 2.45 μm.

**Figure 5.**The spectral preprocessing for (1807) Slovakia with category ‘S’. (

**a**) Original spectra; (

**b**) smoothed spectra; (

**c**) fitted spectra spanning the wavelength range 0.45 to 2.45 μm.

**Figure 7.**The performance of different dimension reduction methods under different reduced dimensions using NN as the classifier.

**Figure 8.**The performance of different dimension reduction methods under different reduced dimensions using SVM as the classifier.

**Figure 9.**The performance of different dimension reduction methods under different reduced dimensions using ELM as the classifier.

**Figure 10.**Visualization for the scatter points of the first two components acquired by different methods. By comparison, the proposed NDCA model shows better within-class compactness and between-class separation characteristics.

**Figure 11.**NN plus NDCA performance under different combinations of parameters. (

**a**) μ and γ (best: 90.60%; worst: 89.76%); (

**b**) Rb and Rw (best: 92.81%; worst: 87.82%).

**Figure 12.**SVM plus NDCA performance under different combinations of parameters. (

**a**) μ and γ (best: 91.7009%; worst: 90.8676%); (

**b**) Rb and Rw (best: 91.9787%; worst: 87.2679%).

**Figure 13.**ELM plus NDCA performance under different combinations of parameters. (

**a**) μ and γ (best: 90.0894%; worst: 88.4365%); (

**b**) Rb and Rw (best: 90.3676%; worst: 87.2660%).

**Figure 14.**Classification performance variations of NDCA plus ELM under different settings of L and α.

Class | ‘A’ | ‘B’ | ‘C’ | ‘Cb’ | ‘Cg’ | ‘Cgh’ | ‘Ch’ | ‘D’ |

# samples | 6 | 4 | 13 | 3 | 1 | 10 | 18 | 16 |

Class | ‘K’ | ‘L’ | ‘O’ | ‘Q’ | ‘R’ | ‘S’ | ‘Sa’ | ‘Sq’ |

# samples | 16 | 22 | 1 | 8 | 1 | 144 | 2 | 29 |

Class | ‘Sr’ | ‘Sv’ | ‘T’ | ‘V’ | ‘X’ | ‘Xc’ | ‘Xe’ | ‘Xk’ |

# samples | 22 | 2 | 4 | 17 | 4 | 3 | 7 | 18 |

Notation | Meaning | Notation | Meaning |
---|---|---|---|

P | Subspace projection matrix | T | Label matrix |

X | High-dimensional dataset with dimension D | x_{i}, x_{j} | Data points with index i and j |

Y | Lower-dimensional features of X with dimension d | N | Number of datapoints |

C | Number of classes in X and Y | N_{i}, i = 1, 2 … C | Number of data points in i-th class |

α | Balance parameter in ELM model | y_{i}, y_{j} | Lower-dimensional features for x_{i}, x_{j} |

Class | ‘A’ | ‘C’ | ‘D’ | ‘K’ | ‘L’ | ‘Q’ | ‘S’ | ‘V’ | ‘X’ | Total |

# Samples | 6 | 45 | 16 | 16 | 22 | 8 | 199 | 17 | 32 | 361 |

Class | Fold 1 | Fold 2 | Fold 3 | Fold 4 | Fold 5 | Total |
---|---|---|---|---|---|---|

‘A’ | 1 | 1 | 1 | 1 | 2 | 6 |

‘C’ | 9 | 9 | 9 | 9 | 9 | 45 |

‘D’ | 3 | 3 | 3 | 4 | 3 | 16 |

‘K’ | 4 | 3 | 3 | 3 | 3 | 16 |

‘L’ | 4 | 4 | 4 | 5 | 5 | 22 |

‘Q’ | 2 | 2 | 1 | 2 | 1 | 8 |

‘S’ | 40 | 40 | 40 | 40 | 39 | 199 |

‘V’ | 3 | 4 | 4 | 3 | 3 | 17 |

‘X’ | 6 | 6 | 7 | 6 | 7 | 32 |

# samples | 72 | 72 | 72 | 73 | 72 | 361 |

Experiments | Training Dataset | Testing Dataset |
---|---|---|

Exp. 1 | fold 1, fold 2, fold 3 and fold 4 (289 samples in total) | fold 5 (72 samples in total) |

Exp. 2 | fold 1, fold 2, fold 3 and fold 5 (288 samples in total) | fold 4 (73 samples in total) |

Exp. 3 | fold 1, fold 2, fold 4 and fold 5 (289 samples in total) | fold 3 (72 samples in total) |

Exp. 4 | fold 1, fold 3, fold 4 and fold 5 (289 samples in total) | fold 2 (72 samples in total) |

Exp. 5 | fold 2, fold 3, fold 4 and fold 5 (289 samples in total) | fold 1 (72 samples in total) |

**Table 6.**Classification accuracy (%) of different dimension reduction algorithms using NN as the classifier.

Methods | Exp. 1 | Exp. 2 | Exp. 3 | Exp. 4 | Exp. 5 | Average |
---|---|---|---|---|---|---|

Raw | 94.4444 | 84.9315 | 87.5000 | 93.0556 | 86.1111 | 89.2085 |

PCA | 94.4444 | 84.9315 | 87.5000 | 93.0556 | 86.1111 | 89.2085 |

LDA | 95.8333 | 90.4110 | 88.8889 | 97.2222 | 88.8889 | 92.2489 |

LPP | 90.2778 | 87.6712 | 90.2778 | 91.6667 | 88.8889 | 89.7565 |

LPDP | 95.8333 | 90.4110 | 91.6667 | 97.2222 | 88.8889 | 92.8044 |

NDCA | 97.2222 | 89.0411 | 93.0556 | 98.6111 | 93.0556 | 94.1971 |

**Table 7.**Classification accuracy (%) of different dimension reduction algorithms using SVM as the classifier.

Methods | Exp. 1 | Exp. 2 | Exp. 3 | Exp. 4 | Exp. 5 | Average |
---|---|---|---|---|---|---|

Raw | 94.4444 | 86.3014 | 93.0556 | 93.0556 | 91.6667 | 91.7047 |

PCA | 94.4444 | 89.0411 | 93.0556 | 93.0556 | 91.6667 | 92.2527 |

LDA | 94.4444 | 90.4110 | 88.8889 | 94.4444 | 91.6667 | 91.9711 |

LPP | 97.2222 | 86.3014 | 90.2778 | 95.8333 | 94.4444 | 92.8158 |

LPDP | 94.4444 | 90.4110 | 93.0556 | 94.4444 | 91.6667 | 92.8044 |

NDCA | 94.4444 | 90.4110 | 94.4444 | 95.8333 | 93.0556 | 93.6377 |

**Table 8.**Classification accuracy (%) of different dimension reduction algorithms using ELM as the classifier.

Methods | Exp. 1 | Exp. 2 | Exp. 3 | Exp. 4 | Exp. 5 | Average |
---|---|---|---|---|---|---|

Raw | 94.8611 | 89.3151 | 92.0833 | 95.6944 | 91.5278 | 92.6963 |

PCA | 95.0000 | 90.4110 | 92.0833 | 96.5278 | 92.3611 | 93.2766 |

LDA | 95.4167 | 94.5205 | 91.8056 | 97.2222 | 93.4722 | 94.4874 |

LPP | 95.8333 | 90.4110 | 93.0556 | 98.6111 | 94.4444 | 94.4711 |

LPDP | 95.9722 | 94.5205 | 92.6389 | 97.2222 | 93.3333 | 94.7374 |

NDCA | 97.7778 | 91.7808 | 93.4722 | 97.2222 | 95.6944 | 95.1895 |

Classifiers | Comparison Pairs | ||||
---|---|---|---|---|---|

<Ours, Raw> | <Ours, PCA> | <Ours, LDA> | <Ours, LPP> | <Ours, LPDP> | |

NN | ↑ 4.9886% | ↑ 4.9886% | ↑ 1.9482% | ↑ 4.4406% | ↑ 1.3927% |

SVM | ↑ 1.9330% | ↑ 1.3850% | ↑ 1.6666% | ↑ 0.8219% | ↑ 0.8333% |

ELM | ↑ 2.4932% | ↑ 1.9129% | ↑ 0.7021% | ↑ 0.7184% | ↑ 0.4521% |

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Guo, T.; Lu, X.-P.; Zhang, Y.-X.; Yu, K.
Neighboring Discriminant Component Analysis for Asteroid Spectrum Classification. *Remote Sens.* **2021**, *13*, 3306.
https://doi.org/10.3390/rs13163306

**AMA Style**

Guo T, Lu X-P, Zhang Y-X, Yu K.
Neighboring Discriminant Component Analysis for Asteroid Spectrum Classification. *Remote Sensing*. 2021; 13(16):3306.
https://doi.org/10.3390/rs13163306

**Chicago/Turabian Style**

Guo, Tan, Xiao-Ping Lu, Yong-Xiong Zhang, and Keping Yu.
2021. "Neighboring Discriminant Component Analysis for Asteroid Spectrum Classification" *Remote Sensing* 13, no. 16: 3306.
https://doi.org/10.3390/rs13163306