Semi-Supervised Classification via Hypergraph Convolutional Extreme Learning Machine
Abstract
:1. Introduction
- We propose a simple but effective hypergraph convolutional ELM, i.e., HGCELM, for semi-supervised classification. The HGCELM method not only inherits all the advantages from ELM but enables ELM to model the high-order relationship of data. The successful attempt signifies that structured information, especially high-order relationships, among data is important for ELM, which offers an alternative orientation for ELM representation learning.
- We have shown that the traditional ELMs are the special cases of HGCELM on the Euclidian data. We conduct extensive experiments on 26 popular datasets for semi-supervised classification task. Comparisons with state-of-the-art methods demonstrate that the proposed GCELM can achieve superior performance.
2. Preliminary and Related Work
2.1. Notations
2.2. Hypergraph Preliminary
2.3. ELMs
2.4. GCNs
3. HGCELM
3.1. Hypergraph Construction
3.2. Random Hypergraph Convolution
3.3. Hypergraph Convolutional Regression
Algorithm 1: HGCELM |
Input: Dataset , training labels , hyper-parameters and L. 1 Construct hypergraph by Equation (11); 2 Generate using the standard normal distribution; 3 Calculate random hypergraph embedding: ; 4 Solve hypergraph conv regression: ; 5 Predict test levels by Equation (17); Output: . |
3.4. Computation Complexity and Connection to Existing Methods
4. Results and Discussion
4.1. Experimental Configurations
4.1.1. Datasets
4.1.2. Baseline Methods
4.2. Qualitative Study
4.2.1. Comparisons with Baselines
4.2.2. Performance with Varying Training Size
4.2.3. Analysis on Decision Boundaries
4.3. Parameter Sensitivity Study
4.3.1. Impact of Hidden Neurons
4.3.2. Impact of and k
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Huang, G.; Huang, G.B.; Song, S.; You, K. Trends in extreme learning machines: A review. Neural Netw. 2015, 61, 32–48. [Google Scholar] [CrossRef] [PubMed]
- Huang, G.B.; Zhu, Q.Y.; Siew, C.K. Extreme learning machine: Theory and applications. Neurocomputing 2006, 70, 489–501. [Google Scholar] [CrossRef]
- Zhang, Y.; Wu, J.; Cai, Z.; Du, B.; Yu, P.S. An unsupervised parameter learning model for RVFL neural network. Neural Netw. 2019, 112, 85–97. [Google Scholar] [CrossRef]
- Huang, G.B.; Zhou, H.; Ding, X.; Zhang, R. Extreme Learning Machine for Regression and Multiclass Classification. IEEE Trans. Syst. Man Cybern. Part B (Cybern.) 2012, 42, 513–529. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Cai, Y.; Liu, X.; Zhang, Y.; Cai, Z. Hierarchical ensemble of extreme learning machine. Pattern Recognit. Lett. 2018, 116, 101–106. [Google Scholar] [CrossRef]
- Song, Y.; Crowcroft, J.; Zhang, J. Automatic epileptic seizure detection in EEGs based on optimized sample entropy and extreme learning machine. J. Neurosci. Methods 2012, 210, 132–146. [Google Scholar] [CrossRef]
- Zeng, Y.; Xu, X.; Shen, D.; Fang, Y.; Xiao, Z. Traffic Sign Recognition Using Kernel Extreme Learning Machines with Deep Perceptual Features. IEEE Trans. Intell. Transp. Syst. 2017, 18, 1647–1653. [Google Scholar]
- Xu, Y.; Dong, Z.Y.; Zhao, J.H.; Zhang, P.; Wong, K.P. A Reliable Intelligent System for Real-Time Dynamic Security Assessment of Power Systems. IEEE Trans. Power Syst. 2012, 27, 1253–1263. [Google Scholar] [CrossRef]
- Chen, X.; Dong, Z.Y.; Meng, K.; Xu, Y.; Wong, K.P.; Ngan, H.W. Electricity Price Forecasting With Extreme Learning Machine and Bootstrapping. IEEE Trans. Power Syst. 2012, 27, 2055–2062. [Google Scholar] [CrossRef]
- Zhang, L.; Suganthan, P. A survey of randomized algorithms for training neural networks. Inf. Sci. 2016, 364–365, 146–155. [Google Scholar] [CrossRef]
- Cao, W.; Wang, X.; Ming, Z.; Gao, J. A review on neural networks with random weights. Neurocomputing 2018, 275, 278–287. [Google Scholar] [CrossRef]
- Wu, Y.; Zhang, Y.; Liu, X.; Cai, Z.; Cai, Y. A multiobjective optimization-based sparse extreme learning machine algorithm. Neurocomputing 2018, 317, 88–100. [Google Scholar] [CrossRef]
- Zhang, Y.; Wu, J.; Cai, Z.; Zhang, P.; Chen, L. Memetic Extreme Learning Machine. Pattern Recognit. 2016, 58, 135–148. [Google Scholar] [CrossRef]
- Xiao, L.; Shao, W.; Jin, F.; Wu, Z. A self-adaptive kernel extreme learning machine for short-term wind speed forecasting. Appl. Soft Comput. 2021, 99, 106917. [Google Scholar] [CrossRef]
- Wang, X.B.; Zhang, X.; Li, Z.; Wu, J. Ensemble extreme learning machines for compound-fault diagnosis of rotating machinery. Knowl. Based Syst. 2020, 188, 105012. [Google Scholar] [CrossRef]
- Huang, G.B.; Bai, Z.; Kasun, L.L.C.; Vong, C.M. Local Receptive Fields Based Extreme Learning Machine. IEEE Comput. Intell. Mag. 2015, 10, 18–29. [Google Scholar] [CrossRef]
- Cai, Y.; Zhang, Z.; Yan, Q.; Zhang, D.; Banu, M.J. Densely connected convolutional extreme learning machine for hyperspectral image classification. Neurocomputing 2021, 434, 21–32. [Google Scholar] [CrossRef]
- Cai, Y.; Liu, X.; Cai, Z. BS-Nets: An End-to-End Framework for Band Selection of Hyperspectral Image. IEEE Trans. Geosci. Remote Sens. 2020, 58, 1969–1984. [Google Scholar] [CrossRef] [Green Version]
- Joachims, T. Transductive inference for text classification using support vector machines. In Proceedings of the ICML, Bled, Slovenia, 27–30 June 1999; Volume 99, pp. 200–209. [Google Scholar]
- Belkin, M.; Niyogi, P.; Sindhwani, V. Manifold regularization: A geometric framework for learning from labeled and unlabeled examples. J. Mach. Learn. Res. 2006, 7, 2399–2434. [Google Scholar]
- Huang, G.; Song, S.; Gupta, J.N.D.; Wu, C. Semi-Supervised and Unsupervised Extreme Learning Machines. IEEE Trans. Cybern. 2014, 44, 2405–2417. [Google Scholar] [CrossRef] [PubMed]
- Yi, Y.; Qiao, S.; Zhou, W.; Zheng, C.; Liu, Q.; Wang, J. Adaptive multiple graph regularized semi-supervised extreme learning machine. Soft Comput. 2018, 22, 3545–3562. [Google Scholar] [CrossRef]
- Wu, Z.; Pan, S.; Chen, F.; Long, G.; Zhang, C.; Yu, P.S. A Comprehensive Survey on Graph Neural Networks. IEEE Trans. Neural Netw. Learn. Syst. 2021, 32, 4–24. [Google Scholar] [CrossRef] [Green Version]
- Garg, V.; Jegelka, S.; Jaakkola, T. Generalization and Representational Limits of Graph Neural Networks. In Proceedings of the 37th International Conference on Machine Learning, Vienna, Austria, 12–18 July 2020; Daumé, H., III, Singh, A., Eds.; Volume 119, pp. 3419–3430. [Google Scholar]
- Kipf, T.N.; Welling, M. Semi-supervised classification with graph convolutional networks. arXiv 2016, arXiv:1609.02907. [Google Scholar]
- Cai, Y.; Zhang, Z.; Cai, Z.; Liu, X.; Jiang, X.; Yan, Q. Graph Convolutional Subspace Clustering: A Robust Subspace Clustering Framework for Hyperspectral Image. IEEE Trans. Geosci. Remote Sens. 2020, 59, 4191–4202. [Google Scholar] [CrossRef]
- Lee, J.; Lee, I.; Kang, J. Self-Attention Graph Pooling. In Proceedings of the 36th International Conference on Machine Learning, Long Beach, CA, USA, 9–15 June 2019; Chaudhuri, K., Salakhutdinov, R., Eds.; Volume 97, pp. 3734–3743. [Google Scholar]
- Li, Q.; Han, Z.; Wu, X.M. Deeper insights into graph convolutional networks for semi-supervised learning. In Proceedings of the Thirty-Second AAAI Conference on Artificial Intelligence, New Orleans, LA, USA, 2–7 February 2018. [Google Scholar]
- Wu, F.; Souza, A.; Zhang, T.; Fifty, C.; Yu, T.; Weinberger, K. Simplifying Graph Convolutional Networks. In Proceedings of the 36th International Conference on Machine Learning, Long Beach, CA, USA, 9–15 June 2019; Chaudhuri, K., Salakhutdinov, R., Eds.; Volume 97, pp. 6861–6871. [Google Scholar]
- Feng, Y.; You, H.; Zhang, Z.; Ji, R.; Gao, Y. Hypergraph Neural Networks. In Proceedings of the AAAI Conference on Artificial Intelligence, Honolulu, HI, USA, 27 January–1 February 2019; Volume 33, pp. 3558–3565. [Google Scholar]
- Bai, S.; Zhang, F.; Torr, P.H. Hypergraph convolution and hypergraph attention. Pattern Recognit. 2021, 110, 107637. [Google Scholar] [CrossRef]
- Zhang, Z.; Cai, Y.; Gong, W.; Liu, X.; Cai, Z. Graph Convolutional Extreme Learning Machine. In Proceedings of the 2020 International Joint Conference on Neural Networks (IJCNN), Glasgow, UK, 19–24 July 2020; pp. 1–8. [Google Scholar] [CrossRef]
- Schölkopf, B.; Platt, J.; Hofmann, T. Learning with Hypergraphs: Clustering, Classification, and Embedding. In Advances in Neural Information Processing Systems 19: Proceedings of the 2006 Conference; MITP: Cambridge, MA, USA, 2007; pp. 1601–1608. [Google Scholar]
- Jin, T.; Cao, L.; Zhang, B.; Sun, X.; Deng, C.; Ji, R. Hypergraph Induced Convolutional Manifold Networks. In Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence (IJCAI-19), Macao, China, 10–16 August 2019; pp. 2670–2676. [Google Scholar]
- Cai, Y.; Zhang, Z.; Cai, Z.; Liu, X.; Jiang, X. Hypergraph-Structured Autoencoder for Unsupervised and Semisupervised Classification of Hyperspectral Image. IEEE Geosci. Remote Sens. Lett. 2021, 1–5. [Google Scholar] [CrossRef]
- Defferrard, M.; Bresson, X.; Vandergheynst, P. Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering. In Proceedings of the 30th International Conference on Neural Information Processing Systems (NIPS’16), Barcelona, Spain, 4–9 December 2016; pp. 3844–3852. [Google Scholar]
- Hamilton, W.L.; Ying, R.; Leskovec, J. Inductive Representation Learning on Large Graphs. In Proceedings of the 31st International Conference on Neural Information Processing Systems (NIPS’17), Long Beach, CA, USA, 4–9 December 2017; pp. 1025–1035. [Google Scholar]
- Chen, C.; Li, W.; Su, H.; Liu, K. Spectral-Spatial Classification of Hyperspectral Image Based on Kernel Extreme Learning Machine. Remote Sens. 2014, 6, 5795–5814. [Google Scholar] [CrossRef] [Green Version]
- Li, Y.; Guan, C.; Li, H.; Chin, Z. A self-training semi-supervised SVM algorithm and its application in an EEG-based brain computer interface speller system. Pattern Recognit. Lett. 2008, 29, 1285–1294. [Google Scholar] [CrossRef]
Notation | Definition |
---|---|
N | The number of data points. |
m | The number of features. |
C | The number of classes. |
The feature matrix of dataset, . | |
The label matrix with one-hot encoding, . | |
The number of labeled samples. | |
The labeled set. | |
The unlabeled set. | |
The hidden layer parameter matrix, . | |
The output layer parameter matrix, . | |
The matrix of latent representation, . | |
The Moore–Penrose generalized inverse of matrix . | |
L | The number of hidden neurons. |
The set of vertices in the hypergraph. | |
The set of hyperedges in the hypergraph. | |
The diagonal matrix of the hyperedge weights, . | |
A hypergraph . | |
The incidence matrix of the hypergraph, . | |
The degree of the vertex v. | |
The degree of the hyperedge e. | |
The diagonal matrix of the vertex degrees, . | |
The diagonal matrix of the hyperedge degrees, . | |
The hypergraph Laplacian matrix, . |
Dataset | #Classes | #Instance | #Feature | #Train | #Test | Dataset | #Classes | #Instance | #Feature | #Train | #Test |
---|---|---|---|---|---|---|---|---|---|---|---|
austra | 2 | 680 | 14 | 10 | 680 | weather | 2 | 22 | 4 | 10 | 12 |
australian | 2 | 690 | 14 | 70 | 620 | Wine | 3 | 178 | 13 | 19 | 159 |
breast | 2 | 277 | 9 | 29 | 248 | X8D5K | 5 | 1000 | 8 | 100 | 900 |
cleve | 2 | 296 | 13 | 30 | 266 | zoo | 7 | 101 | 16 | 13 | 88 |
diabetes | 2 | 768 | 8 | 78 | 690 | cloud | 2 | 1024 | 10 | 103 | 921 |
dnatest | 3 | 1186 | 180 | 120 | 1066 | bupa | 2 | 345 | 6 | 10 | 335 |
german | 2 | 1000 | 24 | 100 | 900 | air | 3 | 359 | 64 | 37 | 322 |
heart | 2 | 270 | 13 | 27 | 243 | segmentation | 7 | 210 | 18 | 21 | 189 |
ionosphere | 2 | 351 | 32 | 10 | 341 | pima In. D. | 2 | 768 | 8 | 10 | 758 |
iris | 3 | 150 | 4 | 15 | 135 | Xinp | 3 | 178 | 13 | 19 | 159 |
sonar | 2 | 208 | 60 | 10 | 198 | wdbc | 2 | 569 | 30 | 58 | 511 |
vote | 2 | 435 | 16 | 10 | 425 | ecoli label | 2 | 335 | 2 | 10 | 325 |
WBC | 2 | 683 | 9 | 10 | 673 | appendicitis | 2 | 106 | 7 | 12 | 94 |
Method | Hyper-Parameters |
---|---|
HGCELM (Ours) | |
GCELM [32] | |
ELM [2] | |
KELM [38] | |
SS-ELM [21] | |
TSVM [19] | |
ST-ELM [39] | |
LapSVM [20] | |
GCN [25] |
Data Sets | HGCELM | GCELM | SS-ELM | ELM | KELM | TSVM | ST-ELM | LapSVM | GCN |
---|---|---|---|---|---|---|---|---|---|
austra | 82.57 ± 3.72 | 80.78 ± 2.99 | 76.24 ± 8.14 | 73.32 ± 10.48 | 79.47 ± 4.12 | 77.43 ± 10.59 | 75.17 ± 9.99 | 55.70 ± 0.21 | 71.38 ± 8.36 |
australian | 81.90 ± 4.73 | 78.46 ± 4.82 | 78.10 ± 6.24 | 71.35 ± 11.25 | 79.32 ± 6.11 | 75.38 ± 10.81 | 78.57 ± 9.56 | 75.79 ± 6.61 | 75.36 ± 7.56 |
breast | 66.52 ± 10.39 | 61.61 ± 8.61 | 55.30 ± 7.67 | 52.00 ± 7.38 | 55.56 ± 8.97 | 55.77 ± 6.78 | 53.18 ± 10.68 | 71.54 ± 0.00 | 60.17 ± 6.18 |
cleve | 74.51 ± 2.09 | 72.90 ± 6.42 | 72.88 ± 5.30 | 68.95 ± 7.31 | 72.38 ± 3.79 | 72.80 ± 5.70 | 68.85 ± 8.30 | 54.20 ± 0.00 | 70.17 ± 6.14 |
diabetes | 69.22 ± 3.75 | 70.10 ± 3.09 | 66.27 ± 8.92 | 57.31 ± 8.51 | 65.71 ± 5.64 | 64.97 ± 5.26 | 58.43 ± 8.56 | 65.30 ± 0.06 | 62.93 ± 6.62 |
dnatest | 78.55 ± 2.34 | 67.87 ± 5.03 | 51.60 ± 4.63 | 50.77 ± 4.79 | 50.81 ± 4.26 | 60.99 ± 2.94 | 42.86 ± 8.51 | 48.27 ± 4.10 | 59.33 ± 3.95 |
german | 67.38 ± 2.93 | 67.10 ± 10.01 | 54.48 ± 5.35 | 57.95 ± 9.42 | 51.07 ± 7.48 | 59.52 ± 6.89 | 59.64 ± 7.63 | 70.21 ± 0.03 | 59.54 ± 5.32 |
heart | 78.50 ± 2.83 | 76.53 ± 3.63 | 72.54 ± 6.06 | 69.15 ± 9.07 | 73.25 ± 8.99 | 70.23 ± 7.77 | 67.21 ± 8.62 | 55.77 ± 0.00 | 70.00 ± 8.58 |
ionosphere | 90.32 ± 0.00 | 81.07 ± 7.51 | 75.98 ± 4.31 | 74.65 ± 8.96 | 75.57 ± 6.18 | 77.05 ± 5.43 | 77.20 ± 5.68 | 64.52 ± 0.00 | 71.32 ± 8.21 |
iris | 95.56 ± 1.72 | 94.32 ± 3.20 | 80.04 ± 4.79 | 80.70 ± 10.76 | 93.04 ± 3.88 | 93.00 ± 3.70 | 81.04 ± 8.36 | 92.70 ± 2.58 | 91.19 ± 3.61 |
sonar | 67.93 ± 4.60 | 67.44 ± 6.64 | 63.28 ± 5.10 | 64.66 ± 6.37 | 62.56 ± 5.37 | 66.41 ± 5.27 | 62.21 ± 5.10 | 46.46 ± 0.00 | 67.22 ± 7.09 |
vote | 90.12 ± 2.60 | 84.19 ± 5.72 | 83.75 ± 3.49 | 84.39 ± 6.17 | 87.98 ± 1.97 | 87.97 ± 3.59 | 87.65 ± 6.21 | 38.35 ± 0.00 | 86.62 ± 3.87 |
WBC | 97.21 ± 0.27 | 96.53 ± 0.74 | 92.30 ± 1.83 | 89.35 ± 5.63 | 95.30 ± 2.78 | 95.40 ± 1.90 | 92.14 ± 3.50 | 65.26 ± 0.07 | 95.93 ± 2.28 |
weather | 84.17 ± 6.92 | 76.94 ± 14.38 | 69.72 ± 15.14 | 73.61 ± 12.74 | 69.17 ± 11.21 | 76.39 ± 14.29 | 70.00 ± 15.00 | 41.67 ± 0.00 | 50.83 ± 11.66 |
Wine | 96.38 ± 1.38 | 91.25 ± 2.28 | 89.37 ± 3.33 | 79.75 ± 6.72 | 90.31 ± 3.92 | 90.51 ± 3.34 | 79.59 ± 7.68 | 88.45 ± 1.60 | 90.84 ± 2.79 |
X8D5K | 100.00 ± 0.00 | 100.00 ± 0.00 | 100.00 ± 0.00 | 96.93 ± 3.07 | 99.96 ± 0.06 | 99.98 ± 0.05 | 99.93 ± 0.21 | 100.00 ± 0.00 | 100.00 ± 0.00 |
zoo | 100.00 ± 0.00 | 99.22 ± 1.32 | 98.02 ± 1.45 | 98.18 ± 1.51 | 98.65 ± 1.60 | 99.48 ± 1.09 | 99.69 ± 0.94 | 99.95 ± 0.28 | 100.00 ± 0.00 |
cloud | 90.38 ± 2.92 | 90.18 ± 3.68 | 89.92 ± 3.22 | 73.38 ± 10.32 | 88.48 ± 4.36 | 89.71 ± 4.65 | 75.95 ± 9.54 | 81.32 ± 4.86 | 88.10 ± 3.90 |
bupa | 57.37 ± 2.72 | 56.33 ± 6.93 | 52.55 ± 4.01 | 54.07 ± 6.63 | 56.15 ± 4.98 | 55.95 ± 5.62 | 55.41 ± 5.85 | 41.79 ± 0.00 | 51.72 ± 3.71 |
air | 81.72 ± 6.33 | 71.30 ± 6.25 | 66.52 ± 4.50 | 67.44 ± 5.83 | 70.89 ± 4.18 | 70.68 ± 6.38 | 69.53 ± 7.55 | 72.49 ± 7.63 | 79.65 ± 5.78 |
segmentation | 84.51 ± 2.00 | 83.22 ± 3.14 | 78.91 ± 3.42 | 65.71 ± 7.74 | 83.01 ± 3.22 | 81.85 ± 4.53 | 67.05 ± 8.36 | 80.84 ± 3.51 | 80.74 ± 3.61 |
pima In. D. | 68.97 ± 3.19 | 68.45 ± 5.55 | 67.28 ± 7.67 | 58.43 ± 7.06 | 63.52 ± 6.78 | 64.84 ± 8.34 | 62.35 ± 5.65 | 35.29 ± 1.03 | 65.47 ± 4.70 |
Xinp | 97.33 ± 0.85 | 93.48 ± 1.95 | 93.04 ± 2.31 | 80.31 ± 6.64 | 92.76 ± 3.00 | 93.90 ± 1.76 | 81.56 ± 8.59 | 92.45 ± 1.32 | 94.97 ± 1.82 |
wdbc | 94.44 ± 1.42 | 92.13 ± 0.16 | 88.94 ± 4.59 | 81.20 ± 8.63 | 91.72 ± 4.20 | 91.15 ± 4.02 | 89.51 ± 5.39 | 62.97 ± 0.00 | 90.86 ± 2.44 |
ecoli_label | 67.45 ± 1.78 | 64.46 ± 7.16 | 67.09 ± 2.29 | 57.92 ± 11.95 | 61.17 ± 12.58 | 58.92 ± 13.97 | 63.28 ± 9.71 | 61.80 ± 8.44 | 63.14 ± 8.21 |
appendicitis | 86.98 ± 2.25 | 85.83 ± 2.80 | 73.33 ± 14.10 | 58.12 ± 11.00 | 69.06 ± 12.88 | 63.70 ± 15.29 | 57.86 ± 15.23 | 16.67 ± 0.00 | 76.15 ± 10.68 |
Average | 82.69 ± 2.84 | 80.4 ± 4.61 | 75.64 ± 5.14 | 71.26 ± 7.78 | 76.9 ± 5.29 | 77.53 ± 5.94 | 72.62 ± 7.68 | 65.9 ± 1.57 | 76.63 ± 5.12 |
W/T/L | – | 24/1/1 | 25/1/0 | 26/0/0 | 26/0/0 | 26/0/0 | 26/0/0 | 24/1/1 | 24/2/0 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Liu, Z.; Zhang, Z.; Cai, Y.; Miao, Y.; Chen, Z. Semi-Supervised Classification via Hypergraph Convolutional Extreme Learning Machine. Appl. Sci. 2021, 11, 3867. https://doi.org/10.3390/app11093867
Liu Z, Zhang Z, Cai Y, Miao Y, Chen Z. Semi-Supervised Classification via Hypergraph Convolutional Extreme Learning Machine. Applied Sciences. 2021; 11(9):3867. https://doi.org/10.3390/app11093867
Chicago/Turabian StyleLiu, Zhewei, Zijia Zhang, Yaoming Cai, Yilin Miao, and Zhikun Chen. 2021. "Semi-Supervised Classification via Hypergraph Convolutional Extreme Learning Machine" Applied Sciences 11, no. 9: 3867. https://doi.org/10.3390/app11093867
APA StyleLiu, Z., Zhang, Z., Cai, Y., Miao, Y., & Chen, Z. (2021). Semi-Supervised Classification via Hypergraph Convolutional Extreme Learning Machine. Applied Sciences, 11(9), 3867. https://doi.org/10.3390/app11093867