Application of Non-Sparse Manifold Regularized Multiple Kernel Classifier
Abstract
:1. Introduction
2. Related Works
2.1. Manifold Regularization
2.2. P-Norm Multiple Kernel Classifier
3. Proposed Classifier
3.1. Manifold Regularized P-Norm Multiple Kernel Classifier
3.2. Proofs of the Theorems
3.3. Procedure of the Algorithm
Algorithm 1. Algorithm of the proposed model |
Step 1. Set M kernel functions {k1,…, kM}, initial d with each dm = 1/M, initial kernel matrix K = ∑dmkm, initial norm parameter p, and other parameters C, rA, rI; Step 2. Given labeled data {xi,yi},i =1,…, ℓ, and unlabeled data {xi}, i = 1,…, u., use all the data samples to construct the adjacent graph G, and use the kNN method and Gaussian function to decide the edges of G and the weights on edges, i.e., W and the Laplacian graph L = D-W. Step 3. Repeat. Obtain the optimal α from (24) with a SVM solver. Obtain the current optimal d according to Theorem 2, denoted as dt. Re-compute the kernel matrix K = ∑dmkm with the current dt. Update the problem formation of (24) Until ||dt + 1-dt|| ≤ ε; Step 4. Output the final optimal classifier according to Theorem 1. |
3.4. Risk Bound Analysis
4. Experiments
- LapSVM [10]. An SVM with manifold regularization adopts kernel of Gaussian with parameter σ in [0.1, 0.3, 0.5]. The actual parameter was selected by five-fold cross validation;
- Transductive SVM (TSVM) [27]. This model could let the classification hyper-plane traverse the low-density data area. It adopts the radial basis function kernel such as k(a,b) = exp(-r∥a-b∥2) with r = 0.1;
- Low-density separation (LDS) [28]. It forms a kernel function by using the metrics of a graph and its kernel function is the same as in TSVM;
- Harmonic function [29]. This method is actually a labelling scheme. This process is asked to be on a graph by keeping the similarities of all the data samples. The classifier has a harmonic property;
- Support vector machine (SVM). It is implemented by LIBSVM [30] and adopts the radial basis function. The parameter of the kernel in SVM could be the mean distances of the whole data samples. Only the labeled data are used to learn SVM model in training stage;
- Graph-structured MKL (GMKL) [31]. It proposed a MKL model that the pruned kernel combination is fulfilled by feedback collected from a graph, developing a novel scheme which actively chooses relevant kernels. Online, there are time-efficiency advantages.
4.1. Experimental Settings
4.2. UCI Datasets
4.3. Benchmark Datasets
4.4. Impact of P-Norm Choice
5. Conclusions
Funding
Data Availability Statement
Conflicts of Interest
References
- Mahdi, A.A.; Omar, C.C. Learning with Multiple Kernels. IEEE Access 2024, 12, 56973–56980. [Google Scholar]
- Yan, W.; Li, Y.; Yang, M. Towards deeper match for multi-view oriented multiple kernel learning. Pattern Recognit. 2023, 134, 109119. [Google Scholar]
- Yuan, L.; Mei, W. Multiple Kernel Learning for Learner Classification. In Proceedings of the International Conference on Algorithms, Computing and Artificial Intelligence, Sanya, China, 22–24 December 2023; pp. 113–118. [Google Scholar]
- Rakotomamonjy, A.; Bach, F.; Canu, S.; Grandvalet, Y. SimpleMKL. J. Mach. Learn. Res. 2008, 9, 2491–2521. [Google Scholar]
- Liu, X.; Wang, L.; Huang, G.B. Multiple kernel extreme learning machine. Neurocomputing 2015, 149, 253–264. [Google Scholar]
- Aiolli, F.; Donini, M. EasyMKL: A scalable multiple kernel learning algorithm. Neurocomputing 2015, 169, 215–224. [Google Scholar]
- Rebai, I.; Benayed, Y.; Mahdi, W. Deep multilayer multiple kernel learning. Neural Comput. Appl. 2016, 27, 2305–2314. [Google Scholar]
- Zhao, S.; Ding, Y.; Liu, X.; Su, X. HKAM-MKM: A hybrid kernel alignment maximization-based multiple kernel model for identifying DNA-binding proteins. Comput. Biol. Med. 2022, 145, 105395. [Google Scholar]
- Rastogi, A. Nonlinear Tikhonov regularization in Hilbert scales for inverse learning. J. Complex. 2024, 82, 101824. [Google Scholar]
- Belkin, M.; Niyogi, P.; Sindhwani, V. Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples. J. Mach. Learn. Res. 2006, 7, 2399–2434. [Google Scholar]
- Liang, X.; Yu, Q.; Zhang, K.; Zeng, P.; Jian, L. LapRamp: A noise resistant classification algorithm based on manifold regularization. Appl. Intell. 2023, 53, 23797–23811. [Google Scholar]
- Xing, Z.; Peng, J.; He, X.; Tian, M. Semi-supervised sparse subspace clustering with manifold regularization. Appl. Intell. 2024, 54, 6836–6845. [Google Scholar]
- He, D.; Sun, S.; Xie, L. Multi-target feature selection with subspace learning and manifold regularization. Neurocomputing 2024, 582, 127533. [Google Scholar]
- Ma, F.; Huo, S.; Liu, S.; Yang, F. Multimode Low-Rank Relaxation and Manifold Regularization for Hyperspectral Image Super-Resolution. IEEE Trans. Instrum. Meas. 2024, 73, 5019614. [Google Scholar]
- Niu, G.; Ma, Z.; Lv, S. Ensemble Multiple-Kernel Based Manifold Regularization. Neural Process. Lett. 2017, 45, 539–552. [Google Scholar]
- Gu, Y.; Wang, Q.; Xie, B. Multiple Kernel Sparse Representation for Airborne LiDAR Data Classification. IEEE Trans. Geosci. Remote Sens. 2017, 55, 1085–1105. [Google Scholar]
- Qi, J.; Liang, X.; Xu, R. A Multiple Kernel Learning Model Based on p-Norm. Comput. Intell. Neurosci. 2018, 2018, 1018789. [Google Scholar]
- Shervin, R.A. One-Class Classification Using p-Norm Multiple Kernel Fisher Null Approach. IEEE Trans. Image Process. 2023, 32, 1843–1856. [Google Scholar]
- Liu, X.; Wang, L.; Zhu, X.; Li, M.; Zhu, E.; Liu, T.; Liu, L.; Dou, Y.; Yin, J. Absent Multiple Kernel Learning Algorithms. IEEE Trans. Pattern Anal. Mach. Intell. 2020, 42, 1303–1316. [Google Scholar]
- Liu, Z.; Huang, S.; Jin, W.; Mu, Y. Local kernels based graph learning for multiple kernel clustering. Pattern Recognit. 2024, 150, 110300. [Google Scholar]
- Gao, Y.L.; Yin, M.M.; Liu, J.X.; Shang, J.; Zheng, C.H. MKL-LP: Predicting Disease-Associated Microbes with Multiple-Similarity Kernel Learning-Based Label Propagation. In Proceedings of the International Symposium on Bioinformatics Research and Applications, Shenzh, China, 26–28 November 2021; Springer: Cham, Switzerland, 2021; pp. 3–10. [Google Scholar]
- Wang, R.; Xu, Y.; Yan, M. Sparse Representer Theorems for Learning in Reproducing Kernel Banach Spaces. J. Mach. Learn. Res. 2024, 25, 1–45. [Google Scholar]
- Kloft, M.; Blanchard, G. The local rademacher complexity of lp-norm multiple kernel learning. Adv. Neural Inf. Process. Syst. 2011, 12, 2465–2502. [Google Scholar]
- Wang, Z.; Chen, D.; Che, X. Multi-kernel learning for multi-label classification with local Rademacher complexity. Inf. Sci. 2023, 647, 119462. [Google Scholar]
- Bache, K.; Lichman, M. UCI Machine Learning Repository; School of Information and Computer Science, University of California: Irvine, CA, USA, 2013. [Google Scholar]
- Chapelle, O.; Schölkopf, B.; Zien, A. Semi-Supervised Learning; MIT: Cambridge, MA, USA, 2006. [Google Scholar]
- Michalis, P.; Andreas, A. Least Squares Minimum Class Variance Support Vector Machines. Computing 2024, 13, 34. [Google Scholar]
- Vasilii, F.; Malik, T.; Aladin, V. Random Matrix Analysis to Balance between Supervised and Unsupervised Learning under the Low Density Separation Assumption. In Proceedings of the International Conference on Machine Learning, Honolulu, HI, USA, 23–29 July 2023; pp. 10008–10033. [Google Scholar]
- Celso, A.; Sousa, R. Kernelized Constrained Gaussian Fields and Harmonic Functions for Semi-supervised Learning. In Proceedings of the International Joint Conference on Neural Networks, Glasgow, UK, 19–24 July 2020; pp. 1–8. [Google Scholar]
- Tao, Z.; Li, Y.; Teng, Z.; Zhao, Y. A Method for Identifying Vesicle Transport Proteins Based on LibSVM and MRMD. Comput. Math. Methods Med. 2020, 2020, 8926750. [Google Scholar]
- Ghari, P.M.; Shen, Y. Online multi-kernel learning with graph-structured feedback. In Proceedings of the International Conference on Machine Learning, Virtual, 12–18 July 2020; pp. 3474–3483. [Google Scholar]
Heart | German | Iono | Vote | Pima | Liver | Wdbc | Vehicle | |
---|---|---|---|---|---|---|---|---|
p = 1 | 82.1 | 72.0 | 90.5 | 94.9 | 74.8 | 65.3 | 94.0 | 85.4 |
p = 4/3 | 78.6 | 72.0 | 90.1 | 91.2 | 74.3 | 66.5 | 91.7 | 90.9 |
p = 2 | 80.8 | 71.7 | 92.0 | 92.4 | 74.2 | 66.4 | 94.9 | 90.2 |
p = 3 | 81.3 | 72.3 | 92.4 | 93.1 | 74.7 | 68.8 | 94.2 | 90.1 |
p = 4 | 81.7 | 73.7 | 91.8 | 92.4 | 75.0 | 66.1 | 92.1 | 87.8 |
p = 8 | 81.8 | 74.3 | 92.8 | 92.8 | 75.5 | 67.5 | 94.1 | 89.1 |
p = 10 | 82.0 | 72.9 | 91.2 | 91.2 | 76.1 | 66.2 | 94.2 | 89.0 |
LapSVM | 81.8 | 71.8 | 84.6 | 91.3 | 80.0 | 68.6 | 85.2 | 82.2 |
TSVM | 81.3 | 68.7 | 82.9 | 93.1 | 73.8 | 65.3 | 92.4 | 83.6 |
harmonic | 78.8 | 68.1 | 85.4 | 92.1 | 71.5 | 63.9 | 93.9 | 83.2 |
LDS | 81.4 | 66.3 | 89.1 | 90.1 | 70.0 | 64.7 | 94.4 | 81.7 |
SVM | 81.4 | 71.2 | 93.0 | 92.6 | 73.1 | 64.7 | 91.7 | 81.2 |
GMKL | 81.9 | 73.8 | 92.9 | 93.7 | 76.6 | 67.5 | 94.3 | 90.1 |
Digit | USPS | COIL | g241c | g241n | BCI | |
---|---|---|---|---|---|---|
p = 1 | 94.9 | 91.8 | 88.1 | 77.8 | 74.1 | 60.4 |
p = 4/3 | 93.5 | 87.9 | 85.9 | 79.6 | 74.4 | 60.0 |
p = 2 | 94.4 | 86.1 | 85.0 | 79.7 | 74.1 | 58.7 |
p = 3 | 94.6 | 85.8 | 85.1 | 79.8 | 74.4 | 60.3 |
p = 4 | 94.3 | 85.7 | 85.1 | 79.8 | 74.5 | 60.3 |
p = 8 | 94.4 | 85.4 | 85.2 | 78.6 | 74.4 | 62.6 |
p = 10 | 94.5 | 85.5 | 85.1 | 77.5 | 74.3 | 60.7 |
LapSVM | 94.8 | 82.1 | 86.2 | 55.4 | 57.6 | 60.7 |
TSVM | 92.3 | 86.5 | 80.5 | 79.1 | 74.1 | 60.5 |
harmonic | 92.9 | 91.4 | 82.1 | 57.8 | 58.9 | 53.1 |
LDS | 93.3 | 75.0 | 68.7 | 71.1 | 75.8 | 58.2 |
SVM | 92.9 | 85.5 | 79.4 | 74.2 | 72.4 | 56.4 |
GMKL | 94.6 | 90.1 | 86.2 | 78.4 | 75.6 | 60.3 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Yang, T. Application of Non-Sparse Manifold Regularized Multiple Kernel Classifier. Mathematics 2025, 13, 1050. https://doi.org/10.3390/math13071050
Yang T. Application of Non-Sparse Manifold Regularized Multiple Kernel Classifier. Mathematics. 2025; 13(7):1050. https://doi.org/10.3390/math13071050
Chicago/Turabian StyleYang, Tao. 2025. "Application of Non-Sparse Manifold Regularized Multiple Kernel Classifier" Mathematics 13, no. 7: 1050. https://doi.org/10.3390/math13071050
APA StyleYang, T. (2025). Application of Non-Sparse Manifold Regularized Multiple Kernel Classifier. Mathematics, 13(7), 1050. https://doi.org/10.3390/math13071050