Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (4)

Search Parameters:
Keywords = twin bounded support vector machine

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
27 pages, 478 KB  
Article
A Novel Robust Metric Distance Optimization-Driven Manifold Learning Framework for Semi-Supervised Pattern Classification
by Bao Ma, Jun Ma and Guolin Yu
Axioms 2023, 12(8), 737; https://doi.org/10.3390/axioms12080737 - 27 Jul 2023
Cited by 1 | Viewed by 1609
Abstract
In this work, we address the problem of improving the classification performance of machine learning models, especially in the presence of noisy and outlier data. To this end, we first innovatively design a generalized adaptive robust loss function called  [...] Read more.
In this work, we address the problem of improving the classification performance of machine learning models, especially in the presence of noisy and outlier data. To this end, we first innovatively design a generalized adaptive robust loss function called Vθ(x). Intuitively, Vθ(x) can improve the robustness of the model by selecting different robust loss functions for different learning tasks during the learning process via the adaptive parameter θ. Compared with other robust loss functions, Vθ(x) has some desirable salient properties, such as symmetry, boundedness, robustness, nonconvexity, and adaptivity, making it suitable for a wide range of machine learning applications. Secondly, a new robust semi-supervised learning framework for pattern classification is proposed. In this learning framework, the proposed robust loss function Vθ(x) and capped L2,p-norm robust distance metric are introduced to improve the robustness and generalization performance of the model, especially when the outliers are far from the normal data distributions. Simultaneously, based on this learning framework, the Welsch manifold robust twin bounded support vector machine (WMRTBSVM) and its least-squares version are developed. Finally, two effective iterative optimization algorithms are designed, their convergence is proved, and their complexity is calculated. Experimental results on several datasets with different noise settings and different evaluation criteria show that our methods have better classification performance and robustness. With the Cancer dataset, when there is no noise, the classification accuracy of our proposed methods is 94.17% and 95.62%, respectively. When the Gaussian noise is 50%, the classification accuracy of our proposed methods is 91.76% and 90.59%, respectively, demonstrating that our method has satisfactory classification performance and robustness. Full article
(This article belongs to the Special Issue Mathematics of Neural Networks: Models, Algorithms and Applications)
Show Figures

Figure 1

25 pages, 440 KB  
Article
Capped L2,p-Norm Metric Based on Robust Twin Support Vector Machine with Welsch Loss
by Haoyu Wang, Guolin Yu and Jun Ma
Symmetry 2023, 15(5), 1076; https://doi.org/10.3390/sym15051076 - 12 May 2023
Cited by 7 | Viewed by 1931
Abstract
A twin bounded support vector machine (TBSVM) is a phenomenon of symmetry that improves the performance of the traditional support vector machine classification algorithm. In this paper, we propose an improved model based on a TBSVM, called a Welsch loss with capped [...] Read more.
A twin bounded support vector machine (TBSVM) is a phenomenon of symmetry that improves the performance of the traditional support vector machine classification algorithm. In this paper, we propose an improved model based on a TBSVM, called a Welsch loss with capped L2,p-norm distance metric robust twin bounded support vector machine (WCTBSVM). On the one hand, by introducing the capped L2,p-norm metric in the TBSVM, the problem of the non-sparse output of the regularization term is solved; thus, the generalization and robustness of the TBSVM is improved and the principle of minimizing the structural risk is realized. On the other hand, a bounded, smooth, and non-convex Welsch loss function is introduced to reduce the influence of noise, which further improves the classification performance of the TBSVM. We use a half-quadratic programming algorithm to solve the model non-convexity problem caused by Welsch loss. Therefore, the WCTBSVM is more robust and effective in dealing with noise compared to the TBSVM. In addition, to reduce the time complexity and speed up the convergence of the algorithm, we constructed a least squares version of the WCTBSVM, named the fast WCTBSVM (FWCTBSVM). Experimental results on both UCI and artificial datasets show that our model can show better classification performance on classification problems. Full article
Show Figures

Figure 1

14 pages, 7641 KB  
Article
Gas Pipeline Leakage Detection Method Based on IUPLCD and GS-TBSVM
by Haiou Shan and Yongqiang Zhu
Processes 2023, 11(1), 278; https://doi.org/10.3390/pr11010278 - 15 Jan 2023
Cited by 3 | Viewed by 2075
Abstract
To improve the identification accuracy of gas pipeline leakage and reduce the false alarm rate, a pipeline leakage detection method based on improved uniform-phase local characteristic-scale decomposition (IUPLCD) and grid search algorithm-optimized twin-bounded support vector machine (GS-TBSVM) was proposed. First, the signal was [...] Read more.
To improve the identification accuracy of gas pipeline leakage and reduce the false alarm rate, a pipeline leakage detection method based on improved uniform-phase local characteristic-scale decomposition (IUPLCD) and grid search algorithm-optimized twin-bounded support vector machine (GS-TBSVM) was proposed. First, the signal was decomposed into several intrinsic scale components (ISC) by the UPLCD algorithm. Then, the signal reconstruction process of UPLCD was optimized and improved according to the energy and standard deviation of the amplitude of each ISC, the ISC components dominated by the signal were selected for signal reconstruction, and the denoised signal was obtained. Finally, the TBSVM was optimized using a grid search algorithm, and a GS-TBSVM model for pipeline leakage identification was constructed. The input of the GS-TBSVM model was the data processed by the IUPLCD algorithm, and the output was the real-time working conditions of the gas pipeline. The experimental results show that IUPLCD can effectively filter the noise in the signal and GS-TBSVM can accurately judge the working conditions of the gas pipeline, with a maximum identification accuracy of 98.4%. Full article
(This article belongs to the Special Issue Process Monitoring and Fault Diagnosis)
Show Figures

Figure 1

19 pages, 756 KB  
Article
A Multiclass Nonparallel Parametric-Margin Support Vector Machine
by Shu-Wang Du, Ming-Chuan Zhang, Pei Chen, Hui-Feng Sun, Wei-Jie Chen and Yuan-Hai Shao
Information 2021, 12(12), 515; https://doi.org/10.3390/info12120515 - 10 Dec 2021
Cited by 5 | Viewed by 3130
Abstract
The twin parametric-margin support vector machine (TPMSVM) is an excellent kernel-based nonparallel classifier. However, TPMSVM was originally designed for binary classification, which is unsuitable for real-world multiclass applications. Therefore, this paper extends TPMSVM for multiclass classification and proposes a novel K multiclass nonparallel [...] Read more.
The twin parametric-margin support vector machine (TPMSVM) is an excellent kernel-based nonparallel classifier. However, TPMSVM was originally designed for binary classification, which is unsuitable for real-world multiclass applications. Therefore, this paper extends TPMSVM for multiclass classification and proposes a novel K multiclass nonparallel parametric-margin support vector machine (MNP-KSVC). Specifically, our MNP-KSVC enjoys the following characteristics. (1) Under the “one-versus-one-versus-rest” multiclass framework, MNP-KSVC encodes the complicated multiclass learning task into a series of subproblems with the ternary output {1,0,+1}. In contrast to the “one-versus-one” or “one-versus-rest” strategy, each subproblem not only focuses on separating the two selected class instances but also considers the side information of the remaining class instances. (2) MNP-KSVC aims to find a pair of nonparallel parametric-margin hyperplanes for each subproblem. As a result, these hyperplanes are closer to their corresponding class and at least one distance away from the other class. At the same time, they attempt to bound the remaining class instances into an insensitive region. (3) MNP-KSVC utilizes a hybrid classification and regression loss joined with the regularization to formulate its optimization model. Then, the optimal solutions are derived from the corresponding dual problems. Finally, we conduct numerical experiments to compare the proposed method with four state-of-the-art multiclass models: Multi-SVM, MBSVM, MTPMSVM, and Twin-KSVC. Experimental results demonstrate the feasibility and effectiveness of MNP-KSVC in terms of multiclass accuracy and learning time. Full article
Show Figures

Figure 1

Back to TopTop