Next Article in Journal
Fault Diagnosis for Rolling Bearings Based on Fine-Sorted Dispersion Entropy and SVM Optimized with Mutation SCA-PSO
Previous Article in Journal
Stabilization of All Bell States in a Lossy Coupled-Cavity Array
Article Menu

Export Article

Open AccessArticle
Entropy 2019, 21(4), 403; https://doi.org/10.3390/e21040403

Robust Variable Selection and Estimation Based on Kernel Modal Regression

College of Science, Huazhong Agricultural University, Wuhan 430070, China
*
Authors to whom correspondence should be addressed.
These authors contributed equally to this work.
Received: 19 February 2019 / Revised: 8 April 2019 / Accepted: 9 April 2019 / Published: 16 April 2019
(This article belongs to the Section Information Theory, Probability and Statistics)
  |  
PDF [394 KB, uploaded 16 April 2019]
  |  

Abstract

Model-free variable selection has attracted increasing interest recently due to its flexibility in algorithmic design and outstanding performance in real-world applications. However, most of the existing statistical methods are formulated under the mean square error (MSE) criterion, and susceptible to non-Gaussian noise and outliers. As the MSE criterion requires the data to satisfy Gaussian noise condition, it potentially hampers the effectiveness of model-free methods in complex circumstances. To circumvent this issue, we present a new model-free variable selection algorithm by integrating kernel modal regression and gradient-based variable identification together. The derived modal regression estimator is related closely to information theoretic learning under the maximum correntropy criterion, and assures algorithmic robustness to complex noise by replacing learning of the conditional mean with the conditional mode. The gradient information of estimator offers a model-free metric to screen the key variables. In theory, we investigate the theoretical foundations of our new model on generalization-bound and variable selection consistency. In applications, the effectiveness of the proposed method is verified by data experiments. View Full-Text
Keywords: modal regression; maximum correntropy criterion; variable selection; reproducing kernel Hilbert space; generalization error modal regression; maximum correntropy criterion; variable selection; reproducing kernel Hilbert space; generalization error
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Guo, C.; Song, B.; Wang, Y.; Chen, H.; Xiong, H. Robust Variable Selection and Estimation Based on Kernel Modal Regression. Entropy 2019, 21, 403.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top