Next Article in Journal
Strong Coupling and Nonextensive Thermodynamics
Next Article in Special Issue
Robust Multiple Regression
Previous Article in Journal
Measuring Carbon Market Transaction Efficiency in the Power Industry: An Entropy-Weighted TOPSIS Approach
Previous Article in Special Issue
Multivariate Tail Coefficients: Properties and Estimation
Open AccessArticle

Analysis of Information-Based Nonparametric Variable Selection Criteria

1
Institute of Computer Science, Polish Academy of Sciences, Jana Kazimierza 5, 01-248 Warsaw, Poland
2
Faculty of Mathematics and Information Science, Warsaw University of Technology, Koszykowa 75, 00-662 Warsaw, Poland
*
Author to whom correspondence should be addressed.
Entropy 2020, 22(9), 974; https://doi.org/10.3390/e22090974
Received: 6 August 2020 / Revised: 28 August 2020 / Accepted: 28 August 2020 / Published: 31 August 2020
We consider a nonparametric Generative Tree Model and discuss a problem of selecting active predictors for the response in such scenario. We investigated two popular information-based selection criteria: Conditional Infomax Feature Extraction (CIFE) and Joint Mutual information (JMI), which are both derived as approximations of Conditional Mutual Information (CMI) criterion. We show that both criteria CIFE and JMI may exhibit different behavior from CMI, resulting in different orders in which predictors are chosen in variable selection process. Explicit formulae for CMI and its two approximations in the generative tree model are obtained. As a byproduct, we establish expressions for an entropy of a multivariate gaussian mixture and its mutual information with mixing distribution. View Full-Text
Keywords: conditional mutual information; CMI; information measures; nonparametric variable selection criteria; gaussian mixture; conditional infomax feature extraction; CIFE; joint mutual information criterion; JMI; generative tree model; Markov blanket conditional mutual information; CMI; information measures; nonparametric variable selection criteria; gaussian mixture; conditional infomax feature extraction; CIFE; joint mutual information criterion; JMI; generative tree model; Markov blanket
Show Figures

Figure 1

MDPI and ACS Style

Łazęcka, M.; Mielniczuk, J. Analysis of Information-Based Nonparametric Variable Selection Criteria. Entropy 2020, 22, 974. https://doi.org/10.3390/e22090974

AMA Style

Łazęcka M, Mielniczuk J. Analysis of Information-Based Nonparametric Variable Selection Criteria. Entropy. 2020; 22(9):974. https://doi.org/10.3390/e22090974

Chicago/Turabian Style

Łazęcka, Małgorzata; Mielniczuk, Jan. 2020. "Analysis of Information-Based Nonparametric Variable Selection Criteria" Entropy 22, no. 9: 974. https://doi.org/10.3390/e22090974

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Search more from Scilit
 
Search
Back to TopTop