Next Article in Journal
Nonequilibrium Thermodynamics and Steady State Density Matrix for Quantum Open Systems
Next Article in Special Issue
Peierls–Bogolyubov’s Inequality for Deformed Exponentials
Previous Article in Journal
Use of Exergy Analysis to Quantify the Effect of Lithium Bromide Concentration in an Absorption Chiller
Previous Article in Special Issue
Intra-Day Trading System Design Based on the Integrated Model of Wavelet De-Noise and Genetic Programming
Article Menu
Issue 4 (April) cover image

Export Article

Open AccessArticle
Entropy 2017, 19(4), 157; doi:10.3390/e19040157

Quadratic Mutual Information Feature Selection

University of Ljubljana, Faculty of Computer and Information Science, Ljubljana 1000, Slovenia
*
Author to whom correspondence should be addressed.
Received: 13 December 2016 / Revised: 27 March 2017 / Accepted: 30 March 2017 / Published: 1 April 2017
(This article belongs to the Collection Advances in Applied Statistical Mechanics)
View Full-Text   |   Download PDF [1567 KB, uploaded 5 April 2017]   |  

Abstract

We propose a novel feature selection method based on quadratic mutual information which has its roots in Cauchy–Schwarz divergence and Renyi entropy. The method uses the direct estimation of quadratic mutual information from data samples using Gaussian kernel functions, and can detect second order non-linear relations. Its main advantages are: (i) unified analysis of discrete and continuous data, excluding any discretization; and (ii) its parameter-free design. The effectiveness of the proposed method is demonstrated through an extensive comparison with mutual information feature selection (MIFS), minimum redundancy maximum relevance (MRMR), and joint mutual information (JMI) on classification and regression problem domains. The experiments show that proposed method performs comparably to the other methods when applied to classification problems, except it is considerably faster. In the case of regression, it compares favourably to the others, but is slower. View Full-Text
Keywords: feature selection; information-theoretic measures; quadratic mutual information; Cauchy–Schwarz divergence feature selection; information-theoretic measures; quadratic mutual information; Cauchy–Schwarz divergence
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Sluga, D.; Lotrič, U. Quadratic Mutual Information Feature Selection. Entropy 2017, 19, 157.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top