Next Article in Journal / Special Issue
Speed-gradient Entropy Principle for Nonstationary Processes
Previous Article in Journal
Information Entropy of Influenza A Segment 7
Previous Article in Special Issue
Intercept Capacity: Unknown Unitary Transformation
Entropy 2008, 10(4), 745-756; doi:10.3390/e10040745
Article

An Assessment of Hermite Function Based Approximations of Mutual Information Applied to Independent Component Analysis

C3ID, DSTO, PO Box, Edinburgh, SA 5111, Australia
Received: 23 May 2008 / Accepted: 28 November 2008 / Published: 4 December 2008
View Full-Text   |   Download PDF [220 KB, uploaded 24 February 2015]   |   Browse Figures
SciFeed

Abstract

At the heart of many ICA techniques is a nonparametric estimate of an information measure, usually via nonparametric density estimation, for example, kernel density estimation. While not as popular as kernel density estimators, orthogonal functions can be used for nonparametric density estimation (via a truncated series expansion whose coefficients are calculated from the observed data). While such estimators do not necessarily yield a valid density, which kernel density estimators do, they are faster to calculate than kernel density estimators, in particular for a modified version of Renyi's entropy of order 2. In this paper, we compare the performance of ICA using Hermite series based estimates of Shannon's and Renyi's mutual information, to that of Gaussian kernel based estimates. The comparisons also include ICA using the RADICAL estimate of Shannon's entropy and a FastICA estimate of neg-entropy.
Keywords: ICA; nonparametric estimation; Hermite functions; kernel density estimation ICA; nonparametric estimation; Hermite functions; kernel density estimation
This is an open access article distributed under the Creative Commons Attribution License (CC BY 3.0).

Share & Cite This Article

Further Mendeley | CiteULike
Export to BibTeX |
EndNote |
RIS
MDPI and ACS Style

Sorensen, J. An Assessment of Hermite Function Based Approximations of Mutual Information Applied to Independent Component Analysis. Entropy 2008, 10, 745-756.

View more citation formats

Related Articles

Article Metrics

For more information on the journal, click here

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert