Next Article in Journal
Output-Feedback Control for Discrete-Time Spreading Models in Complex Networks
Previous Article in Journal
Global Reliability Sensitivity Analysis Based on Maximum Entropy and 2-Layer Polynomial Chaos Expansion
Article Menu
Issue 3 (March) cover image

Export Article

Open AccessFeature PaperArticle
Entropy 2018, 20(3), 203; https://doi.org/10.3390/e20030203

Computational Information Geometry for Binary Classification of High-Dimensional Random Tensors

1
Laboratory of Signals and Systems (L2S), Department of Signals and Statistics, University of Paris-Sud, 91400 Orsay, France
2
Computer Science Department LIX, École Polytechnique, 91120 Palaiseau, France
3
Sony Computer Science Laboratories, Tokyo 141-0022, Japan
The results presented in this work have been partially published in the 2017 IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), New Orleans, LA, USA, 5–9 March 2017 and the 2017 25th European Association for Signal Processing (EUSIPCO), Kos, Greece, 28 August–2 September 2017.
*
Author to whom correspondence should be addressed.
Received: 25 January 2018 / Revised: 13 March 2018 / Accepted: 14 March 2018 / Published: 17 March 2018
Full-Text   |   PDF [520 KB, uploaded 17 March 2018]   |  

Abstract

Evaluating the performance of Bayesian classification in a high-dimensional random tensor is a fundamental problem, usually difficult and under-studied. In this work, we consider two Signal to Noise Ratio (SNR)-based binary classification problems of interest. Under the alternative hypothesis, i.e., for a non-zero SNR, the observed signals are either a noisy rank-R tensor admitting a Q-order Canonical Polyadic Decomposition (CPD) with large factors of size N q × R , i.e., for 1 q Q , where R , N q with R 1 / q / N q converge towards a finite constant or a noisy tensor admitting TucKer Decomposition (TKD) of multilinear ( M 1 , , M Q ) -rank with large factors of size N q × M q , i.e., for 1 q Q , where N q , M q with M q / N q converge towards a finite constant. The classification of the random entries (coefficients) of the core tensor in the CPD/TKD is hard to study since the exact derivation of the minimal Bayes’ error probability is mathematically intractable. To circumvent this difficulty, the Chernoff Upper Bound (CUB) for larger SNR and the Fisher information at low SNR are derived and studied, based on information geometry theory. The tightest CUB is reached for the value minimizing the error exponent, denoted by s . In general, due to the asymmetry of the s-divergence, the Bhattacharyya Upper Bound (BUB) (that is, the Chernoff Information calculated at s = 1 / 2 ) cannot solve this problem effectively. As a consequence, we rely on a costly numerical optimization strategy to find s . However, thanks to powerful random matrix theory tools, a simple analytical expression of s is provided with respect to the Signal to Noise Ratio (SNR) in the two schemes considered. This work shows that the BUB is the tightest bound at low SNRs. However, for higher SNRs, the latest property is no longer true. View Full-Text
Keywords: optimal Bayesian detection; information geometry; minimal error probability; Chernoff/Bhattacharyya upper bound; large random tensor; Fisher information; large random sensing matrix optimal Bayesian detection; information geometry; minimal error probability; Chernoff/Bhattacharyya upper bound; large random tensor; Fisher information; large random sensing matrix
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Pham, G.-T.; Boyer, R.; Nielsen, F. Computational Information Geometry for Binary Classification of High-Dimensional Random Tensors. Entropy 2018, 20, 203.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top