Next Article in Journal
Vague Entropy Measure for Complex Vague Soft Sets
Previous Article in Journal
The General Solution of Singular Fractional-Order Linear Time-Invariant Continuous Systems with Regular Pencils
Previous Article in Special Issue
Shannon Entropy Estimation in ∞-Alphabets from Convergence Results: Studying Plug-In Estimators
Article Menu
Issue 6 (June) cover image

Export Article

Open AccessArticle
Entropy 2018, 20(6), 401; https://doi.org/10.3390/e20060401

Recognizing Information Feature Variation: Message Importance Transfer Measure and Its Applications in Big Data

Department of Electronic Engineering, Tsinghua University, Beijing 30332, China
*
Author to whom correspondence should be addressed.
Received: 25 April 2018 / Revised: 11 May 2018 / Accepted: 12 May 2018 / Published: 24 May 2018
(This article belongs to the Special Issue Information Theory in Machine Learning and Data Science)
View Full-Text   |   Download PDF [589 KB, uploaded 24 May 2018]   |  

Abstract

Information transfer that characterizes the information feature variation can have a crucial impact on big data analytics and processing. Actually, the measure for information transfer can reflect the system change from the statistics by using the variable distributions, similar to Kullback-Leibler (KL) divergence and Renyi divergence. Furthermore, to some degree, small probability events may carry the most important part of the total message in an information transfer of big data. Therefore, it is significant to propose an information transfer measure with respect to the message importance from the viewpoint of small probability events. In this paper, we present the message importance transfer measure (MITM) and analyze its performance and applications in three aspects. First, we discuss the robustness of MITM by using it to measuring information distance. Then, we present a message importance transfer capacity by resorting to the MITM and give an upper bound for the information transfer process with disturbance. Finally, we apply the MITM to discuss the queue length selection, which is the fundamental problem of caching operation on mobile edge computing. View Full-Text
Keywords: information transfer measure; small probability events; big data analysis and processing; mobile edge computing (MEC); queue theory information transfer measure; small probability events; big data analysis and processing; mobile edge computing (MEC); queue theory
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

She, R.; Liu, S.; Fan, P. Recognizing Information Feature Variation: Message Importance Transfer Measure and Its Applications in Big Data. Entropy 2018, 20, 401.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top