Abstract
Many graph invariants have been used for the construction of entropy-based measures to characterize the structure of complex networks. The starting point has been always based on assigning a probability distribution to a network when using Shannon’s entropy. In particular, Cao et al. (2014 and 2015) defined special graph entropy measures which are based on degrees powers. In this paper, we obtain some lower and upper bounds for these measures and characterize extremal graphs. Moreover we resolve one part of a conjecture stated by Cao et al.
MSC:
05C07
1. Introduction
Graph entropy measures have played an important role in a variety of fields, including information theory, biology, chemistry, and sociology. The entropy of a graph was first introduced by Mowshowitz [1] and Trucco [2]. Afterwards, Dehmer and Mowshowitz [3] interpreted the entropy of a graph based on vertex orbits as its structural information content. Indeed, this measure has been used as a graph complexity measure and is a measure for symmetry. Note that several graph entropies have been used extensively to characterize the topology of networks [3].
Dehmer [4] presents some novel information functionals that capture, in some sense, the structural information of the underlying graph G. Several graph invariants, such as the number of vertices, edges, distances, the vertex degree sequences, extended degree sequences (i.e., the second neighbor, third neighbor, etc.), degree powers and connections, have been used for developing entropy-based measures [3,4,5,6]. In fact, the degree power is one of the most important graph invariants in graph theory. In [5,7], Cao et al. studied properties of graph entropies which are based on an information functional by using degree powers of graphs. To study results dealing with investigating properties of degree powers, we refer to [8,9,10,11,12]. In view of the vast of amount of existing graph entropy measures [4,13], there has been very little work to find their extremal values [14]. A reason for this might be the fact that Shannon’s entropy represents a multivariate function and all probability values are not equal to zero when considering graph entropies. Inspired by Dehmer and Kraus [14], it turned out that determining minimal values of graph entropies is intricate because there is a lack of analytical methods to tackle this particular problem. In this paper we study novel properties of graph entropies which are based on an information functional by using degree powers of graphs. In particular, we proved that the path gives the maximal graph entropy for any tree T (one part of the conjecture given in [5]). Moreover, we obtain some bounds on graph entropy in terms of the maximum degree and minimum degree of graphs.
2. Preliminaries
Let be a graph with n vertices. For , is the degree of the vertex in G. The maximum vertex degree is denoted by Δ and the minimum vertex degree δ.
The vertex degree is an important graph invariant, which is related to many properties of graphs. Let G be a graph of order n with degree sequence . The sum of degree powers of a graph G is defined by , where k is an arbitrary real number. Sharp bounds for the sum of the k-th powers of the degrees of the vertices of graph G were obtained by Cioabǎ in [15].
The definition of Shannon’s entropy [16]: Let be a probability vector, namely, and . The Shannon’s entropy of p is defined as
Recently, Cao et al. [5] introduced the following special graph entropy:
where is the degree of the vertex in G. According to [3], we see that . Throughout the paper all logarithms have base 2.
A graph G is said to be r-regular graph if all of its vertices have same degree r. For r-regular graph,
Throughout this paper we use and to denote the path graph and the cycle graph on n vertices, respectively. We obtain
The following conjecture has been published in [5]:
Conjecture 1.
Let T be a tree with n vertices and . Then with equality holding if and only if ; with equality holding if and only if .
3. Results and Discussion
In this section we prove one part of the Conjecture 1. Moreover, we give some lower and upper bounds on in terms of n, Δ and δ.
3.1. Proof of Conjecture on Entropy
Method of Lagrange Multipliers
To find the maximum and minimum values of subject to the constraints (assuming that these extreme values exist and on the surface ):
- (a)
- Find all values of and λ such that
- (b)
- Evaluate f at all the points that result from step (a). The largest of these values is the maximum value of f; the smallest is the minimum value of f.
Lemma 1.
For ,
Proof.
Let us consider a function
Then we have is a strictly decreasing function on and hence
This gives the required result. ☐
Lemma 2.
For , we have
Proof.
We can assume that .
Theorem 1.
Let T be a tree of order n and . Then .
Proof.
For , we have and hence the equality holds. Otherwise, . If , then the equality holds. Otherwise, . It is well-known: for any tree T, . From Equation (1), we have
Claim 1.
Proof of Claim 1.
For tree T, we have
Let us consider a function
Then we have
Therefore is a decreasing function on and hence we get the required result in Inequality (5). ☐
Claim 2.
Proof of Claim 2.
Let us consider a function
with integer such that
Now,
By using the method of Lagrange multiplier, we have
Therefore we yield . Again by using the method of Lagrange multiplier, we conclude that gives either minimum or maximum value. By Lemma 2, we have . Therefore we obtain the required result in Inequality (6). ☐
3.2. Bounds on
In this subsection we obtain lower and upper bounds for in terms of n, Δ and δ.
Theorem 2.
Let G be a graph of order n with maximum degree Δ and minimum degree δ. Then
Both inequalities hold if and only if G is a regular graph.
Proof.
First part: Let
Suppose that the left inequality holds in (7). Then all the inequalities must be equalities. From the equality in (8), we have . From the equality in (9), we have . Hence G is a regular graph.
Conversely, one can see that the left equality holds in (7) for regular graph.
4. Conclusions
In this paper, we studied a special graph entropy measure which is based on vertex degrees. We proved one part of the Conjecture 1 that the path gives the maximal graph entropy for any tree T. Moreover, we give lower and upper bounds for this measure in terms of n, Δ and δ. The characterization of minimal entropy remains an open problem and constitutes future work. We see that characterizing extremal graphs when using graph entropies is intricate because the problem depends on the underlying entropy measure and graph invariant. In this case, finding the minimal entropy is quite challenging as can be interpreted as a multivariate function in terms of the . Studying these problems for special and rather simple graph classes gives us an idea about the complexity of the problem when considering general graphs.
In this paper, we tackled a theoretical problem when dealing with graph entropy. However, as already demonstrated, graph entropies have been applied for solving problems in machine learning and knowledge discovery in several disciplines. Interesting application areas are health and bioinformatics, see [17,18,19,20]. So far, graph entropy measures and other classical information-theoretic measures have been also employed for solving special problems of machine learning such as parameter selection and explorative data analysis of publication data [20,21,22]. A next step could be to demonstrate the potential for graph entropies in nursing and health informatics more extensively and, hence, to add and demonstrate more conceptional rigor and interdisciplinarity when dealing with applied problems in the mentioned fields.
Acknowledgments
The authors are much grateful to three anonymous referees for their valuable comments on our paper, which have considerably improved the presentation of this paper.
Author Contributions
Kinkar Chandra Das and Matthias Dehmer wrote the paper. Both authors have read and approved the final manuscript.
Conflicts of Interest
The authors declare no conflict of interest.
References
- Mowshowitz, A. Entropy and the complexity of the graphs: I. An index of the relative complexity of a graph. Bull. Math. Biophys. 1968, 30, 175–204. [Google Scholar] [CrossRef] [PubMed]
- Trucco, E. A note on the information content of graphs. Bull. Math. Biophys. 1956, 18, 129–135. [Google Scholar] [CrossRef]
- Dehmer, M.; Mowshowitz, A. A history of graph entropy measures. Inf. Sci. 2011, 181, 57–78. [Google Scholar] [CrossRef]
- Dehmer, M. Information processing in complex networks: Graph entropy and information functionals. Appl. Math. Comput. 2008, 201, 82–94. [Google Scholar] [CrossRef]
- Cao, S.; Dehmer, M.; Shi, Y. Extremality of degree-based graph entropies. Inf. Sci. 2014, 278, 22–33. [Google Scholar] [CrossRef]
- Konstantinova, E.V. On Some Applications of Information Indices in Chemical Graph Theory. In General Theory of Information Transfer and Combinatorics; Springer: Berlin/Heidelberg, Germany, 2006. [Google Scholar]
- Cao, S.; Dehmer, M. Degree-based entropies of networks revisited. Appl. Math. Comput. 2015, 261, 141–147. [Google Scholar] [CrossRef]
- Das, K.C. Maximizing the sum of the squares of the degrees of a graph. Discret. Math. 2004, 285, 57–66. [Google Scholar] [CrossRef]
- Das, K.C.; Xu, K.; Nam, J. Zagreb indices of graphs. Front. Math. China 2015, 10, 567–582. [Google Scholar] [CrossRef]
- Das, K.C. On Comparing Zagreb Indices of Graphs. MATCH Commun. Math. Comput. Chem. 2010, 63, 433–440. [Google Scholar]
- Das, K.C.; Gutman, I.; Zhou, B. New upper bounds on Zagreb indices. J. Math. Chem. 2009, 46, 514–521. [Google Scholar] [CrossRef]
- Xu, K.; Das, K.C.; Balachandran, S. Maximizing the Zagreb Indices of (n, m)-Graphs. MATCH Commun. Math. Comput. Chem. 2014, 72, 641–654. [Google Scholar]
- Bonchev, D.G. Information Theoretic Indices for Characterization of Chemical Structures; Research Studies Press: Chichester, UK, 1983. [Google Scholar]
- Dehmer, M.; Kraus, V. On extremal properties of graph entropies. MATCH Commun. Math. Comput. Chem. 2012, 68, 889–912. [Google Scholar]
- Cioabǎ, S.M. Sums of powers of the degrees of a graph. Discret. Math. 2006, 306, 1959–1964. [Google Scholar] [CrossRef]
- Shannon, C.E.; Weaver, W. The Mathematical Theory of Communication; University of Illinois Press: Urbana, IL, USA, 1949. [Google Scholar]
- Holzinger, A.; Dehmer, M.; Jurisica, I. Knowledge discovery and interactive data mining in Bioinformatics—State-of-the-Art, future challenges and research directions. BMC Bioinform. 2014, 15 (Suppl. S6), I1. [Google Scholar] [CrossRef] [PubMed]
- Holzinger, A. Trends in interactive knowledge discovery for personalized medicine: Cognitive science meets machine learning. IEEE Intell. Inform. Bull. 2014, 15, 6–14. [Google Scholar]
- Mayer, C.; Bachler, M.; Holzinger, A.; Stein, P.K.; Wassertheurer, S. The effect of threshold values and weighting factors on the association between entropy measures and mortality after myocardial infarction in the cardiac arrhythmia suppression trial (CAST). Entropy 2016, 18, 129. [Google Scholar] [CrossRef]
- Mayer, C.; Bachler, M.; Hörtenhuber, M.; Stocker, C.; Holzinger, A.; Wassertheurer, S. Selection of entropy-measure parameters for knowledge discovery in heart rate variability data. BMC Bioinform. 2014, 15 (Suppl. 6), S2. [Google Scholar] [CrossRef] [PubMed]
- Holzinger, A.; Hörtenhuber, M.; Mayer, C.; Bachler, M.; Wassertheurer, S.; Pinho, A.; Koslicki, D. On entropy-based data mining. In Interactive Knowledge Discovery and Data Mining in Biomedical Informatics; Holzinger, A., Jurisica, I., Eds.; Springer: Berlin/Heidelberg, Germany, 2014; pp. 209–226. [Google Scholar]
- Holzinger, A.; Ofner, B.; Stocker, C.; Valdez, A.C.; Schaar, A.K.; Ziefle, M.; Dehmer, M. On graph entropy measures for knowledge discovery from publication network data. In Availability, Reliability, and Security in Information Systems and HCI; Cuzzocrea, A., Kittl, C., Simos, D.E., Weippl, E., Xu, L., Eds.; Springer: Berlin/Heidelberg, Germany, 2013; pp. 354–362. [Google Scholar]
© 2016 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC-BY) license (http://creativecommons.org/licenses/by/4.0/).