Next Article in Journal
Directionality Theory and the Entropic Principle of Natural Selection
Next Article in Special Issue
New Insights into the Fractional Order Diffusion Equation Using Entropy and Kurtosis
Previous Article in Journal
Performance Degradation Assessment of Rolling Element Bearings Based on an Index Combining SVD and Information Exergy
Previous Article in Special Issue
Adaptive Leader-Following Consensus of Multi-Agent Systems with Unknown Nonlinear Dynamics

2014, 16(10), 5416-5427; https://doi.org/10.3390/e16105416

Article
A Note on Distance-based Graph Entropies
1
College of Computer and Control Engineering, Nankai University, No. 94 Weijin Road, 300071 Tianjin, China
2
Department of Computer Science, Universität der Bundeswehr München, Werner-Heisenberg-Weg 39, 85577 Neubiberg, Germany
3
Institute for Bioinformatics and Translational Research, UMIT, Eduard Wallnoefer Zentrum A-6060, Hall in Tyrol, Austria
4
Center for Combinatorics and LPMC-TJKLC, Nankai University, No. 94 Weijin Road, 300071 Tianjin, China
*
Author to whom correspondence should be addressed.
Received: 26 July 2014; in revised form: 18 September 2014 / Accepted: 15 October 2014 / Published: 20 October 2014

Abstract

:
A variety of problems in, e.g., discrete mathematics, computer science, information theory, statistics, chemistry, biology, etc., deal with inferring and characterizing relational structures by using graph measures. In this sense, it has been proven that information-theoretic quantities representing graph entropies possess useful properties such as a meaningful structural interpretation and uniqueness. As classical work, many distance-based graph entropies, e.g., the ones due to Bonchev et al. and related quantities have been proposed and studied. Our contribution is to explore graph entropies that are based on a novel information functional, which is the number of vertices with distance k to a given vertex. In particular, we investigate some properties thereof leading to a better understanding of this new information-theoretic quantity.
Keywords:
entropy; Shannon’s entropy; graph entropy; distance; networks

1. Introduction

Studies of the information content of complex networks and graphs have been initiated in the late 1950s based on the seminal work due to Shannon [1]. Numerous measures for analyzing complex networks quantitatively have been contributed [2]. A variety of problems in, e.g., discrete mathematics, computer science, information theory, statistics, chemistry, biology, etc., deal with investigating entropies for relational structures. For example, graph entropy measures have been used extensively to characterize the structure of graph-based systems in mathematical chemistry, biology [3] and in computer science-related areas, see [4]. The concept of graph entropy [5,6] introduced by Rashevsky [7] and Trucco [8] has been used to measure the structural complexity of graphs [3,9,10]. The entropy of a graph is an information-theoretic quantity that has been introduced by Mowshowitz [11]. Here the complexity of a graph [12] is based on the well-known Shannon’s entropy [1,5,11,13]. Importantly, Mowshowitz interpreted his graph entropy measure as the structural information content of a graph and demonstrated that this quantity satisfies important properties when using product graphs etc., see, e.g., [11,1416]. Note the Körner’s graph entropy [17] has been introduced from an information theory point of view and has not been used to characterize graphs quantitatively. An extensive overview on graph entropy measures can be found in [6]. A statistical analysis of topological graph measures has been performed by Emmert-Streib and Dehmer [18].
Dehmer [5] presents some novel information functionals of V that capture, in some sense, the structural information of the underlying graph G. Several graph invariants, such as the number of vertices, edges, distances, the vertex degree sequences, extended degree sequences (i.e., the second neighbor, third neighbor, etc.), degree powers and connections, have been used for developing entropy-based measures [5,6,19,20]. In this paper, we study graph entropies related to a new information functional, which is the number of vertices with distance k to a given vertex. Distance is one of the most important graph invariants. For a given vertex v in a graph, the number of vertices with distance one to v is exactly the degree of v; the number of pairs of vertices with distance three, which is also related to the clustering coefficient of networks [21], is also called the Wiener polarity index introduced by Wiener in 1947 [22].
In view of the vast of amount of existing graph entropy measures [3,5], there has been very little work to find their extremal values [23]. A reason for this might be the fact that Shannon’s entropy represents a multivariate function and all probability values are not equal to zero when considering graph entropies. Inspired by Dehmer and Kraus [23], it turned out that determining minimal values of graph entropies is intricate because there is a lack of analytical methods to tackle this particular problem. Other related work is due to Shi [24], who proved a lower bound of quantum decision tree complexity by using Shannon’s entropy. Dragomir and Goh [25] obtained several general upper bounds for Shannon’s entropy by using Jensen’s inequality. Finally, Dehmer and Kraus [23] proved some extremal results for graph entropies, which are based on information functionals.
The main contribution of the paper is to study novel properties of graph entropies, which are based on an information functional by using the number of vertices with distance k to a given vertex. The paper is organized as follows. In Section 2, some concepts and notations in graph theory are introduced. In Section 3, we introduce the concept of graph entropies and some types of distance-based graph entropies. In Section 4, we state some properties of graph entropy. The paper finishes with a summary and conclusion in Section 5.

2. Preliminaries

A graph G is an ordered pair of sets V (G) and E(G) such that the elements uvE(G) are a sub-collection of the unordered pairs of elements of V (G). For convenience, we denote a graph by G = (V, E) sometimes. The elements of V (G) are called vertices and the elements of E(G) are called edges. If e = uv is an edge, then we say vertices u and v are adjacent, and u, v are two endpoints (or ends) of e. A loop is an edge whose two endpoints are the same. Two edges are called parallel, if both edges have the same endpoints. A simple graph is a graph containing no loops and parallel edges. If G is a graph with n vertices and m edges, then we say the order of G is n and the size of G is m. A graph of order n is addressed as an n-vertex graph, and a graph of order n and size m is addressed as an (n, m)-graph. A graph F is called a subgraph of a graph G, if V (F) ⊆ V (G) and E(F) ⊆ E(G), denoted by FG. In this paper, we only consider simple graphs.
A graph is connected if, for every partition of its vertex set into two nonempty sets X and Y, there is an edge with one end in X and one end in Y. Otherwise, the graph is disconnected. In other words, a graph is disconnected if its vertex set can be partitioned into two nonempty subsets X and Y so that no edge has one end in X and one end in Y.
A path graph is a simple graph whose vertices can be arranged in a linear sequence in such a way that two vertices are adjacent if they are consecutive in the sequence, and are nonadjacent otherwise. Likewise, a cycle graph on three or more vertices is a simple graph whose vertices can be arranged in a cyclic sequence in such a way that two vertices are adjacent if they are consecutive in the sequence, and are nonadjacent otherwise. Denote by Pn and Cn the path graph and the cycle graph with n vertices, respectively.
A connected graph without any cycle is a tree. Actually, the path Pn is a tree of order n with exactly two pendent vertices. The star of order n, denoted by Sn, is the tree with n − 1 pendent vertices. A tree is called a double star Sp,q, if it is obtained from Sp+1 and Sq by identifying a leaf of Sp+1 with the center of Sq. So, for the double star Sp,q with n vertices, we have p + q = n. We call a double star Sp,q balanced, if p = n 2 and q = n 2 . A comet is a tree composed of a star and a pendent path. For any numbers n and 2 ≤ tn−1, we denote by CS(n, t) the comet of order n with t pendent vertices, i.e., a tree formed by a path Pnt of which one end vertex coincides with a pendent vertex of a star St+1 of order t + 1.
The length of a path is the number of its edges. For two vertices u and v, the distance between u and v in a graph G, denoted by dG(u, v), is the length of the shortest path connecting u and v. A path P connecting u and v in G is called the geodesic path, if it is an induced path, i.e., the distance between u and v in G is exactly the length of the path P. The diameter of a graph G is the greatest distance between two vertices of G, denoted by D(G). The set Sj(vi, G) := {vV |d(vi, v) = j, j ≥ 1} is called the j-sphere of vi regarding G.
All vertices adjacent to vertex u are called neighbors of u. The neighborhood of u is the set of the neighbors of u. The number of edges adjacent to vertex u is the degree of u, denoted by d(u). Vertices of degrees 0 and 1 are said to be isolated and pendent vertices, respectively. A pendent vertex is also referred to as a leaf of the underlying graph. A vertex of degree i is also addressed as an i-degree vertex. The minimum and maximum degree of G is denoted by δ(G) and Δ(G), respectively. If G has ai vertices of degree di (i = 1, 2, . . . , t), where Δ(G) = d1 > d2 > ···> dt = δ(G) and i = 1 t a i = n, we define the degree sequence of G as D ( G ) = [ d 1 a 1 , d 2 a 2 , , d t a t ]. If ai = 1, we use di instead of d i a i for convenience.
For terminology and notations not defined here, we refer the readers to [26].

3. Distance-Based Graph Entropies

Now we reproduce the definition of Shannon’s entropy [1].

Definition 1

Let p = (p1, p2, . . . , pn) be a probability vector, namely, 0 ≤ pi ≤ 1 and i = 1 n p i = 1. The Shannon’s entropy of p is defined as
I ( p ) = - i = 1 n p i log  p i .
To define information-theoretic graph measures, we will often consider a tuple (λ1, λ2, . . . , λn) of non-negative integers λi ∈ ℕ [5]. This tuple forms a probability distribution p = (p1, p2, . . . , pn), where
p i = λ i j = 1 n λ j             i = 1 , 2 , , n .
Therefore, the entropy of tuple (λ1, λ2, . . . , λn) is given by
I ( λ 1 , λ 2 , , λ n ) = - i = 1 n p i log  p i = log  ( i = 1 n λ i ) - i = 1 n λ i j = 1 n λ j log  λ i .
In the literature, there are various ways to obtain the tuple (λ1, λ2, . . . , λn), like the so-called magnitude-based information measures introduced by Bonchev and Trinajstić [27], or partition-independent graph entropies, introduced by Dehmer [5,28], which are based on information functionals.
We are now ready to define the entropy of a graph due to Dehmer [5] by using information functionals.

Definition 2

Let G = (V, E) be a connected graph. For a vertex viV, we define
p ( v i ) : = f ( v i ) j = 1 V f ( v j ) ,
where f represents an arbitrary information functional.
Observe that i = 1 V p ( v i ) = 1. Hence, we can interpret the quantities p(vi) as vertex probabilities. Now we immediately obtain one definition of graph entropy of graph G.

Definition 3

Let G = (V, E) be a connected graph and f be an arbitrary information functional. The entropy of G is defined as
I f ( G ) = - i = 1 V f ( v i ) j = 1 V f ( v j ) log ( f ( v i ) j = 1 V f ( v j ) ) = log ( i = 1 V f ( v i ) ) - i = 1 V f ( v i ) j = 1 V f ( v j ) log f ( v i ) .
Distance is one of the most important graph invariants. We first restate some definitions of the information functionals based on distances. In [5], the following information functional was introduced:
f ( v i ) = α j = 1 D ( G ) c j S j ( v i , G ) ,
where cj with j = 1, 2, . . . , D(G) and α are arbitrary real positive parameters. The information functional proposed in [20] is calculated for a vertex vi as the entropy of its shortest distances from all other vertices in the graph:
H ( v i ) = - u V d ( v i , u ) D ( v i ) log d ( v i , u ) D ( v i ) ,
where D(vi) = ∑uV d(vi, u). The aggregation function over all distances of vertices in the graph is proposed as follows:
H = v V H ( v i ) .
The information functional based on the shortest distances is introduced in [29]:
f ( v i ) = u V d ( v i , u ) .
There are also some functionals based on the betweenness centralities [29,30].
In this paper, we consider a new information functional, which is the number of vertices with distance k to a given vertex. For a given vertex v in a graph, the number of vertices with distance one to v is exactly the degree of v. On the other hand, the number of pairs of vertices with distance three, which is also related to the clustering coefficient of networks, is also called the Wiener polarity index introduced for molecular networks by Wiener in 1947 [22]. For more recent results on Wiener index and Wiener polarity index, we refer to [3141].
Let G = (V, E) be a connected graph with n vertices and viV (G). Denote by nk(vi) the number of vertices with distance k to vi, i.e.,
n k ( v i ) = S k ( v i , G ) = { u : d ( u , v i ) = k , u V ( G ) } ,
where k is an integer such that 1 ≤ kD(G).

Definition 4

Let G = (V, E) be a connected graph. For a vertex viV and 1 ≤ kD(G), we define the information functional as:
f ( v i ) : = n k ( v i ) .
Therefore, by applying Definition 4 and Equality (2), we obtain the special graph entropy
I k ( G ) : = I f ( G ) = - i = 1 n n k ( v i ) j = 1 n n k ( v j ) log ( n k ( v i ) j = 1 n n k ( v j ) ) = log  ( i = 1 n n k ( v i ) ) - 1 j = 1 n n k ( v j ) · i = 1 n n k ( v i ) log  n k ( v i ) .
In this paper, we will discuss the extremal properties of the above graph entropy.

4. Results and Discussion

Observe that for k = 1, n1(vi) = d(vi) is the degree of vi and
I 1 ( G ) = log  ( i = 1 n d i ) - 1 j = 1 n d j · i = 1 n d i log d i ,
which has been studied in [19] for some classes of graphs. If we denote the number of edges by m, then we have
I 1 ( G ) = log  ( 2 m ) - 1 2 m · i = 1 n d i log  d i ,
since i = 1 n d i = 2 m. Denote by pk(G) the number of geodesic paths with length k in graph G. Then we have i = 1 n n k ( v i ) = 2 p k, since each path of length k is counted twice in i = 1 n n k ( v i )
Therefore, Equation (3) can be represented as
I k ( G ) = log  ( 2 p k ) - 1 2 p k · i = 1 n n k ( v i ) log  n k ( v i ) .
The number of paths with length k in a given graph is widely studied by Erdös and Bollobás; we refer the readers to [4247]. Since there are some good algorithms for finding shortest paths in a graph, such as Dijkstra’s algorithm [26], we can obtain the following result.

Proposition 5

Let G be a graph with n vertices. For a given integer k, the value of Ik(G) can be computed in polynomial time.
Let T be a tree with n vertices and V (T) = {v1, v2, . . . , vn}. In the following, we consider the properties of Ik(T) for k = 2.
First, we study the values of p2(T) and nk(vi). Observe that
p 2 ( T ) = i = 1 n ( d i 2 ) = 1 2 i = 1 n d i ( d i - 1 ) = 1 2 i = 1 n d i 2 - m = 1 2 i = 1 n d i 2 - ( n - 1 )
and
n 2 ( v i ) = u N ( v i ) d ( u ) .
Then from Equation (5), we have
I 2 ( T ) = log ( i = 1 n d i 2 - 2 ( n - 1 ) ) - i = 1 n n 2 ( v i ) log n 2 ( v i ) i = 1 n d i 2 - 2 ( n - 1 ) .
If TSn is a star graph, then we have
I 2 ( S n ) = log ( ( n - 2 ) ( n - 1 ) ) - ( n - 1 ) ( n - 2 ) log ( n - 2 ) ( n - 1 ) ( n - 2 ) = log ( n - 1 ) .
If TPn is a path graph, then we have
I 2 ( P n ) = log ( 2 ( n - 2 ) ) - ( n - 4 ) · 2 log  2 2 ( n - 2 ) = log ( n - 2 ) + 2 n - 2 .
Let T be a tree with n vertices. By calculating the values I2(T) for n = 7, 8, 9, 10, we can obtain the trees with extremal values of entropy. The trees with maximum and minimum values of I2(T) are shown in Figures 1 and 2, respectively.
As we have seen from Figure 1, for n = 7, 8, 9, 10, the maximum value of I2(T) is attained when T is the balanced double star S n 2 , n 2 . By some elementary calculations, we have
I 2 ( S n 2 , n 2 ) = { log ( n ) if  n = 2 k 3 k - 1 2 k log ( k ) - k - 1 2 k log ( k - 1 ) + 1 if  n = 2 k + 1.
It is easy to obtain the following result.

Theorem 6

Let Sn, Pn, S n 2 , n 2 be the star graph, the path graph and the balanced double star graph with n vertices, respectively. Then we have
I 2 ( S n ) < I 2 ( P n ) < I 2 ( S n 2 , n 2 ) .

Proof

First, we have
I 2 ( P n ) - I 2 ( S n ) = log  ( n - 2 n - 1 ) + 2 n - 2 > 0
for all n ≥ 3. For n = 2k,
I 2 ( S n 2 , n 2 ) - I 2 ( P n ) = log ( 2 k ) - log ( 2 k - 2 ) - 2 2 k - 2 > 0
for all k ≥ 1. The case of n = 2k + 1 is similar.
From Figure 2, for n = 7, 8, 9, 10, the minimum value of I2(T) is attained when T is a comet. For a comet CS(n, t) with nt ≥ 3, by some elementary calculations, we have
I 2 ( C S ( n , t ) ) = log ( t 2 - 3 t + 2 n - 2 ) - 2 ( n - t - 3 ) + n 1 log t + ( t - 1 ) 2 log ( t - 1 ) t 2 - 3 t + 2 n - 2 .
Let
f ( t ) = log ( t 2 - 3 t + 2 n - 2 ) - 2 ( n - t - 3 ) + n 1 log t + ( t - 1 ) 2 log ( t - 1 ) t 2 - 3 t + 2 n - 2
and
g ( t ) = f ( t ) t .
Denote by t0 the root of g(t) = 0. Then CS(n, t0) is the tree with the minimum value of entropy among all comets.
In fact, for a tree T with n vertices, we can guess the following result. However, this was neither successfully proved nor disproved despite many attempts.

Conjecture 7

For a tree T with n vertices, the balanced double star and the comet CS(n, t0) can attain the maximum and the minimum values of I2(T), respectively.
Observe that the extremal graphs for n = 10 is not unique. From this observation, we can obtain the following result.

Theorem 8

Let CS(n, t) be a comet with nt ≥ 4. Denote by T a tree obtained from CS(n, t) by deleting the leaf that is not adjacent to the vertex of maximum degree and attaching a new vertex to one leaf that is adjacent to the vertex of maximum degree. Then we have I2(T) = I2(CS(n, t)).

Proof

Let CS(n, t) be a comet with nt ≥ 4. Let w be the vertex with maximum degree t. Denote by u the leaf of CS(n, t) that is not adjacent to w, and v is one leaf that is adjacent to w. Let T = CS(n, t) − u + uv. Note that the degree sequence of T is the same as CS(n, t). Then we only need to check the part i = 1 n n 2 ( v i ) log  n 2 ( v i ).
For a given graph G, we define a sequence
s ( n 2 , G ) = ( n 2 ( v 1 ) , n 2 ( v 2 ) , , n 2 ( v n ) ) .
By some elementary calculations, we can find that s(n2, T) = s(n2, CS(n, t)). Then from Equation (6), we obtain that I2(T) = I2(CS(n, t)).
Actually, the above proof provides a method to verify whether two graphs have the same value of entropy. If G and H are two graphs with the same vertex set and the same degree sequence, then we can use the defined sequence s(n2, G) to check whether G and H have the same value of entropy.

5. Conclusions

Many distance-based entropies have been proposed and studied. In this paper, based on Shannon’s entropy, we study graph entropies related to a new information functional, which is the number of vertices with distance k to a given vertex. One of the future works is to explore the discrimination power of this entropy.
Some properties of this entropy of graphs are characterized. Similar to other entropies [48], to determine the extremal values of Ik(G) and characterize the extremal graphs is a challenging problem. It seems also much complicated for trees. One possible attempt is to establish some graph transformations, which can increase or decrease the values of the entropy.

Acknowledgments

We wish to thank the referees for valuable suggestions. Matthias Dehmer thanks the Austrian Science Funds for supporting this work (project P26142). Matthias Dehmer gratefully acknowledges financial support from the German Federal Ministry of Education and Research (BMBF) (project RiKoV, Grant No. 13N12304). Zengqiang Chen was supported by the National Science Foundation of China (No. 61174094) and the Natural Science Foundation of Tianjin (No. 14JCYBJC18700). Yongtang Shi was supported by NSFC, PCSIRT, China Postdoctoral Science Foundation (2014M551015) and China Scholarship Council.

Author Contributions

Wrote the paper: Zengqiang Chen, Matthias Dehmer, Yongtang Shi. All authors have read and approved the final manuscript.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Shannon, C.E.; Weaver, W. The Mathematical Theory of Communication; University of Illinois Press: Urbana, OH, USA, 1949. [Google Scholar]
  2. Bonchev, D.; Rouvray, D.H. Complexity in Chemistry, Biology, and Ecology; Mezey,, P.G., Ed.; Mathematical and Computational Chemistry; Springer: New York, NY, USA, 2005. [Google Scholar]
  3. Bonchev, D. Information Theoretic Indices for Characterization of Chemical Structures; Research Studies Press: Chichester, UK, 1983. [Google Scholar]
  4. Holzinger, A.; Ofner, B.; Stocker, C.; Valdez, A.C.; Schaar, A.K.; Ziefle, M.; Dehmer, M. On graph entropy measures for knowledge discovery from publication network data. In Multidisciplinary Research and Practice for Information Systems; Proceedings of International Cross-Domain Conference and Workshop on Availability, Reliability, and Security, CD-ARES 2012, Prague, Czech Republic, 20–24 August 2012, Quirchmayer, G., Basl, J., You, I., Xu, L., Weippl, E., Eds.; Lecture Notes in Computer Science, Volume 7465; Springer: Berlin/Heidelberg, Germany, 2013; pp. 354–362. [Google Scholar]
  5. Dehmer, M. Information processing in complex networks: Graph entropy and information functionals. Appl. Math. Comput 2008, 201, 82–94. [Google Scholar]
  6. Dehmer, M.; Mowshowitz, A. A history of graph entropy measures. Inform. Sci 2011, 181, 57–78. [Google Scholar]
  7. Rashevsky, N. Life, information theory, and topology. Bull. Math. Biophys 1955, 17, 229–235. [Google Scholar]
  8. Trucco, E. A note on the information content of graphs. Bull. Math. Biol 1965, 18, 129–135. [Google Scholar]
  9. Dehmer, M.; Emmert-Streib, F. Structural information content of networks: Graph entropy based on local vertex functionals. Comput. Biol. Chem 2008, 32, 131–138. [Google Scholar]
  10. Dehmer, M.; Borgert, S.; Emmert-Streib, F. Entropy bounds for molecular hierarchical networks. PLoS ONE 2008, 3. [Google Scholar] [CrossRef]
  11. Mowshowitz, A. Entropy and the complexity of the graphs I: An index of the relative complexity of a graph. Bull. Math. Biophys 1968, 30, 175–204. [Google Scholar]
  12. Dehmer, M.; Mowshowitz, A.; Emmert-Streib, F. Advances in Network Complexity; Wiley-Blackwell: Weinheim, Germany, 2013. [Google Scholar]
  13. Cover, T.M.; Thomas, J.A. Elements of Information Theory; Wiley: New York, NY, USA, 2006. [Google Scholar]
  14. Mowshowitz, A. Entropy and the complexity of graphs II: The information content of digraphs and infinite graphs. Bull. Math. Biophys 1968, 30, 225–240. [Google Scholar]
  15. Mowshowitz, A. Entropy and the complexity of graphs III: Graphs with prescribed information content. Bull. Math. Biophys 1968, 30, 387–414. [Google Scholar]
  16. Mowshowitz, A. Entropy and the complexity of graphs IV: Entropy measures and graphical structure. Bull. Math. Biophys 1968, 30, 533–546. [Google Scholar]
  17. Körner, J. Coding of an information source having ambiguous alphabet and the entropy of graphs. In Transactions of the Sixth Prague Conference on Information Theory, Statistical Decision Function, Random Processes; Walter de Gruyter: Berlin, Germany, 1973; pp. 411–425. [Google Scholar]
  18. Emmert-Streib, F.; Dehmer, M. Exploring statistical and population aspects of network complexity. PLoS ONE 2012, 7. [Google Scholar] [CrossRef]
  19. Cao, S.; Dehmer, M.; Shi, Y. Extremality of degree-based graph entropies. Inform. Sci 2014, 278, 22–33. [Google Scholar]
  20. Konstantinova, E.V. On some applications of information indices in chemical graph theory. In General Theory of Information Transfer and Combinatorics; Springer: New York, NY, USA, 2006. [Google Scholar]
  21. Costa, L.D.F.; Rodrigues, F.A.; Travieso, G.; Villas Boas, P.R. Characterization of complex networks: A survey of measurements. Adv. Phys 2007, 56, 167–242. [Google Scholar]
  22. Wiener, H. Structural determination of paraffin boiling points. J. Am. Chem. Soc 1947, 69, 17–20. [Google Scholar]
  23. Dehmer, M.; Kraus, V. On extremal properties of graph entropies. MATCH Commun. Math. Comput. Chem 2012, 68, 889–912. [Google Scholar]
  24. Shi, Y. Entropy lower bounds for quantum decision tree complexity. Inf. Process. Lett 2002, 81, 23–27. [Google Scholar]
  25. Dragomir, S.; Goh, C. Some bounds on entropy measures in information theory. Appl. Math. Lett 1997, 10, 23–28. [Google Scholar]
  26. Bondy, J.A.; Murty, U.S.R. Graph Theory; Springer: Berlin, Germany, 2008. [Google Scholar]
  27. Bonchev, D.; Trinajstić, N. Information theory, distance matrix and molecular branching. J. Chem. Phy 1977, 67, 4517–4533. [Google Scholar]
  28. Dehmer, M.; Varmuza, K.; Borgert, S.; Emmert-Streib, F. On entropy-based molecular descriptors: Statistical analysis of real and synthetic chemical structures. J. Chem. Inf. Model 2009, 49, 1655–1663. [Google Scholar]
  29. Abramov, O.; Lokot, T. Typology by means of language networks: Applying information theoretic measures to morphological derivation networks. In Towards an Information Theory of Complex Networks: Statistical Methods and Applications; Dehmer, M., Emmert-Streib, F., Mehler, A., Eds.; Springer: New York, NY, USA, 2011; pp. 321–346. [Google Scholar]
  30. Brandes, U. A faster algorithm for betweenness centrality. J. Math. Sociol 2011, 25, 163–177. [Google Scholar]
  31. Alizadeh, Y.; Andova, V.; Klavzar, S.; Skrekovski, R. Wiener dimension: Fundamental properties and (5,0)-nanotubical fullerenes. MATCH Commun. Math. Comput. Chem 2014, 72, 279–294. [Google Scholar]
  32. da Fonseca, C.M.; Ghebleh, M.; Kanso, A.; Stevanovic, D. Counterexamples to a conjecture on Wiener index of common neighborhood graphs. MATCH Commun. Math. Comput. Chem 2014, 72, 333–338. [Google Scholar]
  33. Dobrynin, A.A.; Entringer, R.C.; Gutman, I. Wiener index of trees: Theory and applications. Acta Appl. Math 2001, 66, 211–249. [Google Scholar]
  34. Hamzeh, A.; Iranmanesh, A.; Reti, T.; Gutman, I. Chemical graphs constructed of composite graphs and their q-Wiener index. MATCH Commun. Math. Comput. Chem 2014, 72, 807–833. [Google Scholar]
  35. Hrinakova, K.; Knor, M.; Skrekovski, R.; Tepeh, A. A congruence relation for the Wiener index of graphs with a tree-like structure. MATCH Commun. Math. Comput. Chem 2014, 72, 791–806. [Google Scholar]
  36. Knor, M.; Luzar, B.; Skrekovski, R.; Gutman, I. On Wiener index of common neighborhood graphs. MATCH Commun. Math. Comput. Chem 2014, 72, 321–332. [Google Scholar]
  37. Lin, H. On the Wiener index of trees with given number of branching vertices. MATCH Commun. Math. Comput. Chem 2014, 72, 301–310. [Google Scholar]
  38. Lin, H. Extremal Wiener index of trees with given number of vertices of even degree. MATCH Commun. Math. Comput. Chem 2014, 72, 311–320. [Google Scholar]
  39. Lin, H. A note on the maximal Wiener index of trees with given number of vertices of maximum degree. MATCH Commun. Math. Comput. Chem 2014, 72, 783–790. [Google Scholar]
  40. Ma, J.; Shi, Y.; Yue, J. On the extremal Wiener polarity index of unicyclic graphs with a given diameter. In Topics in Chemical Graph Theory; Gutman, I., Ed.; Mathematical Chemistry Monographs, No.16a; University of Kragujevac and Faculty of Science Kragujevac: Kragujevac, Serbia, 2014; pp. 177–192. [Google Scholar]
  41. Skrekovski, R.; Gutman, I. Vertex version of the Wiener theorem. MATCH Commun. Math. Comput. Chem 2014, 72, 295–300. [Google Scholar]
  42. Alon, N. On the number of subgraphs of prescribed type of graphs with a given number of edges. Isr. J. Math 1981, 38, 116–130. [Google Scholar]
  43. Alon, N. On the number of certain subgraphs contained in graphs with a given number of edges. Isr. J. Math 1986, 53, 97–120. [Google Scholar]
  44. Bollobás, B.; Erdös, P. Graphs of extremal weights. Ars Combin 1998, 50, 225–233. [Google Scholar]
  45. Bollobás, B.; Sarkar, A. Paths in graphs. Stud. Sci. Math. Hung 2001, 38, 115–137. [Google Scholar]
  46. Bollobás, B.; Sarkar, A. Paths of length four. Discret. Math 2003, 265, 357–363. [Google Scholar]
  47. Bollobás, B.; Tyomkyn, M. Walks and paths in trees. J. Graph Theory 2012, 70, 54–66. [Google Scholar]
  48. Holzinger, A.; Hortenhuber, M.; Mayer, C.; Bachler, M.; Wassertheurer, S.; Pinho, A.; Koslicki, D. On Entropy-based Data Mining. In Interactive Knowledge Discovery and Data Mining: State-of-the-Art and Future Challenges in Biomedical Informatics; Holzinger, A., Jurisica, I., Eds.; Lecture Notes in Computer Science, Volume 8401; Springer: Berlin/Heidelberg, Germany, 2014; pp. 209–226. [Google Scholar]
Figure 1. The trees with maximum value of I2(T) among all trees with n vertices for 7 ≤ n ≤ 10.
Figure 1. The trees with maximum value of I2(T) among all trees with n vertices for 7 ≤ n ≤ 10.
Entropy 16 05416f1
Figure 2. The trees with minimum value of I2(T) among all trees with n vertices for 7 ≤ n ≤ 10.
Figure 2. The trees with minimum value of I2(T) among all trees with n vertices for 7 ≤ n ≤ 10.
Entropy 16 05416f2
Back to TopTop