Entropy of Closure Operators and Network Coding Solvability

The entropy of a closure operator has been recently proposed for the study of network coding and secret sharing. In this paper, we study closure operators in relation to their entropy. We first introduce four different kinds of rank functions for a given closure operator, which determine bounds on the entropy of that operator. This yields new axioms for matroids based on their closure operators. We also determine necessary conditions for a large class of closure operators to be solvable. We then define the Shannon entropy of a closure operator and use it to prove that the set of closure entropies is dense. Finally, we justify why we focus on the solvability of closure operators only.


Introduction
Network coding is a novel means to transmit data through a network, where each intermediate node determines its output packet from all of the packets it receives [1].Network coding problems are further defined by restrictions on the alphabet of packets and sometimes on what computations the intermediate nodes may do.In particular, linear network coding [2] is optimal in the case of one source; however, this is not the case for multiple sources and destinations [3,4].Although for large dynamic networks, good heuristics, such as random linear network coding [5,6], can be used, maximizing the amount of information that can be transmitted over a static network is fundamental, but very hard in practice.Solving this problem by brute force, i.e., considering all possible operations at all nodes, is computationally prohibitive.The network coding solvability problem is given as follows: given a network (with corresponding graph, sources, destinations and messages), can all of the messages be transmitted?This problem is very difficult, for instance the problem for some networks is as hard as determining whether k mutually orthogonal Latin squares of order A exist [7,8].
Several major advances have been made on this problem.First of all, it can always be reduced to a multiple unicast instance, where each source sends a different message, requested to a corresponding unique destination.In [9], the network coding solvability problem is reduced to the guessing game (described in Section 2.4 below), a simple cooperative problem on arbitrary directed graphs, thus removing the asymmetry between sources, intermediate nodes and destinations.Notably, [10] introduces the entropy of a directed graph (not to be mistaken with Körner's graph entropy in [11] or with the Shannon capacity of a graph [12]); calculating this entropy solves the network solvability problem.This problem can be tackled by a more combinatorial approach, based on the so-called guessing number of a graph, which is closely related to the entropy [10].The guessing number of graphs is studied further in [13], where it is proved that the guessing number of a directed graph is equal to the independence number of a related undirected graph.The guessing number of undirected graphs is further explored in [14].
A closure operator on the vertex set of a digraph is introduced in [8].Network coding solvability is then proven to be a special case of a more general problem, called the closure solvability problem, for the closure operator defined on a digraph related to the network coding instance.The latter problem also generalises the search for ideal secret sharing schemes [15].The main interest of closure solvability is that it allows us to use closure operators, which do not arise from digraphs (notably the uniform matroids), but which have been proven to be solvable over many alphabets.In this paper, we introduce the concept of the entropy of an arbitrary closure operator.Again, calculating the entropy of a closure operator determines whether this closure operator is solvable or not.Therefore, this paper aims at studying this quantity in detail.
The closure solvability problem generalises different problems in coding theory, cryptology or combinatorics.
• As mentioned above, closure operators associated with digraphs are particularly relevant for network coding.Indeed, a network coding instance is solvable if and only if cl D is solvable for some digraph D related to the network coding instance [8].
• When reduced to matroids, this is the problem of representation by partitions in [16], which is equivalent to determining secret-sharing matroids [15].
• When further reduced to the uniform matroid, this is exactly the problem of finding maximum distance separable (MDS) codes, which is arguably the most important open problem in coding theory (see [17]).Special cases include the famous combinatorial problem of the existence of mutually orthogonal Latin squares.
The rest of the paper is organised as follows.We review the closure operator associated with a digraph and the general closure solvability problem in Section 2. In Section 3, we introduce four kinds of rank functions for a given closure operator.This not only helps us derive bounds on the entropy, but we are also able to provide axioms for matroids that are, to the author's knowledge, new.Section 4 then studies a natural upper bound on the entropy, based on polymatroids.This helps us prove that the set of closure entropies contains all rational numbers above one.Finally, Section 5 investigates the solvability problem beyond closure operators.

Closure Operators
Throughout this paper, V is a set of n elements.A closure operator on V is a mapping cl : 2 V → 2 V , which satisfies the following properties (see Chapter IV in [18]).For any A closed set is a set equal to its closure.For instance, in a group, one may define the closure of a set as the subgroup generated by the elements of the set; the family of closed sets is simply the family of all subgroups of the group.Another example is given by linear spaces, where the closure of a set of vectors is the subspace they span.Closure operators are central in lattice theory and in universal algebra; moreover, any Galois connection between subset lattices is equivalent to a closure operator on subsets.
We refer to: as the rank of the closure operator.Any set b ⊆ V of size r and whose closure is V is referred to as a basis of cl.There is a natural partial order on closure operators of the same set.We denote We shall focus on two families of closure operators.Firstly, a matroid closure is a closure operator satisfying the Steinitz-Mac Lane exchange property (in order to simplify notation, we shall identify a singleton {v} with its element v): if X ⊆ V , v ∈ V and u ∈ cl(X ∪ v)\cl(X), then v ∈ cl(X ∪ u) [19].In particular, the uniform matroid U r,n of rank r over n vertices is defined by: It is worth noting that matroids constitute a much richer variety than uniform matroids.
Secondly, let D = (V, E) be a digraph on n vertices (possibly with loops, but without any repeated arcs).The in-neighborhood of a vertex v is denoted as v − = {u : (u, v) ∈ E}; we extend this definition to subsets of vertices Y − = v∈Y v − .The D-closure of any X ⊆ V is defined as follows [8].We let c D (X) := X ∪ {v ∈ V : v − ⊆ X} and the D-closure of X is obtained by applying it c D repeatedly n times: cl D (X) := c n D (X).This definition can be intuitively explained as follows.Suppose we assign a function to each vertex of D, which only depends on its in-neighbourhood (the function that decides which message the vertex will transmit).If we know the messages sent by the vertices of X, we also know the messages that will be sent by any vertex in c D (X).By applying this iteratively, we can determine all messages sent by the vertices in cl D (X).Therefore, cl D (X) represents everything that is determined by X.
Alternatively, we have cl D (X) := X ∪ Y , where Y is the largest acyclic set of vertices, such that Y − ⊆ X ∪ Y (see Lemma 1 in [8]).Recall that a feedback vertex set is a set of vertices X, such that V \X induces an acyclic subgraph.The rank of cl D is therefore the minimum size of a feedback vertex set of D.
Example 1.The D-closure of some classes of graphs can be readily determined.

Partitions
A partition of a finite set B is a collection of subsets, called parts, which are pairwise disjoint and whose union is the whole of B. We denote the parts of a partition f as P i (f ).If every part of f is contained in a unique part of g, we say f refines g.The equality partition E B with |B| parts refines any other partition, while the universal partition (with only one part) is refined by any other partition of B. The common refinement of two partitions f , g of B is given by h := f ∨ g with parts: P i,j (h) = P i (f ) ∩ P j (g) : P i (f ) ∩ P j (g) = ∅.
We shall usually consider a tuple of n partitions f = (f 1 , . . ., f n ) of the same set assigned to elements of a finite set V with n elements.In that case, for any X ⊆ V , we denote the common refinement of all f v , v ∈ X as f X := v∈X f v .For any X, Y ⊆ V , we then have f X∪Y = f X ∨ f Y .

Closure Solvability and Entropy
We now review the closure solvability problem [8].The instance of the problem consists of a closure operator cl on V with rank r and a finite set A with |A| ≥ 2, referred to as the alphabet.Definition 1.A coding function for cl over A is a family f of n partitions of A r (the set of strings of length r over A) into at most |A| parts, such that f X = f cl(X) for all X ⊆ V .
We remark that the family of partitions f = (f 1 , . . ., f n ) where f i is the partition of A r with only one part is a coding function for any closure operator of rank r over A.
The problem is to determine whether there exists a coding function for cl over A, such that f V has A r parts.That is, we want to determine whether there exists an n-tuple f = (f 1 , . . ., f n ) of partitions of A r in at most |A| parts, such that: We make several remarks concerning the closure solvability problem.
(1) The solvability problem could be defined as searching for families of partitions of any set B with |B| ≥ |A| r , such that f V = E B .However, this can only occur if |B| = |A| r ; moreover, since only the cardinality of B matters, we can assume without loss that B = A r .
(2) A coding function f naturally yields a closure operator cl f on V , where cl f (X) = {v ∈ V : we then have cl ≤ cl f .Therefore, if cl 2 is solvable, then any cl 1 with the same rank and cl 1 ≤ cl 2 is also solvable.
For any partition g of A r , the entropy of g is defined as the traditional entropy of the probability distribution where each event is a part with probability proportional to its size, scaled by the log of the alphabet size, i.e., H(g The equality partition on A r is the only partition with full entropy r.Denoting H f (X) := H(f X ), we can recast the conditions above as: The first two conditions are equivalent to f being a coding function.In general, the rank cannot always be attained; hence, we define the entropy of a closure operator cl over A as the maximum entropy of any coding function for it: The entropy of cl is defined to be the supremum of all H(cl, A).

Guessing Game and Closure Solvability
The guessing game was first proposed for the study of network coding solvability by Riis.It is a cooperative game with n players, where each can only see a number of hats of the other players, but not their own.All of the players must guess the color of their own hat at the same time; the team wins if everyone guesses correctly.The aim of the guessing game is to devise a guessing strategy (a protocol), which maximises the number of winning configurations.
More formally, a configuration on a digraph D on V over a finite alphabet A is simply an n-tuple The guessing number of D is then defined as the logarithm of the maximum number of configurations fixed by a protocol of D: The guessing game on D is equivalent to the solvability problem for cl D [8].

Rank Functions of Closure Operators
In this section, we investigate the properties of closure operators in general and we derive bounds on the entropy of their coding functions.We shall introduce four kinds of ranks for any closure operator.It is worth noting that they are all distinct from the so-called rank function of a closure operator studied in [20].

Inner and Outer Ranks
First of all, we are interested in upper bounds on the entropy of coding functions.
Definition 2. The inner rank and outer rank of a subset X of vertices are respectively given by: Although the notations should reflect which closure operator is used in order to be rigorous, we shall usually omit this dependence for the sake of clarity.Instead, if the closure operator is "decorated" by subscripts or superscripts, then the corresponding parameters will be decorated in the same fashion.
A set i with |i| = ir(X) and cl(i) = cl(X) is called an inner basis of X; similarly a set o with |o| = or(X) and cl(X) ⊆ cl(o) is called an outer basis of X.
The following properties are an easy exercise.
Note that the inner rank is not monotonic, as seen in the example in  If cl 1 (X) ⊆ cl 2 (X) for some X, then or 1 (X) ≥ or 2 (X).Indeed, any outer basis of X with respect to cl 1 is also an outer basis of X with respect to cl 2 .In particular, if cl 1 ≤ cl 2 , then or 1 (X) ≥ or 2 (X) for all X.
This Lemma proves that we get subadditivity for free.Since the entropy satisfies all of the conditions of Lemma 1, we obtain an upper bound on the entropy.
Corollary 1.For any coding function f and any X ⊆ V , H f (X) ≤ or(X).

Flats and Span
Before we move on to lower bounds on the entropy, we define two fundamental concepts.Definition 3. A flat is a subset F of vertices for which there is no X ⊃ F with or(X) = or(F ).
Proposition 2. Flats satisfy the following properties.
(1) cl(∅) is the only flat with rank zero, and V is the only flat with rank r.
(2) any flat F is a closed set; (3) or(F ) = ir(F ); (4) for any X, there exists a flat F ⊇ X with or(F ) = or(X).
(2) Since cl(F ) contains F while having the same rank as F , it cannot properly contain F .(4) For any X, let C be a set with rank or(X) and containing X of largest cardinality, then there exists no G, such that C ⊂ G and or(G) = or(X) = or(C).
It is worth noting that there are closed sets that are not flats.For example, consider the following closure operator on V = {1, . . ., n}, where cl(X) = {1, . . ., max(X)}.Then, it has rank one and, hence, only two flats (the empty set and V ), while it has n + 1 closed sets (the empty set and cl(i) for all i).We shall clarify the relationship between closed sets and flats below.Definition 4. For any X ⊆ V , the union of all flats containing X with outer rank equal to that of X is referred to as the span of X, i.e., span(X) := {F : F flat, X ⊆ F, or(F ) = or(X)}.Proposition 3.For any X, (1) cl(X) ⊆ span(X) with equality if and only if cl(X) is a flat; (2) span(cl(X)) = span(X); (3) span(X) := {v ∈ V : or(X ∪ v) = or(X)}.
Proof.The first two properties follow directly from the definition.Suppose v ∈ F , a flat containing X with or(F ) = or(X), then or(X ∪ v) ≤ or(F ) = or(X).Conversely, if or(X ∪ w) = or(X), then X ∪ w is contained in a flat with the same outer rank as X and, hence, in span(X).
Flats and spans provide two alternate axioms for matroids.
Proof.The first property clearly implies the third one.Let us now prove that the second property implies the first one.Let X ⊆ V , v ∈ V and u ∈ cl(X ∪ v)\cl(X), then or(X ∪ u) = or(X) + 1 = or(X ∪ v) and, hence, cl(X ∪ u) = cl(X ∪ v).Thus, cl satisfies the Steinitz-Mac Lane exchange axiom.
We now prove that the third property implies the second.Suppose all closed sets are spans, then we shall prove that all closed sets of outer rank k are flats, by induction on 0 ≤ k ≤ r.This is clear for k = 0; hence, suppose it holds for up to k − 1.There are solvable closure operators that are not matroids, e.g., the undirected graph C4 displayed in Figure 2. It is solvable because it has rank two and contains K 2 ∪ K 2 .More explicitly, the following is a solution for it over any alphabet: In that case, note that the outer rank is submodular, and hence, span C4 = U 2,4 is a matroid; however, cl C4 is not a matroid itself.
We would like to explain the significance of flats in matroids for random network coding.A model for noncoherent random network coding based on matroids is proposed in [21], which generalises routing (a special case for the uniform matroid), linear network coding (the projective geometry) and affine network coding (the affine geometry).In order to combine the messages they receive, the intermediate nodes select a random element from the closure of the received messages.The model is based on matroids, because all closed sets are flats, hence a new message is either in the closure of all of the previously received messages (and is not informative) or it increases the outer rank (and is fully informative).

Upper and Lower Ranks
We are now interested in lower bounds on the entropy of coding functions.Since any closure operator has a trivial coding function with entropy zero (where the universal partition is placed on every vertex), the entropy of any coding function cannot be bounded below.Therefore, most of our bounds will apply to solutions only.A few elementary properties of the lower and upper ranks are listed below.Again, if cl 1 ≤ cl 2 , then lr 1 (X) ≥ lr 2 (X) and ur 1 (X) ≥ ur 2 (X) for all X ⊆ V .Lemma 2. The following hold: (1) lr(V ) = ur(V ) = r and lr(∅) = ur(∅) = 0.
(2) For any X ⊆ V , lr(X) = 0 if and only if cl(V \X) = V .Hence, ur(X) = r if and only if cl(X) = V .
Proof.The first three properties are easily proven.Property (4) for the upper rank follows from Property (3); the result for the lower rank follows from lr(X) = r − ur(V \X).For Property (5), Property (4) yields ur(X) ≤ ur(cl(X)), while cl(X ∪ Y ) = cl(cl(X) ∪ Y ) yields the reverse inequality.We now prove Property (6).The inequality ur(X) ≤ or(X) follows from the subadditivity of the outer rank.To prove that lr(X) ≤ ur(X), let b be a basis for cl.Then: , and hence, r = |b| ≥ lr(X) + lr(V \X).
We remark that for any solution f , we have r = H f (V ) ≤ or(X) + or(Y ) for any X, Y , such that cl(X ∪ Y ) = V .Therefore, we obtain: for all X ⊆ V .

Corollary 2. For any solution f of cl and any
Note that a trivial lower bound on H f (X) (where f is a solution) is given by r−H f (V \X).Therefore, the intermediate bounds on H f (X) in Corollary 2 refine this trivial bound.
Some of the results above can be generalised for any coding function f : denoting: we obtain: We finish this subsection by remarking that Theorem 1 has an analogue for the upper rank.Namely, define an upper flat as a set F , such that F ⊂ X implies ur(X) > ur(F ); define also the upper span of X as: uspan(X) := {F : F upper flat, X ⊆ F, ur(F ) = ur(X)} = {v ∈ V : ur(X ∪ v) = ur(X)}.

Inner and Outer Complemented Sets
We are now interested in a case where the bounds on the entropy are tight.Definition 6.We say a set X is outer complemented if or(X) = ur(X).Moreover, we say it is inner complemented if ir(X) = ur(X).
Therefore, if X is outer complemented, then H f (X) = or(X) = ur(X) for any solution f .Remark that X is outer (inner) complemented if and only if cl(X) is outer (inner) complemented.
Proposition 4. The following are equivalent: (1) X is outer complemented; (2) there exists Z, such that or(X) + or(Z) = r, cl(X ∪ Z) = V and X ∩ Z = ∅; (3) any outer basis of X is contained in a basis of V .
Similar results hold for inner complemented sets.The following are equivalent: (1) X is inner complemented; (2) X is outer complemented and ir(X) = or(X); (3) any inner basis of X is contained in a basis of V .
Proof.The equivalence of the first two properties is easily shown.If X is outer complemented, let o be an outer basis of X, and let Z satisfy cl(X ∪ Z) = V and |Z| = r − or(X).Then, o ∪ Z is a basis of V .Conversely, if any outer basis can be extended to a basis, then any such extension is a valid Z for Property (2).The properties for an inner complemented set are easy to prove.
We saw earlier that cl(X) ⊆ cl f (X) for any coding function f and any X.This can be refined when f is a solution and X is outer complemented.Lemma 3. If f is a solution of cl, then cl(span(X)) ⊆ cl f (X) for any outer complemented X.
Proof.For any outer complemented X, we have Corollary 3. If there exists an outer complemented set X, such that its span has higher outer rank and is also outer complemented, then cl is not solvable over any alphabet.
By extension, we say that cl is outer complemented if all sets are outer complemented.We can characterise the solvable outer complemented closure operators.Theorem 3. Suppose that cl has rank r and is outer complemented.Then, cl is solvable if and only if span is a solvable matroid with rank r.
Proof.If all sets are outer complemented, then any solution f of cl is also a coding function of span, since span(X) = {v ∈ V : H f (X ∪ v) = H f (X)}.Since the outer rank is equal to the entropy H f , it is submodular, and hence, span is a matroid whose rank function is given by the outer rank.Thus, span has rank r and f is a solution for it.Conversely, if span is a solvable matroid with rank r, then we have cl ≤ span and cl is solvable.
For instance, for the undirected cycle C5 in Figure 3, cl C5 is outer complemented, though the outer rank is not submodular; hence, span is not a matroid.As such, C5 is not solvable (its entropy is actually 2.5 [10]).We would like to emphasize that if all sets are outer complemented, then the outer rank must be submodular, i.e., the rank function of a matroid.However, this does not imply that cl should be a matroid itself.For instance, consider cl defined on {1, 2, 3} as follows: cl(1) = 12, cl(2) = 2, cl(3) = 3, cl(13) = cl(23) = 123.Then, any set is inner complemented, cl is solvable (by letting f 1 = f 2 and f 3 , such that f 1 ∨ f 3 = E A 2 ), but cl is not a matroid.

Combining Closure Operators
In this subsection, V 1 and V 2 are disjoint sets of respective cardinalities n 1 and n 2 ; cl 1 and cl 2 are closure operators on V 1 with rank r 1 and on V 2 with rank r 2 , respectively.We further let Different ways of combining closure operators have been proposed in [8].
Definition 7. The disjoint, unidirectional and bidirectional unions of cl 1 and cl 2 are, respectively: If cl is a closure operator on V satisfying cl 1 ∪cl 2 ≤ cl ≤ cl 1 ∪ cl 2 , it has rank r 1 + r 2 and entropy H(cl 1 ) + H(cl 2 ).We can then split the problems into two parts.In that case, we also have: The rank of the bidirectional union is given by: while its entropy only satisfies the inequality: We can determine how the four rank functions given above behave with regards to the three types of union.Proposition 5.For the disjoint union, let cl ∪ := cl 1 ∪ cl 2 , then: For the unidirectional union, let cl ∪ := cl 1 ∪cl 2 , then: For the bidirectional union, let cl∪ := cl 1 ∪cl 2 , then: Proof.The results for the disjoint union easily follow from the definitions.We then turn to the unidirectional union.Again, we remark that cl ∪ (X) = V if and only if cl 1 (X 1 ) = V 1 and cl 2 (X 2 ) = V 2 ; this gives the rank, the upper rank of X and then its lower rank.For the upper rank, we have The proof for the inner rank is similar.
For the bidirectional union, we have ) or X ⊆ o; this yields the outer rank.The inner rank is obtained by considering each case separately.
The lower rank follows from the upper rank.

Shannon Entropy
Since finding the entropy of a digraph is difficult in general, [9,10] developed the idea of Shannon entropy of a graph.The main idea is to maximise over all functions that satisfy some of the properties of an entropic function, notably submodularity.This idea can be adapted to general closure operators.For any closure operator cl on V , a Shannon function for cl can be viewed as a cl-compatible polymatroid.Definition 8.For any closure operator cl on V , a Shannon function for cl is a function r : 2 V → R, such that: r is submodular, i.e., if X, Y ⊆ V , then: for all X ⊆ V , r(X) = r(cl(X)).
The maximum value of r(V ) over all Shannon functions for cl is called the Shannon entropy of cl and is denoted by SE(cl).
Any Shannon function also satisfies the conditions of Lemma 1, hence SE(cl) ≤ or(V ) = r.Moreover, it is clear that if cl 1 ≤ cl 2 , then SE(cl 1 ) ≥ SE(cl 2 ).
is a Shannon function for cl, such that r (X) = r (X 1 ) + r (X 2 ) and r (V ) = r(V ).
Proof.Only the closure property is nontrivial to verify.Since cl(X) Proof.First of all, it is clear that SE(cl) ≥ SE(cl 1 ) + SE(cl 2 ).Indeed, let r i be a Shannon function for cl i , then r defined by r(X) := r 1 (X 1 ) + r 2 (X 2 ) is a Shannon function for cl 1 ∪ cl 2 .
We now show the reverse inequality.By Lemma 4, there exists a Shannon function r for cl with r(X) = r(X 1 ) + r(X 2 ) and r(V ) = SE(cl).It is easily seen that the restriction r 2 (X) of r(X) to V 2 is a Shannon function for cl 2 , hence r 2 (V 2 ) = r(V 2 ) ≤ SE(cl 2 ).Furthermore, define the function We check that r 1 is indeed a Shannon function for cl 1 .The first two properties are straightforward, while submodularity comes from: and the closure property comes from the fact that cl The Shannon entropy of the bidirectional union satisfies a similar inequality to the one for the corresponding entropy.Proposition 7.For any cl 1 and cl 2 , we have: Proof.We say a function r : 2 V → 2 V is a V 1 -function if it satisfies all of the properties of a Shannon function, but only for all X, Y containing V 1 .The maximum value of r(V ) over any V 1 -function, denoted as S, is greater than or equal to SE(cl 1 ∪cl 2 ).Let r be a V 1 -function and consider: for all X ⊆ V 2 .We then prove that r 2 is a Shannon function for cl 2 .Only Property (4) is nontrivial to check; we have: Symmetry finishes the proof.

Density of Closure Entropies
We remark that any closure operator of rank at least one has entropy at least one (assign the universal partition to every vertex in cl(∅) and the same partition g of A r into |A| parts to any other vertices).Moreover, any D-closure for a digraph D with rank (i.e., minimum feedback vertex set size) of two has entropy two; in fact, such closure operators are solvable over any sufficient large alphabet [8].This shows that multiple unicast instances with two source-destination pairs are solvable over all sufficiently large alphabets.This proof technique cannot be generalised for general digraphs, for C5 has rank three, but entropy of only 2.5.Another direction could then be to consider other families of closure operators and find "gaps" in the entropy distribution; in particular, we may ask whether all closure operators of rank two are solvable.Theorem 4 gives an emphatic negative answer to the last question: the set of all possible closure entropies is always dense above one.Theorem 4. For any r ≥ 2 and any rational number H in (1, r], there exists a closure operator of rank r with entropy equal to H.The proof is constructive, i.e., for any H, we give a closure operator with entropy equal to H and the corresponding coding functions with entropy H. First of all, we introduce some notation regarding rooted trees.A rooted tree is a tree with a specific vertex, called the root, denoted as R. The vertices at distance k from the root form level k of the tree (hence, the root is the only vertex on level 0), this is denoted l k .For any vertex v in level k, its parent is the only vertex adjacent to v on level k − 1; we denote it as p(v).Moreover, we denote its ancestry as a(v) := {v, p(v), p 2 (v), . . ., R} (remark that we include v in its ancestry).Conversely, a child of v is any vertex on level k + 1 adjacent to v, and any vertex without any children is a leaf of the tree.We denote the set of children of v as c(v).We extend the definitions above to any set of vertices X, e.g., p(X) = v∈X p(v).The following properties easily follow.
(2) If X ⊆ Y , then a(X) ⊆ a(Y ). ( Fix a coding function f for cl.For any non-leaf v, the submodular inequality gives (with the sets X 1 , . . ., X k corresponding to {u, v} : u ∈ c(v); and hence, We first add up by level; for level k (0 ≤ k ≤ L t ) of tree T t , we denote H k := v∈l k H f (v), and we obtain: Let us now add up for all levels: where we used H Lt ≤ |l Lt | = Ct! (Ct−Lt)! .Simplifying, we obtain: since, by definition, H 0 = H f (R t ).We now add up for all trees T 1 up to T r−1 , we obtain: where we used the following relations: Moreover, the set of all roots is a basis for the closure operator, hence: Multiplying (2) by D(H − 1) and adding it with (1) eventually yields: We say that a, a : 2 V → 2 V are equivalent if any tuple of partitions f is a coding function of a if and only if it is a coding function for a .
Theorem 5. Let a : 2 V → 2 V ; then, there exists a closure operator on V which is equivalent to a. Therefore, the solvability problem for a can be reduced to the solvability problem of some closure operator.
Proof.We take three steps.Firstly, construct the digraph on 2 V with arcs (Y, a(Y )) for all Y ⊆ V .For any X ⊆ V , denote the connected component containing X as C(X).Then, we claim that b(X) := Y ∈C(X) a(Y ) is equivalent to a (we note that b is extensive).Indeed, if f is a coding function for a, then f X = f a(X) .Hence, for any Y ∈ C(X), f Y = f X , and we obtain f b(X) = f X .Conversely, we have b(X) = b(a(X)); and hence, if f is a coding function for b, then f X = f b(X) = f b(a(X)) = f a(X) for all X.
Secondly, we claim that c(X) := Y ⊆X b(Y ) is equivalent to b (we note that c is extensive and isotone).Indeed, if f is a coding function for b and Y ⊆ X, then f X refines f Y = f b(Y ) ; thus, f X refines f c(X) .The converse is immediate; hence, f X = f c(X) , and f is a coding function for c.Conversely, if f is a coding function for c, then f X = f c(X) refines f b(X) and, hence, is equal to f b(X) for all X.
Thirdly, we claim that cl(X) := c n (Y ) is equivalent to c (we note that cl is a closure operator).Indeed, if f is a coding function for c, then f X = f c(X) = . . .= f c n (X) .Conversely, f X = f cl(X) refines f c(X) .

Conclusions
In this paper, we pursued the study of closure solvability introduced in [8] for network coding and secret sharing.
We first investigated the nature of solvable closure operators.This yielded numerous new definitions (four different kinds of ranks) and a new criterion for non-solvability (Theorem 3).In passing, we give two new axioms for matroids in Theorems 1 and 2.
We then introduce the entropy of a closure operator, and we thoroughly investigate its properties.We are able to define the equivalent of the Shannon entropy of a graph.This yields Theorem 4, which shows that the set of entropy values of closure operators contains all rational numbers above one.However, it is easy to show that there are gaps between the entropy values of undirected graphs, for instance the only possible value between two and three is equal to 2.5.For directed graphs, we still do not know whether the set of entropy values is dense in [1, ∞).

( 1 )
If D is acyclic, then cl D = U 0,n .(2) If D = C n , the directed cycle, then cl D = U 1,n .(3) If D = K n , the complete graph, then cl D = U n−1,n .(4) If D has a loop on each vertex, then cl D = U n,n .Conversely, no other uniform matroid can be viewed as a D-closure.However, the following two questions are still open.For which digraphs are the D-closures matroids?What matroids are represented by D-closures of digraphs?

Figure 1 .
Figure 1.Example where the inner rank is not monotonic.

( 3 )
Let o be an outer basis of F .Since F ⊆ cl(o) while or(F ) = or(cl(o)), we obtain F = cl(o) and o is an inner basis of F .
Consider a minimal closed set c of outer rank k, i.e., or(c) = k and or(c ) = k − 1 for any closed set c ⊂ c.By hypothesis, we have c = span(Y ) for some Y ⊆ c; we now prove that c = cl(Y ).Suppose that cl(Y ) ⊂ c, then cl(Y ) = c , a closet set of outer rank at most k − 1.Then, c is a flat, i.e., c = span(c ) = span(cl(Y )) = span(Y ) = c, a contradiction.Thus, c = span(c) and c is a flat.

Figure 2 .
Figure 2. The graph C4 whose closure operator is solvable, but not a matroid.

Figure 3 .
Figure 3.The graph C5 whose closure operator is outer complemented and not solvable.