Entropy Treatment of Evolution Algebras

In this paper, by introducing an entropy of Markov evolution algebras, we treat the isomorphism of S-evolution algebras. A family of Markov evolution algebras is defined through the Hadamard product of structural matrices of non-negative real S-evolution algebras, and their isomorphism is studied by means of their entropy. Furthermore, the isomorphism of S-evolution algebras is treated using the concept of relative entropy.


Introduction
The theory of non-associative algebras is an important branch of abstract algebra. Such kinds of algebras include baric, evolution, Bernstein, train, and stochastic algebras. These types of objects were tied up with the abstract description of biological systems [1][2][3][4].
Let E := (E , ·) be an algebra over a field K, where E is called an evolution algebra if it admits a basis B := {e 1 , e 2 , . . . , e n } such that e i · e j =    n ∑ k=1 a ik e k , if i = j, The matrix A = a ik is called the structure matrix of E relative to B. A basis B satisfying (2) is called the natural basis of E . We say that E is a non-negative evolution algebra if K = R and the structure matrix entries a ik are non-negative.
These kinds of algebras were first considered in [5][6][7] and have been exhaustively studied over the recent years (see [8][9][10][11][12][13][14][15][16] and references therein for a review of some of the main results achieved on this topic [17]). These algebras are related to a wide variety of mathematical subjects, including Markov chains and dynamical systems [18,19]. The relationship between evolution algebras and homogeneous discrete-time Markov chains was settled in [6]. We recall that Markov evolution algebra is a non-negative evolution algebra whose structure matrix A has row sums equal to 1.
Tian [6] proposed one of the most fruitful further topics of research: the development of the theory of continuous evolution algebras and their connection to continuous-time Markov processes. He outlined continuous evolution algebras to be evolution algebras using multiplication, with respect to a natural basis B = {e 1 , e 2 , . . . , e n }, such that for some functions a ik (t).
In [25], Markov evolution algebras, whose stricture matrices obey semi-group property, were investigated. This type of study is related to the chain of evolution algebras [26].
On the other hand, recently, in [27], we introduced a new class of evolution algebras called S-evolution algebras. These algebras are not nilpotent and naturally extended Lotka-Volterra evolution algebras [18]. It is stressed that directed weighted graphs associated with S-evolution algebras have meaning, whereas those connected with Lotka-Volterra algebras do not.
Due to [28], the intersection of information theory and algebraic topology is fertile ground. For example, in [29], it was established that the Shannon entropy defines a derivation of the operad of topological simplices. On the other hand, it is important to construct invariants for evolution algebras which can detect their isomorphism. It turns out that such an invariant can be defined via the Shannon entropy for S-evolution algebras. In the present paper, we demonstrate how this entropy allows for the treatment of the isomorphism of S-evolution algebras. To be precise, we demonstrate that, if S-evolution algebras (symmetric) have different entropies, they are not isomorphic. This result enables the construction of many examples of non-isomorphic evolution algebras. As a result of the primary finding, we propose a non-isomorphic family of Markov evolution algebras. This result sheds new light on the Markov evolution algebras and their isomorphism problems.
Let us briefly describe the structure of this paper. Section 2 contains preliminary definitions of evolution algebra. In Section 3, we define the entropy of the structural matrix of the Markov evolution algebra, and we demonstrate that any isomorphic S-evolution algebra would produce the same Markov evolution algebra. Furthermore, we derive Markov evolution algebra through the Hadamard product of the structural matrix A of S-evolution algebra. We show that the entropy of such a matrix will be constant if n = 2, whereas the entropy will be decreasing if n ≥ 3. This result allows us to construct a lot of non-isomorphic chains of Markov evolution algebras (see [26]). Finally, in Section 4, the relative entropy is defined, and we prove that such a function is a measure of the 'distance', even though it is not a metric space, since the symmetric axiom, in general, is not satisfied. In the case of symmetric evolution algebra, we show that this property is satisfied only in the class of isomorphic algebras.

Preliminaries
In this section, we recall the definitions of S-evolution algebra and some definitions which are needed throughout the paper. Let E be a real non-negative evolution algebra with structure matrix A = (a ik ) and natural basis B. If 0 ≤ a ik ≤ 1 and ∞ ∑ k=1 a ik = 1, for any i, k, then E is called Markov evolution algebra. The name is due to the fact that there is an interesting one-to-one correspondence between E and a discrete time Markov chain (X n ) n≥0 with the stated space {x 1 , x 2 , . . . , x n , . . .} and transition probabilities given by (a ik ) i,k≥1 , i.e., for i, k ∈ {1, 2, . . .}: for any n ≥ 0.
For the sake of completeness, we wish to state that a discrete-time Markov chain can be thought of as a sequence of random variables X 0 , X 1 , X 2 , . . . , X n , . . . defined in the same probability space, taking values from the same set X , and such that the Markovian property is satisfied, i.e., for any set of values {i 0 , . . . , i n−1 , x n , x k } ⊂ X , and any n ∈ N, it holds P(X n+1 = x k |X 0 = i 0 , . . . , X n−1 = i n−1 , X n = x i ) = P(X n+1 = x k |X n = x i ).
Notice that, in the correspondence between the evolution algebra E and the Markov chain (X n ) n≥0 , what we have is each state of X identified with a generator of B.
We notice that, if A = (a ij ) n i,j=1 is a S−matrix, then there is a family of injective functions { f ij : K → K} 1≤i<j≤n , with f ij (0) = 0 such that a ji = f ij (a ij ) for all 1 ≤ i < j ≤ n. Hence, each S-matrix is uniquely defined by off diagonal upper triangular matrix (a ij ) i<j and a family of functions ( f ij ) i<j . This allows us to construct lots of examples of S-matrices.
Given an upper triangular matrix (a ij ) i<j , one can construct several examples of S-matrices as follows: Definition 2. An evolution algebra E is called an S-evolution algebra if its structural matrix is an S−matrix.

Remark 1.
We note that evolution algebras corresponding to skew-symmetric matrices are called Lotka-Volterra evolution algebras. Such kinds of algebras have been investigated in [18].
One can see that the conical form of the table of multiplication of S-evolution algebra E with respect to natural basis {e 1 , e 2 , ..., e n } is given by e i · e j = 0, i = j; (3) We note that, if i = 1, then the first part of (4) is zero, if i = n, then the second part is zero.

Remark 2.
The motivation behind introducing S-evolution algebra is that such algebras have certain applications in the study of electrical circuits, finding the shortest routes and constructing a model for analysis and solution of other problems [8,30].
In this case, the last relationship is denoted by Definition 4. Let E be an evolution algebra with a natural basis B = {e 1 , . . . , e n } and structural matrix A = α ij .

1.
A , is called the graph attached to the evolution algebra E relative to the natural basis B.

2.
The where ω is the map E → F given by ω (i, j) = α ij , is called the weighted graph attached to the S− evolution algebra E relative to the natural basis B.
Recall that if every two vertices of a graph are connected by an edge, then such a graph is called complete.
Using the graph Γ(E , B), in [27], we have established the isomorphism of S-evolution algebras.

Theorem 1 ([27])
. Let E 1 and E 2 be two S-evolution algebras with (a ij ) n i,j=1 , (b ij ) n i,j=1 structural matrices, respectively, whose attached graphs are complete. Then, E 1 ∼ = E 2 if and only if the following conditions are satisfied

S-Evolution Algebras and Corresponding Markov Evolution Algebras
In what follows, we always assume that E is a non-negative, symmetric S-evolution algebra with structure matrix A = (a ik ) and natural basis B. Using the matrix A, one can define a stochastic matrix P(A) = (t ij ) as follows: where i, j ∈ {1, . . . , n}. Sometimes, t ij is denoted by P(a ij ). An evolution algebra with the natural basis B and structural matrix P(A) is a Markov evolution algebra corresponding to E which is denoted byĒ .
Our task now is to examine the isomorphism between E andĒ .

Theorem 2.
Let (E , A) be a non-negative symmetric S-evolution algebra whose attached graphs are complete, and let (Ē , P(A)) be its corresponding Markov evolution algebra. Then, E ∼ =Ē if the following conditions are satisfied.
Proof. We notice that E andĒ are S-evolution algebras. So, the isomorphism between these two algebras can be checked by Theorem 1. Hence, the proof is straightforward.
Consider a discrete random variable X with possible values {x 1 , x 2 , . . . , x n } and probability mass function P(X). The entropy can be explicitly written as: where it is assumed that 0 ln(0) = lim p→0 + p ln(p) = 0. Now, given a non-negative symmetric S-evolution algebra E with structure matrix (a ij ), we define its entropy as follows: where (t ij ) is defined by (5).
The Jamiolkowski entropy of P is defined by h(P) := H(D P ). One can see Hence, the entropy given by (7) can be represented as follows: The obtained formula (11) allows us to investigate H(A) in terms of h(P(A)), which has certain applications in information theory. Moreover, all properties of the Shannon entropy can be applied to H(A).
On the other hand, if one defines the entropy of a stochastic matrix in the sense of [32], then, via (11), one can introduce other types of entropy of evolution algebras. Moreover, given an evolution algebra E with the structure matrix A with (A * A = 1), we may define a mapping Φ : C n → C n by Φ(x) = A * xA, which defines a quantum channel. Using its Jamiolkowski entropy, we define the entropy of E as follows: This will allow us to further investigate the algebraic structure of E with relation to the quantum channel Φ [33].
Theorem 3. Let E 1 ∼ = E 2 be the non-negative symmetric S-evolution algebras with structural matrices A = (a ij ) 1≤i,j≤n and B = (b ij ) 1≤i,j≤n , respectively. Assume that their attached graphs are complete. Then, the corresponding Markov evolution algebras are the same.
Proof. Let E 1 ∼ = E 2 . Due to the isomorphism between theses two algebras, we have The corresponding Markov evolution algebras have the following matrices of structural constants: We may assume that i 0 = j 0 (since the matrices are S-matrices). Hence, Due to the arbitrariness of i 0 , j 0 , we obtain P(A) = P(B). This completes the proof.

Remark 4.
We stress that the converse of Corollary 1 need not be true. Indeed, let E 1 and E 2 be two non-negative S-evolution algebras with the following matrices of structural matrices: Using the condition from (12), we have E 1 E 2 for any a = b. However, H(A 1 ) = H(A 2 ).
Remark 5. The advantage of Theorem 3 is that, for any two non-negative S-evolution algebras whose matrix of structural constants is symmetric, if their entropies are different, then theses algebras are not isomorphic.
The natural question that arises is: if we have arbitrary isomorphic evolution algebras, are their entropies equal? The following example gives a negative answer. Example 1. Let E 1 and E 2 be two dimensional evolution algebras with structural matrices Clearly, E 1 ∼ = E 2 . Now, . Now, we may calculate the entropies for both, which are H(A 1 ) = ln (2) Given the S-evolution algebra (E , A), the Hadamard product is defined as M t : Let us denote A A . . . A t−times by A t . By (E , A t ) we denote the evolution algebra whose structural matrix is A t .

Lemma 1.
Assume that all conditions of Theorem 3 are satisfied and dim(E 1 ) ≥ 3. Then
if and only if t 1 = t 2 .
Proof. Let A = (a ij ) n i,j=1 and B = (b ij ) n i,j=1 be the structure matrices of E 1 and E 2 , respectively, then (E 1 , Assume that for any t 1 , t 2 . Next, let if and only if t 1 = t 2 .

Now, let us consider P(A t ), which is defined by
Let us denote Then one has Let Theorem 4. The entropy of (13) is related to the following first-order differential linear equation: Proof. Let us calculate the entropy of (13). Due to the symmetry of A t , it is enough to find the value H(A t ) for the first row, the rest will process in the same manner. Hence, Thus, Equation (16) can be rewritten as follows: Therefore, The last expression leads to the required assertion. This completes the proof.
Let us denote with K i the set of all maximum entries of the row R i := (a i1 , a i2 , . . . , a in ). In what follows, we assume that |K i | = m i , 1 ≤ m i ≤ n − 1.
Theorem 5. Let E be a non-negative symmetric S-evolution algebra with the structural matrix A = (a ij ) 1≤i,j≤n with attached graphs; it is complete. The following statements hold true:

Proof. (i). From Theorem 4, we infer that H(
Clearly, from the last equation, we find f i (t) < 0, 1 ≤ i ≤ n, then y < 0. As t > 0, then H (A t ) = ty < 0. Hence, H(A t ) is decreasing. This completes the proof of (i). Now consider (ii). For the sake of simplicity of calculations, we may assume that Simple calculations yields that   Then, A t , B t , and C t are, respectively, given by: Then, the corresponding Markov evolution algebras have the following structure matrices: One can see that P(A t ) = P(C t ).
Then, Figure 1 shows the graphs of H(A t ), H(B t ), and H(C t ), and, since E 1 ∼ = E 3 and E 2 E 1 , one can see that the graphs of H(A t ) and H(C t ) are identical, whereas the graph of H(B t ) is different.
The following example is related to (4) of remark (18). Then, A t and B t are, respectively, given by:  Since P(A t ) and P(B t ) reach the maximum entropy, E 1 ∼ = E 2 .

Relative Entropy
Suppose that we have two sets of discrete events, x i and y j , with the corresponding probability distributions, {p(x i )} and {p(y j )}. The relative entropy between these two distributions is defined by This function is a measure of the 'distance' between {p(x i )} and {p(y j )}, even though it is not metric space, since the symmetric axiom in general is not satisfied D(p(x) || p(y)) = D(p(y) || p(x)).
Let E 1 , E 2 be non-negative symmetric S-evolution algebras with matrix of structural matrices A = (a ij ) and B = (b ij ). Let P(A) = (t ij ) and P(B) = (s ij ) be the corresponding stochastic matrices (see (5)). We define the relative entropy of A and B as follows: Theorem 6. Let E 1 , E 2 be non-negative symmetric S-evolution algebras with structural matrices A = (a ij ) 1≤i,j≤n and B = (b ij ) 1≤i,j≤n , respectively. Assume that their attached graphs are complete.
Proof. One can see that if we write P(A 1 t ) as the row vectors P( . Now, we are going to compute D(a (i) ||b (i) ). For a fixed i, one has If k = n, then Substituting the last expression into (19), we obtain D(a (i) ||b (i) ) = The last expression can be rewritten as follows: D(a (i) ||b (i) ) = Due to the arbitrariness of i, we arrive at D(a (i) ||b (i) ) = 0 for any i. This completes the proof.

Conclusions
In this paper, we introduced an entropy of Markov evolution algebras, and treated the isomorphism of the corresponding S-evolution algebras. It turns out that the considered entropy is a semi-invariant of non-negative symmetric evolution algebras. This work opens new insight to the isomorphism problem through the entropy theory. Moreover, we have pointed out that entropy can be investigated by means of quantum channels. Furthermore, a family of Markov evolution algebras is defined through the Hadamard product of the structural matrices of non-negative real S-evolution algebras, and their isomorphism is studied through entropy. The isomorphism of any algebra is considered a crucial task. So, it is necessary to find a shortcut way that is effective and accurate to study such a problem. This paper treats this problem by using the entropy value in the class of evolution algebras. However, this property is not valid for general evolution algebras, as we have shown in Example 1. Therefore, for other types of algebras, it is better to find other kinds of entropies.