1. Introduction and Preliminaries
In many papers dealing with dynamical systems, their strong relation to difference equations is pointed out (see [
1]), which gives the possibilities of their wide applications in many fields of knowledge, including economics, biology, information flow or physics [
2,
3,
4,
5,
6,
7]. Among the problems connected with “dynamical systems with discrete time observations”, a special role is played by the entropy of these systems, which may be treated as a “measure” of the complexity of a dynamical system. This notion was introduced with respect to the issues connected with thermodynamics and the problem of “information loss” (more details on this topic can be found in [
8]). At the beginning, the notion of entropy was related to the measure theory. Later, there appeared the notion of topological entropy introduced by R. Adler, A. Konheim and J. McAndrew [
9], and next, an equivalent definition for metric spaces was formulated [
10,
11]. It is worth mentioning that in the further stage of research, the definition of topological entropy for discontinuous functions was also studied [
12]. The considerations mentioned concerned autonomous systems. Later, still, there appeared results regarding the entropy of nonautonomous dynamical systems. We will base our investigations, among others, on [
13]. In general, the notion of entropy concerns a global property of dynamical systems. However, research connected for example with stability points or non-wandering points, as well as the analysis of various examples of functions lead to the conclusion that it is also purposeful to examine local aspects of entropy and points around which the entropy is “focused” in some sense, e.g., [
14,
15]. Simultaneously, the example presented in [
16] (p. 1118) shows that it is intentional to assume that the essence of a point “focusing” entropy should be connected with the behavior of functions only (exclusively) around this point or the value of functions at this point (sometimes, the fact that a point is a full entropy point [
15] is decisively influenced by the behavior of a function “far away” from that point). For that reason, a new approach to this problem was introduced in [
16]. All the above-mentioned papers concerned autonomous systems. In this paper, we will refer to these issues as nonautonomous dynamical systems. Our considerations will be mainly connected with the periodicity of the examined systems. Such kind of investigations are frequently connected in the literature with such systems (e.g., [
6,
13,
17]). It is caused by the connections of such systems with periodic difference equations (it is well signalled in [
3]).
Throughout the paper, the symbol will stand for the set of all positive integers and . Moreover, will denote a compact metric space. The closure, the interior and the cardinality of a set will be denoted by , and , respectively. For any function and sets , the symbols and mean the restriction of f to A and , respectively.
The symbol will denote the family of all self-maps defined on X such that the point is their fixed point, and the symbol will stand for the set of all fixed points of a function f. Moreover, for any functions , let us adopt the following notation .
Let be a metric space and be a sequence of nonempty closed subsets of X. We shall say that the sequence has the extension property if for any and any continuous function , where is a closed set, one can find a continuous function , which is an extension of , i.e., . Obviously, if for example and are cubes, then this fact follows from the generalizations of the classical Tietze theorem.
Following [
13], by a nonautonomous dynamical system on
X (NDS), we will mean any sequence of functions
such that
. If
for
, then we call the system autonomous and denote it by
. For
, let
and
, where
. Moreover, let
(where id is the identity function) and
for any
(the last notation will be applied to sets, so we do not assume that these maps are invertible). If
is a function, then for any
, the symbol
will denote the
n-th iteration of
f, i.e.,
and
.
We say that a dynamical system is periodic with a period n if , if and otherwise. Moreover, we say that is a periodic point with a period n of an NDS if is a fixed point of an NDS , i.e., for any .
If M is a matrix, then the trace of M will be denoted by . Let be a sequence of square matrices of the same degree t. Then, for any , we will consider .
In [
13] was introduced a Bowen-like definition of entropy for an NDS consisting of continuous functions. This definition was expanded for systems consisting of arbitrary functions in the paper [
8]. We will briefly review that notion.
Let
and
. A set
is called
-separated if for any two distinct points
, there exists
such that
. If
, then
E is
-separated in
Y if it satisfies the above condition and
. Let
denote the maximal cardinality of the
-separated set in
Y. Then, the entropy of a system
on
Y is the number:
If , then we write briefly instead of . Moreover, if we consider an autonomous system , then the entropy of this system will be denoted by and , respectively. By the entropy of a function f, we will mean the entropy of a respective autonomous system .
Now, we will signal, in the form of lemmas, basic facts that will be used in the further part of the paper. Reasoning similar to that in the proofs of Lemma 4.3 and 4.5 [
13] allows proving the following result concerning the entropy of an NDS consisting of not necessarily continuous functions.
Lemma 1. Let be a dynamical system. Then, for any , we have: Lemma 2. Let be a dynamical system on X. For any , we have .
In the case of NDS, entropy does not always fully reflect the complexity of a system (see, e.g., the considerations in [
13], p. 216). Therefore, in [
13] was introduced a new notion of asymptotical entropy, which, with respect to autonomous systems, coincides with the classical entropy.
An asymptotical entropy of a dynamical system
is the number
defined as follows:
. The existence of such a limit follows from Lemma 2. Moreover, Lemma 2 allows concluding that
. It is worth adding that the inequality from Lemma 2 is not true for entropy on subsets of the space, so the asymptotical entropy of a system on a set
is defined as the following upper limit:
Our terminology and notations related to
m-dimensional manifolds will coincide with those of [
18]. An
m-dimensional manifold with a boundary is a nonempty compact metric space
such that every point
has a neighborhood
U that is homeomorphic (via a transformation called the chart on
U) to an open subset of the
m-dimensional upper half space
. Since any open ball in
is homeomorphic to some open subset of
, an
m-dimensional topological manifold is an
m-dimensional topological manifold with a boundary (with an empty boundary). Therefore, in this paper, we will consider only
m-dimensional topological manifolds with a boundary.
If is a nonempty m-dimensional manifold with a boundary, a point that belongs to the inverse image of under some chart is called an interior point of . The set of all interior points of a manifold will be denoted by . The symbol will stand for the set of all closed submanifolds of (i.e., is a closed manifold) such that the dimensions of and are the same.
We shall say that an NDS of functions defined on is irreducible at if for , a function is irreducible at , i.e., for any open neighborhood U of , there exists a point such that .
2. Focal Entropy Points of NDS
Now, we will introduce the notion of a focal entropy point of NDS, having in mind the general assumption: the fact that a given point is a focal entropy point means that the complexity of the system in any neighborhood of this point is the same as the complexity of the whole system and does not depend on the behavior of functions around other points.
Let be a family of nonempty subsets of X such that each nonempty open set contains some element of . In view of the considerations presented in this paper, from now on, we will assume that contains the family of all closed sets of cardinality continuum.
Put .
Let
and
. Set
, where:
Moreover, for
, a system
and
, let:
The entropy of
with respect to the sequence
is the following number:
The process of computing the entropy of a system with respect to a sequence of sets may be simplified by introducing the notion of a path. Let
. For a
k-path connected with the sequence
and with the dynamical system
, we call each sequence of sets
such that
for
and:
The sets are called the nodes of the path. If no confusion can arise, we will write simply k-path. We say that a point is connected with a k-path if for . It is easy to see that such a point exists for any path.
One can easily notice that the entry
, where
, of the matrix (
1) is equal to the number of
-paths connected with the sequence
and the NDS
such that the set
is the first node of the path and the set
is its last node. Consequently,
is equal to the number of
-paths connected with the sequence
and the NDS
such that the set
is simultaneously the first and the last node of the path, for
.
Now, let us state the theorem, which will allow introducing the next steps of the definition.
Theorem 1. Let be an NDS, and . Then: Proof. The second inequality follows from Lemma 1, so it is sufficient to show the first inequality. Suppose, contrary to our claim, that there exists a real number
such that:
It is obvious that
and
. According to our assumptions connected with the family
, we have
Taking into account (
2), we obtain that there exists an increasing sequence of positive integers
such that:
Clearly,
are successive entries of the main diagonal of the matrix
, for any
. By (
3), one can conclude that for any
, we have
. For any
and
, the number of
-paths of the form
where
for
, is equal to
. For any
and
, let
denote the set of all
-paths whose first and last node is
. Obviously
. Therefore, let
. For any
,
and
, let
be a point connected with the path
.
Put for . It is easily seen that for , and . Thus, if and , then . Moreover, if and , then . Thus, , and finally, , because for .
Let be any distinct points of the set . If , then . If , then . Thus, since is connected with the path and is connected with the path and , so there exists such that and . This gives that is the -separated set for the system .
As a consequence, we obtain
. Let
. Thus,
, and hence,
, which contradicts (
2). ☐
We continue the considerations leading to the definition of a focal entropy point. Let
be an open set. For
, the notation
will mean that
for any
. Let us adopt the following notation:
Notice that on account of Theorem 1, for any open set
U, we have:
Using the last quantity, one can define the next one in the following way:
where
denotes the family of all open sets containing
.
According to Theorem 1, we have . If , then we say that a point is a -focal entropy point of a system .
Notice that if a system
is autonomous, i.e.,
for
, then the definition of a
-focal entropy point of the system
coincides with the definition introduced in [
16].
If in the definition of the quantity we will replace an entropy with asymptotical entropy , then by defining in an analogous way as above, we will obtain the notion of a asymptotical -focal entropy point of . In such a case, we will use a star in the respective symbols: , . Therefore, we say that a point is an asymptotical -focal entropy point of a system if .
It is easy to see that if is an asymptotical -focal entropy point of a system , then it is a -focal entropy point of this system. Obviously, if is periodic, then the notions of an asymptotical -entropy point of the system and of a -focal entropy point of the system coincide.
The natural question arises whether there exist such points. The next theorem is a partial answer to this problem.
Theorem 2. Let be a periodic dynamical system on consisting of continuous functions. Then, there exists an asymptotical -focal entropy point of the system .
Proof. Let
n be a period of the system
. Put
and
. Then,
. Hence, by Lemma 1, we obtain:
By Corollary 4.5 [
16], there exists a point
, which is a
-focal entropy point of
.
We will show that is a -focal entropy point of the system . Let . It is easy to observe that and consequently . We need to consider the following cases (we omit the trivial case ):
- (i)
. Thus,
. For any
, there exist
and
such that
and
. Obviously, by (
6), we have
, so
, and therefore,
. As a consequence,
. Hence and from arbitrariness
, we conclude that
, and consequently,
.
- (ii)
and
. By (
5), we obtain
. We have
, so for any
, there exist
and
such that
and
. Clearly, by (
6), we may infer that
, so
. By use of (
5), we get
. Finally, we have shown that for any
, there exist
and
such that
and
, so:
Moreover, according to (
4), we have:
Finally, (
7) and (
8) give
. Thus,
.
Since U was chosen arbitrarily, we obtain , so is a -focal entropy point of , and simultaneously, it is its asymptotical -focal entropy point because this system is periodic. ☐
3. Disturbance and Approximation
In various considerations connected with autonomous and nonautonomous dynamical systems, a special role is played by fixed points of the systems (e.g., stable points [
6]). It is not difficult to find an example showing that a fixed point of NDS need not be its focal entropy point. On the other hand, a given NDS can be approximated or disturbed by entering new functions into it. In each of these operations, it is important to do it by means of functions that are close to the base NDS and belong to the common structure. This leads in a natural way to distinguishing equivalence classes.
Let
and
. In the set
, we will define the following relation:
where
is an open ball with radius
and center
. It is not difficult to show that for the fixed
and
, the relation (
9) is an equivalence relation in
.
The symbol will stand for the equivalence class of under the relation .
In this paper are mainly examined periodic dynamical systems, so it is natural to consider periodic disruptions called disturbances. The idea of the disturbance is introducing, in equal periods of time, a function belonging to the equivalence class generated by the iteration of functions lying between successive disturbance periods.
Let be a periodic NDS with a period , and let . We say that is a periodic -disturbance of if there exists a continuous function such that:
- (PD1)
- (PD2)
.
The next theorem shows that a periodic dynamical system may be periodically disturbed by means of a function belonging to an earlier defined equivalence class (with arbitrary small ) in such a way that a periodic point of the system becomes its asymptotical -focal entropy point.
Theorem 3. Let be a periodic dynamical system on consisting of continuous functions such that is a periodic point of this NDS and is irreducible at . For any , there exists a system that is a periodic ε-disturbance of such that is an asymptotical -focal entropy point of .
Proof. Let be a period of and be a period of . Put . It follows immediately that is both a period of and of . Let and be a sequence of connected submanifolds satisfying the following properties:
- [M1]
for ,
- [M2]
for ,
- [M3]
,
- [M4]
the sequence has the extension property.
Without loss of generality, we can also assume that . Obviously, there exists an open set , such that and . Moreover, Condition [M3] implies that there exists such that for .
Put . Since is irreducible at , it is easy to see that there exist and an arc with endpoints at and such that . Let be disjoint arcs such that and . Put for . Then, , , and the sets and are closed. Moreover, for .
On account of the well-known Hahn–Mazurkiewicz theorem (see, e.g., [
19], p. 106), there exists a continuous function
such that
and
. From the fact that the set
is closed and from Condition [M3], it follows that there exists
such that
. Obviously,
.
By the same reasoning as above, one can find and . Let be such arcs that , if , . Put , . Clearly , whenever , and are closed for . Moreover, for . Let be a continuous function such that for .
Continuing in this fashion, we obtain two sequences: and of closed sets such that for , is closed for and whenever . Moreover, there exists a sequence of continuous functions such that for .
Now, let us consider the set . It follows easily that . It is easy to prove that is closed.
Consider the following function:
Clearly, . Since is closed and is continuous, it follows by Condition [M4] that there exists a continuous function such that .
We will show that is a periodic -disturbance of . Condition (PD1) is obvious. To obtain Condition (PD2), it is enough to show that . We have because for . Moreover, and . This means that .
What is left is to prove that is an asymptotical -focal entropy point of .
Let V be an arbitrary open neighborhood of . Obviously, there exists such that for . Let and . We will show that there exists such that Obviously, one can find such that and . Thus, . Consider , and put . Clearly, .
Let . It is evident that is equal to the number of -paths connected with . We have and for any , so for . As a consequence . Thus, .
Finally, we have shown that for any , there exists , such that . Hence for any , we have . Thus, , and therefore, , so is an asymptotical -focal entropy point of . ☐
The next theorem shows the difference between a -focal entropy point of NDS and an asymptotical -focal entropy point of NDS on the interval under as weak as possible assumptions imposed on the considered functions. For the simplicity of the notation, we will formulate and prove the theorem for . It can be easily generalized for any .
Theorem 4. Let be a function continuous at and such that . Let us assume that:(*) there exists a sequence such that for any , we have .
Then, for any , there exists a sequence of functions continuous at zero such that and zero is a -focal entropy point of the system and is not an asymptotical -focal entropy point of . Proof. Let . Let be a positive number less than and such that for . There exists such that and . Put , and hence, . Let be an odd positive integer such that .
From (*), it follows that there exists an interval such that . Put and . Notice that . Consider a sequence such that . Now, we can define the function as follows: , for , for , , , is linear on respective intervals in each ; and moreover, for , for and for .
We next define functions for . Let for , . Fix . Put for and for and linear on the respective intervals. Moreover, for , for .
Obviously is continuous at zero for and . We will show that and for .
We first prove that . Let , and be an -separated set for . For any , , there exists such that . Notice that . Indeed, we have and . Hence, for , we have and , so for . As a consequence, for any distinct points , we have . It follows that , so , where denotes the smallest positive integer greater than . Hence, . In an analogous way, one can show that .
Moreover, we have for and . Consequently, .
Let
. We will show that
. Clearly,
and
is piecewise monotone. Denote by
the number of intervals of monotonicity of
. We have
for
. Thus, by Theorem 4.2.4 [
20], we have
. Obviously,
, and for any
, we have
. On account of Lemma 4.1.10 [
20] (and the remark after it), we obtain
. Finally, Proposition 3.5 [
12] (see also Lemma 4.1 from [
13]) gives that
.
We now turn to the case . We have and . Hence, . Moreover, and , so . Therefore, and , so .
As was done for a function
, one can show that
and
. As a consequence, by Proposition 3.5 [
12], we obtain
.
Since for , it follows that for .
We will show now that
. We claim that:
Indeed, if
, then
. Thus, for
, we have
and
, so
. For
, we have
, so
for
and
. If
then
for
. Therefore, it is easy to see that for
and
, we have
. Obviously,
. The proof of (
10) is complete.
Notice that for any
and
, the set
is
-separated for the system
if and only if
M is
-separated for
. Indeed, let
M be an
-separated set for
. Then, for any distinct points
, there exists
such that
. By (
10), we obtain
, which means that
M is an
-separated set for the system
. The proof of the converse implication runs in a similar way.
As a consequence, we have , so .
Let
U be an arbitrary neighborhood of zero. We will show that
Clearly, by Theorem 1, we have:
Let
. Consider the interval
. There exists a sequence of points
such that
for
. Put
for
. Then,
and:
Obviously, for any
, the trace
is equal to the number of
-paths with the first and the last node at
for
. By (
12), we conclude that the number of such paths is equal to
. Hence,
, and therefore:
Let
be such that
. Then, by (
13), we obtain
, so
. From this and (
11), we get
. As a consequence,
, which gives
, so zero is a
-focal entropy point of
.
Simultaneously, zero is not an asymptotical -focal entropy point of , because for any neighborhood U of zero, we have and . Therefore, , which means that . ☐