2. Basics
2.1. About the Parametrized (Gauss) Error Special Activation Function
We consider here the parametrized (Gauss) error special activation function
which is a sigmoidal-type function and a strictly increasing function.
Of special interest in neural network theory is when
; see
Section 1—Introduction.
It has the basic properties
and
We consider the function
and we notice that
Thus,
is an even function.
Since , then , and , all .
We see
Let
; then, we have that
proving
, for
. That is,
is strictly decreasing on
, and it is strictly increasing on
, and
Clearly, the x-axis is the horizontal asymptote of .
Concluding,
is a bell symmetric function with maximum
We have
Hence, is a density function on We need
Theorem 3. Let , and with , . It holdswith Denote by the integral part and by the ceiling of a number.
Furthermore, we need
Theorem 4. Let , and so that . Then, Note 1. For large enough n, we always obtain . Also, , iff . As in [18], we obtain that Definition 1. Let and . We introduce and define the X-valued linear neural network operators Clearly here, . We study here the pointwise and uniform convergence of to with rates.
For convenience, also, we call
that is
So that
Consequently, we derive
That is
We will estimate the right-hand side of (
19).
For that, we need, for
, the first modulus of continuity
The fact
is equivalent to
; see [
19].
We present a series of real-valued neural network approximations to a function given with rates.
We first give
Theorem 5. Let , , , Then,
(ii)We notice , pointwise and uniformly. The speed of convergence is
We need
Definition 2 ([
20])
. Let , ; ( is the ceiling of the number), . We assume that . We call the left Caputo fractional derivative of order α:If , we set the ordinary real-valued derivative (defined similar to numerical one, see [21], p. 83), and set . See also [22,23,24,25]. By [
20],
exists almost everywhere in
and
.
If
, then by [
20],
hence
We mention the following.
Lemma 1 ([
19])
. Let , , , and . Then, . We also mention the following.
Definition 3 ([
26])
. Let , , . We assume that , where . We call the right Caputo fractional derivative of order α:We observe that for , and By [
26],
exists almost everywhere on
and
.
If
, and
by [
26],
hence
We need
Lemma 2 ([
19])
. Let , , , , . Then, . We present the following real-valued fractional approximation result by -based neural networks.
Theorem 6. Let , , , , Then,
and
When , we derive
Corollary 1. Let , , , , Then,
and
2.2. About q-Deformed and -Parametrized Half-Hyperbolic Tangent Function
All the next background comes from [
28].
Here, we describe the properties of the activation function
where
We have that
and
hence
It is
and
Furthermore
therefore,
is striclty increasing. Moreover, in case of
, we have that
is strictly concave up, with
And in case of , we have that is strictly concave down.
Clearly,
is a shifted sigmoid function with
, and
, ∀
(a semi-odd function); see also [
28].
We consider the function
;
. Notice that
, so the
x-axis is horizontal asymptote. We have that
which is a deformed symmetry.
Next, we have that
is striclty increasing over and it is strictly decreasing over
Moreover, is concave down over , and it is strictly concave down over
Consequently, has a bell-type shape over
Of course, it holds
Thus, at
, we have the maximum value of
, which is
We mention
It follows
So that is a density function on
We need the following result,
Theorem 9 ([
29])
. Let , and with ; . Then,where Let be the ceiling of the number, and let be the integral part of the number.
We mention the following result:
Theorem 10 ([
29])
. Let and so that . For , we consider the number with and . Then, We also mention
Remark 2 ([
29])
. (i) We have thatwhere (ii) Let . For large n, we always have . Also, , if . In general, it holds We need
Definition 4. Let and . We introduce and define the real-valued linear neural network operatorsClearly, We study here the pointwise and uniform convergence of to with rates.
For convenience, also we call
That is
So that
Consequently, we derive that
where
as in (
41). We will estimate the right-hand side of the last quantity.
We present a set of real-valued neural network approximations to a function given with rates.
Theorem 11. Let , , , , Then,
and
(ii)We observe that , pointwise and uniformly. Next, we present the following.
Theorem 12. Let , , , Then,
and
When , we derive
Corollary 2. Let , , , Then,
and
3. Combine 2.1 and 2.2
Let with , . Let also .
For the next theorems, we call
Also, we set
Furthermore, we set
We present the following.
Theorem 13. Let , , , , Then, for ,
(ii)We observe that , pointwise and uniformly. Proof. From Theorems 5 and 11. □
Next, we present
Theorem 14. Let , , , Then, for
and
Proof. From Theorems 6 and 12. □
When , we derive
Corollary 3. Let , , , Then, for
and
4. About Random Motion on Simple Graphs
Suppose we have a system of S semiaxes with a common origin radially arranged and a particle moving randomly on S. Possible applications include the spread of toxic particles in a system of channels or vessels or the propagation of information in networks.
The mathematical model is the following: Let
S be the set consisting of n semiaxes
with a common origin 0 and let
be the Brownian motion process on
S: namely, the diffusion process on
S whose infinitesimal generator
L is
where
together with the continuity conditions (a total of
equations),
and the so-called “Kirchoff condition”
This is a Walsh-type Brownian motion (see [
30]).
The process has a standard Brownian motion on each of the semiaxes and, when it hits 0, it continues its motion on the j-th semiaxis, with probability
For each semiaxis , it is convenient to use the coordinate Notice that if is a function on S, then its j-th component, , is a function on thus, .
The transition density of
is
and
We need the following result.
Theorem 15. Let where with fixed. We consider the function which is bounded on , i.e., there exists such that for every and it is Lebesgue measurable on Let also be the standard Brownian motion on each of the semiaxes as described above. Here, is fixed on semiaxes, . We consider the related expected value functionThe function is continuous in t and differentiable. Proof. First, we observe that for
and
with
Also, for
and
, it is
It is enough to prove that
is continuous in
.
We have that
Thus,
Furthermore, as
, with
we obtain
By the dominating convergence theorem
and thus,
is continuous in
consequently, the function
is continuous in
t. □
We also need the next theorem.
Theorem 16. Let where with are fixed. We consider function which is bounded on and Lebesgue measurable on Let also be the standard Brownian motion on each of the semiaxes as described above. Here, is fixed on semiaxes, . Then, the related expected value functionis differentiable in andwhich is continuous in t. Proof. First, we observe that for
and
with
,
Also, for
and
, it is
Furthermore, for
for every
and
for every
So, and are bounded with respect to t. The bounds are integrable with respect to and , respectively.
We have
We apply differentiation under the integral sign.
We notice that
and
Therefore, there exists
which is continuous in
t (same proof as in Theorem 15). □
5. Main Results
We present the following general approximation results of Brownian motion on simple graphs.
Theorem 17. We consider function which is bounded on and Lebesgue measurable on Let also be the related expected value function.
If , , , where with then for
(ii)We observe that , pointwise and uniformly. Proof. From Theorem 13. □
Next, we present
Theorem 18. We consider function which is bounded on and Lebesgue measurable on Let also be the related expected value function.
If , , where with and Then, for
and
Proof. From Theorem 14. □
When , we derive
Corollary 4. We consider function which is bounded on and Lebesgue measurable on Let also be the related expected value function.
If , and Then, for ,
and
Proof. From Corollary 3. □
We continue with
Theorem 19. We consider function which is bounded on and Lebesgue measurable on Let also be the related expected value function.
If , , , where with then for ,
(ii)We observe that , pointwise and uniformly. Proof. From Theorem 13. □
6. Applications
Let a function
which is bounded on
, where
with
and is Lebesgue measurable on
. For the Brownian Motion on simple graphs
, we will use the following notations
and
We can apply our main results to the function
. Consider the function
, where
for every
. Let also
be the Brownian motion on simple graphs. Then, the expectation
is continuous in
t.
Moreover,
Corollary 5. Let , , , where with then, for and
(ii)We observe that , pointwise and uniformly. Proof. From Theorems 17 and 19. □
Next, we present
Corollary 6. Let , , where with and Then, for
Proof. From Theorem 18. □
When , we derive
Corollary 7. Let , and Then, for ,
Proof. From Corollary 4. □
For the next application, we consider the function
, where
for every
. Let also
be the Brownian motion on simple graphs. Then, the expectation
is continuous in
t.
Moreover,
Corollary 8. Let , , , where with then, for and
(ii)We observe that , pointwise and uniformly. Proof. From Theorems 17 and 19. □
Next, we present
Corollary 9. Let , , where with and Then, for ,
Proof. From Theorem 18. □
When , we derive
Corollary 10. Let , and Then, for ,
and
Proof. From Corollary 4. □
Let us consider now the function
, where
for every
. Let also
be the Brownian motion on simple graphs. Then, the expectation
is continuous in
t.
Moreover,
Corollary 11. Let , , , where with then, for and ,
(ii)We observe that , pointwise and uniformly. Proof. From Theorems 17 and 19. □
Next, we present
Corollary 12. Let , , where with and Then, for ,
and
Proof. From Theorem 18. □
When , we derive
Corollary 13. Let , and Then, for ,
and
Proof. From Corollary 4. □
In the following, we consider the function
, where
for every
. Let also
be the Brownian motion on simple graphs. Then, the expectation
is continuous in
t.
Moreover,
Corollary 14. Let , , , where with then, for and ,
(ii)We observe that , pointwise and uniformly. Proof. From Theorems 17 and 19. □
Next, we present
Corollary 15. Let , , where with and Then, for ,
and
Proof. From Theorem 18. □
When , we derive
Corollary 16. Let , and Then, for ,
and
Proof. From Corollary 4. □
Let the generalized logistic sigmoid function
, where
for every
. Let also
be the Brownian motion on simple graphs. Then, the expectation
is continuous in
t.
Moreover,
Corollary 17. Let , , , where with then, for and ,
(ii)We observe that , pointwise and uniformly. Proof. From Theorems 17 and 19. □
Next, we present
Corollary 18. Let , , where with and Then, for ,
Proof. From Theorem 18. □
When , we derive
Corollary 19. Let , and Then, for ,
Proof. From Corollary 4. □
When , we have the usual logistic sigmoid function.
For the last application, we consider the Gompertz function
, where
for every
. The Gompertz function is also a sigmoid function which describes growth as being slowest at the start and end of a given time period. Let also
be the Brownian motion on simple graphs. Then, the expectation
is continuous in
t.
Moreover,
Corollary 20. Let , , , where with then for and ,
(ii)We observe that , pointwise and uniformly. Proof. From Theorems 17 and 19. □
Next, we present
Corollary 21. Let , , where with and Then, for ,
and
Proof. From Theorem 18. □
When , we derive
Corollary 22. Let , and Then, for ,
and