Abstract
In this work, we study the univariate quantitative smooth approximations, including both real and complex and ordinary and fractional approximations, under different functions. The approximators presented here are neural network operators activated by Richard’s curve, a parametrized form of logistic sigmoid function. All domains used are obtained from the whole real line. The neural network operators used here are of the quasi-interpolation type: basic ones, Kantorovich-type ones, and those of the quadrature type. We provide pointwise and uniform approximations with rates. We finish with their applications.
Keywords:
Richards’s curve function; quasi-interpolation neural network operators; real and complex approximation; ordinary and fractional approximation; quantitative approximations MSC:
26A33; 41A17; 41A25; 41A30; 46B25
1. Introduction
The author of [1,2], see Section 2, Section 3, Section 4 and Section 5, was the first to establish neural network approximation to continuous functions with rates via very specific neural network operators of Cardaliaguet–Euvrard and “Squashing” types by using the modulus of continuity of the engaged function or its high-order derivative and producing very tight Jackson-type inequalities. He treats both univariate and multivariate cases. These operators’ “bell-shaped” and “squashing” functions are assumed to provide compact support.
The author inspired by [3], continued his studies on neural network approximation by introducing and using the proper quasi-interpolation operators of sigmoidal and hyperbolic tangent types, which resulted in [4,5], treating both the univariate and multivariate cases. He also studied the corresponding fractional cases [5].
A parametrized activation function kills far fewer neurons than the original one.
Therefore, here, the author obtains parametrized Richard’s-curve-activated neural network approximations of differentiated functions, which are transformed from into , going beyond the bounded domain functions.
We present real and complex ordinary and fractional quasi-interpolation quantitative approximations. The fractional case is extensively studied because of the applications of fractional calculus in the interpretation of many natural phenomena and in engineering. Therefore, we derive Jackson inequalities that are close to the sharp type.
Real feed-forward neural networks (FNNs) with one hidden layer, the ones we use here, are mathematically expressed by:
where, for , are the thresholds, are the connection weights, are the coefficients, is the inner product of and x, and is the activation function of the network. For more information about neural networks in general, read [6,7,8]. Due to their efficiency, neural network approximations are widely used for many subjects, like differential equations, numerical analysis, statistics, and AI.
2. Preliminaries
The following come from [9], pp. 2–6.
The Richards’s curve is as follows:
which is of great interest when
The function increases in terms of and is a sigmoid function; specifically, this is a generalized logistic function [10].
It should be noted that
We consider the following activation function:
which is , all .
Function has many applications in epidemiology and especially in COVID-19 modeling of infection trajectories [11].
We can see that
We notice that
Therefore, G is an even function.
We can observe that
that is
Let . We can observe that
Therefore, for and decrease.
Let then, and and then so that and decreases over
Thus, decreases at .
Clearly, increases at , and
We can observe that
That is, the x axis is the horizontal asymptote for G.
In conclusion, G is a bell symmetric function with the following maximum:
We need to use the following theorems.
Theorem 1.
It holds that
Remark 1.
Because G is even, it holds that
Hence
and
Theorem 2.
It holds that
Proof.
We can observe that
So, is a density function. □
Remark 2.
We can obtain that
Let . That is . Applying the mean value theorem, we obtain
where
Notice that
We need the following definitions.
Definition 1.
where which is bounded and/or uniformly continuous.
In this article, we study the smooth approximation properties of the following quasi-interpolation neural network operators acting on (continuous and bounded functions):
- (i)
- The basic ones:
- (ii)
- The Kantorovich-type operators:
- (iii)
- let , , , , andWe also consider the quadrature-type operators:We will be using the first modulus of continuity:
We are motivated by the following result.
Theorem 3
([9], p. 13). Let , , , , . Then,
- (i)
- and
- (ii)
For (uniformly and bounded continuous functions), we can obtain , both pointwise and uniformly.
3. Main Results
Here, we study the approximation properties of neural network operators , , under differentiation.
Theorem 4.
Here, , , , with , . Then,
- (i)
- (ii)
- Assume all , we have thatat high speed
- (iii)
- and
- (iv)
Proof.
Using Taylor’s theorem, we have ()
It follows that
Hence,
Use the following equation:
(I) Let . Then,
(i) case :
We found that
(ii) Case : then,
Consequently, we prove that
Next, we can observe ()
In case , we obtain
Consequently, it holds
Next, we treat
Notice that
Therefore, we have
Hence, it holds that
Thus, we have
and the following holds:
We prove that
and we derive that
Finally, we estimate
The theorem is proved. □
Next comes
Theorem 5.
Here , , , with , . Then,
(i)
(ii) assume all , then
at high speed
(iii)
and
(iv)
Proof.
One can write
Let now with , .
We have that
and
Hence,
Therefore, we can write
where
Call
where .
(I) Let ().
(i) if , then
(ii) if , then
Therefore, when , then
Clearly, now the following holds:
(II) Let .
(i) if , then
(ii) if , then
Hence, when , then
Clearly, then
(by [9], p. 6, Theorem 1.4)
We have found that
Therefore, the following holds:
Finally, we estimate
(… as earlier)
Therefore,
The theorem is proved. □
Therefore, the following theorem holds.
Theorem 6.
Here, , , , with , . Then,
(i)
(ii) Assume all , then,
at high speed
(iii)
(iv)
Proof.
We have that
and
Furthermore, the following holds:
where
Use the following equation:
(I) Let .
(i) if ; then,
(ii) If , then
Therefore, when , then
Clearly, now the following holds:
(II) Let .
(i) if , then
(ii) if , then
So, in general, we obtain
Clearly, then
Therefore, the following holds:
Next, we estimate
The theorem is proved. □
We need the following definition.
Definition 2.
A function is absolutely continuous over , if is absolutely continuous, for every . We can write , if (absolutely continuous functions over ) and .
Definition 3.
Let , ( is the ceiling of the number); . We use the left Caputo fractional derivative ([12,13,14], pp. 49–52) as the following function:
, ; where Γ is the gamma function.
Note that and exists, i.e., on , ∀.
We set , ∀
Lemma 1
(See also [15]). Let , , , and . Then, for any .
Definition 4
(See also [13,16,17]). Let , , . The right Caputo fractional derivative of order is given by
, . We set .
Note that and exists a.e. on , ∀.
Lemma 2
(see also [15]). Let , , . Then, for any .
Convention 1.
We assume that
Proposition 2
(See also [15]). Let , , . Then is continuous in , .
Also we have
Proposition 3
(see also [15]). Let , , . Then, is continuous in , .
We further mention
Proposition 4
(see also [15]). Let , , , and let . Then, is continuous in
Proposition 5
(See also [15]). Let , , , and let . Then, is continuous in
Proposition 6
(See also [15]). Let , , ; . Then, are jointly continuous functions in from
Fractional results follow.
Theorem 7.
Let , , , , , , , , . Assume also that both . Then,
(I)
(II) given , , we have
(III)
and
(IV) Adding , we obtain
As shown above, when , the sum
As we can see here, we obtain fractional pointwise and uniform convergence with rates of , with the unit operator as
Proof.
Let . We have that .
From [12], p. 54, we can derive the left Caputo fractional Taylor’s formula:
for all
Also, from [16], using the right Caputo fractional Taylor’s formula, we can obtain the following:
for all
Hence, the following holds:
for all iff
and
for all , iff
We have that
Therefore, the following holds:
and
Adding the last two equalities (100), (101), we can obtain:
where
with
and
Furthermore, let
for
and
for
Let ; we derive the following:
and
Also, we obtain the following:
and
Therefore, the following holds:
and
Next, we estimate
(by )
Hence, the following holds:
Furthermore, we can obtain:
That is
We have proved that
As was shown earlier, we can obtain that
We have that
and
.
Therefore, the following holds:
and
Thus, it is reasonable to assume that .
Consequently, the following holds:
The theorem is now proved. □
4. Applications for
We obtain the following results:
Corollary 1.
Here, , , , , with ; . Then,
(I)
(II) assume ; we have that
at high speed
(III)
(IV)
Proof.
Use Theorem 4 for □
Corollary 2.
Here, , , , , with ; . Then,
(I)
(II) assume ; we have that
at high speed
(III)
and
(IV)
Proof.
Use Theorems 5 and 6 for □
Corollary 3.
Let , , , , , , . Assume also that . Then,
(I)
and
(II)
Proof.
Use Theorem 7 for □
Next is the case of
Corollary 4.
Let , , , , , . Assume that . Then,
(I)
and
(II)
Proof.
Use Corollary 3. □
5. Complex Neural Network Approximation
Remark 3.
Let , with real and imaginary parts , . Clearly, f is continuous if and are continuous.
Also,
holds for all , given that , .
Here, the following are defined:
We observe here that
and
and
Using we can denote the space of continuous and bounded functions . Clearly, f is bounded if both are bounded from into , where
Theorem 8.
Let , such that . Assume , , with , . Here, , , . Then,
(I)
(II) Given that , we have
(III)
Proof.
By Theorem 4. □
Theorem 9.
All are presented as in Theorem 8. Then,
(I)
(II) Given that , we have
(III)
Proof.
Use Theorems 5 and 6. □
We finish with the following fractional result.
Theorem 10.
Let , such that . Assume , , . Here , , , , , , . Suppose also that . Then,
(I)
(II) Given , , we have
and
(III)
Proof.
By Theorem 7. □
Conclusions: The author used parametrized Richard’s curve activated neural network approximations to differentiate functions from into , going beyond the bounded domain functions. He presented real and complex, ordinary and fractional quasi-interpolation quantitative approximations. The results are totally new.
Funding
This research received no external funding.
Data Availability Statement
No new data were created or analyzed in this study. Data sharing is not applicable to this article.
Conflicts of Interest
The author declares no conflicts of interest.
References
- Anastassiou, G.A. Rate of convergence of some neural network operators to the unit-univariate case. J. Math. Anal. Appl. 1997, 212, 237–262. [Google Scholar] [CrossRef]
- Anastassiou, G.A. Quantitative Approximations; Chapman & Hall/CRC: Boca Raton, FL, USA; New York, NY, USA, 2001. [Google Scholar]
- Chen, Z.; Cao, F. The approximation operators with sigmoidal functions. Comput. Math. Appl. 2009, 58, 758–765. [Google Scholar] [CrossRef]
- Anastassiou, G.A. Inteligent Systems: Approximation by Artificial Neural Networks; Intelligent Systems Reference Library; Springer: Berlin/Heidelberg, Germany, 2011; Volume 19. [Google Scholar]
- Anastassiou, G.A. Intelligent Systems II: Complete Approximation by Neural Network Operators; Springer: Berlin/Heidelberg, Germany; New York, NY, USA, 2016. [Google Scholar]
- Haykin, S. Neural Networks: A Comprehensive Foundation, 2nd ed.; Prentice Hall: New York, NY, USA, 1998. [Google Scholar]
- McCulloch, W.; Pitts, W. A logical calculus of the ideas immanent in nervous activity. Bull. Math. Biophys. 1943, 7, 115–133. [Google Scholar] [CrossRef]
- Mitchell, T.M. Machine Learning; WCB-McGraw-Hill: New York, NY, USA, 1997. [Google Scholar]
- Anastassiou, G.A. Parametrized, Deformed and General Neural Networks; Springer: Berlin/Heidelberg, Germany; New York, NY, USA, 2023. [Google Scholar]
- Richards, F.J. A Flexible Growth Function for Empirical Use. J. Exp. Bot. 1959, 10, 290–300. [Google Scholar] [CrossRef]
- Lee, S.Y.; Lei, B.; Mallick, B. Estimation of COVID-19 spread curves integrating global data and borrowing information. PLoS ONE 2020, 15, e0236860. [Google Scholar] [CrossRef]
- Diethelm, K. The Analysis of Fractional Differential Equations; Lecture Notes in Mathematics 2004; Springer: Berlin/Heidelberg, Germany, 2010. [Google Scholar]
- Frederico, G.S.; Torres, D.F.M. Fractional Optimal Control in the sense of Caputo and the fractional Noether’s theorem. Int. Math. Forum 2008, 3, 479–493. [Google Scholar]
- Samko, S.G.; Kilbas, A.A.; Marichev, O.I. Fractional Integrals and Derivatives, Theory and Applications; English translation from the Russian, Integrals and Derivatives of Fractional Order and Some of Their Applications (Nauka i Tekhnika, Minsk, 1987); Gordon and Breach: Amsterdam, The Netherlands, 1993. [Google Scholar]
- Anastassiou, G.A. Fractional Korovkin theory. Chaos Solitons Fractals 2009, 42, 2080–2094. [Google Scholar] [CrossRef]
- Anastassiou, G.A. On Right Fractional Calculus. Chaos Solitons Fractals 2009, 42, 365–376. [Google Scholar] [CrossRef]
- El-Sayed, A.M.A.; Gaber, M. On the finite Caputo and finite Riesz derivatives. Electron. J. Theor. Phys. 2006, 3, 81–95. [Google Scholar]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).