Inverse Result of Approximation for the Max-Product Neural Network Operators of the Kantorovich Type and Their Saturation Order
Abstract
:1. Introduction
2. Preliminaries
- (1) is an odd function;
- (2) is concave for ;
- (3) as for some .
3. The Saturation Order
4. Local Inverse Result
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
ReLU | Rectified Linear Unit |
References
- Coroianu, L.; Gal, S.G. Approximation by nonlinear generalized sampling operators of max-product kind. Sampl. Theory Signal Image Process. 2010, 9, 59–75. [Google Scholar] [CrossRef]
- Coroianu, L.; Gal, S.G. Approximation by max-product sampling operators based on sinc-type kernels. Sampl. Theory Signal Image Process. 2011, 10, 211–230. [Google Scholar] [CrossRef]
- Bede, B.; Coroianu, L.; Gal, S.G. Approximation by Max-Product Type Operators; Springer: Berlin, Germany, 2016. [Google Scholar] [CrossRef]
- Güngör, S.Y.; Ispir, N. Approximation by Bernstein-Chlodowsky operators of max-product kind. Math. Commun. 2018, 23, 205–225. [Google Scholar]
- Holhos, A. Weighted Approximation of functions by Meyer-König and Zeller operators of max-product type. Numer. Funct. Anal. Optim. 2018, 39, 689–703. [Google Scholar] [CrossRef]
- Holhos, A. Weighted approximation of functions by Favard operators of max-product type. Period. Math. Hung. 2018, 77, 340–346. [Google Scholar] [CrossRef]
- Gokcer, T.Y.; Duman, O. Approximation by max-min operators: A general theory and its applications. Fuzzy Sets Syst. 2020, 394, 146–161. [Google Scholar] [CrossRef]
- Gokcer, T.Y.; Duman, O. Regular summability methods in the approximation by max-min operators. Fuzzy Sets Syst. 2022, 426, 106–120. [Google Scholar] [CrossRef]
- Costarelli, D.; Vinti, G. Approximation by max-product neural network operators of Kantorovich type. Results Math. 2016, 69, 505–519. [Google Scholar] [CrossRef]
- Cardaliaguet, P.; Euvrard, G. Approximation of a function and its derivative with a neural network. Neural Netw. 1992, 5, 207–220. [Google Scholar] [CrossRef]
- Cao, F.; Chen, Z. The approximation operators with sigmoidal functions. Comput. Math. Appl. 2009, 58, 758–765. [Google Scholar]
- Cao, F.; Chen, Z. The construction and approximation of a class of neural networks operators with ramp functions. J. Comput. Anal. Appl. 2012, 14, 101–112. [Google Scholar]
- Cao, F.; Chen, Z. Scattered data approximation by neural networks operators. Neurocomputing 2016, 190, 237–242. [Google Scholar]
- Dai, H.; Xie, J.; Chen, W. Event-Triggered Distributed Cooperative Learning Algorithms over Networks via Wavelet Approximation. Neural Process. Lett. 2019, 50, 669–700. [Google Scholar] [CrossRef]
- Ismailov, V.E. On the approximation by neural networks with bounded number of neurons in hidden layers. J. Math. Anal. Appl. 2014, 417, 963–969. [Google Scholar] [CrossRef]
- Cao, F.; Liu, B.; Park, D.S. Image classification based on effective extreme learning machine. Neurocomputing 2013, 102, 90–97. [Google Scholar] [CrossRef]
- Agostinelli, F.; Hoffman, M.; Sadowski, P.; Baldi, P. Learning Activation Functions to Improve Deep Neural Networks. arXiv 2015, arXiv:1412.6830v3. [Google Scholar]
- Iliev, A.; Kyurkchiev, N.; Markov, S. On the approximation of the cut and step functions by logistic and Gompertz functions. Biomath 2015, 4, 1510101. [Google Scholar] [CrossRef] [Green Version]
- Yarotsky, D. Error bounds for approximations with deep ReLU networks. Neural Netw. 2017, 94, 103–114. [Google Scholar] [CrossRef] [Green Version]
- Bajpeyi, S.; Sathish Kumar, A. Approximation by exponential sampling type neural network operators. Anal. Math. Phys. 2021, 11, 108. [Google Scholar] [CrossRef]
- Cantarini, M.; Costarelli, D.; Vinti, G. Asymptotic expansions for the neural network operators of the Kantorovich type and high order of approximation. Mediterr. J. Math. 2021, 18, 66. [Google Scholar] [CrossRef]
- Costarelli, D.; Sambucini, A.R. Approximation results in Orlicz spaces for sequences of Kantorovich max-product neural network operators. Results Math. 2018, 73, 15. [Google Scholar] [CrossRef] [Green Version]
- Cucker, F.; Zhou, D.X. Learning Theory An Approximation Theory Viewpoint; Cambridge University Press: Cambridge, UK, 2007. [Google Scholar]
- Kadak, U. Fractional type multivariate neural network operators. Math. Methods Appl. Sci. 2021. [Google Scholar] [CrossRef]
- Coroianu, L.; Gal, S.G. Saturation results for the truncated max-product sampling operators based on sinc and Fejér-type kernels. Sampl. Theory Signal Image Process. 2012, 11, 113–132. [Google Scholar] [CrossRef]
- Coroianu, L.; Gal, S.G. Saturation and inverse results for the Bernstein max- product operator. Period. Math. Hung. 2014, 69, 126–133. [Google Scholar] [CrossRef]
- Costarelli, D.; Vinti, G. Saturation classes for max-product neural network operators activated by sigmoidal functions. Results Math. 2017, 72, 1555–1569. [Google Scholar] [CrossRef]
- Ivanov, K. On a new characteristic of functions. II. Direct and converse theorems for the best algebraic approximation in C[-1,1] and Lp[-1,1]. Pliska 1983, 5, 151–163. [Google Scholar]
- Costarelli, D.; Vinti, G. Convergence results for a family of Kantorovich max-product neural network operators in a multivariate setting. Math. Slovaca 2017, 67, 1469–1480. [Google Scholar] [CrossRef]
- Cybenko, G. Approximation by superpositions of a sigmoidal function. Math. Control. Signals Syst. 1989, 2, 303–314. [Google Scholar] [CrossRef]
- Costarelli, D.; Vinti, G. Convergence for a family of neural network operators in Orlicz spaces. Math. Nachr. 2017, 290, 226–235. [Google Scholar] [CrossRef]
- Goebbels, S. On sharpness of error bounds for univariate single hidden layer feedforward neural networks. Results Math. 2020, 75, 109. [Google Scholar] [CrossRef]
- Li, Y.; Yuan, Y. Convergence Analysis of Two-layer Neural Networks with ReLU Activation. arXiv 2017, arXiv:1705.09886. Available online: https://arxiv.org/abs/1705.09886 (accessed on 1 December 2021).
- Zhang, C.; Woodland, P.C. DNN speaker adaptation using parameterised sigmoid and ReLU hidden activation functions. In Proceedings of the 2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Shanghai, China, 20–25 March 2016. [Google Scholar]
- Agarap, A.F. Deep Learning using Rectified Linear Units (ReLU). arXiv 2018, arXiv:1803.08375. [Google Scholar]
- DeVore, R.A.; Lorentz, G.G. Constructive Approximation; Springer Science & Business Media: Berlin/Heidelberg, Germany, 1992. [Google Scholar]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Cantarini, M.; Coroianu, L.; Costarelli, D.; Gal, S.G.; Vinti, G. Inverse Result of Approximation for the Max-Product Neural Network Operators of the Kantorovich Type and Their Saturation Order. Mathematics 2022, 10, 63. https://doi.org/10.3390/math10010063
Cantarini M, Coroianu L, Costarelli D, Gal SG, Vinti G. Inverse Result of Approximation for the Max-Product Neural Network Operators of the Kantorovich Type and Their Saturation Order. Mathematics. 2022; 10(1):63. https://doi.org/10.3390/math10010063
Chicago/Turabian StyleCantarini, Marco, Lucian Coroianu, Danilo Costarelli, Sorin G. Gal, and Gianluca Vinti. 2022. "Inverse Result of Approximation for the Max-Product Neural Network Operators of the Kantorovich Type and Their Saturation Order" Mathematics 10, no. 1: 63. https://doi.org/10.3390/math10010063
APA StyleCantarini, M., Coroianu, L., Costarelli, D., Gal, S. G., & Vinti, G. (2022). Inverse Result of Approximation for the Max-Product Neural Network Operators of the Kantorovich Type and Their Saturation Order. Mathematics, 10(1), 63. https://doi.org/10.3390/math10010063