Symmetrized, Perturbed Hyperbolic Tangent-Based Complex-Valued Trigonometric and Hyperbolic Neural Network Accelerated Approximation
Abstract
:1. Introduction
2. Basics
3. Main Results
4. Conclusions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Anastassiou, G.A. Rate of convergence of some neural network operators to the unit-univariate case. J. Math. Anal. Appl. 1997, 212, 237–262. [Google Scholar] [CrossRef]
- Anastassiou, G.A. Quantitative Approximations; Chapman & Hall/CRC: Boca Raton, FL, USA; New York, NY, USA, 2001. [Google Scholar]
- Chen, Z.; Cao, F. The approximation operators with sigmoidal functions. Comput. Math. Appl. 2009, 58, 758–765. [Google Scholar] [CrossRef]
- Anastassiou, G.A. Inteligent Systems: Approximation by Artificial Neural Networks; Intelligent Systems Reference Library; Springer: Berlin/Heidelberg, Germany, 2011; Volume 19. [Google Scholar]
- Yu, D.; Cao, F. Construction and approximation rate for feedforward neural network operators with sigmoidal functions. J. Comput. Appl. Math. 2025, 453, 116150. [Google Scholar] [CrossRef]
- Cen, S.; Jin, B.; Quan, Q.; Zhou, Z. Hybrid neural-network FEM approximation of diffusion coefficient in elliptic and parabolic problems. IMA J. Numer. Anal. 2024, 44, 3059–3093. [Google Scholar] [CrossRef]
- Coroianu, L.; Costarelli, D.; Natale, M.; Pantiş, A. The approximation capabilities of Durrmeyer-type neural network operators. J. Appl. Math. Comput. 2024, 70, 4581–4599. [Google Scholar] [CrossRef]
- Warin, X. The GroupMax neural network approximation of convex functions. IEEE Trans. Neural Netw. Learn. Syst. 2024, 35, 11608–11612. [Google Scholar] [CrossRef] [PubMed]
- Fabra, A.; Guasch, O.; Baiges, J.; Codina, R. Approximation of acoustic black holes with finite element mixed formulations and artificial neural network correction terms. Finite Elem. Anal. Des. 2024, 241, 104236. [Google Scholar] [CrossRef]
- Grohs, P.; Voigtlaender, F. Proof of the theory-to-practice gap in deep learning via sampling complexity bounds for neural network approximation spaces. Found. Comput. Math. 2024, 24, 1085–1143. [Google Scholar] [CrossRef]
- Basteri, A.; Trevisan, D. Quantitative Gaussian approximation of randomly initialized deep neural networks. Mach. Learn. 2024, 113, 6373–6393. [Google Scholar] [CrossRef]
- De Ryck, T.; Mishra, S. Error analysis for deep neural network approximations of parametric hyperbolic conservation laws. Math. Comp. 2024, 93, 2643–2677. [Google Scholar] [CrossRef]
- Liu, J.; Zhang, B.; Lai, Y.; Fang, L. Hull form optimization research based on multi-precision back-propagation neural network approximation model. Internal. J. Numer. Methods Fluid 2024, 96, 1445–1460. [Google Scholar] [CrossRef]
- Yoo, J.; Kim, J.; Gim, M.; Lee, H. Error estimates of physics-informed neural networks for initial value problems. J. Korean Soc. Ind. Appl. Math. 2024, 28, 33–58. [Google Scholar]
- Haykin, S. Neural Networks: A Comprehensive Foundation, 2nd ed.; Prentice Hall: New York, NY, USA, 1998. [Google Scholar]
- McCulloch, W.; Pitts, W. A logical calculus of the ideas immanent in nervous activity. Bull. Math. Biophys. 1943, 7, 115–133. [Google Scholar] [CrossRef]
- Mitchell, T.M. Machine Learning; WCB-McGraw-Hill: New York, NY, USA, 1997. [Google Scholar]
- Anastassiou, G.A. Parametrized, Deformed and General Neural Networks; Springer: Berlin/Heidelberg, Germany; New York, NY, USA, 2023. [Google Scholar]
- Anastassiou, G.A. Trigonometric and Hyperbolic Generated Approximation Theory; World Scientific: Hackensack, NJ, USA; London, UK; Singapore, 2025. [Google Scholar]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Anastassiou, G.A. Symmetrized, Perturbed Hyperbolic Tangent-Based Complex-Valued Trigonometric and Hyperbolic Neural Network Accelerated Approximation. Mathematics 2025, 13, 1688. https://doi.org/10.3390/math13101688
Anastassiou GA. Symmetrized, Perturbed Hyperbolic Tangent-Based Complex-Valued Trigonometric and Hyperbolic Neural Network Accelerated Approximation. Mathematics. 2025; 13(10):1688. https://doi.org/10.3390/math13101688
Chicago/Turabian StyleAnastassiou, George A. 2025. "Symmetrized, Perturbed Hyperbolic Tangent-Based Complex-Valued Trigonometric and Hyperbolic Neural Network Accelerated Approximation" Mathematics 13, no. 10: 1688. https://doi.org/10.3390/math13101688
APA StyleAnastassiou, G. A. (2025). Symmetrized, Perturbed Hyperbolic Tangent-Based Complex-Valued Trigonometric and Hyperbolic Neural Network Accelerated Approximation. Mathematics, 13(10), 1688. https://doi.org/10.3390/math13101688