This work presents a comprehensive mathematical framework for symmetrized neural network operators operating under the paradigm of fractional calculus. By introducing a perturbed hyperbolic tangent activation, we construct a family of localized, symmetric, and positive kernel-like densities, which form the analytical backbone for
[...] Read more.
This work presents a comprehensive mathematical framework for symmetrized neural network operators operating under the paradigm of fractional calculus. By introducing a perturbed hyperbolic tangent activation, we construct a family of localized, symmetric, and positive kernel-like densities, which form the analytical backbone for three classes of multivariate operators: quasi-interpolation, Kantorovich-type, and quadrature-type. A central theoretical contribution is the derivation of the
Voronovskaya–Santos–Sales Theorem, which extends classical asymptotic expansions to the fractional domain, providing rigorous error bounds and normalized remainder terms governed by Caputo derivatives. The operators exhibit key properties such as partition of unity, exponential decay, and scaling invariance, which are essential for stable and accurate approximations in high-dimensional settings and systems governed by nonlocal dynamics. The theoretical framework is thoroughly validated through applications in signal processing and fractional fluid dynamics, including the formulation of nonlocal viscous models and fractional Navier–Stokes equations with memory effects. Numerical experiments demonstrate a relative error reduction of up to
92.5% when compared to classical quasi-interpolation operators, with observed convergence rates reaching
under Caputo derivatives, using parameters
,
, and
. This synergy between neural operator theory, asymptotic analysis, and fractional calculus not only advances the theoretical landscape of function approximation but also provides practical computational tools for addressing complex physical systems characterized by long-range interactions and anomalous diffusion.
Full article