Next Article in Journal
Hessian-Enhanced Likelihood Optimization for Gravitational Wave Parameter Estimation: A Second-Order Approach to Machine Learning-Based Inference
Previous Article in Journal
CE-FPN-YOLO: A Contrast-Enhanced Feature Pyramid for Detecting Concealed Small Objects in X-Ray Baggage Images
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Correction

Correction: Anastassiou, G.A. Composition of Activation Functions and the Reduction to Finite Domain. Mathematics 2025, 13, 3177

by
George A. Anastassiou
Department of Mathematical Sciences, University of Memphis, Memphis, TN 38152, USA
Mathematics 2025, 13(24), 4013; https://doi.org/10.3390/math13244013
Submission received: 3 December 2025 / Accepted: 9 December 2025 / Published: 17 December 2025
There was an error in the original publication [1]. A correction has been made to 2. Basics, Paragraph 6–8:
So we have h2 : ℝ → (−1, 1), h1|(−1,1) : (−1, 1) → (−1, 1), and the strictly increasing function H := h1|(−1,1)h2 : ℝ → (−1, 1), with the graph of H containing an arc of finite length, such that H(0) = 0, starting at (−1, h1(h2(−1))) and terminating at (1, h1(h2(1))). We call this arc also H. In particular H is negative and convex over (−1, 0], and it is positive and concave over [0, 1).
So it has compact support [−1, 1] and it is like a squashing function, see [3], Ch. 1, p. 8.
We will work from now on with |H|, which has as a graph a cusp joining the points (−1, |h1(h2(−1))|), (0, 0), (1, h1(h2(1))) and with compact support, again, [−1, 1]. The points (−1, |h1(h2(−1))|), (1, h1(h2(1))) belong to the graph of |H| and (0, 0) too.
Typically H has a steeper slope than of h2, but it is flatter and closer to the x-axis than h2 is, e.g. tanh(tanh x) has asymptotes ±0.76, while tanh x has asymptotes ±1, notice that tanh(1) = 0.76. Clearly H has applications in spiking neural networks.
The authors state that the scientific conclusions are unaffected. This correction was approved by the Section Editor-in-Chief. The original publication has also been updated.

Reference

  1. Anastassiou, G.A. Composition of Activation Functions and the Reduction to Finite Domain. Mathematics 2025, 13, 3177. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Anastassiou, G.A. Correction: Anastassiou, G.A. Composition of Activation Functions and the Reduction to Finite Domain. Mathematics 2025, 13, 3177. Mathematics 2025, 13, 4013. https://doi.org/10.3390/math13244013

AMA Style

Anastassiou GA. Correction: Anastassiou, G.A. Composition of Activation Functions and the Reduction to Finite Domain. Mathematics 2025, 13, 3177. Mathematics. 2025; 13(24):4013. https://doi.org/10.3390/math13244013

Chicago/Turabian Style

Anastassiou, George A. 2025. "Correction: Anastassiou, G.A. Composition of Activation Functions and the Reduction to Finite Domain. Mathematics 2025, 13, 3177" Mathematics 13, no. 24: 4013. https://doi.org/10.3390/math13244013

APA Style

Anastassiou, G. A. (2025). Correction: Anastassiou, G.A. Composition of Activation Functions and the Reduction to Finite Domain. Mathematics 2025, 13, 3177. Mathematics, 13(24), 4013. https://doi.org/10.3390/math13244013

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop