Skip Content
You are currently on the new version of our website. Access the old version .
EntropyEntropy
  • This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
  • Article
  • Open Access

31 January 2026

Compact and Interpretable Neural Networks Using Lehmer Activation Units

,
and
1
Department of Mathematical and Computational Sciences, University of Toronto, Mississauga, ON L5L 1C6, Canada
2
Department of Computer Science, University of Colorado Boulder, Boulder, CO 80309, USA
3
Department of Mathematics and Statistics, York University, Toronto, ON M3J 1P3, Canada
*
Author to whom correspondence should be addressed.
Entropy2026, 28(2), 157;https://doi.org/10.3390/e28020157 
(registering DOI)
This article belongs to the Special Issue Complexity of AI

Abstract

We introduce Lehmer Activation Units (LAUs), a class of aggregation-based neural activations derived from the Lehmer transform that unify feature weighting and nonlinearity within a single differentiable operator. Unlike conventional pointwise activations, LAUs operate on collections of features and adapt their aggregation behavior through learnable parameters, yielding intrinsically interpretable representations. We develop both real-valued and complex-valued formulations, with the complex extension enabling phase-sensitive interactions and enhanced expressive capacity. We establish a universal approximation theorem for LAU-based networks, providing formal guarantees of expressive completeness. Empirically, we show that LAUs enable highly compact architectures to achieve strong predictive performance under tightly controlled experimental settings, demonstrating that expressive power can be concentrated within individual neurons rather than architectural depth. These results position LAUs as a principled, interpretable, and efficient alternative to conventional activation functions.

Article Metrics

Citations

Article Access Statistics

Article metric data becomes available approximately 24 hours after publication online.