This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Open AccessArticle
Efficient Gamma-Based Zero-Reference Deep Curve Estimation for Low-Light Image Enhancement
by
Huitao Zhao
Huitao Zhao ,
Shaoping Xu
Shaoping Xu *
,
Liang Peng
Liang Peng ,
Hanyang Hu
Hanyang Hu
and
Shunliang Jiang
Shunliang Jiang
School of Mathematics and Computer Sciences, Nanchang University, Nanchang 330031, China
*
Author to whom correspondence should be addressed.
Appl. Sci. 2025, 15(13), 7382; https://doi.org/10.3390/app15137382 (registering DOI)
Submission received: 16 May 2025
/
Revised: 25 June 2025
/
Accepted: 27 June 2025
/
Published: 30 June 2025
Abstract
In recent years, the continuous advancement of deep learning technology and its integration into the domain of low-light image enhancement have led to a steady improvement in enhancement effects. However, this progress has been accompanied by an increase in model complexity, imposing significant constraints on applications that demand high real-time performance. To address this challenge, inspired by the state-of-the-art Zero-DCE approach, we introduce a novel method that transforms the low-light image enhancement task into a curve estimation task tailored to each individual image, utilizing a lightweight shallow neural network. Specifically, we first design a novel curve formula based on Gamma correction, which we call the Gamma-based light-enhancement (GLE) curve. This curve enables outstanding performance in the enhancement task by directly mapping the input low-light image to the enhanced output at the pixel level, thereby eliminating the need for multiple iterative mappings as required in the Zero-DCE algorithm. As a result, our approach significantly improves inference speed. Additionally, we employ a lightweight network architecture to minimize computational complexity and introduce a novel global channel attention (GCA) module to enhance the nonlinear mapping capability of the neural network. The GCA module assigns distinct weights to each channel, allowing the network to focus more on critical features. Consequently, it enhances the effectiveness of low-light image enhancement while incurring a minimal computational cost. Finally, our method is trained using a set of zero-reference loss functions, akin to the Zero-DCE approach, without relying on paired or unpaired data. This ensures the practicality and applicability of our proposed method. The experimental results of both quantitative and qualitative comparisons demonstrate that, despite its lightweight design, the images enhanced using our method not only exhibit perceptual quality, authenticity, and contrast comparable to those of mainstream state-of-the-art (SOTA) methods but in some cases even surpass them. Furthermore, our model demonstrates very fast inference speed, making it suitable for real-time inference in resource-constrained or mobile environments, with broad application prospects.
Share and Cite
MDPI and ACS Style
Zhao, H.; Xu, S.; Peng, L.; Hu, H.; Jiang, S.
Efficient Gamma-Based Zero-Reference Deep Curve Estimation for Low-Light Image Enhancement. Appl. Sci. 2025, 15, 7382.
https://doi.org/10.3390/app15137382
AMA Style
Zhao H, Xu S, Peng L, Hu H, Jiang S.
Efficient Gamma-Based Zero-Reference Deep Curve Estimation for Low-Light Image Enhancement. Applied Sciences. 2025; 15(13):7382.
https://doi.org/10.3390/app15137382
Chicago/Turabian Style
Zhao, Huitao, Shaoping Xu, Liang Peng, Hanyang Hu, and Shunliang Jiang.
2025. "Efficient Gamma-Based Zero-Reference Deep Curve Estimation for Low-Light Image Enhancement" Applied Sciences 15, no. 13: 7382.
https://doi.org/10.3390/app15137382
APA Style
Zhao, H., Xu, S., Peng, L., Hu, H., & Jiang, S.
(2025). Efficient Gamma-Based Zero-Reference Deep Curve Estimation for Low-Light Image Enhancement. Applied Sciences, 15(13), 7382.
https://doi.org/10.3390/app15137382
Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details
here.
Article Metrics
Article Access Statistics
For more information on the journal statistics, click
here.
Multiple requests from the same IP address are counted as one view.