Research on an Underwater Visual Enhancement Method Based on Adaptive Parameter Optimization in a Multi-Operator Framework
Abstract
1. Introduction
- A multi-operator collaborative enhancement pipeline tailored to the degradation characteristics of underwater optical images is proposed. This pipeline addresses various degradation effects caused by light absorption and scattering from three dimensions—luminance compensation, structural detail restoration, and color correction. It employs contrast-limited adaptive histogram equalization (CLAHE), a local contrast enhancement technique designed to improve image details under non-uniform illumination while mitigating over-enhancement and noise amplification, combined with highlight protection to suppress excessive local brightness enhancement; applies controlled unsharp masking (USM), a classical image sharpening technique for enhancing local contrast and edge details, together with texture gating to reinforce true structural details; and incorporates mild saturation compensation to alleviate bluish–green color bias, thereby producing enhanced images that appear more natural and are more suitable for feature extraction.
- An adaptive parameter optimization mechanism based on a unified scoring function is developed. By incorporating four constraints—ORB (Oriented FAST and Rotated BRIEF) feature gain, inlier matching gain, improvement in UIQM (Underwater Image Quality Measure), and latency penalty—a scoring scheme associated with real-time scene conditions is constructed. Using this scheme, global parameter search and real-time neighborhood refinement are performed for the three key operators—CLAHE clip limit, USM gain, and Gaussian scale—enabling the enhancement parameters to dynamically and robustly adapt to variations in water conditions, illumination, and degradation severity.
- Extensive experiments on public underwater datasets are conducted to evaluate both subjective visual quality and objective metrics. The superiority of the proposed enhancement method in terms of visual perception is validated on the EUVP and UIEB datasets. Quality indicators such as AG and UIQM are assessed on the RUIE dataset. The effectiveness of the adaptive multi-operator parameter tuning mechanism is verified on the UVE38K dataset, and its stability and consistency are further demonstrated through adjacent-frame matching visualizations and metrics including the two-hop survival ratio.
2. Methodology
2.1. Overall Framework
2.2. Image Enhancement Method
2.2.1. Local Contrast Enhancement and Highlight Protection
2.2.2. Controlled Sharpening and Edge Gating
2.2.3. Mild Saturation Compensation
2.3. Adaptive Parameter Tuning Mechanism
2.3.1. Global Parameter Optimization
2.3.2. Real-Time Neighborhood Refinement
3. Experimental Results and Analysis
3.1. Experimental Setup
3.2. Comparative Experiments on Enhancement Methods
3.2.1. Visual Comparison on Public Datasets
3.2.2. Quantitative Evaluation on the RUIE Dataset
3.3. Validation of the Adaptive Parameter Tuning Mechanism
3.3.1. Overall Performance Analysis
3.3.2. Computational Efficiency Analysis
3.3.3. Validation of Parameter Dynamics and Scoring Response
3.3.4. Feature Matching Visualization and Stability Metrics
4. Conclusions and Future Work
4.1. Conclusions
- (1)
- This paper proposes an underwater image enhancement method tailored for diverse aquatic environments. By integrating brightness enhancement, structural detail recovery, and chromaticity compensation into a collaborative multi-operator pipeline, the method effectively addresses degradation phenomena caused by underwater light absorption and scattering, including brightness attenuation, structural weakening, and color shifts.
- (2)
- To further improve the robustness of the enhancement pipeline under varying water conditions, this paper introduces an adaptive parameter optimization mechanism based on multi-metric evaluation. Through offline global parameter search and online neighborhood refinement, key parameters including the CLAHE clip limit, USM gain, and Gaussian scale are dynamically adjusted according to scene variations, thereby overcoming the limitations of fixed-parameter strategies in multi-scene environments.
- (3)
- Experimental results on public datasets such as RUIE demonstrate that the proposed method achieved the best performance in structural sharpness (AG = 0.5922) and overall visual quality (UIQM = 2.0895). On the UVE38K multi-sequence benchmark, compared with fixed-parameter baselines, the proposed approach yielded an average improvement of 12.5% in ORB keypoint count, 9.3% in inlier matches, and 3.9% in UIQM, while also exhibiting higher geometric consistency and temporal stability in correspondence visualization and inlier-ratio analysis. These results validate the effectiveness of the proposed method in improving both visual quality and feature stability, thereby providing more reliable input images for vision-based SLAM front-end processing.
4.2. Future Work
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Raveendran, S.; Patil, M.D.; Birajdar, G.K. Underwater image enhancement: A comprehensive review, recent trends, challenges and applications. Artif. Intell. Rev. 2021, 54, 5413–5467. [Google Scholar] [CrossRef]
- Zhao, L.; Ren, X.; Fu, L.; Yun, Q.; Yang, J. UWS-YOLO: Advancing underwater sonar object detection via transfer learning and orthogonal-snake convolution mechanisms. J. Mar. Sci. Eng. 2025, 13, 1847. [Google Scholar] [CrossRef]
- Xu, R.; Zhu, D.; Pang, W.; Chen, M. An underwater low-light image enhancement algorithm based on image fusion and color balance. J. Mar. Sci. Eng. 2025, 13, 2049. [Google Scholar] [CrossRef]
- Han, M.; Lyu, Z.; Qiu, T.; Xu, M. A review on intelligence dehazing and color restoration for underwater images. IEEE Trans. Syst. Man Cybern. Syst. 2020, 50, 1820–1832. [Google Scholar] [CrossRef]
- Zheng, L.; Wang, Y.; Ding, X.; Mi, Z.; Fu, X. Single underwater image enhancement by attenuation map guided color correction and detail preserved dehazing. Neurocomputing 2021, 425, 160–172. [Google Scholar] [CrossRef]
- Qiang, H.; Zhong, Y.; Zhu, Y.; Zhong, X.; Xiao, Q.; Dian, S. Underwater image enhancement based on multichannel adaptive compensation. IEEE Trans. Instrum. Meas. 2024, 73, 1–10. [Google Scholar] [CrossRef]
- Huang, Y.; Yuan, F.; Xiao, F.; Lu, J.; Cheng, E. Underwater image enhancement based on zero-reference deep network. IEEE J. Ocean. Eng. 2023, 48, 903–924. [Google Scholar] [CrossRef]
- Wang, Y.; Zhang, J.; Cao, Y.; Wang, Z. A deep CNN method for underwater image enhancement. In Proceedings of the IEEE International Conference on Image Processing (ICIP), Beijing, China, 17–20 September 2017; pp. 1382–1386. [Google Scholar] [CrossRef]
- Li, J.; Skinner, K.A.; Eustice, R.M.; Johnson-Roberson, M. WaterGAN: Unsupervised generative network to enable real-time color correction of monocular underwater images. IEEE Robot. Autom. Lett. 2018, 3, 387–394. [Google Scholar] [CrossRef]
- Peng, L.; Zhu, C.; Bian, L. U-Shape transformer for underwater image enhancement. IEEE Trans. Image Process. 2023, 32, 3066–3079. [Google Scholar] [CrossRef] [PubMed]
- Li, C.; Anwar, S.; Hou, J.; Cong, R.; Guo, C.; Ren, W. Underwater Image Enhancement via Medium Transmission-Guided Multi-Color Space Embedding. IEEE Trans. Image Process. 2021, 30, 4985–5000. [Google Scholar] [CrossRef]
- Jamieson, S.; How, J.P.; Girdhar, Y. DeepSeeColor: Realtime Adaptive Color Correction for Autonomous Underwater Vehicles via Deep Learning Methods. In Proceedings of the 2023 IEEE International Conference on Robotics and Automation (ICRA), London, UK, 29 May–2 June 2023; pp. 3095–3101. [Google Scholar] [CrossRef]
- Drews, P., Jr.; do Nascimento, E.; Moraes, F.; Botelho, S.; Campos, M. Transmission estimation in underwater single images. In Proceedings of the IEEE International Conference on Computer Vision Workshops (ICCVW), Sydney, Australia, 2–8 December 2013; pp. 825–830. [Google Scholar] [CrossRef]
- Peng, Y.-T.; Cao, K.; Cosman, P.C. Generalization of the dark channel prior for single image restoration. IEEE Trans. Image Process. 2018, 27, 2856–2868. [Google Scholar] [CrossRef]
- Galdrán, A.; Pardo, D.; Picón, A.; Alvarez-Gila, A. Automatic red-channel underwater image restoration. J. Vis. Commun. Image Represent. 2015, 26, 132–145. [Google Scholar] [CrossRef]
- Pizer, S.M.; Amburn, E.P.; Austin, J.D.; Cromartie, R.; Geselowitz, A.; Greer, T.; ter Haar Romeny, B.; Zimmerman, J.B.; Zuiderveld, K. Adaptive histogram equalization and its variations. Comput. Vis. Graph. Image Process. 1987, 39, 355–368. [Google Scholar] [CrossRef]
- Zuiderveld, K. Contrast limited adaptive histogram equalization. In Graphics Gems IV; Heckbert, P.S., Ed.; Academic Press: Cambridge, MA, USA, 1994; pp. 474–485. [Google Scholar]
- Land, E.H. The Retinex theory of color vision. Sci. Am. 1977, 237, 108–129. [Google Scholar] [CrossRef] [PubMed]
- Ancuti, C.; Ancuti, C.O.; Haber, T.; Bekaert, P. Enhancing underwater images and videos by fusion. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Providence, RI, USA, 16–21 June 2012; pp. 81–88. [Google Scholar] [CrossRef]
- Reza, A.M. Realization of the contrast limited adaptive histogram equalization (CLAHE) for real-time image enhancement. J. VLSI Signal Process. Syst. Signal Image Video Technol. 2004, 38, 35–44. [Google Scholar] [CrossRef]
- Eilertsen, G.; Mantiuk, R.K.; Unger, J. A comparative review of tone-mapping algorithms for high dynamic range video. Comput. Graph. Forum 2017, 36, 565–592. [Google Scholar] [CrossRef]
- Lischinski, D.; Farbman, Z.; Uyttendaele, M.; Szeliski, R. Interactive local adjustment of tonal values. ACM Trans. Graph. 2006, 25, 646–653. [Google Scholar] [CrossRef]
- Reinhard, E.; Stark, M.; Shirley, P.; Ferwerda, J. Photographic tone reproduction for digital images. ACM Trans. Graph. 2002, 21, 267–276. [Google Scholar] [CrossRef]
- Gonzalez, R.C.; Woods, R.E. Digital Image Processing, 4th ed.; Pearson International: London, UK, 2017; Available online: https://elibrary.pearson.de/book/99.150005/9781292223070 (accessed on 1 July 2025).
- Marr, D.; Hildreth, E. Theory of edge detection. Proc. R. Soc. Lond. B Biol. Sci. 1980, 207, 187–217. [Google Scholar] [CrossRef]
- Chiang, J.Y.; Chen, Y.-C. Underwater image enhancement by wavelength compensation and dehazing. IEEE Trans. Image Process. 2012, 21, 1756–1769. [Google Scholar] [CrossRef]
- Zhang, W.; Dong, L.; Zhang, T.; Xu, W. Enhancing underwater image via color correction and bi-interval contrast enhancement. Signal Process. Image Commun. 2021, 90, 116030. [Google Scholar] [CrossRef]
- Fu, X.; Zhuang, P.; Huang, Y.; Liao, Y.; Zhang, X.-P.; Ding, X. A Retinex-based enhancing approach for single underwater image. In Proceedings of the IEEE International Conference on Image Processing (ICIP), Paris, France, 27–30 October 2014; pp. 4572–4576. [Google Scholar] [CrossRef]
- Fattal, R.; Lischinski, D.; Werman, M. Edge-preserving decompositions for tone and detail manipulation. ACM Trans. Graph. 2007, 26, 67. [Google Scholar]
- Ma, K.; Zeng, K.; Wang, Z. Perceptual quality assessment for multi-exposure image fusion. IEEE Trans. Image Process. 2015, 24, 3345–3356. [Google Scholar] [CrossRef]
- Islam, M.J.; Xia, Y.; Sattar, J. Fast underwater image enhancement for improved visual perception. IEEE Robot. Autom. Lett. 2020, 5, 3227–3234. [Google Scholar] [CrossRef]
- Li, C.; Guo, J.; Guo, C.; Cong, R.; Wan, S.; Hou, J.; Kwong, S.; Li, J. An underwater image enhancement benchmark dataset and beyond. IEEE Trans. Image Process. 2020, 29, 4376–4389. [Google Scholar] [CrossRef]
- Finlayson, G.D.; Trezzi, E. Shades of gray and colour constancy. In Proceedings of the Color and Imaging Conference (CIC12), Scottsdale, AZ, USA, 9–12 November 2004; pp. 37–41. [Google Scholar]
- Liu, R.; Fan, X.; Zhu, M.; Hou, M.; Luo, Z. Real-world underwater enhancement: Challenges, benchmarks, and solutions under natural light. IEEE Trans. Circuits Syst. Video Technol. 2020, 30, 4861–4875. [Google Scholar] [CrossRef]
- Gao, X.; Jin, J.; Lin, F.; Huang, H.; Yang, J.; Xie, Y.; Zhang, B. Enhancing underwater images through multi-frequency detail optimization and adaptive color correction. J. Mar. Sci. Eng. 2024, 12, 1790. [Google Scholar] [CrossRef]
- Zhang, W.; Dong, L.; Pan, X.; Zou, P.; Qin, L.; Xu, W. A survey of restoration and enhancement for underwater images. IEEE Access 2019, 7, 182259–182279. [Google Scholar] [CrossRef]
- Wang, S.; Ma, K.; Yeganeh, H.; Wang, Z.; Lin, W. A patch-structure representation method for quality assessment of contrast changed images. IEEE Signal Process. Lett. 2015, 22, 2387–2390. [Google Scholar] [CrossRef]
- Panetta, K.; Gao, C.; Agaian, S. Human-visual-system-inspired underwater image quality measures. IEEE J. Ocean. Eng. 2016, 41, 541–551. [Google Scholar] [CrossRef]
- Yang, M.; Sowmya, A. An underwater color image quality evaluation metric. IEEE Trans. Image Process. 2015, 24, 6062–6071. [Google Scholar] [CrossRef] [PubMed]
- Zhang, Y.; Qi, Q.; Li, K.; Liu, D. Underwater video consistent enhancement: A real-world dataset and solution with progressive quality learning. Multimed. Tools Appl. 2024, 83, 7335–7361. [Google Scholar] [CrossRef]








| Method | AG [34] | PCQI [35] | UIQM [36] | UCIQE [37] |
|---|---|---|---|---|
| Raw | 0.2102 | 0.9974 | 1.0449 | 0.3649 |
| Shades of gray [31] | 0.2337 | 0.9606 | 1.1643 | 0.2821 |
| HE [14] | 0.4038 | 0.6428 | 1.9616 | 0.5024 |
| CLAHE [15] | 0.4100 | 0.7285 | 1.3717 | 0.4182 |
| CLAHE + USM | 0.5281 | 0.5997 | 1.6224 | 0.4607 |
| UDCP [11] | 0.2651 | 0.7231 | 1.9641 | 0.4506 |
| Ours | 0.5922 | 0.5185 | 2.0895 | 0.4367 |
| Sequence | Category | Kp_enh | I_enh | Q_enh | Relative Improvement | |||
|---|---|---|---|---|---|---|---|---|
| ΔN | ΔI | ΔQ | ΔN ↑ | ΔI ↑ | ΔQ ↑ | |||
| turtle_2 | Ours | (2.4, 0.8, 1.0) | 1.193 | 1.023 | 0.515 | 7.3% | 4.2% | 0.7% |
| Baseline | 1.112 | 0.981 | 0.511 | |||||
| mobula_4 | Ours | (1.8, 0.4, 1.2) | 0.554 | 0.464 | 0.042 | 22.2% | 14.9% | 2.4% |
| Baseline | 0.454 | 0.404 | 0.041 | |||||
| marine_r10 | Ours | (2.2, 0.8, 1.0) | 4.379 | 3.035 | 0.538 | 13.6% | 8.4% | 7.2% |
| Baseline | 3.856 | 2.798 | 0.502 | |||||
| marine_r8 | Ours | (2.0, 0.8, 1.2) | 12.730 | 9.017 | 0.822 | 20.6% | 20.6% | 5.7% |
| Baseline | 10.555 | 7.477 | 0.778 | |||||
| marine_r3 | Ours | (1.8, 0.8, 1.2) | 6.368 | 4.860 | 0.815 | 8.0% | 6.7% | 10.1% |
| Baseline | 5.895 | 4.552 | 0.740 | |||||
| marine_r2 | Ours | (1.8, 0.8, 1.2) | 21.295 | 15.495 | 1.025 | 10.3% | 6.4% | 1.7% |
| Baseline | 19.298 | 14.569 | 1.008 | |||||
| coral_1 | Ours | (2.2, 0.8, 1.0) | 0.332 | 0.252 | 0.002 | 5.7% | 4.1% | 0% |
| Baseline | 0.314 | 0.242 | 0.002 | |||||
| Overall Average Improvement | 12.5% | 9.3% | 3.9% | |||||
| Sequence | Feature Matching and Stability Metrics | |||
|---|---|---|---|---|
| Matched Keypoints | Inlier Ratio | Sampson Error Median | Two-Hop Survival Ratio | |
| turtle_2 | 3991 | 0.8251 | 0.2535 | 0.5432 |
| mobula_4 | 3399 | 0.8914 | 0.1662 | 0.7055 |
| marine_r10 | 3477 | 0.8160 | 0.3064 | 0.5366 |
| marine_r8 | 2458 | 0.7843 | 0.2981 | 0.4998 |
| marine_r3 | 3385 | 0.8143 | 0.2918 | 0.5305 |
| marine_r2 | 2053 | 0.7820 | 0.2830 | 0.5079 |
| coral_1 | 14,052 | 0.8515 | 0.2555 | 0.5696 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2026 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license.
Share and Cite
Yang, Z.; Yang, S.; Fu, Y.; Jiang, H. Research on an Underwater Visual Enhancement Method Based on Adaptive Parameter Optimization in a Multi-Operator Framework. Sensors 2026, 26, 668. https://doi.org/10.3390/s26020668
Yang Z, Yang S, Fu Y, Jiang H. Research on an Underwater Visual Enhancement Method Based on Adaptive Parameter Optimization in a Multi-Operator Framework. Sensors. 2026; 26(2):668. https://doi.org/10.3390/s26020668
Chicago/Turabian StyleYang, Zhiyong, Shengze Yang, Yuxuan Fu, and Hao Jiang. 2026. "Research on an Underwater Visual Enhancement Method Based on Adaptive Parameter Optimization in a Multi-Operator Framework" Sensors 26, no. 2: 668. https://doi.org/10.3390/s26020668
APA StyleYang, Z., Yang, S., Fu, Y., & Jiang, H. (2026). Research on an Underwater Visual Enhancement Method Based on Adaptive Parameter Optimization in a Multi-Operator Framework. Sensors, 26(2), 668. https://doi.org/10.3390/s26020668

