Abstract
Underwater images are often degraded by wavelength-dependent absorption, scattering, and turbidity, resulting in color distortion, low contrast, and noise. To address these challenges, we propose a multi-stream preprocessing and multi-scale fusion framework guided by perceptual weight maps for underwater image enhancement. The framework generates three complementary representations of the input: a white-balanced stream for global color correction, a CLAHE-enhanced stream for local contrast improvement, and a Gaussian-filtered CLAHE stream for noise reduction. Each stream is decomposed using Laplacian pyramids, and four weight maps-chromatic, local contrast, saturation, and exposure are adaptively estimated to guide the fusion process. This approach ensures consistent color correction, enhanced textures and structural details, and effective noise suppression in the reconstructed output. The method was evaluated on the UIEB and EUVP datasets using using both reference-based (PSNR, SSIM, MSE) and no-reference (NIQE, AG, UIQM and entropy) metrics. Comparative experiments with UDCP, CLAHE, Water-Net, Retinex, GDCP, FUnIE-GAN, and UGAN demonstrate consistent improvements in color restoration, visibility, and perceptual quality. Our framework achieved 25.44 dB PSNR, 0.895 SSIM, and 7.68 entropy, outperforming both conventional and learning-based enhancement methods. These results, further supported by histogram analysis and ablation studies, confirm the reliability and effectiveness of the approach.
Keywords
Get full access to this article
View all access options for this article.
