论文标题
sguie-net:语义注意力指导的水下图像增强了多尺度感知
SGUIE-Net: Semantic Attention Guided Underwater Image Enhancement with Multi-Scale Perception
论文作者
论文摘要
由于依赖波长的光衰减,折射和散射,水下图像通常会遭受颜色失真和模糊的细节。但是,由于配对的水下图像数量有限,带有未变形图像作为参考的图像,训练对各种降解类型的深度增强模型非常困难。为了提高数据驱动方法的性能,必须建立更有效的学习机制,从而从有限的培训样本资源中挖掘出更丰富的监督信息。在本文中,我们提出了一个新颖的水下图像增强网络,称为SGUIE-NET,其中我们将语义信息作为共享共同语义区域的不同图像的高级指导引入了语义信息。因此,我们提出了语义区域的增强模块,以感知从多个量表的不同语义区域的降解,并将其返回到从其原始量表中提取的全球注意力特征。该策略有助于实现对不同语义对象的强大和视觉上令人愉悦的增强,这应该得益于语义信息的指导以进行差异化增强。更重要的是,对于那些在培训样本分布中不常见的降解类型的指导,该指南根据其语义相关性将它们与已经良好的类型相关联。公开可用数据集和我们提出的数据集进行了广泛的实验,证明了Sguie-net的令人印象深刻的性能。代码和提议的数据集可在以下网址提供:https://trentqq.github.io/sguie-net.html
Due to the wavelength-dependent light attenuation, refraction and scattering, underwater images usually suffer from color distortion and blurred details. However, due to the limited number of paired underwater images with undistorted images as reference, training deep enhancement models for diverse degradation types is quite difficult. To boost the performance of data-driven approaches, it is essential to establish more effective learning mechanisms that mine richer supervised information from limited training sample resources. In this paper, we propose a novel underwater image enhancement network, called SGUIE-Net, in which we introduce semantic information as high-level guidance across different images that share common semantic regions. Accordingly, we propose semantic region-wise enhancement module to perceive the degradation of different semantic regions from multiple scales and feed it back to the global attention features extracted from its original scale. This strategy helps to achieve robust and visually pleasant enhancements to different semantic objects, which should thanks to the guidance of semantic information for differentiated enhancement. More importantly, for those degradation types that are not common in the training sample distribution, the guidance connects them with the already well-learned types according to their semantic relevance. Extensive experiments on the publicly available datasets and our proposed dataset demonstrated the impressive performance of SGUIE-Net. The code and proposed dataset are available at: https://trentqq.github.io/SGUIE-Net.html