论文标题
使用深度学习相关的红外阵列降噪
Correlated Read Noise Reduction in Infrared Arrays Using Deep Learning
论文作者
论文摘要
我们提出了一种植根于深度学习的新程序,该程序是从使用HXRG探测器中的天文仪器收集的数据立方体中构建科学图像。它通过使用检测器的读数方案来减少相关读数噪声的影响,从而改善了传统算法的缺点,可以从多个读数中构造2D图像。我们在添加到实验室黑暗中的模拟天体物理场景上训练卷积复发性神经网络,以估计每个像素图像的每个像素上的通量。与标准的通量测量方案相比,该方法可减少构建的科学图像(相关双重采样,跨度采样)的噪声,从而减少了从这些科学图像中提取的频谱上的误差。超过以低信噪比的比率制作的模拟数据立方体,该方法可能会产生最大的影响,我们发现,我们构造的科学图像的错误比$ 1/\ sqrt {n} $衰减更快,并且从图像中提取的频谱在三个图像的测试集中,标准误差为1.85的标准级别,该频谱已在图像的测试集中进行。该项目中使用的代码在GitHub上公开可用
We present a new procedure rooted in deep learning to construct science images from data cubes collected by astronomical instruments using HxRG detectors in low-flux regimes. It improves on the drawbacks of the conventional algorithms to construct 2D images from multiple readouts by using the readout scheme of the detectors to reduce the impact of correlated readout noise. We train a convolutional recurrent neural network on simulated astrophysical scenes added to laboratory darks to estimate the flux on each pixel of science images. This method achieves a reduction of the noise on constructed science images when compared to standard flux-measurement schemes (correlated double sampling, up-the-ramp sampling), which results in a reduction of the error on the spectrum extracted from these science images. Over simulated data cubes created in a low signal-to-noise ratio regime where this method could have the largest impact, we find that the error on our constructed science images falls faster than a $1/\sqrt{N}$ decay, and that the spectrum extracted from the images has, averaged over a test set of three images, a standard error reduced by a factor of 1.85 in comparison to the standard up-the-ramp pixel sampling scheme. The code used in this project is publicly available on GitHub