论文标题

DFNET:通过直接特征匹配增强绝对姿势回归

DFNet: Enhance Absolute Pose Regression with Direct Feature Matching

论文作者

Chen, Shuai, Li, Xinghui, Wang, Zirui, Prisacariu, Victor Adrian

论文摘要

我们引入了一个相机重新定位管道,该管道结合了绝对姿势回归(APR)和直接功能匹配。通过结合曝光自适应的新视图综合,我们的方法成功地解决了现有基于光度法方法无法处理的室外环境中的光度扭曲。借助域不变的功能匹配,我们的解决方案使用对未标记数据的半监督学习提高了姿势回归精度。特别是,管道由两个组成部分组成:新型视图合成器和DFNET。前者综合了新的视图,以补偿暴露的变化,后者会回归摄像头的姿势,并提取了稳固的特征,这些特征弥合了真实图像和合成图像之间的域间隙。此外,我们引入了在线合成数据生成方案。我们表明,这些方法有效地增强了室内和室外场景中的相机姿势估计。因此,我们的方法通过优于现有的单像APR方法的方法高达56%,从而达到了最先进的准确性,可与基于3D结构的方法相当。

We introduce a camera relocalization pipeline that combines absolute pose regression (APR) and direct feature matching. By incorporating exposure-adaptive novel view synthesis, our method successfully addresses photometric distortions in outdoor environments that existing photometric-based methods fail to handle. With domain-invariant feature matching, our solution improves pose regression accuracy using semi-supervised learning on unlabeled data. In particular, the pipeline consists of two components: Novel View Synthesizer and DFNet. The former synthesizes novel views compensating for changes in exposure and the latter regresses camera poses and extracts robust features that close the domain gap between real images and synthetic ones. Furthermore, we introduce an online synthetic data generation scheme. We show that these approaches effectively enhance camera pose estimation both in indoor and outdoor scenes. Hence, our method achieves a state-of-the-art accuracy by outperforming existing single-image APR methods by as much as 56%, comparable to 3D structure-based methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源