论文标题
AWR:3D手姿势估计的自适应加权回归
AWR: Adaptive Weighting Regression for 3D Hand Pose Estimation
论文作者
论文摘要
在本文中,我们提出了一种自适应加权回归(AWR)方法,以利用基于检测和基于回归的方法的优势。手动关节坐标估计为以自适应重量图为指导的所有像素在密集表示中的离散整合。这个可学习的聚合过程引入了密集和联合监督,允许端到端的培训并带来适应性地图的适应性,从而使网络更加准确和稳健。进行了全面的探索实验,以验证各种实验环境下AWR的有效性和一般性,尤其是其对不同类型的致密表示和输入方式的有用性。我们的方法在包括纽约大学,ICVL,MSRA和HANDS 2017数据集在内的四个公开数据集上的其他最先进方法优于其他最先进的方法。
In this paper, we propose an adaptive weighting regression (AWR) method to leverage the advantages of both detection-based and regression-based methods. Hand joint coordinates are estimated as discrete integration of all pixels in dense representation, guided by adaptive weight maps. This learnable aggregation process introduces both dense and joint supervision that allows end-to-end training and brings adaptability to weight maps, making the network more accurate and robust. Comprehensive exploration experiments are conducted to validate the effectiveness and generality of AWR under various experimental settings, especially its usefulness for different types of dense representation and input modality. Our method outperforms other state-of-the-art methods on four publicly available datasets, including NYU, ICVL, MSRA and HANDS 2017 dataset.