论文标题
变形 - 意识到的未配对图像翻译,用于实验动物的姿势估算
Deformation-aware Unpaired Image Translation for Pose Estimation on Laboratory Animals
论文作者
论文摘要
我们的目标是在不使用任何手动监督的情况下捕获神经科学模型生物的姿势,以研究神经回路如何编排行为。当在由数百万帧组成的真实或模拟数据集中接受培训时,人类姿势估计的准确性很高。但是,对于许多应用程序而言,模型是不切实际的,并且不存在具有全面注释的真实培训数据集。我们使用新的SIM2REAL域传输方法解决了这个问题。我们的关键贡献是在不配对的图像翻译框架中对外观,形状和摆姿势的明确建模。我们的模型使我们能够通过将可用的身体关键点位置从源域转移到生成的目标图像来训练目标域上的姿势估计器。 We compare our approach with existing domain transfer methods and demonstrate improved pose estimation accuracy on Drosophila melanogaster (fruit fly), Caenorhabditis elegans (worm) and Danio rerio (zebrafish), without requiring any manual annotation on the target domain and despite using simplistic off-the-shelf animal characters for simulation, or simple geometric shapes as models.我们的新数据集,代码和受过训练的模型将发布,以支持未来的神经科学研究。
Our goal is to capture the pose of neuroscience model organisms, without using any manual supervision, to be able to study how neural circuits orchestrate behaviour. Human pose estimation attains remarkable accuracy when trained on real or simulated datasets consisting of millions of frames. However, for many applications simulated models are unrealistic and real training datasets with comprehensive annotations do not exist. We address this problem with a new sim2real domain transfer method. Our key contribution is the explicit and independent modeling of appearance, shape and poses in an unpaired image translation framework. Our model lets us train a pose estimator on the target domain by transferring readily available body keypoint locations from the source domain to generated target images. We compare our approach with existing domain transfer methods and demonstrate improved pose estimation accuracy on Drosophila melanogaster (fruit fly), Caenorhabditis elegans (worm) and Danio rerio (zebrafish), without requiring any manual annotation on the target domain and despite using simplistic off-the-shelf animal characters for simulation, or simple geometric shapes as models. Our new datasets, code, and trained models will be published to support future neuroscientific studies.