论文标题

基于事件的非刚性重建

Event-based Non-Rigid Reconstruction from Contours

论文作者

Xue, Yuxuan, Li, Haolong, Leutenegger, Stefan, Stückler, Jörg

论文摘要

随着时间的推移,快速非刚性对象变形的视觉重建是传统基于框架的摄像机的挑战。在本文中,我们提出了一种使用基于事件的相机的测量值重建此类变形的新方法。在静态背景的假设下,所有事件都是由运动生成的,我们的方法估算了概率优化框架中对象轮廓生成的事件的对象变形。它将事件与轮廓上的缝隙相关联,并通过事件像素与相关面部的面孔最大化视线的对齐。在有关合成和真实数据的实验中,我们证明了我们方法比最新的优化和基于学习的方法的优势来重建人的手的运动。可以在https://youtu.be/gzfw7i5okjg上获得实验视频

Visual reconstruction of fast non-rigid object deformations over time is a challenge for conventional frame-based cameras. In this paper, we propose a novel approach for reconstructing such deformations using measurements from event-based cameras. Under the assumption of a static background, where all events are generated by the motion, our approach estimates the deformation of objects from events generated at the object contour in a probabilistic optimization framework. It associates events to mesh faces on the contour and maximizes the alignment of the line of sight through the event pixel with the associated face. In experiments on synthetic and real data, we demonstrate the advantages of our method over state-of-the-art optimization and learning-based approaches for reconstructing the motion of human hands. A video of the experiments is available at https://youtu.be/gzfw7i5OKjg

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源