论文标题
通过直接和间接观察,使用LiDAR和Fisheye摄像机的相对无人机地面车辆定位
Relative Drone-Ground Vehicle Localization using LiDAR and Fisheye Cameras through Direct and Indirect Observations
论文作者
论文摘要
估计无人机(UAV)或无人机的姿势是一项具有挑战性的任务。对于许多应用程序,例如导航,监视,地面跟踪对象和3D重建非常有用。在这项工作中,我们在无人机和地面车辆之间提出了一种基于激光镜的相对姿势估计方法,使用LiDAR传感器和车辆屋顶上的鱼眼摄像头,另一台安装在无人机下方的鱼眼相机。 LIDAR传感器直接观察无人机并测量其位置,并且两个相机使用周围物体的间接观察估计相对取向。我们提出了一种动态自适应核的方法,用于使用LIDAR进行无人机检测和跟踪。我们检测到两个相机中的消失点,并找到它们的对应关系以估计相对取向。另外,我们通过依靠无人机通过激光雷(LiDar)的观察到的运动来提出旋转校正技术。在我们的实验中,我们能够实现无人机的初始检测和实时跟踪。我们的方法是完全自动的。
Estimating the pose of an unmanned aerial vehicle (UAV) or drone is a challenging task. It is useful for many applications such as navigation, surveillance, tracking objects on the ground, and 3D reconstruction. In this work, we present a LiDAR-camera-based relative pose estimation method between a drone and a ground vehicle, using a LiDAR sensor and a fisheye camera on the vehicle's roof and another fisheye camera mounted under the drone. The LiDAR sensor directly observes the drone and measures its position, and the two cameras estimate the relative orientation using indirect observation of the surrounding objects. We propose a dynamically adaptive kernel-based method for drone detection and tracking using the LiDAR. We detect vanishing points in both cameras and find their correspondences to estimate the relative orientation. Additionally, we propose a rotation correction technique by relying on the observed motion of the drone through the LiDAR. In our experiments, we were able to achieve very fast initial detection and real-time tracking of the drone. Our method is fully automatic.