论文标题

使用触觉传感的手机重力旋转

In-Hand Gravitational Pivoting Using Tactile Sensing

论文作者

Toskov, Jason, Newbury, Rhys, Mukadam, Mustafa, Kulić, Dana, Cosgun, Akansel

论文摘要

我们研究引力枢轴,这是一种受约束的手机操纵版本,我们旨在控制物体围绕平行抓地力的握把的旋转。为了实现这一目标,我们没有控制抓地力以避免滑动,而是拥抱滑动以允许物体旋转。我们收集了两个现实世界数据集,一个静态跟踪数据集和一个循环数据集,均以对象角度和角速度标签注释。这两个数据集都包含有关十个不同家庭对象的基于力的触觉信息。我们训练LSTM模型,从纯触觉数据中预测固定对象的角度位置和速度。我们将该模型与一个控制器集成在一起,该模型可以打开并关闭抓地力,从而允许对象旋转到所需的相对角度。我们进行现实世界实验,在该实验中,机器人的任务是实现相对目标角度。我们表明,我们的方法在带有看不见的对象的零拍概括设置中优于基于滑动窗口的MLP。此外,当LSTM模型在使用LSTM模型和控制器内部收集的少量数据上进行微调时,我们的性能提高了16.6%。代码和视频可在https://rhys-newbury.github.io/projects/pivoting/上找到

We study gravitational pivoting, a constrained version of in-hand manipulation, where we aim to control the rotation of an object around the grip point of a parallel gripper. To achieve this, instead of controlling the gripper to avoid slip, we embrace slip to allow the object to rotate in-hand. We collect two real-world datasets, a static tracking dataset and a controller-in-the loop dataset, both annotated with object angle and angular velocity labels. Both datasets contain force-based tactile information on ten different household objects. We train an LSTM model to predict the angular position and velocity of the held object from purely tactile data. We integrate this model with a controller that opens and closes the gripper allowing the object to rotate to desired relative angles. We conduct real-world experiments where the robot is tasked to achieve a relative target angle. We show that our approach outperforms a sliding-window based MLP in a zero-shot generalization setting with unseen objects. Furthermore, we show a 16.6% improvement in performance when the LSTM model is fine-tuned on a small set of data collected with both the LSTM model and the controller in-the-loop. Code and videos are available at https://rhys-newbury.github.io/projects/pivoting/

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源