当前位置: X-MOL 学术IEEE Trans. Robot. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
TossNet: Learning to Accurately Measure and Predict Robot Throwing of Arbitrary Objects in Real Time With Proprioceptive Sensing
IEEE Transactions on Robotics ( IF 9.4 ) Pub Date : 2024-06-18 , DOI: 10.1109/tro.2024.3416009
Lipeng Chen 1 , Weifeng Lu 1 , Kun Zhang 1 , Yizheng Zhang 1 , Longfei Zhao 1 , Yu Zheng 1
Affiliation  

Accurate measuring and modeling of dynamic robot manipulation (e.g., tossing and catching) is particularly challenging, due to the inherent nonlinearity, complexity, and uncertainty in high-speed robot motions and highly dynamic robot–object interactions happening in very short distances and times. Most studies leverage extrinsic sensors such as visual and tactile feedback toward task or object-centric modeling of manipulation dynamics, which, however, may hit bottleneck due to the significant cost and complexity, e.g., the environmental restrictions. In this work, we investigate whether using solely the on-board proprioceptive sensory modalities can effectively capture and characterize dynamic manipulation processes. In particular, we present an object-agnostic strategy to learn the robot toss dynamics of arbitrary unknown objects from the spatio-temporal variations of robot toss movements and wrist-force/torque (F/T) observations. We then propose TossNet, an end-to-end formulation that jointly measures the robot toss dynamics and predicts the resulting flying trajectories of the tossed objects. Experimental results in both simulation and real-world scenarios demonstrate that our methods can accurately model the robot toss dynamics of both seen and unseen objects, and predict their flying trajectories with superior prediction accuracy in nearly real-time. Ablative results are also presented to demonstrate the effectiveness of each proprioceptive modality and their correlations in modeling the toss dynamics. Case studies show that TossNet can be applied on various real robot platforms for challenging tossing-centric robot applications, such as blind juggling and high-precise robot pitching.

中文翻译:


TossNet:学习利用本体感知实时准确测量和预测机器人投掷任意物体



由于高速机器人运动以及在非常短的距离和时间内发生的高度动态的机器人与物体相互作用固有的非线性、复杂性和不确定性,动态机器人操纵(例如投掷和捕捉)的精确测量和建模特别具有挑战性。大多数研究利用外在传感器,例如针对任务或以对象为中心的操纵动力学建模的视觉和触觉反馈,然而,由于巨大的成本和复杂性(例如环境限制),这可能会遇到瓶颈。在这项工作中,我们研究仅使用机载本体感觉方式是否可以有效地捕获和表征动态操纵过程。特别是,我们提出了一种与对象无关的策略,从机器人投掷运动的时空变化和腕力/扭矩(F/T)观察中学习任意未知物体的机器人投掷动力学。然后,我们提出了 TossNet,这是一种端到端的公式,可以联合测量机器人投掷动力学并预测投掷物体的最终飞行轨迹。模拟和现实场景中的实验结果表明,我们的方法可以准确地模拟看到和看不见的物体的机器人投掷动力学,并以近乎实时的卓越预测精度预测它们的飞行轨迹。还提供了消融结果,以证明每种本体感觉模式及其在抛掷动力学建模中的相关性的有效性。案例研究表明,TossNet 可以应用于各种真实的机器人平台,用于具有挑战性的以投掷为中心的机器人应用,例如盲目杂耍和高精度机器人投球。
更新日期:2024-06-18
down
wechat
bug