Dynamic Robot Motion Prediction Updates in Physical Human-Robot Interactive Tasks [arxiv]
Shuangda Duan, Longxin Chen, Yaxiang Wang, Xuan Zhao, and Juan Rojas.

Abstract:
Human-robot collaboration is on the rise. Robots need to increasingly improve the efficiency and smoothness with which they assist humans by properly anticipating a humans intention. To do so, prediction models need to increase their accuracy and responsiveness. This work builds on top Interaction Movement Primitives with phase estimation and re-formulates the framework to use dynamic human-motion observations. Previously, the probabilistic framework has only considered static human observations. By using dynamic observation windows, a series of updated human motion distributions are generated. Phase estimates occur within the dynamic time window range. Co-activation is performed between the current robot motion distribution and the newest and most probable distribution for the latest dynamic observation. The results is a smooth update of the robot motion generation that achieves high accuracies and enhanced responsiveness.

Code

Video

(Youtube | Youku)