Physical Human-Robot Collaboration: Human Intent Prediction

Results of this project are explained in the final report.

 

Apply to this project here

Motivation

Intuitive and efficient physical human-robot collaboration relies on the mutual observability of the human and the robot, i.e. the two entities being able to interpret each other’s intentions and actions. This is remedied by a myriad of methods involving human sensing or intention decoding. However, the physical interaction establishes a rich channel of communication through forces, torques and haptics in general, which is often overlooked in industrial implementations of human-robot interaction. Previous works [1-3] have shown that haptic communication enables human partners to better collaborate with each other. However, little is known about how humans use haptic feedback to predict the intentions of their partners. Understanding this will make cooperation between humans and robots better and more natural.

Experiment:

Together with a research group from Imperial College London, we studied the collaboration between human-human and human-robot teams. In our previous experiment[1], we asked 12 participants to collaborate on a physical task in dyads. The task was to balance a ball on a smooth board at the target area. The task was not trivial and required good coordination of the two partners. The experiment was rendered in a virtual reality (VR) environment and participants interacted with each other via haptic devices (Phantom Touch; 3D SYSTEMS). Each dyad performed 360 trials of the task. Each trial takes about 5-15 s. Kinematic and kinetic data were recorded at 1000 Hz.  In half of the trials, participants received haptic feedback from the devices, and in the other half, they had no haptic feedback and had to rely only on visual information.

Goal:

The goal of this project is to develop an intent predictor, which predicts the intent of the partner by combining visual and haptic feedback. This can be done using probabilistic models such as Hidden Markov Model, Kalman Filter or other Machine Learning algorithms. The model can be integrated with our robot controller and tested by experiments to see if it can improve the collaboration between humans and robots.

 Requirements

·       Proficiency in Python

·       Prior knowledge in machine learning algorithms.

·       Experience in time series analysis will be preferred.

 

[1] Liu, Y., Leib, R., Dudley, W., Shafti, A., Faisal, A. A., & Franklin, D. W. (2022). The role of haptic communication in dyadic collaborative object manipulation tasks. arXiv preprint arXiv:2203.01287.

[2] Takagi, A., Ganesh, G., Yoshioka, T., Kawato, M., & Burdet, E. (2017). Physically interacting individuals estimate the partner’s goal to enhance their movements. Nature Human Behaviour1(3), 1-6.

[3] Takai, A., Fu, Q., Doibata, Y., Lisi, G., Tsuchiya, T., Mojtahedi, K., ... & Santello, M. (2021). Leaders are made: Learning acquisition of consistent leader-follower relationships depends on implicit haptic interactions. bioRxiv.

Important notice

Accepted students to this project should attend online workshops at the LRZ in April 2023 before the semester starts, unless they have proven knowledge. More information will be provided to students accepted to this project.