Dondrup, Christian, Bellotto, Nicola, Jovan, Ferdian et al and Hanheide, Marc
(2015)
Real-time multisensor people tracking for human-robot spatial interaction.
In: Workshop on Machine Learning for Social Robotics at ICRA 2015, 26 - 31 May 2015, Seattle, WA.
![[img]](http://eprints.lincoln.ac.uk/17545/1.hassmallThumbnailVersion/dondrup.pdf)  Preview |
|
PDF
dondrup.pdf
- Whole Document
5MB |
Item Type: | Conference or Workshop contribution (Paper) |
---|
Item Status: | Live Archive |
---|
Abstract
All currently used mobile robot platforms are able to navigate safely through their environment, avoiding static and dynamic obstacles. However, in human populated environments mere obstacle avoidance is not sufficient to make humans feel comfortable and safe around robots. To this end, a large community is currently producing human-aware navigation approaches to create a more socially acceptable robot behaviour. Amajorbuilding block for all Human-Robot Spatial Interaction is the ability of detecting and tracking humans in the vicinity of the robot. We present a fully integrated people perception framework, designed to run in real-time on a mobile robot. This framework employs detectors based on laser and RGB-D data and a tracking approach able to fuse multiple detectors using different versions of data association and Kalman filtering. The resulting trajectories are transformed into Qualitative Spatial Relations based on a Qualitative Trajectory Calculus, to learn and classify different encounters using a Hidden Markov Model based representation. We present this perception pipeline, which is fully implemented into the Robot Operating System (ROS), in a small proof of concept experiment. All components are readily available for download, and free to use under the MIT license, to researchers in all fields, especially focussing on social interaction learning by providing different kinds of output, i.e. Qualitative Relations and trajectories.
Repository Staff Only: item control page