Laser-based navigation enhanced with 3D time-of-flight data

Yuan, Fang and Swadzba, Agnes and Philippsen, Roland and Engin, Orhan and Hanheide, Marc and Wachsmuth, Sven (2009) Laser-based navigation enhanced with 3D time-of-flight data. In: Conference of 2009 IEEE International Conference on Robotics and Automation, ICRA '09, 12-17 May 2009, Kobe International Conference Center, Kobe, Japan.

Full content URL: http://www.scopus.com/inward/record.url?eid=2-s2.0...

Documents
Yuan2009-Laser-based_navigation_enhanced_with_3D_time-of-flight_data.pdf

Request a copy
[img] PDF
Yuan2009-Laser-based_navigation_enhanced_with_3D_time-of-flight_data.pdf - Whole Document
Restricted to Repository staff only

1MB

Abstract

Navigation and obstacle avoidance in robotics using planar laser scans has matured over the last decades. They basically enable robots to penetrate highly dynamic and populated spaces, such as people's home, and move around smoothly. However, in an unconstrained environment the twodimensional perceptual space of a fixed mounted laser is not sufficient to ensure safe navigation. In this paper, we present an approach that pools a fast and reliable motion generation approach with modern 3D capturing techniques using a Timeof-Flight camera. Instead of attempting to implement full 3D motion control, which is computationally more expensive and simply not needed for the targeted scenario of a domestic robot, we introduce a "virtual laser". For the originally solely laserbased motion generation the technique of fusing real laser measurements and 3D point clouds into a continuous data stream is 100% compatible and transparent. The paper covers the general concept, the necessary extrinsic calibration of two very different types of sensors, and exemplarily illustrates the benefit which is to avoid obstacles not being perceivable in the original laser scan. © 2009 IEEE.

Item Type:Conference or Workshop contribution (Paper)
Additional Information:Navigation and obstacle avoidance in robotics using planar laser scans has matured over the last decades. They basically enable robots to penetrate highly dynamic and populated spaces, such as people's home, and move around smoothly. However, in an unconstrained environment the twodimensional perceptual space of a fixed mounted laser is not sufficient to ensure safe navigation. In this paper, we present an approach that pools a fast and reliable motion generation approach with modern 3D capturing techniques using a Timeof-Flight camera. Instead of attempting to implement full 3D motion control, which is computationally more expensive and simply not needed for the targeted scenario of a domestic robot, we introduce a "virtual laser". For the originally solely laserbased motion generation the technique of fusing real laser measurements and 3D point clouds into a continuous data stream is 100% compatible and transparent. The paper covers the general concept, the necessary extrinsic calibration of two very different types of sensors, and exemplarily illustrates the benefit which is to avoid obstacles not being perceivable in the original laser scan. © 2009 IEEE.
Keywords:Robotics, Human-robot interaction
Subjects:H Engineering > H670 Robotics and Cybernetics
Divisions:College of Science > School of Computer Science
ID Code:7218
Deposited By: Marc Hanheide
Deposited On:08 Jan 2013 15:27
Last Modified:13 Mar 2013 09:21

Repository Staff Only: item control page