A sensor fusion layer to cope with reduced visibility in SLAM

Santos, Joao Machado, Couceiro, Micael S., Portugal, David and Rocha, Rui P. (2015) A sensor fusion layer to cope with reduced visibility in SLAM. Journal of Intelligent & Robotic Systems, 80 (3). pp. 401-422. ISSN 0921-0296

A sensor fusion layer to cope with reduced visibility in SLAM

Request a copy
A sensor fusion layer to cope with reduced visibility in SLAM
[img] PDF
__ddat02_staffhome_jpartridge_art%3A10.1007%2Fs10846-015-0180-8.pdf - Whole Document
Restricted to Repository staff only

__ddat02_staffhome_jpartridge_A sensor fusion layer. Santos.pdf - Whole Document

Item Type:Article
Item Status:Live Archive


Mapping and navigating with mobile robots in scenarios with reduced visibility, e.g. due to smoke, dust, or fog, is still a big challenge nowadays. In spite of the tremendous advance on Simultaneous Localization and Mapping (SLAM) techniques for the past decade, most of current algorithms fail in those environments because they usually rely on optical sensors providing dense range data, e.g. laser range finders, stereo vision, LIDARs, RGB-D, etc., whose measurement process is highly disturbed by particles of smoke, dust, or steam. This article addresses the problem of performing SLAM under reduced visibility conditions by proposing a sensor fusion layer which takes advantage from complementary characteristics between a laser range finder (LRF) and an array of sonars. This sensor fusion layer is ultimately used with a state-of-the-art SLAM technique to be resilient in scenarios where visibility cannot be assumed at all times. Special attention is given to mapping using commercial off-the-shelf (COTS) sensors, namely arrays of sonars which, being usually available in robotic platforms, raise technical issues that were investigated in the course of this work. Two sensor fusion methods, a heuristic method and a fuzzy logic-based method, are presented and discussed, corresponding to different stages of the research work conducted. The experimental validation of both methods with two different mobile robot platforms in smoky indoor scenarios showed that they provide a robust solution, using only COTS sensors, for adequately coping with reduced visibility in the SLAM process, thus decreasing significantly its impact in the mapping and localization results obtained.

Keywords:Dust, Fuzzy logic, Indoor positioning systems, Mapping, Mobile robots, Radar equipment, Range finders, Robotics, Robots, Sonar, Stereo image processing, Stereo vision, Visibility, Commercial off-the shelves, Complementary characteristics, Mapping and localization, Reduced visibility, Robot operating systems (ROS), Sensor fusion, Simultaneous localization and mapping, SLAM, Heuristic methods, bmjgoldcheck, NotOAChecked
Subjects:H Engineering > H671 Robotics
G Mathematical and Computer Sciences > G740 Computer Vision
Divisions:College of Science > School of Computer Science
Related URLs:
ID Code:16820
Deposited On:27 Feb 2015 10:25

Repository Staff Only: item control page