Heshmat, Mohamed, Abdellatif, Mohamed, Nakamura, Kazuaki et al, Abouelsoud, A. A. and Babaguchi, Noboru
(2014)
Camera oscillation pattern for VSLAM: translational versus rotational.
In: The IEEE International Conference on 3D Imaging (IC3D), 9 - 10 Dec 2014, Liege, Belgium.
![[img]](http://eprints.lincoln.ac.uk/25428/1.hassmallThumbnailVersion/Heshmat%20IC3D%202014%202.pdf)  Preview |
|
PDF
Heshmat IC3D 2014 2.pdf
- Whole Document
338kB |
Item Type: | Conference or Workshop contribution (Presentation) |
---|
Item Status: | Live Archive |
---|
Abstract
Visual SLAM algorithms exploit natural scene features to infer the camera motion and build a map of the environment landmarks. SLAM algorithm has two interrelated processes localization and mapping. For accurate localization, we need the features location estimates to converge quickly. On the other hand, to build an accurate map, we need accurate localization. Recently, a biologically inspired approach exploits deliberate camera oscillation has been used to improve the convergence speed of depth estimate. In this paper, we explore the effect of camera oscillation pattern on the accuracy of VSLAM. Two main oscillation patterns are used for distance estimation: translational and rotational. Experiments, using static and moving robot, are made to explore the effect of these oscillation patterns on the VSLAM performance.
Repository Staff Only: item control page