Abstract

LOCALIZATION AND MAPPING USING A SINGLE- PERSPECTIVE CAMERA

Neetu

147-152

Vol: 2, Issue: 2, 2012

Localization and mapping are fundamental problems in mobile robotics. On the one hand, it is crucial to know where the robot is located in its environment in order to perform high-level applications such as delivery tasks. On the other hand, maps of the environment are often not available at the outset. Therefore, the ability to build a map of the environment is often an essential requirement. In the literature, the localization problem has been intensively studied in the past. It can be divided into two different classes. In the first class, robot localization is solved under the assumption that a map of the environment is known. It is the goal to track the robot’s position in a map. If the start location is unknown, we refer to it as global localization. In the second class, the environment is unknown so it is required to build a map on the fly. This problem is called simultaneous localization and mapping (SLAM). In general, SLAM is a harder problem than localization in known environments since mapping and localization needs to be solved at the same time. SLAM is often considered to be a chicken-and-egg problem: A map is necessary to localize a robot, and the robot’s position and orientation should be known in order to build a map. Thus, we have got circular cause and .

Download PDF

    References

  1. A. Abdel-Hakim and A. Farag. CSIFT: A SIFT descriptor with color invariant characteristics. In Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition (CVPR), volume 2, pages 1978–1983, 2006.
  2. H. Bay, T. Tuytelaars, and L. Van Gool. SURF: Speeded up robust features. In Proc. of the European Conf. on Computer Vision (ECCV), 2006.
  3. M. Bennewitz, C. Stachniss, W. Burgard, and S. Behnke. Metric localization with scaleinvariant visual features using a single perspective camera. InProc. of the European Robotics Symposium (EUROS), Palermo, March 2006.
  4. P. Biber and W. Straer. The normal distributions transform: A new approach to laser scan matching. In Proc. of the IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), 2003.
  5. D. Brown. Decentering distortion of lenses. Photometric Engineering, 32(3):444–462, 1966.
  6. A. Censi, L. Iocchi, and G. Grisetti. Scan matching in the hough domain. In Proc. of the IEEE Int. Conf. on Robotics & Automation (ICRA), 2005.
  7. A. Davison. Real-time simultaneous localisation and mapping with a single camera. In Proc. of the Int. Conf. on Computer Vision, Proc. of the Int. Conf. on Computer Vision (ICCV).
  8. A. Davison, I. Reid, N. Molton, and O. Stasse. MonoSLAM: Real-time single camera SLAM. IEEE Transaction on Pattern Analysis and Machine Intelligence, 29(6), 2007.
  9. F. Dellaert, S. Seitz, C. Thorpe, and S. Thrun. EM, MCMC, and chain flipping for structure from motion with unknown correspondence. Machine Learning, 2003.
  10. R. Diestel. Graph Theory, volume 173 of Graduate Texts in Mathematics. Springer, Heidelberg, 2005.
  11. T. Duckett, S. Marsland, and J. Shapiro. Fast, on-line learning of globally consistent maps. Journal of Autonomous Robots, 12(3):287 – 300, 2002.
  12. M. Fischler and R. Bolles. Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography. Communications of the ACM, 24:381–395, 1981.
  13. J. Folkesson and H. Christensen. Outdoor exploration and SLAM using a compressed filter. In Proc. of the IEEE Int. Conf. on Robotics & Automation (ICRA), Taipei, Taiwan, 2003.
  14. U. Frese, P. Larsson, and T. Duckett. A multilevel relaxation algorithm for simultaneous localisation and mapping. IEEE Transactions on Robotics, 21(2):1–12, 2005.
  15. G. Grisetti, C. Stachniss, S. Grzonka, and W. Burgard. A tree parameterization for efficiently computing maximum likelihood maps using gradient descent. In Proc. of Robotics: Science and Systems (RSS), 2007. to appear.
  16. D. H¨ahnel, D. Fox, W. Burgard, and S. Thrun. A highly efficient FastSLAM algorithm for generating cyclic maps of large-scale environments from raw laser range measurements. In Proc. of the IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), 2003.
  17. C. Harris and M. Stephens. A combined corner and edge detector. In Proc. Of the 4th Alvey Vision Conference, 1988.
  18. R. Hartley. Euclidean reconstruction from uncalibrated views. Application of Invariance in Computer Vision, 1994.
Back

Disclaimer: All papers published in IJRST will be indexed on Google Search Engine as per their policy.

We are one of the best in the field of watches and we take care of the needs of our customers and produce replica watches of very good quality as per their demands.