Skip to content

and-buk/Udacity-CV-SLAM

Repository files navigation

Project: Landmark Detection & Robot Tracking (SLAM)

This repository contains a solution for the third project of Computer Vision Nanodegree program.

Introduction

In this project, we'll implement SLAM (Simultaneous Localization and Mapping) for a 2 dimensional world. We'll combine knowledge about robot sensor measurements and movement to create a map of an environment from only sensor and motion data gathered by a robot, over time. SLAM gives us a way to track the location of a robot in the world in real-time and identify the locations of landmarks such as buildings, trees, rocks, and other world features. This is an active area of research in the fields of robotics and autonomous systems.

Below is an example of a 2D robot 50x50 grid world with landmarks (purple x's) and the robot (a red 'o') located and found using only sensor and motion data collected by that robot.

Example of SLAM output (estimated final robot pose and landmark location)

Instructions

The repository contains five files:

  • 1. Robot Moving and Sensing.ipynb: Robot Moving and Sensing
  • 2. Omega and Xi, Constraints.ipynb: Omega and Xi, Constraints
  • 3. Landmark Detection and Tracking.ipynb: Landmark Detection and Tracking
  • robot_class.py: Implementation of Robot class
  • helpers.py: Helper functions

About

SLAM (Simultaneous Localization and Mapping) implementation

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published