Skip to content

An implementation of ORB_SLAM2 in Windows platform with webcam or Intel® RealSense™

Notifications You must be signed in to change notification settings

Panepo/Nagatsuki

Repository files navigation

Nagatsuki

Build status

An implementation of ORB_SLAM2 in Windows platform with webcam or Intel® RealSense™

Thanks

FAQ

What is this?

This is an implementation of visual SLAM algorithm using ORB_SLAM2 library in Windows platform with your webcam or Intel® RealSense™. ORB_SLAM2 is a real-time SLAM library for Monocular, Stereo and RGB-D cameras that computes the camera trajectory and a sparse 3D reconstruction (in the stereo and RGB-D case with true scale).

SLAM?

Simultaneous localization and mapping (SLAM) is the computational problem of constructing or updating a map of an unknown environment while simultaneously keeping track of an agent's location within it.

Augmented Reality?

Right, SLAM is the basis of the augmented reality: The device needed to know it's 3D position in the world. When an AR app starts, the system doesn’t know much about the environment. It starts processing data from various sensors like camera. To improve accuracy, the device combines data from other useful sensors like the accelerometer and the gyroscope.

Visual SLAM

Visual SLAM is the SLAM system which process data only come from the camera, so doesn’t use odometry by accelerometers and gyroscopes. A very well working and recent algorithm is ORB-SLAM by Mur-Atal, Montiel and Tardós. The successor ORB_SLAM2 adds support for stereo or depth cameras in addition to a monocular system. What’s especially great is that the algorithms are available as open source under the GPL-v3 license.

Prerequisite

Requirements

  • Intel® RealSense™ 415/435
  • Webcam

Usage and Results

KITTI Dataset

  1. Download the dataset (grayscale images) from http://www.cvlibs.net/datasets/kitti/eval_odometry.php
  2. Execute the following command. Change KITTIX.yamlto KITTI00-02.yaml, KITTI03.yaml or KITTI04-12.yaml for sequence 0 to 2, 3, and 4 to 12 respectively. Change PATH_TO_DATASET_FOLDER to the uncompressed dataset folder. Change SEQUENCE_NUMBER to 00, 01, 02,.., 11.

TUM Dataset

  1. Download a sequence from http://vision.in.tum.de/data/datasets/rgbd-dataset/download and uncompress it.
  2. Associate RGB images and depth images using the python script tum-associate.py. We already provide associations for some of the sequences in Examples/RGB-D/associations/. You can generate your own associations file executing:
python tum-associate.py PATH_TO_SEQUENCE/rgb.txt PATH_TO_SEQUENCE/depth.txt > associations.txt
  1. Execute the following command. Change TUMX.yaml to TUM1.yaml,TUM2.yaml or TUM3.yaml for freiburg1, freiburg2 and freiburg3 sequences respectively. Change PATH_TO_SEQUENCE_FOLDERto the uncompressed sequence folder. Change ASSOCIATIONS_FILE to the path to the corresponding associations file.

Webcam

Grab webcam calibration and distortion parameters by yourselves, there is no common methods to get them.

  • RGB Mode: only RGB mode is allowed for webcam.

Intel® RealSense™ 415/435

First get RealSense intrin and extrin parameter using this python script getRealsense.py, then fill the parameter to realsense.yaml, realsense-stereo.yaml and realsense-rgbd.yaml for RGB, Stereo and RGBD mode respectively.

  • RGB Mode

result-rgb

Input source is set as RGB camera of RealSense. For an enclosing path, there might some route error.

  • Stereo Mode

Input source is set as two infrared cameras of RealSense. Due to the brightness of the infrared cameras, feature extracting, matching and tracking is quite hard.

  • RGBD Mode

result-rgbd

Input source is set as RGB camera and depth frame of RealSense. The best result within 3 modes with RealSense

About

An implementation of ORB_SLAM2 in Windows platform with webcam or Intel® RealSense™

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published