Skip to content

dattadebrup/TUM-RGBD-odometry-evaluator

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

TUM RGBD odometry evaluator.

This software helps the user to practice/test their own odometry/localization algorithm on the TUM RGBD dataset. The performance(speed) and accuracy of the users' algorithm are shown on the GUI of the software. More details description about this software is available here.

How to Run:

To launch the software, follow these steps below:

Where to insert the code

MyAlgorithm.py

    def execute(self):
        # demo code (replace with your code )
        print ("Runing")

        #Getting sensor data 
        data = self.getReadings( 'color_img' , 'depth_img' )

        #color image 
        color_image = data.color_img 

        #depth image 
        depth_image = data.depth_img 

        #set processed image
        self.set_processed_image(color_img)

        #set predicted pose
        x = 1
        y = 1
        self.set_predicted_pose(x,y) 
        

API's:

Sensors available: - 'color_img' , 'depth_img', 'orientation' , 'accelerometer' , 'scan' (laser scan) .

  • data = self.getReadings('color_img' , 'depth_img') - to get the next available RGB image and the Depth image from the ROSbag file. ( only one pair of RGB image and Depth image will be provided per call).

  • color_img = data.color_img - to get the RGB image from the object 'data'.

  • depth_img = data.depth_img - to get the Depth image from the object 'data'.

  • color_img_t = data.color_img_t - to get the timestamp of the RGB image.

  • depth_img_t = data.depth_img_t - to get the timestamp of the Depth image.

  • self.set_processed_image(color_img) - to set the processed RGB image to be shown on the GUI.

  • self.set_predicted_pose(x,y) - to set the position predicted by the algorithm ( x and y should be floating point number.)

  • self.set_predicted_path(path) - to set predicted path at once /or reset the previously set predicted poses at once ---- path should be Nx2 (numpy array or python list) [x,y].

More information is available here.

Data Types:

  • The Color RGB image is provided in 640×480 8-bit RGB format.

  • The Depth image is provided in 32-bit floating point precision.

  • The Timestamps are floating point numbers.

  • The laser scan data is provided in numpy array format.

  • Orientation data can be accessed as follows:

    qz = data.orientation['qz']

    qw = data.orientation['qw']

    ‘qx’ and ‘qy’ are essentially zero(since it is a 2D odometry).

  • Accelerometer data can be accessed as follows:

    ax = data.accelerometer['x']

    ay = data.accelerometer['y']

    ‘ay’ is essentially zero.

  • For more details refer to the original page of the TUM RGBD dataset https://vision.in.tum.de/data/datasets/rgbd-dataset/file_formats#ros_bag.

Demonstration video:

Demo Video

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages