Skip to content
/ Egonn Public

EgoNN: Egocentric Neural Network for Point Cloud Based 6DoF Relocalization at the City Scale

License

Notifications You must be signed in to change notification settings

jac99/Egonn

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

64 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

EgonNN: Egocentric Neural Network for Point Cloud Based 6DoF Relocalization at the City Scale

Paper: EgoNN: Egocentric Neural Network for Point Cloud Based 6DoF Relocalization at the City Scale IEEE Robotics and Automation Letters (RA-L) Volume 7 Issue 2 April 2022

Jacek Komorowski, Monika Wysoczanska, Tomasz Trzcinski

Warsaw University of Technology

What's new

  • [2021-10-24] Evaluation code and pretrained models released.
  • [2021-12-16] Training code released.

Our other projects

  • MinkLoc3D: Point Cloud Based Large-Scale Place Recognition (WACV 2021): MinkLoc3D
  • MinkLoc++: Lidar and Monocular Image Fusion for Place Recognition (IJCNN 2021): MinkLoc++
  • Large-Scale Topological Radar Localization Using Learned Descriptors (ICONIP 2021): RadarLoc
  • Improving Point Cloud Based Place Recognition with Ranking-based Loss and Large Batch Training (2022): MinkLoc3Dv2

Introduction

The paper presents a deep neural network-based method for global and local descriptors extraction from a point cloud acquired by a rotating 3D LiDAR sensor. The descriptors can be used for two-stage 6DoF relocalization. First, a course position is retrieved by finding candidates with the closest global descriptor in the database of geo-tagged point clouds. Then, 6DoF pose between a query point cloud and a database point cloud is estimated by matching local descriptors and using a robust estimator such as RANSAC. Our method has a simple, fully convolutional architecture and uses a sparse voxelized representation of the input point cloud. It can efficiently extract a global descriptor and a set of keypoints with their local descriptors from large point clouds with tens of thousand points.

Citation

If you find this work useful, please consider citing:

@ARTICLE{9645340,
author={Komorowski, Jacek and Wysoczanska, Monika and Trzcinski, Tomasz},
journal={IEEE Robotics and Automation Letters}, 
title={EgoNN: Egocentric Neural Network for Point Cloud Based 6DoF Relocalization at the City Scale}, 
year={2022},
volume={7},
number={2},
pages={722-729},
doi={10.1109/LRA.2021.3133593}}

Environment and Dependencies

Code was tested using Python 3.8 with PyTorch 1.10.1 and MinkowskiEngine 0.5.4 on Ubuntu 20.04 with CUDA 10.2. Note: CUDA 11.1 is not recommended as there are some issues with MinkowskiEngine 0.5.4 on CUDA 11.1.

The following Python packages are required:

  • PyTorch (version 1.10.1 or above)
  • MinkowskiEngine (version 0.5.4 or above)
  • pytorch_metric_learning (version 1.0.0 or above)
  • Open3D (version 0.14 or above)
  • python-lzf (version 0.2.4 or above)
  • wandb

Modify the PYTHONPATH environment variable to include absolute path to the project root folder:

export PYTHONPATH=$PYTHONPATH:/home/.../Egonn

Datasets

EgoNN is trained and evaluated using the following datasets:

  • MulRan dataset: Sejong traversal is used. The traversal is split into training and evaluation part link
  • Apollo-SouthBay dataset: SunnyvaleBigLoop trajectory is used for evaluation, other 5 trajectories (BaylandsToSeafood, ColumbiaPark, Highway237, MathildaAVE, SanJoseDowntown) are used for training link
  • Kitti dataset: Sequence 00 is used for evaluation link

First, you need to download datasets:

  • For MulRan dataset you need to download ground truth data (*.csv) and LiDAR point clouds (Ouster.zip) for traversals: Sejong01 and Sejong02 (link).
  • Download Apollo-SouthBay dataset using the download link on the dataset website (link).
  • Download Kitti odometry dataset (calibration files, ground truth poses, Velodyne laser data) (link).

After loading datasets you need to generate training pickles for the network training and evaluation pickles for model evaluation.

Training pickles generation

Generating training tuples is very time consuming, as ICP is used to refine the ground truth poses between each pair of neighbourhood point clouds.

cd datasets/mulran
python generate_training_tuples.py --dataset_root <mulran_dataset_root_path>

cd ../southbay
python generate_training_tuples.py --dataset_root <apollo_southbay_dataset_root_path>
Evaluation pickles generation
cd datasets/mulran
python generate_evaluation_sets.py --dataset_root <mulran_dataset_root_path>

cd ../southbay
python generate_evaluation_sets.py --dataset_root <apollo_southbay_dataset_root_path>

cd ../kitti
python generate_evaluation_sets.py --dataset_root <kitti_dataset_root_path>

Training

First, download datasets and generate training and evaluation pickles as described above. Edit the configuration file config_egonn.txt. Set dataset_folder parameter to point to the dataset root folder. Modify batch_size_limit and secondary_batch_size_limit parameters depending on available GPU memory. Default limits require at least 11GB of GPU RAM.

To train the EgoNN model, run:

cd training

python train.py --config ../config/config_egonn.txt --model_config ../models/egonn.txt 

Pre-trained Model

EgoNN model trained (on training splits of MulRan and Apollo-SouthBay datasets) is available in weights/model_egonn_20210916_1104.pth folder.

Evaluation

To evaluate a pretrained model run below commands. Ground truth poses between different traversals in all three datasets are slightly misaligned. To reproduce results from the paper, use --icp_refine option to refine ground truth poses using ICP.

cd eval

# To evaluate on test split of Mulran dataset
python evaluate.py --dataset_root <dataset_root_path> --dataset_type mulran --eval_set test_Sejong01_Sejong02.pickle --model_config ../models/egonn.txt --weights ../weights/model_egonn_20210916_1104.pth --icp_refine

# To evaluate on test split of Apollo-SouthBay dataset
python evaluate.py --dataset_root <dataset_root_path> --dataset_type southbay --eval_set test_SunnyvaleBigloop_1.0_5.pickle --model_config ../models/egonn.txt --weights ../weights/model_egonn_20210916_1104.pth --icp_refine

# To evaluate on test split of KITTI dataset
python evaluate.py --dataset_root <dataset_root_path> --dataset_type kitti --eval_set kitti_00_eval.pickle --model_config ../models/egonn.txt --weights ../weights/model_egonn_20210916_1104.pth --icp_refine

Results

EgoNN performance...

Visualizations

Visualizations of our keypoint detector results. On the left, we show 128 keypoints with the lowest saliency uncertainty (red dots). On the right, 128 keypoints with the highest uncertainty (yellow dots).

Successful registration of point cloud pairs from KITTI dataset gathered during revisiting the same place from different directions. On the left we show keypoint correspondences (RANSAC inliers) found during 6DoF pose estimation with RANSAC. On the right we show point clouds aligned using estimated poses.

License

Our code is released under the MIT License (see LICENSE file for details).