Skip to content

Real-time detection of diverse dynamic objects in complex environments.

License

Notifications You must be signed in to change notification settings

ethz-asl/dynablox

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Ubuntu 20.04 + ROS Noetic: Build

Dynablox

An online volumetric mapping-based approach for real-time detection of diverse dynamic objects in complex environments.

Table of Contents

Credits

Setup

Examples

Paper

If you find this package useful for your research, please consider citing our paper:

  • Lukas Schmid, Olov Andersson, Aurelio Sulser, Patrick Pfreundschuh, and Roland Siegwart. "Dynablox: Real-time Detection of Diverse Dynamic Objects in Complex Environments" in IEEE Robotics and Automation Letters (RA-L), Vol. 8, No. 10, pp. 6259 - 6266, October 2023. [ IEEE | ArXiv | Video ]
    @article{schmid2023dynablox,
      title={Dynablox: Real-time Detection of Diverse Dynamic Objects in Complex Environments},
      author={Schmid, Lukas, and Andersson, Olov, and Sulser, Aurelio, and Pfreundschuh, Patrick, and Siegwart, Roland},
      booktitle={IEEE Robotics and Automation Letters (RA-L)},
      year={2023},
      volume={8},
      number={10},
      pages={6259 - 6266},
      doi={10.1109/LRA.2023.3305239}}
    }

Video

A brief overview of the problem, approach, and results is available on youtube: Dynablox Youtube Video

News

We were excited to learn that Dynablox has been integrated into NVIDIA's nvblox, where the algorithm's parallelism can make fantastic use of the GPU and detect moving objects fast and at high resolutions!

Setup

There is a docker image available for this package. Check the usage in the dockerhub page.

Installation

  • Note on Versioning: This package was developed using Ubuntu 20.04 using ROS Noetic. Other versions should also work but support can not be guaranteed.
  1. If not already done so, install ROS. We recommend using Desktop-Full.

  2. If not already done so, setup a catkin workspace:

    mkdir -p ~/catkin_ws/src
    cd ~/catkin_ws
    catkin init
    catkin config --extend /opt/ros/$ROS_DISTRO
    catkin config --cmake-args -DCMAKE_BUILD_TYPE=RelWithDebInfo
    catkin config --merge-devel
  3. Install system dependencies:

    sudo apt-get install python3-vcstool python3-catkin-tools ros-$ROS_DISTRO-cmake-modules protobuf-compiler autoconf git rsync -y   
  4. Clone the repo using SSH Keys:

    cd ~/catkin_ws/src
    git clone git@github.com:ethz-asl/dynablox.git
  5. Install ROS dependencies:

    cd ~/catkin_ws/src
    vcs import . < ./dynablox/ssh.rosinstall --recursive 
  6. Build:

    catkin build dynablox_ros

Datasets

To run the demos we use the Urban Dynamic Objects LiDAR (DOALS) Dataset. To download the data and pre-process it for our demos, use the provided script:

roscd dynablox_ros/scripts
./download_doals_data.sh /home/$USER/data/DOALS # Or your preferred data destination.

We further collect a new dataset featuring diverse dynamic objects in complex scenes. The full dataset and description ca nbe found here. To download the processed ready-to-run data for our demos, use the provided script:

roscd dynablox_ros/scripts
./download_dynablox_data.sh /home/$USER/data/Dynablox # Or your preferred data destination.

Examples

Running a DOALS Sequence

  1. If not done so, download the DOALS dataset as explained here.

  2. Adjust the dataset path in dynablox_ros/launch/run_experiment.launch:

    <arg name="bag_file" default="/home/$(env USER)/data/DOALS/hauptgebaeude/sequence_1/bag.bag" />  
  3. Run

    roslaunch dynablox_ros run_experiment.launch 
  4. You should now see dynamic objects being detected as the sensor moves through the scene:

Run DOALS Example

Running a Dynablox Sequence

  1. If not done so, download the Dynablox dataset as explained here.

  2. Adjust the dataset path in dynablox_ros/launch/run_experiment.launch and set use_doals to false:

    <arg name="use_doals" default="false" /> 
    <arg name="bag_file" default="/home/$(env USER)/data/Dynablox/processed/ramp_1.bag" />  
  3. Run

    roslaunch dynablox_ros run_experiment.launch 
  4. You should now see dynamic objects being detected as the sensor moves through the scene: Run Dynablox Example

Running and Evaluating an Experiment

Running an Experiment

  1. If not done so, download the DOALS dataset as explained here.

  2. Adjust the dataset path in dynablox_ros/launch/run_experiment.launch:

    <arg name="bag_file" default="/home/$(env USER)/data/DOALS/hauptgebaeude/sequence_1/bag.bag" />  
  3. In dynablox_ros/launch/run_experiment.launch, set the evaluate flag, adjust the ground truth data path, and specify where to store the generated outpuit data:

    <arg name="evaluate" default="true" />
    <arg name="eval_output_path" default="/home/$(env USER)/dynablox_output/" />
    <arg name="ground_truth_file" default="/home/$(env USER)/data/DOALS/hauptgebaeude/sequence_1/indices.csv" />
  4. Run

    roslaunch dynablox_ros run_experiment.launch 
  5. Wait till the dataset finished processing. Dynablox should shutdown automatically afterwards.

Analyzing the Data

  • Printing the Detection Performance Metrics:

    1. Run:
    roscd dynablox_ros/src/evaluation
    python3 evaluate_data.py /home/$USER/dynablox_output
    1. You should now see the performance statistics for all experiments in that folder:
    1/1 data entries are complete.
    Data                     object_IoU               object_Precision              object_Recall
    hauptgebaeude_1          89.8 +- 5.6              99.3 +- 0.4                   90.3 +- 5.6
    All                      89.8 +- 5.6              99.3 +- 0.4                   90.3 +- 5.6
    
  • Inspecting the Segmentation:

    1. Run:
    roslaunch dynablox_ros cloud_visualizer.launch file_path:=/home/$USER/dynablox_output/clouds.csv
    1. You should now see the segmentation for the annotated ground truth clouds, showing True Positives (green), True Negatives (black), False Positives (blue), False Negatives (red), and out-of-range (gray) points: Evaluation
  • Inspecting the Run-time and Configuration: Additional information is automatically stored in timings.txt and config.txt for each experiment.

Advanced Options

  • Adding Drift to an Experiment: To run an experiment with drift specify one of the pre-computed drift rollouts in dynablox_ros/launch/run_experiment.launch:

    <arg name="drift_simulation_rollout" default="doals/hauptgebaeude/sequence_1/light_3.csv" />

    All pre-computed rollouts can be found in drift_simulation/config/rollouts. Note that the specified sequence needs to match the data being played. For each sequence, there exist 3 rollouts for each intensity.

    Alternatively, use the drift_simulation/launch/generate_drift_rollout.launch to create new rollouts for other datasets.

  • Changing th Configuration of Dynablox: All parameters that exist in dynablox are listed in dynablox_ros/config/motion_detector/default.yaml, feel free to tune the method for your use case!