Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Different branch meanings and operating conditions #8

Open
Cristian-wp opened this issue Oct 7, 2022 · 28 comments
Open

Different branch meanings and operating conditions #8

Cristian-wp opened this issue Oct 7, 2022 · 28 comments

Comments

@Cristian-wp
Copy link

Hello, I am trying to understand how to make this package work.
I am trying to use it in a really bad enviroment: is a tunnel of 300m with flat concrete walls, no window, colums, signals, really nothing.
Now I am trying to investigate the darpa branch, because I think is the best fitting one.
Can you please suggest me the bests than can be usefull in my case?
Thanks in advance.

@kubelvla
Copy link
Contributor

kubelvla commented Oct 7, 2022

@vdsmax Are you able to find the yaml configuration files that we used for the finals? @Cristian-wp The specifics of this branch are mainly about the integration to our SubT competition system. Maxime might give you the necessary yaml files that configure the ICP to the state that worked the best for us (we were using 16-beam lidars in the Urban circuit, and then also some 128-beam for the finals, but we subsampled them anyway). If you have a straight tunnel without any geometrical features, you may use the trick with two robots, one moving while the other being static and vice-versa, this may provide the necessary feature. Otherwise, the mapping will probably converge to what your prior says, along the tunnel direction at least.

@Cristian-wp
Copy link
Author

@kubelvla Thank you for your answer, but...
What do you mean with "Otherwise, the mapping will probably converge to what your prior says, along the tunnel direction at least.h "?
For what concern the lidar sensor, at the moment we can use a velodyne VLP-16 or a Ouster OS0-64 (not at the same time).
@vdsmax if you can provide the yaml's it will be great! Thank you a lot.

Do you think that for my purpose I have to work with this branch or I have to work with another?

@kubelvla
Copy link
Contributor

kubelvla commented Oct 7, 2022

Regarding the branch, I think you can use the melodic one, there's actually some fixes that are missing in the darpa branch. Or you go wild like Norlab and switch to ROS2 :) I haven't switched yet and I'm looking with awe on the new code...
Regarding the prior: The mapper fetches the prior transform from the /odom -> /base_link transform, and then it runs the ICP to precisely find the /map -> /odom transform that completes the /map->/base_link chain. In the environment you describe, there is a good chance that ICP happily converges to a solution along the length of the tunnel, with any shift you tell it to start with (there is this one DOF which is unconstrained by the environment). So if your prior transformation says start looking for a match one meter ahead from the previous position, then the mapper probably also converges there with a solution. If your prior is zero motion, the mapping will probably get stuck at some point, until some geometry appears in the view.

@kubelvla
Copy link
Contributor

kubelvla commented Oct 7, 2022

I've been just running the mapper on a dataset from an underground mine, and it works with ok-ish, but I needed to increase the map density and update rate, just because there is not so many features on the walls (just irregularities). See the screenshot:
image

@Cristian-wp
Copy link
Author

Hi thanks a lot for your reply.
I have try to follow your suggestions (with the darpa one) but I can not manage to get a "good" map like yours.(screen below)
This are my actual parameters in the norlab_icp_mapper_ros package (I put them in a launch file). Can you suggest me which one are to tune in order to get something similar to yours?

<!-- icp ros params -->
    <param name="odom_frame" value="odom" />
    <param name="robot_frame" value="base_link" />
    <param name="sensor_frame" value="os_sensor_sync" />
    <param name="initial_map_file_name" value="" />
    <param name="initial_robot_pose" value="[[1, 0, 0, 0], [0, 1, 0, 0], [0, 0, 1, 0], [0, 0, 0, 1]]" /> <!-- identity matrix -->
    <param name="final_map_file_name" value="map.vtk" />
    <param name="final_trajectory_file_name" value="trajectory.vtk" />
    <param name="icp_config" value="" />
    <param name="input_filters_config" value="" />
    <param name="map_post_filters_config" value="" />
    <param name="map_update_condition" value="overlap" /> <!-- "overlap", "delay", "distance" -->
    <param name="map_update_overlap" value= "0.9" />
    <param name="map_update_delay" value="1" />
    <param name="map_update_distance" value="1" />
    <param name="map_publish_rate" value="1" />
    <param name="map_tf_publish_rate" value="100" /> <!-- rate of the IMU -->
    <param name="max_idle_time" value="10" />
    <param name="min_dist_new_point" value="0.03" />
    <param name="sensor_max_range" value="20" />
    <param name="prior_dynamic" value="0.6" />
    <param name="threshold_dynamic" value="0.9" />
    <param name="beam_half_angle" value="0.01" />
    <param name="epsilon_a" value="0.01" />
    <param name="epsilon_d" value="0.01" />
    <param name="alpha" value="0.8" />
    <param name="beta" value="0.99" />
    <param name="consecutive_convergence_errors_before_failure" value="5" />
    <param name="is_3D" value="true" />
    <param name="is_online" value="true" />
    <param name="compute_prob_dynamic" value="false" />
    <param name="is_mapping" value="true" />
    <param name="save_map_cells_on_hard_drive" value="false" />
    <param name="publish_tfs_between_registrations" value="true" />
    <param name="backup_in_ram" value="false" />

rviz_screenshot_2022_10_07-14_24_36

@kubelvla
Copy link
Contributor

kubelvla commented Oct 7, 2022

My launch file for the mine is quite simple:

<node pkg="norlab_icp_mapper_ros" type="mapper_node" name="mapper_node" output="screen" >
		<remap from="points_in" to="/ouster/points_deskewed"/>
		<param name="robot_frame" value="base_link"/>
		<param name="odom_frame" value="odom"/>
		<param name="icp_config" value="$(find alina_launchers)/params/realtime_icp_config.yaml"/>
		<param name="input_filters_config" value="$(find alina_launchers)/params/realtime_input_filters.yaml"/>
		<param name="map_post_filters_config" value="$(find alina_launchers)/params/realtime_post_filters.yaml"/>
		<param name="map_update_condition" value="delay"/>
		<param name="map_update_delay" value="0.1"/>
		<param name="sensor_max_range" value="40"/>
		<param name="min_dist_new_point" value="0.1"/>
		<param name="compute_prob_dynamic" value="true"/>
		<param name="map_tf_publish_rate" value="100"/>
</node>

@kubelvla
Copy link
Contributor

kubelvla commented Oct 7, 2022

But the trick is in the ICP config, the default ones are too crude.
The icp config:

matcher:
  KDTreeMatcher:
    knn: 6
    maxDist: 2.0
    epsilon: 1


outlierFilters:
 - TrimmedDistOutlierFilter:
    ratio: 0.7


errorMinimizer:
  PointToPlaneErrorMinimizer:

transformationCheckers:
  - DifferentialTransformationChecker:
      minDiffRotErr: 0.001
      minDiffTransErr: 0.01
      smoothLength: 2
  - CounterTransformationChecker:
      maxIterationCount: 40
  - BoundTransformationChecker:
      maxRotationNorm: 1000000.1
      maxTranslationNorm: 15000.00

inspector:
  NullInspector

logger:
  NullLogger

@kubelvla
Copy link
Contributor

kubelvla commented Oct 7, 2022

The input filters:

- BoundingBoxDataPointsFilter:
    xMin: -1.5
    xMax: 0.5
    yMin: -1
    yMax: 1
    zMin: -1
    zMax: 0.5
    removeInside: 1

- RandomSamplingDataPointsFilter:
    prob: 0.5

-ObservationDirectionDataPointsFilter:

Note here that we use the bounding box filter to remove the robotic platform itself

@kubelvla
Copy link
Contributor

kubelvla commented Oct 7, 2022

The post-processing filters:

- CutAtDescriptorThresholdDataPointsFilter:
    descName: probabilityDynamic
    useLargerThan: 1
    threshold: 0.8

- SurfaceNormalDataPointsFilter:
    knn: 15

The first one removes stuff considered dynamic, and the second filter recomputes the surface normals based on the final map pointcloud (otherwise, normals computed from the input pointcloud are kept and used for future matching)

@Cristian-wp
Copy link
Author

@kubelvla O_O
it is really simple, if they are not confidential can you please send me your config files?(icp_config, input_filters_config, map_post_filters_config)
Ok I have see your post now xD Thanks a lot man!
Do you have any study material about this mapper? I am trying to better understan the logic behind it.

Another question, the yaml files you send me are already implemented inside some part of the code, or I have to provide the yaml? I have read on libpointmatcher(ethz) that is possible to use them without yaml files, in fact I was wandering if alle the parameters in the northlab_icp_mapper_ros are related to them, or have other function.

@kubelvla
Copy link
Contributor

kubelvla commented Oct 7, 2022

You can generally play with the map density: == points 10 cm apart in the map, and with the amount of points left in the input pointcloud:

 - RandomSamplingDataPointsFilter:
    prob: 0.5

@kubelvla
Copy link
Contributor

kubelvla commented Oct 7, 2022

@kubelvla O_O it is really simple, if they are not confidential can you please send me your config files?(icp_config, input_filters_config, map_post_filters_config) Ok I have see your post now xD Thanks a lot man! Do you have any study material about this mapper? I am trying to better understan the logic behind it.

Another question, the yaml files you send me are already implemented inside some part of the code, or I have to provide the yaml? I have read on libpointmatcher(ethz) that is possible to use them without yaml files, in fact I was wandering if alle the parameters in the northlab_icp_mapper_ros are related to them, or have other function.

Save my yamls as .yaml files and provide paths through the parameters.

@kubelvla
Copy link
Contributor

kubelvla commented Oct 7, 2022

Outdoors, the map density around 15cm seems sufficient, indoors we go towards higher densities. We tend to downsample the input pointclouds quite a lot, the Ouster 128-line lidar was actually downsampled to 10 percent or less, and still good for slam.

@Cristian-wp
Copy link
Author

Ok, now I try and let you know the results :)
Thanks a lot.

@kubelvla
Copy link
Contributor

kubelvla commented Oct 7, 2022

I am not sure about a publication that would really describe what filters and minimizers to use, since it is really problem-dependent. You can get some insight from our (and other's) underground experience from this: https://arxiv.org/pdf/2208.01787.pdf

@kubelvla
Copy link
Contributor

kubelvla commented Oct 7, 2022

Wait for the Canadians to wake up and see if Maxime can find the very yamls from the competition (but they won't be much different, maybe some extra input filtering)

@pomerlef
Copy link

pomerlef commented Oct 7, 2022

@Cristian-wp , you can have a look at "A Review of Point Cloud Registration Algorithms for Mobile Robotics" for a description of the pipeline and most of the filters:
https://hal.archives-ouvertes.fr/hal-01178661/document

@Cristian-wp
Copy link
Author

@pomerlef Thanks :)

@Cristian-wp
Copy link
Author

Hi @kubelvla , today I was able do create a map :) is not perfect, because my odometry source give as last position ~250m, and the icp_odom as last position gives ~230m, but is still a good result for the moment. I let you know for further improvement. For the moment thanks a lot for your help!

rviz_screenshot_2022_10_10-11_54_01

Can you leave this topic open? So in case of other problem and update I write all here.

@kubelvla
Copy link
Contributor

Nice @Cristian-wp ! You can also consider a special mode of the ICP minimization, which however requires very accurate attitude estimation in your odometry prior (in roll and pitch, and better than 0.1 degree). You also have to really make sure that the transformation between the frame that is linked to the odometry estimation (base_link?) and the lidar frame is spot-on. If you have these two things, you can suppress pitch and roll drift in the mapping (we were using this in SubT). Documentation here: https://arxiv.org/abs/2203.13799 and the option selected by adding a parameter to the minimizer settings in the icp config yaml:

errorMinimizer:
  PointToPlaneErrorMinimizer:
         force4DOF: 1

@Cristian-wp
Copy link
Author

Thanks @kubelvla ! Now I am working to make my odometry source more stable. I am using UKF from ROS robot localization package to fuse my data in order to have a realible VisualInertialLidar slam, after that I will try it for shure. For the Attitude estimation I directly fuse a rugged industrial grade IMU, so the roll-pitch estimation shoud be good.

During the Darpa you fuse inside your estimator even the ICP_odom get from the mapper?

I know is off-topic, but can you maybe suggest me a good way to improbe my localization system?

@kubelvla
Copy link
Contributor

No, the icp odom output was the final output, with no feedback to the odometry system. One has to be very careful not to cause oscillations between the prior and the icp output. I don't know the correct answer how to do this safely, in a way that would converge to a better solution (even in the pose graph approach, Francois remembers situations where the pose graph did some optimization but the ICP still wanted back to its solution). In SubT, we took the IMU and wheel odometry as a prior, and ICP to get the fine localization and mapping. We actually didn't use loop closures, the system was good enough like that.

@Cristian-wp
Copy link
Author

I can not use wheel odometry because I am working on a UAS :')

@kubelvla
Copy link
Contributor

:D Ok, no wheel odom. I hope you don't have a brand new tunnel with walls clean uniform grey color ;) IMU should help the VIO quite a lot in finding good matches, make sure your solution does that

@Cristian-wp
Copy link
Author

Not new, but they have uniform grey color :D, Yes, I am using a VIO + LIO system + external sensors, but its my first time with this type of enviromet. Now I am trying to figure out how to get the covariance matrix in order to improve the fusion

@vdsmax
Copy link
Member

vdsmax commented Oct 12, 2022

Hi @Cristian-wp ! Here are the set of parameters we used for the DARPA challenge:
ICP_params_darpa.zip
There are two set of parameters, one for the Norlab robots which were using a 16-beam lidar, and another one for the CTU robots which were using a 128-beam lidar.
During the competition, we were using the 4-DOF minimization as Vlad was explaining. I fined tune by myself the parameters for the competition according to the lidar we had, and the environments we were deploying.

@Cristian-wp
Copy link
Author

@vdsmax Thank you a lot! Now I am working on another part of the projrct. Next week I will try your parameter for my cases and let you know the results :)

@Cristian-wp
Copy link
Author

Hi @kubelvla, my results with the mapper are getting better, specially after the suggestion of @vdsmax .
But now I have a performance problem...in my setup, as previously say, I fuse different odometry source from slams, sensors, ecc... in order to get a good position and orientation estimation.
As you explained, To work the mapper need the trasformation between odom and base_link in order to work (that is generated from the slam). I have try run the mapper with all my system but it is really slow and sometimes it affects all the other part (CPU full).
So I have try to record a rosbag with the pointcloud, the fused odometry message and all the tf, but the map does not get created. In rviz I have put map as fixed frame and during the rosbag play I see the frames move correctly and even the output of "rosrun tf view_frame" shows that the link map->odom->base_link is present.
Do you have any suggestion?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants