Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Test on own data,and some problem on the turning movement! #62

Open
mightyhao opened this issue Jul 8, 2021 · 12 comments
Open

Test on own data,and some problem on the turning movement! #62

mightyhao opened this issue Jul 8, 2021 · 12 comments

Comments

@mightyhao
Copy link

Hi, @mbrossar.,I'm interested in your great job!But I can't get good result on my own test data.
I use IMU of the intel realsense T265 on the handcart to move on the straight line,and turn sometimes.The move speed is about 1m/s.I got the result as follows:
Figure_1
Figure_3
Figure_2
Figure_4
Figure_5
I have tuned the initial parameters hardly, like initial error covariance, process noise covariance and measurement covariance. but still,the result is not good on the turn position of the trajectory, could you please give me some advice on how to solve this problem? Thank you!

@scott81321
Copy link

scott81321 commented Jan 28, 2022

Hello @mightyhao Did you check your data? Did you make sure all timestamps were sort in forward time with no time gaps greater than 0.1 seconds?

I am more experience with ai-imu-dr. I found what I think is a serious issue concerning one of Brossard's equations. If I get this right, you have a significant gyro_z component happening for a while and I suspect the issue I brought explains your poor result for the turning.

@milkcoffee365
Copy link

Maybe the precision of the imu is not good enough. The data used in this paper is very accuracy, very small bias and noise. I have try run ai_imu with smart phone data, and the result is not good. But, when use GPS's position update, the result comes much better.

@saltrack
Copy link

@scott81321 @milkcoffee365 @marooncn Did you guys ever manage to train the model with your own dataset?
Also additionally, how did you gather ground truth data for your own dataset? Did you use oxts like KITTI?

@milkcoffee365
Copy link

I tried to run Ai-Imu without the model, use the paper's kitty data, this result seems no big different. So, I think the model is not necessary. The main contribution of this paper should be attributed to the IEKF modeling.
I collect my own data with smartphones, and no ground truth. I regard GPS as ground truth about 1HZ, not same with this paper. Remove some continuous GPS to simulate the GPS outage.

@1248280302
Copy link

@mightyhao @milkcoffee365 Hey guys.How do you test this code with only IMU data?I have seen the KITTI dataset include GPS data and other error conv.Could it work with IMU 6 axis basic data?

@ajay1606
Copy link

Hello @mightyhao @scott81321 are you able to test working for this repo code with your own data ! Would you please share some of your experience on that ! I also trying to convert my imu bag data into the KITTI format mentioned by @mbrossar.

Currently, I have tested the code with provided example KITTI dataset and it works well as shown in the README. Also tried testing the code with KITTI raw data (using only OXTS file format and enabling read_data=1 in the code main_kitti.py) and the results are very good.

Similarly to KITTI OXTS ( .txt) format, I have generated a .txt file and time stamp file to input to this algorithm after converting bag file data into OXTS format as below. But obtained results look very strange and couldn't understand clearly where I am missing !! Please, anyone, guide me to plug my bag values into this repository code. Really appreciate any response.

Here is the Bag to OXTS conversion:

Original IMU data from bag file

header: 
  seq: 4201
  stamp: 
    secs: 1633681151
    nsecs: 664900319
  frame_id: "imu_link"
orientation: 
  x: -0.0040089504111
  y: -0.0138810181661
  z: -0.132263991438
  w: 0.991109218109
orientation_covariance: [0.0009797271326400653, 0.0, 0.0, 0.0, 0.0012425838195850997, 0.0, 0.0, 0.0, 1.462497123398009]
angular_velocity: 
  x: 0.00138823367606
  y: 0.000741836739148
  z: 7.29580193999e-05
angular_velocity_covariance: [-1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
linear_acceleration: 
  x: 0.00724529630717
  y: -0.0154154589063
  z: 0.0730627986135
linear_acceleration_covariance: [-1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
---

From IMU Topic:
IMU is installed in a way that XYZ coordinates matches with the vehicle FLU coordinate axis.

  af=msg->linear_acceleration.x;
  ax=msg->linear_acceleration.x;
  al=msg->linear_acceleration.y;
  ay=msg->linear_acceleration.y;
  au=msg->linear_acceleration.z;
  az=msg->linear_acceleration.z; 

  wf=msg->angular_velocity.x;
  wx=msg->angular_velocity.x;
  wl=msg->angular_velocity.y;
  wy=msg->angular_velocity.y;
  wu=msg->angular_velocity.z;
  wz=msg->angular_velocity.z; 

From Twist topic:
Not Sure about vn, ve observations! How can i compute?

// Not sure about these values
  vn=msg->twist.linear.x;
  ve=msg->twist.linear.y;

  vf=msg->twist.linear.x;
  vl=msg->twist.linear.y;
  vu=msg->twist.linear.z; 

From GNSS topic:

  lat=fix->latitude; 
  lon=fix->longitude;
  alt=fix->altitude ;  

Expected Trajectory (Original Trajectory)
image

Obtained Trajectory
image

Position Graph:
image

Orientation Graph:
image

Really any response much appreciated.

@scott81321
Copy link

scott81321 commented Jul 28, 2022

I hope this answers the questions I have received. Brossard's program uses input data in his particular pickle format (.p files). This format requires IMU data accel+gyro each 3-dimensional data and velocities (ENU) and RPY in ZYX standard. 'P' is displacement (also EN but the gravity part is weird), which starts at zero. For integration, Brossard's program verification only needs the initial values of P, velocities and RPY. However, for comparison and testing, it needs all these values. So far, I have not done any training. Brossard tells me you can get a good result without any need for training. I wrote a python script which reads csv files of the needed and produces the desired the pickle .p files directly. I don't use OXTS data (except for initial testing of Brossard's program). The KITTI format has more data fields than are needed by Brossard's program and this format in ascii takes a lot of space (1 file for timestamp entry). So far, I have not succeeded in producing a good result because my data had too much drift in the accel and gyro measures. This drift was caused at every round of acceleration and deceleration. Drift could produce runaway solutions. I suspect that this drift is the "killer" for a lot of test cases that do not work (look at the IMU baselines after the speed returns to zero). The OXTS sensor is precise and you need sufficiently precise data to make Brossard's program work.

RPY is indeed in ENU (like velocities) but I found that the IMU filter using accel+gyro in NED using ZYX standard for the Euler angles gave better estimations for yaw - which remains mysterious. On a flat surface, pitch and roll should be close to zero.

=>

@ajay1606
Copy link

@scott81321 Thank you so much for your detailed response indeed it makes me understand more clearly. By any chance is it possible to look at your script for reading data from a CSV file.

@scott81321
Copy link

scott81321 commented Aug 9, 2022

I managed to finally make something of a successful run of Brossard's program on a small example. It was successful enough to convince me that Brossard's program is sound. My test case started well enough, beautifully but derailed at a particular turn. It did eventually recover in terms of velocities but the resulting calculation position was somewhat derailed. I think I know why: my drift correction was not continuous but piece-wise continuous causing discontinuities in the first derivative of the IMU accel+gyro data. Brossard's program is sensitive to such discontinuities as I found out from one of his own test cases (2011_10_03_drive_0027_extract). The errors file is instructive in this regard. This could be the source of some people current problems. Also, if you have drift it in the original IMU accel+gyro data, it will show up in the imu.png image.

@pkr97
Copy link

pkr97 commented Mar 24, 2023

Hi Scott, Thanks for the elaboration. Would you like to share a simple use case where you can collect some Accelerometer and Gyro samples for the program and test? That would be very helpful in the hands-on testing of the code.

@scott81321
Copy link

Hello @pkr97 I am dealing with proprietary data, so I cannot release it. I have posted comments though. The situations has changed since Aug. 9 2022. For one thing, there were errors in the arxiv version of Brossard's program largely fixed in the IEEE version of the paper (a problem with brackets mostly). The 'update' routine had to be updated. @robocar2018 was very helpful here. The program does poorly when the vehicle comes to a grinding halt for, say 30-60 seconds. It is not designed to deal with that. In that case, you have to consider ZUPT and implementing something which I did with some success. When you do consider ZUPT, other issues are revealed. Just look at my other postings.

@yusshzay
Copy link

@scott81321 @mightyhao Hello, sir. Could u please tell me how to use my own data? Thx a lot.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants