Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

how to use sensor data from iphone #217

Open
HtutLynn opened this issue Oct 18, 2023 · 2 comments
Open

how to use sensor data from iphone #217

HtutLynn opened this issue Oct 18, 2023 · 2 comments

Comments

@HtutLynn
Copy link

Hi, I am trying to generate a pointcloud by using Kimera-VIO. I'm trying to use data from iphone sensors. imu sensor data format used by Kimera is different from fused_imu data generated by iOS. I was wondering if I need to do some preprocessing to the data from iPhone sensor so that it can be used with Kimera-VIO.

imu data format used by Kimera

timestamp [ns] ,w_RS_S_x [rad s^-1], w_RS_S_y [rad s^-1], w_RS_S_z [rad s^-1], a_RS_S_x [m s^-2], a_RS_S_y [m s^-2], a_RS_S_z [m s^-2]

fused_imu data format by iOS

timestamp, ax, ay, az, rx, ry, rz, mx, my, mz, gx, gy, gz, heading

w_RS_S_x [rad s^-1], w_RS_S_y [rad s^-1], w_RS_S_z [rad s^-1 are angular velocity around x,y, z axes. So I think thinking of using rx, ry, rz from iOS sensor. And as for a_RS_S_x [m s^-2], a_RS_S_y [m s^-2], a_RS_S_z [m s^-2], these are linear acceleration along x, y, z axis so I am thinking of using ax, ay, az from iOS sensors.

However, the problem is that the data value from a_RS_S_x [m s^-2], a_RS_S_y [m s^-2], a_RS_S_z [m s^-2], especially linear acceleration along x axis, it ranges from 8 - 9 but the data from iOS sensor ax, ay, az, it all ranges from -0.001 to -0.002. I think I am doing something wrong. Am I doing something wrong? Thank you for the help in advance!

@marcusabate
Copy link
Member

We haven't tried Kimera on iPhone data but it should work. Your associations between headers is probably right. A large difference in magnitude isn't necessarily concerning since the iPhone may be moving at much slower accelerations than the Euroc drone, however Kimera does require that the gravity readings are present in the imu data so you should see one vector (not necessarily axis-aligned) with values in the 8-9 m/s^2 range.

If you're able to upload your dataset I can take a look. It's possible that the iPhone preprocesses out the gravity vector before publishing. I'd recommend researching if anyone else has tried to use IMU data from the iPhone and if so, what they had to do to make it usable.

@HtutLynn
Copy link
Author

HtutLynn commented Nov 6, 2023

@marcusabate , Thanks for the reply! Yes, I am trying to make Kimera works with sensor data from iphone and still doesn't work due to error from #218 (comment). For your information, I uploaded the sensor + visual data collected from iphone through Core Motion Framework and ARKit in this GDrive.

Here is the sensor data format from iOS (iphone),

  1. ax, ay, az are accelerometer sensor values.
  2. rx, ry, rz are gyroscope sensor values.
  3. mx, my, mz are magnetometer sensor values.
  4. gx, gy, gz are gravitometer sensor values.

But the values that EUROC dataset provides are,

  1. Linear Acceleration
  2. angular velocity

Here is the transformation process for sensor values from iOS (iphone) to EUROC format,

linear_acceleration_x = (ax+gx) * -9.81
linear_acceleration_y = (ay+gy) * -9.81
linear_acceleration_z = (az+gz) * -9.81

angular_velocity_x = rx
angular_velocity_y = ry
angular_velocity_z = rz

imu_msg = [nano_timestamp,
           angular_velocity_x,
           angular_velocity_y,
           angular_velocity_z,
           linear_acceleration_x,
           linear_acceleration_y,
           linear_acceleration_z
          ]

pose.txt file contains the pose generated from Apple ARKIt not from CoreMotion framework. So for visual data and camera intrinsic, I extract it from the pose.txt. But since the camera is rotated, I have to rotate the extracted frames from video to 90' clockwise to get correct orientation.

But the problem with using data from iphone is that I don't know where I can get the intrinsics of imu sensor from iphone. IMU noise values such as Gyrometer Noise, Accelerometer Noise, GyroWalk and AccWalk. Also I am also lost on how to get transformation matrix from camera to body-frame.

Thanks for the help in advance!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants