Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Controller pose lag: GetDeviceToAbsoluteTrackingPose and fPredictedSecondsToPhotonsFromNow #121

Open
paul-marechal opened this issue May 23, 2021 · 1 comment

Comments

@paul-marechal
Copy link

paul-marechal commented May 23, 2021

What I noticed

When using ARVRController nodes and parenting a mesh to it, I can see an offset between where Godot thinks the controller are versus what SteamVR shows when displaying overlays. It is as if Godot reads positions with some lag.

To reproduce: Simply parent a mesh to ARVRController nodes, then if you have SteamVR you can open its overlay. It will render a model of the controllers, and with Godot still running in the background you should be able to notice the lag between the two renders. Note that I noticed the same issue in other games, such as ones made with Unity engine.

Granted the lag is small, and barely noticeable if not displaying the overlay to visually see the difference, but still something nice to consider since the API supports it.

Potential cause

Ramping up on the different components involved for VR in Godot, I found the following documentation about the API to fetch poses: https://github.com/ValveSoftware/openvr/wiki/IVRSystem::GetDeviceToAbsoluteTrackingPose

The pose that the tracker thinks that the HMD will be in at the specified number of seconds into the future. Pass 0 to get the state at the instant the method is called. Most of the time the application should calculate the time until the photons will be emitted from the display and pass that time into the method.

This Godot OpenVR addon currently passes 0.0 to this API:

if (tracking_universe == openvr_data::OpenVRTrackingUniverse::SEATED) {
vr::VRSystem()->GetDeviceToAbsoluteTrackingPose(vr::TrackingUniverseSeated, 0.0, tracked_device_pose, vr::k_unMaxTrackedDeviceCount);
} else if (tracking_universe == openvr_data::OpenVRTrackingUniverse::STANDING) {
vr::VRSystem()->GetDeviceToAbsoluteTrackingPose(vr::TrackingUniverseStanding, 0.0, tracked_device_pose, vr::k_unMaxTrackedDeviceCount);
} else {
vr::VRSystem()->GetDeviceToAbsoluteTrackingPose(vr::TrackingUniverseRawAndUncalibrated, 0.0, tracked_device_pose, vr::k_unMaxTrackedDeviceCount);
}

The same wiki page mentions a way to automatically compute this "seconds to photons" value.

@BastiaanOlij
Copy link
Member

One problem here is that we update the tracking data right before we render the scene but don't apply the new locations until the next frame is processed. We only use the new head location for the rendered frame to keep the latency here as low as possible.

In order to react properly on controller movement we have to update to the new locations before script logic is processed.

That said, you are correct we should be sending timing info to the tracking functions. I'm guessing we could use the delta between get_last_commit_usec and get_last_process_usec

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants