Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Option to map controller actions to hand poses with added noise #107

Open
goatchurchprime opened this issue Sep 12, 2020 · 2 comments
Open

Comments

@goatchurchprime
Copy link

The actions.json config file is a great start for handling the proliferation of controller systems. However, this leaves out hand-tracking, which is not simply another controller with buttons that could be bound. Also, the way the tech is developing, hand tracking could become the most important controller system of all. Already hand tracking systems cost the same as a pair of controllers, and they don't need batteries.

In order to develop for hand tracking before you have a hand tracker, we need a system that maps the controller buttons to hand gestures, that we can then decode. AltspaceVR has this in-game, but the gestures aren't right. For example, they map trigger button to index finger folded in, when maybe this should be thumb and index finger pinch. Then we can develop our tracker UIs for hand tracking at the same time as controller binding, instead of building it for controllers and bodging something to make hands work. We should instead develop for the hands and controllers simultaneously, and then make tweaks to the controller and hand interfaces separately to take advantage of their unique characteristics.

This is a developmental tool for all of us -- to turn controllers into the hand-tracking interface within the API -- so it should be done at a low level in the plugin to get us into good habits in our development workflow. Otherwise to do it this way we would have to wrap the plugin with another layer that bound the two interfaces together, and we shouldn't have to do that if this is an approach that all of us should be adopting.

@BastiaanOlij
Copy link
Member

This is something for the future to investigate once OpenVR actually supports hand tracking, the direction Valve seems to be taking is that this logic will already be supported in SteamVR itself. It is very clear that Valve wants games to publish which actions are necessary and that you can map those action to the input device being used so that you as a game developer don't have to worry about this.

That means the onus of the above logic lies with the supplier who support hand tracking, they need to detect the gestures and you can then map those gestures to the actions defined in the game within the bindings interface Steam supplies.

Until such features are made available to us this is just theory however.

@BastiaanOlij
Copy link
Member

Just to further this discussion. devices such as Leap Motion now have a number of community made SteamVR drivers that implement this according to Valves specifications. Leap Motion themselves have an OpenXR implementation that exposes the finger tracking through OpenXRs proper API for this.

So the pieces of the puzzle are starting to fall into place:)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants