Replies: 1 comment 4 replies
-
Microsoft's futureUWP was dead shortly after HL2 shipped, and is officially deprecated at this point. The WMR runtime for Desktop has an official support End of Life in... 2026? But is being removed from Windows in June/July of this year. I wouldn't really consider Microsoft to be 100% out, but maybe 90% out. If we see another consumer/business headset from Microsoft (and I think there's still a decent chance we may), my guesses are that it will be Android, and that Microsoft will not be directly making it. I'm honestly curious how Microsoft is going to pull off the military AR headset contract at this point, given how much talent they've bled. But it's a big commitment that may force them to stay in the game just a little bit. Camera StreamsApple Vision Pro, I expect Apple would chase you with lawyers if you found a way to access their cameras. Meta, given their mass consumer focus, I don't think they're likely to expose camera streams unless it becomes an industry standard, or more socially acceptable. The story gets a bit better with some other headsets though. Snapdragon Spaces headsets have an Tracking camera data is something that may become less and less available over time though? I believe there's a lot of advantages to having dedicated hardware for processing tracking data, passthrough, stuff that's really expensive yet critical to run uninterrupted. Stuff like the HL2's Holographic Processing Unit, or AVP's R1 chip. It's a lot of data to be shuffling back and forth to the main CPUs, if you have to do that, which is why HL2 had a major performance penalty for research mode. |
Beta Was this translation helpful? Give feedback.
-
Sorry, that this is not 100% related to StereoKit: we are creating medical devices and typically want to do our own optical pattern tracking using the calibrated cameras of the goggle. We had done that for HL2 using it in Dev Mode.
As we believe HL2 is kind of deprecated we will be playing with Apple Vision Pro and Meta Quest 3. From what we find, there seems to be no way to access the camera stream(s) of the inside out cameras with calibration info to throw that into our pattern tracking algos.
So, some questions hoping you are much closer connected to things (@maluoi):
Thanks,
Fabian
Beta Was this translation helpful? Give feedback.
All reactions