Quest 3's passthrough will improve over time with software updates, according to Meta's CTO.
When asked about potential passthrough improvements in an Instagram AMA (ask me anything) session yesterday, Andrew Bosworth replied:
"Yes, it will continue to improve as we continue to get real world lighting conditions and information from the headsets that have been picked up, we start to tune the algorithms that drive it more effectively.
And so I do think it will continue to improve - modestly - from here for a little while as we do a better job depth estimating with where your hands are and working with the distortion around that, and things like that.
So yeah, we're gonna continue to work on it as we have with the Quest Pro."
While Quest 3 is arguably the first consumer headset with passthrough you'd want to spend more than a few minutes in, in our review we pointed out the noticeable flaws that make it feel very far from a transparent optic.
While the passthrough is depth and scale correct for objects and scenery further away than arm's length, and has impressively low latency and stability, at close range it exhibits geometric warping and double-imaging on moving objects. This can be clearly seen on your hands and arms, and means that while the camera resolution is just about good enough to read text, you won't actually want to use your phone for long because it appears significantly distorted in a way that's different for each eye.
This warping is partially a result of the very low resolution of the depth estimation field the system generates to reproject the color camera views. It works by comparing the views from the two lower latency greyscale cameras. Quest Pro's depth estimation outputs 10,000 points per frame, and Meta hasn't yet given a figure for Quest 3.
Bosworth's comments suggest that Quest 3's depth estimation accuracy or resolution may be improved in upcoming software updates. If that happens, it would also improve the quality of the currently experimental dynamic occlusion feature.
Still, Bosworth tempered expectations with the word “modestly” in his reply. While software can improve, Meta's passthrough stack will still be limited by the cameras on the Quest 3 headset and the Snapdragon XR2 Gen 2 chipset. My colleague Ian reports that distortion wasn't visible in his brief time with Vision Pro, but Apple is using a dedicated secondary chipset to process passthrough and more (likely higher quality) sensors. Of course, this is also part of why it will start at seven times the price.
via Mint VR