setProjectionFromUnion() shouldn't run for non-stereoscopic rendering
See original GitHub issueDescription of the problem
I think setProjectionFromUnion()
in WebXRManager
is for stereoscopic rendering. But it’s called even for non-stereoscopic rendering, for example mobile AR, from WebXRManager.getCamera()
.
Especially, in mobile AR (view# is 1) cameraR
(cameraVR.cameras[1]
) won’t be updated. So setProjectionFromUnion()
which updates projection matrix from cameraL/R
may not work correctly.
An easy solution may be skipping setProjectionFromUnion()
if the number of views is one. But currently no way to detect it in .getCamera()
. We may need to save the number of views somewhere.
/cc @jsantell
Three.js version
- Dev
- r111
- …
Browser
- All of them
- Chrome
- Firefox
- Internet Explorer
OS
- All of them
- Windows
- macOS
- Linux
- Android
- iOS
Hardware Requirements (graphics card, VR Device, …)
Issue Analytics
- State:
- Created 4 years ago
- Comments:5 (4 by maintainers)
Top Results From Across the Web
Stereoscopic Rendering Check Box - Unity Forum
Hi, I tried to use this Stereoscopic Rendering feature in Unity 4.5, but I could not make it work. I have a system...
Read more >How to do Stereoscopic Rendering - Unity - Manual
How to do Stereoscopic Rendering. Stereoscopic renderingThe process of drawing graphics to the screen (or to a render texture).
Read more >Implementing Stereoscopic 3D in Your Applications - NVIDIA
―In 3D‖ is the new ―Stereo‖. — They are used interchangeably, stereoscopic rendering is the technical means to make an image ―In 3D‖....
Read more >Calculating Stereo Pairs - Paul Bourke
In order to render a stereo pair one needs to create two images, ... The amount by which the images are extended and...
Read more >Non-photorealistic rendering in stereoscopic 3D visualization
The viewer is no longer a mere third-party observer, he is reinstated in regard to the stereoscopic artwork as a participant who not...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Correct. In the vast majority of cases you’ll get a consistent number of views each frame for the full duration of an XRSession, but there are some reasonable scenarios where the view count may change mid-session as well. For example, I’ve observed some platforms that drop content to monoscopic rendering when system UI is up to prevent depth conflicts. Another example would be zSpace’s stereo monitors, which will smoothly transition from mono to stereo display depending on whether or not they detect a tracked pair of stereo glasses.
Regardless of how common the scenarios above may be, though, the most important goal of the current views system is that we didn’t want devs making assumptions about how to render content based on anything other than the actual views reported. Otherwise we’d definitely get developers doing things like blindly rendering only one view across the entire buffer when in AR mode because they only considered phone AR and not headsets.
I think there are additional related issues with how cameras are used in relation to XR views. The current code in WebXRManager’s onAnimationFrame makes assumptions about the WebXR views that aren’t necessarily true.
If a page uses an AR session after a VR session, the second camera continues rendering using stale viewport information from the previous session, resulting in duplicate output. I filed this as a separate issue: https://github.com/mrdoob/three.js/issues/18525
Also, fresh AR sessions seem to start out with a second camera with a (0, 0, 0, 0) viewport, and as far as I can tell it’s using this camera every frame for rendering a zero-sized view. I don’t know if this has a significant performance cost, but it would be preferable to skip this camera when it’s not needed.
The code also assumes that the first view is for the left eye, and the second view for the right eye. This isn’t necessarily true, it should use the view’s eye property to see which eye is expected to see each view. A headset that has views in a different order could end up showing stereo content to the wrong eye.
If the XR session has more than two views, the access to
cameraVR.cameras[i]
would be out of bounds, and the code after that such ascamera.matrix.fromArray
will fail. Currently, there don’t seem to be any devices with more than two views, but future ones such as Varjo with a high-resolution inset display are likely to use this capability.