Scene: Respect original material settings in .overrideMaterial
See original GitHub issueWhen setting the override material for a scene the maps and other settings on the original material are not used – this applies to animations, mesh instances, and other uniforms and defines, as well. This jsfiddle shows that the normal map is not applied when rendering the scene with a MeshNormalMaterial
applied:
This makes it a difficult to render things like normal and depth buffers for screen effects. So when an overrideMaterial is used the defines and uniforms of the original should be used (if they exist on the override material). This would let an override material us the color, displacementMap, normalMap, textures, skinning settings, etc available.
This mechanic could be used make post processing passes more robust and correct, afford a depth prepass with correct vertex transformations, a deferred renderer that uses the original material properties without having to manually copy everything, and shadows that support displacement maps out of the box.
This is closer to how Unity’s Shader Replacement works to afford this functionality:
the camera renders the scene as it normally would. the objects still use their materials, but the actual shader that ends up being used is changed
In Unity if the objects material doesn’t define a specific uniform then the default shader uniform value is used. To allow for backwards compatibility this could be included as an opt-in feature:
Scene.setOverrideMaterial( material : Material, overrideUniforms : Boolean );
Issue Analytics
- State:
- Created 5 years ago
- Reactions:3
- Comments:6 (2 by maintainers)
Top GitHub Comments
That was a rough overview of how it works. There are a couple things you can do to speed it up – such as only caching during the first replacement, rendering all subsequent passes, and then resetting afterward:
Keeping a shallow array of all meshes with materials that you want to replace (even if that means all of them) can help speed things up because you can avoid recursing and iterating over non-renderable nodes. You can generate that array once at the beginning of every frame or update when you add or remove something from the scene. I use a hacked in version of #16934 to track when items are added and removed more easily for this type of purpose.
You can disable
scene.autoUpdate
on subsequent renders, too, to avoid matrix updates.I haven’t gone that deep into it, no. It’s not the ideal solution but until there’s something built in this is what I’ve looked into. The postprocessing effects in the examples folder (and shadow map passes!) could really benefit from proper passes like this, though.
Generally I would like to see tools and features added into three.js that allow us to build our own flexibility around the library permanently, such as the exposed shaders, uniforms, and defines for built in materials and #16934. In this case there are benefits to be had within the library from including this functionality.
@gkjohnson The solution you linked seems to require extra traversals of the scene. I’d imagine that kind of a solution is quite costly for performance, if we consider the use case of rendering depth/normal buffers as a part of post-processing or a deferred rendering pipeline on every frame. I think whatever we come up with this should be written with performance in mind first and foremost. Did you measure what kind of CPU performance impact does your solution have compared to the current method of rendering a normal buffer (without the effect of normal maps)?