question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Reconsider removal of inline sRGB decode

See original GitHub issue

Decided to start a new issue rather than adding comments on the closed ones, hope that’s OK.

I’d like to request a reconsideration of the inline sRGB decode removal from #23129.

The WebGL1 fallback involves forcibly disabling mipmaps if EXT_sRGB is supported, or a costly CPU-side conversion if not - both of these are significant regressions in WebGL1 functionality.

As discussed over on #22689, it seems like WebGL1 support is still accepted as a current necessity. iOS is the main reason why this is still an important consideration - roughly 20% of iOS users are still not yet on iOS 15 when WebGL2 was first enabled by default (prior to that the implementation was so buggy to be pointless enabling anyway).

iOS 15 uses ANGLE for its WebGL2 implementation, but with the Metal backend which is pretty new - it is usable and works correctly for a fair amount of content, but there are known bugs and regressions even in the latest iOS 15.4. Therefore sticking with WebGL1, even on the latest iOS, is one practical way to avoid the worst implementation bugs whilst Apple’s WebGL2 implementation settles down and becomes more reliable.

I understand that doing sRGB decode in the shader leads to incorrect interpolation, and without using the correct sRGB internal formats, mipmap filtering will be incorrect. I think the change in #23129 will lead to worse practical effects for WebGL1 content, especially the CPU-side conversion.

Here are some example values:

8-bit sRGB value [0,1] linear value 8-bit linear (floating point) 8-bit linear (rounded) Linear -> 8-bit sRGB
7 0.002125 0.54 1 13
12 0.003677 0.94 1 13
17 0.005605 1.43 1 13

Essentially any pixel with an sRGB value in the range [7, 17] will end up as 1 in the CPU-converted texture, and be rounded to 13 when the shader encodes as sRGB for output (NB: I’ve assumed rounding here, whereas it looks like the CPU conversion uses Math.floor - the logic still stands but exact numbers may vary).

In my view this is likely to cause much more severe artefacts than the mathematically incorrect interpolation.

A sampler half-way between two pixels with sRGB values (7/255) and (17/255) should be computed in the sampler after the conversion of each sample to linear space - ie it should be 0.002125 * 0.5 + 0.005605 * 0.5 = 0.003865. “Incorrect” linear blending would compute it as (7 * 0.5 + 17 * 0.5) / 255, ie (12/255) and then do sRGB conversion on that result, giving 0.003677. It’s not quite right but it’s not that bad.

The problem is the linear values are just an intermediate stage (need to be encoded back to sRGB for output anyway) and 8-bits is just not enough precision for storing them. In the shader as long as the precision isn’t lowp you’d be guaranteed more precision for those intermediate results.

The overall question is what to do about sRGB stuff in WebGL1 contexts. One option is just to not support them - Unity only allows you to use “linear” lighting if WebGL1 is disabled, but that means both webgl1 and webgl2 contexts end up running in “gamma” mode with the mathematically-incorrect lighting calculations.

I think three.js’s previous behaviour of emulating in the shader post-sampling was reasonable, which gives mathematically-correct behaviour in WebGL2, plus a visually consistent fallback in WebGL1. What’s more as the intermediate values use sufficient precision, it wouldn’t introduce banding artefacts for a simple “passthrough” unlit shader, unlike the current CPU fallback.

WebGL1 + EXT_sRGB looks like a reasonable “correct on WebGL1” approach, but the lack of mipmapping is annoying. sRGB-correct mips could be computed on the CPU side but there’d be a performance cost. Perhaps the best balance is to only use EXT_sRGB in the case mipmaps are explicitly disabled, rather than forcing that choice on users.

Issue Analytics

  • State:closed
  • Created a year ago
  • Reactions:2
  • Comments:13 (11 by maintainers)

github_iconTop GitHub Comments

2reactions
Mugen87commented, Nov 21, 2022

I think right now three still assumes RGBA for video and does an inline decode, but the plan is to move that to sRGB internal formats too at some point in future?

Yes. Right now, video texture still use inline decodes and also texImage2D() instead of texStorage2D(). Browser issues and problems with determining the dimensions of a video texture prevent a migration so far (see https://github.com/mrdoob/three.js/issues/23109#issuecomment-1004640157). But I guess when we dump WebGL 1 and the related fallbacks and code paths, we are going to reconsider this issue.

In any event, #23129 is not going to be reverted so this issue can be closed.

1reaction
tangobravocommented, Mar 29, 2022

The inline sRGB decode produces wrong colors, see #22483. The implementation was hacky (compared to SRGB8_ALPHA8) and only worked for built-in textures.

As @donmccurdy mentioned in that issue - that example is designed to make this obvious, as it tests 0.5 interpolation between two vastly different pixel values. It really overstates the severity of the issue in general, and as I mention above it’s even spec-compliant to do the conversion post-filtering.

In practice if those intermediate values between two pixels are important then just increasing texture resolution will reduce the error introduced by post-filtering conversion. The precision errors introduced in the CPU fallback are likely to have a much more severe real-world impact.

TBH, I would rather recommend to remove WebGL 1 support than bringing back the sRGB inline decode. If users don’t want the current policy, they have to stick at r136.

It seems the settled opinion from #22689 that it’s too soon to remove WebGL1 support. I’m not suggesting removing use of SRGB8_ALPHA8 or EXT_sRGB where available, but just that the inline shader decode is a much better WebGL1 fallback than CPU-side lossy 8-bit conversion.

If the view of the project is that from r136 the WebGL1 output will be regressed in order to offer code simplifications for WebGL2, and that going forward more divergence can and should be expected, that’s certainly a reasonable policy but it is one that would involve us sticking at r136 for our projects until we’re prepared to give up on WebGL1.

[There are certain ecosystem issue with react-three-fiber packages that bump peer dependencies in patch versions leading to npm install usually requiring the latest three - it’s not a three.js issue, but does bring practical challenges to trying to stick to an early version right now.]

Read more comments on GitHub >

github_iconTop Results From Across the Web

https://opensource.apple.com/source/WebCore/WebCor...
Remove the ColorSpaceDeviceRGB enum and users. ImageBuffer now uses sRGB instead of deviceRGB in the few cases that used the latter.
Read more >
Free Automated Malware Analysis Service - powered by Falcon ...
Submit malware for free analysis with Falcon Sandbox and Hybrid Analysis technology. Hybrid Analysis develops and licenses analysis tools to fight malware.
Read more >
US20060153476A1 - Strategies for Performing Scaling Operations ...
Strategies are set forth for performing a scaling operation on image information using a filter. According to one exemplary implementation, the strategies ...
Read more >
THE SONY SERVICE CODES - Articles, Comments, Discoveries ...
If RGB=127 looks like a 50% neutral grey on the memory stick, ... Comparing the color decoder patterns on a component input to...
Read more >
java-1_8_0-openjdk-headless-1.8.0.312-lp152.2.18.2 ... - RPMFind
toString + JDK-8243559: Remove root certificates with 1024-bit keys + ... but not both + [backport] 8242301: Shenandoah: Inline LRB runtime call + ......
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found