Filters in retina display have wrong UV
See original GitHub issueI’m using pixi 5.1.1
Writting a simple vignette shader I can see that the build in varyings representing the UV’s are wrong on the .y
component.
If my display is non retina or if it is retina but I initialise the Application with resolution = 1
The shader is as expected. if I’m on a retina display then the .y needs to be divided by 2
const noRetina = true; // manually change the resolution on a retina display (on/off)
const opts = {
width: 1920,
height: 1080,
resolution: noRetina ? 1 : global.devicePixelRatio,
autoDensity: !noRetina,
backgroundColor: 0x000000,
antialias: false,
view,
};
const app = new Application(opts);
Fragment shader
precision mediump float;
varying vec2 vTextureCoord;
uniform sampler2D uSampler;
uniform float size;
uniform float amount;
void main() {
vec3 color = texture2D(uSampler, vTextureCoord).rgb;
float dist = distance(vTextureCoord, vec2(0.5, 0.5));
color *= smoothstep(0.8, size * 0.799, dist * (1.0 * amount + size));
gl_FragColor = vec4(color, 1.0);
}
Issue Analytics
- State:
- Created 4 years ago
- Reactions:1
- Comments:9 (2 by maintainers)
Top Results From Across the Web
Filters in retina display have wrong UV #6266 - pixijs ... - GitHub
I'm using pixi 5.1.1 Writting a simple vignette shader I can see that the build in varyings representing the UV's are wrong on...
Read more >Blue Light Filters For Your Mac: What You Need To Know
Once the retina is damaged, then you go permanently blind. The damage is similar to what overexposure to Ultra-Violet rays does to your...
Read more >Blue Light Filters & Eye Strain - What You Need to Know
Learn why and how a blue light filter for your PC is a good idea for both mental ... permeable to UV and...
Read more >How to Use a Computer Screen Blue Light Filter - Felix Gray
We explain the dangers of Blue Light, what Blue Light filters are, ... which means it falls next to ultra violet light (UV...
Read more >Does blue light from electronic devices damage our eyes?
As we age, our cloudier eyes do a better job of filtering out dangerous UV rays and blue light. The clear eyes of...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Wait, I remember one more cool thing!
If you need something a vignette , that just “masks” container inside , you can just draw a mesh on top, without filters . Mesh is easier to make and it wont require extra framebuffer: https://pixijs.io/examples/#/mesh-and-shaders/triangle-textured.js
Hi @ivanpopelyshev thanks for your reply.
Let’s do this by parts, I don’t want to extend this issue longer than it needs be.
The screenshots above were taken with the exact same screen with the exact same width and height, only the
resolution
andautoDensity
properties changed.Assuming I was creating a
1920px
x1080px
canvas withresolution=1
andautoDensity=false
everything seems to be “on scale”. The vignette gets “distorted” as its mapping the corners of the screen. So far, so good.If I then changed
resolution=2
andautoDensity=true
I would have a canvas with3840px
x2160px
that was scaled down via css to1920px
x1080px
. Shouldn’t I have the same looking vignette? What I have now is a perfect circle, not a distorted one. So its either theautoDensity
or theresolution
enabling such difference no?I think I do, the effect above is just so I can illustrate that the UV coordinates I’m getting aren’t the ones I’m expecting. The effect I want to achieve is a mix of what’s on the stage, and some post processing going on top. So I do need an input texture of what’s on the screen (
uniform sampler2D uSampler;
). I just covered it red so I’m not sharing clients confidential work 😃For example, applying a filter to the stage, if the content gets drawn into a framebuffer and then used as a texture via
uniform sampler2D uSampler;
shouldn’t I be able to properly see the UV’s mapping the corners of the canvas element by doing something like:gl_FragColor = vec4(vTextureCoord, 1.0, 1.0);
?I do appreciate you took the time to explain a little bit about conversion functions, but reading that article Its not quite clear (for me) what varying to use and when to “convert them”. I would be expecting something like what we can do in ShaderToy
vec2 uv = fragCoord / iResolution.xy;
to map with the corners of my canvas element and It seems likevTextureCoord
only does this ifautoDensity=false
andresolution=1
(first screenshot).I’m happy to accept that I’m missing something, and all the information in in that article, but maybe just use this thread as a constructive criticism that the article might not be as clear as it could be.