question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Using rect(0,0,0,0) in shader examples

See original GitHub issue

Currently in the documentation, when a shader is used, where the intent is to create a flat image of size canvas.width and canvas.height, it is suggested to use rect( 0, 0, width, height) as a way to provide the vertex and fragment shaders with geometry. For example (src, live):

function draw ()
{
    // shader() sets the active shader with our shader
    shader(theShader);

    // rect gives us some geometry on the screen
    rect(0,0,width, height);
}

I think this approach creates a false mental model. One might get the impression (at least I did) that if you change the width and height parameters to rect, then you can control the dimensions of the image generated by the shader. In actuality, the width and height are completely irrelevant. Using values of 0 yields an identical result.

function draw ()
{
    // shader() sets the active shader with our shader
    shader(theShader);

    // rect gives us some geometry on the screen
    rect(0,0,0,0);
}

Digging in to the source code a bit, I think this is because the only thing that matters is the uv coordinates. (The width and height are used only for scaling before rendering, and are not part of the vertex data (aPosition) generated for the rect geometry and passed to the vertex shader).

p5.RendererGL.prototype.rect = function(args) {
  ...
      for (let i = 0; i <= this.detailY; i++) {
        ...
        for (let j = 0; j <= this.detailX; j++) {
          ...
          const p = new p5.Vector(u, v, 0);
          this.vertices.push(p);                       // <- same value
          this.uvs.push(u, v);                         // <- same value
        }
     }
  ...
  // Only a single rectangle (of a given detail) is cached: a square with
  // opposite corners at (0,0) & (1,1).
  // Before rendering, this square is scaled & moved to the required location.
  ...
  try {
    this.uMVMatrix.translate([x, y, 0]);
    this.uMVMatrix.scale(width, height, 1);

    this.drawBuffers(gId);
  }
  ...
}

Using rect(0,0,0,0,) in the examples, can help avoid a false mental model with regards to scaling of the image generated by the shader.

.----------------------------------------------------------

So, what would one actually need to do if they wanted to scale the generated image? I.e. have the image respond to the width and height parameters.

One possible approach, that is consistent with the current p5 implementation, is to somehow make the shaders aware of the dimension transformations made by the modelView matrix.

vertex shader ~~~ For the vertex shader, the only change that needs to happen is to modify how gl_Position is generated as shown below:

Current basic.vert:

attribute vec3 aPosition;
attribute vec2 aTexCoord;

varying vec2 vTexCoord;

void main ()
{
    vTexCoord = aTexCoord;
    
    vec4 positionVec4 = vec4(aPosition, 1.0);

    positionVec4.xy = positionVec4.xy * 2.0 - 1.0;
    
    gl_Position = positionVec4;
}

Modified basic.vert:

attribute vec3 aPosition;
attribute vec2 aTexCoord;

uniform mat4 uProjectionMatrix;
uniform mat4 uModelViewMatrix;

varying vec2 vTexCoord;

void main ()
{
    vTexCoord = aTexCoord;

    vec4 positionVec4 = vec4(aPosition, 1.0);

    gl_Position = uProjectionMatrix * uModelViewMatrix * positionVec4;
}

fragment shader ~~~ For the fragment shader, nothing has to be changed if all the coordinates used are based on the uv coordinates provided by the vertex shader.

For example, the code in basic.frag works as is:

varying vec2 vTexCoord;

void main()
{
    vec2 coord = vTexCoord;

    gl_FragColor = vec4(coord.x, coord.y, (coord.x+coord.y), 1.0 );
}

For fragment shaders that use coordinates based on gl_FragCoord, a minor adjustment can be made to use uv coordinates instead.

.----------------------------------------------------------

Here is a live demo covering everything mentioned above.

@aferriss I’m tagging you here, since you did most of the work on these shader examples. What do you think?

Issue Analytics

  • State:open
  • Created 2 years ago
  • Reactions:1
  • Comments:7 (6 by maintainers)

github_iconTop GitHub Comments

1reaction
JetStarBluescommented, May 6, 2021

Hi, just wanted to check in and mention that I do plan on creating a pull request for this issue. I haven’t been able to find as much time as I would like to tackle this.

0reactions
JetStarBluescommented, May 28, 2021

Thinking of including the following explanation in the updated .frag files. It’s a bit fuzzy because I don’t quite understand what happens behind the scenes. Would love some thoughts on the explanation.

/* p5 uses y-coordinates that decrease as you go "upwards".
    However, conventional? y-coordinates *increase* as you go "upwards".
    As a result, fragment shaders are typically written with the assumption
    that the origin (0, 0) for the uv coordinates lies at the bottom-left.
    Below, we "flip" uv.y so that it follows this convention of increasing upwards.
*/
uv.y = 1.0 - uv.y;
Read more comments on GitHub >

github_iconTop Results From Across the Web

Confused about what kind of shader I need for this? : r/opengl
I'm a complete beginner. Rendered my first triangle and rectangle today. Now I'm trying to render a rectangle with smooth corners, ...
Read more >
GPU drawing using ShaderEffects in QtQuick - Woboq
A ShaderEffect is a QML item that takes a GLSL shader program allowing applications to render using the GPU directly. Using only property ......
Read more >
Unreal Engine 4 Rendering Part 5: Shader Permutations
Because Unreal uses a deferred renderer, some shaders are global (such as the pass that samples the GBuffer and calculates a final color)....
Read more >
Custom effects - Win32 apps - Microsoft Learn
... that ship with Direct2D. To see examples of a complete pixel, vertex, and compute shader effect, see the D2DCustomEffects SDK sample.
Read more >
Rendering 100k spheres, instantiating and draw calls
Simulating life with only a GPU; 2.Light, shadows, and extended ThreeJS materials. Custom shaders means more work.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found