question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Blank Canvas on FaceDraw in React

See original GitHub issue

Hey, I’m trying to replicate the FaceDraw demo into react. However, I’m getting a blank canvas in return. The warning is as follows: performance warning: READ-usage buffer was written, then fenced, but written again before being read back. This discarded the shadow copy that was created to accelerate readback. and after 256 calls of this error: WebGL: too many errors, no more errors will be reported to the console for this context.

I think there’s some issue with the buffers here as I tried to replicate the FaceTracking demo and it worked fine (it wasn’t using buffers).

I am using the facefilter npm package here.

PS: the code is same, I’ve just created a helper for the FaceDraw and calling it inside a ReactComponent.

Here’s some code for help:

const AVTest = () => {
  const jeelizRef = useRef(null);
  useEffect(() => {
    if (jeelizRef.current) initJeeliz();
  }, [jeelizRef]);

  const initJeeliz = () => {
    let CVD = null;
    console.log("initJeeliz");
    JEELIZFACEFILTER.init({
      canvas: jeelizRef.current,
      NNC: NN_DEFAULT, // root of NN_DEFAULT.json file
      callbackReady: function (errCode, spec) {
        if (errCode) {
          console.log("AN ERROR HAPPENS. SORRY BRO :( . ERR =", errCode);
          return;
        }

        console.log("INFO: JEELIZFACEFILTER IS READY");
        CVD = JeelizCanvas2DHelper(spec);
        // CVD.Init_scene(spec);
      },

      // called at each render iteration (drawing loop):
      callbackTrack: function (detectState) {
        if (
          CVD.ISDETECTED &&
          detectState.detected < CVD.SETTINGS.detectionThreshold - CVD.SETTINGS.detectionHysteresis
        ) {
          // DETECTION LOST
          CVD.detect_callback(false);
          CVD.ISDETECTED = false;
        } else if (
          !CVD.ISDETECTED &&
          detectState.detected > CVD.SETTINGS.detectionThreshold + CVD.SETTINGS.detectionHysteresis
        ) {
          // FACE DETECTED
          CVD.detect_callback(true);
          CVD.ISDETECTED = true;
        }

        // render the video screen:
        CVD.GL.viewport(0, 0, CVD.CV.width, CVD.CV.height);
        CVD.GL.useProgram(CVD.SHADERVIDEO.program);
        CVD.GL.uniformMatrix2fv(CVD.SHADERVIDEO.videoTransformMat2, false, CVD.VIDEOTRANSFORMMAT2);
        CVD.GL.bindTexture(CVD.GL.TEXTURE_2D, CVD.VIDEOTEXTURE);
        CVD.GL.drawElements(CVD.GL.TRIANGLES, 3, CVD.GL.UNSIGNED_SHORT, 0);

        if (CVD.ISDETECTED) {
          const aspect = CVD.CV.width / CVD.CV.height;

          // move the cube in order to fit the head:
          const tanFOV = Math.tan((aspect * CVD.SETTINGS.cameraFOV * Math.PI) / 360); // tan(FOV/2), in radians
          const W = detectState.s; // relative width of the detection window (1-> whole width of the detection window)
          const D = 1 / (2 * W * tanFOV); // distance between the front face of the cube and the camera

          // coords in 2D of the center of the detection window in the viewport:
          const xv = detectState.x;
          const yv = detectState.y;

          // coords in 3D of the center of the cube (in the view coordinates system):
          const z = -D - 0.5; // minus because view coordinate system Z goes backward. -0.5 because z is the coord of the center of the cube (not the front face)
          const x = xv * D * tanFOV;
          const y = (yv * D * tanFOV) / aspect;

          // move and rotate the cube:
          CVD.set_mat4Position(
            CVD.MOVMATRIX,
            x,
            y + CVD.SETTINGS.pivotOffsetYZ[0],
            z + CVD.SETTINGS.pivotOffsetYZ[1]
          );
          CVD.set_mat4RotationXYZ(
            CVD.MOVMATRIX,
            detectState.rx + CVD.SETTINGS.rotationOffsetX,
            detectState.ry,
            detectState.rz
          );

          // render the canvas above:
          CVD.GL.clear(CVD.GL.DEPTH_BUFFER_BIT);
          CVD.GL.enable(CVD.GL.BLEND);
          CVD.GL.blendFunc(CVD.GL.SRC_ALPHA, CVD.GL.ONE_MINUS_SRC_ALPHA);
          CVD.GL.useProgram(CVD.SHADERCANVAS.program);
          CVD.GL.enableVertexAttribArray(CVD.SHADERCANVAS.position);
          CVD.GL.enableVertexAttribArray(CVD.SHADERCANVAS.uv);
          CVD.GL.uniformMatrix4fv(CVD.SHADERCANVAS.movMatrix, false, CVD.MOVMATRIX);
          if (CVD.PROJMATRIXNEEDSUPDATE) {
            CVD.update_projMatrix();
          }
          CVD.GL.bindTexture(CVD.GL.TEXTURE_2D, CVD.CANVASTEXTURE);
          if (CVD.CANVASTEXTURENEEDSUPDATE) {
            CVD.GL.texImage2D(
              CVD.GL.TEXTURE_2D,
              0,
              CVD.GL.RGBA,
              CVD.GL.RGBA,
              CVD.GL.UNSIGNED_BYTE,
              CVD.CANVAS2D
            );
            CVD.CANVASTEXTURENEEDSUPDATE = false;
          }
          CVD.GL.bindBuffer(CVD.GL.ARRAY_BUFFER, CVD.VBO_VERTEX);
          CVD.GL.vertexAttribPointer(CVD.SHADERCANVAS.position, 3, CVD.GL.FLOAT, false, 20, 0);
          CVD.GL.vertexAttribPointer(CVD.SHADERCANVAS.uv, 2, CVD.GL.FLOAT, false, 20, 12);
          CVD.GL.bindBuffer(CVD.GL.ELEMENT_ARRAY_BUFFER, CVD.VBO_FACES);
          CVD.GL.drawElements(CVD.GL.TRIANGLES, 6, CVD.GL.UNSIGNED_SHORT, 0);
          CVD.GL.disableVertexAttribArray(CVD.SHADERCANVAS.uv);
          CVD.GL.disableVertexAttribArray(CVD.SHADERCANVAS.position);
          CVD.GL.disable(CVD.GL.BLEND);
        }
      },
      isKeepRunningOnWinFocusLost: true,
    });
  };
  return (
   <div>
      <canvas id="jeelizCanvas" ref={jeelizRef} hidden={false} height={750} width={750} />
   </div>
  );
}

And here’s the helper:

import frame from './frame.png';

const JeelizCanvas2DHelper = function (spec) {
  console.log("Helper Called");
  // SETTINGS of this demo:
  const SETTINGS = {
    strokeStyle: "red",
    rotationOffsetX: 0, // negative -> look upper. in radians
    cameraFOV: 40, // in degrees, 3D camera FOV
    pivotOffsetYZ: [0.2, 0.2], // XYZ of the distance between the center of the cube and the pivot
    detectionThreshold: 0.75, // sensibility, between 0 and 1. Less -> more sensitive
    detectionHysteresis: 0.05,
    scale: [1, 1.24], // scale of the 2D canvas along horizontal and vertical 2D axis
    offsetYZ: [-0.1, -0.2], // offset of the 2D canvas along vertical and depth 3D axis
    canvasSizePx: 512, // resolution of the 2D canvas in pixels
  };

  // some globalz:
  let CV = null,
    CANVAS2D = null,
    CTX = null,
    GL = null,
    CANVASTEXTURE = null,
    CANVASTEXTURENEEDSUPDATE = false;
  let PROJMATRIX = null,
    PROJMATRIXNEEDSUPDATE = true;
  let VBO_VERTEX = null,
    VBO_FACES = null,
    SHADERCANVAS = null;
  let SHADERVIDEO = null,
    VIDEOTEXTURE = null,
    VIDEOTRANSFORMMAT2 = null;
  let MOVMATRIX = create_mat4Identity(),
    MOVMATRIXINV = create_mat4Identity();

  let ZPLANE = 0,
    YPLANE = 0;
  let ISDETECTED = false;

  // callback: launched if a face is detected or lost.
  function detect_callback(isDetected) {
    console.log('detect_callback');
    if (isDetected) {
      console.log("INFO in detect_callback(): DETECTED");
    } else {
      console.log("INFO in detect_callback(): LOST");
    }
  }

  //BEGIN MATRIX ALGEBRA FUNCTIONS
  function create_mat4Identity() {
    console.log('create_mat4Identity');
    return [1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1];
  }

  // set the position part of a flattened transposed mat4:
  function set_mat4Position(m, x, y, z) {
    console.log('set_mat4Position');
    m[12] = x;
    m[13] = y;
    m[14] = z;
  }

  // set the rotation part of a flattened transposed mat4 - see https://en.wikipedia.org/wiki/Euler_angles
  function set_mat4RotationXYZ(m, rx, ry, rz) {
    console.log('set_mat4RotationXYZ');
    const c1 = Math.cos(rx),
      s1 = Math.sin(rx),
      c2 = Math.cos(ry),
      s2 = Math.sin(ry),
      c3 = Math.cos(rz),
      s3 = Math.sin(rz);
    // first line (not transposed)
    m[0] = c2 * c3;
    m[4] = -c2 * s3;
    m[8] = s2;

    // second line (not transposed)
    m[1] = c1 * s3 + c3 * s1 * s2;
    m[5] = c1 * c3 - s1 * s2 * s3;
    m[9] = -c2 * s1;

    // third line (not transposed)
    m[2] = s1 * s3 - c1 * c3 * s2;
    m[6] = c3 * s1 + c1 * s2 * s3;
    m[10] = c1 * c2;
  }

  // inverse a mat4 move matrix m and put result to mat4 matrix r
  function inverse_mat4MoveMatrix(m, r) {
    console.log('inverse_mat4MoveMatrix');
    // rotation part: the inverse = the transpose
    r[0] = m[0];
    r[1] = m[4];
    r[2] = m[8];

    r[4] = m[1];
    r[5] = m[5];
    r[6] = m[9];

    r[8] = m[2];
    r[9] = m[6];
    r[10] = m[10];

    // translation part: = -tR.T where T=[m[12], m[13], m[14]]
    r[12] = -(m[0] * m[12] + m[1] * m[13] + m[2] * m[14]);
    r[13] = -(m[4] * m[12] + m[5] * m[13] + m[6] * m[14]);
    r[14] = -(m[8] * m[12] + m[9] * m[13] + m[10] * m[14]);
  }

  function multiply_matVec4(m, v) {
    return [
      m[0] * v[0] + m[4] * v[1] + m[8] * v[2] + m[12] * v[3],
      m[1] * v[0] + m[5] * v[1] + m[9] * v[2] + m[13] * v[3],
      m[2] * v[0] + m[6] * v[1] + m[10] * v[2] + m[14] * v[3],
      m[3] * v[0] + m[7] * v[1] + m[11] * v[2] + m[15] * v[3],
    ];
  }

  function get_mat4Pos(m) {
    console.log('get_mat4Pos');
    return [m[12], m[13], m[14]];
  }
  //END MATRIX ALGEBRA FUNCTIONS

  //BEGIN WEBGL HELPERS
  // compile a shader:
  function compile_shader(source, glType, typeString) {
    console.log('compile_shader');
    const glShader = GL.createShader(glType);
    GL.shaderSource(glShader, source);
    GL.compileShader(glShader);
    if (!GL.getShaderParameter(glShader, GL.COMPILE_STATUS)) {
      alert("ERROR IN " + typeString + " SHADER: " + GL.getShaderInfoLog(glShader));
      console.log("Buggy shader source: \n", source);
      return null;
    }
    return glShader;
  }

  // helper function to build the shader program:
  function build_shaderProgram(shaderVertexSource, shaderFragmentSource, id) {
    console.log('build_shaderProgram');
    // compile both shader separately:
    const glShaderVertex = compile_shader(shaderVertexSource, GL.VERTEX_SHADER, "VERTEX " + id);
    const glShaderFragment = compile_shader(
      shaderFragmentSource,
      GL.FRAGMENT_SHADER,
      "FRAGMENT " + id
    );

    const glShaderProgram = GL.createProgram();
    GL.attachShader(glShaderProgram, glShaderVertex);
    GL.attachShader(glShaderProgram, glShaderFragment);

    // start the linking stage:
    GL.linkProgram(glShaderProgram);
    return glShaderProgram;
  }

  // helper function to create the projection matrix:
  function update_projMatrix() {
    console.log('update_projMatrix');
    const tan = Math.tan((0.5 * SETTINGS.cameraFOV * Math.PI) / 180),
      zMax = 100,
      zMin = 0.1,
      a = CV.width / CV.height;

    const A = -(zMax + zMin) / (zMax - zMin),
      B = (-2 * zMax * zMin) / (zMax - zMin);

    PROJMATRIX = [1.0 / tan, 0, 0, 0, 0, a / tan, 0, 0, 0, 0, A, -1, 0, 0, B, 0];

    GL.uniformMatrix4fv(SHADERCANVAS.projMatrix, false, PROJMATRIX);
    PROJMATRIXNEEDSUPDATE = false;
  }

  //END WEBGL HELPERS

  //build the 3D. called once when Jeeliz Face Filter is OK
  // function Init_scene(spec) {
    console.log('Init_scene');
    // affect some globalz:
    GL = spec.GL;
    CV = spec.canvasElement;
    VIDEOTEXTURE = spec.videoTexture;
    VIDEOTRANSFORMMAT2 = spec.videoTransformMat2;

    // create and size the 2D canvas and its drawing context:
    CANVAS2D = document.createElement("canvas");
    CANVAS2D.width = SETTINGS.canvasSizePx;
    CANVAS2D.height = Math.round((SETTINGS.canvasSizePx * SETTINGS.scale[1]) / SETTINGS.scale[0]);
    CTX = CANVAS2D.getContext("2d");

    const frameImage = new Image();
    frameImage.src = frame;
    frameImage.onload = function () {
      console.log('image loaded');
      CTX.drawImage(
        frameImage,
        0,
        0,
        frameImage.width,
        frameImage.height,
        0,
        0,
        CANVAS2D.width,
        CANVAS2D.height
      );
      update_canvasTexture();
    };
    console.log(frameImage);

    // create the WebGL texture with the canvas:
    CANVASTEXTURE = GL.createTexture();
    GL.bindTexture(GL.TEXTURE_2D, CANVASTEXTURE);
    GL.texImage2D(GL.TEXTURE_2D, 0, GL.RGBA, GL.RGBA, GL.UNSIGNED_BYTE, CANVAS2D);
    GL.texParameteri(GL.TEXTURE_2D, GL.TEXTURE_MAG_FILTER, GL.LINEAR);
    GL.texParameteri(GL.TEXTURE_2D, GL.TEXTURE_MIN_FILTER, GL.LINEAR);
    GL.texParameteri(GL.TEXTURE_2D, GL.TEXTURE_WRAP_S, GL.CLAMP_TO_EDGE);
    GL.texParameteri(GL.TEXTURE_2D, GL.TEXTURE_WRAP_T, GL.CLAMP_TO_EDGE);

    // create the face plane:
    const sx = SETTINGS.scale[0],
      sy = SETTINGS.scale[1]; // scale
    YPLANE = SETTINGS.offsetYZ[0] + SETTINGS.pivotOffsetYZ[0]; // offset
    ZPLANE = SETTINGS.offsetYZ[1] + SETTINGS.pivotOffsetYZ[1];
    VBO_VERTEX = GL.createBuffer();
    GL.bindBuffer(GL.ARRAY_BUFFER, VBO_VERTEX);
    GL.bufferData(
      GL.ARRAY_BUFFER,
      new Float32Array([
        //format of each vertex: x,y,z,  u,v
        -sx,
        -sy + YPLANE,
        ZPLANE,
        1,
        1,
        sx,
        -sy + YPLANE,
        ZPLANE,
        0,
        1,
        sx,
        sy + YPLANE,
        ZPLANE,
        0,
        0,
        -sx,
        sy + YPLANE,
        ZPLANE,
        1,
        0,
      ]),
      GL.STATIC_DRAW
    );

    // FACES:
    VBO_FACES = GL.createBuffer();
    GL.bindBuffer(GL.ELEMENT_ARRAY_BUFFER, VBO_FACES);
    GL.bufferData(GL.ELEMENT_ARRAY_BUFFER, new Uint16Array([0, 1, 2, 0, 2, 3]), GL.STATIC_DRAW);

    // create the shaders:
    const copyCropVertexShaderSource =
      "attribute vec2 position;\n\
     uniform mat2 videoTransformMat2;\n\
     varying vec2 vUV;\n\
     void main(void){\n\
      gl_Position = vec4(position, 0., 1.);\n\
      vUV = vec2(0.5,0.5) + videoTransformMat2 * position;\n\
     }";

    const copyFragmentShaderSource =
      "precision lowp float;\n\
     uniform sampler2D samplerImage;\n\
     varying vec2 vUV;\n\
     \n\
     void main(void){\n\
      gl_FragColor = texture2D(samplerImage, vUV);\n\
     }";
    const shpVideo = build_shaderProgram(
      copyCropVertexShaderSource,
      copyFragmentShaderSource,
      "VIDEO"
    );
    SHADERVIDEO = {
      program: shpVideo,
      videoTransformMat2: GL.getUniformLocation(shpVideo, "videoTransformMat2"),
    };
    let uSampler = GL.getUniformLocation(shpVideo, "samplerImage");
    GL.useProgram(shpVideo);
    GL.uniform1i(uSampler, 0);

    const shpCanvas = build_shaderProgram(
      //basic 3D projection shader
      "attribute vec3 position;\n\
    attribute vec2 uv;\n\
    uniform mat4 projMatrix, movMatrix;\n\
    varying vec2 vUV;\n\
    void main(void){\n\
      gl_Position = projMatrix*movMatrix*vec4(position, 1.);\n\
      vUV = uv;\n\
    }",
      copyFragmentShaderSource,
      "CANVAS"
    );

    SHADERCANVAS = {
      program: shpCanvas,
      projMatrix: GL.getUniformLocation(shpCanvas, "projMatrix"),
      movMatrix: GL.getUniformLocation(shpCanvas, "movMatrix"),
      position: GL.getAttribLocation(shpCanvas, "position"),
      uv: GL.getAttribLocation(shpCanvas, "uv"),
    };
    uSampler = GL.getUniformLocation(shpCanvas, "samplerImage");
    GL.useProgram(shpCanvas);
    GL.uniform1i(uSampler, 0);
    GL.disableVertexAttribArray(shpCanvas, SHADERCANVAS.position);
    GL.disableVertexAttribArray(shpCanvas, SHADERCANVAS.uv);
  // } //end init_scene()

  function update_canvasTexture() {
    console.log('update_canvasTexture');
    CANVASTEXTURENEEDSUPDATE = true;
  } //end update_canvasTexture return value

  return {
    CV,
    CANVAS2D,
    CTX,
    GL,
    CANVASTEXTURE,
    CANVASTEXTURENEEDSUPDATE,
    PROJMATRIX,
    PROJMATRIXNEEDSUPDATE,
    VBO_VERTEX,
    VBO_FACES,
    SHADERCANVAS,
    SHADERVIDEO,
    VIDEOTEXTURE,
    VIDEOTRANSFORMMAT2,
    MOVMATRIX,
    MOVMATRIXINV,

    ZPLANE,
    YPLANE,
    ISDETECTED,
    // Init_scene,
    update_canvasTexture,
    detect_callback,
    get_mat4Pos,
    update_projMatrix,
    multiply_matVec4,
    inverse_mat4MoveMatrix,
    set_mat4RotationXYZ,
    set_mat4Position,
  };
}; //end JeelizCanvas2DHelper()

export default JeelizCanvas2DHelper;

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:5 (3 by maintainers)

github_iconTop GitHub Comments

1reaction
rohg007commented, Mar 10, 2022

Hi @xavierjs, thanks for a quick response, it works like a charm now!

1reaction
xavierjscommented, Mar 10, 2022

Hi @rohg007

There was an issue with Jeeliz Facefilter, errors happening in callbackTrack were simply dropped. I have fixed it and updated the NPM package. I have also fixed some small stuffs in your demo, I have submitted a PR here: https://github.com/rohg007/JeelizFaceDrawReact/pull/1

Now it displays the camera video in the backgroud and an image on the head.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Canvas with React.js - Medium
In this article, we will see how to create a Canvas React component and a custom hook for extracting its logic, so we...
Read more >
2D sketches with React and the Canvas API - OpenReplay Blog
A simple animation; A blank canvas where we can draw shapes and graphics. Properties and methods available in the Canvas API. Some properties ......
Read more >
Using React Hooks with canvas - Koen van Gilst
In this tutorial I will use React Hooks to create an html canvas drawing ... we clearing the state by setting it to...
Read more >
How to get Image response from react canvas draw?
toDataURL(); const w = window.open('about:blank', 'image from canvas'); const img = require("relativepath to background image"); ...
Read more >
react-signature-canvas - npm
A React wrapper component around signature_pad. Unopinionated and heavily updated fork of react-signature-pad. Latest version: 1.0.6, ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found