Implementation in Video(React Native)
See original GitHub issueImplementation Issue
So I have been trying to implement gl-react over videos in React Native(Expo CLI). I follwed this example .
Expected behavior
Video should play with gl-react layers.
Actual behavior
However I get this error(copying the files exactly as it is):
blob:https://web.whatsapp.com/ccfde4db-5bbe-467e-a749-14dbd05dfc7d
So I tried changing the code a bit. (only index.js)
// @flow
import React, { useRef, useEffect } from "react";
import { Shaders, GLSL, Node } from "gl-react";
import { Video } from 'expo-av';
import VideoPlayer from 'expo-video-player'
import { Surface } from "gl-react-expo";
import raf from "raf";
import videoMP4 from "./videoMP4.mp4";
export { videoMP4 };
export const VideoContext: React$Context<?HTMLVideoElement> = React.createContext();
import {Dimensions} from 'react-native'
const width = Dimensions.get('window').width;
const height = Dimensions.get('window').height;
// We implement a component <Video> that is like <video>
// but provides a onFrame hook so we can efficiently only render
// if when it effectively changes.
export const VideoPlay = ({ onFrame, ...rest }: { onFrame: number => void }) => {
const video = useRef();
useEffect(() => {
let handle;
let lastTime;
const loop = () => {
handle = raf(loop);
if (!video.current) return;
const currentTime = video.current.currentTime;
// Optimization that only call onFrame if time changes
if (currentTime !== lastTime) {
lastTime = currentTime;
onFrame(currentTime);
}
};
handle = raf(loop);
return () => raf.cancel(handle);
}, [onFrame]);
return (
<VideoContext.Provider value={video}>
<VideoPlayer
{...rest} ref={video}
videoProps={{
shouldPlay: true,
resizeMode: Video.RESIZE_MODE_CONTAIN,
source: require('./videoMP4.mp4'),
}}
height={height}
inFullscreen={true}
/>
</VideoContext.Provider>
);
};
// Our example will simply split R G B channels of the video.
const shaders = Shaders.create({
SplitColor: {
frag: GLSL`
precision highp float;
varying vec2 uv;
uniform sampler2D children;
void main () {
float y = uv.y * 3.0;
vec4 c = texture2D(children, vec2(uv.x, mod(y, 1.0)));
gl_FragColor = vec4(
c.r * step(2.0, y) * step(y, 3.0),
c.g * step(1.0, y) * step(y, 2.0),
c.b * step(0.0, y) * step(y, 1.0),
1.0);
}`
}
//^NB perf: in fragment shader paradigm, we want to avoid code branch (if / for)
// and prefer use of built-in functions and just giving the GPU some computating.
// step(a,b) is an alternative to do if(): returns 1.0 if a<b, 0.0 otherwise.
});
const SplitColor = ({ children }) => (
<Node shader={shaders.SplitColor} uniforms={{ children }} />
);
// We now uses <Video> in our GL graph.
// The texture we give to <SplitColor> is a (redraw)=><Video> function.
// redraw is passed to Video onFrame event and Node gets redraw each video frame.
export default VideoShow = () => (
<Surface style={{height: height, width: width}} pixelRatio={1}>
<SplitColor>
{redraw => (
<VideoPlay onFrame={redraw} autoPlay loop />
)}
</SplitColor>
</Surface>
);
Now I get a complete black screen as the layer over the video. Video plays for sure, since audio is coming, seems like the gl layer over it comes as black only. This warning comes >> Node#138(colorify#50), uniform children: child is not renderable. Got:, null
Issue Analytics
- State:
- Created 3 years ago
- Comments:6 (1 by maintainers)
Top Results From Across the Web
Adding videos to React Native with react-native-video
Follow this guide to learn how to implement video into your React Native app using the most common package, react-native-video.
Read more >react-native-video - npm
A element for react-native. Latest version: 5.2.1, last published: 4 months ago. Start using react-native-video in your project by running ...
Read more >React Native Video - GitHub
Version 6.0.0 is introducing dozens of breaking changes, mostly through updated dependecies and significant refactoring. While the API remains compatible, the ...
Read more >Add Video to Your React Native App Using react-native-video
react -native-video is basically a package that allows a video component to be shown/used within React Native. It offers various functionalities, such as...
Read more >React Native Video Library to Play Video in Android and IOS
React Native Video Player to Play Any Video in React Native App for Android and IOS. Use online Video URL or local video...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
i think the underlying gl.node simply dosen’t support video surfaces which i think is fundamentally different from images.
i ‘resolved’ it by showing /rendering the preview version of the video, which I have as a gif format - instead of trying to put the gl-react filter on the actual mp4 video. Then afterwards i have to ffmpeg filters to manually ‘synchronize’ with the gl-react filter.
In any case, it would have been inefficient to try to render all the different filter options to show the user on the actual mp4 video.
Is the issue that we cannot access the frame in
<Video>
from expo? Would using https://github.com/gre/react-native-view-shot to capture the video frame make sense? Or would that be too slow?