Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implementation in Video(React Native) #258

Open
ShashwatMDas opened this issue Jul 5, 2020 · 6 comments
Open

Implementation in Video(React Native) #258

ShashwatMDas opened this issue Jul 5, 2020 · 6 comments
Labels
Expo this is related to Expo video

Comments

@ShashwatMDas
Copy link

ShashwatMDas commented Jul 5, 2020

Implementation Issue

So I have been trying to implement gl-react over videos in React Native(Expo CLI). I follwed this example .

Expected behavior

Video should play with gl-react layers.

Actual behavior

However I get this error(copying the files exactly as it is):

blob:https://web.whatsapp.com/ccfde4db-5bbe-467e-a749-14dbd05dfc7d

So I tried changing the code a bit. (only index.js)

// @flow
import React, { useRef, useEffect } from "react";
import { Shaders, GLSL, Node } from "gl-react";
import { Video } from 'expo-av';
import VideoPlayer from 'expo-video-player'
import { Surface } from "gl-react-expo";
import raf from "raf";
import videoMP4 from "./videoMP4.mp4";
export { videoMP4 };
export const VideoContext: React$Context<?HTMLVideoElement> = React.createContext();
import {Dimensions} from 'react-native'


const width = Dimensions.get('window').width;
const height = Dimensions.get('window').height;

// We implement a component <Video> that is like <video>
// but provides a onFrame hook so we can efficiently only render
// if when it effectively changes.
export const VideoPlay = ({ onFrame, ...rest }: { onFrame: number => void }) => {
  const video = useRef();

  useEffect(() => {
    let handle;
    let lastTime;

    const loop = () => {
      handle = raf(loop);
      if (!video.current) return;
      const currentTime = video.current.currentTime;
      // Optimization that only call onFrame if time changes
      if (currentTime !== lastTime) {
        lastTime = currentTime;
        onFrame(currentTime);
      }
    };
    handle = raf(loop);

    return () => raf.cancel(handle);
  }, [onFrame]);

  return (
    <VideoContext.Provider value={video}>
      <VideoPlayer
      {...rest} ref={video}
          videoProps={{
              shouldPlay: true,
              resizeMode: Video.RESIZE_MODE_CONTAIN,
              source: require('./videoMP4.mp4'),
          }}
          height={height}
          inFullscreen={true}
      />
    </VideoContext.Provider>
  );
};

// Our example will simply split R G B channels of the video.
const shaders = Shaders.create({
  SplitColor: {
    frag: GLSL`
precision highp float;
varying vec2 uv;
uniform sampler2D children;
void main () {
  float y = uv.y * 3.0;
  vec4 c = texture2D(children, vec2(uv.x, mod(y, 1.0)));
  gl_FragColor = vec4(
    c.r * step(2.0, y) * step(y, 3.0),
    c.g * step(1.0, y) * step(y, 2.0),
    c.b * step(0.0, y) * step(y, 1.0),
    1.0);
}`
  }
  //^NB perf: in fragment shader paradigm, we want to avoid code branch (if / for)
  // and prefer use of built-in functions and just giving the GPU some computating.
  // step(a,b) is an alternative to do if(): returns 1.0 if a<b, 0.0 otherwise.
});
const SplitColor = ({ children }) => (
  <Node shader={shaders.SplitColor} uniforms={{ children }} />
);

// We now uses <Video> in our GL graph.
// The texture we give to <SplitColor> is a (redraw)=><Video> function.
// redraw is passed to Video onFrame event and Node gets redraw each video frame.
export default VideoShow = () => (
  <Surface style={{height: height, width: width}} pixelRatio={1}>
    <SplitColor>
      {redraw => (
        <VideoPlay onFrame={redraw} autoPlay loop />
      )}
    </SplitColor>
  </Surface>
);
 

Now I get a complete black screen as the layer over the video. Video plays for sure, since audio is coming, seems like the gl layer over it comes as black only. This warning comes >> Node#138(colorify#50), uniform children: child is not renderable. Got:, null

@magiccDev
Copy link

I am also having a similar issue, with webcam feed - react native

@gre gre added the Expo this is related to Expo label Dec 27, 2020
@enigmablue
Copy link

I'm trying this with a react-native-video but it dosen't pick up as well. (not on expo)

child is not renderable. Got:', null

Any idea?

@enigmablue
Copy link

enigmablue commented Jan 25, 2021

@gre wonder if you could advise best way this could be fixed?

I can see a gl-react-image https://github.com/gre/gl-react-image
Is there a need for something similar for gl-react-native video ? And for the uniform to pick it up?

@bastienrobert
Copy link

Related to #215

@wcandillon
Copy link
Contributor

Is the issue that we cannot access the frame in <Video> from expo? Would using https://github.com/gre/react-native-view-shot to capture the video frame make sense? Or would that be too slow?

@enigmablue
Copy link

i think the underlying gl.node simply dosen't support video surfaces which i think is fundamentally different from images.

i 'resolved' it by showing /rendering the preview version of the video, which I have as a gif format - instead of trying to put the gl-react filter on the actual mp4 video. Then afterwards i have to ffmpeg filters to manually 'synchronize' with the gl-react filter.

In any case, it would have been inefficient to try to render all the different filter options to show the user on the actual mp4 video.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Expo this is related to Expo video
Projects
None yet
Development

No branches or pull requests

6 participants