Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multistreams with Green Screen Effect #310

Open
wants to merge 25 commits into
base: master
Choose a base branch
from

Conversation

jimver04
Copy link

Hi Vincent,

I have used the MultiStreams example to make an example for green screen effect.

Contribution:

examples/advanced-video-green-screen.html
src/components/networked-video-source-green-screen.js

Real Demo:
https://vrodos-multiplaying.iti.gr/advanced-video-green-screen.html

Tests:
Firefox (not yet tested with Chrome)

Methodology Explanation:

When you send stream: I use a canvas to process the camera stream and send the one without green pixels.
When you receive a stream (for all received streams): I use a custom shader to remove all green pixels.

Video:
A video about a previous version here:
https://www.youtube.com/watch?v=WKKZUz_iKqY

More collaboration:
Let's make a meeting for demonstration. For the time being, I have dat.gui handles for only green pixels (g-r>thres). In the todo is to add also other colors as well as YUV encoding rather than RGB which is more useful in compositing.

Optimization:
The original camera stream is disabled to save some bandwidth. I have used 800x600 to keep the bandwidth around 1Mbps for two persons. I guess that 0.5 Mbps is added per person.

Best,
Dimitrios

Copy link
Member

@vincentfretin vincentfretin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Quick review with some comments. I didn't look in details to the example yet. I'm going to run them now.

**sudo kill -9 3766**



Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

On Linux, a process that runs in a terminal in foreground can be stopped with Ctrl+C.
Is this not working for you?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The process might be initialized with a cron job or some other way. We usually put an automatic way in case for some reason the server restarts. So , in order to kill it you must search it.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok. Generally pm2 is used to deploy a nodejs app, to start it as a service. Last time I deployed easyrtc-server.js on a test server I read https://www.digitalocean.com/community/tutorials/how-to-set-up-a-node-js-application-for-production-on-ubuntu-16-04 and used this:

$ sudo npm install -g pm2
$ pm2 start easyrtc-server.js
                          Runtime Edition

        PM2 is a Production Process Manager for Node.js applications
                     with a built-in Load Balancer.

                Start and Daemonize any application:
                $ pm2 start app.js

                Load Balance 4 instances of api.js:
                $ pm2 start api.js -i 4

                Monitor in production:
                $ pm2 monitor

                Make pm2 auto-boot at server restart:
                $ pm2 startup

                To go further checkout:
                http://pm2.io/


                        -------------

[PM2] Spawning PM2 daemon with pm2_home=/home/vincentfretin/.pm2
[PM2] PM2 Successfully daemonized
[PM2] Starting /home/vincentfretin/vr/easyrtc-server.js in fork_mode (1 instance)
[PM2] Done.
┌────────────────┬────┬──────┬────────┬───┬─────┬───────────┐
│ Name           │ id │ mode │ status │ ↺ │ cpu │ memory    │
├────────────────┼────┼──────┼────────┼───┼─────┼───────────┤
│ easyrtc-server │ 0  │ fork │ online │ 0 │ 0%  │ 29.6 MB   │
└────────────────┴────┴──────┴────────┴───┴─────┴───────────┘
 Use `pm2 show <id|name>` to get more details about an app


vincentfretin@server1:~/vr$ pm2 startup
[PM2] Init System found: systemd
[PM2] To setup the Startup Script, copy/paste the following command:
sudo env PATH=$PATH:/usr/bin /usr/lib/node_modules/pm2/bin/pm2 startup systemd -u vincentfretin --hp /home/vincentfretin

[PM2] Freeze a process list on reboot via:
$ pm2 save

[PM2] Remove init script via:
$ pm2 unstartup systemd

Show info about the app:

    pm2 show easyrtc-server

Show logs:

    tail -f /home/vincentfretin/.pm2/logs/easyrtc-server-out.log

Feel free to modify your doc to include that.

// Get port or default to 8080
const port = process.env.PORT || 8080;
// Get port or default to 5832
const port = process.env.PORT || 5832;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please don't change the default port in the PR.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you need to change the port, you can do this in bash in your terminal:

export PORT=5832
npm start

and even put this line in your ~/.bashrc
so next time you just use npm start and this will use the redefined port.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok thanks, I didn't know that way !

if (data.event) {
// This will log the `message` when the entity emits the `event`.
el.addEventListener(data.event, function () {
});
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

dead code?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oops. it was from debugging. Yes delete it.

this.video = video;
}
}
});
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The idea I had was to make changes in the networked-video-source component, maybe adding a parameter or two to have the option of using a custom shader instead of writing a specific networked-video-source-green-screen, duplicating lots of the code.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My mainly concern was not to break the existing component and have something for you to merge without conflict resolution. I can merge them if it is an issue.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It doesn't make much sense to me to have two different components that are almost identical in the NAF library. It should be a properly registered shader and an option to the existing networked-video-source to use a custom shader. Of course the default options should have no change to the existing examples, so still using the default shader.
It depends on the level of contribution you are willing to take. If you decide to keep it separated, in this case you shouldn't include it in src/components but keep it only for your example, so in examples/js directory.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually you shouldn't need to change networked-video-source at all. You should be able to register a new "green-screen" shader with registerShader, modify your shader to use uniform map instead of uMap. And use in your entity networked-video-source="streamName: screen" material="shader:green-screen;GreenThresholdIn:0.02"

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have selected the last case, however,

How do I update the map with the video texture ? I am completely stack with the shader below. My map is always null. I have not much experience with shaders and there only a few examples in Aframe.

<script>

    let vertexShader1 = `
                           varying vec2 vUv;

                          void main() {
                              vec4 worldPosition = modelViewMatrix * vec4( position, 1.0 );
                              vec3 vWorldPosition = worldPosition.xyz;
                              vUv = uv;
                              gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
                          }
                        `;

    let fragmentShader1 = `
                             varying vec2 vUv;
                             uniform sampler2D map;
                             uniform float gthreshold;

                             void main() {
                                  vec2 uv = vUv;
                                  vec4 tex1 = texture2D(map, uv * 1.0);
                                  if (tex1.g - tex1.r > gthreshold)
                                      gl_FragColor = vec4(0,0,0,0);
                                  else
                                      gl_FragColor = vec4(tex1.r,tex1.g,tex1.b,1.0);
                              }`;


    AFRAME.registerShader('green-screen', {

        schema: {
            map: {type: 'map', is: 'uniform'},
            gthreshold : {type: 'float', value: 0.02}
        },

        uniforms: {
            // the texture value (once the texture source is loaded, update)
            map: { type: 't', value: null },
            gthreshold:  {type: 'f', value: 0.02}
        },

        init: function(data) {

            this.material = new THREE.ShaderMaterial({
                uniforms: this.uniforms,
                vertexShader: this.vertexShader,
                fragmentShader: this.fragmentShader
            });
        },

        update: function (data) {
            AFRAME.utils.material.updateMap(this, data);
            this.uniforms.map.value = this.material.map;
        },

        vertexShader: vertexShader1,
        fragmentShader: fragmentShader1
    });

</script>

// This is the processed stream that we want to send
// We do not want the default stream camera (takes bandwidth)
NAF.connection.adapter.enableCamera(false);
},5000);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Did you try setting video:false on networked-scene like I said in #296 (comment) ?
As far I understand, having video:true implies negotiating a video track with the other participants, and setting enableCamera(false) afterwards will just make the track inactive. Setting video:false will not add the default camera video track from the RTCPeerConnection at all.
I don't have two PC to test that on my local machine. I usually test with both my PC and iPad, but on the iPad the page freeze.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I will try that also. I totally forgot it as it has been a long time ago.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes you are right, it works with video:false

<tr>
<td style="width:200px">
</td>
<td style="width:600px">
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why this empty cell that make an indentation?

If you want to separate the basic examples from advanced examples, I'm fine with it, but you should probably rewrite the h1 at the top to be just "Networked-Aframe", add a h2 Examples, and here the h2 Advanced Video Examples so it's at the same level I think. And use the same syntax as the Examples section instead of using a table.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I will keep the old style

<br />
<h3>
<p>Want to help make better examples? That would be awesome! <a href="https://github.com/networked-aframe/networked-aframe/blob/master/CONTRIBUTING.md#join-the-community-on-slack" target="_blank">Contact us on Slack</a>.</p>
</h3>
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we should add a representative screenshot for each example. It is easier thus to select. I will fix it. I am preparing an example.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I didn't have any decent screenshot. I will make a better scene in the future

@vincentfretin
Copy link
Member

I like the dat gui to tweak the value for the custom shader, that's great for a demo. This was working properly for me on Firefox and Chrome, but not Safari. But I couldn't test the networked part of it though.
Only GreenThresholdIn is really used, GreenThresholdOut is not used AFAICT.

I don't know much how aframe handle additional material, but I think you can create only once the custom shader, and reuse the same for every networked-video-source component instance, but using specific uniforms params?

I think for this PR to be merged and the new example to be maintained in the long term is just to keep your example advanced-video-green-screen.html and try to have only one networked-video-source component with necessary changes instead of duplicating it.

I see you committed the file basic-video-mediapiped.html. I think this was your previous example that was not using the new api. You probably want to remove the file from the PR.

If possible I think you should put an updated mediapipe example on glitch that is using the new networked-aframe api+this PR with changes in networked-video-source supporting the custom shader. The goal in the end is to have your mediapipe example running with minimal lines of code on a released version of networked-aframe, and not a custom build.
And we could put a link to your glitch in the examples.

vec2 uv = vUv;
vec4 tex1 = texture2D(uMap, uv * 1.0);
if (tex1.g - tex1.r > GreenThresholdIn)
gl_FragColor = vec4(0,0,0,0);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Did you try discard like I said in #291 (comment) ?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Today it worked with
discard;

@vincentfretin
Copy link
Member

Look at registerShader https://aframe.io/docs/1.2.0/components/material.html#registershader
and look a the Programs count of the stats component
https://aframe.io/docs/1.2.0/components/stats.html
I think with your current code, this Programs counter will keep increasing for each new participant (I can't confirm it right now), but if you properly reuse the shader you will only have one I think.

@jimver04
Copy link
Author

As regards the procedure of replacing pixels, it is as follows. Get my video stream and replace "green" pixels with flat RGBA = (0,1,0,1) pixels (flat green). The amount of "green" of the background is configured with GreenThresholdOut. This is due to the fact that there are always shadows on your background as no green is absolute flat green. For this procedure I have used an html canvas and traverse it pixel by pixel.

When you receive all the "Greenish" video streams from other clients, you should be able to remove the green pixels. This can be done by simply saying if RGBA == (0,1,0,1) then put transparent (0,0,0,0). But for experimentation reasons I have added also a GreenThresholdIn which is a threshold on the amount of green for the received streams (for all the clients that you receive a video stream). It is mainly for debugging. For this method I have used the shader language.

I am planning to do something for the white channel also as many have white background. It would be lossy however, as eye bulbs and reflections are white. A custom color picker would be also useful. Thank god there is dat.gui for that.

As regards Mediapipe for automatic selfie segmentation with AI, I will re-visit it when there will be an improvement by Google. The last version 0.6 was very problematic. Only 50% of the times worked and the error codes were not illuminating the problem.

@vincentfretin
Copy link
Member

Thank you for the explanation, it's clearer now what your loop does, replacing the greenish pixels with pure green pixels.
I completely skipped your loop when I read your code, I see now that GreenThresholdOut is used there.
You should put this explanation in comments just above your loop().

And as I understand it with your mediapipe version you did a similar thing, filling the detected background by pure green (canvasCtx.fillStyle = "#00FF00FF";). Cool.

Maybe I'm asking something obvious, is there a reason to compare the difference between red and green, instead of just checking green above some threshold?

For your loop, wouldn't it be better to use setInterval instead of setTimeout? I don't know if it makes a difference for performance.
https://developer.mozilla.org/en-US/docs/Web/API/setInterval

@vincentfretin
Copy link
Member

I reply to myself here, setInterval would potentially execute again the function even if the previous call wasn't finished I think.

@vincentfretin
Copy link
Member

For anyone interested in playing with the new mediapipe selfie segmentation:
https://twitter.com/TensorFlow/status/1488214410346872840?t=bEJ0d-1WnvU4lsLxhR-z5w
If you don't have a green background :)

@jimver04
Copy link
Author

jimver04 commented Feb 4, 2022

As regards "Programs" in stats, you have right, it increases incrementally when each player enters. However, I do not know how to use shaders for video streams. There is no example anywhere.

@vincentfretin
Copy link
Member

I'll take a look for the shader code this weekend.

@jimver04
Copy link
Author

jimver04 commented Feb 6, 2022

I have found how to make a green-screen shader for videos in AFrame without the need of a canvas and without the need to go back to Three.js.

Look at two examples at my aframe 1.3.0 fork
Shader based: https://github.com/jimver04/aframe/tree/master/examples/test/shader-texture-video
Canvas based: https://github.com/jimver04/aframe/tree/master/examples/test/canvas-texture-video

I don't know which is faster. My GPU is 3080 and doesn't have any problems showing both at 60fps. Is there any method to check it? Stats are showing about the same.

image

I will see to fix NAF now.
Best,
D.

@jimver04
Copy link
Author

jimver04 commented Feb 6, 2022

The video dom example is not working properly in NAF as it shows always the same video for all players. I guess I have to use the embedded map of the shader

/* global AFRAME */
/**
 * The Green screen shader
 */
AFRAME.registerShader('green-screen-shader', {

    schema: {
        greenThreshold:{type: 'float', is: 'uniform', default:0.08}
    },

    vertexShader: `varying vec2 vUv;

                          void main() {
                              vec4 worldPosition = modelViewMatrix * vec4( position, 1.0 );
                              vec3 vWorldPosition = worldPosition.xyz;
                              vUv = uv;
                              gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
                          }`,

    fragmentShader: `varying vec2 vUv;
                     uniform sampler2D map;
                     uniform float greenThreshold;

                     void main() {
                                  vec2 uv = vUv;
                                  vec4 tex1 = texture2D(map, uv * 1.0);
                                   if (tex1.g - tex1.r > greenThreshold)
                                      gl_FragColor = vec4(0,0,0,0);
                                   else
                                      gl_FragColor = vec4(tex1.r,tex1.g,tex1.b,1.0);
                              }`

});

However the map corresponds to the whole level map but not the player map texture:

image

@jimver04
Copy link
Author

jimver04 commented Feb 6, 2022

By Searching aframe.js, the only map declared for flat shader (the shader of videos) is src. I have replaced map with src but I get the same map, i.e. the skysphere map which is the first map loaded.

…n screen effect.

- Global Threshold for all clients through a shader
- No multistreams, just adaptations on the main stream
- Changed init to update in networked-video-source.js in order to be able to change GreenThreshold with dat.gui
…n screen effect.

- Global Threshold for all clients through a shader
- No multistreams, just adaptations on the main stream
- Changed init to update in networked-video-source.js in order to be able to change GreenThreshold with dat.gui
…n screen effect.

- Global Threshold for all clients through a shader
- No multistreams, just adaptations on the main stream
- Changed init to update in networked-video-source.js in order to be able to change GreenThreshold with dat.gui
@jimver04
Copy link
Author

jimver04 commented Feb 6, 2022

Finally I have done a simple solution. It is very robust in comparison to my other methods.

  • No secondary stream
  • No canvas for personal stream processing before send
  • Just apply a green effect shader for all incoming video streams.
    Although it does not give the ability for each user to configure its booth setup, it is much better as regards speed and accessibility.
    More advanced-examples will follow in the future.
  1. use canvas per user to configure personal stream locally before sending
  2. Check new versions of MediaPipe (cross-fingers with WebAssembly)

I think we can not avoid Three.js as Aframe does not apply a shader with registerShader when no dom element for texture is defined. It applies only globaly in the scene to the first texture it finds, e.g. to the skysphere in my case.

…n screen effect.

- Global Threshold for all clients through a shader
- No multistreams, just adaptations on the main stream
- Changed init to update in networked-video-source.js in order to be able to change GreenThreshold with dat.gui
- Show more info for Master client: Size of video streams panels
Check why shader is applied only to one panel.
- 1st version for size of video streams panels
Check why shader is applied only to one panel.
Check why shader is applied only to one panel.
@vincentfretin
Copy link
Member

I didn't look at the registerShader with custom shader with the map I talked about yet, but it's on my todo list.
I was busy with the other issues and doing the releases in the last weeks. ;-)

What are you up to in the latest commits?
FYI you're working on the master branch in your fork, so all the commits ends up in this PR.
For futures PR you may want to create a branch instead.

@jimver04
Copy link
Author

jimver04 commented Mar 12, 2022

Hi Vincent,

my crazy idea is to do remote virtual productions for shooting movies whilst the actors are at their homes. It is very risky but I have managed to attract the interest of the local community of actors and musicians through a demo, as well as persuading the project managers. Actors will participate through their smartphones from home with a green screen background* and with a low rendering needs page. They only have to press 3 keys that will transfer them in their acting position, so as to avoid 3D navigational stress. I will be in the same room as a director with my RTX3080 card rendering the full scene:

image
Director High-end PC

then I stream back the overall scene with the "Screen sharing" feature that you have recently added in order for the actors to have a feedback:

image
Actor Smartphone

Most of the developments are on the director page. I have followed your advice to avoid using canvas on the sender side and use only shaders on the receiver side. I have changed the RGB algorithm to YUV space with double thresholds which is more commonly used in chroma key removal (see https://www.shadertoy.com/view/MlVXWD). I have also added these thresholds on the Director page together with the choice to control the chroma key (it can be also blue or any color). I have added dat.gui interfaces per actor so that each actor can have separate chroma key and different thresholds. I have also added the parameters to change the width and the height of each actor plane so that I can normalize the size of the actor in the scene.

I have followed your advice, to avoid doing processing on a canvas on the sender side and the code became more simple. I am working only with shaders and it works well. I am using a modified version of your adaptor, and I have delete mine.

Of course the nice Ship 3D model is excluded from all the commits as I have bought it from a well-known site.
The water is done with A-water component (https://github.com/Dante83/a-water) which is open source.

image

To summarize, ignore this pull request. I will prepare another branch depending on what you want from all these above.
Any comments ? ideas ? My future plan is to add some movement for the actors' planes to give them the ability to walk in the scene. It is not possible for them to touch their smartphone so it has to be done on the Director side.

Best,
Dimitrios

*Web AI for autoremoval (Mediapipe) is unstable and computational expensive for smartphones, plus it has artifacts of the background. We will share to actors some low cost "Creator studios" consisting of a green cloth sheet and a smartphone stand ~ 70euros. Of course some DIY solutions for green screen can work too as I found that green pieces of paper can do the job if the lighting is medium to good.

@vincentfretin
Copy link
Member

Nice project! Thanks for keeping us posted about the things you're trying, this is interesting.
I'll try to tinker a bit the custom shader on my side and let you know.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants