Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Piping h264 video stream to stdout #17

Open
sl13 opened this issue Jan 14, 2024 · 3 comments
Open

Piping h264 video stream to stdout #17

sl13 opened this issue Jan 14, 2024 · 3 comments

Comments

@sl13
Copy link

sl13 commented Jan 14, 2024

Hi,
First of all, thank you for the great project. I am currently working with a Raspberry Pi 2 and a Logitech C925e.
The local streaming with your project works on the first try, but unfortunately this is not what I want to achieve.
I would like to transfer the h264 stream of the camera via rtmp to Youtube.
I already got this working successfully with ffmpeg and the raspberry pi camera and I think the easiest way would be to use the ffmpeg software for the Logitech camera as well.
For this project I would have to write the h264 camera data to stdout and forward it to ffmpeg.
Currently I am not sure how to implement this. Where can I find the h264 video data and how can I write it to stdout?

I would be grateful for any help.
Best regards,
Sven

@soyersoyer
Copy link
Owner

I used to think about this as well.

GStreamer also has an mjpegh264 demuxer, maybe that is more suitable for you.

https://gstreamer.freedesktop.org/documentation/uvch264/uvch264mjpgdemux.html

But with this code:

Initalize the camera and such like in fmp4streamer.py, create an MP4Writer instead of Streamingserver,
use sys.stdout instead of self.wfile

                mp4_writer = MP4Writer(self.wfile, config.width(), config.height(), config.rotation(), config.timescale(), h264parser.sps, h264parser.pps)
                while True:
                    nalus, frame_secs, frame_usecs = h264parser.read_frame()
                    mp4_writer.add_frame(nalus, frame_secs, frame_usecs)

@sl13
Copy link
Author

sl13 commented Jan 17, 2024

Thank you very much for your help. Unfortunately, I can't get either of the two variants to work. I have carried out some tests for the gstreamer variant. The following two pieplines work without problems.

gst-launch-1.0 -v v4l2src device=/dev/video0 ! "video/x-raw, width=640,height=480,framerate=30/1" ! videoconvert ! autovideosink
gst-launch-1.0 -v v4l2src device=/dev/video0 ! "image/jpeg,parsed=true,width=640,height=480" ! jpegdec ! videoconvert ! autovideosink

But when I add the demuxer "uvch264mjpgdemux" to the piepline it doesn't work.

gst-launch-1.0 -v v4l2src device=/dev/video0 ! "image/jpeg,parsed=true,width=640,height=480" ! uvch264mjpgdemux ! "video/x-h264,width=640,height=480,framerate=30/1" ! avdec_h264 ! videoconvert ! autovideosink

The following error report is displayed:

Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Got context from element 'autovideosink0': gst.gl.GLDisplay=context, gst.gl.GLDisplay=(GstGLDisplay)"\(GstGLDisplayX11\)\ gldisplayx11-0";
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = image/jpeg, parsed=(boolean)true, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, colorimetry=(string)2:4:5:1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = image/jpeg, parsed=(boolean)true, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, colorimetry=(string)2:4:5:1
/GstPipeline:pipeline0/GstUvcH264MjpgDemux:uvch264mjpgdemux0.GstPad:jpeg: caps = image/jpeg, parsed=(boolean)true, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, colorimetry=(string)2:4:5:1
/GstPipeline:pipeline0/GstUvcH264MjpgDemux:uvch264mjpgdemux0.GstPad:sink: caps = image/jpeg, parsed=(boolean)true, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, colorimetry=(string)2:4:5:1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = image/jpeg, parsed=(boolean)true, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, colorimetry=(string)2:4:5:1
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data stream error.
Additional debug info:
../libs/gst/base/gstbasesrc.c(3127): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming stopped, reason not-linked (-1)
Execution ended after 0:00:00.315406966
Setting pipeline to NULL ...
Freeing pipeline ...

I know the problem has nothing to do with your software, but do you have any idea what the error could be?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants