Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problems when operating two cameras at the same time #137

Open
nabetayam opened this issue Jan 19, 2022 · 17 comments
Open

Problems when operating two cameras at the same time #137

nabetayam opened this issue Jan 19, 2022 · 17 comments
Assignees
Labels
type:question User question

Comments

@nabetayam
Copy link

I connected two UVC cameras to the Raspberry Pi and started ustreamer with different ports.
However, when I use two of the same camera at the same time, the problem is that the streaming of the camera started earlier is normal, but the streaming of the camera started later is awkward / not streamed.

For example, suppose I have two same cameras, one with camera A and one with camera B.
I stream normally when I first run camera A on port 8080, but then when I run camera B on port 8081 the streaming is awkward.
I stream normally when I first run camera B on port 8081, but then when I run camera A on port 8080 the streaming is awkward.

  • Awkward in Firefox, streaming is not displayed in Chrome.

I checked with iptraf.
Then, the amount of packets transferred by the camera started earlier is large, and the amount transferred by the camera started later is abnormally small, so it is thought that this imbalance in streaming data transfer is the cause.

  • The imbalance in packet transfer volume is the same for Firefox and Chrome.

So far, I have confirmed the following contents.

  1. With another camera other than the above camera, the second camera will not be awkward.
  2. With two cameras other than the above cameras, the packet transfer amount of the two cameras is well-balanced, and the streaming of the camera that is started later does not become awkward.

The cameras that are having problems are:
https://www.watec.co.jp/English/e_USB.html
MODEL: WAT-01U2

In addition, the image sensor chip used internally by this camera is as follows.
https://www.nisshinbo-microdevices.co.jp/ja/products/usb-camera-controller/
USB2.0 Camera Controller LSI R5U8710

If you have any ideas for a solution, please suggest.

@mdevaev
Copy link
Member

mdevaev commented Jan 20, 2022

Please see this: #99

@mdevaev mdevaev self-assigned this Jan 20, 2022
@mdevaev mdevaev added the type:question User question label Jan 20, 2022
@nabetayam
Copy link
Author

thanks for your reply

I already know this # 99, but I asked because the situation is different.

#99 states
 It appears the issue may be related to the USB hub on the Pi4

The difference is that my environment doesn't use a USB hub and connects directly to the Raspberry Pi.

Also, depending on the camera model, even if two UVC cameras with different resolutions are connected at the same time, the packets will be transferred in a well-balanced manner by the two cameras, and there is no problem with streaming.

If this software cannot be used due to the specifications of the camera, I think it can't be helped.
I thought that I would like to teach me that point.

But you're busy, so it's hard to answer my questions.
I give up what I am trying to do with this software.
thank you.

@mdevaev
Copy link
Member

mdevaev commented Jan 20, 2022

Raspberry Pi already contains a USB HUB inside. The USB to which you connect your cameras are not chip interfaces, they are connected to a PCI-E USB controller.

Let's start with debugging. Show the command line opts that you use to launch and the ustreamer logs for each camera.

I'm not refusing to help you, I just don't fully understand the root of the problem.

@nabetayam
Copy link
Author

thank you for your reply.

Now, let me give you some details about the current situation.

1)The following two cameras are connected to the Raspberry Pi.

$ v4l2-ctl -d /dev/video0 --list-formats-ext
ioctl: VIDIOC_ENUM_FMT
Type: Video Capture

	[0]: 'MJPG' (Motion-JPEG, compressed)
		Size: Discrete 640x480
			Interval: Discrete 0.033s (30.000 fps)
		Size: Discrete 160x120
			Interval: Discrete 0.033s (30.000 fps)
		Size: Discrete 176x144
			Interval: Discrete 0.033s (30.000 fps)
		Size: Discrete 320x240
			Interval: Discrete 0.033s (30.000 fps)
		Size: Discrete 1280x800
			Interval: Discrete 0.033s (30.000 fps)
		Size: Discrete 1280x720
			Interval: Discrete 0.033s (30.000 fps)
		Size: Discrete 1280x1024
			Interval: Discrete 0.033s (30.000 fps)
		Size: Discrete 1920x1080
			Interval: Discrete 0.033s (30.000 fps)
	[1]: 'YUYV' (YUYV 4:2:2)
		Size: Discrete 640x480
			Interval: Discrete 0.033s (30.000 fps)
		Size: Discrete 160x120
			Interval: Discrete 0.033s (30.000 fps)
		Size: Discrete 176x144
			Interval: Discrete 0.033s (30.000 fps)
		Size: Discrete 320x240
			Interval: Discrete 0.033s (30.000 fps)
		Size: Discrete 1280x800
			Interval: Discrete 0.100s (10.000 fps)
		Size: Discrete 1280x720
			Interval: Discrete 0.100s (10.000 fps)
		Size: Discrete 1280x1024
			Interval: Discrete 0.133s (7.500 fps)
		Size: Discrete 1920x1080
			Interval: Discrete 0.200s (5.000 fps)

  1. Start the 1st camera with ustreamer as follows.

$ ./ustreamer --device=/dev/video0 -r 1920x1080 --device-timeout=8 --quality 100 --format=MJPEG --host=192.168.1.25 --port=8080 -f 30

The log is as follows.

-- INFO  [7857.478      main] -- Using internal blank placeholder
-- INFO  [7857.479      main] -- Listening HTTP on [192.168.1.25]:8080
-- INFO  [7857.479    stream] -- Using V4L2 device: /dev/video0
-- INFO  [7857.479    stream] -- Using desired FPS: 30
-- INFO  [7857.479      http] -- Starting HTTP eventloop ...
================================================================================
-- INFO  [7857.549    stream] -- Device fd=8 opened
-- INFO  [7857.549    stream] -- Using input channel: 0
-- INFO  [7857.551    stream] -- Using resolution: 1920x1080
-- INFO  [7857.551    stream] -- Using pixelformat: MJPEG
-- INFO  [7857.554    stream] -- Using HW FPS: 30
-- ERROR [7857.554    stream] -- Device does not support setting of HW encoding quality parameters
-- INFO  [7857.554    stream] -- Using IO method: MMAP
-- INFO  [7857.591    stream] -- Requested 5 device buffers, got 5
-- INFO  [7857.601    stream] -- Capturing started
-- INFO  [7857.602    stream] -- Switching to HW encoder: the input is (M)JPEG ...
-- INFO  [7857.602    stream] -- Using JPEG quality: encoder default
-- INFO  [7857.602    stream] -- Creating pool JPEG with 1 workers ...
-- INFO  [7857.602    stream] -- Capturing ...
-- INFO  [7865.927      http] -- HTTP: Registered client: [192.168.1.10]:53085, id=d7b192d9eb43e44a; clients now: 1

  1. Next, start the 2nd camera as follows.

$ ./ustreamer --device=/dev/video2 -r 1920x1080 --device-timeout=8 --quality 100 --format=MJPEG --host=192.168.1.25 --port=8081 -f 30

The log is as follows.

-- INFO  [7861.701      main] -- Using internal blank placeholder
-- INFO  [7861.702      main] -- Listening HTTP on [192.168.1.25]:8081
-- INFO  [7861.702    stream] -- Using V4L2 device: /dev/video2
-- INFO  [7861.702    stream] -- Using desired FPS: 30
================================================================================
-- INFO  [7861.703      http] -- Starting HTTP eventloop ...
-- INFO  [7861.769    stream] -- Device fd=8 opened
-- INFO  [7861.769    stream] -- Using input channel: 0
-- INFO  [7861.771    stream] -- Using resolution: 1920x1080
-- INFO  [7861.771    stream] -- Using pixelformat: MJPEG
-- INFO  [7861.773    stream] -- Using HW FPS: 30
-- ERROR [7861.773    stream] -- Device does not support setting of HW encoding quality parameters
-- INFO  [7861.773    stream] -- Using IO method: MMAP
-- INFO  [7861.806    stream] -- Requested 5 device buffers, got 5
-- INFO  [7861.815    stream] -- Capturing started
-- INFO  [7861.815    stream] -- Switching to HW encoder: the input is (M)JPEG ...
-- INFO  [7861.815    stream] -- Using JPEG quality: encoder default
-- INFO  [7861.815    stream] -- Creating pool JPEG with 1 workers ...
-- INFO  [7861.815    stream] -- Capturing ...
-- INFO  [7868.123      http] -- HTTP: Registered client: [192.168.1.10]:53086, id=eabcdd7ff735682; clients now: 1
-- INFO  [7869.604      http] -- HTTP: Disconnected client: [192.168.1.10]:53086, id=eabcdd7ff735682, Resource temporarily unavailable (reading,eof); clients now: 0
-- INFO  [7869.618      http] -- HTTP: Registered client: [192.168.1.10]:53087, id=84ad1d4ef88d3342; clients now: 1
  1. Check the streaming. Please watch this video.

https://drive.google.com/file/d/1ApOHYZ7aVXrzJr6z5fcphWzwykdBgCtA/view

This is the screen of another PC to which the Raspberry Pi is connected.
I am monitoring the IP address 192.168.1.25. Port 8080 is on the left and 8081 is on the right.
From the screen we can see that port 8080 on the left is responsive in real time. Port 8081 on the right is slow to respond.

  1. I attribute this to the difference in packet transfer during streaming. Please watch this video.

https://drive.google.com/file/d/1HLkqCRISXthlsA5CjyTcMve7YRvVMg9Z/view

This seems to be starting iptraf-ng on Raspberry Pi and monitoring traffic. There is a clear difference in the amount of packet and byte count increments between ports 8080 and 8081.
For example, 8080 increments the byte count by around 260KByte per count, while 8081 increments only around 150.

That is all for the current situation.
If you are requested to log during streaming, please tell me the option (performance? Or verbose? Or debug?).

@mdevaev
Copy link
Member

mdevaev commented Jan 21, 2022

Interesting. Please attach JSON from the /state URLs during streaming. The verbose log will be useful too.

@nabetayam
Copy link
Author

Is this okay for json files?
I also attach a log with the verbose option.
In addition, 8080 is operating normally, 8081 is streaming abnormally.
state(8080).json.txt
state(8081).json.txt
8081.log
8080.log

@mdevaev
Copy link
Member

mdevaev commented Jan 21, 2022

Okay, the video capture is fine. It looks like a purely network problem. Just for the sake of experiment: can you output the camera image to two physically different computers? Will the problem be reproduced like on your video?

@nabetayam
Copy link
Author

I'm sorry.
Currently I can't prepare a wired LAN switching hub, so I connected two PCs via wireless LAN.
This video is the result. As you can see, it works in the same way.
This is a stream video of the camera that first operated the right PC. The PC on the left is the camera that was operated next, but the image does not appear on the screen and remains black probably because there are few packet transfers.

https://drive.google.com/file/d/1e77j97yJveuXG7xNMzyYT-6sp7Ulff0h/view?usp=sharing

@nabetayam
Copy link
Author

I forgot to attach the log.
This is the log.
8080 is the log of the camera that started first, 8081 is the log of the camera that started later

8080(Right-PC).log
8081(Left-PC).log

@mdevaev
Copy link
Member

mdevaev commented Jan 21, 2022

@nabetayam Okay, I don't have any ideas yet, but I know who to ask

@xornet-sl Could you take a look at this? Looks like a strange network problem from Pi side.

@nabetayam
Copy link
Author

Thank you for your support.
I hope this problem can be resolved.

@mdevaev
Copy link
Member

mdevaev commented Feb 14, 2022

@xornet-sl boop

@xornet-sl
Copy link

Sorry. I will be here on the nearest weekend for inspecting this.

@aminsadeghi81
Copy link

aminsadeghi81 commented Sep 12, 2023

hi guys
I run into exactly the same problem
has anyone come to a solution for this ?
@nabetayam
@xornet-sl
@mdevaev

@AdemOruc55
Copy link

Hi guys
i have a same problem
did you find a solution?
@aminsadeghi81
@nabetayam
@xornet-sl
@mdevaev

@xornet-sl
Copy link

Hi, didn't have a chance to take a look at it but definitely will have time in the upcoming weeks.

@hazza64
Copy link

hazza64 commented Feb 13, 2024

@xornet-sl Did you get a chance to look into this?
I have the same issue with a Radxa Zero previously (Bullseye, I believe) and I just updated it to Armbian 23.11.1.
Still has the same issue; both cameras are detected, work, but will only show video on the one camera. Same setup works fine on a Raspberry Pi4, although I think I'm using camera-streamer in that case.
This was via mainsail/crowsnest, but using uStreamer to stream the video. Unplugging one camera will allow video to come from the other, so it's not a config issue and seems like it is uStreamer at fault on some level.
If it helps, the FPS when both camera are connected show what would be "correct", but one of the streams will have the No Display default message (forgot the exact wording).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type:question User question
Development

No branches or pull requests

6 participants