-
-
Notifications
You must be signed in to change notification settings - Fork 54
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Stream through wifi? #1
Comments
Hi @sleepyeye , I have tried to implement Wi-Fi streaming, but due to very high bandwidth requirements, I decided not to add this feature (there were additional issues) — the average Wi-Fi speeds were not enough to reliably transmit the data in real time. If you do not require live streaming, then you can use the new Shareable/Internal Format export option in the latest version of Record3D. The exported The You could alter the Unity demo project to load the pairs of JPEG images and depth maps instead of loading the color and depth data from the iOS device. |
Hi @marek-simonik , thanks for responding. Unfortunately, I'm working on a project based on live-streamed point cloud. I thought real-time and lossless Wi-Fi streaming is possible because iPad pro 4 supports Wi-Fi 6 which has almost similar bandwidth to USB 3.1. Do you have any plan for adding this feature but left it as an experimental feature? |
I am going to do new Wi-Fi streaming tests at the end of the next week — feel free to drop me an email (support@record3d.app) on Friday (2020/10/02) to ask for the results. |
Okay. Thanks a lot. |
I am also interested in Wi-Fi streaming. Are there any updates on the progress? If bandwidth is still a problem frame-skipping would be ok (I imagine 10Hz would be enough). Alternatively or additionally, one can think about a (more) lossy compression. I looked in the code and it seems that for:
For a lossy depth compression, a quick search revealed the following options:
I would be happy to collaborate with you to get the required Mbps down so Wi-Fi streaming becomes a reality soon. |
Hello @adrelino, apologies for not responding sooner and thank you for your reply. My intention with Record3D streaming was to get as accurate depth data as possible, so I opted for lossless depth map compression of The goal of depth accuracy was the reason why I abandoned Wi-Fi streaming after performing tests that showed a large bandwidth would be required for the streaming to work without frame drops. However, there seems to be growing interest in Wi-Fi streaming, so I am going to start working on it in about two weeks time. Please note that lossy compression will be necessary to achieve real-time Wi-Fi streaming without significant frame drops, thus the Wi-Fi streaming feature will be suited just and only for entertainment purposes. To comment on your two remarks:
|
Hi. I'm interested in this. I can measure speeds of up to 830Mbps on my domestic Wifi. I've also got access to some 5g test equipment and I would very much like to see how far we can push that for WAN access. What kind of bandwidth requirements are you seeing - both currently and with your planned lossy version? |
In comparison - the other app I'm playing with is https://github.com/keijiro/Rcam2 by @keijiro That packs the RGB, depth and a stencil into a single uncompressed 1080p NDI frame. I believe that uses about 100Mbps and the results are very satisfactory for the purposes I have in mind. On reflection I think the key point is "people have different requirements". USB is a great fallback but wireless should work if either a) you've got fantastic wifi b) you don't need great depth accuracy c) you can lose framerate/depth resolution etc etc It's only in the case where someone needs all of this at once and doesn't have the network bandwidth to support it - but most people will be happy to sacrifice one of more dimensions to suit their network conditions. |
Hi @andybak, I'll start working on the Wi-Fi streaming feature at the end of the next week, so I can't guarantee a specific bandwidth that I will end up with. Your domestic Wi-Fi would for sure handle streaming of even the non-lossy stream (~38 Mbps for LiDAR @60fps and ~85 Mbps for the selfie FaceID camera @30 FPS). However, similar to what you wrote, my goal is to offer the Wi-Fi streaming also to people with worse network conditions, so I plan to reduce the required bandwidth hopefully to sub-10 Mbps. |
Crikey. That's going to make even WAN-streaming viable. You can potentially live stream RGB and depth to a WebVR client like Mozilla Hubs. I'd love to get a proof of concept going of that... |
Good luck. If you need a beta tester (especially with regards to Unity integration) let me know. |
Thank you, I'll post an update after I'll achieve some progress with the Wi-Fi streaming feature. |
Thanks for your hint, I was indeed not aware of these differences. But I got around the issue and succeeded in saving the complete depth image as 16 bit depth in RVL format by first loading and uncompressing the 32 bit .depth image with lzfse, then using openCV to convert from CV_32FC1 to CV_16FC1 and then saving the resulting bytes using RVL, see void compressAndSave(std::string filename, cv::Mat depth){ //16 bit depth
RvlCodec codec;
int nPixels = depth.total();
short buf[nPixels]; //usually too large
int nBytes2 = codec.CompressRVL((const unsigned short*) depth.data, (unsigned char*) buf,nPixels);
write_buffer_to_disk(filename, (char*) buf,nBytes2);
} This is then used as follows: cv::Mat depth32 = readDepth(cv::utils::fs::join(rgbd,std::to_string(i)+".depth"),m.w,m.h);
depth32.convertTo(depth16,CV_16FC1);
compressAndSave(rvlFile,depth16); Of course, I asked myself how much space can be saved and information is lost by this conversion process from 32 to 16 bit float depth images so I ran it on a sample sequence I recorded with the iPad Pro 2018 (3rd Gen) TrueDepth camera. So I also looked at the raw values: frame: 0
lzfse size: 249.825 kB range [0.27925,3.89213] m
RVL size: 205.06 kB range [0.279297,3.89258] m
diff mean: 0.179536 mm max: 0.968933 mm
The saving in space (250 vs 200kb) is not too large, but note that 16 bit can store sub-centimeter depth values, since the maximum difference between the 32 and 16 bit image is below 1 millimeter (this was consistent for my whole sequence). |
Thank you for the analysis, I guess I rushed the assumption that 16bit import numpy as np
num = 0.9765
np.float32(num) # 0.9765
np.float16(num) # 0.9766 Anyway, I am going to start working on the Wi-Fi streaming feature by the end of this week, so I will try to push the file size down as much as I can. I don't want to promise something I would not be able to deliver, but I hope to make the Wi-Fi streaming feature publicly available in about 3 weeks from now. |
After quite some time, I have finally managed to implement Wi-Fi streaming. I am sorry for the delay. The Wi-Fi streaming feature is implemented via WebRTC (works only on local Wi-Fi network); that way, I could ensure that streaming would be possible even on very low bandwidth networks. RGBD data are encoded in lossy mp4 videos, so Wi-Fi streaming will be intended mainly for entertainment purposes. The feature is described in more detail at https://record3d.app/features. There are also two JS demos: Wi-Fi Streaming is a part of a paid Extension Pack, but I am giving you the Extension Pack for free — I already sent a Promo Code to @adrelino (via email). However, I could not find your email address — @adrelino — please send me an email at support@record3d.app and I will give you a Promo Code too as a thanks :). |
I've got pretty good Wifi. I'd like to stream with low compression rates and the depth packed into 24 or 32 bits of video. Is this possible? What's the best quality you've implemented over wireless? |
I may add lossless (or near-lossless) Wi-Fi streaming as an additional Settings in a future update. That Wi-Fi streaming option would be based on a completely different principle — likely by utilizing RVL, as proposed by @adrelino. I sent you an email. |
Will Wi-Fi streaming ever be added to this C++/Python library? I tried to implement it in Python with the aiortc Python library with no success. Just wondering if this will ever be added. |
This library in particular (i.e. However, you can use the existing Wi-Fi streaming feature of Record3D, which streams RGBD video via WebRTC (the depth values are encoded into the Hue component of HSV, so it there is lossy compression). Here you can see a minimal working example of how to connect to Record3D's WebRTC stream (comments in the JS code describe the specifics you need to follow for successful connection): https://github.com/marek-simonik/record3d-simple-wifi-streaming-demo Since WebRTC is a standard, there should be WebRTC implementations in major languages with the same API, which should allow you to replicate the JS demo linked above. |
Hi! Is it possible to stream depth >3m? Would that be at the expense of depth accuracy given the dynamic range of the mapping? I see the max depth was recently adjusted for recorded mp4s. |
Hi, After enabling "Settings > Export options > RGBD mp4 video dynamic max. depth", the exported mp4 files will include the However, when streaming via Wi-Fi, there is (of course) no way to tell what will be the maximum depth value observed throughout the whole Wi-Fi streaming session. A possible solution would be to associate a |
Would a compomise be to have a "depth" option that switched between say 0-3m, 0-6m and 0-12m? (can't remember the max LIDAR depth but you get my point) |
Another thought - use a logarithmic mapping or some other non-linear mapping? |
I'm not sure if logarithmic/nonlinear mapping would work well; my main concern is that the WebRTC video compression can introduce significant noise (especially with low bandwidth) and that the nonlinearity would only amplify already existing noise (but I haven't tested this, so I might be wrong). However, if it would be enough for your purposes (@graycrawford and @andybak) to have the option to manually specify the maximum depth value used for depth encoding during Wi-Fi streaming, then I will implement such option into the next Record3D update. |
hi @marek-simonik, manually specifying the maximum depth value would be perfect! As it doesn't need to be hyper precise, only gesturally expressive, we'll take the noise in stride, likely just clamping the data from the far wall to isolate only the figures. Our space is max 10m, but audience occlusion will mean we're likely to be mostly measuring people 2-7m distant. Thank you for your rapid response! It's such a cool tool and we haven't found a better tool for our needs; (something like Zig Sim has worked in the past but has also very limited range by default.) |
Thanks for sharing the screenshots, @graycrawford :)! Please download the latest update of Record3D (version 1.8.3, released yesterday) — there is a new option in the Settings tab > Export options > Wi-Fi Streaming max depth. You can modify the value to adjust the maximum depth range used for depth encoding during Wi-Fi streaming (the depth range will be A tip for using Wi-Fi streamingincrease the maximum depth value so that it's a bit higher than the maximum distance you need to measure (e.g. if you need to measure people 7 meters away, then set the max. depth to 8–10 meters). The depth is encoded into the Hue value of the HSV color space, which is circular. That means depth values too close to the camera (near 0) and values near the maximum depth value will be encoded into similar colors. Increasing the maximum depth will allow you to better discern between those two extremes and thus better filter the depth values. |
Incredible! On the USB streaming front, does the wifi max depth value apply to USB streaming too? |
No, it does not apply. When streaming via USB, there is no need to encode the depth image into HSV, so there is no maximal depth value to be applied. Instead, the raw |
Hi,
If I want to scan large space such as a room, I have to connect ipad to laptop.
This is quite cumbersome so I want to stream RGBD images through wifi.
Is there any way to stream RGBD image through wifi or any chances to add this feature in the future?
The text was updated successfully, but these errors were encountered: