New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How can i stream images obtained from TextureView #1340
Comments
Is this the function in which i have to modify the code to send my ow image ? private void decode() {
} |
Hello, Since you already have h264/265 buffers usng DJI you can use rtmp module directly. You can follow the code implementation of other user and adapt it. You have an issue that seem to work using h264: |
Thanks i will try this but actually i only have bitmap obtained from texture view using following code
|
Hello, Use bitmap is not a good solution. Because that way
I recommend you read DJI documentation because I'm sure you can get H264/H265 buffers. I think this is the documentation that you need: Try to find a way to get raw H264 buffers. Maybe this could help you: |
yes there is |
hello @pedroSG94 i have managed to get raw buffer and i am using rtmp client as you had said but i am facing a slight issue. void onCreate(){
//callbacks for rtmp client
ConnectCheckerRtmp connectCheckerRtmp = new ConnectCheckerRtmp() {
.
.
.
}
rtmpClient = new RtmpClient(connectCheckerRtmp);
rtmpClient.setVideoResolution(640, 640);
rtmpClient.setFps(30);
rtmpClient.connect("rtmp://192.168.137.59/demo/d2");
// This callback function returns raw h264 encoded arraylist
mReceivedVideoDataListener = new VideoFeeder.VideoDataListener() {
@Override
public void onReceive(byte[] videoBuffer /** H264 encoded buffer**/, int size) {
DJIVideoStreamDecoder.getInstance().parse(videoBuffer, size);
// here i an sending it via rtmp client
rtmpClient.sendVideo(ByteBuffer.wrap(videoBuffer), new MediaCodec.BufferInfo());
}
};
} |
Hello, That code is imcomplete. You can check this example:
Also, remember use setOnlyVideo(true) (at the same moment you use setVideoResolution) if you are not using audio |
Thanks i will add it :) |
/**
I have the same issue, but I have able to get the video buffer and from video buffer I was able to decode sps pps values, the main issue I am getting is that, I am getting 0 on rtmpClient.getSentVideoFrames(); |
Hello, You are not creating BufferInfo correctly. You need to update values properly. Check it in my example: |
Yes I saw the comment, and I also try this but I was unable to get the keyframe, the nalutype I am getting 7 and then 1.
|
I find similar problem in this #817 section, but also no solution was given, is there any other way to find out the keyframe. |
Hello, If you are getting only 1 and 7 nal type you will need decode frames and re encode it to fix the problem. As you mentioned here: The code posted is the way to do it. I explain you the code. That code decode frames provided by DJI in onReceive callback using DJICodecManager as a decode. Then DJICodecManager class is connected to my VideoEncoder class rendering the surface provided by glInterface that contain my VideoEncoder surface that automatically encode it. |
Hello, Remember change resolution in DJIExample class (width and height). |
Hello thanks for open-sourcing your implementation.
I know it might be a quite basic question but i don't have much experience with android.
Actually my scenario is that i have developed app for receiving dji drone video to my phone and i need to stream that video to my pc for further processing.
DJI also provides rtmp streaming option but it is lagging alot i have tested your app and it give reasonable results for streaming videos.
I can access bitmap show in texture view of current app but i am not sure how i can stream those frames using your rtmp api.
if possible can you provide me some example or point to resource explaining streaming process like how it is reading video from file then i can change that implementation to read bitmap from my texture-view.
Thanks
The text was updated successfully, but these errors were encountered: