Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How can i stream images obtained from TextureView #1340

Open
FawadAbbas12 opened this issue Nov 13, 2023 · 14 comments
Open

How can i stream images obtained from TextureView #1340

FawadAbbas12 opened this issue Nov 13, 2023 · 14 comments

Comments

@FawadAbbas12
Copy link

Hello thanks for open-sourcing your implementation.
I know it might be a quite basic question but i don't have much experience with android.
Actually my scenario is that i have developed app for receiving dji drone video to my phone and i need to stream that video to my pc for further processing.
DJI also provides rtmp streaming option but it is lagging alot i have tested your app and it give reasonable results for streaming videos.
I can access bitmap show in texture view of current app but i am not sure how i can stream those frames using your rtmp api.
if possible can you provide me some example or point to resource explaining streaming process like how it is reading video from file then i can change that implementation to read bitmap from my texture-view.
Thanks

@FawadAbbas12
Copy link
Author

Is this the function in which i have to modify the code to send my ow image ?

private void decode() {
.
.

while (running) {
  synchronized (sync) {
    .
    .
    int inIndex = codec.dequeueInputBuffer(10000);
    int sampleSize = 0;
    if (inIndex >= 0) {
      ByteBuffer input;
      if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.LOLLIPOP) {
        input = codec.getInputBuffer(inIndex);
      } else {
        input = codec.getInputBuffers()[inIndex];
      }
      .
      .
      .
      if (sampleSize < 0) {
        if (!loopMode) {
          codec.queueInputBuffer(inIndex, 0, 0, 0, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
        }
      } else {
        codec.queueInputBuffer(inIndex, 0, sampleSize, ts + sleepTime, 0);
        extractor.advance();
      }
    }
    int outIndex = codec.dequeueOutputBuffer(bufferInfo, 10000);
    if (outIndex >= 0) {
      if (!sleep(sleepTime)) return;
      ByteBuffer output;
      if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.LOLLIPOP) {
        output = codec.getOutputBuffer(outIndex);
      } else {
        output = codec.getOutputBuffers()[outIndex];
      }
      boolean render = decodeOutput(output);
      codec.releaseOutputBuffer(outIndex, render && bufferInfo.size != 0);
      boolean finished = extractor.getSampleTime() < 0;
      .
      .
    }
  }
}

}

@pedroSG94
Copy link
Owner

Hello,

Since you already have h264/265 buffers usng DJI you can use rtmp module directly. You can follow the code implementation of other user and adapt it. You have an issue that seem to work using h264:
#1311
Maybe you need to replace decodeSpsPpsFromByteArray because that method is for H265. You can try this one:
https://github.com/pedroSG94/RootEncoder/blob/master/encoder/src/main/java/com/pedro/encoder/video/VideoEncoder.java#L362

@FawadAbbas12
Copy link
Author

Thanks i will try this but actually i only have bitmap obtained from texture view using following code

Bitmap bitmap = mVideoSurface.getBitmap();

@pedroSG94
Copy link
Owner

pedroSG94 commented Nov 13, 2023

Hello,

Use bitmap is not a good solution. Because that way

  • first you need convert to H264 or H265 (it is not easy)
  • The conversion is slow (low fps)
  • The way you get bitmap result in low fps

I recommend you read DJI documentation because I'm sure you can get H264/H265 buffers. I think this is the documentation that you need:
https://developer.dji.com/api-reference-v5/android-api/Components/SDKManager/DJISDKManager.html

Try to find a way to get raw H264 buffers. Maybe this could help you:
https://developer.dji.com/doc/mobile-sdk-tutorial/en/tutorials/video-stream.html

@FawadAbbas12
Copy link
Author

yes there is VideoFeeder.VideoDataListener() class to receive raw h264 data but apparently it is not working for mavic 2 pro :(
that is why i thought of using bitmap but it is really slow to encode.
i will try to get rawh264 images

@FawadAbbas12
Copy link
Author

hello @pedroSG94 i have managed to get raw buffer and i am using rtmp client as you had said but i am facing a slight issue.
as of connection i gets open correctly but no data is getting transmitted can you have a look at my code and tell me what i am doing wrong ?
I have also validated that the said callback function from dji is getting executed.

void onCreate(){
    //callbacks for rtmp client
    ConnectCheckerRtmp connectCheckerRtmp = new ConnectCheckerRtmp() {
        .
        .
        .
     }

    rtmpClient = new RtmpClient(connectCheckerRtmp);
    rtmpClient.setVideoResolution(640, 640);
    rtmpClient.setFps(30);
    rtmpClient.connect("rtmp://192.168.137.59/demo/d2");

    // This callback function returns raw h264 encoded arraylist 
    mReceivedVideoDataListener = new VideoFeeder.VideoDataListener() {
        @Override
        public void onReceive(byte[] videoBuffer /** H264 encoded buffer**/, int size) {
            DJIVideoStreamDecoder.getInstance().parse(videoBuffer, size);
            // here i an sending it via rtmp client
            rtmpClient.sendVideo(ByteBuffer.wrap(videoBuffer), new MediaCodec.BufferInfo());
        }
    };

}

@pedroSG94
Copy link
Owner

Hello,

That code is imcomplete. You can check this example:
#1033 (comment)
If you havent isIdr boolean you can manually check it like this:
https://github.com/pedroSG94/RootEncoder/blob/master/library/src/main/java/com/pedro/library/base/recording/BaseRecordController.java#L60
This is the method decodeSpsPpsFromBuffer:

private Pair<ByteBuffer, ByteBuffer> decodeSpsPpsFromBuffer(ByteBuffer outputBuffer, int length) {

Also, remember use setOnlyVideo(true) (at the same moment you use setVideoResolution) if you are not using audio

@FawadAbbas12
Copy link
Author

Thanks i will add it :)

@ghost
Copy link

ghost commented Nov 28, 2023

/**
* Step: 1
creating the connection the with rtmp server.. Ngnix.. The Connectivity code is on OnCreate..
*/

                    rtmpClient.connect("rtmp://192.168.137.59");

                    /**
                     * Step: 2
                     decoding sps pps from the video buffer, and setting the vps to null.. Then se the video info
                     */


                    Pair<ByteBuffer, ByteBuffer> buffers = decodeSpsPpsFromBuffer(ByteBuffer.wrap(videoBuffer), size);
                    if (buffers != null) {
                        Log.i(_TAG, "manual sps/pps extraction success");
                        sps = buffers.first;
                        pps = buffers.second;
                        vps = null;
                        spsPpsSetted = true;
                    }
                    else {
                        spsPpsSetted = false;
                    }

                    try {
                        rtmpClient.setVideoInfo(sps, pps, vps);
                        Log.i(_TAG, "set video info");
                    } catch (Exception ee) {
                        ee.getMessage();
                    }

                    /**
                     * Step: 3
                     send video buffer.. and also create a MediaCodec.BufferInfo object..

                     */


                    MediaCodec.BufferInfo info = new MediaCodec.BufferInfo();
                    try {
                        rtmpClient.sendVideo(ByteBuffer.wrap(videoBuffer), info);
                        long result = rtmpClient.getSentVideoFrames();
                        Log.d(_TAG, "" + result);
                    } catch (Exception ee) {
                        ee.getMessage();
                    }

I have the same issue, but I have able to get the video buffer and from video buffer I was able to decode sps pps values, the main issue I am getting is that, I am getting 0 on rtmpClient.getSentVideoFrames();

@pedroSG94
Copy link
Owner

Hello,

You are not creating BufferInfo correctly. You need to update values properly. Check it in my example:
#1033 (comment)

@ghost
Copy link

ghost commented Nov 28, 2023

Yes I saw the comment, and I also try this but I was unable to get the keyframe, the nalutype I am getting 7 and then 1.

protected boolean isKeyFrame(ByteBuffer videoBuffer) {
    byte[] header = new byte[5];
    videoBuffer.duplicate().get(header, 0, header.length);
    if (videoMime.equals(CodecUtil.H264_MIME) && (header[4] & 0x1F) == RtpConstants.IDR) {  //h264
        return true;
    } else { //h265
        return videoMime.equals(CodecUtil.H265_MIME)
                && ((header[4] >> 1) & 0x3f) == RtpConstants.IDR_W_DLP
                || ((header[4] >> 1) & 0x3f) == RtpConstants.IDR_N_LP;
    }
}

in my cases this is always false. 

@ghost
Copy link

ghost commented Nov 28, 2023

I find similar problem in this #817 section, but also no solution was given, is there any other way to find out the keyframe.

@pedroSG94
Copy link
Owner

pedroSG94 commented Nov 28, 2023

Hello,

If you are getting only 1 and 7 nal type you will need decode frames and re encode it to fix the problem. As you mentioned here:
#817 (comment)

The code posted is the way to do it. I explain you the code.

That code decode frames provided by DJI in onReceive callback using DJICodecManager as a decode. Then DJICodecManager class is connected to my VideoEncoder class rendering the surface provided by glInterface that contain my VideoEncoder surface that automatically encode it.
This way you can get frames from VideoEncoder using getVideoData interface.

@pedroSG94
Copy link
Owner

Hello,

Remember change resolution in DJIExample class (width and height).
Also check rtmpClient.setVideoResolution because 1080x720 is a weird resolution. Maybe you want to use 1280x720

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants