Question

I'm working on video transcoding in Android, and using the standard method as these samples to extract/decode a video. I test the same process on different devices with different video devices, and I found a problem on the frame count of decoder input/output.

For some timecode issues as in this question, I use a queue to record the extracted video samples, and check the queue when I got a decoder frame output, like the following codes: (I omit the encoding-related codes to make it clearer)

Queue<Long> sample_time_queue = new LinkedList<Long>();

....

// in transcoding loop

if (is_decode_input_done == false)
{
    int decode_input_index = decoder.dequeueInputBuffer(TIMEOUT_USEC);
    if (decode_input_index >= 0)
    {
        ByteBuffer decoder_input_buffer = decode_input_buffers[decode_input_index];
        int sample_size = extractor.readSampleData(decoder_input_buffer, 0);
        if (sample_size < 0)
        {
            decoder.queueInputBuffer(decode_input_index, 0, 0, 0, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
            is_decode_input_done = true;
        }
        else
        {
            long sample_time = extractor.getSampleTime();
            decoder.queueInputBuffer(decode_input_index, 0, sample_size, sample_time, 0);

            sample_time_queue.offer(sample_time);
            extractor.advance();
        }
    }
    else
    {
        DumpLog(TAG, "Decoder dequeueInputBuffer timed out! Try again later");
    }
}

....

if (is_decode_output_done == false)
{
    int decode_output_index = decoder.dequeueOutputBuffer(decode_buffer_info, TIMEOUT_USEC);
    switch (decode_output_index)
    {
        case MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED:
        {
            ....
            break; 
        }
        case MediaCodec.INFO_OUTPUT_FORMAT_CHANGED:
        {
            ....
            break; 
        }
        case MediaCodec.INFO_TRY_AGAIN_LATER:
        {
            DumpLog(TAG, "Decoder dequeueOutputBuffer timed out! Try again later");
            break;
        }
        default:
        {
            ByteBuffer decode_output_buffer = decode_output_buffers[decode_output_index];
            long ptime_us = decode_buffer_info.presentationTimeUs;
            boolean is_decode_EOS = ((decode_buffer_info.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0);

            if (is_decode_EOS)
            {
                // Decoder gives an EOS output.
                is_decode_output_done = true;

                ....                                    
            }
            else
            {
                // The frame time may not be consistent for some videos.
                // As a workaround, we use a frame time queue to guard this.
                long sample_time = sample_time_queue.poll();
                if (sample_time == ptime_us)
                {
                    // Very good, the decoder input/output time is consistent.
                }
                else
                {
                    // If the decoder input/output frame count is consistent, we can trust the sample time.
                    ptime_us = sample_time;
                }

                // process this frame
                ....
            }

            decoder.releaseOutputBuffer(decode_output_index, false);
        }
    }
}

In some cases, the queue can "correct" the PTS if the decoder gives error value (e.g. a lot of 0s). However, there are still some issues about the frame count of decoder input/output.

On an HTC One 801e device, I use the codec OMX.qcom.video.decoder.avc to decode the video (with MIME types video/avc). The sample time and PTS is matched well for the frames, except the last one. For example, if the extractor feeds 100 frames and then EOS to the decoder, the first 99 decoded frames has the exactly same time values, but the last frame is missing and I get output EOS from the decoder. I test different videos encoded by the built-in camera, by ffmpeg muxer, or by a video processing AP on Windows. All of them have the last one frame disappeared.

On some pads with OMX.MTK.VIDEO.DECODER.AVC codec, things becomes more confused. Some videos has good PTS from the decoder and the input/output frame count is correct (i.e. the queue is empty when the decoding is done.). Some videos has consistent input/output frame count with bad PTS in decoder output (and I can still correct them by the queue). For some videos, a lot of frames are missing during the decoding. For example, the extractor get 210 frames in a 7 second video, but the decoder only output the last 180 frames. It is impossible to recover the PTS using the same workaround.

Is there any way to expect the input/output frame count for a MediaCodec decoder? Or more accurately, to know which frame(s) are dropped by the decoder while the extractor gives it video samples with correct sample time?

Was it helpful?

Solution

Same basic story as in the other question. Pre-4.3, there were no tests confirming that every frame fed to an encoder or decoder came out the other side. I recall that some devices would reliably drop the last frame in certain tests until the codecs were fixed in 4.3.

I didn't search for a workaround at the time, so I don't know if one exists. Delaying before sending EOS might help if it's causing something to shut down early.

I don't believe I ever saw a device drop large numbers of frames. This seems like an unusual case, as it would have been noticeable in any apps that exercised MediaCodec in similar ways even without careful testing.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top