Question

I have a h264 stream and want to decode using MediaCodec in Android 4.1.2

The stream is decodable using ffmpeg, but it's slow, so i want to use MediaCodec. The Mobile is Samsung Galaxy S3

On a button click, a new activity will be started.

public void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);

    test = new SurfaceView(getApplicationContext());
    test.getHolder().addCallback(this);
    setContentView(test);
}

In the callback

public void surfaceChanged(SurfaceHolder arg0, int arg1, int arg2, int arg3) {
    // TODO Auto-generated method stub
    //codedBuffer = new byte[1536000];
    if(wt == null){
        wt = new RenderThread(arg0.getSurface());
        wt.start();
    }

}

where RenderThread is

public void run() {
formatIn = MediaFormat.createVideoFormat("video/avc", 480, 800);
coder = MediaCodec.createDecoderByType("video/avc");
if(formatIn != null)
    coder.configure(formatIn, mSurface, null, 0);
coder.start();
ByteBuffer[] inputBuffers = coder.getInputBuffers();
ByteBuffer[] outputBuffers = coder.getOutputBuffers();
mBufferInfo = new BufferInfo();
while(!Thread.interrupted()){

    while(waitForStream){
        Thread.yield();
}
if(!EOS ){

int inBufIndex = coder.dequeueInputBuffer(10000);
if(inBufIndex != -1){
ByteBuffer buffer = inputBuffers[inBufIndex]; 
buffer.put(receivebuffer,0,size);
coder.queueInputBuffer(inBufIndex, 0, size, 0,0);
}
int outBufIndex = coder.dequeueOutputBuffer(mBufferInfo, 10000);
switch (outBufIndex) {
case MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED:
Log.d("DecodeActivity", "INFO_OUTPUT_BUFFERS_CHANGED");
outputBuffers = coder.getOutputBuffers();
break;
case MediaCodec.INFO_OUTPUT_FORMAT_CHANGED:
Log.d("DecodeActivity", "New format " );
break;
case MediaCodec.INFO_TRY_AGAIN_LATER:
    Log.d("DecodeActivity", "dequeueOutputBuffer timed out!");
    break;
default:
//  ByteBuffer buffer = outputBuffers[outBufIndex];
//  buffer.get(decodedBuffer);
    coder.releaseOutputBuffer(outBufIndex, true);
}
}

The decodedBuffer is a byte[1536000]

The updated error log is

02-03 05:53:20.065: I/OMXClient(12333): Using client-side OMX mux.
02-03 05:53:20.145: D/DecodeActivity(12333): dequeueOutputBuffer timed out!
02-03 05:53:20.170: D/DecodeActivity(12333): dequeueOutputBuffer timed out!
02-03 05:53:20.195: D/DecodeActivity(12333): dequeueOutputBuffer timed out!
02-03 05:53:20.215: D/DecodeActivity(12333): dequeueOutputBuffer timed out!
02-03 05:53:20.270: D/DecodeActivity(12333): dequeueOutputBuffer timed out!
02-03 05:53:20.290: D/DecodeActivity(12333): dequeueOutputBuffer timed out!
02-03 05:53:20.295: E/SpannableStringBuilder(12333): SPAN_EXCLUSIVE_EXCLUSIVE spans cannot have a zero length
02-03 05:53:20.295: E/SpannableStringBuilder(12333): SPAN_EXCLUSIVE_EXCLUSIVE spans cannot have a zero length
02-03 05:53:20.310: D/DecodeActivity(12333): dequeueOutputBuffer timed out!
02-03 05:53:20.330: D/DecodeActivity(12333): dequeueOutputBuffer timed out!
02-03 05:53:20.355: D/DecodeActivity(12333): dequeueOutputBuffer timed out!
02-03 05:53:20.375: D/DecodeActivity(12333): dequeueOutputBuffer timed out!
02-03 05:53:20.395: D/DecodeActivity(12333): dequeueOutputBuffer timed out!
02-03 05:53:20.410: D/DecodeActivity(12333): dequeueOutputBuffer timed out!
02-03 05:53:20.420: D/DecodeActivity(12333): dequeueOutputBuffer timed out!
02-03 05:53:20.425: D/DecodeActivity(12333): New format 
02-03 05:53:20.485: D/DecodeActivity(12333): dequeueOutputBuffer timed out!
02-03 05:53:20.605: W/dalvikvm(12333): threadid=15: thread exiting with uncaught exception (group=0x40de32a0)
02-03 05:53:20.605: E/AndroidRuntime(12333): FATAL EXCEPTION: Thread-658
02-03 05:53:20.605: E/AndroidRuntime(12333): java.nio.BufferOverflowException
02-03 05:53:20.605: E/AndroidRuntime(12333):    at java.nio.Buffer.checkPutBounds(Buffer.java:189)
02-03 05:53:20.605: E/AndroidRuntime(12333):    at java.nio.ReadWriteDirectByteBuffer.put(ReadWriteDirectByteBuffer.java:100)
02-03 05:53:20.605: E/AndroidRuntime(12333):    at java.nio.ByteBuffer.put(ByteBuffer.java:712)
Was it helpful?

Solution

Since it's a get() call, I assume you're failing here:

ByteBuffer buffer = outputBuffers[outBufIndex];
buffer.get(decodedBuffer);

If you look at the CTS EncodeDecodeTest, which works with raw YUV data, you'll see an extra bit:

ByteBuffer outputFrame = decoderOutputBuffers[decoderStatus];
outputFrame.position(info.offset);
outputFrame.limit(info.offset + info.size);

The MediaCodec code doesn't read or update the ByteBuffer state, so you have to do it explicitly. This applies to the input side too.

The buffer-to-buffer and buffer-to-surface portions of EncodeDecodeTest will work with API 16, and are useful to examine. The surface-to-surface tests require API 18.

One thing you'll find is that the test doesn't extracting the buffer contents to a fixed-size array, because the decoded size of the frame is not knowable statically. You need to know the color-format of the output, and that will vary from one device to another. You'll also find that checkFrame() gives up entirely on some devices because it doesn't understand the frame format.

If you can decode to a Surface, your code can work like checkSurfaceFrame() instead, where you don't care about the buffer format because OpenGL ES does all the work. See also ExtractMpegFramesTest.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top