Question

I'm using MediaCodec to decode h264 packets that were encoded with ffmpeg. When I decode with ffmpeg, the frames display fine. However, when I decode with the MediaCodec hardware decoder I sometimes get black bars that show up in the middle of the frame. This only happens if the encoding bitrate is set high enough (say upwards of 4000000) so that any given AVPacket size becomes above 95000 or so. It seems like MediaCodec (or the underlying decoder) is truncating the frames. Unfortunately, I need the quality so the bitrate can't be turned down. I've verified that the frames aren't being truncated elsewhere, and I've tried setting MediaFormat.KEY_MAX_INPUT_SIZE to something higher.

Has anyone ran into this issue or know of a way I can work around it?

I've attached an image of random pixels that I rendered in OpenGL and then decoded on my Galaxy S4.

a busy cat

Was it helpful?

Solution

I figured out what the issue was. I had to increase an incoming socket buffer in order to receive all the packet data. Since I was using a Live 555 RTSP client, I used the increaseReceiveBufferTo() function to do so.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top