Question

I'm passing a SurfaceView surface from Java to JNI where I obtain the native window from that surface. Stagefright decodes h264 frames from an mp4 file. During the decoding process I call ANativeWindow::queueBuffer() in order to send decoded frames to be rendered. There are no errors on decoding or on calling queueBuffer(), all I get is a black screen.

I really feel like I'm not setting up the native window properly so that when queueBuffer() is called, it is rendered to the screen. However, I can render pixels to the native window directly via memcpy. Unfortunately, after I instantiate the OMXClient a segfault occurs when trying to manually draw pixels, so it seems I must use queueBuffer().

My surfaceview being setup in onCreate():

protected void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);

    SurfaceView surfaceView = new SurfaceView(this);
    surfaceView.getHolder().addCallback(this);
    setContentView(surfaceView);
}    

Once the surface is created, I call my native init() function with the surface:

@Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
    NativeLib.init(holder.getSurface(), width, height);
}

The native window is created in JNI and a decode thread is started:

nativeWindow = ANativeWindow_fromSurface(env, surface);
int ret = pthread_create(&decode_thread, NULL, &decode_frames, NULL);

My routine for decoding frames a la vec.io's Stagefright decoding example

void* decode_frames(void*){
    mNativeWindow = nativeWindow;
    sp<MediaSource> mVideoSource = new AVFormatSource();
    OMXClient mClient;
    mClient.connect();

    sp<MediaSource> mVideoDecoder = OMXCodec::Create(mClient.interface(), mVideoSource->getFormat(), false, mVideoSource, NULL, 0, mNativeWindow);
    mVideoDecoder->start();

    while(err != ERROR_END_OF_STREAM ) {
        MediaBuffer *mVideoBuffer;
        MediaSource::ReadOptions options;
        err = mVideoDecoder->read(&mVideoBuffer, &options);

        if (err == OK) {
            if (mVideoBuffer->range_length() > 0) {

                sp<MetaData> metaData = mVideoBuffer->meta_data();
                int64_t timeUs = 0;
                metaData->findInt64(kKeyTime, &timeUs);
                status_t err1 = native_window_set_buffers_timestamp(mNativeWindow.get(), timeUs * 1000);
                //This line results in a black frame
                status_t err2 = mNativeWindow->queueBuffer(mNativeWindow.get(), mVideoBuffer->graphicBuffer().get(), -1); 

                if (err2 == 0) {
                    metaData->setInt32(kKeyRendered, 1);
                }
            } 
            mVideoBuffer->release();
        }
    }
mVideoSource.clear();
mVideoDecoder->stop();
mVideoDecoder.clear();
mClient.disconnect();
}

EDIT: Taking Ganesh's advice, I interfaced with the Awesome Renderer in order to change color space. During this it became apparent that the color format wasn't being set in Stagefright.

08-06 00:56:32.842: A/SoftwareRenderer(7326): frameworks/av/media/libstagefright/colorconversion/SoftwareRenderer.cpp:42 CHECK(meta->findInt32(kKeyColorFormat, &tmp)) failed.
08-06 00:56:32.842: A/libc(7326): Fatal signal 11 (SIGSEGV) at 0xdeadbaad (code=1), thread 7340 (hieu.alloclient)

Trying to set the color space explicitly (kKeyColorFormat to a yuv420P color space) leads to a dequeue problem. Which probably makes sense because the color format I specify is arbitrary.

08-06 00:44:30.878: V/OMXCodec(6937): matchComponentName (null)
08-06 00:44:30.888: V/OMXCodec(6937): matching 'OMX.qcom.video.decoder.avc' quirks 0x000000a8
08-06 00:44:30.888: V/OMXCodec(6937): matchComponentName (null) 
08-06 00:44:30.888: V/OMXCodec(6937): matching 'OMX.google.h264.decoder' quirks 0x00000000
08-06 00:44:30.888: V/OMXCodec(6937): Attempting to allocate OMX node 'OMX.qcom.video.decoder.avc'
08-06 00:44:30.918: V/OMXCodec(6937): Successfully allocated OMX node 'OMX.qcom.video.decoder.avc'
08-06 00:44:30.918: V/OMXCodec(6937): configureCodec protected=0
08-06 00:44:30.918: I/OMXCodec(6937): [OMX.qcom.video.decoder.avc] AVC profile = 66 (Baseline), level = 13
08-06 00:44:30.918: V/OMXCodec(6937): [OMX.qcom.video.decoder.avc] setVideoOutputFormat width=320, height=240
08-06 00:44:30.918: V/OMXCodec(6937): [OMX.qcom.video.decoder.avc] portIndex: 0, index: 0, eCompressionFormat=7 eColorFormat=0
08-06 00:44:30.918: V/OMXCodec(6937): [OMX.qcom.video.decoder.avc] found a match.
08-06 00:44:30.938: I/QCOMXCodec(6937): Decoder should be in arbitrary mode
08-06 00:44:30.958: I/OMXCodec(6937): [OMX.qcom.video.decoder.avc] video dimensions are 320 x 240
08-06 00:44:30.958: I/OMXCodec(6937): [OMX.qcom.video.decoder.avc] Crop rect is 320 x 240 @ (0, 0)
08-06 00:44:30.958: D/infoJNI(6937): before started
08-06 00:44:30.968: V/OMXCodec(6937): [OMX.qcom.video.decoder.avc] allocating 2 buffers of size 2097088 on input port
08-06 00:44:30.968: V/OMXCodec(6937): [OMX.qcom.video.decoder.avc] allocated buffer 0x417037d8 on input port
08-06 00:44:30.968: V/OMXCodec(6937): [OMX.qcom.video.decoder.avc] allocated buffer 0x41703828 on input port
08-06 00:44:30.978: V/OMXCodec(6937): native_window_set_usage usage=0x40000000
08-06 00:44:30.978: V/OMXCodec(6937): [OMX.qcom.video.decoder.avc] allocating 22 buffers from a native window of size 147456 on output port
08-06 00:44:30.978: E/OMXCodec(6937): dequeueBuffer failed: Invalid argument (22)
Was it helpful?

Solution

I ended up solving this issue by using the Java low level APIs instead. I set up a native read_frame function that parses video frames using FFmpeg. I call this function in a separate Java decoder thread, which returns a new frame of data to be decoded by MediaCodec. It was very straight forward to render this way-- just pass MediaCodec the surface.

Alternatively, I could have used MediaExtractor, but FFmpeg had some other functionality that I needed.

OTHER TIPS

Just in case the problem has not been solved! I have had the same problem, and found the problem pure accidentally!

@Override public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) { NativeLib.init(holder.getSurface(), width, height); }

you have to allocate the frame buffer to a dimension dividable by 16, which is the macro block size. Otherwise, the graphic buffer is not large enough for decoding output. H264 encoder has internal little large frame size for encoding process if the provided video sequence has the width or height not aligned to macro block. Just apply following: width = 16 * (width + 15)/16; height = 16 * (height + 15)/16;

You need call native_window_set_scaling_mode(mNativeWindow->get(), NATIVE_WINDOW_SCALING_MODE_SCALE_TO_WINDOW);

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top