Question

I came across this generic question on Adaptive Playback in Android Kitkat in SO. However, I have some specific queries as below:

I am able to understand the implications on the codec and the related framework. However, there is one specific point which I am unable to understand completely.

In case of adaptive playback, the codec employs a metadata on it's output port which basically abstracts and packages an underlying buffer handle (in all probability a gralloc handle). If a resolution changes, I presume the codec updates the gralloc handle attributes directly and sends the buffer back to the OMX Client without a port settings change.

My first question is: Is my understanding and assumption correct?

If so, how does the Surfaceflinger/SurfaceTexture know that the incoming payload is in metadata format and how does it retrieve the width and height from the underlying gralloc implementation?

Can you please point me to the relevant sources which will help me to understand this issue better?

Many thanks in advance.

Was it helpful?

Solution

sendFormatChange=>kWhatOutputFormatChange=>MEDIA_SET_VIDEO_SIZE=> ... =>native_window_set_buffers_dimensions

Just a clarification, lets use QC example, component recognize crop change - look for OMX_IndexConfigCommonOutputCrop here omx_vdec, then ACodec clears mSentFormat this makes ACodec to call sendFormatChange which sends kWhatOutputFormatChange event to NuPlayer which sends MEDIA_SET_VIDEO_SIZE to media player. At the second side of stick you will get native_window_set_buffers_geometry which forces Surface::setBuffersDimensions.

While OMXCodec is used OMX_IndexConfigCommonOutputCrop received sets mOutputPortSettingsHaveChanged, nearest call of OMXCodec::read will return INFO_FORMAT_CHANGED this in AwesomePlayer will make notifyVideoSize_l to be called and this one sends to listener MEDIA_SET_VIDEO_SIZE

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top