Question

Does this Android Frameworks give capabilities for develop encoding and decoding functions for one Android app?

I know there is one high level API with 4.1 named MediaCodec which gives the capabilities for encoding and decoding video, but i dont know if it really uses one of that Media Frameworks as Stagefright for access to the codecs.

Which are the benefitis encoding or decoding using directly one of that frameworks than using the MediaCodec API?

Maybe i'm very confused but i have to make a project about encoding with Android video and i have to choose the better option for this, if anybody can help me...

Was it helpful?

Solution

You should use MediaCodec api and not any lower level api. The MediaCodec api will then use the appropriate framework internally. This is because android vendors will integrate their own codecs and hardware / dsp accelerated codecs into the framework and set those as default for MediaCodec api. They will ensure their codecs work with MediaCodec api. If you use MediaCodec api it will work on all devices irrespective of the framework being used below. That is what that api is for.

Unless you specifically want to do some testing with OpenCore/Stagefright don't go there.

OTHER TIPS

What you are trying to do is done by vendor/integrators. The best solution would be (if you have access to customized source code of the platform -> if not, use any free board) to integrate your HW enc with libstagefright (as OMX component) in this way you will have encoding of YUV done along with encapsulation into any desired (from supported ones) format, then you can use some high level rtp api to encapsulate and manage streaming (just conclude from a uncle google answer about android and rtp).

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top