Pregunta

I am currently working on an app, to read the camera images and process them using OpenCV. I would like to store the original images as a "normal" video and to drop the processed images after processing (after the information is extracted). For this I basically need to encode the image stream to h264. It will not be possible to use ffmpeg or other solutions wich do it on the general cpu. It has to be done using the internal hardware acceleration. Because of this I have actually three questions:

  • Does somebdy know how to achieve h264 encoding using the internal hw-modules to encode videos?

  • Or does ffmpeg really support this on Android?

  • Or is there a way access the image stream during the "usual" encoding process?

Thanks alot!

¿Fue útil?

Solución

You can run the MediaRecorder to create the video using h264 hardware encoder (if available) and at the same time register your preview frame handler. I cannot guarantee that all preview frames will go in both directions, though.

Alternatively, you can compile ffmpeg with libstagefright, and it will use hardware avc encoder.

Alternatively, you can use stagefright directly from your app (via JNI).

Finally, you can follow the approach of libstreaming to find the optimal recording API for your platform (be it MediaRecorder API, or MediaCodec buffer-to-buffer method which requires Android 4.1, or MediaCodec surface-to-buffer method which requires Android 4.3.

Licenciado bajo: CC-BY-SA con atribución
No afiliado a StackOverflow
scroll top