Question

Using JavaCV to consume a multicast stream, I want to render the video frames in a GLSurfaceView. The frames are grabbed using the FFmpegFrameGrabber class; I have successfully output the captured frames to sdcard and a non-GL surface for visual debuggging. I have looked all over for a solution or clue to no avail; here is the section of code where help is needed:

 // get the frame
 opencv_core.IplImage img = capture.grab();
 if (img != null) {
   opencv_core.CvMat rgbaImg = opencv_core.CvMat.create(height, width, CV_8U, 4);
   Bitmap bitmap = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888);
   // convert colorspace
   cvCvtColor(img, rgbaImg, CV_BGR2RGBA);
   bitmap.copyPixelsFromBuffer(rgbaImg.getByteBuffer());
   Rect rect = new Rect(x, y, width, height);
   Canvas c = surface.lockCanvas(rect);
   c.drawBitmap(bitmap, 0, 0, null);
   surface.unlockCanvasAndPost(c);
   if (bitmap != null) {
     bitmap.recycle();
   }
   if (rgbaImg != null) {
     rgbaImg.release();
   }
 }
Also if there is a more optimal way to do anything above, let me know.

Edit Since there's not much action on the first part of this question, would a "workaround" of rendering on the SufaceTexture that is used to create the Surface be a possibility instead?

  SurfaceTexture surfaceTexture = new SurfaceTexture(textureId);
  surfaceTexture.setOnFrameAvailableListener(this);
  surface = new Surface(surfaceTexture);

Note: I am forced to stick with Android 4.2.2 for now.

Was it helpful?

Solution

ffmpeg with wild video formats:

For your first method, you can speed it up by a shared bitmap and the other resources, that will eliminate any memory allocation and speed it up a lot.

As for rendering FFmpeg results to a GLSurfaceView, you should look here:

(I have used both JJmpeg and JavaCV)

https://code.google.com/p/jjmpeg/source/browse/#svn%2Fbranches%2Fffmpeg-0.10-android%2Fjjmpeg%2Fsrc%2Fau%2Fnotzed%2Fjjmpeg%2Fmediaplayer

Most of the gems are here: (GLESVideoRenderer.onDrawFrame method) https://code.google.com/p/jjmpeg/source/browse/branches/ffmpeg-0.10-android/jjmpeg/src/au/notzed/jjmpeg/mediaplayer/GLESVideoRenderer.java

Basic idea is to load the frames into 2D texture array, and then draw it.

You can modify the FFmpegFrameGrabber to a renderer for the GLSurfaceView, framerates will vary between devices.

If you know the video format:

What you really should do since you are already on Android 4.2.2, is to use MediaCodec from SDK and push the frames directly onto a surface.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top