Question

I am trying to render the output of a glsurfaceview to a PNG on the sdCard and am having some issues. I have spent a few days trying to sort through similar SO queries and this is simply above my level of expertise. Can someone pls help me sort thru the Logs below and see where I might be going wrong.

Many thanks.

Log:

12-16 12:09:18.831: E/AndroidRuntime(29864): FATAL EXCEPTION: GLThread 2712
12-16 12:09:18.831: E/AndroidRuntime(29864): java.nio.BufferUnderflowException
12-16 12:09:18.831: E/AndroidRuntime(29864):    at java.nio.Buffer.checkGetBounds(Buffer.java:177)
12-16 12:09:18.831: E/AndroidRuntime(29864):    at java.nio.DirectByteBuffer.get(DirectByteBuffer.java:66)
12-16 12:09:18.831: E/AndroidRuntime(29864):    at java.nio.IntToByteBufferAdapter.get(IntToByteBufferAdapter.java:105)
12-16 12:09:18.831: E/AndroidRuntime(29864):    at java.nio.IntBuffer.get(IntBuffer.java:234)
12-16 12:09:18.831: E/AndroidRuntime(29864):    at com.research.glgrade.GLLayer.grabPixels(GLLayer.java:865)
12-16 12:09:18.831: E/AndroidRuntime(29864):    at com.research.glgrade.GLLayer.saveScreenShot(GLLayer.java:810)
12-16 12:09:18.831: E/AndroidRuntime(29864):    at com.research.glgrade.GLLayer.onDrawFrame(GLLayer.java:794)
12-16 12:09:18.831: E/AndroidRuntime(29864):    at android.opengl.GLSurfaceView$GLThread.guardedRun(GLSurfaceView.java:1527)
12-16 12:09:18.831: E/AndroidRuntime(29864):    at android.opengl.GLSurfaceView$GLThread.run(GLSurfaceView.java:1240)
12-16 12:09:26.849: I/Choreographer(29864): Skipped 478 frames!  The application may be doing too much work on its main thread.

Here is my current code: From Main Activity :

GlobalVariables.setPrint("true");
mView.requestRender();

From Render Class

public void onDrawFrame(GL10 glUnused ) {
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
    final String vertexShader = getVertexShader();
    final String fragmentShader = getFragmentShader();

    final int vertexShaderHandle = ShaderHelper.compileShader(
            GLES20.GL_VERTEX_SHADER, vertexShader);
    final int fragmentShaderHandle = ShaderHelper.compileShader(
            GLES20.GL_FRAGMENT_SHADER, fragmentShader);

    mProgramHandle = ShaderHelper.createAndLinkProgram(vertexShaderHandle,
            fragmentShaderHandle, new String[] { "a_Position",
                    "a_TexCoordinate" });

    // Set our per-vertex lighting program.
    GLES20.glUseProgram(mProgramHandle);
            // Set program handles for cube drawing. 
    mMVPMatrixHandle = GLES20.glGetUniformLocation(mProgramHandle,
            "u_MVPMatrix");
    mTextureUniformHandle0 = GLES20.glGetUniformLocation(mProgramHandle,
            "u_Texture0");
    mTextureUniformHandle1 = GLES20.glGetUniformLocation(mProgramHandle,
            "u_Texture1");

            GLES20.glActiveTexture(GLES20.GL_TEXTURE4);
            // Bind the texture to this unit.
    GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTextureDataHandle4);
            // Tell the texture uniform sampler to use this texture in the shader by
            // binding to texture unit 3.
    GLES20.glUniform1i(mTextureUniformHandle4, 4);


    // Draw some cubes.
    Matrix.setIdentityM(mModelMatrix, 0);


    Matrix.translateM(mModelMatrix, 0, mTransX, mTransY, mAngle*0.05f);
    Matrix.rotateM(mModelMatrix, 0, 0.0f, 1.0f, 1.0f, 0.0f);
    drawCube();   


    int width_surfacea =  width_surface ;
        int height_surfacea = height_surface ;

    if ( GlobalVariables.getPrint() != "false" ) {

        String mFrameCount = "1";
        saveScreenShot(0, 0, width_surfacea, height_surfacea, "/save/test.png");
        GlobalVariables.setPrint("false");


    }

}

    public void saveScreenShot(int x, int y, int w, int h, String filename) {

    Bitmap bmp = grabPixels(x, y, w, h);

    try {
        String path = Environment.getExternalStorageDirectory() + "/" + filename;

        File file = new File(path);
        file.createNewFile();

        FileOutputStream fos = new FileOutputStream(file);
        bmp.compress(CompressFormat.PNG, 100, fos);

        fos.flush();

        fos.close();

    } catch (Exception e) {
        e.printStackTrace();
    }
}

public Bitmap grabPixels(int x, int y, int w, int h) {

    int screenshotSize = w * h;

    ByteBuffer bb = ByteBuffer.allocateDirect(screenshotSize * 3);
    bb.order(ByteOrder.nativeOrder());
    bb.position(0);
    GLES20.glReadPixels(0, 0, w, h, GLES20.GL_RGB, GLES20.GL_UNSIGNED_BYTE, bb);
    int pixelsBuffer[] = new int[screenshotSize];

    bb.asIntBuffer().get(pixelsBuffer);
    bb = null;

    Bitmap bitmap = Bitmap.createBitmap(w, h, Bitmap.Config.RGB_565);

    bitmap.setPixels(pixelsBuffer, screenshotSize-w, -w, 0, 0, w, h);

    pixelsBuffer = null;

    short sBuffer[] = new short[screenshotSize];
    ShortBuffer sb = ShortBuffer.wrap(sBuffer);
    bitmap.copyPixelsToBuffer(sb);

    //Making created bitmap (from OpenGL points) compatible with Android bitmap
    for (int i = 0; i < screenshotSize; ++i) {                  
        short v = sBuffer[i];
        sBuffer[i] = (short) (((v&0x1f) << 11) | (v&0x7e0) | ((v&0xf800) >> 11));
    }
    sb.rewind();
    bitmap.copyPixelsFromBuffer(sb);


    return bitmap;

}
Was it helpful?

Solution

You have array length mismatch. The line of code:

 bb.asIntBuffer().get(pixelsBuffer);

Causes an exception because the length of bb.asIntBuffer() is only (w*h*3) while the length of pixelsBuffer is w*h*4 (of course I am speaking about bytes). So the get() throws an exception as it should (according to the docs) http://developer.android.com/reference/java/nio/IntBuffer.html#get%28int[]%29

Update:

Try this code (warning: I haven't even try to compile it):

public Bitmap grabPixels(int x, int y, int w, int h) {

    // Allocate byte array of H*W size assuming that we retrieve only R,G and B values
    ByteBuffer byteBuffer = ByteBuffer.allocateDirect(screenshotSize * 3);
    byteBuffer.order(ByteOrder.nativeOrder());
    byteBuffer.position(0);
    // Read RGB values to our byte array
    GLES20.glReadPixels(x, y, w, h, GLES20.GL_RGB, GLES20.GL_UNSIGNED_BYTE, byteBuffer);

    //Now we need to convert the 24bit RGB data to ARGB8888 format
    // I like arrays, they are honest :)
    byte[] pixelBuffer = byteBuffer.array();
    // This is an array that we going to use as backend for the bitmap
    int[] finalPixels = new int[w*h];
    // Let the magic flow
    int j = 0;
    for (int i = 0; i < pixelBuffer.length; i += 3) {
        byte red = pixelBuffer[i];
        byte green = pixelBuffer[i+1];
        byte blue = pixelBuffer[i+2];

        finalPixels[j++] = 0xFF000000 | ((int)blue << 16) | ((int)green << 8) | red; 
    }

    // Create a bitmap of ARGB_8888
    return Bitmap.createBitmap(finalPixels, w, h, Bitmap.Config.ARGB_8888);
}

OTHER TIPS

Sample code for extracting frames from an MPEG file can be found on bigflake. The video frames are rendered with GL, and the saveFrame() function uses glReadPixels() and Bitmap to save it to disk as a PNG.

There are some fairly significant performance pitfalls that you need to be aware of; see comments in saveFrame() for some notes about how best to use NIO. You'll also want to make sure your EGLConfig matches your pixel read format (glReadPixels() may get very slow if the source and destination formats aren't the same).

Update: there's now a slightly more general implementation in Grafika -- see EglSurfaceBase#saveFrame().

I think ByteBuffer bb = ByteBuffer.allocateDirect(screenshotSize * 3); is wrong. It should be ByteBuffer bb = ByteBuffer.allocateDirect(screenshotSize * 4);

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top