Question

I am trying to save an OpenGL buffer (whats currently displayed in the view) to the device's photo library. The code snippet below works fine on the simulator. But for the actual device it is crashing. I believe there could be a problem with the way im creating the UIImage captured from the screen.

  • This operations is initiated via an IBAction event handle method.
  • The function i use to save the image is UIImageWriteToSavedPhotosAlbum (i recently changed this to ALAssetsLibrary's writeImageToSavedPhotosAlbum).
  • I have ensured that my app is authorized to access the Photos library.
  • I also made sure that my CGImageRed is globally defined (defined at the top of the file) and my UIImage is a (nonatomic, retain) property.

Can somebody help me fix this issue? I'd like to have a valid UIImage reference that was generated from the glReadPixels data.

Below is the relevant code snippet (call to save to photo library):

-(void)TakeImageBufferSnapshot:(CGSize)dimensions
{
   NSLog(@"TakeSnapShot 1 : (%f, %f)", dimensions.width, dimensions.height);
   NSInteger size = dimensions.width * dimensions.height * 4;
   GLubyte *buffer = (GLubyte *) malloc(size);
   glReadPixels(0, 0, dimensions.width, dimensions.height, GL_RGBA, GL_UNSIGNED_BYTE, buffer);
   GLubyte *buffer2 = (GLubyte *) malloc(size);
   int height = (int)dimensions.height - 1;
   int width = (int)dimensions.width;

   for(int y = 0; y < dimensions.height; y++)
   {
       for(int x = 0; x < dimensions.width * 4; x++)
       {
           buffer2[(height - 1 - y) * width * 4 + x] = buffer[y * 4 * width + x];
       }
   }

   NSLog(@"TakeSnapShot 2");

   // make data provider with data.
   CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, buffer2, size, NULL);

   if (buffer) free(buffer);
   if (buffer2) free(buffer2);

   // prep the ingredients
   int bitsPerComponent = 8;
   int bitsPerPixel = 32;
   int bytesPerRow = 4 * self.view.bounds.size.width;
   CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();
   CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault;
   CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault;

   NSLog(@"TakeSnapShot 3");

   // make the cgimage
   g_savePhotoImageRef = CGImageCreate(dimensions.width, dimensions.height, bitsPerComponent, bitsPerPixel, bytesPerRow, colorSpaceRef, bitmapInfo, provider, NULL, NO, renderingIntent);

   NSLog(@"TakeSnapShot 4");

   // then make the uiimage from that
   self.savePhotoImage = [UIImage imageWithCGImage:g_savePhotoImageRef];

   CGColorSpaceRelease(colorSpaceRef);
   CGDataProviderRelease(provider);
}

-(void)SaveToPhotoAlbum
{
   ALAuthorizationStatus status = [ALAssetsLibrary authorizationStatus];
   NSLog(@"Authorization status: %d", status);

   if (status == ALAuthorizationStatusAuthorized)
   {
       [self TakeImageBufferSnapshot:self.view.bounds.size];

       // UPDATED - DO NOT proceed to save to album below.
       // Instead, set the created image to a UIImageView IBOutlet.
       // On the simulator this shows the screen/buffer captured image (as expected) -
       // but on the device (ipad) this doesnt show anything and the app crashes.
       self.testImageView.image = self.savePhotoImage;
       return;

       NSLog(@"Saving to photo album...");
       UIImageWriteToSavedPhotosAlbum(self.savePhotoImage,
                                   self,
                                       @selector(photoAlbumImageSave:didFinishSavingWithError:contextInfo:),
                                   nil);
   }
   else
   {
       UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Access is denied"
                                                    message:@"Allow access to your Photos library to save this image."
                                                   delegate:nil
                                          cancelButtonTitle:@"Close"
                                          otherButtonTitles:nil, nil];
       [alert show];
   }
}

- (void)photoAlbumImageSave:(UIImage *)image didFinishSavingWithError:(NSError *)error     contextInfo:(void *)context
{
   self.savePhotoImage = nil;
   CGImageRelease(g_savePhotoImageRef);

   if (error)
   {
       NSLog(@"Error saving photo to albums: %@", error.description);
   }
   else
   {
       NSLog(@"Saved to albums!");
   }
}

* Update * I think i've managed to narrow down my issue. I started doing trial & error, where i run the app (on the device) after commenting out lines of code, to narrow things down. It looks like i may have a problem with the TakeImageBufferSnapshot function, which takes the screen buffer (using glReadPixels) and creates an CGImageRef. Now, when i try to create a UIImage out of this (using the [UIImage imageWithCGImage:] method, this seems to be why the app crashes. If I comment this line out it seems like there is no issue (other than the fact that i dont have a UIImage reference).

I basically need a valid UIImage reference so that i can save it to the photo library (which seems to work just fine using test images).

Was it helpful?

Solution

First, I should point out that glReadPixels() may not behave the way you expect. If you try to use it to read from the screen after -presentRenderbuffer: has been called, the results are undefined. On iOS 6.0+, this returns a black image, for example. You need to either use glReadPixels() right before the content is presented to the screen (my recommendation) or enable retained backing for your OpenGL ES context (which has adverse performance consequences).

Second, there's no need for the two buffers. You can capture directly into one and use that to create your CGImageRef.

To your core issue, the problem is that you are deallocating your raw image byte buffer while your CGImageRef / UIImage is still relying on it. This pulls the rug out from underneath your UIImage and will lead to the image corruption / crashing you are seeing. To account for this, you need to put in place a callback function to be triggered on the deallocation of your CGDataProvider. This is how I do this within my GPUImage framework:

rawImagePixels = (GLubyte *)malloc(totalBytesForImage);
glReadPixels(0, 0, (int)currentFBOSize.width, (int)currentFBOSize.height, GL_RGBA, GL_UNSIGNED_BYTE, rawImagePixels);
dataProvider = CGDataProviderCreateWithData(NULL, rawImagePixels, totalBytesForImage, dataProviderReleaseCallback);
cgImageFromBytes = CGImageCreate((int)currentFBOSize.width, (int)currentFBOSize.height, 8, 32, 4 * (int)currentFBOSize.width, defaultRGBColorSpace, kCGBitmapByteOrderDefault | kCGImageAlphaLast, dataProvider, NULL, NO, kCGRenderingIntentDefault);
CGDataProviderRelease(dataProvider);

The callback function takes this form:

void dataProviderReleaseCallback (void *info, const void *data, size_t size)
{
    free((void *)data);
}

This function will be called only when the UIImage containing your CGImageRef (and by extension the CGDataProvider) is deallocated. Until that point, the buffer containing your image bytes remains.

You can examine how I do this within GPUImage, as a functional example. Take a look at the GPUImageFilter class for how I extract images from an OpenGL ES frame, including a faster method using texture caches instead of glReadPixels().

OTHER TIPS

well - from my experience you cannot just grab the pixels that are in the buffer right now you need to reestablish the right context, draw and grab THEN before finally releasing the context

=> This is mainly true for the device and ios6 in particular

EAGLContext* previousContext = [EAGLContext currentContext];
[EAGLContext setCurrentContext: self.context];

    [self fillBuffer:sender];

    //GRAB the pixels here

    [EAGLContext setCurrentContext:previousContext];

alternatively (thats how I do it) create a new FrameBuffer, fill THAT and grab pixels from THERE

GLuint rttFramebuffer;
glGenFramebuffers(1, &rttFramebuffer);
glBindFramebuffer(GL_FRAMEBUFFER, rttFramebuffer);

[self fillBuffer:self.displayLink];

size_t size = viewportHeight * viewportWidth * 4;
GLubyte *pixels = malloc(size*sizeof(GLubyte));
glPixelStorei(GL_PACK_ALIGNMENT, 4);
glReadPixels(0, 0, viewportWidth, viewportHeight, GL_RGBA, GL_UNSIGNED_BYTE, pixels);

// Restore the original framebuffer binding
glBindFramebuffer(GL_FRAMEBUFFER, framebuffer);
glDeleteFramebuffers(1, &rttFramebuffer);

size_t bitsPerComponent = 8;
size_t bitsPerPixel = 32;
size_t bytesPerRow = viewportWidth * bitsPerPixel / bitsPerComponent;
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGBitmapInfo bitmapInfo = kCGBitmapByteOrder32Big | kCGImageAlphaPremultipliedLast;
CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, pixels, size, ImageProviderReleaseData);
CGImageRef cgImage = CGImageCreate(viewportWidth, viewportHeight, bitsPerComponent, bitsPerPixel, bytesPerRow, colorSpace, bitmapInfo, provider, NULL, true, kCGRenderingIntentDefault);
CGDataProviderRelease(provider);

UIImage *image = [UIImage imageWithCGImage:cgImage scale:self.contentScaleFactor orientation:UIImageOrientationDownMirrored];
CGImageRelease(cgImage);
CGColorSpaceRelease(colorSpace);

Edit: removed call to presentBuffer

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top