Question

I'm having memory issues while reading video frames from an existing video chosen from the iPhone library. First I added the UIImage-frames themselves into an Array, but I thought that the array was too big for the memory after a while, so instead I save the UIImages in the documents folder and add the imagepath to the array. However, I still get the same memory warnings even though checking with Instruments for allocations. The total allocated memory never gets over 2.5mb. Also there are no leaks found... Can anyone think of something?

-(void)addFrame:(UIImage *)image
{
    NSString *imgPath = [NSString stringWithFormat:@"%@/Analysis%d-%d.png", docFolder, currentIndex, framesArray.count];       
    [UIImagePNGRepresentation(image) writeToFile:imgPath atomically:YES];
    [framesArray addObject:imgPath];    
    frameCount++;      
}

-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
    [picker dismissModalViewControllerAnimated:YES];
    [framesArray removeAllObjects];    
    frameCount = 0;          

    // incoming video
    NSURL *videoURL = [info valueForKey:UIImagePickerControllerMediaURL];
    //NSLog(@"Video : %@", videoURL);

    // AVURLAsset to read input movie (i.e. mov recorded to local storage)
    NSDictionary *inputOptions = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES] forKey:AVURLAssetPreferPreciseDurationAndTimingKey];
    AVURLAsset *inputAsset = [[AVURLAsset alloc] initWithURL:videoURL options:inputOptions];     

    // Load the input asset tracks information
    [inputAsset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:@"tracks"] completionHandler: ^{        

        NSError *error = nil;
        nrFrames = CMTimeGetSeconds([inputAsset duration]) * 30;
        NSLog(@"Total frames = %d", nrFrames);

        // Check status of "tracks", make sure they were loaded    
        AVKeyValueStatus tracksStatus = [inputAsset statusOfValueForKey:@"tracks" error:&error];
        if (!tracksStatus == AVKeyValueStatusLoaded)
            // failed to load
            return;        

        /* Read video samples from input asset video track */
        AVAssetReader *reader = [AVAssetReader assetReaderWithAsset:inputAsset error:&error];

        NSMutableDictionary *outputSettings = [NSMutableDictionary dictionary];
        [outputSettings setObject: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA]  forKey: (NSString*)kCVPixelBufferPixelFormatTypeKey];
        AVAssetReaderTrackOutput *readerVideoTrackOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:[[inputAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] outputSettings:outputSettings];


        // Assign the tracks to the reader and start to read
        [reader addOutput:readerVideoTrackOutput];
        if ([reader startReading] == NO) {
            // Handle error
            NSLog(@"Error reading");
        }

        NSAutoreleasePool *pool = [NSAutoreleasePool new];
        while (reader.status == AVAssetReaderStatusReading)
        {            
            if(!memoryProblem)
            {
                CMSampleBufferRef sampleBufferRef = [readerVideoTrackOutput copyNextSampleBuffer];
                if (sampleBufferRef) 
                {
                    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBufferRef);
                    /*Lock the image buffer*/
                    CVPixelBufferLockBaseAddress(imageBuffer,0); 
                    /*Get information about the image*/
                    uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer); 
                    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
                    size_t width = CVPixelBufferGetWidth(imageBuffer); 
                    size_t height = CVPixelBufferGetHeight(imageBuffer); 

                    /*We unlock the  image buffer*/
                    CVPixelBufferUnlockBaseAddress(imageBuffer,0);

                    /*Create a CGImageRef from the CVImageBufferRef*/
                    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 
                    CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); 
                    CGImageRef newImage = CGBitmapContextCreateImage(newContext); 

                    /*We release some components*/
                    CGContextRelease(newContext); 
                    CGColorSpaceRelease(colorSpace);

                    UIImage *image= [UIImage imageWithCGImage:newImage scale:[UIScreen mainScreen].scale orientation:UIImageOrientationRight];          
                    //[self addFrame:image];
                    [self performSelectorOnMainThread:@selector(addFrame:) withObject:image waitUntilDone:YES];

                    /*We release the CGImageRef*/
                    CGImageRelease(newImage);                    

                    CMSampleBufferInvalidate(sampleBufferRef);
                    CFRelease(sampleBufferRef);
                    sampleBufferRef = NULL;
                }
            }
            else 
            {                
                break;
            }            
        }
        [pool release];

        NSLog(@"Finished");        
    }];   
}
Was it helpful?

Solution

You do one thing and try.

Move the NSAutoreleasePool inside the while loop and drain that inside the loop.

So that it would be like as follows:

while (reader.status == AVAssetReaderStatusReading)
{            
    NSAutoreleasePool *pool = [NSAutoreleasePool new];

    .....

    [pool drain];
} 
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top