Question

I am using AVFramework to capture camera frames and I would like to process and display them in a UIImageView but am having some trouble. I have the code:

// Delegate routine that is called when a sample buffer was written
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{ 
    //NSLog(@"Capturing\n");
    // Create a UIImage from the sample buffer data
    UIImage *image = [self imageFromSampleBuffer:sampleBuffer];     

    NSLog(@"Image: %f %f\n", image.size.height, image.size.width);
    [imageView setImage:image];
}

However, it won't display. The correct size shows up in the NSLog, and when I put:

[imageView setImage:[UIImage imageNamed:@"SomethingElse.png"]]; 

in viewDidLoad, an image is displayed correctly (so I know the UIImageView is connected correctly).

Is there any reason this shouldn't work??? I am at a loss right now.

Cheers, Brett

Was it helpful?

Solution

Are you doing this on the main thread? You can either use the main queue for the delegate (undesirable, because you're doing processing first) or:

dispatch_async(dispatch_get_main_queue(), ^{
    imageView.image = ...;
});

OTHER TIPS

Is imageView set correctly? If imageView is actually nil, your call to [imageView setImage:image]; will silently do nothing.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top