Question

I am using AVCapture foundation to retrieve images (from the iphone 4 front camera) through the delegate call back. I have specified BGRA as the format with the following:

self.theOutput.videoSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];

where self.output is of type AVCaptureVideoDataOutput.

And I can preview the image correctly.

  1. I have been trying to find a simple way to test the individual pixels RGBs to try to understand better, holding a blue card for example over the camera but the numbers just fluctuate between 80-110, versus my expected 0,0,255 what gives?

  2. When I hold a white card over the camera, I would expect 255 for each of RGB, yet I am getting close to 120 for all. It seems it is discounting all numbers by 50% on the white. Any reason? But when I let direct light shine unto camera, I get 255 for each of the RGB element.

I feel I am missing some elementary understanding here.

Was it helpful?

Solution

I would expect 255 for each of RGB, yet I am getting close to 120 for all.

A camera will try to automatically adjust its exposure to get the best picture. In this case it's probably assuming that the average brightness of the image should be around 128, and it adjusts to obtain that value. The camera can't tell if the paper you're holding in front of it is white, black, or middle gray, so it assumes it must be middle gray.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top