Question

Initial Context

I am developping an augmented reality application location based and I need to get the field of view [FOV] (I just update the value when the orientation change, so I am looking for a method which can get this value when I call it)

The goal is to make a "degree ruler" relevant to reality like the following: Degree Ruler - AR App

I am already using AVCaptureSession to display camera stream ; and a path coupled with a CAShapeLayer to draw the ruler. This is working pretty good, but now I have to use Field of view value to place my element in the right place (choose the right space between 160° and 170° for example!).

Actually, I am hardcoding these values with these sources : https://stackoverflow.com/a/3594424/3198096 (Special thanks to @hotpaw2!) But I am not sure they are fully precise and this is not handling iPhone 5, etc. I was unable to obtain values from official sources (Apple!), but there is a link showing values for all iDevice I think I need (4, 4S, 5, 5S) : AnandTech | Some thoughts about the iphone 5s camera improvements.

Note: After personal test and some other research online, I am pretty sure these values are inaccurate! Also this forces me to use an external library to check which model of iPhone am I using to manually initialize my FOV... And I have to check my values for all supported device.

###I would prefer a "code solution" !###

After reading this post: iPhone: Real-time video color info, focal length, aperture?, I am trying to get exif data from AVCaptureStillImageOutput like suggested. After what I could be able to read the focal length from the exif data, and then calculate the horizontal and vertical field of view via formula! (Or maybe directly obtain the FOV like showed here : http://www.brianklug.org/2011/11/a-quick-analysis-of-exif-data-from-apples-iphone-4s-camera-samples/ -- note: after a certain number of update, It seems that we can't get directly field of view from exif!)


Actual Point

Sources from : http://iphonedevsdk.com/forum/iphone-sdk-development/112225-camera-app-working-well-on-3gs-but-not-on-4s.html and Modified EXIF data doesn't save properly

Here is the code I am using:

AVCaptureDevice* camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if (camera != nil)
{
    captureSession = [[AVCaptureSession alloc] init];
    
    AVCaptureDeviceInput *newVideoInput = [[AVCaptureDeviceInput alloc] initWithDevice:camera error:nil];
    
    [captureSession addInput:newVideoInput];
    
    captureLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:captureSession];
    captureLayer.frame = overlayCamera.bounds;
    [captureLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
    previewLayerConnection=captureLayer.connection;
    [self setCameraOrientation:[[UIApplication sharedApplication] statusBarOrientation]];
    [overlayCamera.layer addSublayer:captureLayer];
    [captureSession startRunning];
    
    AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
    NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
    [stillImageOutput setOutputSettings:outputSettings];
    [captureSession addOutput:stillImageOutput];
    
    AVCaptureConnection *videoConnection = nil;
    for (AVCaptureConnection *connection in stillImageOutput.connections)
    {
        for (AVCaptureInputPort *port in [connection inputPorts])
        {
            if ([[port mediaType] isEqual:AVMediaTypeVideo] )
            {
                videoConnection = connection;
                break;
            }
        }
        if (videoConnection) { break; }
    }
    
    [stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection
                                                         completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error)
     {
         NSData *imageNSData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
         
         CGImageSourceRef imgSource = CGImageSourceCreateWithData((__bridge_retained CFDataRef)imageNSData, NULL);

         NSDictionary *metadata = (__bridge NSDictionary *)CGImageSourceCopyPropertiesAtIndex(imgSource, 0, NULL);

         NSMutableDictionary *metadataAsMutable = [metadata mutableCopy];
         
         NSMutableDictionary *EXIFDictionary = [[metadataAsMutable objectForKey:(NSString *)kCGImagePropertyExifDictionary]mutableCopy];
        
         if(!EXIFDictionary)
             EXIFDictionary = [[NSMutableDictionary dictionary] init];

         [metadataAsMutable setObject:EXIFDictionary forKey:(NSString *)kCGImagePropertyExifDictionary];
         
         NSLog(@"%@",EXIFDictionary);
     }];
}

Here is the output:

{
    ApertureValue = "2.52606882168926";
    BrightnessValue = "0.5019629837352776";
    ColorSpace = 1;
    ComponentsConfiguration =     (
        1,
        2,
        3,
        0
    );
    ExifVersion =     (
        2,
        2,
        1
    );
    ExposureMode = 0;
    ExposureProgram = 2;
    ExposureTime = "0.008333333333333333";
    FNumber = "2.4";
    Flash = 16;
    FlashPixVersion =     (
        1,
        0
    );
    FocalLenIn35mmFilm = 40;
    FocalLength = "4.28";
    ISOSpeedRatings =     (
        50
    );
    LensMake = Apple;
    LensModel = "iPhone 4S back camera 4.28mm f/2.4";
    LensSpecification =     (
        "4.28",
        "4.28",
        "2.4",
        "2.4"
    );
    MeteringMode = 5;
    PixelXDimension = 1920;
    PixelYDimension = 1080;
    SceneCaptureType = 0;
    SceneType = 1;
    SensingMethod = 2;
    ShutterSpeedValue = "6.906947890818858";
    SubjectDistance = "69.999";
    UserComment = "[S.D.] kCGImagePropertyExifUserComment";
    WhiteBalance = 0;
}

I think I have everything I need to calculate FOV. But are they the right values? Because after reading a lot of different website giving different focal length values, I am a bit confused! Also my PixelDimensions seems to be wrong!

Via http://en.wikipedia.org/wiki/Angle_of_view this is the formula I planned to use:

FOV = (IN_DEGREES(   2*atan( (d) / (2  * f) )   ));
// d = sensor dimensions (mm)
// f = focal length (mm)

My Question

Do my method and my formula look right, and if yes, which values do I pass to the function?


Precisions

  • FOV is what I think I need to use, if you have any suggestion of how the ruler can match reality; I would accept the answer !
  • Zoom is disabled in the augmented reality view controller, so my field of view is fixed when camera is initialized, and can't change until the user rotate the phone!

Was it helpful?

Solution

In iOS 7 and above you can do something along these lines:

float FOV = camera.activeFormat.videoFieldOfView;

where camera is your AVCaptureDevice. Depending on what preset you choose for the video session, this can change even on the same device. It's the horizontal field-of-view (in degrees), so you'll need to calculate the vertical field-of-view from the display dimensions.

Here's Apple's reference material.

OTHER TIPS

To answer your question:

Do my method and my formula look right...?

Maybe, but they also look too complex.

...and if yes, which values do I pass to the function?

I don't know, but if the goal is to calculate HFOV and VFOV, here is a code example which programmatically finds the Horizontal Viewing Angle, which is the only viewing angle one can access in Swift currently, and then calculates the Vertical Viewing Angle based on the aspect ratio of the iPhone 6, 16:9.

    let devices = AVCaptureDevice.devices()
    var captureDevice : AVCaptureDevice?
    for device in devices {
        if (device.hasMediaType(AVMediaTypeVideo)) {
            if(device.position == AVCaptureDevicePosition.Back) {
                captureDevice = device as? AVCaptureDevice
            }
        }
    }
    if let retrievedDevice = captureDevice {
        var HFOV : Float = retrievedDevice.activeFormat.videoFieldOfView
        var VFOV : Float = ((HFOV)/16.0)*9.0
    }

Also remember to import AVFoundation if you want this to work!

Apple has also released a list with all camera specification details including FOV (Field of View).

https://developer.apple.com/library/ios/documentation/DeviceInformation/Reference/iOSDeviceCompatibility/Cameras/Cameras.html

The values match the values that can be retrieved using:

float FOV = camera.activeFormat.videoFieldOfView;

From the FOV equation, you need focal length and sensor dimensions. Exif data has focal length but not sensor dimensions. I have found only one camera vendor (Canon) that provides sensor dimensions in the metadata -- and that is in their CRW raw files. So you will have to have a table look-up for sensor dimensions based on the camera or smart phone. The following Wikipedia link lists sensor dimensions for numerous cameras and some newer iPhones. The list is near the end of the Wiki article. http://en.wikipedia.org/wiki/Image_sensor_format

Another source for this type of information might be various photography blogs. Hope this helps.

If digital zoom is used, multiply the focal length by the zoom value. The Exif data contains the zoom factor.

Photography blogs and web sites such as http://www.kenrockwell.com have lots of information on camera sensor dimensions.

A correction: actually there are Exif tags for Focal plane X resolution and Focal plane Y resolution (in pixels) and Focal plane resolution units, From those tags and the image dimensions, you can compute the sensor size. But not all cameras provide those Exif tags. For example iPhone 4 does not provide those tags.

The diameter of the field of view is the diameter of the circle of light you can see when you look into a microscope.

If about ten round cells in a row could fit across the diameter of the field of view and if you know the diameter in millimeters then you can divide the total diameter by the number of cells that would fit to find the size of each cell.

total diameter divided by number of cells that would fit equals the size (diameter) of each cell

Example

If the diameter of the field of view is 1.5mm and ten cells would fit if they were arranged in a line, then each cell is:

1.5mm/10 = 0.15mm

0.15 millimeters

REMEMBER that every time you change the magnification, you change the diameter of the circle of light you can see. So each magnification will have its own diameter.

If you put a ruler under a microscope, the more you zoom in, the fewer the number of lines on the ruler you can see.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top