문제

I wrote a custom image picker based on ALAssetsLibrary, everything works fine but VoiceOver, every photo are only representing as "Button", I think that is not good.

So I checked the Photo app that built in iOS, VoiceOver spoke following information for each photo:

  1. It's photo or video or screenshot etc.
  2. It's portrait or landscape.
  3. The creation date of it.
  4. It's sharp or blurry.
  5. It's bright or dark.

I think I can get the first three from ALAsset's properties, which is

  1. ALAssetPropertyType
  2. ALAssetPropertyOrientation
  3. ALAssetPropertyDate

But how about sharpness and brightness? Can I get them from image Metadata or derive them out?

Update:

In the Photo's EXIF metadata:

  1. brightness is available with photos taken directly from camera, but photos saved from web or captured from screen always returns nil value.
  2. sharpness is always nil in the exif, from the document, sharpness value is "The sharpness applied to the image", so I think it is used by image-processing app (such as Aperture)

But Photos.app always have right brightness and sharpness values for any kind of photo, is it possible to do this by ourselves?

도움이 되었습니까?

해결책

You can get values using EXIF metadata.

All the keys are refferenced in Apple's Docs here and here

Here I wrote an example:

NSDictionary *allMetadata = [[asset defaultRepresentation] metadata];

NSDictionary *exif = [allMetadata objectForKey:(NSString*)kCGImagePropertyExifDictionary];

and than get sharpness and brightness

NSNumber *sharpness = [exif objectForKey:(NSString*)kCGImagePropertyExifSharpness];
NSNumber *brightness = [exif objectForKey:(NSString*)kCGImagePropertyExifBrightnessValue];
라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top