Question

I have ~3200 photos on my iPhone over 2 years and would like to create an app that analyzes and plots when I took these pictures (ex: I was very actively photographing for a week, and then stopped). Is there a way to do bulk processing of camera roll photos on iPhone(with user permission) to extract timestamps of photos?

I think if I back up my photos to my mac with the "image capture" app, I can do this kind of metric analysis with a Mac app, but the import process would take hours and I'm looking for something simpler.

Was it helpful?

Solution

Take a look at ALAssetsLibrary

An instance of ALAssetsLibrary provides access to the videos and photos that are under the control of the Photos application.

Here is sample application PhotosByLocationn.

Demonstrates how to use the AssetsLibrary APIs to provide a custom image picking UI. The user experience is centered around the idea of using the assets location and time metadata as a basis for certain features.

OTHER TIPS

Yes you can use the Assets Library for this.

This fill allow you acces to the image on the users device.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top