Question

I'm making a camera app in iOS. Everything is fine except from the memory usage. This is the flow of the "takeImageTask":

AVCaptureStillImageOutput -> NSData -> UIImage (which I crop and do other changes to) -> save to ALAssetsLibrary

I need to keep the UIImage in an array because I need to have them in a UICollectionView, and I have the possibility to show the images in a "big size" view.

This flow uses a lot of memory, so what I thought could be a better solution was to use the AssetURL I get from the AVCaptureStillImageOutput writeImageToSavedPhotosAlbum to fetch the images from the phone storage. My app also have the possibility to fetch images from the photo album by using UIImagePickerController, and I've noticed that this just uses a fraction of the memory compared to the "takeImageTask". I first made the app as an Android version, and there I just stored all the URLs, and used them to present images. Does anyone have any experience using asseturls to present images in iOS?

Was it helpful?

Solution

I need to keep the UIImage in an array because I need to have them in a UICollectionView, and I have the possibility to show the images in a "big size" view.

This is flawed logic. You might want to keep a minimal size thumbnail image in memory for the collection view, but you should even drop these from memory before the count gets too big. And the big images should be loaded from disk on demand because it is relatively infrequent and relatively expensive (using the asset URL is a good plan for this).

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top