First off, thank you to everyone on this site...it's been INCREDIBLY helpful in getting into the grit of iOS programming.

My current issue:

I have an app that renders a very stylized version of a photo. It uses some CoreImage filters for some of it, but needs a bunch of CoreGraphics to get the heavy image processing done.

The proxy size renders work out great, but when I render a full resolution version of my image, it sometimes crashes because of high memory usage. The problem is that I need to be able to have several full resolution (3264x2448) buffers in memory when rendering. I don't know what or how to free up more memory. I've been very careful with matching CGImageRelease's everywhere I can.

And with ARC, how do I know if something's really been released and freed up? Setting an object to nil doesn't really do anything.

And I doubt I can stream this to disk in any way.

ANY suggestions would be extremely appreciated!

THANKS!

有帮助吗?

解决方案

ARC doesn't make a difference in such a context.

It just mean you don't have to call release by yourself.

With non-ARC, under low-memory conditions, you may want to release some properties, that you don't really need (meaning they can be re-created on demand).

- ( void )didReceiveMemoryWarning:
{
    [ _myProperty release ];

    _myProperty = nil;

    [ super didReceiveMemoryWarning ];
}

Under ARC, it's exactly the same, except you don't have to call release:

- ( void )didReceiveMemoryWarning:
{
    _myProperty = nil;

    [ super didReceiveMemoryWarning ];
}

Setting your property to nil will, under ARC, automatically release it.
So it really does something.

If it's doesn't work for you, then you definitively have another problem.
Make sure you don't have memory leaks, nor retain cycles.

The last one is certainly the problem...

其他提示

So as has been suggested (but not explicitly stated) - this isn't an ARC problem.

You're going to need 30 MB of memory to hold a single image in memory of that resolution (3264x2448, assuming 32 bits per pixel). And whilst you don't say how many buffers of that size you need in memory, it sounds like it's at least three - you're basically at your memory limit there for many iOS devices (the original iPad and iPhone 3GS only have 256MB total. Of that, you may only have access to a third. The memory available to your app is highly variable).

ARC isn't like garbage collection - it just adds the release and retain statements in at compilation. If you've structured your code correctly, your images will release when they're no longer needed. I strongly suspect that if you turned ARC off (which you can do on a file by file basis, using a compiler flag) you'd see the same results.

As someone has already posted, the way around this is to tile your image, and work on a small sample at a time. If your blur algorithm can't cope with that then the hard truth is you're probably going to have to write one that does!

You should tile your image and only work on parts of it at a time. You can do this by creating your CIImage and then calling:

[myContext drawImage:myImage atPoint:P fromRect:tileBounds];

in a loop and changing P and tileBounds so that eventually it covers the entire output image area.

许可以下: CC-BY-SA归因
不隶属于 StackOverflow
scroll top