I want to resize a UIImage (height and width) and store it in a NSData with new measures. Example:

[UIImage imageWithCGImage:[image CGImage] scale:compression orientation:UIImageOrientationUp]

This does not preserve the dimensions when converted to NSData. What is the problem?

有帮助吗?

解决方案

The scale parameter does something else than resizing image. To give real life example - scale is involved when you load retina images. In that case, image dimensions are halved, while amount of data (and thus the actual dimensions) are not. This is just a clever way to provide nice compatibility between retina/nonretina devices.

What you need is to create smaller CGContext and redraw an image. There should be plenty of examples even in SO. For example here: The simplest way to resize an UIImage?

许可以下: CC-BY-SA归因
不隶属于 StackOverflow
scroll top