Frage

I want to resize a UIImage (height and width) and store it in a NSData with new measures. Example:

[UIImage imageWithCGImage:[image CGImage] scale:compression orientation:UIImageOrientationUp]

This does not preserve the dimensions when converted to NSData. What is the problem?

War es hilfreich?

Lösung

The scale parameter does something else than resizing image. To give real life example - scale is involved when you load retina images. In that case, image dimensions are halved, while amount of data (and thus the actual dimensions) are not. This is just a clever way to provide nice compatibility between retina/nonretina devices.

What you need is to create smaller CGContext and redraw an image. There should be plenty of examples even in SO. For example here: The simplest way to resize an UIImage?

Lizenziert unter: CC-BY-SA mit Zuschreibung
Nicht verbunden mit StackOverflow
scroll top