문제

I want to resize a UIImage (height and width) and store it in a NSData with new measures. Example:

[UIImage imageWithCGImage:[image CGImage] scale:compression orientation:UIImageOrientationUp]

This does not preserve the dimensions when converted to NSData. What is the problem?

도움이 되었습니까?

해결책

The scale parameter does something else than resizing image. To give real life example - scale is involved when you load retina images. In that case, image dimensions are halved, while amount of data (and thus the actual dimensions) are not. This is just a clever way to provide nice compatibility between retina/nonretina devices.

What you need is to create smaller CGContext and redraw an image. There should be plenty of examples even in SO. For example here: The simplest way to resize an UIImage?

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top