In my app I am using the camera and photo library to get an UIImage,
this UIImage is then scaled down about 20 times its normal size I then set a NSData object based off the UIImage.

_regularImage = [self resizeImage:_takenImage width:100 height:100];


-(UIImage *)resizeImage:(UIImage *)anImage width:(int)width height:(int)height
{

    CGImageRef imageRef = [anImage CGImage];

    CGImageAlphaInfo alphaInfo = CGImageGetAlphaInfo(imageRef);

    if (alphaInfo == kCGImageAlphaNone)
        alphaInfo = kCGImageAlphaNoneSkipLast;


    CGContextRef bitmap = CGBitmapContextCreate(NULL, width, height, CGImageGetBitsPerComponent(imageRef), 4 * width, CGImageGetColorSpace(imageRef), alphaInfo);

    CGContextDrawImage(bitmap, CGRectMake(0, 0, width, height), imageRef);

    CGImageRef ref = CGBitmapContextCreateImage(bitmap);
    UIImage *result = [UIImage imageWithCGImage:ref];

    CGContextRelease(bitmap);
    CGImageRelease(ref);

    return result;      
}

NSData *image1Data = UIImageJPEGRepresentation(_regularImage, 1);

I cant seem to figure anything else out that might cause this

Thank you

LittleRy

有帮助吗?

解决方案

The issue here may be that you are creating your bitmap context or UIImage in the wrong way. Try debugging and checking if _regularImage is nil, or if it's invalid. For scaling an image, I would suggest using a third party library called ANImageBitmapRep. It's a small set of classes that allows for easy cropping, resizing, rotating, etc of images on the iPhone. Scaling a UIImage can be done like this:

ANImageBitmapRep * irep = [ANImageBitmapRep imageBitmapRepWithImage:myImage];
[irep setSize:BMPointMake(myWidth, myHeight)]; // scale the image
UIImage * theImage = [irep image];
[irep release];
NSData * jpeg = UIImageJPEGRepresentation(theImage, 1);

With this sort of code I doubt that UIImageJPEGRepresentation would be the issue. The ANImageBitmapRep class itself handles the CGContextRef stuff internally, making your job very much easier.

许可以下: CC-BY-SA归因
不隶属于 StackOverflow
scroll top