質問

i encountered a problem on my code when i read and downsize a song from my music library, it took up about 20mb till memory warning and crash. i can't seen to find what went wrong, i realise that when i call the code below my app memory size will increase.

Will appreciate any comments on my codes.

 if ([self exportAudio:[AVAsset assetWithURL:songURL] toFilePath:savedPath])
        {
           // [self performSelector:@selector(sendSongForUpload:) withObject:subStringPath afterDelay:1];
            [self sendRequest:2 andPath:subStringPath andSongDBItem:songVar];
        }

convert songs to a lower bit rate and save in app doc folder

- (BOOL)exportAudio:(AVAsset *)avAsset toFilePath:(NSString *)filePath
{
    CMTime assetTime = [avAsset duration];
    Float64 duration = CMTimeGetSeconds(assetTime);
    if (duration < 40.0) return NO;

    // get the first audio track
    NSArray *tracks = [avAsset tracksWithMediaType:AVMediaTypeAudio];
    if ([tracks count] == 0) return NO;


    NSError *readerError = nil;
    AVAssetReader *reader = [[AVAssetReader alloc] initWithAsset:avAsset  error:&readerError];
    AVAssetReaderOutput *readerOutput = [AVAssetReaderAudioMixOutput
                                         assetReaderAudioMixOutputWithAudioTracks:avAsset.tracks
                                         audioSettings: nil];

    if (! [reader canAddOutput: readerOutput])
    {
        NSLog (@"can't add reader output... die!");
        return NO;
    }
    else
    {
        [reader addOutput:readerOutput];
    }

    // writer
    NSError *writerError = nil;
    AVAssetWriter *writer = [[AVAssetWriter alloc] initWithURL:[NSURL fileURLWithPath:filePath]
                                                      fileType:AVFileTypeAppleM4A
                                                         error:&writerError];
    NSLog(@"writer %@",writer);

    AudioChannelLayout channelLayout;
    memset(&channelLayout, 0, sizeof(AudioChannelLayout));
    channelLayout.mChannelLayoutTag = kAudioChannelLayoutTag_Stereo;

    // use different values to affect the downsampling/compression
    //    NSDictionary *outputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
    //                                    [NSNumber numberWithInt: kAudioFormatMPEG4AAC], AVFormatIDKey,
    //                                    [NSNumber numberWithFloat:16000.0], AVSampleRateKey,
    //                                    [NSNumber numberWithInt:2], AVNumberOfChannelsKey,
    //                                    [NSNumber numberWithInt:128000], AVEncoderBitRateKey,
    //                                    [NSData dataWithBytes:&channelLayout length:sizeof(AudioChannelLayout)], AVChannelLayoutKey,
    //                                    nil];

    NSDictionary *outputSettings = @{AVFormatIDKey: @(kAudioFormatMPEG4AAC),
                                     AVEncoderBitRateKey: @(8000),
                                     AVNumberOfChannelsKey: @(1),
                                     AVSampleRateKey: @(8000)};

    AVAssetWriterInput *writerInput = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeAudio outputSettings:outputSettings];
    [writerInput setExpectsMediaDataInRealTime:NO];

    //\Add inputs to Write
    NSParameterAssert(writerInput);
    NSAssert([writer canAddInput:writerInput], @"Cannot write to this type of audio input" );

    if ([writer canAddInput:writerInput])
    {
        [writer addInput:writerInput];
    }
    else
    {
        NSLog (@"can't add asset writer input... die!");
        return NO;
    }
    [writer startWriting];
    [writer startSessionAtSourceTime:kCMTimeZero];
    [reader startReading];

    __block UInt64 convertedByteCount = 0;
     __block BOOL returnValue;
    dispatch_queue_t mediaInputQueue = dispatch_queue_create("mediaInputQueue", NULL);

    [writerInput requestMediaDataWhenReadyOnQueue:mediaInputQueue usingBlock:^{

        // NSLog(@"Asset Writer ready : %d", writerInput.readyForMoreMediaData);

        while (writerInput.readyForMoreMediaData)
        {
            CMSampleBufferRef nextBuffer = [readerOutput copyNextSampleBuffer];

            if (nextBuffer)
            {
                [writerInput appendSampleBuffer: nextBuffer];
                convertedByteCount += CMSampleBufferGetTotalSampleSize (nextBuffer);
                //NSNumber *convertedByteCountNumber = [NSNumber numberWithLong:convertedByteCount];
                //NSLog (@"writing");
            }
            else
            {
                [writerInput markAsFinished];

                [writer finishWritingWithCompletionHandler:^{

                    if (AVAssetWriterStatusCompleted == writer.status)
                    {
                        NSLog(@"Writer completed");
                        //[writer cancelWriting];
                        //[reader cancelReading];
                        returnValue = YES;

                        dispatch_async(mediaInputQueue, ^{
                            dispatch_async(dispatch_get_main_queue(), ^{
                                // add this to the main queue as the last item in my serial queue
                                // when I get to this point I know everything in my queue has been run

                            });

                        });

                    }
                    else if (AVAssetWriterStatusFailed == writer.status)
                    {
                        [writer cancelWriting];
                        [reader cancelReading];
                        NSLog(@"Writer failed");
                        return;
                    }
                    else
                    {
                        NSLog(@"Export Session Status: %d", writer.status);
                    }
                }];
                break;
            }
        }
    }];
    writer = nil;
    writerInput = nil;
    reader = nil;
    readerOutput=nil;

    return returnValue;
    //return YES;
}
役に立ちましたか?

解決

https://developer.apple.com/library/mac/documentation/AVFoundation/Reference/AVAssetReaderOutput_Class/Reference/Reference.html#//apple_ref/occ/instm/AVAssetReaderOutput/copyNextSampleBuffer

Core Foundation does not release by ARC as per the “The Create Rule” in Memory Management Programming Guide for Core Foundation. I have to release CMSampleBufferRef obtained from -(CMSampleBufferRef)copyNextSampleBuffer or i will get a memory leak.

if (nextBuffer)
{
    [writerInput appendSampleBuffer: nextBuffer];
    convertedByteCount += CMSampleBufferGetTotalSampleSize (nextBuffer);
    //NSLog (@"writing");
    CFRelease(nextBuffer);
}
ライセンス: CC-BY-SA帰属
所属していません StackOverflow
scroll top