Question

I'm using an AVAssetWriter with a single video input to write video frames to a movie. The mechanics of the write loop aren't a problem: but I'm finding memory management is.

With regard to the CMSampleBufferRef's I'm appending: a) I create a CVPixelBufferRef fro some image data (raw bytes). Thus the CVPBR owns the data. b) I then wrap this in a CMSampleBufferRef like so (removed error checking for brevity):

    + (CMSampleBufferRef)wrapImageBufferInCMSampleBuffer:(CVImageBufferRef)imageBuffer timingInfo:(CMSampleTimingInfo const *)timingInfo error:(NSError **)error {
    // Create a format description for the pixel buffer
        CMVideoFormatDescriptionRef formatDescription = 0;
        OSStatus result = CMVideoFormatDescriptionCreateForImageBuffer(kCFAllocatorDefault, imageBuffer, &formatDescription);

        // Finally, create a CMSampleBuffer wrapper around it all
        CMSampleBufferRef sampleBuffer = nil;
        OSStatus sampleBufferResult = CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault, imageBuffer, YES, nil, nil, formatDescription, timingInfo, &sampleBuffer);

        if (formatDescription) {
            CFRelease(formatDescription);
        }

        return sampleBuffer;
    }

This creates me a buffer that can be passed into AVWriter. So far so good. The problem is that I see odd distortions if I simply free the buffers after appending them to the writer.

If I do this:

    [avInput appendSampleBuffer:delayedBuffer]
    CFRelease(sampleBuffer);
    CFRelease(pixelBuffer);

Then it works, there are no memory leaks, but occasionally I will see corrupted frames. Notice the sometimes out of sync frames, and apparent memory corruption at frame 391,394. To me it looks like the memory buffers are being freed before AVF has finished encoding them.

If I remove the CFRelease(pixelBuffer), the problem goes away. The resulting movie is perfectly smooth with no corruption at all. Of course; then I've a problem of a multi-GB memory leak!

Has anyone else come across something like this?

btw: it doesn't matter if I use a AVAssetWriterInputPixelBufferAdaptor either. The same result is obtained.

Here's a full code snippet that reproduces the problem:

- (void)recordMovieUsingStandardAVFTo:(NSURL *)url colorSequence:(NSArray *)colorSequence frameDuration:(CMTime)frameDuration size:(NSSize)size {
    NSError *error = nil;
    AVAssetWriter *writer = [AVAssetWriter assetWriterWithURL:url fileType:AVFileTypeQuickTimeMovie error:&error];
    if ([self checkForError:error doing:@"creation of asset writer"] == NO) {
        return;
    }

    NSMutableDictionary *videoSettings = [NSMutableDictionary dictionary];
    [videoSettings setValue:[NSNumber numberWithLong:(long) size.width] forKey:AVVideoWidthKey];
    [videoSettings setValue:[NSNumber numberWithLong:(long) size.height] forKey:AVVideoHeightKey];
    [videoSettings setValue:AVVideoCodecH264 forKey:AVVideoCodecKey];

    AVAssetWriterInput *videoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];

    [writer addInput:videoInput];
    [writer startWriting];
    [writer startSessionAtSourceTime:kCMTimeZero];

    dispatch_queue_t queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0);
    dispatch_async(queue, ^{
        NSError *localError = nil;
        CMTime frameTime = kCMTimeZero;
        int frameCounter = 0;
        for (int i = 0; i < 4; i++) {
            @autoreleasepool {
                for (NSColor *color in colorSequence) {
                    CMSampleTimingInfo timing = kCMTimingInfoInvalid;
                    timing.presentationTimeStamp = frameTime;
                    timing.duration = frameDuration;

                    while (videoInput.isReadyForMoreMediaData == NO) {
                        [NSThread sleepForTimeInterval:0.1];
                    }

                    CVPixelBufferRef pixelBuffer = [self createPixelBufferBufferOfColor:color size:size];
                    CMSampleBufferRef sampleBuffer = [OSGUtils wrapImageBufferInCMSampleBuffer:pixelBuffer timingInfo:&timing error:&localError];

                    BOOL recordingSuccess = [videoInput appendSampleBuffer:sampleBuffer];
                    if (recordingSuccess) {
                        frameTime = CMTimeAdd(frameTime, frameDuration);
                        frameCounter++;
                        if (frameCounter % 60 == 0) {
                            ApplicationLogInfo(@"Wrote frame at time %@", [NSString niceStringForCMTime:frameTime]);
                        }
                    } else {
                        ApplicationLogError(@"Can't write frame at time %@", localError);
                    }

                    CFRelease(sampleBuffer);
                    CFRelease(pixelBuffer);
                }
            }
        }

        [videoInput markAsFinished];
        [writer endSessionAtSourceTime:frameTime];
        BOOL success = [writer finishWriting];
        if (!success) {
            ApplicationLogError(@"Failed to finish writing, %@, %d", writer.error, writer.status);
        } else {
            ApplicationLogInfo(@"Write complete");
        }
    });
}

No correct solution

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top