Question

I'm attempting to combine a number of video files into a single file with specific codec settings. I used to use the AVAssetExportSession for this, but I now need more control over the codec than the AVAssetExportSession offers.

Below I've posted the createFinalVideo: function that handles the combination of the video files.

The approach I've taken is attempting to write to the same file with an AVAssetWriter while simply starting the session at the location where the next video should be appended. I know this will not work because the AVAssetWriter apparently doesn't allow for this behavior.

Previously, I had the AVAssetWriter defined outside of the for loop, and I was attempting to add a new input for each video file (each pass of the for loop). However, it appears that AVAssetWriter doesn't allow for adding new inputs after [AVAssetWriter startWriting] has been called.

My question is how do I do what I'm trying to do the right way?

/**
 * Final video creation. Merges audio-only and video-only files.
 **/
-(void)createFinalVideo:(id)args
{
    ENSURE_SINGLE_ARG(args, NSDictionary);

    // presentation id
    NSString * presID = [args objectForKey:@"presID"];

    // array of video paths
    NSArray * videoPathsArray = [args objectForKey:@"videoPaths"];

    videoSuccessCallback = [args objectForKey:@"videoSuccess"];
    videoCancelCallback  = [args objectForKey:@"videoCancel"];
    videoErrorCallback   = [args objectForKey:@"videoError"];

    NSError * error = nil;

    NSFileManager * fileMgr = [NSFileManager defaultManager];
    NSString * bundleDirectory = [[NSBundle mainBundle] bundlePath];
    NSString * documentsDirectory = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents"];

    /*********************************************************************/
    /* BEGIN: merge all videos into a final MP4 */
    /*********************************************************************/

    // create the final video output file as MP4 file
    NSString * finalOutputFilePath = [NSString stringWithFormat:@"%@/%@/final_video.mp4", documentsDirectory, presID];
    NSURL * finalOutputFileUrl = [NSURL fileURLWithPath:finalOutputFilePath];

    // delete file if it exists
    if ([fileMgr fileExistsAtPath:finalOutputFilePath]) {
        [fileMgr removeItemAtPath:finalOutputFilePath error:nil];
    }

    float renderWidth = 640, renderHeight = 480;

    NSDictionary *videoCleanApertureSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                                [NSNumber numberWithInt:renderWidth], AVVideoCleanApertureWidthKey,
                                                [NSNumber numberWithInt:renderHeight], AVVideoCleanApertureHeightKey,
                                                [NSNumber numberWithInt:10], AVVideoCleanApertureHorizontalOffsetKey,
                                                [NSNumber numberWithInt:10], AVVideoCleanApertureVerticalOffsetKey,
                                                nil];

    NSDictionary *codecSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                   [NSNumber numberWithInt:1960000], AVVideoAverageBitRateKey,
                                   [NSNumber numberWithInt:24],AVVideoMaxKeyFrameIntervalKey,
                                   videoCleanApertureSettings, AVVideoCleanApertureKey,
                                   AVVideoProfileLevelH264Baseline30, AVVideoProfileLevelKey,
                                   nil];

    NSDictionary *videoCompressionSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                              AVVideoCodecH264, AVVideoCodecKey,
                                              codecSettings,AVVideoCompressionPropertiesKey,
                                              [NSNumber numberWithInt:renderWidth], AVVideoWidthKey,
                                              [NSNumber numberWithInt:renderHeight], AVVideoHeightKey,
                                              AVVideoScalingModeResizeAspect, AVVideoScalingModeKey,
                                              nil];

    NSError *aerror = nil;

    // next start time for adding to the compositions
    CMTime nextStartTime = kCMTimeZero;

    // loop through the video paths and add videos to the composition
    for (NSString * path in videoPathsArray) {
        // wait for each video to finish writing before continuing
        dispatch_semaphore_t semaphore = dispatch_semaphore_create(0);

        // create video writer
        AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:finalOutputFileUrl fileType:AVFileTypeQuickTimeMovie error:nil];
        NSParameterAssert(videoWriter);

        NSLog(@"at the top of the for loop");
        NSLog(@"%@", path);

        AVAssetWriterInput* videoWriterInput = [AVAssetWriterInput
                                                assetWriterInputWithMediaType:AVMediaTypeVideo
                                                outputSettings:videoCompressionSettings];
        NSParameterAssert(videoWriterInput);
        NSParameterAssert([videoWriter canAddInput:videoWriterInput]);
        videoWriterInput.expectsMediaDataInRealTime = YES;
        [videoWriter addInput:videoWriterInput];

        AVAssetWriterInput* audioWriterInput = [AVAssetWriterInput
                                                assetWriterInputWithMediaType:AVMediaTypeAudio
                                                outputSettings:nil];
        NSParameterAssert(audioWriterInput);
        NSParameterAssert([videoWriter canAddInput:audioWriterInput]);
        audioWriterInput.expectsMediaDataInRealTime = NO;
        [videoWriter addInput:audioWriterInput];

        [videoWriter startWriting];

        // video setup
        AVAsset *avAsset = [[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath:path] options:nil];
        AVAssetReader *reader = [[AVAssetReader alloc] initWithAsset:avAsset error:&aerror];
        AVAssetTrack *videoTrack = [[avAsset tracksWithMediaType:AVMediaTypeVideo]objectAtIndex:0];

        CMTime videoDuration = avAsset.duration;
        // Wait until the duration is actually available
        int durationAttempts = 5;
        while(CMTimeGetSeconds(videoDuration) == 0 && durationAttempts > 0) {
            durationAttempts--;
            [NSThread sleepForTimeInterval:0.3];
            videoDuration = avAsset.duration;
        }
        NSLog(@"[INFO] MODULE-VIDUTILS video duration in secs: %f", CMTimeGetSeconds(videoDuration));

        //videoWriterInput.transform = videoTrack.preferredTransform;
        NSDictionary *videoOptions = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange] forKey:(id)kCVPixelBufferPixelFormatTypeKey];
        AVAssetReaderTrackOutput *asset_reader_output = [[AVAssetReaderTrackOutput alloc] initWithTrack:videoTrack outputSettings:videoOptions];
        [reader addOutput:asset_reader_output];

        //audio setup
        AVAssetReader *audioReader = [AVAssetReader assetReaderWithAsset:avAsset error:nil];
        AVAssetTrack* audioTrack = [[avAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
        AVAssetReaderOutput *readerOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:audioTrack outputSettings:nil];
        [audioReader addOutput:readerOutput];

        NSLog(@"startSessionAtSourceTime: %f", CMTimeGetSeconds(nextStartTime));
        [videoWriter startSessionAtSourceTime:nextStartTime];
        // set next start time
        nextStartTime = CMTimeAdd(nextStartTime, videoDuration);
        [reader startReading];

        dispatch_queue_t _processingQueue = dispatch_queue_create("AVAssetWriterQueue", DISPATCH_QUEUE_SERIAL);
        [videoWriterInput requestMediaDataWhenReadyOnQueue:_processingQueue usingBlock:^{
            while ([videoWriterInput isReadyForMoreMediaData]) {
                CMSampleBufferRef sampleBuffer;
                if ([reader status] == AVAssetReaderStatusReading &&
                    (sampleBuffer = [asset_reader_output copyNextSampleBuffer])) {

                    BOOL result = [videoWriterInput appendSampleBuffer:sampleBuffer];
                    CFRelease(sampleBuffer);

                    if (!result) {
                        [reader cancelReading];
                        NSLog(@"NO RESULT");
                        NSLog (@"[INFO] MODULE-VIDUTILS createFinalVideo AVAssetWriterInputStatusFailed: %@", videoWriter.error);
                        if (videoErrorCallback != nil) {
                            [self _fireEventToListener:@"videoError" withObject:nil listener:videoErrorCallback thisObject:nil];
                        }
                        return;
                        break;
                    }
                } else {
                    [videoWriterInput markAsFinished];

                    switch ([reader status]) {
                        case AVAssetReaderStatusReading:
                            // the reader has more for other tracks, even if this one is done
                            break;

                        case AVAssetReaderStatusCompleted:
                            [audioReader startReading];
                            [videoWriter startSessionAtSourceTime:nextStartTime];
                            NSLog(@"Request");
                            NSLog(@"Asset Writer ready :%d", audioWriterInput.readyForMoreMediaData);
                            while (audioWriterInput.readyForMoreMediaData) {
                                CMSampleBufferRef nextBuffer;
                                if ([audioReader status] == AVAssetReaderStatusReading && (nextBuffer = [readerOutput copyNextSampleBuffer])) {
                                    NSLog(@"Ready");
                                    if (nextBuffer) {
                                        NSLog(@"NextBuffer");
                                        [audioWriterInput appendSampleBuffer:nextBuffer];
                                    }
                                } else {
                                    [audioWriterInput markAsFinished];

                                    //dictionary to hold duration
                                    if ([audioReader status] == AVAssetReaderStatusCompleted) {
                                        NSLog (@"[INFO] MODULE-VIDUTILS createFinalVideo AVAssetReaderStatusCompleted");
                                        [videoWriter finishWritingWithCompletionHandler:^{
                                            switch([videoWriter status]) {
                                                case AVAssetWriterStatusCompleted:
                                                    NSLog (@"[INFO] MODULE-VIDUTILS createFinalVideo AVAssetWriterStatusCompleted");
                                                    dispatch_semaphore_signal(semaphore);
                                                    break;

                                                case AVAssetWriterStatusCancelled:
                                                    NSLog (@"[INFO] MODULE-VIDUTILS createFinalVideo AVAssetWriterStatusCancelled");
                                                    if (videoSuccessCallback != nil) {
                                                        [self _fireEventToListener:@"videoCancel" withObject:nil listener:videoCancelCallback thisObject:nil];
                                                    }
                                                    return;
                                                    break;

                                                case AVAssetWriterStatusFailed:
                                                    NSLog (@"[INFO] MODULE-VIDUTILS createFinalVideo AVAssetWriterStatusFailed");
                                                    if (videoSuccessCallback != nil) {
                                                        [self _fireEventToListener:@"videoError" withObject:nil listener:videoErrorCallback thisObject:nil];
                                                    }
                                                    return;
                                                    break;
                                            }
                                        }];
                                        break;
                                    }
                                }
                            }
                            break;

                        case AVAssetReaderStatusFailed:
                            NSLog (@"[INFO] MODULE-VIDUTILS createFinalVideo AVAssetReaderStatusFailed, @%", reader.error);
                            if (videoSuccessCallback != nil) {
                                [self _fireEventToListener:@"videoError" withObject:nil listener:videoErrorCallback thisObject:nil];
                            }
                            [videoWriter cancelWriting];
                            return;
                            break;
                    }
                    break;
                }
            }
        }];
        // wait for the writing to finish
        dispatch_semaphore_wait(semaphore, DISPATCH_TIME_FOREVER);
        NSLog(@"Write Ended");
    }
    NSLog(@"got here -- should have waited for all videos to complete first");

    // call success if we got here
    if (videoSuccessCallback != nil) {
        [self _fireEventToListener:@"videoSuccess" withObject:nil listener:videoSuccessCallback thisObject:nil];
    }
}
Was it helpful?

Solution

I found a replacement for AVAssetExportSession called SDAVAssetExportSession that allows you to specify settings instead of using presets.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top