質問

I capturing a video using AVCaptureConnection in my iOS app. After that I add some images in the video as CALayers. Everything is working fine but I get a black frame at the very end of the resulting video after adding images. There is no frame of actual audio/video that has been affected in this. For audio I am extracting it and changing its pitch and then add it using AVMutableComposition. Here is the code that I am using. Please help me with what I am doing wrong or do I need to add something else.

cmp = [AVMutableComposition composition];

    AVMutableCompositionTrack *videoComposition = [cmp addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    AVMutableCompositionTrack *audioComposition = [cmp addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
    AVAssetTrack *sourceVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
    AVAssetTrack *sourceAudioTrack = [[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];


    [videoComposition insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:sourceVideoTrack atTime:kCMTimeZero error:nil] ;
    [audioComposition insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:sourceAudioTrack atTime:kCMTimeZero error:nil];

    animComp = [AVMutableVideoComposition videoComposition];
    animComp.renderSize = CGSizeMake(320, 320);
    animComp.frameDuration = CMTimeMake(1,30);
    animComp.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer  inLayer:parentLayer];

    // to gather the audio part of the video
    NSArray *tracksToDuck = [cmp tracksWithMediaType:AVMediaTypeAudio];
    NSMutableArray *trackMixArray = [NSMutableArray array];
    for (NSInteger i = 0; i < [tracksToDuck count]; i++) {
        AVMutableAudioMixInputParameters *trackMix = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:[tracksToDuck objectAtIndex:i]];
        [trackMix setVolume:5 atTime:kCMTimeZero];
        [trackMixArray addObject:trackMix];
    }
    audioMix = [AVMutableAudioMix audioMix];
    audioMix.inputParameters = trackMixArray;

    AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
    instruction.timeRange = CMTimeRangeMake(kCMTimeZero, [asset duration]);
    AVMutableVideoCompositionLayerInstruction *layerVideoInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoComposition];

    [layerVideoInstruction setOpacity:1.0 atTime:kCMTimeZero];
    instruction.layerInstructions = [NSArray arrayWithObject:layerVideoInstruction] ;
    animComp.instructions = [NSArray arrayWithObject:instruction];
    [self exportMovie:self];

This is my method for exporting the video

-(IBAction) exportMovie:(id)sender{

    //successCheck = NO;
    NSArray *docPaths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    NSString *tempPath = [docPaths objectAtIndex:0];
    //NSLog(@"Temp Path: %@",tempPath);

    NSString *fileName = [NSString stringWithFormat:@"%@/Final.MP4",tempPath];
    NSFileManager *fileManager = [NSFileManager defaultManager] ;
    if([fileManager fileExistsAtPath:fileName ]){
        NSError *ferror = nil ;
        [fileManager removeItemAtPath:fileName error:&ferror];
    }

    NSURL *exportURL = [NSURL fileURLWithPath:fileName];

    AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:cmp presetName:AVAssetExportPresetMediumQuality]  ;
    exporter.outputURL = exportURL;
    exporter.videoComposition = animComp;
    //exporter.audioMix = audioMix;
    exporter.outputFileType = AVFileTypeQuickTimeMovie;

    [exporter exportAsynchronouslyWithCompletionHandler:^(void){
        switch (exporter.status) {
            case AVAssetExportSessionStatusFailed:{
                NSLog(@"Fail");
                break;
            }
            case AVAssetExportSessionStatusCompleted:{
                NSLog(@"Success video");
                                });
                break;
            }

            default:
                break;
        }
           }];
    NSLog(@"outside");
}
役に立ちましたか?

解決

there is a property of exportsession to give the time range , try giving time range little less than the actual time (few nano seconds less)

他のヒント

You can get the true video duration from AVAssetTrack. The duration of AVAsset is sometimes longer than AVAssetTrack' one.

Check durations out like this.

print(asset.duration.seconds.description)
print(videoTrack.timeRange.duration.description)

So you can change this line.

[videoComposition insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:sourceVideoTrack atTime:kCMTimeZero error:nil] ;

To like this.

 [videoComposition insertTimeRange:sourceVideoTrack.timeRange, ofTrack:sourceVideoTrack atTime:kCMTimeZero error:nil] ;

For swift 5

videoComposition.insertTimeRange(sourceVideoTrack.timeRange, of: sourceVideoTrack, at: CMTime.zero)

Then you will avoid the last black frame :)

Hope this helps someone still suffer.

Just wanted to write this for people with my specific issue.

I was taking a video and trying to speed it up / slow it down by taking an AVMutableComposition and scaling the time range of the audio and video components via scaleTimeRange

Scaling the time range to 2x speed or 3x speed sometimes caused the last few frames of the video to be black. Fortunately, @Khushboo's answer fixed my problem as well.

However, instead of decreasing the exporter's timeRange by a few nanoseconds, I just made it the same as the composition's duration which ended up working perfectly.

exporter?.timeRange = CMTimeRangeMake(start: CMTime.zero, duration: composition.duration)

Hope this helps!

ライセンス: CC-BY-SA帰属
所属していません StackOverflow
scroll top