Question

Can someone point me in the right direction to help me debug why my AVMutuableComposition for an AVPlayerItem in ios 6.1 does not load the track list correctly.

What I am doing is creating an AVMutableCompositionTrack and adding Time ranges to it from the same mp4 for playback. This has work correctly for me with other files, but I have a some files that have been encoded at 720p at 30 frames per second and they will not create the track list correctly. I want to display them in order, seamlessly.

Here is some of my code:

    - (AVPlayerItem *) loadVideoByAVComposition
    {
        AVComposition *composition = [ self loadCompositionTracks];
        AVComposition *immutableSnapshotComposition = [composition copy];
        AVPlayerItem *playerItemForComposition = [[AVPlayerItem alloc]initWithAsset:immutableSnapshotComposition];

        return playerItemForComposition;
     }

Load tracks:

    -(AVMutableComposition *) loadCompositionTracks
    {
        AVMutableComposition *composition = [[AVMutableComposition alloc]init];
        NSDictionary *optionsDictionary = [NSDictionary dictionaryWithObject:[NSNumber  numberWithBool:YES] forKey:AVURLAssetPreferPreciseDurationAndTimingKey];

        AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];

        // hard coded test mp4
        NSString *videoFileName = @"avi.mp4";
        AVAsset  *asset = [self getAVAssetByFileName:videoFileName optsDict:optionsDictionary];

        if (asset == nil) { return nil; }

        // add some track items
        [self addTrackList:compositionVideoTrack asset:asset];

        return composition;
    }

Our desktop app that creates these files exports information out by frame number and duration of the video file. So I start with these and convert to seconds to add to the track list.

    -(void)addTrackList:(AVMutableCompositionTrack *)track asset:(AVAsset *)asset
    {
        if (track == nil) { return; }

        CMTime insertionPointTime = kCMTimeZero;

        // add some test track insert points
        insertionPointTime =  [self addTrackItems:track asset:asset start:100 duration:800 insertTime:insertionPointTime];
        insertionPointTime =  [self addTrackItems:track asset:asset start:1500 duration:600 insertTime:insertionPointTime];
        insertionPointTime =  [self addTrackItems:track asset:asset start:2500 duration:500 insertTime:insertionPointTime];

     }

And here is the method to actually add time ranges to the AVMutuableComposition

    -(CMTime)addTrackItems:(AVMutableCompositionTrack *)track asset:(AVAsset *)asset start:(NSUInteger)start duration:(NSUInteger)duration insertTime:(CMTime)insertTime
    {
        if (track == nil) { return insertTime; }

        NSError *videoError = nil;

        Float64 secondsCalc = (start / 30);
        CMTime startTime =  CMTimeMakeWithSeconds(secondsCalc, 600);

        secondsCalc = duration / 30;
        CMTime cmDuration = CMTimeMakeWithSeconds(secondsCalc, 600);

        CMTimeRange timeRange = CMTimeRangeMake(startTime, cmDuration);

        // add video track
        AVAssetTrack *clipVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
        [track insertTimeRange:timeRange ofTrack:clipVideoTrack atTime:insertTime error:&videoError];

       if (videoError != nil)
       {
            NSLog(@"");
            NSLog(@"Error while adding track to composition");       
       }

       // calculate for next insert point
       CMTime resultInsertionPointTime = CMTimeAdd(insertTime, timeRange.duration);

       return resultInsertionPointTime;
    }

So this code works with some video files but some of them it may load just the 1st track and then not load the other tracks etc. Just get black frames or video stops. Basically it is not building the track list correctly. So how can I debug any warning or error codes coming back from the load to tell me if the TimeRanges are valid etc.

Are there any other good resources out that besides apple documentation that can explain the load processes. The apple documentation is sparse. I have already looked at some of the wwdc sessions as well. Any help would be greatly appreciated. Thanks!

Was it helpful?

Solution

Looks like the video that has been exported in my desktop app was malformed h.264 video. The AVComposition object does not have the tolerance to play it correctly, even though it plays correctly on the desktop etc. Not surprising as the Ipad wants the h.264 video to adhere to certain parameters.

If anyone knows of tools on the ios that can help debug video issues that would be great!

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top