I'm trying to use QTKit to convert a list of images to a quicktime movie. I've figured out how to do everything except get the frame rate to 29.97. Through other forums and resources, the trick seems to be using something like this:

QTTime frameDuration = QTMakeTime(1001, 30000)

However, all my attempts using this method, or even (1000, 29970) still produce a movie at 30fps. This fps is what shows up when playing with Quicktime player.

Any ideas? Is there some other way to set the frame rate for the entire movie once its created?

Here's some sample code:

NSDictionary *outputMovieAttribs = [NSDictionary dictionaryWithObjectsAndKeys:@"jpeg", QTAddImageCodecType, [NSNumber numberWithLong:codecHighQuality], QTAddImageCodecQuality, nil];
QTTime frameDuration = QTMakeTime(1001, 30000);
QTMovie *outputMovie = [[QTMovie alloc] initToWritableFile:@"/tmp/testing.mov" error:nil];
[outputMovie setAttribute:[NSNumber numberWithBool:YES] forKey:QTMovieEditableAttribute];
[outputMovie setAttribute:[NSNumber numberWithLong:30000] forKey:QTMovieTimeScaleAttribute];

if (!outputMovie) {
    printf("ERROR: Chunk: Could not create movie object:\n");
} else {
    int frameID = 0;
    while (frameID < [framePaths count]) {
        NSAutoreleasePool *readPool = [[NSAutoreleasePool alloc] init];
        NSData *currFrameData = [NSData dataWithContentsOfFile:[framePaths objectAtIndex:frameID]];
        NSImage *currFrame = [[NSImage alloc] initWithData:currFrameData];

        if (currFrame) {
            [outputMovie addImage:currFrame forDuration:frameDuration withAttributes:outputMovieAttribs];
            [outputMovie updateMovieFile];
            NSString *newDuration = QTStringFromTime([outputMovie duration]);
            printf("new Duration: %s\n", [newDuration UTF8String]);
            currFrame = nil;
        } else {
            printf("ERROR: Could not add image to movie");
        }
        frameID++;
        [readPool drain];
    }
}

NSString *outputDuration = QTStringFromTime([outputMovie duration]);
printf("output Duration: %s\n", [outputDuration UTF8String]);
有帮助吗?

解决方案

Ok, thanks to your code, I could solve the issue. I was using the development tool called Atom Inpector to see that the data structure looked totally different than the movies I am currently working with. As I said, I never created a movie from images as you do, but it seems that this is not the way to go if you want to have a movie afterwards. QuickTime recognizes the clip as "Photo-JPEG", so not a normal movie file. The reason for this seems to be, that the added pictures are NOT added to a movie track but just somewhere in the movie. This can also be seen with Atom Inspector. With the "movieTimeScaleAttribute", you set a timeScale that is not used!

To solve the issue I changed the code just a tiny bit.

NSDictionary *outputMovieAttribs = [NSDictionary dictionaryWithObjectsAndKeys:@"jpeg", 
                      QTAddImageCodecType, [NSNumber numberWithLong:codecHighQuality],
                      QTAddImageCodecQuality,[NSNumber numberWithLong:2997], QTTrackTimeScaleAttribute, nil];

QTTime frameDuration = QTMakeTime(100, 2997);
QTMovie *outputMovie = [[QTMovie alloc] initToWritableFile:@"/Users/flo/Desktop/testing.mov" error:nil];
[outputMovie setAttribute:[NSNumber numberWithBool:YES] forKey:QTMovieEditableAttribute];
[outputMovie setAttribute:[NSNumber numberWithLong:2997] forKey:QTMovieTimeScaleAttribute];

Everything else is unaltered. Oh, by the way. To print the timeValue and timeScale, you could also do :

NSLog(@"new Duration timeScale : %ld timeValue : %lld \n", 
       [outputMovie duration].timeScale, [outputMovie duration].timeValue);

This way you can see better if your code does as desired.

Hope that helps! Best regards

其他提示

I have never done what you're trying to do, but I can tell you how to get the desired framerate I guess. If you "ask" a movie for its current timing information, you always get a QTTime structure, which contains the timeScale and the timeValue. For a 29.97 fps video, you would get a timeScale of 2997 ( for example, see below ). This is the amount of "units" per second.

So, if the playback position of the movie is currently at exactly 2 seconds, you would get a timeValue of 5994. The frameDuration is therefore 100, because 2997 / 100 = 29.97 fps.

QuickTime cannot handle float values, so you have to convert all the values to a long value by multiplication. By the way, you don't have to use 100, you could also use 1000 and a timeScale of 29970, or 200 as frame duration and 5994 timeScale. That's all I can tell you from what you get if you read timing information from already existing clips. You wrote that this didn't work out for you, but this is how QuickTime works internally. You should look into it again!

Best regards

许可以下: CC-BY-SA归因
不隶属于 StackOverflow
scroll top