Question

I'm having a hard time understanding how to convert a stream of motion JPEG at 30fps using the AVAssetWriter to a video file. The part I'm not getting is the [adaptor appendPixelBuffer:buffer withPresentationTimeresentTime] method.

How do I calculate the withPresentationTime value if I want to output 30fps mpeg4 video?

The video source is a camera that streams 30fps motion JPEG in real time.

Appreciate any idea.

Thanks

Was it helpful?

Solution

You will need to generate a CMTime structure using CMTimeMake. You will need to increment the time by 1/30 of a second for each frame.

Here is a sketch:

CMTime time = CMTimeMake(0, 30); // (time, time_scale)

for(each image) {
  [adaptor appendPixelBuffer:buffer withPresentationTime:time]
  time.value += 1; 
}

With the time setup as shown,the smallest time resolution is 1/30 of a second. time / time_scale = 1 second. I am not certain if there is a specific requirement for H.264. AVFoundation uses a time scale of 1000000000 (1,000,000,000 or 1 billion) when capturing (in my experience).

Update:

Just to review. From the CMTime struct:

CMTimeValue value;  /*! @field value The value of the CMTime. value/timescale = seconds. */
CMTimeScale timescale;  /*! @field timescale The timescale of the CMTime. value/timescale = seconds.  */

The timebase would stay the same throughout the video. Let say you have a current value of 10 with a time scale of 30. The current time in seconds is 10/30 = 0.33333 seconds. The time value for the 40th frame of your movie is 40/30 = 1.33333. So the 40th frame should render at 1.3333 seconds into the movie.

I am not sure if this time base is appropriate for an H.264 video. I am not familiar with the spec. I know when capturing video the presentation time base for video frames is 1000000000. Technically it should not matter. The time is a rational number -- 1000000000 / 1000000000 = 1 second and 30 / 30 = 1 second.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top