質問

I have built some code to process video files on OSX, frame by frame. The following is an extract from the code which builds OK, opens the file, locates the video track (only track) and starts reading CMSampleBuffers without problem. However each CMSampleBufferRef I obtain returns NULL when I try to extract the pixel buffer frame. There's no indication in iOS documentation as to why I could expect a NULL return value or how I could expect to fix the issue. It happens with all the videos on which I've tested it, regardless of capture source or CODEC.

Any help greatly appreciated.

NSString *assetInPath = @"/Users/Dave/Movies/movie.mp4";
NSURL *assetInUrl = [NSURL fileURLWithPath:assetInPath];
AVAsset *assetIn = [AVAsset assetWithURL:assetInUrl];

NSError *error;
AVAssetReader *assetReader = [AVAssetReader assetReaderWithAsset:assetIn error:&error];
AVAssetTrack *track = [assetIn.tracks objectAtIndex:0];
AVAssetReaderOutput *assetReaderOutput = [[AVAssetReaderTrackOutput alloc]
                                              initWithTrack:track
                                              outputSettings:nil];
[assetReader addOutput:assetReaderOutput];

// Start reading
[assetReader startReading];

CMSampleBufferRef sampleBuffer;
do {
       sampleBuffer = [assetReaderOutput copyNextSampleBuffer];

       /**
        ** At this point, sampleBuffer is non-null, has all appropriate attributes to indicate that
        ** it's a video frame, 320x240 or whatever and looks perfectly fine. But the next
        ** line always returns NULL without logging any obvious error message
        **/

       CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);

       if( pixelBuffer != NULL ) {
           size_t width = CVPixelBufferGetWidth(pixelBuffer);
           size_t height = CVPixelBufferGetHeight(pixelBuffer);
           CVPixelBufferLockBaseAddress(pixelBuffer, 0);
           ...
           other processing removed here for clarity
        }
} while( ... );

To be clear, I've stripped all error checking code but no problems were being indicated in that code. i.e. The AVAssetReader is reading, CMSampleBufferRef looks fine etc.

役に立ちましたか?

解決 3

FWIW: Here is what official docs say for the return value of CMSampleBufferGetImageBuffer:

"Result is a CVImageBuffer of media data. The result will be NULL if the CMSampleBuffer does not contain a CVImageBuffer, or if the CMSampleBuffer contains a CMBlockBuffer, or if there is some other error."

Also note that the caller does not own the returned dataBuffer from CMSampleBufferGetImageBuffer, and must retain it explicitly if the caller needs to maintain a reference to it.

Hopefully this info helps.

他のヒント

You haven't specified any outputSettings when creating your AVAssetReaderTrackOutput. I've run into your issue when specifying "nil" in order to receive the video track's original pixel format when calling copyNextSampleBuffer. In my app I wanted to ensure no conversion was happening when calling copyNextSampleBuffer for the sake of performance, if this isn't a big concern for you, specify a pixel format in the output settings.

The following are Apple's recommend pixel formats based on the hardware capabilities:

kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange kCVPixelFormatType_420YpCbCr8BiPlanarFullRange

Because you haven't supplied any outputSettings you're forced to use the raw data contained within in the frame.

You have to get the block buffer from the sample buffer using CMSampleBufferGetDataBuffer(sampleBuffer), after you have that you need to get the actual location of the block buffer using

size_t blockBufferLength; char *blockBufferPointer; CMBlockBufferGetDataPointer(blockBuffer, 0, NULL, &blockBufferLength, &blockBufferPointer);

Look at *blockBufferPointer and decode the bytes using the frame header information for your required codec.

ライセンス: CC-BY-SA帰属
所属していません StackOverflow
scroll top