문제

I'm trying to encode iPhone's camera frames into a H.264 video using ffmpeg's libav* libraries. I found in this Apple's article how to convert CMSampleBuffer to UIImage, but how can I convert it to ffmpeg's AVPicture?

Thanks.

도움이 되었습니까?

해결책

Answering my own question:

CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(pixelBuffer, 0);

// access the data
int width = CVPixelBufferGetWidth(pixelBuffer);
int height = CVPixelBufferGetHeight(pixelBuffer);
unsigned char *rawPixelBase = (unsigned char *)CVPixelBufferGetBaseAddress(pixelBuffer);

// Do something with the raw pixels here
// ...

// Fill in the AVFrame
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);

AVFrame *pFrame;
pFrame = avcodec_alloc_frame();

avpicture_fill((AVPicture*)pFrame, rawPixelBase, PIX_FMT_RGB32, width, height);

Now pFrame is filled in with the content of sample buffer, which is using the pixel format kCVPixelFormatType_32BGRA.

This solved my issue. Thanks.

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top