Pregunta

I have everything set up to record video on AVFoundation (for Mac not iOS), I can get a video preview via AVCaptureVideoPreviewLayer and I can also capture still images however I cannot get the sample buffer from the video output working, I need this as I need further editing than the Preview Layer will support.

The below code is what I'm using at the moment, when called the imageBuffer is created fine but when I try to export that to a CIImage the CIImage remains nil. Any help will be appreciated.

- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
   fromConnection:(AVCaptureConnection *)connection
{
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    CIImage *ciImage = [[CIImage alloc] initWithCVImageBuffer:imageBuffer];
}
¿Fue útil?

Solución

From the docs:

The imageBuffer parameter must be in one of the following formats:

  • kCVPixelFormatType_32ARGB
  • kCVPixelFormatType_422YpCbCr8
  • kCVPixelFormatType_32BGRA

You can try the route Image Buffer → IOSurface → CIImage instead.
Maybe the surface-based CIImage initializer does some implicit conversion:

CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
IOSurfaceRef surface = CVPixelBufferGetIOSurface(imageBuffer);
CIImage* ciImage = [[CIImage alloc] initWithIOSurface:surface];

If this doesn't work, you could reconfigure your output settings so that the sample buffers are provided in one of the supported pixel formats.

Licenciado bajo: CC-BY-SA con atribución
No afiliado a StackOverflow
scroll top