Question

I am recording video in iOS using AVCaptureSession.

-(id)init 
{
   if ((self = [super init])) 
   {
    [self setCaptureSession:[[AVCaptureSession alloc] init]];
   }
   return self;
}


-(void)addVideoPreviewLayer 
{
   [self setPreviewLayer:[[[AVCaptureVideoPreviewLayer alloc] initWithSession:[self captureSession]] autorelease]];
   [[self previewLayer] setVideoGravity:AVLayerVideoGravityResizeAspectFill];
}

How can I create NSData of recorded video simultaneously with recording ?

Was it helpful?

Solution

Access the encoded frames? You cannot do that with the iOS SDK alone. You can record a bit to file, access the encoded frames in the file, record a new file, access more ... if you need to do so.

However, if you are trying to get the raw frames, while also writing, that's fairly straightforward. Instead of capturing output to a file, use –captureOutput:didOutputSampleBuffer:fromConnection: on your AVCaptureAudioDataOutputSampleBufferDelegate. Just make sure to also route the data to something that is encoding/writing the buffers, otherwise you will lose the "...simultaneously with recording" aspect.

This isn't an NSData, but a CMSampleBufferRef, which depending on if the buffer is audio or video, can be converted to NSData in various ways.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top