Question

I am new to Objective-C and iOS technology.I want to record the video through code and during run time, I have to get each frame as raw data for some processing.How can I achieve this? Please any one help me. Thanks in Advance. Here is my code so far:

- (void)viewDidLoad
{
    [super viewDidLoad];

    [self setupCaptureSession];

}

The viewDidAppear function

-(void)viewDidAppear:(BOOL)animated
{
    if (!_bpickeropen)
    {
       _bpickeropen = true;
        _picker = [[UIImagePickerController alloc] init];
        _picker.delegate = self;
        NSArray *sourceTypes = [UIImagePickerController availableMediaTypesForSourceType:picker.sourceType];
        if (![sourceTypes containsObject:(NSString *)kUTTypeMovie ])
        {
            NSLog(@"device not supported");
            return;
        }

        _picker.sourceType = UIImagePickerControllerSourceTypeCamera;
        _picker.mediaTypes = [NSArray arrayWithObjects:(NSString *)kUTTypeMovie,nil];//,(NSString *) kUTTypeImage
        _picker.videoQuality = UIImagePickerControllerQualityTypeHigh;
        [self presentModalViewController:_picker animated:YES];
    }



}

// Delegate routine that is called when a sample buffer was written

- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection
{

    CVImageBufferRef cameraFrame = CMSampleBufferGetImageBuffer(sampleBuffer);


    CVPixelBufferLockBaseAddress(cameraFrame, 0);

    GLubyte *rawImageBytes = CVPixelBufferGetBaseAddress(cameraFrame);

    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(cameraFrame);

    **NSData *dataForRawBytes = [NSData dataWithBytes:rawImageBytes length:bytesPerRow * CVPixelBufferGetHeight(cameraFrame)];

**

PROBLEMS 1.(Here i am getting the raw bytes only once) 2.(After that i want to store this raw bytes as binary file in app path).

// Do whatever with your bytes

NSLog(@"bytes per row %zd",bytesPerRow);

[dataForRawBytes writeToFile:[self datafilepath]atomically:YES];

NSLog(@"Sample Buffer Data is %@\n",dataForRawBytes);


CVPixelBufferUnlockBaseAddress(cameraFrame, 0);

}

here i am setting the delegate of output// Create and configure a capture session and start it running - (void)setupCaptureSession { NSError *error = nil;

// Create the session
AVCaptureSession *session = [[AVCaptureSession alloc] init];


// Configure the session to produce lower resolution video frames, if your
// processing algorithm can cope. We'll specify medium quality for the
// chosen device.
session.sessionPreset = AVCaptureSessionPresetMedium;

// Find a suitable AVCaptureDevice
AVCaptureDevice *device = [AVCaptureDevice
                           defaultDeviceWithMediaType:AVMediaTypeVideo];

// Create a device input with the device and add it to the session.
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
                                                                    error:&error];
if (!input)
{
    // Handling the error appropriately.
}
[session addInput:input];

// Create a VideoDataOutput and add it to the session
AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
[session addOutput:output];


// Configure your output.
dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
[output setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);

// Specify the pixel format
output.videoSettings =
[NSDictionary dictionaryWithObject:
 [NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
                            forKey:(id)kCVPixelBufferPixelFormatTypeKey]; //kCVPixelBufferPixelFormatTypeKey


// If you wish to cap the frame rate to a known value, such as 15 fps, set
// minFrameDuration.
// output.minFrameDuration = CMTimeMake(1, 15);

// Start the session running to start the flow of data
[session startRunning];

// Assign session to an ivar.
//[self setSession:session];

}

I appreciate any help.Thanks in advance.

Was it helpful?

Solution

You could look into the AVFoundation framework. It allows you access to the raw data generated from the camera.

This link is a good intro-level project to the AVFoundation video camera usage.

In order to get individual frames from the video output, you could use the AVCaptureVideoDataOutput class from the AVFoundation framework.

Hope this helps.

EDIT: You could look at the delegate functions of AVCaptureVideoDataOutputSampleBufferDelegate, in particular the captureOutput:didOutputSampleBuffer:fromConnection: method. This will be called every time a new frame is captured.

If you do not know how delegates work, this link is a good example of delegates.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top