Question

I have my app set up to record video from the camera using an AVCaptureSession, however, there is no audio with it. What do I need to do to record audio and then add it to the videoOutput for the file? Here is my code for recording the video:

AVCaptureSession *session = [[AVCaptureSession alloc] init];
[session beginConfiguration];
session.sessionPreset = AVCaptureSessionPresetMedium;

CALayer *viewLayer = self.vImagePreview.layer;
NSLog(@"viewLayer = %@", viewLayer);

AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
captureVideoPreviewLayer.frame = self.vImagePreview.bounds;

[self.vImagePreview.layer addSublayer:captureVideoPreviewLayer];

AVCaptureDevice *device = [self frontFacingCameraIfAvailable];

NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
    // Handle the error appropriately.
    NSLog(@"ERROR: trying to open camera: %@", error);
}

NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectoryPath = [paths objectAtIndex:0];

AVCaptureMovieFileOutput *movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];

NSString *archives = [documentsDirectoryPath stringByAppendingPathComponent:@"archives"];
NSString *outputpathofmovie = [[archives stringByAppendingPathComponent:@"Test"] stringByAppendingString:@".mp4"];
NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputpathofmovie];

[session addInput:input];
[session addOutput:movieFileOutput];
[session commitConfiguration];
[session startRunning];
[movieFileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self];

I added another input for the audio, but it wont work with the mpmovieplayercontroller that is in the background. Are there any thoughts to something that could play one video and simultaneously record audio and video from a camera?

Was it helpful?

Solution

You have not included the audio device:

AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
AVCaptureDeviceInput * audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:nil];
[session addInput:audioInput]

between beginConfiguration and commitConfiguration. It'll work!!!

OTHER TIPS

Add below code between beginConfiguration() and commitConfiguration()

// Add audio device to the recording

let audioDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeAudio)
do {
    let audioInput = try AVCaptureDeviceInput(device: audioDevice)
    self.captureSession.addInput(audioInput)
} catch {
    print("Unable to add audio device to the recording.")
}

in swift 5x, you can use this:

do {
        guard let audioDevice = AVCaptureDevice.default(for: AVMediaType.audio) else {
            print("Default audio device is unavailable.")
            setupResult = .configurationFailed
            session.commitConfiguration()
            return
        }

        // Add audio input
        let audioInput = try AVCaptureDeviceInput(device: audioDevice)
        if session.canAddInput(audioInput) {
            session.addInput(audioInput)
        } else {
            print("Couldn't add audio device input to the session.")
            setupResult = .configurationFailed
            session.commitConfiguration()
            return
        }

    } catch {
        print("Couldn't create Audio device input: \(error)")
        setupResult = .configurationFailed
        session.commitConfiguration()
        return
    }
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top