Pergunta

I am recording some video in my ios app and sometimes (very unpredictable) it crashes with EXC_BAD_ACCESS KERN_INVALID_ADDRESS while recording. (edit: project is using ARC)

Thread : Crashed: com.myapp.myapp
0  libobjc.A.dylib                0x3b1cc622 objc_msgSend + 1
1  com.myapp.myap                 0x00156faf -[Encoder encodeFrame:isVideo:] (Encoder.m:129)
2  com.myapp.myap                 0x001342ab -[CameraController     captureOutput:didOutputSampleBuffer:fromConnection:] (CameraController.m:423)
3  AVFoundation                   0x2f918327 __74-[AVCaptureAudioDataOutput  _AVCaptureAudioDataOutput_AudioDataBecameReady]_block_invoke + 282
4  libdispatch.dylib              0x3b6abd53 _dispatch_call_block_and_release + 10
5  libdispatch.dylib              0x3b6b0cbd _dispatch_queue_drain + 488
6  libdispatch.dylib              0x3b6adc6f _dispatch_queue_invoke + 42
7  libdispatch.dylib              0x3b6b15f1 _dispatch_root_queue_drain + 76
8  libdispatch.dylib              0x3b6b18dd _dispatch_worker_thread2 + 56
9  libsystem_pthread.dylib        0x3b7dcc17 _pthread_wqthread + 298

declarations of my variables:

@interface CameraController  () <AVCaptureVideoDataOutputSampleBufferDelegate,    AVCaptureAudioDataOutputSampleBufferDelegate>
{
AVCaptureSession* _session;
AVCaptureVideoPreviewLayer* _preview;
dispatch_queue_t _captureQueue;
AVCaptureConnection* _audioConnection;
AVCaptureConnection* _videoConnection;


Encoder* _encoder;
BOOL _isRecording;
BOOL _isPaused;
BOOL _discont;
int _currentFile;
CMTime _timeOffset;
CMTime _lastVideo;
CMTime _lastAudio;

int _cx;
int _cy;
int _channels;
Float64 _samplerate;  
}
@end

and here is [Encoder encodeFrame:isVideo:] (line no.1 in ntrace) in context:

- (void) captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
BOOL bVideo = YES;

@synchronized(self)
{
    if (!self.isCapturing  || self.isPaused)
    {
        return;
    }
    if (connection != _videoConnection)
    {
        bVideo = NO;
    }
    if ((_encoder == nil) && !bVideo)
    {
        CMFormatDescriptionRef fmt = CMSampleBufferGetFormatDescription(sampleBuffer);
        [self setAudioFormat:fmt];
        NSString* filename = [NSString stringWithFormat:@"capture%d.mp4", _currentFile];
        NSString* path = [NSTemporaryDirectory() stringByAppendingPathComponent:filename];
        _encoder = [VideoEncoder encoderForPath:path Height:_cy width:_cx channels:_channels samples:_samplerate];
    }
    if (_discont)
    {
        if (bVideo)
        {
            return;
        }
        _discont = NO;
        // calc adjustment
        CMTime pts = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
        CMTime last = bVideo ? _lastVideo : _lastAudio;
        if (last.flags & kCMTimeFlags_Valid)
        {
            if (_timeOffset.flags & kCMTimeFlags_Valid)
            {
                pts = CMTimeSubtract(pts, _timeOffset);
            }
            CMTime offset = CMTimeSubtract(pts, last);
            NSLog(@"Setting offset from %s", bVideo?"video": "audio");
            NSLog(@"Adding %f to %f (pts %f)", ((double)offset.value)/offset.timescale, ((double)_timeOffset.value)/_timeOffset.timescale, ((double)pts.value/pts.timescale));

            // this stops us having to set a scale for _timeOffset before we see the first video time
            if (_timeOffset.value == 0)
            {
                _timeOffset = offset;
            }
            else
            {
                _timeOffset = CMTimeAdd(_timeOffset, offset);
            }
        }
        _lastVideo.flags = 0;
        _lastAudio.flags = 0;
    }

    // retain so that we can release either this or modified one
    CFRetain(sampleBuffer);

    if (_timeOffset.value > 0)
    {
        CFRelease(sampleBuffer);
        sampleBuffer = [self adjustTime:sampleBuffer by:_timeOffset];
    }

    // record most recent time so we know the length of the pause
    CMTime pts = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
    CMTime dur = CMSampleBufferGetDuration(sampleBuffer);
    if (dur.value > 0)
    {
        pts = CMTimeAdd(pts, dur);
    }
    if (bVideo)
    {
        _lastVideo = pts;
    }
    else
    {
        _lastAudio = pts;
    }
}

// pass frame to encoder
[_encoder encodeFrame:sampleBuffer isVideo:bVideo]; //This is line 129
CFRelease(sampleBuffer);
}

For full used code please refer to: http://www.gdcl.co.uk/2013/02/20/iPhone-Pause.html - I have used this control for video recording. I know it is hard to help with this kind of issue but where should I start debugging this issue? Thank you for any help

Foi útil?

Solução

In your method you have the following...

CFRetain(sampleBuffer);

if (_timeOffset.value > 0)
{
    CFRelease(sampleBuffer);
    sampleBuffer = [self adjustTime:sampleBuffer by:_timeOffset];
}

Then at the end you have another

CFRelease(sampleBuffer);

In the cases where _timeOffset.value is greater than 0 aren't you over releasing? Or are you doing a retain elsewhere? Should you be retaining it again within the if block?

Licenciado em: CC-BY-SA com atribuição
Não afiliado a StackOverflow
scroll top