Question

I'm trying to create a custom patch for Quartz Composer that will function just like the Video Input patch, but with a selectable capture device on an input port. It's a small patch, and looks right to me, but when I connected a DV device (Canopus ADVC-110), and select it, the ColorSpace is (null), and I get an exception. It works fine for the FaceTime HD camera, which is a video media type. I must be missing something, but I just can't see it.

The delegate function captureOutput fires over and over like there's new frames coming in, and the capture seems to start fine. What am I missing?

#import <OpenGL/CGLMacro.h>
#import "CaptureWithDevice.h"

#define kQCPlugIn_Name              @"Capture With Device"
#define kQCPlugIn_Description       @"Servies as a replacement for the default Video Input patch, and differs in that it allows the input device to be specified by the user."

@implementation CaptureWithDevice
@dynamic inputDevice, outputImage;

+ (NSDictionary*) attributes
{
    return [NSDictionary dictionaryWithObjectsAndKeys:
            kQCPlugIn_Name, QCPlugInAttributeNameKey, 
            kQCPlugIn_Description, QCPlugInAttributeDescriptionKey,
            nil];
}
+ (NSDictionary*) attributesForPropertyPortWithKey:(NSString*)key
{       
    if([key isEqualToString:@"inputDevice"]) {
        NSArray *videoDevices= [QTCaptureDevice inputDevicesWithMediaType:QTMediaTypeVideo];
        NSArray *muxedDevices= [QTCaptureDevice inputDevicesWithMediaType:QTMediaTypeMuxed];

        NSMutableArray *mutableArrayOfDevice = [[NSMutableArray alloc] init ];
        [mutableArrayOfDevice addObjectsFromArray:videoDevices];
        [mutableArrayOfDevice addObjectsFromArray:muxedDevices];

        NSArray *devices = [NSArray arrayWithArray:mutableArrayOfDevice];
        [mutableArrayOfDevice release];

        NSMutableArray *deviceNames= [NSMutableArray array];

        int i, ic= [devices count];


        for(i= 0; i<ic; i++) {
            [deviceNames addObject:[[devices objectAtIndex:i] description]];
            // be sure not to add CT to the list
        }



        return [NSDictionary dictionaryWithObjectsAndKeys:
                @"Device", QCPortAttributeNameKey,
                QCPortTypeIndex,QCPortAttributeTypeKey,
                [NSNumber numberWithInt:0], QCPortAttributeMinimumValueKey,
                deviceNames, QCPortAttributeMenuItemsKey,
                [NSNumber numberWithInt:ic-1], QCPortAttributeMaximumValueKey,
                nil];
    }
    if([key isEqualToString:@"outputImage"])
        return [NSDictionary dictionaryWithObjectsAndKeys:
                @"Video Image", QCPortAttributeNameKey,
                nil];
    return nil;
}
+ (QCPlugInExecutionMode) executionMode
{
    return kQCPlugInExecutionModeProvider;
}

+ (QCPlugInTimeMode) timeMode
{
    return kQCPlugInTimeModeIdle;
}

- (id) init
{
    if(self = [super init]) {
        [[NSNotificationCenter defaultCenter] addObserver:self 
                                                 selector:@selector(_devicesDidChange:) 
                                                     name:QTCaptureDeviceWasConnectedNotification 
                                                   object:nil];
        [[NSNotificationCenter defaultCenter] addObserver:self 
                                                 selector:@selector(_devicesDidChange:) 
                                                     name:QTCaptureDeviceWasDisconnectedNotification 
                                                   object:nil];
    }
    return self;
}

- (void) finalize
{
    [super finalize];
}

- (void) dealloc
{
    if (mCaptureSession) {
        [mCaptureSession release];
        [mCaptureDeviceInput release];
        [mCaptureDecompressedVideoOutput release];
    }
    [[NSNotificationCenter defaultCenter] removeObserver:self];
    [super dealloc];
}

@end

@implementation CaptureWithDevice (Execution)

- (BOOL) startExecution:(id<QCPlugInContext>)context
{
    return YES;
}

- (void) enableExecution:(id<QCPlugInContext>)context
{
}
static void _BufferReleaseCallback(const void* address, void* info)
{
    CVPixelBufferUnlockBaseAddress(info, 0); 

    CVBufferRelease(info);
}
- (BOOL) execute:(id<QCPlugInContext>)context atTime:(NSTimeInterval)time withArguments:(NSDictionary*)arguments
{
    if (!mCaptureSession || [mCaptureSession isRunning]==NO || _currentDevice!=self.inputDevice){
        NSError *error = nil;
        BOOL success;

        NSArray *videoDevices= [QTCaptureDevice inputDevicesWithMediaType:QTMediaTypeVideo];
        NSArray *muxedDevices= [QTCaptureDevice inputDevicesWithMediaType:QTMediaTypeMuxed];

        NSMutableArray *mutableArrayOfDevice = [[NSMutableArray alloc] init ];
        [mutableArrayOfDevice addObjectsFromArray:videoDevices];
        [mutableArrayOfDevice addObjectsFromArray:muxedDevices];

        NSArray *devices = [NSArray arrayWithArray:mutableArrayOfDevice];
        [mutableArrayOfDevice release];


        NSUInteger d= self.inputDevice;
        if (!(d<[devices count])) {
            d= 0;
        }
        QTCaptureDevice *device = [devices objectAtIndex:d];
        success = [device open:&error];
        if (!success) {
            NSLog(@"Could not open device %@", device);
            self.outputImage = nil; 
            return YES;
        } 
        NSLog(@"Opened device successfully");




        [mCaptureSession release];
        mCaptureSession = [[QTCaptureSession alloc] init];

        [mCaptureDeviceInput release];
        mCaptureDeviceInput = [[QTCaptureDeviceInput alloc] initWithDevice:device];

        // if the device is a muxed connection  make sure to get the right connection
        if ([muxedDevices containsObject:device]) {
            NSLog(@"Disabling audio connections");
            NSArray *ownedConnections = [mCaptureDeviceInput connections];
            for (QTCaptureConnection *connection in ownedConnections) {
                NSLog(@"MediaType: %@", [connection mediaType]);
                if ( [[connection mediaType] isEqualToString:QTMediaTypeSound]) {
                    [connection setEnabled:NO];
                    NSLog(@"disabling audio connection");

                }
            }
        }



        success = [mCaptureSession addInput:mCaptureDeviceInput error:&error];

        if (!success) {
            NSLog(@"Failed to add Input");
            self.outputImage = nil; 
            if (mCaptureSession) {
                [mCaptureSession release];
                mCaptureSession= nil;
            }
            if (mCaptureDeviceInput) {
                [mCaptureDeviceInput release];
                mCaptureDeviceInput= nil;

            }
            return YES;
        }




        NSLog(@"Adding output");

        [mCaptureDecompressedVideoOutput release];
        mCaptureDecompressedVideoOutput = [[QTCaptureDecompressedVideoOutput alloc] init];

        [mCaptureDecompressedVideoOutput setPixelBufferAttributes:
         [NSDictionary dictionaryWithObjectsAndKeys:
          [NSNumber numberWithBool:YES], kCVPixelBufferOpenGLCompatibilityKey,
          [NSNumber numberWithLong:k32ARGBPixelFormat], kCVPixelBufferPixelFormatTypeKey, nil]];

        [mCaptureDecompressedVideoOutput setDelegate:self];
        success = [mCaptureSession addOutput:mCaptureDecompressedVideoOutput error:&error];

        if (!success) {
            NSLog(@"Failed to add output");
            self.outputImage = nil; 
            if (mCaptureSession) {
                [mCaptureSession release];
                mCaptureSession= nil;
            }
            if (mCaptureDeviceInput) {
                [mCaptureDeviceInput release];
                mCaptureDeviceInput= nil;
            }
            if (mCaptureDecompressedVideoOutput) {
                [mCaptureDecompressedVideoOutput release];
                mCaptureDecompressedVideoOutput= nil;
            }
            return YES;
        }

        [mCaptureSession startRunning]; 
        _currentDevice= self.inputDevice;
    }


    CVImageBufferRef imageBuffer = CVBufferRetain(mCurrentImageBuffer);

    if (imageBuffer) {
        CVPixelBufferLockBaseAddress(imageBuffer, 0);
        NSLog(@"ColorSpace: %@", CVImageBufferGetColorSpace(imageBuffer));
        //NSLog(@"ColorSpace: %@ Height: %@ Width: %@", CVImageBufferGetColorSpace(imageBuffer), CVPixelBufferGetWidth(imageBuffer), CVPixelBufferGetHeight(imageBuffer));
        id provider= [context outputImageProviderFromBufferWithPixelFormat:QCPlugInPixelFormatARGB8           
                                                                pixelsWide:CVPixelBufferGetWidth(imageBuffer)
                                                                pixelsHigh:CVPixelBufferGetHeight(imageBuffer)
                                                               baseAddress:CVPixelBufferGetBaseAddress(imageBuffer)
                                                               bytesPerRow:CVPixelBufferGetBytesPerRow(imageBuffer)
                                                           releaseCallback:_BufferReleaseCallback
                                                            releaseContext:imageBuffer
                                                                colorSpace:CVImageBufferGetColorSpace(imageBuffer)
                                                          shouldColorMatch:YES];
        if(provider == nil) {
            return NO; 
        }
        self.outputImage = provider;
    } 
    else 
        self.outputImage = nil; 

    return YES; 
}

- (void) disableExecution:(id<QCPlugInContext>)context
{
}
- (void) stopExecution:(id<QCPlugInContext>)context
{
}

- (void)captureOutput:(QTCaptureOutput *)captureOutput
  didOutputVideoFrame:(CVImageBufferRef)videoFrame 
     withSampleBuffer:(QTSampleBuffer *)sampleBuffer 
       fromConnection:(QTCaptureConnection *)connection
{    
    NSLog(@"connection type: %@", [connection mediaType]);
    CVImageBufferRef imageBufferToRelease;
    CVBufferRetain(videoFrame);
    imageBufferToRelease = mCurrentImageBuffer;


    @synchronized (self) {
        mCurrentImageBuffer = videoFrame;
    }
    CVBufferRelease(imageBufferToRelease);
}
- (void)_devicesDidChange:(NSNotification *)aNotification
{
}
@end
Was it helpful?

Solution

I managed to get this patch to work with both Video and Muxed inputs by removing the kCVPixelBufferOpenGLCompatibilityKey from mCaptureDecompressedVideoOutput. While that allows the patch to work perfectly inside Quartz Composer, my intent is to run this patch in a composition that is used inside CamTwist, which appears not to need OpenGL support. Right now, it just displays a black screen with wither Video or Muxed inputs, where it was working with Video inputs before. So, I'm going to convert my CVImageBufferRef to an OpenGL texture and see if I can get that to work with

outputImageProviderFromTextureWithPixelFormat:pixelsWide:pixelsHigh:name:flipped:releaseCallback:releaseContext:colorSpace:shouldColorMatch
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top