I've got a callback from which I am attempting to capture audio from a remote I/O unit in an audio graph. Inside of my callback I have an AudioBufferList in an AudioUnitRender function that I need to store non-interleaved data from two channels.

Here's a code snippet for context:

//audiounit render function inside of a callback    
OSStatus status;
        status = AudioUnitRender(THIS.ioUnit,    
                                 ioActionFlags,
                                 inTimeStamp,
                                 0,
                                 inNumberFrames,
                                 &bufferList);

//want to initialize and configure bufferList to store non-interleaved data from both channels
...

//write audio data to disk 
    OSStatus result;
        if (*ioActionFlags == kAudioUnitRenderAction_PostRender) {
            result =  ExtAudioFileWriteAsync(THIS.extAudioFileRef, inNumberFrames, &bufferList);
            if(result) printf("ExtAudioFileWriteAsync  %ld \n", result);}
        return noErr; 

Does anyone know how to do this?

Thanks.

有帮助吗?

解决方案

You should make a float** array and initialize it lazily in your render callback to have the same buffersize as you are passed (reallacating it as necessary). From there you can simply copy to the channel you need and use that data in the other channel (I'm guessing you're making some type of effect which needs to have interaction between the channels).

Unfortunately, this will by necessarily a global variable, but due to the restrictions on AudioUnits in iOS you'll probably just have to live with that.

其他提示

Your buffer list should be initialized with

myBufferList.mNumberBuffers = 2;

For a working example, check MixerHost example by Apple.

许可以下: CC-BY-SA归因
不隶属于 StackOverflow
scroll top