문제

I am using the Audio Unit services to set up an output rendering callback so I can mix together synthesized audio. The code I have seems to work perfectly on the devices I have (iPod Touch, iPhone 3G, and iPad) but fails to work on the simulator.

On the simulator, the AudioUnitInitialise function fails and returns a value of -10851 (kAudioUnitErr_InvalidPropertyValue according to Apple documentation).

Here is my initialisation code.. anyone with more experience with this API than I see anything I'm doing incorrect here?

#define kOutputBus 0
#define kInputBus  1 

... 

static OSStatus playbackCallback(void *inRefCon, 
                             AudioUnitRenderActionFlags* ioActionFlags, 
                             const AudioTimeStamp*       inTimeStamp, 
                             UInt32                      inBusNumber, 
                             UInt32                      inNumberFrames, 
                             AudioBufferList*            ioData) 
{
    // Mix audio here - but it never gets here on the simulator
    return noErr;
}

...



{
    OSStatus status;

    // Describe audio component
    AudioComponentDescription desc;
    desc.componentType         = kAudioUnitType_Output;
    desc.componentSubType      = kAudioUnitSubType_RemoteIO;
    desc.componentFlags        = 0;
    desc.componentFlagsMask    = 0;
    desc.componentManufacturer = kAudioUnitManufacturer_Apple;

    // Get component
    AudioComponent inputComponent = AudioComponentFindNext(NULL, &desc);

    // Get audio units
    status = AudioComponentInstanceNew(inputComponent, &m_audio_unit);
    if(status != noErr) {
        NSLog(@"Failed to get audio component instance: %d", status);
    }

    // Enable IO for playback
    UInt32 flag = 1;
    status = AudioUnitSetProperty(m_audio_unit, 
                                  kAudioOutputUnitProperty_EnableIO, 
                                  kAudioUnitScope_Output, 
                                  kOutputBus,
                                  &flag, 
                                  sizeof(flag));
    if(status != noErr) {
        NSLog(@"Failed to enable audio i/o for playback: %d", status);
    }

    // Describe format
    AudioStreamBasicDescription audioFormat;
    audioFormat.mSampleRate       = 44100.00;
    audioFormat.mFormatID         = kAudioFormatLinearPCM;
    audioFormat.mFormatFlags      = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked;
    audioFormat.mFramesPerPacket  = 1;      
    audioFormat.mChannelsPerFrame = 2;
    audioFormat.mBitsPerChannel   = 16;
    audioFormat.mBytesPerPacket   = 4;
    audioFormat.mBytesPerFrame    = 4;      

    // Apply format
    status = AudioUnitSetProperty(m_audio_unit, 
                                  kAudioUnitProperty_StreamFormat, 
                                  kAudioUnitScope_Input, 
                                  kOutputBus, 
                                  &audioFormat, 
                                  sizeof(audioFormat));
    if(status != noErr) {
        NSLog(@"Failed to set format descriptor: %d", status);
    }

    // Set output callback
    AURenderCallbackStruct callbackStruct;
    callbackStruct.inputProc       = playbackCallback;
    callbackStruct.inputProcRefCon = self;
    status = AudioUnitSetProperty(m_audio_unit, 
                                  kAudioUnitProperty_SetRenderCallback, 
                                  kAudioUnitScope_Global, 
                                  kOutputBus,
                                  &callbackStruct, 
                                  sizeof(callbackStruct));
    if(status != noErr) {
        NSLog(@"Failed to set output callback: %d", status);
    }

    // Initialize (This is where it fails on the simulator)
    status = AudioUnitInitialize(m_audio_unit);
    if(status != noErr) {
        NSLog(@"Failed to initialise audio unit: %d", status);
    }

}

My XCode version is 3.2.2 (64 bit) My Simulator version is 3.2 (Though the same issue occurs in 3.1.3 Debug or Release)

Thanks, I appreciate it!

도움이 되었습니까?

해결책

compiling for a device and for a simulator is totally different. Most common things have the same expected result. For example loading a view switch between them playing sounds and so on. However when it comes to other things like playing sound with OpenAL loading 10 buffers and then switching between them the simulator cannot handle that but the devices can.

The way i see it is as long as it works on the device that's all I care about. Try not t pull your hair out just to make an application work on a simulator when it works fine on the device.

hope that helps

Pk

다른 팁

Did you configure and enable an Audio Session prior to calling your RemoteIO initialization code?

When you are setting the stream properties to the input bus, you are using kOutputBus for your input scope. That's probably not good. Also, you probably don't need to apply the render callback to the global scope, as you only need it for output. Furthermore, I think that your definitions of kOutputBus and kInputBus are wrong... when I look at working iPhone Audio code, it uses 0 for the input bus and 1 for the output bus.

I can also think of a few minor things in regards to the AudioStreamBasicDescription, though I don't think these will make much of a difference:

  1. Add the kAudioFormatFlagsNativeEndian property to your format flags
  2. Explicitly set the mReserved field to 0.
라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top