How can I obtain the native (hardware-supported) audio sampling rates in order to avoid internal sample rate conversion?

StackOverflow https://stackoverflow.com/questions/20889902

  •  23-09-2022
  •  | 
  •  

Question

Can anybody point me to documentation stating the native sampling rates on the different iPhone versions in order to avoid core-audio internal sampling rate conversion?

Edit: Otherwise, can you please point me to a source code example of how can I get those values programmatically?

Edit: This Apple document (page 26) refers to a Canonical audio format, but only makes mention of sample type (PCM) and bit depth (16-bit). It doesn't mention any native sampling rates supported directly by the capture hardware. Those are the values I'm looking for.

Was it helpful?

Solution

What you need to do is find a way to detect the hardware sample rate, and use whatever you find in your subsequent code.

There is an audio session property which will give you this: CurrentHardwareSampleRate

- (void) logSampleRate {
    Float64 sampleRate;
    UInt32 srSize = sizeof (sampleRate);
    OSStatus error = 
         AudioSessionGetProperty(
         kAudioSessionProperty_CurrentHardwareSampleRate,
         &srSize,
         &sampleRate);
    if (error == noErr) {
        NSLog (@"CurrentHardwareSampleRate = %f", sampleRate);
    }
}

iPhone 4S and iPhone 5S report hardwareSampleRate = 44100.000000 but others devices may (will) differ...

edit

While answering the question and reading the latest docs, I see that CurrentHardwareSampleRate is deprecated in iOS6. And really, I should have know better, given my own advice.

So here is the thing:

1 - don't use this antiquated and deprecated C interface, use the AVAudioSession API

2 - don't use HardwareSampleRate, use sampleRate.

This raises the suspicion that Apple wants to distance us (even) further from the metal. But we should rest assured. While the docs for sampleRate only offer

The audio sample rate, in hertz, that is currently in effect. (read-only)

notably omitting the word hardware, we also have this method:

- (BOOL)setPreferredSampleRate:(double)sampleRate error:(NSError **)outError

where sampleRate is described adequately as

The hardware sample rate you want to use. The available range for hardware sample rate is device dependent. It typically ranges from 8,000 through 48,000 hertz.

It seems that the way we are supposed to use this, is to set our preferred rate, from which the device will set an (the?) actual rate which presumably will be as near to the preferred as feasible, based on the hardware. So once you have set your preferred rate, you can check the value of the sampleRate (as opposed to preferredSampleRate) property, to discover the actual sample rate that will be used. This is - we hope - hardware sampling.

Here is the modern way...

- (void) logAudioFormatAV {
    AVAudioSession* session = [AVAudioSession sharedInstance];
    BOOL success;
    NSError* error = nil;
    double preferredSampleRate = 48000;
    success  = [session setPreferredSampleRate:preferredSampleRate error:&error];
    if (success) {
        NSLog (@"session.sampleRate = %f", session.sampleRate);
    } else {
        NSLog (@"error setting sample rate %@", error);
    }
}

I have tried this with various preferred sample rates on the 4S and 5S and always get back 44100. So this seems to be doing the right thing, and reporting back the actual hardware sample rate. But this deserves wider testing for greater certainty.

I'd like to also point you in the direction of Chris Adamson's excellent book Learning Core Audio, which touches on this subject in Chapter 10, Core Audio on iOS.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top