Question

Could someone explain to me how OpenAL fits in with the schema of sound on the iPhone?

There seem to be APIs at different levels for handling sound. The higher level ones are easy enough to understand.

But my understanding gets murky towards the bottom. There is Core Audio, Audio Units, OpenAL.

What is the connection between these? Is openAL the substratum, upon which rests Core Audio (which contains as one of its lower-level objects Audio Units) ?

OpenAL doesn't seem to be documented by Xcode, yet I can run code that uses its functions.

Was it helpful?

Solution

This is what I have figured out:

The substratum is Core Audio. Specifically, Audio Units.

So Audio Units form the base layer, and some low-level framework has been built on top of this. And the whole caboodle is termed Core Audio.

OpenAL is a multiplatform API -- the creators are trying to mirror the portability of OpenGL. A few companies are sponsoring OpenAL, including Creative Labs and Apple!

So Apple has provided this API, basically as a thin wrapper over Core Audio. I am guessing this is to allow developers to pull over code easily. Be warned, it is an incomplete implementation, so if you want OpenAL to do something that Core Audio can do, it will do it. But otherwise it won't.

Kind of counterintuitive -- just looking at the source, it looks as if OpenAL is lower level. Not so!

OTHER TIPS

Core Audio covers a lot of things, such as reading and writing various file formats, converting between encodings, pulling frames out of streams, etc. Much of this functionality is collected as the "Audio Toolbox". Core Audio also offers multiple APIs for processing streams of audio, for playback, capture, or both. The lowest level one is Audio Units, which works with uncompressed (PCM) audio and has some nice stuff for applying effects, mixing, etc. Audio Queues, implemented atop Audio Units, are a lot easier because they work with compressed formats (not just PCM) and save you from some threading challenges. OpenAL is also implemented atop Audio Units; you still have to use PCM, but at least the threading isn't scary. Difference is that since it's not from Apple, its programming conventions are totally different from Core Audio and the rest of iOS (most obviously, it's a push API: if you want to stream with OpenAL, you poll your sources to see if they've exhausted their buffers and push in new ones; by contrast, Audio Queues and Audio Units are pull-based, in that you get a callback when new samples are needed for playback).

Higher level, as you've seen, is nice stuff like Media Player and AV Foundation. These are a lot easier if you're just playing a file, but probably aren't going to give you deep enough access if you want to do some kind of effects, signal processing, etc.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top