Question

I'm currently developing an app which plays an audio file (mp3, but can change to WAV to reduce decoding time), and records audio at the same time.
For synchronization purposes, I want to estimate the exact time when audio started playing.

Using AudioQueue to control each buffer, I can estimate the time when the first buffer was drained. My questions are:

  1. What is the hardware delay between AudioQueue buffers being drained and them actually being played?
  2. Is there a lower level API (specifically, AudioUnit), that has better performance (in hardware latency measures)?
  3. Is it possible to place an upper limit on hardware latency using AudioQueue, w or w/o decoding the buffer? 5ms seems something that I can work with, more that that will require a different approach.

Thanks!

Was it helpful?

Solution

The Audio Queue API runs on top of Audio Units, so the RemoteIO Audio Unit using raw uncompressed audio will allow a lower and more deterministic latency. The minimum RemoteIO buffer duration that can be set on some iOS devices (using the Audio Session API) is about 6 to 24 milliseconds, depending on application state. That may set a lower limit on both play and record latency, depending on what events you are using for your latency measurement points.

Decoding compressed audio can add around an order of magnitude or two more latency from decode start.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top