Question

I'm looking at getInputBufferSize(...) function in AudioHardwareALSA.cpp and it returns hardcoded the value of 320. My question is: How is this value calculated?

I've done some pre-cals but still there are some questions.


sample_rate = 8000
format = S16_LE = 2 bytes/sample
period_time = 10000 us (guessing)
buffer_size = 2 * period_size
period_size = period_time * bytes/sec
buffer_size = 2 * (0.01 * sample_rate * 2) = 320 bytes.

I can't find the period_time in the code, so one question is: where is it defined or is just a rough calculation?

I'm also trying to add some more sample rates i.e 16000 and 32000 (later maybe more). How to calculate the right minimum buffer size? Is the delay always 10 ms for all the sample rates?

Any help is appreciated.

Was it helpful?

Solution

I believe Google implemented NB-AMR encode to start with. later they added support for AAC. In the case of NB-AMR, the frame size is 320 bytes. You may be aware that for NB-AMR: sampling rate = 8000 samples / sec frame duration = 20ms sample size = 2 bytes channels = mono So, each frame contains 8000 samples / sec * 0.02 sec * 2 bytes / sample / channel * 1 channels = 320 bytes

For AAC, these parameters are different and hence the framesize

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top