Question

I'm using Android's MediaCodec class to read raw data from audio files. That works just fine.

The problem is that I don't know if it's safe to assume that the output data will always be 16-bit?

I can tell, experimentally, that the output is 16-bit, but I don't know how to check that at runtime. The MediaCodec documentation doesn't appear to tell me. The MediaFormat KEY_CHANNEL_MASK could tell me, but MediaCodec doesn't appear to set those flags. It sets the sample rate, and the mime-type, but nothing that can tell me the bit-size explicitly.

I suppose that given the difference between presentation times of subsequent blocks, and the sample rate, I should be able to calculate it, but that doesn't seem very satisfactory.

Is there a way to tell, or is it written somewhere that I don't have to?

Was it helpful?

Solution

Currently the output is always 16 bit in stock Android. If that changes in the future we'll add an additional format key that specifies the format. Note that KEY_CHANNEL_MASK would only tell you which channels are included (e.g. left, right, center, etc), not the sample format.

OTHER TIPS

No it does not. If you have 24 bit PCM (WAV) files, it will give you 24 bit audio. And seemingly no way to determine this. I have added a question here MediaCodec and 24 bit PCM

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top