Question

I broke out a .WAV file into a byte array. The audio is 16 bit, 1 channel, with a sampling rate of 44,100HZ.

Is there a way to calculate what one byte of data represents with respect to time?

Was it helpful?

Solution

One byte of 16 bit audio isn't anything. You need two bytes for a sample at 16-bit.

If you want to know how many milliseconds you have per sample...

1,000 (ms) / Frequency (Hz)

44.1kHz is ~0.0227ms per sample.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top