Question

I am trying to extract amplitude information from a sound loaded from a URL using the Web Audio API instantaneously (not in real time), which will likely require the OfflineAudioContext. I am expecting to obtain something along the lines of an array containing the amplitude of the sound every t seconds for the duration of the sound (with size depending on the duration of the sound, divided by t). Unfortunately, documentation is sparse at this point, and I'm unsure of how to proceed. How can I load the sound and extract the amplitude every t seconds?

Was it helpful?

Solution

This was done very quickly, so the math might be messed up. But hopefully it'll get you started...

var ac = new webkitAudioContext(),
  url = 'path/to/audio.mp3';

function fetchAudio( url, callback ) {
  var xhr = new XMLHttpRequest();
  xhr.open('GET', url, true);
  xhr.responseType = 'arraybuffer';
  xhr.onload = function() {
    callback(xhr.response);
  };
  xhr.send();
}

function decode( arrayBuffer, callback ) {
  ac.decodeAudioData(arrayBuffer, function( audioBuffer ) {
    callback(audioBuffer);
  });
}

// return an array of amplitudes for the supplied `audioBuffer`
//
// each item in the array will represent the average amplitude (in dB)
// for a chunk of audio `t` seconds long
function slice( audioBuffer, t ) {
  var channels = audioBuffer.numberOfChannels,
    sampleRate = ac.sampleRate,
    len = audioBuffer.length,
    samples = sampleRate * t,
    output = [],
    amplitude,
    values,
    i = 0,
    j, k;
  // loop by chunks of `t` seconds
  for ( ; i < len; i += samples ) {
    values = [];
    // loop through each sample in the chunk
    for ( j = 0; j < samples && j + i < len; ++j ) {
      amplitude = 0;
      // sum the samples across all channels
      for ( k = 0; k < channels; ++k ) {
        amplitude += audioBuffer.getChannelData(k)[i + j];
      }
      values.push(amplitude);
    }
    output.push(dB(values));
  }
  return output;
}

// calculate the average amplitude (in dB) for an array of samples
function dB( buffer ) {
  var len = buffer.length,
    total = 0,
    i = 0,
    rms,
    db;
  while ( i < len ) {
    total += ( buffer[i] * buffer[i++] );
  }
  rms = Math.sqrt( total / len );
  db = 20 * ( Math.log(rms) / Math.LN10 );
  return db;
}


// fetch the audio, decode it, and log an array of average
// amplitudes for each 5-second chunk
fetchAudio(url, function( arrayBuffer ) {
  decode(arrayBuffer, function( audioBuffer ) {
    console.log(slice(audioBuffer, 5));
  });
});

Basically, if you want to get data for an entire buffer faster than realtime, you really don't even need an OfflineAudioContext. You can just read the samples, do some math, and figure it out.

This is pretty slow, though. Especially for larger audio files. So you might want to put it in a Web Worker.

It's possible that using an OfflineAudioContext would be faster. I'm really not sure. But even if you decide to go down that route, there's still a lot of manual work you'd need to do in order to get the amplitude of these arbitrary chunks of t seconds.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top