Frage

Using latest nodejs...

Got a binary coming from mongodb (field within a document). Means I will be processing multiple binary payloads concurrently. Data is a media file (h264) made up of slices (nal units). Each slice is delimited.

Using a readable stream from fs if I act on "data" events is the order of the data chunks preserved? Can I be guaranteed to process the "data" in order? (See the origin in the path part of the "this" scope in each call)

War es hilfreich?

Lösung

The order that data is written to a stream is guaranteed to be the same order that it is read with. When writing to a stream, the data is either written or queued, order does not change. This is from the Node.js source:

function writeOrBuffer(stream, state, chunk, encoding, cb) {
  chunk = decodeChunk(state, chunk, encoding);
  if (util.isBuffer(chunk))
    encoding = 'buffer';
  var len = state.objectMode ? 1 : chunk.length;

  state.length += len;

  var ret = state.length < state.highWaterMark;
  state.needDrain = !ret;

  if (state.writing || state.corked)
    state.buffer.push(new WriteReq(chunk, encoding, cb));
  else
    doWrite(stream, state, false, len, chunk, encoding, cb);

  return ret;
}

This is also how data events are fired:

// if we want the data now, just emit it.
if (state.flowing && state.length === 0 && !state.sync) {
  stream.emit('data', chunk);
  stream.read(0);
}

The data event won't fire for a chunk unless there is no queued data, which means you will get the data in the order that it was passed in as.

Lizenziert unter: CC-BY-SA mit Zuschreibung
Nicht verbunden mit StackOverflow
scroll top