Question

I've extracted the raw pixels from a 16 bit RGBA png and am trying to separate out the colour and alpha into 2 separate TypedArray's.

For 8 bit RGBA this works,

var pixels = myRawPixels, // a Uint32Array
    len = pixels.length,
    colorData = new Uint8Array(len * 3),
    alphaData = new Uint8Array(len),
    pixel,
    i = 0,
    n = 0;

for(; i < len; i++) {
    pixel = pixels[i];

    colorData[n++] = (pixel >>> 0) & 0xff; // R
    colorData[n++] = (pixel >>> 8) & 0xff;  // G
    colorData[n++] = (pixel >>> 16) & 0xff;  // B

    alphaData[i] = (pixel >>> 24) & 0xff;
}

But due to the lack of 64 bit ints in JS, i'm struggling to work out how the equivalent would look for a 16 bit image?

var pixels = myRawPixels, // Uint32Array
    len = pixels.length,
    imgData = new Uint16Array((len * 0.5) * 3),
    alphaData = new Uint16Array(len * 0.5),
    i = 0, n = 0, a = 0,
    pixel;

for(; i < len; i++) {
    pixel = pixels[i] | pixels[i++];

    imgData[n++] = pixel >>> 0;
    imgData[n++] = pixel >>> 16;
    imgData[n++] = pixel >>> 32;

    alphaData[a++] = pixel >>> 48;
}
Était-ce utile?

La solution

pixel = pixels[i] | pixels[i++];

That doesn't join them to a 64-bit int (and you would've wanted pixels[++i] anyway). Instead, access the 16-bit numbers separately on the two indices:

while(i < len) {
    pixel = pixels[i++];
    colorData[n++] = (pixel >>> 0)  & 0xffff; // R
    colorData[n++] = (pixel >>> 16) & 0xffff; // G
    pixel = pixels[i++];
    colorData[n++] = (pixel >>> 0)  & 0xffff; // B
    alphaData[a++] = (pixel >>> 16) & 0xffff; // A
}
Licencié sous: CC-BY-SA avec attribution
Non affilié à StackOverflow
scroll top