Question

Is there a way to stop the premultiplication of the alpha channel for canvas data, or a workaround?

I want to generate an image (in this case some random rgba values) and save the canvas as an image.

During the second step, I want to compare the original image with the generated image using the imageData, however this won't work due to the premultiplication of the alpha channel of my rgba pixels in the generated image.

The example

function drawImage(ctx) {
    var img = ctx.createImageData(canvas.width,canvas.height);

        for (var i=img.data.length;i-=4;) {     
                img.data[i] = Math.floor(Math.random() * 255);
                img.data[i+1] = Math.floor(Math.random() * 255);
                img.data[i+2] = Math.floor(Math.random() * 255);
                img.data[i+3] = Math.floor(Math.random() * 255);
        }

        ctx.putImageData(img, 0, 0);
            // our image data we just set
        console.log(img.data);
            // the image data we just placed onto the canvas
        console.log(ctx.getImageData(0,0,canvas.width, canvas.height).data);
}   

In the console, you will find two console.log outputs. The first before the premultiplication, and the second after the premultiplication. These two outputs are different, some values being off by 3 or more. This only happens when there is partial transparency involved (the alpha being set to anything other than 255).

Is there a way to get the same output? Any ideas about this problem? Any ideas how to create something like a workaround for this problem?

Thank you in advance!

Was it helpful?

Solution

Bleh, this is an acknowledged issue as far as the canvas spec is concerned. It notes:

Due to the lossy nature of converting to and from premultiplied alpha color values, pixels that have just been set using putImageData() might be returned to an equivalent getImageData() as different values.

So this:

var can = document.createElement('canvas');
var ctx = can.getContext('2d');
can.width = 1;
can.height = 1;
var img = ctx.createImageData(1, 1);
img.data[0] = 40;
img.data[1] = 90;
img.data[2] = 200;
var ALPHAVALUE = 5;
img.data[3] = ALPHAVALUE;
console.log(img.data); 
ctx.putImageData(img, 0, 0);
console.log(ctx.getImageData(0, 0, 1, 1).data); 

outputs:

[40, 90, 200, 5]
[51, 102, 204, 5]

In all browsers.

So this is a lossy operation, there's no workaround unless they change the spec to give an option for not using premultiplication. This was discussed as far back as 2008 in the WHATWG mailing list, and they decided that a "round trip"/identity of put/get image data is not a promise the spec is willing to demand.

If you need to "save" the image data, you can't save it and keep the same fidelity using putImageData. Workarounds by drawing the full-alpha data to a temporary canvas and redrawing to the main canvas with a smaller globalAlpha won't work, either.

So you're out of luck. Sorry.


To this day (May 12, 2014) this still gets discussed on the WHATWG list: http://lists.whatwg.org/htdig.cgi/whatwg-whatwg.org/2014-May/296792.html

OTHER TIPS

I found a way to read the accurate byte values of an image using WebGL 2. This is related to my question here. Take a look at the following code which compares the result of getImageData with the expected output and the output of WebGL:

let image = new Image();

image.addEventListener('load', function () {
  let canvas = document.createElement('canvas');
  let ctx = canvas.getContext("2d");
  canvas.width = this.width;
  canvas.height = this.height;
  ctx.drawImage(this, 0, 0);
  let data = ctx.getImageData(0, 0, this.width, this.height).data;
  document.getElementById('imageData').innerHTML = data;
});

image.addEventListener('load', function () {
  let canvas = document.createElement('canvas');
  let gl = canvas.getContext("webgl2");
  gl.activeTexture(gl.TEXTURE0);
  let texture = gl.createTexture();
  gl.bindTexture(gl.TEXTURE_2D, texture);
  const framebuffer = gl.createFramebuffer();
  gl.bindFramebuffer(gl.FRAMEBUFFER, framebuffer);
  gl.framebufferTexture2D(gl.FRAMEBUFFER, gl.COLOR_ATTACHMENT0, gl.TEXTURE_2D, texture, 0);
  gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE, this);
  gl.drawBuffers([gl.COLOR_ATTACHMENT0]);
  let data = new Uint8Array(this.width * this.height * 4);
  gl.readPixels(0, 0, this.width, this.height, gl.RGBA, gl.UNSIGNED_BYTE, data);
  document.getElementById('webGl').innerHTML = data;
});

image.src = "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAYAAAAfFcSJAAAADUlEQVR4nGPg2T3JDgADyAGYmiSbAQAAAABJRU5ErkJggg==";
<pre>Expected:  12,187,146,62<br>
ImageData: <span id="imageData"></span><br>
WebGL:     <span id="webGl"></span><br></pre>

Well this is still a bummer...

I supposed if you're using it in a texture for webgl you could just pass it as a uniform byte array

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top