Question

I am converting a BufferedImage to byte[] array using the following code:

BufferedImage input = ImageIO.read(new File(path));
DataBufferByte bufferBytes = (DataBufferByte) input.getRaster().getDataBuffer();
byte[] bytes = bufferBytes.getData();

One thing that is getting me confused is that how are the channels mapped to actual byte elements? Assuming that I have an ARGB image for the first pixel in the image, starting from bytes[0] and working through bytes[3], which element is the red channel, which one is the green channel, which one is the blue channel and which one is the alpha channel?

I did some test runs and checked each element in the debugging mode, but my results were inconclusive (am I missing something?)

Thanks in advance.

Était-ce utile?

La solution

Thanks to the comment on my original question, I found the answer. Looking at the result of BufferedImage.getType() along with JavaDoc, will help you determine the mapping of the colors in the byte[] array.

In my case, the type of the image was TYPE_3BYTE_BGR which according to Java documentation, is stored as interleaved Blue, Green, Red in 3 consecutive byte elements.

Hope this helps others :-)

Licencié sous: CC-BY-SA avec attribution
Non affilié à StackOverflow
scroll top