Question

I really don't understand what I'm doing wrong, but according to Adobe, this is how you use uploadFromByteArray(...) on an IndexBuffer3D object:

What I don't understand is which writing-method do I have to use to write the integers of my vertex-indexes? I've tried writeFloat, writeUnsignedInt, writeInt, even writeShort and all fails. I've set my ByteArray instance to ba.endian = Endian.LITTLE_ENDIAN, still no go.

I've made sure to reset my ByteArray position to 0 before I attempt to upload it to my index-buffer, but nothing shows up!

If I upload a Vector.<uint> instead, that works! So I know the problem is not with the AGAL shader.

Any ideas?

Was it helpful?

Solution

Nevermind, just found what I was doing wrong:

WRONG:

_buffer.uploadFromByteArray(_dataBytes, 0, 0, _dataBytes.length >> 2);

I was dividing by 4 at first (bitwise-shift twice is the same) because I though the number of bytes per index was 4 bytes long. Nah ah! Incorrect!

ByteArrays for IndexBuffer3D purposes should be written with writeShort(), since it utilizes 16-bit numbers instead of 32-bit. Therefore, it only uses 2 bytes per indexes.

CORRECT:

_buffer.uploadFromByteArray(_dataBytes, 0, 0, _dataBytes.length >> 1);

Hope that clarifies it up for other Stage3D users! :)

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top