Question

How can I convert from C function arguments to arguments expected by the underlying protocol?

I am trying to implement glTexImage2D on the protocol level.

glxproto.pdf which describes the arguments like this:

TexImage2D
2 56+n+p rendering command length
2 110 rendering command opcode
1 BOOL swapbytes
1 BOOL lsbfirst
2 unused
4 CARD32 rowlength
4 CARD32 skiprows
4 CARD32 skippixels
4 CARD32 alignment
4 ENUM target
4 INT32 level
4 INT32 components
4 INT32 width
4 INT32 height
4 INT32 border
4 ENUM format
4 ENUM type
n LISTofBYTE image
p unused,p=pad(n)

Now the OpenGL official documentation http://www.opengl.org/sdk/docs/man/xhtml/glTexImage2D.xml describes it like this:

void glTexImage2D(GLenum target,  GLint level,
GLint internalFormat,  GLsizei width,
GLsizei height,  GLint border,  GLenum format,
GLenum type,  const GLvoid * data);

Parameters:
target
    Specifies the target texture.
    Must be GL_TEXTURE_2D, GL_PROXY_TEXTURE_2D,
    GL_TEXTURE_1D_ARRAY, GL_PROXY_TEXTURE_1D_ARRAY,
    GL_TEXTURE_RECTANGLE, GL_PROXY_TEXTURE_RECTANGLE,
    GL_TEXTURE_CUBE_MAP_POSITIVE_X,
    GL_TEXTURE_CUBE_MAP_NEGATIVE_X,
    GL_TEXTURE_CUBE_MAP_POSITIVE_Y,
    GL_TEXTURE_CUBE_MAP_NEGATIVE_Y,
    GL_TEXTURE_CUBE_MAP_POSITIVE_Z,
    GL_TEXTURE_CUBE_MAP_NEGATIVE_Z, or
    GL_PROXY_TEXTURE_CUBE_MAP.
level
    Specifies the level-of-detail number.
    Level 0 is the base image level.
    Level n is the nth mipmap reduction image.
    If target is GL_TEXTURE_RECTANGLE or
    GL_PROXY_TEXTURE_RECTANGLE, level must be 0.
internalFormat
    Specifies the number of color components in the texture.
    Must be one of base internal formats given in Table 1,
    one of the sized internal formats given in Table 2, or one
    of the compressed internal formats given in Table 3, below.
width
    Specifies the width of the texture image.
    All implementations support texture images that are at least 1024 texels
    wide.
height
    Specifies the height of the texture image, or the number of layers in a texture
    array, in the case of the GL_TEXTURE_1D_ARRAY and
    GL_PROXY_TEXTURE_1D_ARRAY targets.
    All implementations support 2D texture images that are at least 1024 texels
    high, and texture arrays that are at least 256 layers deep.
border
    This value must be 0.
format
    Specifies the format of the pixel data.
    The following symbolic values are accepted:
    GL_RED,
    GL_RG,
    GL_RGB,
    GL_BGR,
    GL_RGBA,
    GL_BGRA,
    GL_RED_INTEGER,
    GL_RG_INTEGER,
    GL_RGB_INTEGER,
    GL_BGR_INTEGER,
    GL_RGBA_INTEGER,
    GL_BGRA_INTEGER,
    GL_STENCIL_INDEX,
    GL_DEPTH_COMPONENT,
    GL_DEPTH_STENCIL.
type
    Specifies the data type of the pixel data.
    The following symbolic values are accepted:
    GL_UNSIGNED_BYTE,
    GL_BYTE,
    GL_UNSIGNED_SHORT,
    GL_SHORT,
    GL_UNSIGNED_INT,
    GL_INT,
    GL_FLOAT,
    GL_UNSIGNED_BYTE_3_3_2,
    GL_UNSIGNED_BYTE_2_3_3_REV,
    GL_UNSIGNED_SHORT_5_6_5,
    GL_UNSIGNED_SHORT_5_6_5_REV,
    GL_UNSIGNED_SHORT_4_4_4_4,
    GL_UNSIGNED_SHORT_4_4_4_4_REV,
    GL_UNSIGNED_SHORT_5_5_5_1,
    GL_UNSIGNED_SHORT_1_5_5_5_REV,
    GL_UNSIGNED_INT_8_8_8_8,
    GL_UNSIGNED_INT_8_8_8_8_REV,
    GL_UNSIGNED_INT_10_10_10_2, and
    GL_UNSIGNED_INT_2_10_10_10_REV.
data
    Specifies a pointer to the image data in memory.

The problem is that they are different!

data, border, width, height, level are OK, they are same and explained.

swapbytes, lsbfirst, skiprows, skippixels, alignment, rowlength, components -- what are they?

where internalFormat goes?

I was trying to find someone's implementation, but no luck. I tried XCB sources, Mesa sources, Node.js-x11. No way I could locate the right lines.

Was it helpful?

Solution

Are you seriously trying to interface directly through the GLX protocol? That seems absurdly complicated. Do you have a reason for doing this?

Of course the command parameters and protocol parameters are different - GLX includes a wire protocol, it has to send client state such as pixel store to the server in addition to the parameters you actually pass to the underlying GL API command. To implement GL command -> GLX protocol, you really need to implement a state machine for client state at minimum.

Update:

Since the author of Node.js-x11 has expressed interest in not implementing pixel store state tracking, I have included a table with the default values for all of the parameters you will need to implement this yourself.

pname                       Type      Initial Value   Valid Range
-------------------------------------------------------------------
GL_UNPACK_SWAP_BYTES        boolean   false           true or false
GL_UNPACK_LSB_FIRST         boolean   false           true or false
GL_UNPACK_ROW_LENGTH        integer   0               [0,oo)
GL_UNPACK_SKIP_ROWS         integer   0               [0,oo)
GL_UNPACK_SKIP_PIXELS       integer   0               [0,oo)
GL_UNPACK_ALIGNMENT         integer   4               1, 2, 4, or 8

You can either change the API binding to include these as parameters (very messy), or always assume that they are default. But you cannot implement glTexImage2D (...) over GLX without this state information.

There are not a whole lot of client-side states that you need to track to communicate with a GLX server for a basic application, but pixel store is definitely one of them if you want textures. Most other states are server-side, those include anything you can save/restore using glPushAttrib (...) and glPopAttrib (...).

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top