سؤال

I'm having trouble getting an index buffer to display some imported .obj correctly. It's been taken into consideration coordinate system orientation. (The obj loader was hard coded) can print out the indices fine, so it correctly fills a DWORD array to be read in and set:

vector < vector <float> > index;
index = GetObjData(FilePath, VERTEXINDEXLIST);
for (int n=0; n<index.size(); n++){
    m=m+index[n].size();
}

...

DWORD *IndexBuffer = new DWORD[m];

...

iBufferDescription.Usage                   =D3D11_USAGE_DEFAULT;
iBufferDescription.ByteWidth               =sizeof(DWORD)*m;
iBufferDescription.BindFlags               =D3D11_BIND_INDEX_BUFFER;
iBufferDescription.CPUAccessFlags          =0;
iBufferDescription.MiscFlags               =0;

D3D11_SUBRESOURCE_DATA iSRData;
iSRData.pSysMem=IndexBuffer;

Device->CreateBuffer(&iBufferDescription, &iSRData, &D3DIndexBuffer);
DeviceContext->IASetIndexBuffer(D3DIndexBuffer, DXGI_FORMAT_R16_UINT, 0);

Here's a Maya generated .obj:

# This file uses centimeters as units for non-parametric coordinates.

mtllib tbox.mtl
g default
v -0.500000 -0.500000 -0.000000
v 0.500000 -0.500000 -0.000000
v -0.500000 0.500000 0.000000
v 0.500000 0.500000 0.000000
vt 0.000000 0.000000
vt 1.000000 0.000000
vt 0.000000 1.000000
vt 1.000000 1.000000
vn 0.000000 -0.000000 1.000000
vn 0.000000 -0.000000 1.000000
vn 0.000000 -0.000000 1.000000
vn 0.000000 -0.000000 1.000000
s 1
g pPlane1
usemtl initialShadingGroup
f 1/1/1 2/2/2 3/3/3
f 3/3/3 2/2/2 4/4/4

A 2D square with 4 vertices.Passed in through the function, DWORD IndexBuffer's contents read:

2
1
0
3
1
2

(-1 from all indices, to conform to DirectX) I will also add that some other things were set too, such as the ID3D11RasterizerState.

D3D11_RASTERIZER_DESC DrawStyleState;
DrawStyleState.AntialiasedLineEnable=true;
DrawStyleState.CullMode=D3D11_CULL_NONE;
DrawStyleState.DepthBias=0;
DrawStyleState.FillMode=D3D11_FILL_SOLID;
DrawStyleState.DepthClipEnable=true;
DrawStyleState.MultisampleEnable=true;
DrawStyleState.FrontCounterClockwise=false;
DrawStyleState.ScissorEnable=false;

ID3D11RasterizerState *DS_State;
Device->CreateRasterizerState(&DrawStyleState, &DS_State);
DeviceContext->RSSetState(DS_State);

Finally the render function is pretty standard:

void Render(){
float ColorBlue[] = {0.3f,0.3f,1.0f,1.0f};
DeviceContext->ClearRenderTargetView(RenderTargetView,ColorBlue);
    UINT stride=sizeof(VERTEX);
    UINT Offset=0;
    DeviceContext->IASetVertexBuffers(0,1,&D3DBuffer,&stride,&Offset);
    DeviceContext->IASetPrimitiveTopology(D3D_PRIMITIVE_TOPOLOGY_TRIANGLESTRIP);    
    DeviceContext->DrawIndexed(IndSz,0,0);
Swapchain->Present(0,0);

}

IndSz is a global, for the index size. Which is correct: I created a debugger for it, giving feedback:

4 vertices
6 index array size //element size = IndSz 
Index 0: 2
Index 1: 1
Index 2: 0
Index 3: 3
Index 4: 1
Index 5: 2

The above gets parsed into 1 triangle.

|\
| \
|  \ 
|   \
------

I've concluded it might be another issue other than what I can conceive of. I've checked for culling issues, ordering, data type funniness, memory size craziness and it seems close to a rewrite now. Help!

هل كانت مفيدة؟

المحلول

DWORD will not change with the word size of your CPU, as funny as this sounds. DWORD is always 32-bit no matter the host CPU on Windows, it is effectively Microsoft's uint32_t. UINT, on the other hand, is very vague and likely to change with CPU word size. By the way, the appropriate data type to pair with DXGI_FORMAT_R16_UINT is actually WORD (16-bit).

However, your actual issue here appears to be your use of D3D_PRIMITIVE_TOPOLOGY_TRIANGLESTRIP. The imported model you showed consists of two faces, six vertices in a triangle strip is going to produce four faces.

You want D3D_PRIMITIVE_TOPOLOGY_TRIANGLELIST if these 6 indices are supposed to produce exactly 2 triangles.

مرخصة بموجب: CC-BY-SA مع الإسناد
لا تنتمي إلى StackOverflow
scroll top