Question

Good evening,

Right now I'm trying to run the following code using Xcode, but it has been impossible so far. The code is a from a simple tutorial I found online, and the code is supposed to simply draw a triangle using OpenGL and VBOs. If I try the code using Visual Studio, I actually get the expected result with no problems at all. However, when I try to run the code using Xcode, I only get a black screen.

To setup the project in Xcode, I installed GLEW and FreeGlut using MacPorts and then I installed XQuartz 2.7.5. Then I created a new project in xcode as a command line tool, and in the build settings I added -lGLEW and -lGLUT in the Other Linker Flags section. Additionally, I modified the library search paths to include /opt/local/lib/ and /opt/X11/lib/, and I modified the User Header search paths to include /opt/local/include/ and /opt/X11/include/. Finally, in the Build Phases section, I added the OpenGL.framework in the Link Binary With Libraries section.

What am I missing? If the code worked for me on visual studio, then I must have messed up while trying to configure Xcode.

Edit: If I change GL_TRIANGLES WITH GL_POINTS, for some reason sometimes it will just draw a single point in the middle of the screen. If I add the code for the shaders, this single point will actually have the same color as the one specified in the shaders.

Edit2: For those interested, the tutorials I'm following are on: http://ogldev.atspace.co.uk

#include <stdio.h>
#include <GL/glew.h>
#include <GL/freeglut.h>
#include "math_3d.h"

GLuint VBO;

static void RenderSceneCB()
{
    glClear(GL_COLOR_BUFFER_BIT);

    glEnableVertexAttribArray(0);
    glBindBuffer(GL_ARRAY_BUFFER, VBO);
    glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 0, 0);

    glDrawArrays(GL_TRIANGLES, 0, 3);

    glDisableVertexAttribArray(0);

    glutSwapBuffers();
}


static void InitializeGlutCallbacks()
{
    glutDisplayFunc(RenderSceneCB);
}

static void CreateVertexBuffer()
{
    Vector3f Vertices[3];
    Vertices[0] = Vector3f(-1.0f, -1.0f, 0.0f);
    Vertices[1] = Vector3f(1.0f, -1.0f, 0.0f);
    Vertices[2] = Vector3f(0.0f, 1.0f, 0.0f);

    glGenBuffers(1, &VBO);
    glBindBuffer(GL_ARRAY_BUFFER, VBO);
    glBufferData(GL_ARRAY_BUFFER, sizeof(Vertices), Vertices, GL_STATIC_DRAW);
}


int main(int argc, char** argv)
{
    glutInit(&argc, argv);
    glutInitDisplayMode(GLUT_DOUBLE|GLUT_RGBA);
    glutInitWindowSize(1024, 768);

    glutInitWindowPosition(100, 100);
    glutCreateWindow("Tutorial 03");

    InitializeGlutCallbacks();

    // Must be done after glut is initialized!
    GLenum res = glewInit();
    if (res != GLEW_OK) {
        fprintf(stderr, "Error: '%s'\n", glewGetErrorString(res));
        return 1;
    }

    glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
    CreateVertexBuffer();
    glutMainLoop();

    return 0;
}
Was it helpful?

Solution

Since you do not have a shader, OpenGL does not know how to interpret your vertex attribute 0, as it only knows about position, color, etc. but knows nothing about generic attributes. If you want to keep the code shader-free, you need to use the named attributes:

static void RenderSceneCB()
{
    glClear(GL_COLOR_BUFFER_BIT);

    glEnableClientState(GL_VERTEX_ARRAY); // or GL_NORMAL_ARRAY, etc.
    glBindBuffer(GL_ARRAY_BUFFER, VBO);
    glVertexPointer(3, GL_FLOAT, GL_FALSE, 0, 0); // or glNormalPointer(), etc.

    glDrawArrays(GL_TRIANGLES, 0, 3);

    glDisableClientState(GL_VERTEX_ARRAY);

    glutSwapBuffers();
}

Note that your code may actually work on some GPUs as they will map the generic attributes to the same "slots" as the named attributes, and zero would then be interpreted as position (typically NVidia is (errorneously) less strict with these issues and on my GTX 680 the provided code indeed displays a white triangle).

You can use MiniShader, just put the following code after CreateVertexBuffer();:

minish::InitializeGLExts(); // initialize shading extensions
const char *vs =
    "layout(location = 0) in vec3 pos;\n"
    // the layout specifier binds this variable to vertex attribute 0
    "void main()\n"
    "{\n"
    "    gl_Position = gl_ModelViewProjectionMatrix * vec4(pos, 1.0);\n"
    "}\n";
const char *fs = "void main() { gl_FragColor=vec4(.0, 1.0, .0, 1.0); }"; // green
minish::CompileShader(vs, fs);

Note that writing the vertex shader just like that will likely lead to problems, as no version is specified. The layout qualifier is only available in GLSL 330, but gl_ModelViewProjectionMatrix is deprecated after GLSL 120. So we need to add something like this to the beginning of the vertex shader:

    "#version 140\n" // need at least 140 for GL_ARB_explicit_attrib_location

    "#extension GL_ARB_explicit_attrib_location : require\n"
    // required for layout(location)

    "#extension GL_ARB_compatibility : require\n"
    // still need gl_ModelViewProjectionMatrix
    // (which is otherwise deprecated after version 120)
    // also, GL_ARB_compatibility is not supported in version 330 (and above)

Alternately, we can just use GLSL 330 (or greater) but then the modelview-projection matrix needs to be passed through an uniform (which also needs to be properly set in the CPU code):

    "#version 330\n"
    "uniform mat4 MVP;\n"
    "layout(location = 0) in vec3 pos;\n"
    "void main()\n"
    "{\n"
    "    gl_Position = MVP * vec4(pos, 1.0);\n"
    "}\n";

Note that this answer is an extended duplicate of my answer to Triangle doesn't render.

OTHER TIPS

You're trying to use generic vertex attributes with the legacy fixed pipeline. I don't think that can work. For the fixed pipeline, you'll also have to use the legacy calls for setting up and enabling the position array. Replace your call to glEnableVertexAttribArray() with this:

glEnableClientState(GL_VERTEX_ARRAY);

and your glVertexAttribPointer() call with this:

glVertexPointer(3, GL_FLOAT, 0, 0);

you can see How to set up vertex attributes in OpenGL? You need to bind a VAO before setting attribute pointers.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top