문제

Using QT 4.7 and QGLWidget, I want to use QGLWidget::paintGL() to render a scene into a QGLFramebufferObject and then - using the texture generated by the QGLFramebufferObject - onto the screen. For the second step I render a full-screen quad with an orthographic projection and use an own shader to render the texture onto it.

Rendering into the QGLFramebufferObject seems to work fine (at least I can call QGLFramebufferObject::toImage().save(filename) and I get the correctly rendered image), but I can't get the rendered texture to be drawn onto the screen.

Here the code I use to render into the framebuffer object:

    //Draw into framebufferobject
    _fbo->bind();
    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
    glLoadIdentity();
    _camera->applyModelviewMatrix();
    _scene->glDraw();
    glFlush();
    _fbo->release();
    _fbo->toImage().save("image.jpg");

As said, the image saved here contains the correctly rendered image.

Here the code I try to render the framebuffer object onto the screen. I'm using an own shader to render it.

    //Draw framebufferobject to a full-screen quad on the screen
    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
    glMatrixMode(GL_PROJECTION);
    glLoadIdentity();
    glOrtho(0.0,1.0,0.0,1.0,-1.0,1.0);
    glMatrixMode(GL_MODELVIEW);
    glLoadIdentity();
    glEnable(GL_TEXTURE_2D);
    _selectionShader->bind();
    glBindTexture(GL_TEXTURE_2D, _fbo->texture());
    _selectionShader->setUniformValue("renderedTexture", _fbo->texture());

    glBegin(GL_QUADS);
        glTexCoord2f(0.0,0.0);
        glVertex3f(0.0f,0.0f,0.0f);
        glTexCoord2f(1.0,0.0);
        glVertex3f(1.0f,0.0f,0.0f);
        glTexCoord2f(1.0,1.0);
        glVertex3f(1.0f,1.0f,0.0f);
        glTexCoord2f(0.0,1.0);
        glVertex3f(0.0f,1.0f,0.0f);
    glEnd();
    glDisable(GL_TEXTURE_2D);
    glFlush();
}

The vertex shader simply passes through the position it uses as texture coordinates

varying vec2 position;

void main()
{
    gl_Position = ftransform();
    position = gl_Vertex.xy;
}

And the fragment shader draws the texture

varying vec2 position;
uniform sampler2D renderedTexture;

void main()
{
    gl_FragColor = texture2D(renderedTexture, position);
}

The projection I'm doing is correct, for when I exchange the fragment shader with the following one, it draws the expected color gradient:

varying vec2 position;
uniform sampler2D renderedTexture;

void main()
{
    gl_FragColor = vec4(position.x, 0.0f, position.y, 1.0f);
}

But using the other fragment shader that should render the texture, I only get a blank screen (that was made blank by glClear() in the beginning of the rendering). So the fragment shader seems to draw either black or nothing.

Am I missing anything? Am I passing the texture correctly to the shader? Do I have to do anything else to prepare the texture?

도움이 되었습니까?

해결책

_selectionShader->setUniformValue("renderedTexture", _fbo->texture());

This is the (or a) wrong part. A sampler uniform in a shader is not set to the texture object, but to the texture unit that this object is bound as texture to (which you already did with glBindTexture(GL_TEXTURE_2D, _fbo->texture()). So since you seem to use GL_TEXTURE0 all the time, you just have to set it to texture unit 0:

_selectionShader->setUniformValue("renderedTexture", 0);

By the way, no need to glEnable(GL_TEXTURE_2D), that isn't necessary when using shaders. And why use glTexCoord2f if you're just using the vertex position as texture coordinate anyway?

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top