Question

Say I've got a texture mapped to a grid screen-aligned mesh. It looks something like:

original image

The vertex positions are:

(-1, -1), (1, -1), (-1, 1), (1, 1)

The UVs:

(0, 0), (1, 0), (0, 1), (1,1)

I warp the image by moving around the vertices and save the output by doing a glReadPixels().

The new warped vertex positions are:

(-1, -1), (0.8, -0.8), (-0.6, 0.6), (0.4, 0.4)

And the produced output is like:

warped image

Next time as an input I use the warped image that I've just saved. What I'm trying to do is reverse the warping effect I did before, by modifying the vertex coordinates. Initially I thought that the coordinates that would unwarp the image must be something of the sort:

x_unwarp = 2 * x_original - x_warped

But it's not working. The warping effect doesn't get undone. What I get is something like:

attempted unwarp

Any idea what I'm doing wrong and how is should modify the vertex coords or perhaps uvs? I'm sure I've got the math wrong.

Thanks!

UPDATE

It seems that my formula is wrong. I should've used matrices:

transform_matrix * x_original = x_warped

and then:

x_unwarped = inverse(tranform_matrix) * x_original

As the transformation I'm doing is pure scaling, the transform matrix is like:

/     \
| S  0 |
| 0  S |
\      /

Where S is the scale factor. Therefore the inverse would be:

/         \
| 1/S  0  |
| 0   1/S |
\         /

Thus this would give the unwarping vertex positions as:

(-1, -1), (1.25, -1.25), (-1.67, 1.67), (2.5, 2.5)

Seems straighter, but still incorrect:

second attempt to unwarp

Second update

I seem to have started it wrong from the beginning. There's been some random factor in all the renders. I'm going to fix that and try it again.

Update: UV warp

Done it using UV warping this time.

Warped using these UVs:

(0.0, 0.0), (0.9, 0.0), (0.0, 0.8), (0.7, 0.7)

This produced:

uv warped

And then tried to unwarp using the inverses:

(0.0, 0.0), (1/0.9, 0.0), (0.0, 1/0.8), (1/0.7, 1/0.7)

Which looks like:

unsuccessful uv unwarp

This didn't do it either. I'm starting to worry about the 1/0 cases that I just overlook.

Completely clueless.

Was it helpful?

Solution

One thing you're running into is perspective incorrect texture mapping (you can see in the first picture, that the texture coordinates are interpolated differently for each of the two triangles a quad is made of). You can either use perspective correction on a large single quad (using a fragment shader and implementing the algorithm there), or subdivide your quad into smaller patches.

Now reversing the texture warping is straightforward if you think about it like that: The distortion coordinated of your first step become the UV coordinates for the second step. Of course this requires to understand the distortion vertex positions in a certain range. Recall the OpenGL transformation pipeline:

Modelview → Projection → Clipping → Normalized Device Coordinates. Screen space is the last step, with the viewport dimensions applied. NDC coordinates are in the range [-1,1], but getting to the [0,1] range is easy enough: (x+1)/2

So what you do is performing the transformation pipeline (you can omit the clipping, but you must apply the perspective divide), this gives you the UV coordinates for de-distortion.

OTHER TIPS

If I'm understanding your problem correctly (which isn't certain, given that my answer feels way too simple :p), why not simply modify your texture by applying a texture matrix (i.e. transform the UVs with a translation, rotation, shearing, etc. matrix as needed) instead of moving the quad it is applied to ?

This would make reversing the transformation a simple matter of inverting your original transformation matrix.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top