문제

In one of my earlier questions I asked about onTouchEvents in OpenGL. The idea was to create a simple app that drew a square where you pressed the screen. After a while I came out with an app that did everything the way it should have, but the square didn't redraw, it just vanished from the screen.

I have been told that to solve this I need to send a ray into the 3D environment of OpenGL from my 2D perspective, but for the life of me I can't understand why, because when I read up on raycasting it seemed to be used primarily for lighting effects. What exactly is raycasting? Why would I need the z-axis at all when all I need to change in the squares position is the x and y axis from which coodinates can be found no matter what depth the touch registers at?

도움이 되었습니까?

해결책

You need to send a ray from the view to the screen to know where in the world to draw your square. While this involves casting a ray, it's not ray casting as a rendering technique. Unless you are working ortho, and even then to an extent, the math for converting from screen to world or world to screen needs to be done in order to find the correct point for your primitive.

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top