문제

I did a small test application which simulates moving around an object. I used the JPCT-AE library and the Rotation Vector Sensor on my mobile device.

My problem is that the current state of the rotation is not simulating the moving around the object correctly. The rotation is reverse.

Here is a picture of the problem to be more clear:

enter image description here

In the picture the upper part is showing the moving of the user from point A to B. The part bellow is showing how the application is simulating the moving around the object. The "HOW IT IS" screen is showing how it is currently rotating the object.

The code looks like this:

public class HelloWorld extends Activity {

private GLSurfaceView mGLSurfaceView;
private SensorManager mSensorManager;
private MyRenderer mRenderer;
Object3D object = null;
private World world = null;
private Light sun = null;
Context context = this;
private FrameBuffer fb = null;
private RGBColor back = new RGBColor(175, 175, 175);

@Override
protected void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);

    // Get an instance of the SensorManager
    mSensorManager = (SensorManager) getSystemService(SENSOR_SERVICE);

    mRenderer = new MyRenderer(context);
    mGLSurfaceView = new GLSurfaceView(this);
    mGLSurfaceView.setRenderer(mRenderer);
    setContentView(mGLSurfaceView);
}

@Override
protected void onResume() {
    super.onResume();
    mRenderer.start();
    mGLSurfaceView.onResume();
}

@Override
protected void onPause() {
    super.onPause();
    mRenderer.stop();
    mGLSurfaceView.onPause();
}

class MyRenderer implements GLSurfaceView.Renderer, SensorEventListener {
    private Sensor mRotationVectorSensor;
    private final float[] mRotationMatrix = new float[16];
    Context context;

    public MyRenderer(Context context) {
        // find the rotation-vector sensor
        this.context = context;
        mRotationVectorSensor = mSensorManager
                .getDefaultSensor(Sensor.TYPE_ROTATION_VECTOR);

        // initialize the rotation matrix to identity
        mRotationMatrix[0] = 1;
        mRotationMatrix[4] = 1;
        mRotationMatrix[8] = 1;
        mRotationMatrix[12] = 1;
    }

    public void start() {
        mSensorManager.registerListener(this, mRotationVectorSensor, 10000);
    }

    public void stop() {
        // make sure to turn our sensor off when the activity is paused
        mSensorManager.unregisterListener(this);
    }

    public void onSensorChanged(SensorEvent event) {
        if (event.sensor.getType() == Sensor.TYPE_ROTATION_VECTOR) {
            SensorManager.getRotationMatrixFromVector(mRotationMatrix,
                    event.values);
            SensorManager.remapCoordinateSystem(mRotationMatrix, SensorManager.AXIS_X,  SensorManager.AXIS_MINUS_Y, mRotationMatrix);


        }
    }

    public void onDrawFrame(GL10 gl) {

        Matrix m = new Matrix();
        m.setDump(mRotationMatrix);

        object.setRotationMatrix(m);

        fb.clear(back);
        world.renderScene(fb);
        world.draw(fb);
        fb.display();
    }

    public void onSurfaceChanged(GL10 gl, int width, int height) {
        if (fb != null) {
            fb.dispose();
        }
        fb = new FrameBuffer(gl, width, height);
        world = new World();
        world.setAmbientLight(250, 250, 250);

        // set view-port
        gl.glViewport(0, 0, width, height);
        // set projection matrix
        float ratio = (float) width / height;
        gl.glMatrixMode(GL10.GL_PROJECTION);
        gl.glLoadIdentity();
        gl.glFrustumf(-ratio, ratio, -1, 1, 1, 10);
        try {               
            object = loadModel("untitled.obj", "untitled.mtl", 0.1F);                   
        } catch (UnsupportedEncodingException e) {
            // TODO Auto-generated catch block
            e.printStackTrace();
        }   

        object.build();
        world.addObject(object);

        Camera cam = world.getCamera();
        cam.moveCamera(Camera.CAMERA_MOVEOUT, 50);
        cam.lookAt(object.getTransformedCenter());
    }

    public void onSurfaceCreated(GL10 gl, EGLConfig config) {
        // dither is enabled by default, we don't need it
        gl.glDisable(GL10.GL_DITHER);
        // clear screen in white
        gl.glClearColor(1, 1, 1, 1);
    }

    public void onAccuracyChanged(Sensor sensor, int accuracy) {
    }

    private Object3D loadModel(String filename, String mtlFileName,
            float scale) throws UnsupportedEncodingException {

        InputStream stream = null;
        InputStream mtlStream = null;
        try {

            stream = context.getAssets().open(filename);
            mtlStream = context.getAssets().open(mtlFileName);
        } catch (IOException e) {
            // TODO Auto-generated catch block
            e.printStackTrace();
        }


        Object3D[] model = Loader.loadOBJ(stream, mtlStream, scale);

        return Object3D.mergeAll(model);

    }
}}

Does anyone here has experience with this? I want to know if there is a simple solution for this before I start going full mathematical on this.

Thank you for your time! I really appreciate it!

도움이 되었습니까?

해결책

OK I found a solution on my own. I had to rotate the supplied rotation matrix so it is expressed in a coordinate system that was suitable for my problem.

I edited the code in my question. That code now works for me.

Basicly I added this line of code in the onSensorChanged method:

SensorManager.remapCoordinateSystem(mRotationMatrix, SensorManager.AXIS_X,  SensorManager.AXIS_MINUS_Y, mRotationMatrix);

I know its not the best performance solution because I am using mRotationMatrix as both the first and the last parameter in the mentioned method. But for now I am not bothered with the performance issue.

I hope this helps someone in the future.

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top