Question

Hey together,

first time posting here, because I'm damn stuck...

The further away a mesh is from the origin at (0, 0, 0), the more it "jumps"/"flickers" when rotating or moving the camera. It's somehow hard to describe this effect: it is like the mesh is jittering/shivering/trembling a little bit and this trembling gets bigger and bigger as you gain distance to the origin.
For me, it begins to be observable at around 100000 units distance to the origin, so at (0, 0, 100000) for example. Neither the axis of the translation nor the type of the mesh (default mesh created from Mesh.Create... or with assimp.NET imported 3ds mesh) have influence on this effect. The value of the position of the mesh doesn't change when this effect occurs, checked this by logging the position.

If I'm not missing something, this narrows it down to two possibilities:

  1. My camera code
  2. The DirectX-Device

As for the DirectX-Device, this is my device initialization code:

    private void InitializeDevice()
    {
        //Initialize D3D
        _d3dObj = new D3D9.Direct3D();

        //Set presentation parameters
        _presParams = new D3D9.PresentParameters();
        _presParams.Windowed = true;
        _presParams.SwapEffect = D3D9.SwapEffect.Discard;
        _presParams.AutoDepthStencilFormat = D3D9.Format.D16;
        _presParams.EnableAutoDepthStencil = true;
        _presParams.PresentationInterval = D3D9.PresentInterval.One;
        _presParams.BackBufferFormat = _d3dObj.Adapters.DefaultAdapter.CurrentDisplayMode.Format;
        _presParams.BackBufferHeight = _d3dObj.Adapters.DefaultAdapter.CurrentDisplayMode.Height;
        _presParams.BackBufferWidth = _d3dObj.Adapters.DefaultAdapter.CurrentDisplayMode.Width;

        //Set form width and height to current backbuffer width und height
        this.Width = _presParams.BackBufferWidth;
        this.Height = _presParams.BackBufferHeight;

        //Checking device capabilities
        D3D9.Capabilities caps = _d3dObj.GetDeviceCaps(0, D3D9.DeviceType.Hardware);
        D3D9.CreateFlags devFlags = D3D9.CreateFlags.SoftwareVertexProcessing;
        D3D9.DeviceType devType = D3D9.DeviceType.Reference;

        //setting device flags according to device capabilities
        if ((caps.VertexShaderVersion >= new Version(2, 0)) && (caps.PixelShaderVersion >= new Version(2, 0)))
        {
            //if device supports vertexshader and pixelshader >= 2.0
            //then use the hardware device
            devType = D3D9.DeviceType.Hardware;

            if (caps.DeviceCaps.HasFlag(D3D9.DeviceCaps.HWTransformAndLight))
            {
                devFlags = D3D9.CreateFlags.HardwareVertexProcessing;
            }
            if (caps.DeviceCaps.HasFlag(D3D9.DeviceCaps.PureDevice))
            {
                devFlags |= D3D9.CreateFlags.PureDevice;
            }
        }

        //initialize the device
        _device = new D3D9.Device(_d3dObj, 0, devType, this.Handle, devFlags, _presParams);
        //set culling
        _device.SetRenderState(D3D9.RenderState.CullMode, D3D9.Cull.Counterclockwise);
        //set texturewrapping (needed for seamless spheremapping)
        _device.SetRenderState(D3D9.RenderState.Wrap0, D3D9.TextureWrapping.All);
        //set lighting
        _device.SetRenderState(D3D9.RenderState.Lighting, false);
        //enabling the z-buffer
        _device.SetRenderState(D3D9.RenderState.ZEnable, D3D9.ZBufferType.UseZBuffer);
        //and setting write-access exlicitly to true...
        //i'm a little paranoid about this since i had to struggle for a few days with weirdly overlapping meshes
        _device.SetRenderState(D3D9.RenderState.ZWriteEnable, true);
    }

Am I missing a flag or renderstate? Is there something that could cause such a weird/distorted behaviour?

My camera class is based on Michael Silvermans C++ Quaternion Camera:

//every variable prefixed with an underscore is 
//a private static variable initialized beforehand
public static class Camera
{
    //gets called every frame
    public static void Update()
    {
        if (_filter)
        {
            _filteredPos = Vector3.Lerp(_filteredPos, _pos, _filterAlpha);
            _filteredRot = Quaternion.Slerp(_filteredRot, _rot, _filterAlpha);
        }

        _device.SetTransform(D3D9.TransformState.Projection, Matrix.PerspectiveFovLH(_fov, _screenAspect, _nearClippingPlane, _farClippingPlane));
        _device.SetTransform(D3D9.TransformState.View, GetViewMatrix());
    }

    public static void Move(Vector3 delta)
    {
        _pos += delta;
    }

    public static void RotationYaw(float theta)
    {
        _rot = Quaternion.Multiply(Quaternion.RotationAxis(_up, -theta), _rot);
    }

    public static void RotationPitch(float theta)
    {
        _rot = Quaternion.Multiply(_rot, Quaternion.RotationAxis(_right, theta));
    }

    public static void SetTarget(Vector3 target, Vector3 up)
    {
        SetPositionAndTarget(_pos, target, up);
    }

    public static void SetPositionAndTarget(Vector3 position, Vector3 target, Vector3 upVec)
    {
        _pos = position;

        Vector3 up, right, lookAt = target - _pos;

        lookAt = Vector3.Normalize(lookAt);
        right = Vector3.Cross(upVec, lookAt);
        right = Vector3.Normalize(right);
        up = Vector3.Cross(lookAt, right);
        up = Vector3.Normalize(up);

        SetAxis(lookAt, up, right);
    }

    public static void SetAxis(Vector3 lookAt, Vector3 up, Vector3 right)
    {
        Matrix rot = Matrix.Identity;

        rot.M11 = right.X;
        rot.M12 = up.X;
        rot.M13 = lookAt.X;

        rot.M21 = right.Y;
        rot.M22 = up.Y;
        rot.M23 = lookAt.Y;

        rot.M31 = right.Z;
        rot.M32 = up.Z;
        rot.M33 = lookAt.Z;

        _rot = Quaternion.RotationMatrix(rot);
    }

    public static void ViewScene(BoundingSphere sphere)
    {
        SetPositionAndTarget(sphere.Center - new Vector3((sphere.Radius + 150) / (float)Math.Sin(_fov / 2), 0, 0), sphere.Center, new Vector3(0, 1, 0));
    }

    public static Vector3 GetLookAt()
    {
        Matrix rot = Matrix.RotationQuaternion(_rot);
        return new Vector3(rot.M13, rot.M23, rot.M33);
    }

    public static Vector3 GetRight()
    {
        Matrix rot = Matrix.RotationQuaternion(_rot);
        return new Vector3(rot.M11, rot.M21, rot.M31);
    }

    public static Vector3 GetUp()
    {
        Matrix rot = Matrix.RotationQuaternion(_rot);
        return new Vector3(rot.M12, rot.M22, rot.M32);
    }

    public static Matrix GetViewMatrix()
    {
        Matrix viewMatrix, translation = Matrix.Identity;
        Vector3 position;
        Quaternion rotation;

        if (_filter)
        {
            position = _filteredPos;
            rotation = _filteredRot;
        }
        else
        {
            position = _pos;
            rotation = _rot;
        }

        translation = Matrix.Translation(-position.X, -position.Y, -position.Z);
        viewMatrix = Matrix.Multiply(translation, Matrix.RotationQuaternion(rotation));

        return viewMatrix;
    }
}

Do you spot anything in the camera code which could cause this behaviour?

I just can't imagine that DirectX can't handle distances greater than 100k. I am supposed to render solar systems and I'm using 1 unit = 1km. So the earth would be rendered at its maximum distance to the sun at (0, 0, 152100000) (just as an example). This is becoming impossible if these "jumps" keep occuring. Finally i thought about scaling everything down, so that a system never goes beyond 100k/-100k distance from the origin, but I think this won't work because the "jittering" gets bigger as the distance from the origin gets bigger. Scaling everything down would - i think - scale down the jumping-behaviour, too.

Was it helpful?

Solution

Just to not leave this question unanswered (credits to @jcoder, see comments of question):

The weird behaviour of the meshes comes from the floating point precision of DX. The bigger your world gets, the less precision is there to calculate positions accurately.

There are two possibilities to solve this problem:

  1. Downscaling the whole world
    this may be problematic in a "galactic-style" world, where you have really big position offsets as well as really small ones (i.e. the distance of a planet to its sun is really big, but the distance of a spaceship in orbit of a planet may be really small)
  2. Dividing the world into smaller chunks
    this way you have either to express all positions relative to something else (see stackoverflow.com/questions/1930421) or make multiple worlds and somehow move between them
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top