Question

The result of the following Code differs If it is started with the debugger in the background or without. The difference is only there, if optimization is switched on.

This is the result:

-> with optimazation: 1000 2008 3016 1001 2009 3007 ...

-> without optimization ( as expected ) 1000 1008 1016 1001 1009 1017 ...

Code:

using System;
using System.Diagnostics;
using System.Runtime.CompilerServices;

namespace OptimizerTest
{   
    public class Test
    {
        int dummy;

        public void TestFunction(int stepWidth)
        // stepWidth must be a parameter
        {
            for (int step = 0; step < stepWidth; step++)
            {
                dummy = step + 1000;
                // addition with constant ( same value as later !)
                for (int x = 0; x < 20; x += stepWidth)
                {
                    int index = x + 1000 + step;
                    // constant must be same as above and ?!?! 
                    // int index = x + step + 1000; works !!!!!
                    Console.Write("\n\r" + index);
                }
            }
        }

        [MethodImpl(MethodImplOptions.NoOptimization)]
        public void TestFunctionNoOptimization(int stepWidth)
        {
            for (int step = 0; step < stepWidth; step++)
            {
                dummy = step + 1000;
                for (int x = 0; x < 20; x += stepWidth)
                {
                    int index = x + 1000 + step;                        
                    Console.Write("\n\r" + index);
                }
            }
        }
    }

    class Program
    {
        /// <summary>
        /// Result differs from Start with F5 to Ctrl-F5
        /// </summary>
        /// <param name="args"></param>
        static void Main(string[] args)
        {
            Test test = new Test();
            Console.Write("\n\r---------\n\roptimized result\n\r-------------" );
            test.TestFunction(8);
            Console.Write("\n\r---------\n\rnot optimized result\n\r-------------");
            test.TestFunctionNoOptimization(8);
            Console.Write("\n\r---------\n\rpress any key");
            Console.ReadKey();
        }
    }
}

The behavior of the error depends on the number of iterations of the inner loop (x<5 everything works fine). Very interesting is that the does not occure when I use

   int index = x + step + 1000; 

instead of

   int index = x + 1000 + step; 

I am working with Visual Studio 2010 SP1 and tried it with .NET Framework from 2.0 to 4.0.3. I've always seen the same result.

Does anybody know about this bug or can reproduce?

Was it helpful?

Solution

Yes, this is definitely a jitter optimizer bug. The reason other SO users have trouble reproducing it is because only the x64 jitter appears to have this bug. You must set the project's Platform target to AnyCPU, untick the "Prefer 32-bit" option on VS2012 and up.

I haven't looked at the underlying reason closely enough but it appears to fumble at trying to eliminate the common step + 1000 sub-expression. Sub-expression elimination is one of the standard jitter optimizations. But it incorrectly incorporates the expression code inside the loop instead of keeping it out of the loop as written. You'll for example see the bug disappear when you write:

  dummy = step + 999;

This bug is still present in the latest .NET 4.5.1 version (clrjit.dll, v4.0.30319.34003 on my machine), also present in the v2 jitter (mscorjit.dll, v2.0.50727.7905 on my machine).

The code is a bit too synthetic to recommend a solid workaround, you already found one anyway so you can keep motoring on your project. In general, I'd recommend you eliminate the sub-expression yourself with:

  int index = x + dummy;  

It should be reported to Microsoft, you can so so by posting a bug report at connect.microsoft.com. If you don't want to take the time then let me know and I'll take care of it.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top