Domanda

I'm busy reading Jon Skeet's excellent book C# In Depth. He mentions in the section about boxing and unboxing that using boxed objects has a small overhead that could conceivably make a performance difference, on a big enough scale.

So I wrote my own benchmark tests, adding together all the numbers from 1 to 100,000,000 using a for loop. In one case I used Int32, then int, then I cast to object and cast back to int. Repeated all tests 10 times and took an average. Results (in seconds):

Int32 avg: 0.333

int avg: 0.326

object avg: 1.061

Not much difference between Int32 and int, but the boxing/unboxing took 3 times longer!

So please help me understand here: when you cast an int to an object, isn't it the same as casting it to Int32? DotNetPerls asserts that int is really just an alias for Int32 - but if that's the case, then why does int consistently perform faster than Int32, even if only marginally so?

EDIT: By popular request, here's the benchmark code:

const int SIZE = 100000000, ITERATIONS=10;
var intTimes = new List<double>();
var int32Times = new List<double>();
var objectTimes = new List<double>();

for (var n = 0; n < ITERATIONS; n++)
{
    Console.WriteLine("Iteration "+(n+1));
    Console.WriteLine("Testing using Int32");
    long result = 0;
    var sw = Stopwatch.StartNew();
    for (Int32 x = 0; x < SIZE; x++)
    {
        result += x;
    }
    sw.Stop();
    int32Times.Add(sw.Elapsed.TotalSeconds);
    Console.WriteLine("Result = {0} after {1:0.000} seconds", result, sw.Elapsed.TotalSeconds);

    Console.WriteLine("Testing using int");
    result = 0;
    sw = Stopwatch.StartNew();
    for (int x = 0; x < SIZE; x++)
    {
        result += x;
    }
    sw.Stop();
    Console.WriteLine("Result = {0} after {1:0.000} seconds", result, sw.Elapsed.TotalSeconds);
    intTimes.Add(sw.Elapsed.TotalSeconds);

    Console.WriteLine("Testing using object");
    result = 0;
    sw = Stopwatch.StartNew();
    for (int i = 0; i < SIZE; i++)
    {
        object o = i;
        result += (int) o;
    }
    sw.Stop();
    Console.WriteLine("Result = {0} after {1:0.000} seconds", result, sw.Elapsed.TotalSeconds);
    objectTimes.Add(sw.Elapsed.TotalSeconds);
}
Console.WriteLine("Summary:");
Console.WriteLine("Int32 avg: {0:0.000}", int32Times.Average());
Console.WriteLine("int avg: {0:0.000}", intTimes.Average());
Console.WriteLine("object avg: {0:0.000}", objectTimes.Average());
È stato utile?

Soluzione

The c# keyword int is an alias for System.Int32. So the perform exactly the same.

Of course there are always fluctuations during a measurement, because your machine is also doing other tasks.

Every time when you cast an int to an object and little piece of memory (on the heap) is reserved and the value of the int is copied into there. This takes quite some effort. Every time you cast it back, than .NET will first check if your object actually contains an int (since an object can contain anything) and than copy the value back to the int. This check also takes time.

Autorizzato sotto: CC-BY-SA insieme a attribuzione
Non affiliato a StackOverflow
scroll top