Should a .NET generic dictionary be initialised with a capacity equal to the number of items it will contain?

StackOverflow https://stackoverflow.com/questions/414109

Question

If I have, say, 100 items that'll be stored in a dictionary, should I initialise it thus?

var myDictionary = new Dictionary<Key, Value>(100);

My understanding is that the .NET dictionary internally resizes itself when it reaches a given loading, and that the loading threshold is defined as a ratio of the capacity.

That would suggest that if 100 items were added to the above dictionary, then it would resize itself when one of the items was added. Resizing a dictionary is something I'd like to avoid as it has a performance hit and is wasteful of memory.

The probability of hashing collisions is proportional to the loading in a dictionary. Therefore, even if the dictionary does not resize itself (and uses all of its slots) then the performance must degrade due to these collisions.

How should one best decide what capacity to initialise the dictionary to, assuming you know how many items will be inside the dictionary?

Was it helpful?

Solution

What you should initialize the dictionary capacity to depends on two factors: (1) The distribution of the gethashcode function, and (2) How many items you have to insert.

Your hash function should either be randomly distributed, or it is should be specially formulated for your set of input. Let's assume the first, but if you are interested in the second look up perfect hash functions.

If you have 100 items to insert into the dictionary, a randomly distributed hash function, and you set the capacity to 100, then when you insert the ith item into the hash table you have a (i-1) / 100 probability that the ith item will collide with another item upon insertion. If you want to lower this probability of collision, increase the capacity. Doubling the expected capacity halves the chance of collision.

Furthermore, if you know how frequently you are going to be accessing each item in the dictionary you may want to insert the items in order of decreasing frequency since the items that you insert first will be on average faster to access.

OTHER TIPS

I did a quick test, probably not scientific, but if I set the size it took 1.2207780 seconds to add one million items and it took 1.5024960 seconds to add if I didn't give the Dictionary a size... this seems negligible to me.

Here is my test code, maybe someone can do a more rigorous test but I doubt it matters.

static void Main(string[] args)
        {
            DateTime start1 = DateTime.Now;
            var dict1 = new Dictionary<string, string>(1000000);

            for (int i = 0; i < 1000000; i++)
                dict1.Add(i.ToString(), i.ToString());

            DateTime stop1 = DateTime.Now;

            DateTime start2 = DateTime.Now;
            var dict2 = new Dictionary<string, string>();

            for (int i = 0; i < 1000000; i++)
                dict2.Add(i.ToString(), i.ToString());

            DateTime stop2 = DateTime.Now;

            Console.WriteLine("Time with size initialized: " + (stop1.Subtract(start1)) + "\nTime without size initialized: " + (stop2.Subtract(start2)));
            Console.ReadLine();
        }

I think you're over-complicating matters. If you know how many items will be in your dictionary, then by all means specify that on construction. This will help the dictionary to allocate the necessary space in its internal data structures to avoid reallocating and reshuffling data.

Specifying the initial capacity to the Dictionary constructor increases performance because there will be fewer number of resizes to the internal structures that store the dictionary values during ADD operations.

Considering that you specify a initial capacity of k to the Dictionary constructor then:

  1. The Dictionary will reserve the amount of memory necessary to store k elements;
  2. QUERY performance against the dictionary is not affected and it will not be faster or slower;
  3. ADD operations will not require more memory allocations (perhaps expensive) and thus will be faster.

From MSDN:

The capacity of a Dictionary(TKey, TValue) is the number of elements that can be added to the Dictionary(TKey, TValue) before resizing is necessary. As elements are added to a Dictionary(TKey, TValue), the capacity is automatically increased as required by reallocating the internal array.

If the size of the collection can be estimated, specifying the initial capacity eliminates the need to perform a number of resizing operations while adding elements to the Dictionary(TKey, TValue).

Yes, contrary to a HashTable which uses rehashing as the method to resolve collisions, Dictionary will use chaining. So yes, it's good to use the count. For a HashTable you probably want to use count * (1/fillfactor)

The initial size is just a suggestion. For example, most hash tables like to have sizes that are prime numbers or a power of 2.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top