Question

I'm implementing a priority queue and want to iterate through the list to insert at the right spot. In the documentation it states that C# List<T>.Item Property is O(1): List<T>.Item Property

e.g.

int retrivedValue = myIntList[5];

How is this possible since add also is O(1)? It's like eating the cookie and still have it. Normal lists in my head have O(n) for accessing an element.

Was it helpful?

Solution 2

List<T> is a list in representation, as the documentation says, it represents a typed list of objects that can be accessed by index. Its elements can be accessed by index directly and does not require element-by-element traversal, ergo, it's time complexity to access an element is O(1). It's implemented internally as a dynamic array, the kind that doubles its size when it fills up (thanks to comments!).

You may be confusing it with LinkedList<T>, which is implemented as a linked list...

OTHER TIPS

The standard List type is backed by an internal array with O(1) access performance.

List does not use a linked list implementation.

The List<T> is backed by an array.

The Add operation is O(1) amortized across all adds, meaning that most operations are O(1), but some are O(N). Whenever the backing array fills up it gets copied to a new array of double the size.

As an example of how it works, assume the backing array starts with 4 elements. The first 4 additions are O(1), but the 5th will have to copy the existing 4 elements and add the 5th, making it O(5). Adding elements 6, 7, and 8 are O(1), while adding element 9 will be O(9). Then elements 10-16 will also be O(1).

By the time you've added 16 elements to the array, you've done O(28) operations, making adding N elements take almost O(2N) operations. Thus, adding a single element is O(2) = O(1) on average.

You might want to have a look at this in order to understand and yet use the most suitable data structure.

Below is the summary table:

enter image description here

You only have O(n) if you have to iterate across the list to retrive the item.

For this example you are accessing an internal array by index, so do not have to iterate.

Despite attractive List Search complexity based on element index, I wonder if your needs might be better served by a SkipList as described in An Extensive Examination of Data Structures Using C# 2.0 alluded to here, as the ordering priority is not unique.

Dynamically extending an Array for n insertions (n unbounded) is in fact O(n)

If an array backing of a list is doubled in size every time and n is allowed to grow without bound then the number of times the array will be re-allocated will be O(logn) and at each point will be size 2^k; k = 1,2,3...

enter image description here

List<T>.Item versus List<T>.Contains

Access to elements in a List is O(1) if the index is known, but to keep the correct order based on priority, you would need to either use some external ordering mechanism, or use .Contains, but that 2nd method is not O(1).

Skiplists have O(logn) Search complexity; allows a PriorityQueue O(logn) enqueue and O(1) dequeue

The idea is that a SkipList is a linked list with randomly inserted "fast forward" pointers to overcome a link list's O(n) Search complexity. Every node has a height K and K next pointers. Without going into detail, heights are distributed and the next() function selects the appropriate pointer in such a way to make the search O(logn). However, the node to remove from the queue is always going to be at one end of the linked list.

Online behavior

Memory is finite and practically speaking a priority queue is not going to grow forever. One could argue that the queue grows to a certain size and then never gets reallocated. But then, does it shrink? If the list becomes fragmented, a shrink operation sounds even more costly than the grow operation. If shrink is supported, then the cost will be paid every time the List grows too small and then subsequently the grow cost will be paid when the List grows too large. If it's not supported then the memory associated with the largest queue size will remain allocated. This being a priority queue, it does in fact sound possible that a queue could grow as work is fed into it, then shrink (logically) to 0,. There may be rare circumstances that it will grow very large if the consumer side lags.

List internally uses an array, so indexing is a direct operation. Also, add at the end is fast (unless a realoc is required). Insert and Remove are O(n) though, because all elements of the array need to be moved around. In practice, this isn't so bad as well, because the move is implemented as a single block-move of the right part of the array.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top