Question

Apologies if this question feels like a solution verification, but this question was asked in my graduate admission test and there's a lot riding on this:

What is the worst case time complexity of inserting $n$ elements into an empty linked list, if the linked list needs to be maintained in sorted order?

In my opinion, the answer should be $O(n^2)$ because in every insertion, we will have to insert the element in the right place and it is possible that every element has to be inserted at the last place, giving me a time complexity of $1 + 2 + ... (n-1) + n = O(n^2)$

However, the solution that I have says that we can first sort the elements in $O(n \log n)$ and then, we can insert them one by one in $O(n)$, giving us an overall complexity of $O(n \log n)$.

From the given wording of the question, which solution is more apt? In my opinion, since the question mentions "linked list needs to be maintained in sorted order", I am inclined to say that we cannot sort the elements beforehand and then insert them in the sorted order.

Was it helpful?

Solution

The question only says that the target list needs to be maintained in sorted order. It doesn't say anything about any other data structure that you may choose to use. The proposed solution first does some preprocessing of the arguments to insert, then does the insertion proper. This is allowed by the problem statement.

A practical reason to do this, rather than insert the elements then sort, would be if the linked list object is shared with another thread that requires it to always be sorted. (In such a scenario, you'd need to ensure that inserting one element is atomic.) So this question isn't just making strange requirements for the sake of being strange. It's the sort of requirements that come up often in the real world of programming.

Another solution with the same complexity would be to insert the elements into the target list as they come, and maintain a parallel data structure mapping element values to node pointers in the target list. To insert each element, find the preceding element in the mapping, and insert the new element after this node. This assumes that the insertion process creates the list nodes as it goes (as opposed to filling existing blank nodes).

This question is more about reading comprehension than about algorithms. The way it's worded, it's a bit of a trick question. It's somewhat poorly worded because it relies on precise reading, but fails to state some key assumptions, such as the fact that obtaining the elements to insert costs $O(n)$, comparing two elements can be done in $O(1)$, and the input domain is effectively unbounded (exercise: come up with an $O(n)$ algorithm if the inputs are integers in the range $[1,42]$). But the given answer is correct.

You made the assumption that there's no way to use an auxiliary data structure. Nothing in the problem statement forbids using auxiliary data structures. A simple way to forbid auxiliary data structures would be to require $O(1)$ memory overhead.

Note that even under this assumption, your reasoning is wrong, or at least imprecise. If you happened to know that the elements are given in the correct order, you could maintain a pointer to the tail of the list, and keep inserting there, which would take $O(n)$. The worst case is not if every element has to be inserted at the last position in the target list, but at the last position reached when traversing the list in some way. The worst case is indeed $\Theta(n^2)$, but to prove this, you have to prove that finding the insertion point in the list takes $\Theta(n)$ time, and this requires proving that the distance from any pointer you have into the list is bounded below by $\Omega(n)$. This is the case if you have a constant number $A$ of pointers (you implicitly assumed $A=1$, with a single pointer at the start of the list), so that you need to traverse at least $k/A$ nodes after $k$ insertions in the worst case.

OTHER TIPS

Best possible structure which I know of, are Fibonacci Heaps, you can insert elements in $O(1)$ and extract the minimum in $O(\log(n))$, this means if you need a sorted order of all elements it takes you $O(n\log(n))$ while inserting new elements only costs you $O(1)$, I know no other structure which could keep up with this.

It really is a tricky question. First of all, the complexity of O(nlogn) applies only for the algorithms which use comparison between their elements (comparative algorithm). There are also algorithms which are non-comparative such as Radix sort which their complexity depends on the size in bits which the numbers need to be stored in memory. So if we assume that we can sort the numbers beforehand with any algorithm, then we can also assume that the numbers are naturals and the maximum element is M < 10, so with radix sort you would get worst case O(10n) = O(n). If we cannot make any assumption then you are right. If you are only allowed to use linked lists and nothing more (no indexing of any kind), then the complexity is O(n^2) (bubble sort).

It should be O(n). Follow the algorithm as -

1) If Linked list is empty then make the node as head and return it.

2) If the value of the node to be inserted is smaller than the value of the head node, then insert the node at the start and make it head.

3) In a loop, find the appropriate node after which the input node is to be inserted.

To find the appropriate node start from the head, keep moving until you reach a node who's value is greater than the input node. The node just before that is the appropriate node

4) Insert the node after the appropriate node found in step 3

Licensed under: CC-BY-SA with attribution
Not affiliated with cs.stackexchange
scroll top