سؤال

Could it happen that at certain values ​​of the pivot_value complexity of the quicksort is logarithmic ?

هل كانت مفيدة؟

المحلول 2

No.

The very first partition, regardless of the pivot value you pick, will take O(n) time - you need to compare every other element to the pivot.

After this you recurse and do the same for both partitions - the optimal choice of pivots (when each pivot is the median of the range) ends up with a complexity of O(n log n). The worst choice of pivots (when each pivot is the largest or smallest value in the range) ends up with a complexity of O(n^2).

نصائح أخرى

No.

Sorting using compare ops (and quick sort is such an algorithm) is proven to be Omega(nlogn) op - and thus it cannot be done better than O(nlogn) and definetly not at O(logn)


Sorting is Omega(nlogn) using compares because you need to use operators that have 2/3 possible results to manipulate the data. Since there are n! permutations for the data, it means you need Omega(log(n!)) compare ops, and log(n!) is strictly bounded by nlog(n) - so sorting is Omega(nlog(n))


Note that quick sort can be done in Theta(n) best case by checking if the list is sorted and aborting if it is, but there is no case that can beat this, and definetly you cannot get to a logarithimic complexity for any case.

No. In order for a sorting algorithm to be a correct algorithm (i.e. for every possible input it always produces the correct output) a comparison sorting algorithm like Quicksort MUST make at least n*log(n) comparison. If it does not, then there is some input for which it cannot determine the correct output. This is most easily visualized from the decision tree model, where each leaf corresponds to a correct output from the algorithm.

From the famous CLRS textbook p193:

The length of the longest simple path from the root of a decision tree to any of its reachable leaves represents the worst-case number of comparisons that the corresponding sorting algorithm performs. Consequently, the worst-case number of comparisons for a given comparison sort algorithm equals the height of its decision tree. A lower bound on the heights of all decision trees in which each permutation appears as a reachable leaf is therefore a lower bound on the running time of any comparison sort algorithm.

The height of the decision tree is n*log(n).

No. The concept of complexity is detached from private cases, that's the whole point.

No sorting algorithm can sort performing less than Ω(n) operations, since the algorithm has to read the input. Here is a lowerbound using the adversary argument:

if an algorithm doesn't read a part of input then we can change that part of the input without effecting the execution of the algorithm, as a result we can change the answer but the algorithm will not notice this change since it is not reading that part.

But quicksort does not achieve this lowerbound. No comparison based algorithm can have worst-case (or even average-case) running time O(n). All of them require Ω(n lg n) comparison operations. This can be shown using a decision tree lowerbound argument.

The best case complexity of the QuickSort algorithm (which implies that the pivot is chosen optimally) is Ω(n lg n) as Dukeling has explained.

It is possible to modify algorithms and hardcode the answer for a particular kind of inputs (like already sorted arrays) more efficiently and potentially improve the best-case running time. How efficiently depends on how fast we can check if the input belongs to one of those special cases and how fast we can output the answer for those special cases. We can check if an array is already sorted and output it in linear time. So we can improve the best-case complexity of any sorting algorithm to O(n) by hardcoding this special case.

مرخصة بموجب: CC-BY-SA مع الإسناد
لا تنتمي إلى StackOverflow
scroll top