请输入您要查询的百科知识:

 

词条 Comparison sort
释义

  1. Examples

  2. Performance limits and advantages of different sorting techniques

  3. Alternatives

  4. Number of comparisons required to sort a list

     Lower bound for the average number of comparisons 

  5. Notes

  6. References

A comparison sort is a type of sorting algorithm that only reads the list elements through a single abstract comparison operation (often a "less than or equal to" operator or a three-way comparison) that determines which of two elements should occur first in the final sorted list. The only requirement is that the operator forms a total preorder over the data, with:

  1. if ab and bc then ac (transitivity)
  2. for all a and b, ab or ba (connexity).

It is possible that both ab and ba; in this case either may come first in the sorted list. In a stable sort, the input order determines the sorted order in this case.

A metaphor for thinking about comparison sorts is that someone has a set of unlabelled weights and a balance scale. Their goal is to line up the weights in order by their weight without any information except that obtained by placing two weights on the scale and seeing which one is heavier (or if they weigh the same).

Examples

Some of the most well-known comparison sorts include:

  • Quicksort
  • Heapsort
  • Shellsort
  • Merge sort
  • Introsort
  • Insertion sort
  • Selection sort
  • Bubble sort
  • Odd–even sort
  • Cocktail shaker sort
  • Cycle sort
  • Merge insertion (Ford–Johnson) sort
  • Smoothsort
  • Timsort

Performance limits and advantages of different sorting techniques

There are fundamental limits on the performance of comparison sorts. A comparison sort must have an average-case lower bound of Ω(n log n) comparison operations,[1] which is known as linearithmic time. This is a consequence of the limited information available through comparisons alone — or, to put it differently, of the vague algebraic structure of totally ordered sets. In this sense, mergesort, heapsort, and introsort are asymptotically optimal in terms of the number of comparisons they must perform, although this metric neglects other operations. Non-comparison sorts (such as the examples discussed below) can achieve O(n) performance by using operations other than comparisons, allowing them to sidestep this lower bound (assuming elements are constant-sized).

Comparison sorts may run faster on some lists; many adaptive sorts such as insertion sort run in O(n) time on an already-sorted or nearly-sorted list. The Ω(n log n) lower bound applies only to the case in which the input list can be in any possible order.

Real-world measures of sorting speed may need to take into account the ability of some algorithms to optimally use relatively fast cached computer memory, or the application may benefit from sorting methods where sorted data begins to appear to the user quickly (and then user's speed of reading will be the limiting factor) as opposed to sorting methods where no output is available until the whole list is sorted.

Despite these limitations, comparison sorts offer the notable practical advantage that control over the comparison function allows sorting of many different datatypes and fine control over how the list is sorted. For example, reversing the result of the comparison function allows the list to be sorted in reverse; and one can sort a list of tuples in lexicographic order by just creating a comparison function that compares each part in sequence:

 '''function''' tupleCompare((lefta, leftb, leftc), (righta, rightb, rightc))     '''if''' lefta ≠ righta         '''return''' compare(lefta, righta)     '''else if''' leftb ≠ rightb         '''return''' compare(leftb, rightb)     '''else'''         '''return''' compare(leftc, rightc)

Balanced ternary notation allows comparisons to be made in one step, whose result will be one of "less than", "greater than" or "equal to".

Comparison sorts generally adapt more easily to complex orders such as the order of floating-point numbers. Additionally, once a comparison function is written, any comparison sort can be used without modification; non-comparison sorts typically require specialized versions for each datatype.

This flexibility, together with the efficiency of the above comparison sorting algorithms on modern computers, has led to widespread preference for comparison sorts in most practical work.

Alternatives

Some sorting problems admit a strictly faster solution than the {{math|Ω(n log n)}} bound for comparison sorting; an example is integer sorting, where all keys are integers. When the keys form a small (compared to {{mvar|n}}) range, counting sort is an example algorithm that runs in linear time. Other integer sorting algorithms, such as radix sort, are not asymptotically faster than comparison sorting, but can be faster in practice.

The problem of sorting pairs of numbers by their sum is not subject to the {{math|Ω(n² log n)}} bound either (the square resulting from the pairing up); the best known algorithm still takes {{math|O(n² log n)}} time, but only {{math|O(n²)}} comparisons.

Number of comparisons required to sort a list

n Minimum
100
211
333
455
577
61010
71313
81616
91919
102222
112626
122930[2][3]
133334[4][5][6]
143738[6]
154142[7][8][9]
1645 45 or 46[10]
1749 49 or 50
1853 53 or 54
195758[9]
206262
216666
227071[6]
n
102219
100525521
1 0008 5308 524
10 000118 459118 451
100 0001 516 7051 516 695
{{nobr|1 000 000}}18 488 88518 488 874

Above: A comparison of the lower bound to the actual minimum number of comparisons (from {{OEIS2C|id=A036604}}) required to sort a list of n items (for the worst case). Below: Using Stirling's approximation, this lower bound is well-approximated by .

The number of comparisons that a comparison sort algorithm requires increases in proportion to , where is the number of elements to sort. This bound is asymptotically tight.

Given a list of distinct numbers (we can assume this because this is a worst-case analysis), there are n factorial permutations exactly one of which is the list in sorted order. The sort algorithm must gain enough information from the comparisons to identify the correct permutation. If the algorithm always completes after at most f(n) steps, it cannot distinguish more than 2f(n) cases because the keys are distinct and each comparison has only two possible outcomes. Therefore,

, or equivalently

By looking at the first factors of , we obtain

This provides the lower-bound part of the claim. A better bound can be given via Stirling's approximation.

An identical upper bound follows from the existence of the algorithms that attain this bound in the worst case, like heapsort and mergesort.

The above argument provides an absolute, rather than only asymptotic lower bound on the number of comparisons, namely comparisons. This lower bound is fairly good (it can be approached within a linear tolerance by a simple merge sort), but it is known to be inexact. For example, , but the minimal number of comparisons to sort 13 elements has been proved to be 34.

Determining the exact number of comparisons needed to sort a given number of entries is a computationally hard problem even for small n, and no simple formula for the solution is known. For some of the few concrete values that have been computed, see {{OEIS2C|id=A036604}}.

Lower bound for the average number of comparisons

A similar bound applies to the average number of comparisons. Assuming that

  • all keys are distinct, i.e. every comparison will give either a>b or a<b, and
  • the input is a random permutation, chosen uniformly from the set of all possible permutations of n elements,

it is impossible to determine which order the input is in with fewer than {{math|log2(n!)}} comparisons on average.

This can be most easily seen using concepts from information theory. The Shannon entropy of such a random permutation is {{math|log2(n!)}} bits. Since a comparison can give only two results, the maximum amount of information it provides is 1 bit. Therefore, after k comparisons the remaining entropy of the permutation, given the results of those comparisons, is at least {{math|log2(n!) − k}} bits on average. To perform the sort, complete information is needed, so the remaining entropy must be 0. It follows that k must be at least {{math|log2(n!)}}.

This differs from the worst case argument given above, in that it does not allow rounding up to the nearest integer. For example, for {{math|1=n = 3}}, the lower bound for the worst case is 3, the lower bound for the average case as shown above is approximately 2.58, while the highest lower bound for the average case is 8/3, approximately 2.67.{{clarify|date=August 2017|post-text=See talk page}}

In the case that multiple items may have the same key, there is no obvious statistical interpretation for the term "average case", so an argument like the above cannot be applied without making specific assumptions about the distribution of keys.

Notes

1. ^{{Introduction to Algorithms|3|pages=191–193}}
2. ^Mark Wells, Applications of a language for computing in combinatorics, Information Processing 65 (Proceedings of the 1965 IFIP Congress), 497–498, 1966.
3. ^Mark Wells, Elements of Combinatorial Computing, Pergamon Press, Oxford, 1971.
4. ^Takumi Kasai, Shusaku Sawato, Shigeki Iwata, Thirty four comparisons are required to sort 13 items, LNCS 792, 260-269, 1994.
5. ^Marcin Peczarski, Sorting 13 elements requires 34 comparisons, LNCS 2461, 785–794, 2002.
6. ^Marcin Peczarski, New results in minimum-comparison sorting, Algorithmica 40 (2), 133–145, 2004.
7. ^Marcin Peczarski, Computer assisted research of posets, PhD thesis, University of Warsaw, 2006.
8. ^{{cite journal | last1 = Peczarski | first1 = Marcin | year = 2007 | title = The Ford-Johnson algorithm is still unbeaten for less than 47 elements | url = | journal = Inf. Process. Lett. | volume = 101 | issue = 3| pages = 126–128 | doi = 10.1016/j.ipl.2006.09.001 }}
9. ^{{cite journal | first1 = Weiyi | last1 = Cheng | first2 = Xiaoguang | last2 = Liu | first3 = Gang | last3 = Wang | first4 = Jing | last4 = Liu | date = October 2007 | title = 最少比较排序问题中S(15)和S(19)的解决 |language=zh |trans-title=The results of S(15) and S(19) to minimum-comparison sorting problem | url = http://fcst.ceaj.org/EN/abstract/abstract47.shtml | journal = Journal of Frontiers of Computer Science and Technology | volume = 1 | issue = 3| pages = 305–313 }}
10. ^{{Cite journal|last=Peczarski|first=Marcin|title=Towards Optimal Sorting of 16 Elements|journal=Acta Universitatis Sapientiae |volume=4|issue=2|pages=215–224|date=3 August 2011|arxiv=1108.0866|bibcode=2011arXiv1108.0866P}}

References

  • Donald Knuth. The Art of Computer Programming, Volume 3: Sorting and Searching, Second Edition. Addison-Wesley, 1997. {{ISBN|0-201-89685-0}}. Section 5.3.1: Minimum-Comparison Sorting, pp. 180–197.
{{sorting}}{{DEFAULTSORT:Comparison Sort}}

1 : Sorting algorithms

随便看

 

开放百科全书收录14589846条英语、德语、日语等多语种百科知识,基本涵盖了大多数领域的百科知识,是一部内容自由、开放的电子版国际百科全书。

 

Copyright © 2023 OENC.NET All Rights Reserved
京ICP备2021023879号 更新时间:2024/11/11 7:21:29