[Java] nine sorting algorithm summary (and the complexity of the application scenarios)


1, bubble sort, no matter what sequence, to be compared are n (n-1) / 2 times, preferably, the worst average time complexity are O (n²), a temporary variable required for exchanging the array data position, the space complexity is O (1).

    Optimization: Use a flag to determine whether orderly, Ruoguo orderly exit the loop once, time complexity is O (n).

2, selection sort is improved bubble sort, selection sort, whether the same sequence comparison to what is n (n-1) / 2 times, preferably, the worst average time complexity is also O (n² ), a temporary variable required for exchanging data within the array position, the space complexity is O (1).

3, insertion sort, if the sequence is completely ordered, as long as the comparison insertion sort n times, without moving the time complexity is O (n), if the sequence is reverse, the insertion sort to compare O (n²) and a mobile O (n² ), the average complexity is O (n²), preferably O (n), the worst is O (n²), as long as a secondary sorting process space, the space complexity of O (1).

4, quicksort preferably O (nlogn), the average is O (nlogn), the worst case is originally ordered sequence, in which case time complexity is O (n²), spatial complexity may be understood as quicksort recursion depth, to achieve recursive rely on the stack, on average, a recursive logn times, the average space complexity is O (logn).

5, merge sort requires a temporary TEMP [] to store the result of merging the spatial complexity is O (n), the time complexity is O (nlogn), may be the spatial complexity is reduced from O (n) to O (1) However, the time complexity opposite by O (nlogn) was raised to O (n²).

6, heap sort of time complexity , mainly in the initialization process and rebuild the heap stack after each select the maximum number of processes, the time complexity when the stack is built initialization O (n), after the reconstruction of the elements change the heap stack time complexity degree of O (nlogn), so that the average heap sort, preferably, are the worst time complexity of O (nlogn), heap sort situ sequencing, constant space complexity O (1).

7, Hill sorting time complexity analysis of its complexity, the complexity of some incremental sequence one has yet been able to prove it, just remember that conclusion on the line, {1,2,4,8, ...} this sequence is not a good increment sequences, the sequence of the incremental time complexity (worst case) is O (n²), Hibbard presents another increment series {1,3,7, ... , 2 ^ k-1}, the time complexity of such a sequence (worst case) for the (n ^ 1.5) O, Sedgewick proposes several incremental sequence, which is the worst case running time is O (n ^ 1.3) , where the best sequence is a 1,5,19,41,109 {, ...}, a temporary variable required for exchanging the position of the array data, the spatial complexity is O (1).

8, radix sort for n records, a time of performing the distribution and collection of O (n + r), if the keywords are d bits, d will have to perform over, so the total time complexity of O (d (n + r)). The space complexity of the algorithm is that in the distribution element, the tub use of space, the space complexity is O (r + n) = O (n)

9. bucket sort:

The average time complexity

O (n2) is generally a simple sort: direct insertion, selection sort, bubble

O (nlog2n): fast discharge, merge, heap

O (n): Linear: radix sorting

stability:

Formulas: a man named Hill quickly kill a group of people, directly elected bulldozers, pile the ground, since the first murder, so the mood is very unstable.

Hill, rapid, selection, stack, unstable

Scenario:

  • If n is small (e.g. n≤50), or can be directly inserted directly select the sort.
    • If the initial state of the basic sequence order, and bubbling directly into the best random Quicksort good. Insertion sort is very effective in partially ordered arrays, the average number of comparisons needed to select only half of the sort.
    • If n is large, it should be used for the time complexity O (nlgn) sorting method : quick sort, heap sort or merge sort . Merge sort can handle millions or even larger arrays, but insertion and selection can not do. The main disadvantage is the extra space merge sort and n proportional to the size of the array used in the secondary.
      • Quicksort is based on a comparison of internal order is considered the best way, when a keyword to be sorted are randomly distributed, the shortest average time quick sort;
      • Heapsort auxiliary space required is less than quick sort, and quick sort will not be the worst case possible. But both are sort of instability.
      • To require a stable sort, you can choose merge sort. Twenty-two merge sort algorithm is not to be encouraged, and it can usually direct insertion sort in combination. The first use of direct insertion sort order to obtain a longer subfolder, and then merge the twenty-two. Because the direct insertion sort is stable, so the improved merge sort is still stable.
    • Quick sort has the advantage of in situ sequencing (only a small auxiliary stack), but the benchmark chosen is a problem for small arrays, quicksort slower than insertion sort.
    • Heap Sort advantage is that sorting can be sorted array itself as a heap, without any additional space, and selection sort is somewhat similar, but relatively much less required, such as heap sort for embedded systems or low-cost mobile limited capacity in scene device.

Guess you like

Origin www.cnblogs.com/yaogungeduo/p/11245233.html