Sorting Algorithms & Analysis - When to use what sort

To learn about various sorts, see the sorting column for details

History of Sorting Algorithms

Throughout the history of sorting algorithms, which sorting algorithms can reach the speed of O ( n log ( n ) ) O(n~log(n))O ( n l o g ( n )) 

  • Bubble sort ( Bubble BubbleBubble S o r t Sort S or t ): Bubble sort is one of the simplest sorting algorithms. It gradually "bubbles" the largest (or smallest) element to one end of the array by comparing and exchanging adjacent elements multiple times. Although the time complexity of bubble sort isO ( n 2 ) O(n^2)O ( n2 ), less efficient, but it's easy to understand and implement.

  • Selection sorting ( Selection SelectionSelection S o r t Sort S or t ): Selection sort is a simple and intuitive sorting algorithm. It gradually builds the sorted sequence by selecting the smallest (or largest) element of the unsorted part each time and placing it at the end of the sorted part. The time complexity of selection sort is alsoO ( n 2 ) O(n^2)O ( n2 ), but it has fewer swaps than bubble sort.

  • Insertion sort ( Insertion InsertionInsertion S o r t Sort S or t ): Insertion sort is a stable sorting algorithm. It builds a sorted sequence by inserting elements of the unsorted part one by one into the appropriate positions of the sorted part. The time complexity of insertion sort isO ( n 2 ) O(n^2)O ( n2 ), but for small or mostly ordered arrays, insertion sort performs better.

  • Hill sort ( Shell ShellShell S o r t Sort S or t ): Hill sort is an improved version of insertion sort. It works by splitting the array into multiple smaller subsequences, performing an insertion sort on the subsequences, and finally performing an insertion sort on the entire array. The time complexity of Hill sort is betweenO ( n ) O(n)O ( n ) SumO ( n 2 ) O(n^2)O ( n2 )depending on the chosen interval sequence.

  • Merge sort ( M erge MergeMerge S o r t Sort S or t ): Merge sort is a divide and conquer algorithm. It recursively splits an array into two subarrays, sorts each, and then merges the two sorted subarrays into a single sorted array. The time complexity of merge sort isO ( n log ( n ) ) O(n~log(n))O ( n log ( n ))  , which is a stable sorting algorithm .

  • Quick Sort ( Q uick QuickQuick S o r t Sort S or t ): Quick sort is also a divide and conquer algorithm. It divides the array into two subarrays by choosing a pivot element, one of which has all elements smaller than the pivot element and the other subarray with all elements greater than the pivot element. Then recursively sort the two subarrays. The time complexity of quick sort isO ( n log ( n ) ) O(n~log(n))O ( n l o g ( n ))  , but possibly O ( n 2 ) O(n^2)in the worst caseO ( n2)

  • Heap sort ( Heap HeapHeap S o r t Sort S or t ): Heap sorting uses the data structure of the heap for sorting. It builds a max-heap (or min-heap), swaps the top element with the last element, and readjusts the heap for the remaining elements, repeating this process until the entire array is sorted. The time complexity of heap sorting isO ( n log ( n ) ) O(n~log(n))O ( n log ( n )) , which is  an unstable sorting algorithm .

  • Counting sorting (this can't be sorting ~) ( Counting CountingCounting S o r t Sort S or t ): Counting sort is a non-comparison sorting algorithm. It implements sorting by determining the position of each element in the sorted sequence. The time complexity of counting sort isO ( n + k ) O(n+k)O ( n+k ) , where k is the largest value in the array to be sorted. Counting sort is suitable for cases where the range of elements is small.

  • Bucket sort ( B ucket BucketBucket S o r t Sort S or t ): Bucket sorting is also a non-comparative sorting algorithm. It divides the elements to be sorted into different buckets, sorts the elements in each bucket, and then merges the elements in the order of the buckets. The time complexity of bucket sort depends on the number of buckets and the sorting algorithm used in each bucket.

  • Radix sort ( Radix RadixRadix S o r t Sort S or t ): Radix sorting is a non-comparative sorting algorithm. It sorts the elements to be sorted from low to high according to the number of bits of the element. Radix sort can use a stable sorting algorithm as the sorting algorithm for each digit, such as counting sort or bucket sort.

Sorting Algorithm Analysis

quick sort

We found that very fast sorts, such as bucket sort and radix sort ,Their codes are more complicated, and they are generally not used if they can be used.

faster sort

And faster sorting, such as: merge sorting and heap sorting (the reason why fast sorting is not used is because fast sorting is too unstable!!!),Their codes are also more complicated (I didn't mention the priority queue). If you use the priority queue, it is inconvenient to access, so don't use it if you can.
Note: Sometimes a priority queue is convenient.

medium sort

Medium sorting, such as: Hill sorting and quick sorting ,Sometimes the speed is not enough, and the code falls into the complex category.

very slow sort

Very slow sorting, such as: bubble sorting and selection sorting ,Although the code is short and easy to remember, the speed is really too slow! ! ! ! ! !

Results of the analysis

0. No requirement

If there are no special requirements, it is a good choice to use the priority queue for heap sorting, and you can also use sort sortsort t function to sort

1. Requirements for speed

If there is a requirement for speed, it is recommended to use priority queue for heap sorting, or sort sortThe sort function sorts.

Saying it is the same as not saying it

2. Operation while sorting

If you want to operate in sorting, it is recommended to use various slower sorting algorithms, which is easy to understand and change.

3. Condition 1 & Condition 2

In this case, it is best to use merge sort! ! This is an excellent sorting algorithm, and it is stable and can be used in most cases

4. Operate in ordinal numbers

It is recommended to use insertion sorting, because insertion sorting itself maintains an ordered array, which is convenient and fast!

5. Condition 1 & Condition 4

Insertion sort optimization - a super algorithm beyond merge sort! !

See my amazing blog for details: The fastest insertion sort in history

Guess you like

Origin blog.csdn.net/DUXS11/article/details/132376602