https://www.cnblogs.com/onepixel/p/7674659.html this article is very nice
https://www.bilibili.com/video/av685670?from=search&seid=1637373535603658338 this excellent action figure
https://www.icourse163.org/course/ZJU-93001 MOOC data structure better and better Zhejiang University, Ho Ming Chin
Bubble sort, selection sort, insertion sort these three is the slowest and most classic three sorting algorithms
Quick sort for large random serial number is generally believed to be the fastest known sorting
Bubble sort bubble sort
The simplest sorting algorithms, but also the efficiency of the worst , because it must be known before the exchange at the final position, wasting a lot of "exchange operations"
If the list has been sorted, it is the best case, no exchange during the traversal, if found to have been sorted, can be terminated early, bubbling in this modification is often referred to short bubble sort
Time Complexity: The average O (n ^ 2) worst O (n ^ 2) is preferably O (n)
Space complexity: O (1)
stable
# Test input array alist = [27, 33 is, 28,. 4, 2, 26 is, 13 is, 35,. 8, 14] # bubble sort for I in Range (len (alist) -1): for J in Range (len (alist) -1-I): IF alist [J]> alist [J +. 1]: alist [J], alist [J +. 1] = alist [J +. 1], alist [J] # sort result output print ( 'sorted:', alist)
Selection sort selection sort
Improved selection sort bubble sort, each time through the list only once exchange
Time Complexity: The average O (n ^ 2) worst O (n ^ 2) is preferably O (n ^ 2)
Space complexity: O (1)
Unstable
# Test input array alist = [27, 33 is, 28,. 4, 2, 26 is, 13 is, 35,. 8, 14] # sort selected for I in Range (len (alist) -1): Least = I for J in Range (I +. 1, len (alist)): IF alist [J] <alist [Least]: Least = J IF Least = I:! alist [Least], alist [I] = alist [I], alist [Least] # sorting result output print ( 'sorted:', alist )
Insertion sort insertion sort
It is always maintained at a lower position in the list is a sort of sub-lists, and each new item "Insert" back to the previous sub-list, so that the child becomes a larger sorted list an item
Time Complexity: The average O (n ^ 2) worst O (n ^ 2) is preferably O (n)
Space complexity: O (1)
stable
# Test input array alist = [27, 33 is, 28,. 4, 2, 26 is, 13 is, 35,. 8, 14] # insertion sort for I in Range (. 1, len (alist)): J = I; the while J> 0 and alist [J-. 1]> alist [J]: alist [J-. 1], alist [J] = alist [J], alist [J-. 1] J - =. 1 # sort result output print ( 'sorted: ', alist)
Shell sort shell sort
1959 Shell invention, O (n ^ 2) sorted first break, is improved simple insertion sort, insertion sort and it is different from the priority comparison distant elements, known as "incremental descending sort"
2 is directed to the following features to improve insertion sort: when the data operation almost sorted, high efficiency, i.e., efficiency of the linear ordering can be achieved; is generally inefficient because insertion sort can only move data one
Shell sort through all of the elements to be compared is divided into several areas to improve performance of insertion sort. This allows a one-time element can be a big step forward towards the final position
gap concept, decreasing gap, the final step is insertion sort, but this time almost sorted the
Hill is the core of sorting gap
Known gap provided Marcin Ciura's gap sequence, gaps = [701, 301, 132, 57, 23, 10, 4, 1], but the length of the array by continuously following example divisible sequence obtained as gaps 2
Time Complexity: The average O (n (logn) ^ 2) worst O (n ^ 2) is preferably O (n)
Space complexity: O (1)
Unstable
# Test input array alist = [27, 33 is, 28,. 4, 2, 26 is, 13 is, 35,. 8, 14] # Hill sorting gap = len (alist) 2 // the while gap> 0: # for each gap do insertion sort for I in Range (GAP, len (alist)): J = I the while J> = GAP and alist [J-GAP]> alist [J]: alist [J-GAP], alist [J] = alist [J], alist [GAP-J] J - GAP = GAP = GAP 2 // # sort result output print ( 'sorted:', alist )
Merge sort merge sort
Divide-and-conquer strategy to improve performance, is a typical application of divide and conquer
Merge sort is a recursive algorithm, the list will continue to be split into half
Merging multiple merge
The disadvantage is that the merge process requires extra storage space
Time Complexity: The average O (nlogn) worst O (nlogn) preferably O (nlogn)
Space complexity: O (n)
stable
#测试输入数组 alist = [27, 33, 28, 4, 2, 26, 13, 35, 8, 14] #归并排序 def merge_sort(ilist): def merge(left, right): result = [] while left and right: result.append((left if left[0] <= right[0] else right).pop(0)) return result + left + right if len(ilist) <= 1: return ilist mid = len(ilist) // 2 return merge(merge_sort(ilist[:mid]), merge_sort(ilist[mid:])) #排序结果输出 print('sorted:',merge_sort(alist))
Quick sort quick sort
Merging the same ordering, using the divide and conquer method, without using an additional storage
Bubble Sort is an improvement
Generally significantly faster than other algorithms, because its inner loop can be very effective in reaching the majority of the architecture
Simple versions and merge sort as well, the need for additional storage space, but can be changed in-place version, the extra space will not
Time Complexity: The average O (nlogn) worst O (n ^ 2) is preferably O (nlogn)
Space complexity: O (nlogn)
Unstable
#测试输入数组 alist = [27, 33, 28, 4, 2, 26, 13, 35, 8, 14] #快速排序 def quick_sort(ilist): length = len(ilist) if length <= 1: return ilist else: # Use the last element as the first pivot pivot = ilist.pop() # Put elements greater than pivot in greater list # Put elements lesser than pivot in lesser list greater, lesser = [], [] for element in ilist: if element > pivot: greater.append(element) else: lesser.append(element) return quick_sort(lesser) + [pivot] + quick_sort(greater) # Sorting result output print ( 'sorted:', quick_sort (alist))
Heap sort heap sort
One option is improved sorting
This algorithm uses the heap data structure nearly complete binary tree designed
Stack, typically by a large root-dimensional array of real roots small heap pile
Time Complexity: The average O (nlogn) worst O (nlogn) preferably O (nlogn)
Space complexity: O (1)
Unstable
#测试输入数组 alist = [27, 33, 28, 4, 2, 26, 13, 35, 8, 14] #堆排序 def heapify(unsorted, index, heap_size): largest = index left_index = 2 * index + 1 right_index = 2 * index + 2 if left_index < heap_size and unsorted[left_index] > unsorted[largest]: largest = left_index if right_index < heap_size and unsorted[right_index] > unsorted[largest]: largest = right_index if largest != index: unsorted[largest], unsorted[index] = unsorted[index], unsorted[largest] heapify(unsorted, largest, heap_size) def heap_sort(unsorted): n = len(unsorted) for i in range(n // 2 - 1, -1, -1): heapify(unsorted, i, n) for i in range(n - 1, 0, -1): unsorted[0], unsorted[i] = unsorted[i], unsorted[0] heapify(unsorted, 0, i) return unsorted #排序结果输出 print('sorted:',heap_sort(alist))