Sorting algorithm on Python

  All kinds of sorting contrast

  The method of sorting complexity of the worst and best stability complexity desired complexity

  冒泡 Sort by 稳定 O (n) O (n) O (n) O (n2) O (n ^ 2) O (n2) O (n2) O (n ^ 2) O (n2)

  选择 Sort by 稳定 O (n2) O (n ^ 2) O (n2) O (n2) O (n ^ 2) O (n2) O (n2) O (n ^ 2) O (n2)

  Insertion discharge mechanism 稳定 O (n) O (n) O (n) O (n2) O (n ^ 2) O (n2) O (n2) O (n ^ 2) O (n2)

  Quicksort labile O (n) O (n) O (n) O (n2) O (n ^ 2) O (n2) O (nlogn) O (nlogn) O (nlogn)

  Merge sort stable O (nlogn) O (nlogn) O (nlogn) O (nlogn) O (nlogn) O (nlogn) O (nlogn) O (nlogn) O (nlogn)

  Bubble Sort

  The core idea is to compare the size of the adjacent elements, swap places.

  Specific steps can be divided into an outer loop and inner loop:

  Each step of the outer loop the largest element in the array "sink" to the end of the array (the case of ascending order);

  Each step of the cycle the position of the elements in the size of the swap adjacent.

  Original array: [1,2,9,9,4,5,6,6,3,9,4]

  The first step in the cycle: the element 1 and element 2 are compared, no exchange position (ASC);

  Step inner loop: the element 2 and the elements 9 are compared, no exchange location;

  The third inner loop step of: comparing element 9 and element 9, do not exchange position;

  Circulating Step 4: 9 elements comparable to the elements 4, the exchange position [1,2,9,4,9,5,6,6,3,9,4];

  Circulating Step 5: 9 elements comparable to the elements 5, switching location [1,2,9,4,5,9,6,6,3,9,4];

  Circulating a sixth step of: comparing element 9 with the element 6, the exchange position [1,2,9,4,5,6,9,6,3,9,4];

  ...

  The last step in the cycle: the 9 elements comparable to the elements 6, to the exchange position [1,2,9,4,5,6,6,3,9,4,9];

  Results The first step is the outer loop cycles will sink to the end of the element 9, for the [1,2,9,4,5,6,6,3,9,4,9];

  Step outer loop: the second element 9 sink to the end, as [1,2,4,5,6,6,3,9,4,9,9];

  Step outer loop: the third element 9 sink to the end, as [1,2,4,5,6,3,6,4,9,9,9];

  Outer loop fourth step: the third element 9 sink to the end, as [1,2,4,5,3,6,4,6,9,9,9];

  ...

  The last step in the outer loop: the final result [1,2,3,4,4,5,6,6,9,9,9].

  Algorithm

  Bubble sort is a stable sort;

  Under optimum circumstances, the array are positive, the time a degree of complexity O (n) O (n) O (n);

  Worst-case reverse, time complexity is O (n2) O (n ^ 2) O (n2).

  def bubbleSort(nums):

  if len(nums) < 2:

  return nums

  # 1 because behind the index to add to compare, so here is len (nums) - 1

  for i in range(len(nums)-1):

  # -I is already i elements sink to the end

  for j in range(len(nums)-i-1):

  if nums[j] > nums[j+1]:

  nums[j], nums[j+1] = nums[j+1], nums[j]

  return nums

  Selection Sort

  The core idea is the smallest of the remaining elements (maximum) elements taken out in the first place (at the end).

  Specific steps:

  Traversing n elements, the first place to find the smallest element;

  Traversing the remaining n-1 elements, the first place to find the smallest element;

  Repeat the above steps until an orderly array.

  Algorithm

  Selection Sort nothing to do with the time complexity of the initial state, is O (n2) O (n ^ 2) O (n2).

  def selectionSort(nums):

  if len(nums) < 2:

  return nums

  for i in range(len(nums)-1):

  min_index = i

  for j in range(i+1,len(nums)):

  if nums[j] < nums[min_index]:

  min_index = j

  nums[min_index], nums[i] = nums[i], nums[min_index]

  return nums

  Insertion Sort

  The core idea is already partially ordered arrays, finding the right position and insert a new element.

  Specific steps are as follows:

  From the second array elements when opened, determine the size of the previous element, the exchange position;

  The third element, the first two elements have been ordered by size to find the right position, the third insertion element;

  And so on, until all the elements are in place.

  Algorithm

  Insertion sort is a stable sort, the optimal case, the arrays are positive, the time complexity is O (n) O (n) O (n). Worst-case reverse, time complexity is O (n2) O (n ^ 2) O (n2).

  def insertSort(nums):

  if len(nums) < 2:

  return nums

  for i in range(1,len(nums)):

  value = nums[i]

  j = i - 1

  while j >= 0 and nums[j] > value:

  nums[j+1] = nums[j]

  j -= 1

  nums[j+1] = value

  return nums

  if __name__ == "__main__":

  nums = [1,2,9,9,4,5,6,6,3,9,4]

  print(insertSort(nums))

  Output: [1, 2, 3, 4, 4, 5, 6, 6, 9, 9, 9]

  Quick Sort

  The core idea is to divide and conquer. Specific steps are as follows:

  Selecting an element in the array as a reference, you can take any value, but generally the median help understanding;

  All arrays are an array with a reference comparison, smaller than the reference benchmark movement on the left, than on the large base moved to the reference to the right;

  On both sides of the reference sub-array as a new array, repeating the first two steps, until a left subarray element.

  Sort partition idea is better in effects processing large data sets, some small difference in performance data, the scale reaches a certain hour switch to insertion sort.

  Algorithm

  Quick drain is an unstable sort, which is desirable time complexity is O (nlogn) O (nlogn) O (nlogn), the worst case time complexity of O (n2) O (n ^ 2) O (n2).

  Varied to achieve fast row, select a best understood: Partition + recursion.

  def quickSort(nums):

  if len(nums) < 2:

  return nums

  mid = nums[len(nums)//2]

  left, right = [], []

  nums.remove(mid)

  for num in nums:

  if num >= mid:

  right.append(num)

  else:  Wuxi Women's Hospital http://www.bhnnk120.com/

  left.append(num)

  return quickSort(left) + [mid] + quickSort(right)

  if __name__ == "__main__":

  nums = [1,2,9,9,4,5,6,6,3,9,4]

  print(quickSort(nums))

  Output: [1, 2, 3, 4, 4, 5, 6, 6, 9, 9, 9]

  Merge sort

  Merge sort also applied the idea of ​​partition, the main steps are as follows:

  The original sequence of n elements are regarded as an ordered sequence;

  Pairwise merge, doubling the length of the ordered sequence;

  Repeat the above steps until an ordered sequence of length n.

  It may be combined with the view seen:

  A start sequence [38, 27, 43, 3, 9, 82, 10] is divided into seven sequences of length 1;

  Pairwise merge to give 4 ordered sequence [27, 38], [3,43], [9,82], [10];

  Twenty-two merge again, to give an ordered sequence of two [3,27,38,43], [9,10,82];

  Until the last merge ordered sequence of a length of 7 [3,9,10,27,38,43,82]

  Algorithm

  Merge sort is a stable sort, are preferably worst time complexity O (nlogn) O (nlogn) O (nlogn), and therefore also desirable complexity O (nlogn) O (nlogn) O (nlogn).

  def merge(left, right):

  res = []

  i, j = 0, 0

  while i < len(left) and j < len(right):

  if left[i] <= right[j]:

  res.append(left[i])

  i += 1

  else:

  res.append(right[j])

  j += 1

  if i == len(left):

  res += right[j:]

  else:

  res += left[i:]

  return res

  def mergeSort(nums):

  if len(nums) <= 1: return nums

  mid = len(nums)//2

  left = mergeSort(nums[:mid])

  right = mergeSort(nums[mid:])

  return merge(left, right)

  Simplify the wording

  In place of the two pointers operating .pop (.pop time complexity is O (1) O (1) O (1));

  Returns left or right is determined to be empty because of a necessarily empty, the other is not empty;

  left and right order, so you can directly spliced ​​together.

  def merge(left, right):

  res = []

  while left and right:

  if left[0] <= right[0]:

  res.append(left.pop(0))

  else:

  res.append(right.pop(0))

  return res + left + right

  def mergeSort(nums):

  if len(nums) <= 1: return nums

  mid = len(nums)//2

  left = mergeSort(nums[:mid])

  right = mergeSort(nums[mid:])

  return merge(left, right)

  2019.7.15 supplement insertion sort

  2019.7.16 supplement bubble sort, select, in order to increase comparison table

  2019.7.30 supplement merge sort


Guess you like

Origin blog.51cto.com/14503791/2432549