12. Merge sort

Merge Sort

First divide the array from the middle into two parts, and then sort the two parts separately, and then merge the two parts in order, so that the entire array is in order

Divide the mind, divide and rule.

Divide and conquer is a problem-solving thought, recursion is a programming skill

Recursive formula:

merge_sort(p…r) = merge(merge_sort(p…q), merge_sort(q+1…r))

Termination condition: p> = r no longer need to continue decomposition

Fake code:

// 归并排序算法, A 是数组,n 表示数组大小
merge_sort(A, n) {
  merge_sort_c(A, 0, n-1)
}

// 递归调用函数
merge_sort_c(A, p, r) {
  // 递归终止条件
  if p >= r  then return

  // 取 p 到 r 之间的中间位置 q
  q = (p+r) / 2
  // 分治递归
  merge_sort_c(A, p, q)
  merge_sort_c(A, q+1, r)
  // 将 A[p...q] 和 A[q+1...r] 合并为 A[p...r]
  merge(A[p...r], A[p...q], A[q+1...r])
}

Analysis process:

1. Apply for a temporary array tmp with the same size as A [p ... r].

2. We use two cursors i and j to point to the first element of A [p ... q] and A [q + 1 ... r], respectively.

3. Compare these two elements A [i] and A [j], if A [i] <= A [j], we put A [i] into the temporary array tmp, and i is shifted by one, Otherwise, put A [j] into the array tmp, and shift j backward.

4. Repeat the above process until one array is all finished, and then put the elements in another array in turn.

5. Finally, copy the data in the temporary array tmp to the original array A [p… r] ...

Stability : stable sorting,

time complexity:

T (1) = C; When n = 1, only constant-level execution time is needed, so it is expressed as C.

T(n) = 2*T(n/2) + n; n>1

T(n) = 2*T(n/2) + n
     = 2*(2*T(n/4) + n/2) + n = 4*T(n/4) + 2*n
     = 4*(2*T(n/8) + n/4) + 2*n = 8*T(n/8) + 3*n
     = 8*(2*T(n/16) + n/8) + 3*n = 16*T(n/16) + 4*n
     ......
     = 2^k * T(n/2^k) + k * n
     ......

T(n) = 2^kT(n/2^k)+kn。

When T (n / 2 ^ k) = T (1), that is n / 2 ^ k = 1, we get k = log2n. We substitute the value of k into the above formula to get T (n) == Cn + nlog2n. If we use big O notation, T (n) is equal to O (nlogn) .

The execution efficiency has nothing to do with the order of the original array to be sorted, so its time complexity is very stable. Whether it is the best case, the worst case, or the average case, the time complexity is O (nlogn).

Space complexity : It is necessary to open up an array with a temporary space of n, so the space complexity is O (n) , a fatal weakness.

achieve:

package main

import "fmt"

func main() {
	arr := []int{8, 9, 5, 7, 1, 2, 5, 7, 6, 3, 5, 4, 8, 1, 8, 5, 3, 5, 8, 4}
	tempArr := mergeSort(arr)
	fmt.Println(tempArr)
}

/**
归并排序(Merge sort,台湾译作:合并排序)是建立在归并操作上的一种有效的排序算法。该算法是采用分治法(Divide and Conquer)的一个非常典型的应用
分治思想,时间复杂度为:O(n*log(n))
*/

func mergeSort(arr []int) []int {
	if len(arr) < 2 {
		return arr
	}
	i := len(arr) / 2
	left := mergeSort(arr[0:i])
	right := mergeSort(arr[i:])
	tempArr := merge(left, right)
	return tempArr
}

func merge(left, right []int) []int {
	tempArr := make([]int, 0)
	m, n := 0, 0 // left和right的index位置
	l, r := len(left), len(right)
	for m < l && n < r {
		if left[m] > right[n] {
			tempArr = append(tempArr, right[n])
			n++
			continue
		}
		tempArr = append(tempArr, left[m])
		m++
	}
	tempArr = append(tempArr, right[n:]...) // 这里竟然没有报数组越界的异常?
	tempArr = append(tempArr, left[m:]...)
	return tempArr
}

 

Published 127 original articles · Likes 24 · Visits 130,000+

Guess you like

Origin blog.csdn.net/Linzhongyilisha/article/details/100110826