Detailed explanation of array sorting

1. Summary of sorting algorithm

  Sorting algorithms can be divided into two categories :
(1) Non-linear time sorting algorithm: The relative order between elements is determined by comparison. Since its time complexity cannot exceed O(nlogn), it is called nonlinear time comparison sorting. Such algorithms include: Bubble Sort, Insertion Sort, Selection Sort, Hill Sort, Merge Sort, Quick Sort, Heap Sort.
(2) Linear time non-comparison sorting algorithm: It does not determine the relative order of elements through comparison, it can break through the time lower bound based on comparison sorting, and runs in linear time, so it is called linear time non-comparison sorting. Such algorithms include: bucket sort, counting sort, radix sort. However, although the time complexity of these algorithms is relatively low, they also have strict requirements on the data to be sorted.
  The complexity analysis of different algorithms is as follows :
1

2. Sorting algorithm principle and implementation

2.1 Merge sort

  1. The principle
      of merge sorting adopts the idea of ​​divide and conquer. If we want to sort an array, we first divide the array into two parts from the middle, then sort the two parts separately, and then merge the two parts that have been sorted together, so that The entire array is sorted. The core is the merge() merge function.
  2. accomplish
/**
 * 归并排序:
 * 采用分治思想,如果要排序一个数组,我们先把数组从中间分成前后两个部分,
 * 然后对前后两个部分分别进行排序,再将排好序的两部分合并在一起,这样整个数组就都有序了。
 * @author mling
 */
public class mergeSortTest {
	public static void main(String[] args) {
		int[] arr = new int[]{1,3,2,5,3};
		mergeSort(arr,0,arr.length-1);
		print(arr);
	}
	/**归并算法:
	 * @param arr 数组
	 * @param l 待排序的起始下标(左边)
	 * @param r 待排序的终止下标(右边)
	 * */
	public static void mergeSort(int[] arr,int l,int r){
		if(l >= r){//递归终止条件
			return;
		}
		int mid = (l+r)/2;//取数组的中间点
		mergeSort(arr,l,mid);//对数组前半部分进行排序
		mergeSort(arr,mid+1,r);//对数组后半部分进行排序
		merge(arr,l,r,mid);//合并数组前半部分和后半部分
	}
	/**
	 * 合并数组前半部分和后半部分: 从小到大
	 * //l为起点,r为终点,mid为第1区间与第2区间的分界线
	 */
	private static void merge(int[] arr, int l, int r, int mid) {
		//初始化变量:i表示前半部分的下标,j表示后半部分的下标,k表示临时数组的下标
		int i=l,j=mid+1,k=0;
		int[] tmp = new int[r-l+1];
		while(i<=mid && j<=r){//1,2区间均未结束
			if(arr[i] <= arr[j]){//1区间的数<2区间
				tmp[k++] = arr[i++];
			}else{
				tmp[k++] = arr[j++];
			}
		}
		//将剩余数据拷贝到临时数组中
		for(;i<=mid;i++) tmp[k++]=arr[i];
		for(;j<=r;j++) tmp[k++]=arr[j];
		//将tmp临时数组拷贝回原数组
		for(int x=0;x<r-l+1;x++){
			arr[l+x]=tmp[x];
		}
	}
	private static void print(int[] arr) {
		for(int i=0; i<arr.length; i++){
			System.out.print(arr[i]);
		}
		System.out.println();
	}
}
  1. Algorithm analysis
    (1) Merge sort is a stable sorting algorithm;
    (2) The best, worst, and average time complexity are O(nlogn).
    (3) The space complexity is O(n), so it is not in-place sorting. This is also an Achilles' heel of merge sort.

2.2 Quick Sort

  1. The principle of
      quick sorting is also to use the idea of ​​​​divide and conquer. If we want to sort a set of data, we first select any one of the data in this set of data as the partition point pivot, and then traverse this set of data, put the data smaller than the partition point pivot on the left, and put the data larger than the partition point pivot on the right, and put the pivot to the middle. Then sort the left and right parts separately. The core is the partition() function.
  2. accomplish
/**
 * 快速排序
 * @author rmling
 */
public class QuickSortTest {
	public static void main(String[] args) {
		int[] arr = new int[]{1,3,2,5,3};
		quickSort(arr,0,arr.length-1);
		print(arr);
	}
	/**快速排序算法:从小到大
	 * @param arr:数组   l:待排序的起始下标(左边) r:待排序的终止下标(右边)
	 * */
	public static void quickSort(int[] arr,int l,int r){
		if(l >= r){//递归终止条件
			return;
		}
		int part = partition(arr,l,r);//获取分区点
		quickSort(arr,l,part-1);//对左分区进行排序
		quickSort(arr,part+1,r);//对右分区进行排序
	}
	/**返回分区点下标*/
	private static int partition(int[] arr, int l, int r) {
		int i=l;int prv=arr[r];
		for(int j=l;j<=r;j++){
			if(arr[j] < prv){
				swap(arr,i,j);
				i++;
			}
		}
		swap(arr,i,r);
		return i;
	}
	private static void swap(int[] a,int x,int y){
		int temp=a[x];
		a[x]=a[y];
		a[y]=temp;
	}

	private static void print(int[] arr) {
		for(int i=0; i<arr.length; i++){
			System.out.print(arr[i]);
		}
		System.out.println();
	}
}
  1. Algorithm analysis
    (1) Quick sort is an in-place sort with a space complexity of O(1).
    (2) Quick sort is an unstable algorithm.
    (3) The best and average time complexity of quick sort is O(nlogn). But in extreme cases, if the data in the array is originally ordered, such as 1, 3, 5, 6, 8, if we choose the last element as the pivot every time, then the two intervals obtained by each partition are different. Equally, we need to perform about n partition operations to complete the whole process of quick sorting. We need to scan about n/2 elements on average for each partition. At this time, the time complexity is O(n^2), which is the most Bad time complexity.

The difference between merge and quick sort :
(1) Both merge sort and quick sort use the idea of ​​​​divide and conquer, and the code is implemented through recursion, but the core of merge sort is the merge() function, and the core of quick sort is partition() partition function. The processing of merge sort is from bottom to top, first processing sub-problems, and then merging. The processing process of quick sort is just the opposite. Its processing process is from top to bottom, partitioning first, and then processing sub-problems.
(2) The merge sort algorithm is a sorting algorithm with relatively stable time complexity in any case, and the time complexity is O(nlogn); the time complexity of the quick sort algorithm is O(n^2) in the worst case, But the average time complexity is O(nlogn). Not only that, the probability that the time complexity of quick sorting degenerates to O(n^2) is very small, and we can avoid this situation by choosing a pivot reasonably.
(3) Merge sort is not an in-place sorting algorithm, and the space complexity is relatively high, which is O(n); quick sort is an in-place sorting algorithm, and the space complexity is O(1).
(4) Merge sort is a stable algorithm, while quick sort is unstable.

2.3 Bubble sort

  1. Principle
      Bubble sorting only operates on two adjacent data. Each bubbling operation will compare two adjacent elements to see if they meet the requirements of the size relationship. If not, let them swap places. One bubbling will move at least one element to where it should be, and repeat n times to complete the sorting of n data.
  2. Algorithm Description
    (1) Compare adjacent elements. If the first is greater than the second, swap them both;
    (2) do the same for each pair of adjacent elements, from the first pair at the beginning to the last pair at the end, so that the element at the end should be is the largest number;
    (3) Repeat the above steps for all elements except the last one;
    repeat steps 1~3 until the sorting is complete.
  3. accomplish
/**
 * 冒泡排序
 * @author mling
 */
public class BubbleSortTest {
	public static void main(String[] args) {
		int[] arr = new int[]{1,3,2,5,3};
		bubbleSort(arr);
		print(arr);
	}
	/**从小到大排序 */
	public static void bubbleSort(int[] arr){
		int len = arr.length;
		for(int i=0; i<len; i++){
			boolean flag = false;//提前退出冒泡循环的标志
			for(int j=0;j<len-i-1;++j){
				if(arr[j] > arr[j+1]){
					swap(arr,j,j+1);
					flag=true;//true表示有数据交换
				}
			}
			if(!flag) break;//没有数据交换,提前退出
		}
	}
	private static void swap(int[] a,int x,int y){
		int temp=a[x];a[x]=a[y];a[y]=temp;
	}
	private static void print(int[] arr) {
		for(int i=0; i<arr.length; i++){	System.out.print(arr[i]);}
		System.out.println();
	}
}
  1. Algorithm analysis
    (1) The process of bubbling only involves the exchange operation of adjacent data, and only requires a constant level of temporary space, so its space complexity is O(1), and it is an in-place sorting algorithm.
    (2) Bubble sort is a stable sorting algorithm.
    (3) The best time complexity is O(n), the worst time complexity is O(n*n), and the time complexity is O(n*n).

2.4 Insertion sort

  1. The principle
      divides the data of the array into two intervals, the sorted interval and the unsorted interval. The initial sorted range has only one element, which is the first element of the array. The core idea of ​​the algorithm is to take the elements in the unsorted range, find a suitable position in the sorted range and insert them, and ensure that the data in the sorted range is always in order. Repeat this process until the elements in the unsorted interval are empty, and the algorithm ends.
  2. Algorithm description
    (1) Starting from the first element, the element can be considered to be sorted;
    (2) Take out the next element and scan from the back to the front in the sorted element sequence;
    (3) If the element (sorted ) is greater than the new element, move the element to the next position;
    (4) Repeat step 3 until the position where the sorted element is less than or equal to the new element is found;
    (5) Insert the new element after the position;
    (6) Repeat steps 2~5.
  3. accomplish
/**
 * 插入排序
 * @author mling
 */
public class InsertSortTest {
	public static void main(String[] args) {
		int[] arr = new int[]{1,3,2,5,3};
		insertSort(arr);
		print(arr);
	}
	/**从小到大排序 */
	public static void insertSort(int[] arr){
		int len = arr.length;
		for(int i=1; i<len; i++){
			int current = arr[i];
			//查找需要插入的位置
			int j=i-1;
			for(;j>=0;j--){
				if(arr[j] > current){
					arr[j+1]=arr[j];
				}else{
					break;
				}
			}
			arr[j+1] = current;
		}
	}
	private static void print(int[] arr) {
		for(int i=0; i<arr.length; i++){
			System.out.print(arr[i]);
		}
		System.out.println();
	}
}
  1. Algorithm analysis
    (1) The operation of the insertion algorithm only needs an additional storage space, so the space complexity is O(1), which is an in-place sorting algorithm.
    (2) The insertion algorithm is a stable sorting algorithm.
    (3) The best time complexity is O(n), the worst time complexity is O(n*n), and the average time complexity is O(n^2).

2.5 Selection sort

  1. Algorithm principle :
      The implementation idea of ​​selection sorting is similar to that of insertion sorting, and it is also divided into sorted intervals and unsorted intervals. But the selection sort will select the smallest (largest) element from the unsorted interval every time, and store the end of the sorted interval. Repeat this until all elements are sorted.
  2. Algorithm description :
    (1) Initial state: the unordered area is R[1...n], and the ordered area is empty; (
    2) When the i-th sorting (i=1,2,3...n-1) starts, the current The ordered area and the disordered area are R[1...i-1] and R(i...n), respectively. This sorting process selects the record R[k] with the smallest key from the current unordered area, and exchanges it with the first record R in the unordered area, so that R[1...i] and R[i+1... n) respectively become a new ordered area with an increase in the number of records and a new disordered area with a decrease in the number of records; (
    3) The n-1 pass is over, and the array is ordered.
  3. Algorithm implementation :
/**
 * 插入排序
 * @author mling
 */
public class SelectSortTest {
	public static void main(String[] args) {
		int[] arr = new int[]{1,3,2,5,3};
		selectionSort(arr);
		print(arr);
	}
	/**选择排序:从小到大*/
	public static void selectionSort(int[] arr){
		int len = arr.length;
		int minIndex;
		for(int i=0;i<len;i++){
			minIndex=i;
			//寻找未排序部分的最小值的索引
			for(int j=i+1;j<len;j++){
				if(arr[j]<arr[minIndex]){
					minIndex=j;
				}
			}
			swap(arr,i,minIndex);
		}
	}
	private static void swap(int[] a,int x,int y){
		int temp=a[x];a[x]=a[y];a[y]=temp;
	}
	private static void print(int[] arr) {
		for(int i=0; i<arr.length; i++){
			System.out.print(arr[i]);
		}
		System.out.println();
	}
}
  1. Algorithm analysis :
    (1) The space complexity of selection sort is O(1), which is an in-place sorting algorithm.
    (2) Selection sort is not a stable sorting algorithm. Because the selection sort needs to select the minimum value from the unsorted interval every time, and exchange positions with the previous elements, which will destroy stability.
    (3) The best, worst, and average time complexity of selection sorting are O(n^2).

2.6 Hill sort:

  1. Algorithm principle:
      Hill sort is an improved version of simple insertion sort. The difference between him and insertion sort is that it will give priority to farther elements. Hill sort is also called shrinking incremental sort. The core of Hill sorting lies in the setting of the interval sequence (that is, the increment). The interval sequence can be set in advance or dynamically defined.
  2. Algorithm description :
    first divide the entire record sequence to be sorted into several sub-sequences for direct insertion sorting respectively, specific algorithm description:
    (1) Select an incremental sequence t1, t2, ..., tk, where ti>tj, i<j , tk=1;
    (2) According to the incremental sequence number k, sort the sequence for k times;
    (3) For each sorting, according to the corresponding increment ti, divide the sequence to be sorted into several lengths of m (m= length/ti), perform direct insertion sort on each subtable. Only when the increment factor is 1, the entire sequence is treated as a table, and the length of the table is the length of the entire sequence.
  3. Algorithm implementation:
/**
 * 希尔排序
 * @author mling
 */
public class ShellSortTest {
	public static void main(String[] args) {
		int[] arr = new int[]{1,3,2,5,3};
		shellSort(arr);
		print(arr);
	}
	/**希尔排序:从小到大*/
	public static void shellSort(int[] arr){
		int len = arr.length;
		int gap=1;//希尔排序的关键在于设置增量,这里我们动态设置一个增量
		while(gap<len/3){
			gap=gap*3+1;
		}
		int temp;
		for(;gap>0;gap=(int) Math.floor(gap/3)){
			for(int i=gap;i<len;i++){
				temp = arr[i];
				int j=i-gap;
				for(;j>=0 && arr[j]>temp;j-=gap){
					arr[j+gap]=arr[j];
				}
				arr[j+gap]=temp;
			}
		}
	}
	private static void print(int[] arr) {
		for(int i=0; i<arr.length; i++){
			System.out.print(arr[i]);
		}
		System.out.println();
	}
}
  1. Algorithm analysis:
    (1) The space complexity of Hill sorting is O(1), which is an in-place sorting algorithm.
    (2) Hill sort is not an unstable sorting algorithm. Multiple insertion sorting, we know that one insertion sorting is stable and will not change the relative order of the same elements, but in different insertion sorting processes, the same elements may move in their respective insertion sorting, and finally its stability will be destroyed Shuffle, so Hill sort is unstable.
    (3) The best time complexity of selection sorting is O(n), the worst time complexity is O(n*n), and the average time complexity is O(n^1.3).

Reference link: sorting algorithm

Guess you like

Origin blog.csdn.net/weixin_44462773/article/details/130155475
Recommended