Summary of sorting problems [java implementation]

Table of contents

1. Bubble sort

2. Selection sort

2.1 Simple Selection Sort

2.2 Heap sort

3. Insertion sort 

3.1 Direct insertion sort

3.2 Hill sort

4. Merge sort

5. Quick Sort

Summarize


1. Bubble sort

Bubble Sort (Bubble Sort) is also a simple and intuitive sorting algorithm. It iteratively walks through the array to be sorted, comparing two elements at a time, and swapping them if they are in the wrong order. The work of visiting the sequence is repeated until there is no need to exchange, that is to say, the sequence has been sorted. The name of this algorithm comes from the fact that the smaller elements will slowly "float" to the top of the array through exchange.

1.1 Algorithm steps

Compare adjacent elements. If the first is bigger than the second, swap them both.

Do the same for each pair of adjacent elements, from the first pair at the beginning to the last pair at the end. After this step is done, the last element will be the largest number.

Repeat the above steps for all elements except the last one.

Continue repeating the above steps for fewer and fewer elements each time until there are no pairs of numbers to compare.

1.2 The fastest efficiency

All input data are in positive order

1.3 The slowest efficiency

When all input data is in reverse order

1.4 Stability

The so-called stable sorting algorithm means that when there are multiple identical values ​​in the original data, the relative positions of these values ​​remain unchanged after sorting.
What is the use of stability?

For example, a company wants to count the turnover of all departments, and sort these departments according to the turnover from small to large. When the turnover of two departments is the same, the two maintain the original order  . For this kind of problem, a stable sorting algorithm is needed to deal with it. .

1.5 Complexity

Time complexity: O(n^2)

Space complexity: O(1)

public class BubbleSort {
    public static void BubbleSort(int[] arr){
        if(arr == null || arr.length < 2){
            return;
        }
        for (int i = 0; i < arr.length; i++) {
            for (int j = 0; j < arr.length-1-i; j++){
                if(arr[j] > arr[j+1]){
                    int tmp = arr[j];
                    arr[j] = arr[j+1];
                    arr[j+1] = tmp;
                }
            }
        }
    }

    public static void main(String[] args) {
        int[] arr = {4, 5, 3, 2, 1};
        BubbleSort(arr);
        for (int i  : arr) {
            System.out.print(i+" ");
        }
    }
}

2. Selection sort

Selection sort is a simple and intuitive sorting algorithm, no matter what data goes in, it has a time complexity of O(n²). So when it is used, the smaller the data size, the better. The only advantage may be that it does not take up additional memory space.

Each time, select the smallest (largest) element from the data elements to be sorted and store it at the beginning of the sequence, and then continue to find the smallest (largest) element from the remaining unsorted elements, and then put it into the sorted sequence end of . And so on, until all the data elements to be sorted are exhausted. Selection sort is an unstable sorting method.

2.1 Simple Selection Sort

Basic idea: Each pass selects the smallest (or largest) element from the number elements to be sorted as the first element until all elements are exhausted.

Algorithm implementation: Each pass continuously compares and exchanges to make the first element the current minimum. Exchange is a time-consuming operation. We can record the subscript of the smaller element by setting a value. After the loop ends, the current value is stored The subscript of the smallest element can be exchanged at this time.

For an unordered array {4, 6, 8, 5, 9}, we use min to record the subscripts of smaller elements, and i and j are combined to traverse the array. Initially, both min and i point to the array The first element, j points to the next element, j starts to traverse the array elements from right to left, if any element is smaller than the min element, it is exchanged, then min is the subscript of the smaller element, and i goes to the right, so that the loop reaches When i reaches the last element, the sorting is completed. The process is shown in the figure below:

 

// 简单选择排序
public class SimpleSelectSort {
    public void selectionSort(int[] arr){
        // 需要遍历获得最小值的次数
        // 需要注意,当排序N个数,已经经过 N-1 次遍历后吗,已经是有序数列
        for (int i = 0; i < arr.length; i++){
            int min = i;   // 保存最小值的索引
            // 寻找第i个小的数值
            for (int j = i + 1; j < arr.length; j++){
                if (arr[min] > arr[j]){
                    min = j;
                }
                
            // 若min有变化,就将找到的第i个小的数值与第i个位置上的数值交换
                if (min != i){
                    int temp = arr[min];
                    arr[min] = arr[i];
                    arr[i] = temp;
                } 
            }
        }
    }

    public static void main(String[] args) {
        int[] arr = {3,4,5,2,1};
        System.out.println("排序前:"+ Arrays.toString(arr));
        SimpleSelectSort simpleSelectSort = new SimpleSelectSort();
        simpleSelectSort.selectionSort(arr);
        System.out.println("简单选择排序后:"+Arrays.toString(arr));
    }
}

2.2 Heap sort

A heap is a complete binary tree with the following properties: the data of the parent node is greater than the data of the child nodes, which is called a large root heap, which can be used for ascending order; the data of the parent node, which is smaller than the data of the child nodes, is called a small root heap, which can be used for descending order. For the above array {4, 6, 8, 5, 9}, the following tree can be formed:

To form it into a large root pile or a small root pile is shown in the following figure:

 

Basic idea: Make the sequence to be sorted into a large top heap. At this time, the maximum value of the entire sequence is the root node at the top of the heap. Exchange it with the end element, at this time the end is the maximum value, and then reconstruct the remaining n-1 elements into a heap, so that the second smallest value of n elements will be obtained, and an ordered sequence can be obtained by repeated execution. sequence. 

Algorithm implementation:

Step 1: Construct the initial heap. Constructs the given unordered sequence into a large top heap.

(1) Initial unordered sequence

 

(2) Starting from the last non-leaf node, adjust from left to right and from top to bottom. (The first non-leaf node length/2 - 1 = 5/2 - 1 = 1, namely the following 6 nodes)

(3) Find the second non-leaf node 4, because 9 is the largest in {4,9,8}, then 4 and 9 are exchanged

 

(4) The exchange at this time makes the structure of {4, 5, 6} change, continue to adjust, 6 is the largest in {4, 5, 6}, exchange 4 and 6 

In this way, the unordered sequence is constructed into a large root heap.

Step 2: Exchange the top element of the heap with the end element of the array to make the last element the largest, then continue to adjust the heap structure, and then exchange the top element with the end element to obtain the second largest element, and then exchange and adjust repeatedly to get a ordered sequence.

(1) Exchange the top element 9 with the end element 4

  (2) Adjust the heap structure so that it continues to meet the heap definition

(3) Exchange the top element 8 with the end element 5 to get the second largest element 8

 

(4) Exchange the top element 5 with the end element 4

(5) Finally adjust the heap structure and exchange 5 and 6 to get an ordered sequence 

 

public class HeapSort {
    public static void heapSort(int []arr){
        if (arr == null || arr.length < 2){
            return ;
        }
        //假定用户给的数据是一个一个给的
        //时间复杂度为O(nlogn) 自上而下
//        for (int i=0;i<arr.length;i++){
//            heapInsert(arr, i);
//        }
        //自下而上 时间复杂度为O(n)
        for (int i=arr.length-1;i>=0;i--){
            heapify(arr,i,arr.length);
        }
        int size = arr.length;
        swap(arr,0,--size);
        while(size>0){
            heapify(arr,0,size);
            swap(arr,0,--size);
        }

    }
    private static void heapInsert(int []arr,int index){
        while(arr[index]>arr[(index-1)/2]){
            swap(arr,index,(index-1)/2);
            index = (index-1)/2;
        }
    }
    // 可以看做数据往下沉,
    // 依次跟自己的值最大的子节点的值进行比较,
    // 如果字节点的值比该节点的值大,就往下沉
    private static void heapify(int []arr,int index,int size){
        int left = 2*index+1;
        while(left<size){//说明有孩子
            int largestIndex = left+1<size && arr[left]<arr[left+1]?left+1:left;
            largestIndex = arr[largestIndex]>arr[index]?largestIndex:index;
            if (largestIndex == index){
                break;
            }
            swap(arr,largestIndex,index);
            index = largestIndex;
            left = index*2+1;
        }
    }
    private static void swap(int []arr,int i,int j){
        int temp = arr[i];
        arr[i] = arr[j];
        arr[j] = temp;
    }

    public static void main(String[] args) {
        int []arr = {5,4,3,2,1};
        heapSort(arr);
        for (int i : arr) {
            System.out.print(i+" ");
        }
    }
}

3. Insertion sort 

3.1 Direct insertion sort

Basic idea: Insert a piece of data to be sorted into the ordered sequence that has been sorted before at each step until all elements are inserted.

Algorithm implementation: Direct insertion sorting is to insert the data in the unordered sequence into the ordered sequence. When traversing the unordered sequence, first compare the first element in the unordered sequence with each element in the ordered sequence and compare them. Insert to the appropriate position until all elements in the unordered sequence are inserted. For an unordered sequence arr{4, 6, 8, 5, 9}, we first determine that the first element 4 is ordered, and then traverse to the right in the unordered sequence, if 6 is greater than 4, it is inserted into 4 , and then continue to traverse to 8, and if 8 is greater than 6, insert it behind 6, and continue until the ordered sequence {4, 5, 6, 8, 9} is obtained.

(1) We use a variable tmp to store keywords, because we first determine that the first element is temporarily ordered, so tmp stores the second element of the unordered sequence, and then i starts to be the subscript of the second element , j is i-1, because j uses ordered area elements to compare with unordered area elements. Then at the beginning i=1, tmp=6, j=0, because 6>4, so 6 does not need to be inserted; then i goes to the right, i=2, tmp=arr[2]=8, j=i- 1=1, 8>6>4 does not need to be inserted. 

(2) i continues to go to the right, i=3, tmp=arr[3]=5, j=i-1=2, 5<8 will give 8 to the element data where 5 is located, and j goes to the left to continue traversing ordered area. 

(3) When j goes right to 6, it is found that 6>tmp=5, so give 6 to the first value on the right (j+1 position), and then continue to traverse the ordered area, and find 4 when j=0 <5, the position of j+1 is the position where 5 should be, then the value of tmp 5 is given to the value of the element at the position of j+1.

 

(4) Continue the above operation. When i reaches 9 at the end and finds that it is larger than the elements in the previous ordered area, there is no need to insert any more. In this way, an ordered sequence {4, 5, 6, 8, 9} is obtained. 

 

public class Insertsort {
    public static void insert(int[] arr){
        // 从下标为1的元素开始选择合适的位置插入,因为下标为0的只有一个元素,默认是有序的
        for (int i = 1; i < arr.length; i++){
            // 记录要插入的数据
            int tmp = arr[i];

            // 从已经排序的序列最右边的开始比较,找到比其小的数
            int j = i;
            while(j > 0 && tmp < arr[j-1]){
                arr[j] = arr[j-1];
                j--;
            }
            // 存在比其小的数,插入
            if (j != i){
                arr[j] = tmp;
            }
        }
    }

    public static void main(String[] args) {
        int[] arr = {3,5,4,2,1};
        insert(arr);
        for (int i : arr) {
            System.out.print(i + " ");
        }
    }
}

3.2 Hill sort

Basic idea: Hill sorting is to group the sequence by a certain increment of the subscript, and use the direct insertion sorting algorithm to sort each group; as the increment gradually decreases, each group contains more and more keywords, when the increment decreases When it reaches 1, the entire sequence is just divided into one group, and the algorithm terminates.

Initially, there is an unordered sequence of size 10.

(1) In the first sorting, we might as well set gap1 = N / 2 = 5, that is, elements separated by a distance of 5 form a group and can be divided into 5 groups.

(2) Next, sort each group according to the method of direct insertion sort.

In the second sorting, we reduce the previous gap by half, that is, gap2 = gap1 / 2 = 2 (take an integer). In this way, elements separated by a distance of 2 form a group, which can be divided into 2 groups.

(3) Sort each group according to the method of direct insertion sort.

(4) In the third sorting pass, the gap is reduced by half again, that is, gap3 = gap2 / 2 = 1. In this way, elements separated by a distance of 1 form a group, that is, there is only one group.

(5) Sort each group according to the method of direct insertion sorting. At this point, sorting has ended.

time complexity:

Best case: Since the quality of Hill sorting has a lot to do with the selection of the step size gap, it is not yet clear how to choose the best step size (now there are some better choices, but I am not sure whether it is the best OK). So, don't know the best case time complexity of the algorithm.

Worst case: O(N*logN), the worst case is about the same as the average case.

The best known step sequence is proposed by Sedgewick (1, 5, 19, 41, 109,...).

This research also shows that "comparison is the most important operation in Hill sorting, not exchange." Hill sorting with such a step sequence is faster than insertion sorting and heap sorting, and even faster than quick sorting in small arrays. Still faster, but Hillsort is still slower than Quicksort when large amounts of data are involved.

space complexity

From the direct insertion sorting algorithm, we need a temporary variable to store the value to be inserted during the sorting process, so the space complexity is 1.

algorithm stability

Equal data in Hill sorting may exchange positions, so Hill sorting is an unstable algorithm.

public class ShellSort {  // 希尔排序

    public static void shellSort(int arr[]) {
        int d = arr.length;//gap的值
        while (true){
            d = d/ 2;//每次都将gap的值减半
            for (int x = 0; x< d; x++) {//对于gap所分的每一个组
                for (int i = x+ d; i < arr.length; i= i + d) {      //进行插入排序
                    int temp= arr[i];
                    int j;
                    for (j= i - d; j>= 0 && arr[j] > temp;j = j - d){
                        arr[j+ d] = arr[j];
                    }
                    arr[j+ d] = temp;
                }
            }
            if (d== 1) {//gap==1,跳出循环
                break;
            }
        }
    }

    public static void main(String[] args) {
        int[] arr = {3,4,1,2,5};
        shellSort(arr);
        for (int i : arr) {
            System.out.print(i + " ");
        }
    }
}

4. Merge sort

Merge sort is an effective sorting algorithm based on the merge operation , which is a very typical application of the divide and conquer method. Combine the ordered subsequences to obtain a completely ordered sequence; that is, first make each subsequence in order, and then make the subsequence segments in order.

Algorithm idea:

The merge sort algorithm has two basic operations, one is division , which is the process of dividing the original array into two sub-arrays. The other is rule , which merges two sorted arrays into a larger sorted array.

  1. The linear list to be sorted is continuously divided into several sublists until each sublist contains only one element. At this time, the sublist containing only one element can be considered as an ordered list.
  2. Merge the sub-lists in pairs, and each merge will generate a new and longer ordered list. Repeat this step until there is only one sub-list left, which is a sorted linear list.

Assuming we have an initial sequence of numbers {8, 4, 5, 7, 1, 3, 6, 2}, the entire merge sort process is shown in the figure below.

It can be seen that this structure is very similar to a complete binary tree. We use recursion to implement the merge sort in this article (it can also be implemented in an iterative way). Phases can be understood as the process of recursively splitting subsequences, and the recursion depth is log2n. 

Looking at the governance stage again, we need to merge two ordered subsequences into an ordered sequence, such as the last merge in the above figure, to combine [4,5,7,8] and [1,2,3 ,6] Two already ordered subsequences are merged into the final sequence [1,2,3,4,5,6,7,8], let’s see the implementation steps.

Merge sort is second only to quick sort in speed. The time complexity is O(nlogn). The space complexity is O(N), and the merge sort needs an array of the same length as the original array to assist in sorting. Merge sort is a stable sorting algorithm.

 

public class MergeSort {
    public static void mergeSort(int[] arr){
        if (arr == null || arr.length < 2){
            return;
        }
        sort(arr,0,arr.length-1);
    }
    private static void sort(int[] arr, int left, int right){
        if (left >= right){
            return;
        }
        int mid = left + ((right-left) >> 1);
        sort(arr, left, mid);
        sort(arr, mid+1, right);
        merge(arr, left, mid, right);
    }
    private static void merge(int[] arr, int start, int mid, int end){
        int[] temp = new int[end-start+1];
        int index = 0;
        int left = start;
        int right = mid+1;
        while (left <= mid && right <= end){
            temp[index++] = arr[left]>arr[right]?arr[right++]:arr[left++];
        }
        while (left <= mid){
            temp[index++] = arr[left++];
        }
        while (right <= end){
            temp[index++] = arr[right++];
        }
        for (int i = 0; i < temp.length; i++){
            arr[start+i] = temp[i];
        }
    }
    public static void main(String[] args) {
        int[] arr = {5, 4, 3, 2, 1};
        mergeSort(arr);
        for (int i : arr) {
            System.out.print(i+" ");
        }
    }
}

5. Quick Sort

Basic idea: Divide the data to be sorted into two independent parts by one-pass sorting, all the data in one part is smaller than all the data in the other part, and then quickly sort the two parts of the data in this way, the whole sorting process It can be done recursively so that the entire data becomes an ordered sequence.

The idea of ​​the sorting algorithm is very simple. In the sequence to be sorted, we first need to find a number as the reference number (this is just a special term). For convenience, we generally choose the first number as the reference number (in fact, it doesn't matter which number you choose). Next, we need to move the elements in the sequence to be sorted that are smaller than the reference number to the left of the sequence to be sorted, and the elements that are greater than the reference number to the right of the sequence to be sorted. At this time, the elements of the left and right partitions are relatively ordered; then the elements of the two partitions are respectively followed by the above two methods to continue to find the reference number for each partition, and then move until each partition has only one number.

This is a typical divide-and-conquer idea, that is, divide and conquer. Let's describe the algorithm of a practical example and explain the sorting steps of quick sort.

Take the sequence of 47, 29, 71, 99, 78, 19, 24, 47 as an example to sort. In order to distinguish between two 47, we add an underline to the following 47, that is, the sequence to be sorted is 47, 29, 71, 99, 78, 19, 24, 47 .

First of all, we need to choose a reference number in the sequence. We usually choose the middle number or the first and last numbers. Here, we directly choose the first number 47 as the reference number, and then move the number smaller than 47 to the left. 47 Larger numbers are moved to the right, and equal numbers are left unmoved. So in fact, we need to find a certain position k in the middle, so that the values ​​on the left of k are all smaller than the values ​​on k, and the values ​​on the right of k are all larger than the values ​​on k.

Next start moving elements. How to move it? In fact, bubble sorting also involves the movement of elements, but it is very tiring to move like that. For example, to move the last element to the first one, it needs to compare n-1 times and exchange n-1 times at the same time, which is very inefficient. In fact, just exchange the first element with the last element. Can this idea be used for reference when sorting? It was said before that quick sorting is an improvement on bubble sorting, which is why.

The operation of quick sorting is as follows: firstly, search from the right side of the sequence to the left, we set the subscript as i, that is, perform the subtraction operation (i--), find the first value smaller than the reference number, let It is exchanged with the reference value; then look from the left to the right, set the subscript as j, and then perform the addition operation (j++), find the first value that is greater than the reference number, and exchange it with the reference value; then continue Search until the end when i and j meet, and the position of the final benchmark value is the position of k, that is to say, the values ​​on the left of k are smaller than the values ​​on k, and the values ​​on the right of k are larger than the values ​​on k.

Therefore, for the above sequence 47, 29, 71, 99, 78, 19, 24, 47 , the sorting of the first exchange is as follows, and the first operation is shown in Figure 1.

After the exchange, j moves to the position with subscript 6, and continues to scan i, as shown in Figure 2.

 

At this time, the exchanged sequence becomes 24, 29, 47, 99, 78, 19, 71, 47 . Next, we continue to operate on i and j, as shown in Figure 3, and continue to compare i-- and j++.

 

After the two moves, comparisons, and exchanges of i and j, the sequence we finally get is 24, 29, 19, 47, 78, 99, 71, 47 . Next, we continue the operation of i--, and find that when i is 4, it is greater than 47 without exchanging, and when i is 3, it meets j. At this time, there is no need to continue to move and compare, and k has been found, and k The value is 3. We can confirm whether the values ​​on the left of k are all smaller than 47 in the current sequence, and the values ​​on the right of k are all larger than 47 (because the relative position needs to be kept unchanged, 47 is also on the right of the reference value 47).

47 This value has fallen to where it should be, and the first sorting is complete. The next step is to divide it into two parts based on k, and then perform the above sorting operation on the left and right parts respectively, and finally the data will be divided into 4 parts; then operate on each part until each part has only one value.

Next, the second sorting is performed, and now the left part is 24, 29, 19. We choose the first number 24 as the reference number, and then perform i--, j++ operations. We find that the initial value of i is 19, which is higher than 24 This benchmark value is small, so exchange it with the benchmark value, and the obtained sequence is 19, 29, 24; when j is 1, we find that 29 is greater than 24, so exchange it with the benchmark value, and obtain the sequence 19, 24, 29 , at this time i is 2, j is 1; when continuing i--, it is found that i is 1, and when it meets j, the k of the left part of the sequence is 1, and the left and right parts have only one element respectively. At this time, the second round of sorting The sorting of the left part ends, and all the data in the left part are sorted.

Let's look at the sorting of the right part. The sequence to be sorted is 78, 99, 71, 47. We also choose the first value 78 as the reference value, and then move and compare i and j, and find  47 If it is smaller than 78, exchange it, and get the sequence 47, 99, 71, 78; from left to right, find that 99 is larger than the reference value 78, exchange it, and get the sequence 47, 78,  71 , 99; continue from right to left Look, it is found that 71 is smaller than the reference value 78, and the exchange is performed, and the sequence obtained is  47 , 71, 78, 99. At this time , the subscript of i in the overall array is 6, and j is 5. If you continue j++, you will meet i, so this round of sorting is completed.

At this time, the k of the number sequence on the right is 6, which is generally the position where the reference value is located. At this time, the number sequence is divided into two parts, the left side is 47, 71, and the right side is 99. It is necessary to continue to adjust the  left part The data is sorted. Although there are only two data, we still continue to operate according to the idea of ​​quick sorting, choose  47  as the reference number, and move and compare i from right to left, and find that there is no movement when i and j are equal. Complete round 2 of sorting.

So far, all the sorting has been completed, and the final sequence results are 19, 24, 29, 47, 47, 71, 78, 99. How about it, does quick sorting complete all the sorting very simply? Although this quick sort did not change the order of the elements with the same value, but because the quick sort needs to move the elements in the array back and forth, sometimes the relative order will still be changed (for example, 47 was moved to 47 during the first round of movement   to the right), so quicksort is not a stable algorithm.

/*
*   荷兰国旗问题
*   问题一:
*       给定一个数组arr,和一个数num,请把小于等于num的数放在数组的左边,大于num的数放在数组的右边。
*       要求额外空间复杂度O(1),时间复杂度O(N)
*   问题二:(荷兰国旗问题)
*       给定一个数组arr,和一个数num,请把小于num的数放在数组的左边,等于num的数放在数组的中间,大于num的数放在数组的右边。
*       要求额外空间复杂度O(1),时间复杂度O(n)
* */
public class QuickSort {
    public static void quickSort(int[] arr) {
        if (arr == null || arr.length < 2) {
            return;
        }
        quickSort(arr, 0, arr.length - 1);
    }

    // arr[l..r]排好序
    public static void quickSort(int[] arr, int L, int R) {
        if (L < R) {
            swap(arr, L + (int) (Math.random() * (R - L + 1)), R);
            int[] p = partition(arr, L, R);
            quickSort(arr, L, p[0] - 1);
            quickSort(arr, p[1] + 1, R);
        }
    }

    // 处理arr[l...r]的函数
    // 默认以arr[r]做划分,arr[r] -> p   <p   ==p   >p
    // 返回等于区域(左边界,右边界),所以返回一个长度为2的数组res,res[0],res[1]
    public static int[] partition1(int[] arr, int L, int R) {
        int less = L - 1; // <区右边界
        int more = R;   // >区左边界
        while (L < more) {
            // L表示当前位置,arr[R] -> 划分值
            if (arr[L] < arr[R]) {    // 当前数 < 划分值
                swap(arr, ++less, L++);
            } else if (arr[L] > arr[R]) {    //当前数 > 划分值
                swap(arr, --more, L);
            } else {
                L++;
            }
        }
        swap(arr, more, R);
        return new int[]{less + 1, more};
    }

    public static void swap(int[] arr, int i, int j){
        int tmp = arr[i];
        arr[i] = arr[j];
        arr[j] = tmp;
    }


    /*
    * 1.0版本,时间复杂度O(n^2)
    *
    * */
    public static void quickSort_1(int[] arr, int L, int R){
        if (L >= R){
            return;
        }
        int partition = arr[R];
        int less = L - 1;
        int more = R + 1;
        int left = L;
        while(left < more){
            if(arr[left] > partition){
                swap(arr, left, --more);
            }else if (arr[left] == partition){
                left++;
            }else{
                swap(arr, left++,++less);
            }
        }
        quickSort_1(arr, L, less);
        quickSort_1(arr, more, R);
    }

    /*
    * 2.0版本:指定的基准值是随机的,时间复杂度最后收敛到O(nlogn)
    *
    * */
    public static void quickSort_v2(int[] arr, int L, int R){
        if (L < R){
            // 随机选出一个值与最右边的元素交换
            swap(arr, L+(int)(Math.random()*(R-L+1)),R);
            // 将数组分成三部分,小于 等于 大于
            // 返回值 0: 等于区域的第一个值        1:等于区域的最后一个值
            int[] partition = partition(arr, L, R);
            // 递归进行小于区域的分区
            quickSort_v2(arr, L, partition[0]-1);
            // 递归进行大于区域的分区
            quickSort_v2(arr, partition[1]+1, R);
        }
    }
    // 该方法将数组分成三部分,小于 等于 大于
    // 返回两个参数表示,0;
    private static int[] partition(int[] arr, int L ,int R){
        int less = L-1;
        int more = R;
        while(L < more){
            if (arr[L] < arr[more]){
                swap(arr, L++, ++less);
            } else if (arr[L] > arr[more]){
                swap(arr, L, --more);
            } else {
                L++;
            }
        }
        swap(arr, more, R);
        return new int[]{less+1, more};
    }


    public static void main(String[] args) {
        int[] arr= {1,4,3,7,5,3};
        quickSort_v2(arr,0,arr.length-1);
        for (int i : arr) {
            System.out.print(i+" ");
        }
    }


}

 Quick sorting is improved on the basis of bubble sorting. Bubble sorting can only exchange two adjacent elements at a time, while quick sorting is a jumping exchange with a large exchange distance, so the total comparison and The number of exchanges is much less, and the speed is also much faster.

But the worst-case time complexity of quick sort is the same as that of bubble sort. Yes  O(n2), each comparison actually needs to be exchanged, but this situation is not common. We can think about it if each comparison needs to be exchanged, then the average time complexity of the sequence is,  O(nlogn)in fact, most of the time, the sorting speed is faster than this average time complexity. This algorithm is actually a divide-and-conquer idea, that is, divide and conquer, divide the problem into small parts to solve them separately, and then combine the results.

Quick sort only uses the original space of the array for sorting, so the space occupied should be constant. However, since each division is followed by a recursive call, the recursive call will consume a certain amount of space during operation. In general The space complexity of is  O(logn), in the worst case, if only one element is completed at a time, then the space complexity is  O(n). So we generally think that the space complexity of quick sort is  O(logn).

Quick sort is an unstable algorithm. After sorting, the relative position of elements with the same value may be changed.

Quicksort is basically considered to have the best average performance among all sorting algorithms of the same order of magnitude.

Summarize:

Sorting Algorithm time complexity space complexity stability
selection sort O(N^2) O(1) ×
Bubble Sort O(N^2) O(1)
insertion sort O(N^2) O(1)
merge sort O(N*logN) O(N)
quick row O(N*logN) O(logN) ×
pile up O(N*logN) O(1) ×
counting sort O(N)
radix sort O(N)
  • In general, if you choose a sorting algorithm, choose quick sorting (although the time complexity of quick sorting and heap sorting is the same, but the constant time of quick sorting is the fastest), choose heap sorting only when the space is limited

  • Based on comparison sorting, there is no algorithm that can achieve a time complexity below O(N*logN)

  • Sorting based on comparison, when the time complexity is O(N*logN), the space complexity is below O(N), and no algorithm can be stable

  • Common pitfalls (very difficult, does not require mastery, will break):

    • The additional space complexity of merge sort can be reduced to O(1) through the "merge sort internal cache method", but its stability is lost

    • "In-place merge sort" can change the extra space complexity to O(1), but the time complexity will become O(N^2)

    • Quick sort can turn stability into stable through the paper "01 stable sort", but at the same time, its additional space complexity will become O(N)

    • There is a question where the odd numbers are placed on the left side of the array, and the even numbers are placed on the right side of the array. It is also required that the original relative order remains unchanged. It can be realized, but it is difficult. You can refer to the paper "01 stable sort"!

      The classic quick sort division cannot achieve stability, but it is 01 standard, and it is an adjustment strategy with the parity problem. The classic quick sort cannot, so I don’t know how to solve this problem

  • engineering sorting

    • Make full use of the respective advantages of O(N*logN) and O(N^2) sorting

      For example, in the classic quick sort, insertion sort can be used for sorting when the sample size is called , and quick sort is used for large sample sizes, taking advantage of their respective advantages

    • stability considerations

      Basic types of data use fast sorting, and non-basic types use merge sorting (to ensure stability)

 

Guess you like

Origin blog.csdn.net/weixin_44516623/article/details/128513836