Several commonly used sorting algorithms

Three sorting algorithms with O(n2) time complexity

    Insertion sort

    The main idea of ​​insertion sort is: every time a new element is obtained, it is compared with the original sorted sequence, and then the new element is inserted into the correct position. The specific code implementation:

 
 
void insertSort(int a[],int length)
{
	if(a==NULL||length<=0) return;
	for(int i=1;i<length;i++)
	{
		int j=i;
		while(j-1>=0&&a[j-1]>a[j])
		{
			int tmp=a[j-1];
			a[j-1]=a[j];
			a[j]=tmp;
			j--;
		}
	}

}
 
 

    The worst case of insertion sort, here is the arrangement from small to large, assuming that for the k-th cycle, the elements before k and the size of the k-th need to be compared, and the k-th element needs to be moved to a position less than or equal to , all for the worst case, the array should be arranged in reverse order, starting from the i-th (i>=1) element, each element needs to be moved i bits, and the worst time complexity should be 1+2+3 +....+n-1 , so it should be O(n2).

    The best time-complex reading is already sorted, that is, it only needs to be traversed once, and the time complexity is O(n);

    By the same token, the average case should also be O(n2).

    Bubble Sort

    The main idea of ​​bubble sort is that small ones sink and big ones float up.

    The code is implemented as follows:

void bubSort(int a[],int length)
{
	if(a==NULL||length<=0) return;
	for(int i=1;i<length;i++)
	{
		for(int j=i;j>0;j--)
		{
			if(a[j]<a[j-1])
			{
				int tmp=a[j];
				a[j]=a[j-1];
				a[j-1]=tmp;
			}
		}
	}
}

    It can be seen from the code that the time complexity of bubble sort is O(n2)

selection sort

    The main idea is: choose the smallest one from the array, put it in the first position, then choose the next smallest one, and go down in order, thus forming an ordered array.

    Code:

void selectSort(int a[],int length)
{
	if(a==NULL||length<=0) return;
	for(int i=0;i<length;i++)
	{
		int min = a [i];
		int pos=i;
		for(int j=i+1;j<length;j++)
		{
			if(a[j]<min)
			{
				min = a [j];
				pos=j;
			}
		}		
		// swap the smallest
		int tmp=a[i];
		a [i] = min;
		a[pos]=tmp;
	}
}

    It can be seen that the selection sort is the best, the worst, and the average is O(n2);

Here is a comparison of three simple sorting algorithms

    Shell sort

        The main idea: Shell sorting is an improvement of insertion sorting. Since insertion sorting is a comparison between adjacent elements, Shell sorting mainly performs non-incremental comparison of the compared elements, that is, divides the elements into multiple sequences. Then compare, but it should be noted that at the end, there is also a comparison with an increment of 1, which is a regular insertion sort.

        Code

void shellInsertSort(int a[],int length,int incr)
{
	for(int i=incr;i<length;i+=incr)
	{
		int j=i;
		while(j-incr>=0&&a[j-incr]>a[j])
		{
			int tmp=a[j-1];
			a[j-1]=a[j];
			a[j]=tmp;
			j-=incr;
		}
	}
}

void shellSort(int a[],int length)
{
	for(int i=length;i>2;i=i/2)
		for(int j=0;j<i;j++)
			shellInsertSort(a,length-j,i);
	shellInsertSort(a,length,1);
}

    The best and worst time complexity of Shell sorting is O(n2), and the average time complexity is O(n1.5)

quicksort

    The main idea of ​​quick sort: It mainly uses the idea of ​​divide and conquer to divide the array into two parts, the department on the left is less than a[k], the element on the right is greater than or equal to a[k-1], and then repeat the above steps.

    Code:

public class QuickSort {

    public static void quickSort(int arr[],int _left,int _right){
        int left = _left;
        int right = _right;
        int temp = 0;
        if(left <= right){ //There are at least two elements to be sorted
            temp = arr[left]; //The first element to be sorted is used as the reference element
            while(left != right){ //Scan alternately from left and right until left = right

                while(right > left && arr[right] >= temp)  
                     right --; //Scan from right to left to find the first element smaller than the base element
                  arr[left] = arr[right]; // find this element arr[right] and exchange it with arr[left]

                while(left < right && arr[left] <= temp)
                     left ++; //Scan from left to right to find the first element larger than the base element
                  arr[right] = arr[left]; //After finding this element arr[left], exchange it with arr[right]

            }
            arr[right] = temp; //Return the base element
            quickSort(arr,_left,left-1); //Recursively sort the elements to the left of the base element
            quickSort(arr, right+1,_right); //Recursively sort the right side of the base element
        }        
    }
    public static void main(String[] args) {
        int array[] = {10,5,3,1,7,2,8};
        System.out.println("Before sorting: ");
        for(int element : array){
            System.out.print(element+" ");
        }
        
        quickSort(array,0,array.length-1);

        System.out.println("\nAfter sorting: ");
        for(int element : array){
            System.out.print(element+" ");
        }

    }

}

    The best and average time of quicksort is O(nlogN), and the worst case is O(n2).

heap sort

    The main idea: take advantage of the properties of the heap.

    Heap sort has been implemented in the previous sections, here is just to explain the worst, worst and average time complexity of O(nlogN);

merge sort

void mergeSort(int a[],int tmp[],int left,int right)
{
	int mid=(left+right)/2;
	if(left==right) return;
	mergeSort(a,tmp,left,mid);
	mergeSort(a,tmp,mid,right);
	for(int i=left;i<=right;i++)
	{
		tmp[i]=a[i];
	}
	int i1=left,j=mid+1;
	for(int i=left;i<=right;i++)
	{
		if(i1==mid+1){
			//The array on the left here is used up
			a[i]=tmp[j++];
		}
		else if(j>right)
		{
			a[i]=tmp[i1++];
		}
		else if(a[i1]<a[j])
		{
			a[i]=a[i1++];
		}
		else
		{
			a[i]=a[j++];
		}
	}
}
    For merge sort, the depth of division is logN, whether good or bad, on average, the average complexity of this algorithm is (nlongN)



Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=325721780&siteId=291194637