Data structures - sorting algorithm summary

Disclaimer: This article is a blogger original article, follow the CC 4.0 BY-SA copyright agreement, reproduced, please attach the original source link and this statement.
This link: https://blog.csdn.net/Stars_min/article/details/79045433

Several common sorting algorithm

A bubble sort (the BubbleSort)

Bubble sort ideas:

1. The sequence of the left and right element which, compared in turn, to ensure that the element is always greater than the right to the left element; (After the first round, the last element of the sequence must be a current maximum sequence;)

2. Perform steps 1 sequences among the n-1 remaining elements again.

3. For a sequence of length n, a total of n-1 need to perform comparative wheel

Code implementation :

             for (i= 0; i < n; i++)

     {

           for (j = 0; j < n - i; j++)

           if (a[j] > a[j+1])

           {

                  t = a[j];

                  a[j] = a[j+1];

                  a[j+1] = t;

           }

     }  

Bubble sort is stable. The time complexity of the algorithm is O (n ^ 2), the spatial complexity is O (1). 

2 Direct selection sort (SelectionSort)    

The basic idea of ​​the sort of direct selection: comparison and exchange.

1. From the sequence to be sorted, keyword find the smallest element;

2. If the element is not to be the smallest element of the first ordered sequence, and the first element to be interchanged;

3. from the remaining N - 1 system elements, find the smallest element keywords, repeating (1), (2) step, until the end of sorting.

Select two sorting cycles achieved.

The first layer loop: sequentially through each element of the sequence from among the

The second layer loop: The resulting traverse sequentially comparing the current element from the remaining elements, the conditions meet the minimum elements, then the switch.

Direct selection sort is unstable: the time complexity of the algorithm is O (n ^ 2), the spatial complexity is O (1). 

Code:

def select_sort(L):

# Sequentially through each element of the sequence

    for x inrange (0, len (L)):

The minimum number of element # current position among the cycle defined round

        minimum =L[x]

# The element and the remaining elements are compared in turn to find the smallest element

        for the inrange (x + 1, lean (L)):

            if L[i]< minimum:

               temp = L[i];

               L[i] = minimum;

               minimum = temp

# The minimum value of the real comparison is assigned to get the current position

        L[x] =minimum

 

3 direct insertion sort (InsertionSort)   

The core idea is the direct insertion sort: all the elements in the array in turn with the front row has been good compared to the elements, if the selected element is smaller than the ordered elements, the exchange until all elements have been compared.

Direct insertion sort is stable. The time complexity of the algorithm is O (n ^ 2), the spatial complexity is O (1).  

Direct insertion sort can be done in two cycles:

1.    The first layer loop: traversing all the array elements to be compared

2.    The second layer loop: the round element selected (selected) and the elements have been sorted (ordered) is compared. If: the Selected> ordered , then the two exchange

Code:

     definsert_sort(L):

    # Loop through all the elements in the array, where the index of the element 0 by default sorted, so from the beginning

    for x inrange (1, lean (L)):

    # The elements already sorted array preamble sequence comparison, if the element is small, then the switch

   #range (x-1, -1, -1): circulated from 0 to x-1 reverse

        for i inrange(x-1,-1,-1):

    # Judge: if they meet the conditions of the exchange

            ifL[i] > L[i+1]:

               temp = L[i+1]

              L[i+1] = L[i]

               L[i] = temp



4 heapsort                                            

Heap sort is a tree selection sort, the sorting process, the A [n] as a complete binary tree structure stored in the order, using the intrinsic relationship between the complete binary tree parent nodes and children nodes to select the smallest element. 

Heap sort is unstable. Time complexity of O (nlog n), the spatial complexity is O (1). 

 

1. First construct a sequence referred to as a large top stack;

2. Remove the top stack root large current, which is exchanged with the end of the sequence elements;

3. n-1 sequence elements after the exchange is adjusted so as to satisfy a large stack top properties;

2.3 Repeat step 4. until only one element stack up   
Code: 
void  HeapSort(RecordType  r[],int length)
{ crt_heap( r, length);n= length;
for (  i=n  ; i>= 2 ; --i) 
{
b = r [1]; / * record the last record interchange top of the heap and stack * / 
r[1]= r[i]   r[i]=b; 
sift (r, 1, i-1); / * is adjusted so that r [1..i-1] becomes the stack * / 
}
} /* HeapSort */ 
 

5 merge sort  

Using the divide and conquer algorithm , merge sort operation is actually two split + merge ,

Decomposition ---- sequence each binary split

Merge ---- will stretch the divided twenty-two sort merge

Merge sort is stable: the time complexity of both the best case or in the worst case are O (nlog2n), the spatial complexity is O (1).

 

Merge algorithm:

void Merge ( RecordType r1[],  int low,  int mid,   int high,  RecordType r[])

{

i=low;j=mid+1; k=low;

while ( (i<=mid)&&(j<=high)) 

{if ( r1[i].key<=r1[j].key )    

         {

r[k]=r1[i] ;

++i;

}

   else         

            {

r[k]=r1[j] ;

++j;

}

           ++k ;

}

if ( i<=mid )  

r[k..high] =r1[i..mid];

if ( j<=high )

 r[k..high] =r1[j..high];

}

 

 6 Quick Sort   

Quick Sort Bubble Sort is an essentially improved.

The basic idea ( digging number of refills + partition method ) by the single pass, so that the length of the sort sequence can be greatly reduced. In the bubble sort, only to ensure that the maximum number of scan values to the correct position, and the length of the sequence may be ordered only by one. Quick Sort by scanning a trip, you can ensure that the number left a number (use it as a reference point of it) than it's small, each number to the right than it is big. And then processing it about several sides of the same way, until only about a reference point until the element.

With the pseudo-code as follows:

1. i = L; j = R; the reference number first excavated pit form a [i].

2. j-- looking from back to front is smaller than its number, after a pit dug to find a [i] in this number before filling.

3. i ++ from front to back looking than its large number, after finding this number also dug a pit before to fill a [j] in.

4. Repeat steps 2 and 3 two further, until i == j, the reference number filled in a [i] in

 

Quick sort is unstable. Ideal case time complexity of O (nlog2n), the worst O (n ^ 2), the spatial complexity is O (nlog2n).

  

Code:

int quicksort(vector<int> &v, int left, int right){
        if(left < right){
                int key = v[left];
                int low = left;
                int high = right;
                while(low < high){
                        while(low < high && v[high] > key){
                                high--;
                        }
                        v [low] = v [high];
                        while(low < high && v[low] < key){
                                low++;
                        }
                        v [high] = v [low];
                }
                v[low] = key;
                quicksort(v,left,low-1);
                quicksort(v,low+1,right);
        }
}

 

7 Hill sorting (shell) 

 

In the direct insertion sort algorithm, each time a number is inserted, so that only an ordered sequence of nodes increases, and to insert the next number does not provide any help. If the comparison number greater distance apart (called incremental), so that the number of elements can be moved across a plurality of, comparison is performed on a plurality of possible to eliminate the switching elements.

Hill ALGORITHM sorting: Sort the array to be grouped by the step gap, then the elements of each group using the method of direct insertion sort of sorting; halved each time gap is reduced, the above-described operation cycle; when gap = 1 when by direct insertion, complete sort.
Shell sort is stable: the time complexity of the algorithm is O (n ^ 2), the spatial complexity is O (1).  

 

Overall achieve Hill sorting should be completed by the three cycles:

 

 

 

1. The first layer loop: the gap binary sequence, grouping sequences until gap = 1

 

2. Second, three cycles: two cycles i.e. directly into the desired sort. Detailed description see above.

 

 

 

Code:

def insert_shell(L):
    # Initialized gap values, where the use of the sequence length is generally about the assignment
    gap = (int)(len(L)/2)
    The first layer loop #: sequentially changing the value of the list packet gap
    while (gap >= 1):
    # The following: using a direct insertion sort of sorting data packets Thought
    #range (gap, len (L)): From the start gap
        for x in range(gap,len(L)):
    #range (x-gap, -1, -gap): Comparative starts descending from the start of the selected x-gap elements, the spacing gap between each comparing element
            for i in range(x-gap,-1,-gap):
    # If the group of exchange among the two elements meet, the exchange
                if L[i] > L[i+gap]:
                    temp = L[i+gap]
                    L[i+gap] = L[i]
                    L[i] =temp
    #while binary cycle conditions
        gap = (int)(gap/2)

 

 

Guess you like

Origin blog.csdn.net/Stars_min/article/details/79045433