Sorting algorithm analysis

 Algorithm stability : It can ensure that the two equal numbers before sorting have the same sequence before and after sorting and their sequence after sorting.
 Time complexity : the order of magnitude that the program execution time increases with the increase of the input scale, that is, the progressive time complexity.
                        The number of executions of statements in an algorithm is called statement frequency or time frequency. Denoted as T(n).
   

    Sorting Algorithm:

     1 Bubble sorting: Repeatedly traverse the sequence of numbers to be sorted, compare two elements at a time, if their order is wrong, swap them to
             compare adjacent elements, change the corresponding positions, and do the same work for each pair of adjacent elements

    def bubble_sort(alist):
        for j in range(len(alist)-1,0,-1):
                # j represents the number of comparisons required for each traversal, which is gradually reduced
                for i in range(j):
                if alist [i]> alist[i+1]:
                    alist[i], alist[i+1] = alist[i+1], alist[i]

    Optimal time complexity: O(n) (means that the traversal once finds no elements that can be exchanged, and the sorting ends.)
    Worst time complexity: O(n2)
    stability: stable


   2 Quick sort:
   def quick_sort(alist, start, end):
    """Quick sort"""
 
        # Recursive exit condition
           if start >= end:
            return

        # Set the starting element as the reference element to find the position
        mid = alist[start]

        # low is the cursor moving from left to right on the left side of the sequence
        low = start

        # high is the cursor moving from right to left on the right side of the sequence
        high = end

        while low <high:
            # If low and high do not coincide, the element pointed to by high is not smaller than the reference element, then high moves to the left
            while low <high ​​and alist[high] >= mid:
                high -= 1
            # The element pointed to by high Put it in the low position
            alist[low] = alist[high]

            # If low and high do not coincide, and the element pointed to by low is smaller than the reference element, then low moves to the right
            while low <high ​​and alist[low] <mid:
                low += 1
            # Put the element pointed to by low to the position of high
            alist[high] = alist[low]

        # After exiting the loop, low coincides with high, and the position pointed to at this time is the correct position of
        the reference element # Put the reference element to this position
        alist[low] = mid

        # Quickly sort the subsequences on the left of the reference element
        quick_sort(alist, start, low-1)

        # Quickly sort the subsequence on the right side of the reference element
        quick_sort(alist, low+1, end)
        Time complexity
        Optimal time complexity: O(nlogn)
        Worst time complexity: O(n2)
        Stability: Unstable
    
    3 Selective sorting:
      Find the smallest (large) element in the unsorted sequence, store it at the beginning of the sorted sequence, and then continue to find the smallest (large) element from the remaining unsorted elements, and then put it at the end of the sorted sequence. And so on, until all the elements are sorted.
      # If you choose the largest in ascending order each time, the complexity is still n^2, you can have n without sorting
    def selection_sort(alist):
        n = len(alist)
        # Need to do n -1 selection operation
        for i in range(n-1):
            # record the minimum position
            min_index = i
            # select the minimum data from i+1 position to the end
            for j in range(i+1, n):
                if alist(j ] <alist[min_index]:
                    min_index = j
            # If the selected data is not in the correct position, exchange
            if min_index != i:
                alist[i], alist[min_index] = alist[min_index], alist[i]

    Time complexity:
        Optimal time complexity: O(n2)
        Worst time complexity: O(n2)
        Stability: Unstable (considering the largest selection in ascending order)

        More mouth-watering, for example, the sequence 5 8 5 2 9. We know that selecting the first element 5 in the first pass will exchange with 2, then the relative order of the two 5s in the original sequence is destroyed, so the selection order is not A stable sorting algorithm

    4 Insertion Sort (English: Insertion Sort) is a simple and intuitive sorting algorithm. It works by constructing an ordered sequence. For unsorted data, scan from back to front in the sorted sequence, find the corresponding position and insert it. In the implementation of insertion sort, in the process of scanning from back to front, the sorted elements need to be moved backwards step by step to provide insertion space for the latest elements.

    def insert_sort(alist):
        # Insert forward from the second position, that is, the element with subscript 1
        for i in range(1, len(alist)):
            # Compare forward from the i-th element, if less than The previous element, swap positions
            for j in range(i, 0, -1):
                if alist[j] <alist[j-1]:
                    alist[j], alist[j-1] = alist[j-1] , alist[j]

            Time complexity
            Optimal time complexity: O(n) (ascending order, the sequence is already in ascending order)
            Worst time complexity: O(n2)
            Stability: stable
 

Guess you like

Origin blog.csdn.net/weixin_42322206/article/details/100068219