Comparison of stability, time and space complexity of common sorting

1. Concept

1.1 Time complexity

 In general, the number of repetitions of the basic operations in the algorithm is a function of the problem size n , which is represented by T(n) . If there is an auxiliary function f(n), such that when n approaches infinity, T ( The limit value of n)/f(n) is a constant not equal to zero, then f(n) is said to be a function of the same order of magnitude as T(n) . Denoted as T(n) = O (f(n)), O (f(n))  is called the asymptotic time complexity of the algorithm, referred to as the time complexity.

         In various algorithms, if the number of statements executed in the algorithm is a constant, the time complexity is O(1). In addition, when the time frequency is different, the time complexity may be the same, such as T(n)= n2+3n+4 and T(n)=4n2+2n+1 have different frequencies, but the time complexity is the same, both of which are O(n2) . Arranged in increasing order of magnitude, the common time complexities are: constant order  O(1), logarithmic order O(log2n), linear order O(n),  linear logarithmic order O(nlog2n), square order O(n2) , Cubic order O(n3),... ,  k -th power order O(nk), exponential order O(2n) . With the continuous increase of the problem size n , the above-mentioned time complexity continues to increase, and the execution efficiency of the algorithm is lower.

 

1.2 Space complexity

Similar to time complexity, space complexity is a measure of the storage space required by an algorithm to execute within a computer. Written as : S(n)=O(f(n))  We are generally talking about the size of the auxiliary storage unit in addition to the normal occupied memory overhead.

         When the space complexity of an algorithm is a constant, that is, it does not change with the size of the processed data n , it can be expressed as O(1) ; when the space complexity of an algorithm is the logarithm of n in the base 2 When proportional, it can be expressed as 0 (10g2n)

 

1.3 Stability

Assuming that there are multiple records with the same key in the sequence of records to be sorted, if sorted, the relative order of these records remains unchanged, that is, in the original sequence, ri=rj , and ri is before rj , and in In the sorted sequence, if ri is still before rj , the sorting algorithm is said to be stable; otherwise, it is said to be unstable.

 

Second, the commonly used sorting algorithm

2.1 Stability of commonly used algorithms

Selection sort, quick sort, hill sort, heap sort are not stable sorting algorithms,

Bubble sort, insertion sort, merge sort, and radix sort are stable sorting algorithms.

 

2.2 Comparison of the complexity and stability of the algorithm



 

 

 

 

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=326204607&siteId=291194637