Encyclopedia of complexity and space complexity

You can know it literally

  • Time complexity: In other words, the length of time it takes to execute the algorithm, the faster the better.
  • Space complexity: In other words, the amount of storage space required to execute the current algorithm, the less the better

Representation method
We generally use "big O notation" to represent time complexity: T(n) = O(f(n))
n is a factor that affects the complexity of changes, f(n) is a specific algorithm for complexity.

Time and space complexity meaning

Evaluating the efficiency of an algorithm mainly depends on its time complexity and space complexity.

Some developers may not be exposed to time complexity and space complexity optimization (especially on the client side), but the application on the server side is relatively extensive. In the case of a huge amount of concurrency, it is necessary to understand that a small amount of time complexity or space complexity optimization can bring huge performance improvements.

time complexity

Common time complexity metrics are as follows:

  • Constant order O(1)
  • Linear order O(n)
  • Logarithmic order O(logN)
  • Linear logarithmic order O(nlogN)
  • Square order O(n²)
  • Cubic order O(n³)
  • K-th order O(n^k)
  • Exponential order (2^n)

1. Constant order O(1)

int a = 1;
int b = 2;
int c = 3;

Big O notation is not used to truly represent the execution time of the algorithm, it is used to represent the growth trend of code execution time.
Even if the code above is tens of thousands, its algorithm is not side-length, and its time complexity is still O(1).

  1. Linear order O(n)
for(i = 1; i <= n; i++) {
    
    // 执行n次
   j = i;// 执行n次
}

If n is a few, the internal code block of for needs to be run how many times, so its time complexity is actually O(n).

3. Logarithmic order O(logN)

int i = 1;
while(i < n) {
    
    
    i = i * 2;
}

You can see that i is multiplied by 2 each time it loops, so the total number of loops is log2n, so the time complexity of this code is O(log2n).
The base here is not important for studying the efficiency of the program. When writing the code, the impact of the data size n on the efficiency of the program should be considered. The constant part is ignored, so it becomes O(logn)

Similarly, if the multiple relationship of different time complexity is constant, it can also be approximated that the two are of the same order of time complexity.

4. Linear logarithmic order O(nlogN)

for(m = 1; m < n; m++) {
    
    
    i = 1;
    while(i < n) {
    
    
        i = i * 2;
    }
}

The linear logarithmic order O(nlogN) is actually very easy to understand. If the code with the time complexity of O(logn) is looped N times, then its time complexity is n * O(logN), which is O(nlogN) .

5. Square order O(n²)

for(x = 1; i <= n; x++){
    
    //执行你次
   for(i = 1; i <= n; i++) {
    
    //执行 N*n次
       j = i;  //执行 N*n次
    }
}

If the O(n) code is nested and looped again, its time complexity is O(n²).

6. Cubic order O(n³) and Kth order O(n^k) can
refer to O(n²) above to understand. O(n³) is equivalent to three layers of n cycles, and the others are similar.

Time complexity analysis
T(n) is usually used to represent the code execution time, n is the data size, and f(n) is the number of code execution synthesis.

In the O(n) example, the number of executions is 2n. If there is no execution time t, the execution time is 2nt, which can mean f(n)=2nt.
In the O(n²) example, the execution times are 2n, if there is no execution time. t, the execution time is 2nt, which can be expressed as f(n)=(2n²+n)t.

The execution time T(n) of the code is proportional to the number of executions n of each line of code. People sum up this rule into this formula: T(n) = O(f(n)). Big O time complexity does not specifically indicate the actual execution time of the code, but indicates the trend of the code execution time as the data scale grows. Therefore, it is also called progressive time complexity, or time complexity for short.

When n becomes larger and larger, the low-order, constant, and coefficient in the formula cannot affect its growth trend, so you can ignore them directly and only record the largest magnitude. So the two examples are actually Their time complexity should be recorded as: T(n)=O(n), T(n)=O(n*n)

Space complexity

1. Space O(1)
If the temporary space required for algorithm execution does not change with the size of a certain variable n, the space complexity of this algorithm is a constant, which can be expressed as O(1).

int i = 1;
int j = 2;
++i;
j++;
int m = i + j;

2. Space O(n)

int[] arr = new int[n]

The allocated space is only related to n

stability

Suppose there are multiple records with the same key in the sequence of records to be sorted. If sorted, the relative order of these records remains unchanged, that is, in the original sequence, r[i]=r[j], and r [i] is before r[j], and in the sorted sequence, r[i] is still before r[j], this sorting algorithm is said to be stable; otherwise it is called unstable.

It should be noted that whether the sorting algorithm is stable or not is determined by the specific algorithm. An unstable algorithm can become a stable algorithm under certain conditions, and a stable algorithm can also become unstable under certain conditions. algorithm.

Modify the condition to if (arr[j] >= arr[j + 1]) in the following bubble sorting, then the stability will change from stable to unstable.

          if (arr[j] > arr[j + 1]) {
    
    

Guess you like

Origin blog.csdn.net/luo_boke/article/details/106902703