The beauty of data structures and algorithms (time complexity analysis)

content

The beauty of data structures and algorithms (time complexity analysis)

1. What is complexity analysis?

2. Why do complexity analysis?

3. How to analyze time complexity analysis

4. Commonly used complexity levels

5. How to master complexity analysis

6. 4 Concepts of Complexity


The beauty of data structures and algorithms (time complexity analysis)

Foreword:

Data structures and algorithms themselves solve the problem of "fast" and "saving" , that is, how to make the code run faster and how to make the code save more storage space. Therefore, the execution efficiency is a very important consideration of the algorithm.

Today's content mainly revolves around: time complexity and space complexity ( there is nothing to say about space complexity, just look at the space opened up) 

1. What is complexity analysis?


1. Data structure and algorithm solving is "how to make computers solve problems faster and in less space".
2. Therefore, it is necessary to evaluate the performance of data structures and algorithms from the two dimensions of execution time and footprint.
3. The two concepts of time complexity and space complexity are used to describe the performance problem, and the two are collectively referred to as complexity.
4. Complexity describes the growth relationship between algorithm execution time (or occupied space) and data size.


2. Why do complexity analysis?


1. Compared with performance testing, complexity analysis has the characteristics of independent execution environment, low cost, high efficiency, easy operation and strong guidance.
2. Mastering complexity analysis will be able to write code with better performance, which will help reduce system development and maintenance costs.

Big O notation (let's find out for yourself) https://baike.baidu.com/item/%E5%A4%A7O%E8%A1%A8%E7%A4%BA%E6%B3%95/1851162

What it does:

To put it bluntly, it is to give you a rough idea of ​​the running time or required space of the algorithm (this code) (just know its size)

3. How to analyze time complexity analysis

1) The frequency of single-segment codes is high

Look at the lines of code with the highest execution frequency (such as while, for loop) [ focus on the piece of code with the most loop execution times ]

2) Multi-segment code to take the largest

What is the largest, that is to say, there may be multiple loops or single loops in a piece of code, then see which one is bigger (multiple loops).

The difference between the following 4 means that the fourth point of the same variable    is a different variable, so the addition rule is used

3) Nested code takes product (multiplication rule)

void  text(int n,int m) {
int i,j;
for(i=0;i<n;i++)
    for(j=0;j<m;j++)
       {
          //待执行的代码
        }
 }

The time complexity of the current code is O(n*m)

4) Add multiple scales and take the sum of the two (addition rule)

void  text(int m, int n) {
  int sum1 = 0;
  for ( int i = 1; i < m; ++i) {
    sum1 = sum1 + i;
  }

  int sum2 = 0;
  for (int j = 1; j < n; ++j) {
    sum2 = sum2 + j;
  }

}

The method has two parameters to control the number of two loops, then the complexity of the two is added. O(m+n)

4. Commonly used complexity levels

These complex measures above can be divided into two parts:   polynomial and non-polynomial

 Polynomial magnitude:

O(1) (constant order), O(logn) (logarithmic order), O(n) (linear order), O(nlogn) (linear logarithmic order), O(n^2) (square order), O(n^3) (cubic order)

Non-polynomial magnitude:

O(2^n) (exponential order), O(n!) (factorial order)

It is worth mentioning here that

Polynomial order: As the size of the data grows, the execution time and space occupied by the algorithm increase in proportion to the polynomial.

Non-polynomial order: As the size of the data increases, the execution time and space occupation of the algorithm increase sharply , and the performance of this type of algorithm is extremely poor .

5. How to master complexity analysis

Here comes the nonsense (practice more and you will be done)

6. 4 Concepts of Complexity

1) Best case time complexity

The time complexity of the code executing in the best-case scenario.
 

2) Worst case time complexity

The worst-case time complexity of the code.

3) Average case time complexity

Expressed as a weighted average of the number of times the code was executed in all cases.

4) Amortized case time complexity

Most of all complexity cases of code execution are low-level complexity, and individual cases are high-level complexity and when there is a timing relationship, individual high-level complexity can be amortized to low-level complexity. Basically amortized result equals low-level complexity.

 // array表示一个长度为n的数组
 // 代码中的array.length就等于n
 int[] array = new int[n];
 int count = 0;
 
 void insert(int val) {
    if (count == array.length) {
       int sum = 0;
       for (int i = 0; i < array.length; ++i) {
          sum = sum + array[i];
       }
       array[0] = sum;
       count = 1;
    }

    array[count] = val;
    ++count;
 }

example:

Each O(n) insertion operation will be followed by n-1 O(1) insertion operations, so the more time-consuming operation is evenly amortized to the next n-1 less time-consuming operations. Down, the amortized time complexity of this group of consecutive operations is O(1).

After talking about these four concepts, some people may ask why these four concepts are added.

This is because the same piece of code may have different levels of complexity under different inputs. The four concepts are used to describe the corresponding time complexity more accurately.

The most important thing is how we analyze:

Best and worst not to mention.

Average case time complexity:

The order of magnitude difference in the complexity of the code in different situations is represented by the weighted average of the execution times of the code in all possible situations.

Amortized case time complexity:

In a set of continuous operations on a data structure, the time complexity is very low in most cases, and the time complexity is relatively high only in some cases, and there is a coherent timing relationship between these operations. At this time, we can Analyze this group of operations together to see if the time consuming of the operation with higher time complexity can be amortized to other operations with lower time complexity. Moreover, where amortized time complexity analysis can be applied, generally amortized time complexity is equal to best-case time complexity.

Guess you like

Origin blog.csdn.net/qq_54729417/article/details/122844630
Recommended