[US] data structures and algorithms of the study notes 02 - time and space complexity analysis

We often say that never quite work data structures and algorithms, in fact, we are active or unconsciously filter out such opportunities.

1 What is

  1. Data structures and algorithms to solve is "how to make your computer faster time, space-saving and more to solve the problem."
  2. Thus the required execution time and space in two dimensions to evaluate the performance of the data structures and algorithms.
  3. Respectively time complexity and space complexity of the concepts described performance issues, both collectively referred to as complexity.
  4. Complexity describes the growth of the relationship between the algorithm execution time (or space) and the data size .

2 Why

Why the complexity of the analysis?

  1. And performance tests compared to the complexity of the analysis is not dependent on the execution environment, low cost, high efficiency, easy operation, guidance and strong features.
  2. Grasp the complexity of the analysis, we will be able to write better code performance, help reduce system development and maintenance costs.

3 How to analyze

3.1 Big O notation complexity

3.1.1 Examples

Code:

int cal(int n) {
  int sum = 0;
  int i = 1;
  int j = 1;
  for (; i <= n; ++i) {
    j = 1;
    for (; j <= n; ++j) {
      sum = sum +  i * j;
    }
  }
}
  1. Each line of code is the execution time unit_time;
  2. 2,3,4 lines of code needed for each row unit_time an execution time;
  3. 5,6 line loop is executed n times, required 2n * unit_timeexecution time;
  4. 7,8 line loop is executed n times, it is necessary 2n * unit_timeexecution time;
  5. Therefore, the total code for the whole execution time T(n) = (2n +2n+3)*unit_time.

It follows that, n is proportional to the number of times each line of code and execution time of all the code T (n).

3.1.2 Big O formula

And the number of execution time of the algorithm is proportional to each line of code , with T(n) = O(f(n))FIG.

  1. T(n) The total represents the algorithm execution time;
  2. f(n)It represents the total number of times each line of code execution,
  3. n Often indicate the size of the data.
  4. Formula Oexecution time of a code representing T(n)the f(n)proportional;

3.1.3 Time Complexity

Therefore, 3.1.1in the example T(n) = O(2n +2n+3), it is a large time complexity O notation .

Big O time complexity does not actually perform the specific code indicating real time, but rather the size of code execution time with the data growth trends , therefore, also called the complexity of the incremental time (asymptotic Time Complexity) , referred to as the time complexity .

When large n, you can think of it as 10000,100000. The three-part formula of low-order, constant coefficient is not about growth trends, it can be ignored. We only need to record a maximum order of magnitude on it, if expressed Big O method represents a time just talking about the complexity of that code, it can be written as: T ( n ) = O ( n 2 ) T (n) = O (n ^ 2) ; Similarly other suchT(n) = O(n)like.

3.1.4 complexity analysis rule

  1. Single frequency of the code to see : such as loops;

  2. Taking the maximum multi-segment codes : a code such as a single cycle and multiple cycle there, then the complexity of taking multiple cycles;

He stressed that piece of code cycle 10000times, 1000000twice as long as a known number, it is still a constant level of execution time, relative to the infinite n, can be ignored.
Abstract formula: If T1(n)=O(f(n)), T2(n)=O(g(n)); then T(n)=T1(n)+T2(n)=max(O(f(n)), O(g(n))) =O(max(f(n), g(n))).

  1. Nested tags seeking product : such a recursive , multi-cycle and the like;

If T1(n)=O(f(n)), T2(n)=O(g(n)); then T(n)=T1(n)*T2(n)=O(f(n))*O(g(n))=O(f(n)*g(n)).
That is, assuming T 1 ( n ) = O ( n ) T1 (n) = O (n) , T 2 ( n ) = O ( n 2 ) T2 (n) = O (n ^ 2) , the T 1 ( n ) T 2 ( n ) = O ( n 3 ) T1 (n) * T2 (n) = O (n ^ 3)

Iterative following sample code:

Overall time complexity is O ( n 3 ) O (n ^ 3)

int cal(int n) {
  int ret = 0; 
  int i = 1;
  for (; i < n; ++i) {
    ret = ret + f(i);
  }
}
int f(int n) {
  int sum = 0;
  int i = 1;
  for (; i < n; ++i) {
    sum = sum + i;
  } 
  return sum;
}
  1. A plurality of adders scale for : methods such as frequency and two control parameters of two cycles , then the time whichever adding complexity.

3.1.5 usual level of complexity

3.1.5.1 polynomial order

With the implementation of temporal and spatial scale of data growth, occupancy algorithm, an increase in proportion polynomial.

1

  1. O (1) (a constant order)

First you have to define a concept, O (1) is only a representation of the time constant level of complexity, it does not mean that only one line of code is executed.
In general, as long as the loop, a recursive algorithm statement does not exist, even if there are thousands of lines of code, the time complexity is Ο (1).

  1. O (logn) (of the order)
  2. O (n) (linear order)
  3. O (nlogn) (for linear order)
  4. O ( n 2 ) O (n ^ 2) (order of the square)
  5. O ( n 3 ) O (n ^ 3) (cubic order)

3.1.5.2 Non-polynomial order

With the implementation of temporal and spatial scale of data growth, occupancy jumped algorithm, such algorithms poor performance.

  1. O ( 2 n ) O (2 ^ n) (step index)
  2. O ( n ! ) O (n!) (Order factorial)

3.2 complexity of spatial analysis

Front, we spent a long time talking about the big O notation and time complexity analysis, to understand speaking in front of content, space complexity analysis is very simple to learn.

Complexity time stands for progressive time complexity , represents a growth of the relationship between the execution time of the algorithm and the data size . Analogy about the space complexity is the full name of the complexity of progressive space (asymptotic space complexity), it represents the relationship between the growth of storage algorithms and data scale .

3.2.1 example analysis

  1. Code Example:
void print(int n) {
  int i = 0;
  int[] a = new int[n];
  for (i; i < n; ++i) {
  	a[i] = i * i;
  }
  
  for (i = n - 1; i >= 0; --i) {
    print out a[i]
  }
}
  1. Like the time complexity analysis, we can see that the second line of code, we applied for a space to store variable i , but it is a constant order , with the size of the data ndoes not matter, so that we can ignore ;
  2. Applying a third line of size n int type array , in addition, the rest of the code does not take up more space, so that the whole space is the complexity of the code O(n).
  3. Our common space complexity is O (1), O (n), O ( n 2 ) O (n ^ 2) , the image O (logn), O (nlognsuch as the order of complexity are usually less than;
  4. Moreover, the space complexity analysis than the time complexity analysis is much simpler . So, for space complexity, to grasp just what I say enough content.

to sum up

Complexity is also called progressive complexity, including the time complexity and space complexity to analyze the growth of the relationship between the efficiency of algorithms and data size can be roughly said that the higher-order algorithm complexity, the lower the efficiency. Common not much complexity, low order to high order from there: O (1), O (logn), O (n), O (nlogn), O ( n 2 ) O (n ^ 2) . Etc. After you learn a complete column, you will find that almost all of the data structures and complexity of the algorithm are not running these.

2

Published 20 original articles · won praise 3 · Views 4536

Guess you like

Origin blog.csdn.net/qq_34246646/article/details/87651849
Recommended