01. Data structure information, time complexity and space complexity (basic data structure algorithm)

watermark,size_16,text_QDUxQ1RP5Y2a5a6i,color_FFFFFF,t_100,g_se,x_10,y_10,shadow_90,type_ZmFuZ3poZW5naGVpdGk=Introductory materials and knowledge index


table of Contents


Common information

Algorithm complexity

https://www.bigocheatsheet.com

Dynamic demo site

https://www.cs.usfca.edu/~galles/visualization/Algorithms.html

https://visualgo.net/zh

knowledge structure

5 major features of the algorithm

Finite, exact, input, output, feasibility

time complexity:

The computational workload required to execute the algorithm.
Generally speaking, a computer algorithm is a function of the problem size n, f(n)and the time complexity of the algorithm is therefore recorded asT(n)=O(f(n))

The larger the problem is, the growth rate of the algorithm execution time is positively related to the growth rate of f(n), which is called asympotic time complexity (Asympotic Time Complexity)

Calculation:

  • 1. The formula for calculating the number of times
    1+2+3+...+n;
<?php
$sum=0;
for($i=1;$i<=$n;$i++){
    sum+=$i;
}
?>

Calculate n times, the time complexity is O(n)

  • 2. Use the constant 1 to replace all the addition constants at all times. For example, O(3) is recorded as O(1)
<?php
function test($n){
    echo $n;
    echo $n;
    echo $n;
}
?>

O(3) is recorded as O(1)

  • 3. In the modified number of operations function, only the highest order item
    n^2+n+1 is retained, which is recorded as O(n^2)
  • 4. If the highest order exists and is not 1, remove the constant
    2n^2+3n+1 multiplied by this term and record it as O(n^2)
Constant order: O(1) 
Linear order: O(n) 
Flat (Li) square order: O(n^2), O(n^3)
<?php
$sum=0;
for($i=1;$i<=$n;$i++){
    for($j=1;$j<=$n;$j++){
        $sum+=$j;
    }
}
?>

Two-layer loop O(n^2) Three-layer O(n^3)
special square order: O(n^2/2+n/2)-> O(n^2)

for(){
    for(){
        
    }
}

for(){

}
echo $a+$b;

n^2+n+1->O(n^2)

Logarithmic order

$i=1;$n=100;
while($i<$n){
    $i=$i*2;
}
2^x=n 

x=\log_2^n

\log_2^n=\log_2^{10}\log{n}
3^x=n

x=\log_3^n

\log_3^n=\log_3^{10}*\log{n}

Worst case: the running time under the worst case, a guarantee, if there is no special description, the time complexity mentioned is the time complexity under the worst case

Average case: expected run time

Space complexity: the memory space consumed by the algorithm, denoted as S(n)=O(f(n))

  • Including the space occupied by the program code
  • The space occupied by the input data and
  • Space occupied by auxiliary variables

These 3 aspects

The calculation and representation methods are similar to the time complexity, and are generally represented by the progressiveness of the complexity

Time complexity analysis

1. Only focus on the code with the most loop executions

Big O This complexity representation method just shows a trend of change.

We usually ignore the constants, low-order, and coefficients in the formula, and only need to record the magnitude of the largest order. So when we analyze the time complexity of an algorithm or a piece of code, we only pay attention to the piece of code that has the most loop execution times.

C is a constant, why can C be omitted? 

Cn^2 
C(n+1)^2-Cn^2=C(2n+1) 
(n+1)^2-n^2=2n+1

2. The rule of addition: the total complexity is equal to the complexity of the code with the largest magnitude

3. The rule of multiplication: the complexity of the nested code is equal to the product of the complexity of the code inside and outside the nest

Space complexity analysis

The full name of time complexity is progressive time complexity , which represents the growth relationship between the execution time of an algorithm and the scale of data . By analogy, the full name of space complexity is asymptotic space complexity (asymptotic space complexity), which represents the growth relationship between the storage space of an algorithm and the scale of data .

void print(int n) {
  int i = 0;
  int[] a = new int[n];
  for (i; i <n; ++i) {
    a[i] = i * i;
  }

  for (i = n-1; i >= 0; --i) {
    print out a[i]
  }
}

Like the time complexity analysis, in the second line of code, we apply for a space to store the variable i, but it is of constant order and has nothing to do with the data size n, so we can ignore it. Line 3 applies for an int type array of size n. In addition, the rest of the code does not take up more space, so the space complexity of the entire code is O(n).

Picture taken from: https://time.geekbang.org/column/article/40036

Guess you like

Origin blog.51cto.com/huangkui/2677732