0 Time Complexity Space Complexity of Basic Data Structure

1. Algorithm efficiency

1.1 How to measure the quality of an algorithm

如何衡量一个算法的好坏呢? 比如对于下列的斐波那契数列:
long long Fib(int n)
{
    
    
    if(n < 3)
     {
    
    
         return 1;
     }
    else
    {
    
    
       return Fib(n - 1) + Fib(n - 2);
    }

The recursive implementation of the Fibonacci sequence is very concise, but is simplicity necessarily good? Then how to measure its goodness and badness? Next we will introduce the measurement standard.

1.2 Algorithm complexity

After the algorithm is compiled into an executable program, it needs to consume time resources and space (memory) resources when running. Therefore, to measure the quality of an algorithm, it is generally measured from two dimensions of time and space, that is, time complexity and space complexity.

Time complexity mainly measures the running speed of an algorithm, and space complexity mainly measures the extra space required for an algorithm to run.
(In the early days of computer development, the storage capacity of the computer was very small. So I was very concerned about the space complexity. But after the rapid development of the computer industry, the storage capacity of the computer has reached a very high level. So we no longer need to Pay special attention to the space complexity of an algorithm.)

2. Time complexity

2.1 The concept of time complexity

In computer science, the time complexity of an algorithm is a function (a function in the mathematical sense), and the number of executions of the basic operations of the algorithm is the time complexity of the algorithm.

That is: finding the mathematical expression between a certain basic sentence and the problem size N is to calculate the time complexity of the algorithm.
Let's take an example.

//请计算一下Func1中++count共执行了几次
void Func1(int N)
{
    
    
     int count = 0;
for (int i = 0; i < N ; ++ i)
{
    
    
     for (int j = 0; j < N ; ++ j)
     {
    
    
         ++count;
     }
}

for (int k = 0; k < 2 * N ; ++ k)
{
    
    
     ++count;
}
int M = 10;
   while (M--)
  {
    
    
    ++count;
  }
  printf("%d\n", count);
}

Obviously, the number of for loops in the first layer of nesting is N*N , and the number of loops in the second for loop is 2*N while the number of loops is 10
, so the basic number of Func1 executions can be expressed as F(N) = N 2 + 2*N + 10
From this expression, we can find that when N is very large, the value of F(N) = N * N is close to F(N) = N * N + 2*N + 10. In practice, we calculate the time When it comes to complexity, we don’t actually have to calculate the exact number of executions, but only the approximate number of executions, so here we use the asymptotic notation of big O.

2.2 Asymptotic representation of big O

Big O notation (Big O notation): is a mathematical notation used to describe the asymptotic behavior of a function.
Derive the Big O method:
1. Replace all additive constants in run time with the constant 1.
2. In the modified running times function, only the highest order term is kept.
3. If the highest order item exists and is not 1, remove the constant multiplied by this item. The result is big O order.

Then, after using the asymptotic representation of big O in the above example, the time complexity of Func1 is: O(N 2 )

Through the above, we will find that the asymptotic representation of big O removes those items that have little influence on the result , and expresses the number of executions concisely. In addition, the time complexity of some algorithms has the best, average and worst cases:
the best case : Minimum number of runs (lower bound) for
any input size Average case: Expected number of runs for any input size Worst
case: Maximum number of runs (upper bound)
for any input size
Good case: 1 time to find
Worst case: N times to find
Average case: N/2 times to find
In practice, the general situation focuses on the worst operation of the algorithm, so the time complexity of searching data in the array is O(N)

2.3 Examples of common time complexity calculations

In this part, we briefly give a few examples to help you better use the progressive notation of big O.

Example 1:

//计算func2的时间复杂度
void Func2(int N)
{
    
    
    int count = 0;
    for (int k = 0; k < 100; ++ k)
    {
    
    
++count; }
    printf("%d\n", count);
}

Obviously, the number of executions of the for loop is 100 times. According to the first item of the big O progressive notation, the time complexity is O(1)
Example 2

// 计算BubbleSort的时间复杂度? 
void BubbleSort(int* a, int n) 
{
    
    
    assert(a);
    for (size_t end = n; end > 0; --end)
    {
    
    
        int exchange = 0;
        for (size_t i = 1; i < end; ++i)
        {
    
    
             if (a[i-1] > a[i])
             {
    
    
                 Swap(&a[i-1], &a[i]);
                 exchange = 1;
             }
        } 
        if (exchange == 0)
              break;
     }
}  

According to the idea of ​​bubble sorting (old irons who don’t know can go to look it up and don’t go into details here), the actual time complexity here is N-1+N-2+N-3+…+2+1 and then according
to The arithmetic sequence summation formula is N*(N-1)/2, and finally, according to the big O asymptotic notation of 2 and 3, the time complexity is O(N 2 )
Example 3

// 计算BinarySearch的时间复杂度?
int BinarySearch(int* a, int n, int x)
{
    
    
    assert(a);
    int begin = 0;
    int end = n-1;
// [begin, end]:begin和end是左闭右闭区间,因此有=号 
    while (begin <= end)
    {
    
    
        int mid = begin + ((end-begin)>>1);
        if (a[mid] < x)
         begin = mid+1;
        else if (a[mid] > x)
            begin = mid+1;
        else
            end = mid-1;
            return mid;
    }
    return -1;
}

The above code is the implementation process of binary search. According to the implementation principle of binary search, the worst case of actual time complexity is

N/2/2/2.../2 = 1
suppose to search X times
N = 2 and take logarithm on both sides of X at the same time as log ⁡ 2 N \log_2N

log2N
here we write O(log ⁡ N \log NlogN ) ps: In the algorithm analysis, logN means that the base is 2 and the logarithm is N. In some places, it will be written as lgN.

3. Space complexity

Space complexity is also a mathematical expression, which is a measure of the amount of storage space temporarily occupied by an algorithm during operation .
Space complexity is not how many bytes the program occupies, because this is not very meaningful, so space complexity is calculated by the number of variables . The space complexity calculation rules are basically similar to the practical complexity, and the big O asymptotic notation is also used .
Note: The stack space (storage parameters, local variables, some register information, etc.) required by the function at runtime has been determined during compilation, so the space complexity is mainly determined by the extra space explicitly requested by the function at runtime.
Below we also give a few examples to help you understand
Example 1

// 计算BubbleSort的空间复杂度? 
void BubbleSort(int* a, int n) 
{
    
    
    assert(a);
    for (size_t end = n; end > 0; --end)
    {
    
    
        int exchange = 0;
        for (size_t i = 1; i < end; ++i)
        {
    
    
             if (a[i-1] > a[i])
             {
    
    
                 Swap(&a[i-1], &a[i]);
                 exchange = 1;
             }
        } 
        if (exchange == 0)
              break;
     }
}  

A constant amount of extra space is used here, so the space complexity is O(1) .
Example 2

// 计算阶乘递归Fac的空间复杂度? long long Fac(size_t N)
{
    
    
    if(N == 0)
        return 1;
    return Fac(N-1)*N;
}

Example 2 recursively calls N times, opens up N stack frames, and each stack frame uses a constant space. The space complexity is O(N)

4. Common complexity comparison

insert image description here

Guess you like

Origin blog.csdn.net/weixin_69423932/article/details/127477698