[Data structure] Detailed explanation of time complexity and space complexity

Table of contents

1 Time complexity

1.1 The concept of time complexity

1.2 Big O asymptotic notation

1.3 Common time complexity calculation examples

2 Space Complexity

2.1 The concept of space complexity

2.2 Examples of Common Space Complexity Calculation


1 Time complexity

1.1 The concept of time complexity

Time complexity, there will be a misunderstanding here, some people may think that time complexity describes the length of time it takes for the algorithm to run. However, theoretically speaking, due to a series of environmental factors such as hardware configuration and network environment, the specific time spent by an algorithm in the execution process cannot be calculated in advance. The specific time can only be known by running the program honestly on the machine.

But this is undoubtedly very resource-intensive, so people created the concept of time complexity. Because in the process of running a program, we can approximately think that the time spent on each execution of the program is equal, that is, the time spent by an algorithm is proportional to the number of statement executions. So we call the number of executions of the basic operations in the algorithm the time complexity of the algorithm.

Try to count how many times the ++count statement of the following function is executed:

void Func1(int N)
{
    int count = 0;
    for (int i = 0; i < N ; ++ i)
    {
         for (int j = 0; j < N ; ++ j)
     {
         ++count;
     }
    }
 
    for (int k = 0; k < 2 * N ; ++ k)
    {
         ++count;
    }
    int M = 10;
    while (M--)
    {
         ++count;
    }
}

It is not difficult to conclude that the ++count statement here has been executed a total of N^{2}+2*N+10times.

The number of times calculated above is the exact number of executions of the program. But in practical applications, we just want to describe the approximate speed of an algorithm, and we don't necessarily need a very precise number of executions, so we only need an approximate number of executions, which requires the use of big O asymptotic notation.


1.2 Big O asymptotic notation

When we see time complexity in many books, we have seen some writing methods like O(n) and O(n²), so what is big O?

The introduction to the algorithm is explained like this: Big O is used to represent the upper bound . When it is used as the upper bound of the worst-case running time of the algorithm, it is the upper bound of the running time for any data input.

Take bubble sorting as an example, the time complexity of bubble sorting is O(n²), but not every time it can be completely executed n² times. The n² here is a representation of the worst case. In practice, it is possible to complete the sorting before reaching n².

How to derive the big O order of an algorithm has the following methods:

  1. Replace all additive constants in run time with the constant 1.
  2. In the modified run count function, only the highest order term is kept.
  3. If the highest-order term exists and is not 1, remove the constant that is multiplied by this term. The result is big O order.

Therefore, through the above rules, we return to the func1 function in 1.1, and use the big O asymptotic method to express it as: O(n²)

Through the above rules, we can find that the big O asymptotic representation removes the items in the original function that have little influence on the result, making the result more concise .

If you have a better understanding of limits in mathematics, you will know that this is actually related to the concept of infinity. Take the execution count function of func1:

                                                              N^{2}+2*N+10

When N approaches infinity, the growth rate of N² is much faster than 2*N, and the constant term has no effect on the result. So we can think that the big O asymptotic representation is based on the fastest growing item in the original function as the measurement standard .


1.3 Common time complexity calculation examples

Example 1:
void Func2(int N)
{
     int count = 0;
     for (int k = 0; k < 2 * N ; ++ k)
     {
         ++count;
     }
     int M = 10;
     while (M--)
     {
         ++count;
     }
     printf("%d\n", count);
}

It is easy to conclude that the basic operation in Example 1 is performed 2*N+10 times, which can be obtained by deriving the method of big O-order asymptotic representation, and the time complexity is O(N)

 Example 2:

void Func3(int N, int M)
{
     int count = 0;
     for (int k = 0; k < M; ++ k)
     {
         ++count;
     }
     for (int k = 0; k < N ; ++ k)
     {
         ++count;
     }
     printf("%d\n", count);
}

The basic operation is performed M+N times, here because there are two unknowns, so the time complexity is O(N+M)

Example 3:

void Func4(int N)
{
     int count = 0;
     for (int k = 0; k < 100; ++ k)
     {
         ++count;
     }
     printf("%d\n", count);
}

 The basic operation is performed 100 times, which can be obtained by deriving the method of big O order asymptotic representation, and the time complexity is O(1)

Example 4:

// 计算二分查找的时间复杂度
int BinarySearch(int* a, int n, int x)
{
     assert(a);
     int begin = 0;
     int end = n-1;
     // [begin, end]:begin和end是左闭右闭区间,因此有=号
     while (begin <= end)
     {
         int mid = begin + ((end-begin)>>1);
         if (a[mid] < x)
             begin = mid+1;
         else if (a[mid] > x)
             end = mid-1;
         else
         return mid;
     }
     return -1;
}

Here we assume that there are a total of n data, and the maximum number of executions is x. Assuming that the target value is found after searching all the data in the worst case, then according to the definition of the binary algorithm, the following equation is given: 2^{x}=n;

So there are: x={log_{2}}n;

The time complexity is O(logN);

( ps: There are some differences with conventional mathematics here. In the algorithm analysis, logN means that the base is 2 and the logarithm is N. In some places, it will be written as lgN )

Example 5:

// 计算阶乘递归Fac的时间复杂度?
long long Fac(size_t N)
{
     if(0 == N)
     return 1;
 
     return Fac(N-1)*N;
}

Here it can be found that the recursion is performed N times, so the time complexity is O(N)


2 Space Complexity

2.1 The concept of space complexity

Space complexity is also a mathematical expression, which is used to measure the size of the storage space temporarily occupied by the algorithm during operation.

There is also a common misunderstanding here. Space complexity is not how many bytes the calculation program takes up. It is meaningless to do so. So the space complexity calculation is the number of variables. At the same time as the time complexity, the space complexity also uses the big O asymptotic notation.

Note: Since the stack space (such as storage parameters, local variables, etc.) required by the function runtime has been determined during preprocessing, the space complexity is mainly determined by the additional space requested by the function at runtime.

2.2 Examples of Common Space Complexity Calculation

Example 1:

void BubbleSort(int* a, int n)
{
    assert(a);
     for (size_t end = n; end > 0; --end)
     {
         int exchange = 0;
         for (size_t i = 1; i < end; ++i)
         {
             if (a[i-1] > a[i])
         {
             Swap(&a[i-1], &a[i]);
             exchange = 1;
         }
     }
         if (exchange == 0)
             break;
     }
}

A constant amount of extra space is used in the algorithm, so the space complexity is O(1)

Example 2:

long long* Fibonacci(size_t n)
{
     if(n==0)
     return NULL;
 
     long long * fibArray = (long long *)malloc((n+1) * sizeof(long long));
     fibArray[0] = 0;
     fibArray[1] = 1;
     for (int i = 2; i <= n ; ++i)
     {
         fibArray[i] = fibArray[i - 1] + fibArray [i - 2];
     }
     return fibArray;
}

N additional spaces are opened up in the algorithm, so the space complexity is O(N)

Example 3:

long long Fac(size_t N)
{
     if(N == 0)
     return 1;
 
     return Fac(N-1)*N;
}

In the algorithm, recursive calls are made N times, and N stack frames are opened up. There is no additional space in each stack frame, so the space complexity is O(N)

 (End of this article)

Guess you like

Origin blog.csdn.net/fbzhl/article/details/130159333