Still don't understand time complexity and space complexity? A blog tells you

Foreword:

        I believe that students will definitely encounter such a term in their daily study and brushing: "time complexity" and "space complexity". Many students are at a loss, and even have a little understanding, but they can't say why. So today, let's talk about what is "time complexity" and "space complexity" from shallow to deep. What the hell!

One, why is there such a thing?

        In the early days of our computer development, the technology at that time was not as advanced as we are now, and the storage technology was still relatively backward. When programming, a bad algorithm often led to the failure of the project; so people at that time, in order to distinguish a The pros and cons of an algorithm and another algorithm, so I thought of a method to analyze the efficiency of the algorithm, namely time complexity and space complexity; time complexity mainly measures the running speed of an algorithm, while space complexity mainly measures The extra space required by an algorithm.

        Now, after years of industry development, our storage technology has become very high, and we no longer pay special attention to the space complexity of an algorithm.

2. Time complexity

  2.1, the concept of time complexity

        Let's first look at its definition: in computer science, the time complexity of an algorithm is a function that quantitatively describes the running time of the algorithm.

        Maybe some students will ask at this time: Why can't we hold the watch to see the time it takes to run a program when calculating the time complexity? Hahaha, in fact, when it is only you, you pinch a table to record the running time of each algorithm, that is not bad, but you have to know that everyone's equipment performance is different, good and bad, And it is also subject to various conditions and restrictions, so we can only use a general analysis method to analyze the time complexity of an algorithm.

        The time spent by an algorithm is proportional to the number of executions of the statements in it, and the number of executions of the basic operations in the algorithm is the time complexity of the algorithm.

  2.2. Asymptotic representation of big O

        Let's take a look at a simple chestnut first:

        In the function above, there is only one loop statement, and the "a++" in the statement is executed n times in total (the for loop executes n-0 times a++), so the basic operation of this function: F(n) = n;

        Then let's take another chestnut:

        In this function, a for loop is nested in the first for loop, so the "count++" statement is executed n^2 (n squared) times in total, and the second for loop executes "count++" 2n times , the last while loop executes "count++" 10 times, so the basic operation of the whole function: F(n) = n^2 + 2*n + 10;

        So do you think this is the time complexity of the algorithm? Actually no, in fact, when we calculate the time complexity, we do not necessarily require the precise number of executions to be calculated, so why?

        From the above two examples, we know that n in the execution function is an unknown number, and we do not know the size. It may be that n is only a small data, or it may be a very large data. When n is large enough, it reaches At 100, 1000, or even 10,000, the last two terms of the expression F(n) = n^2 + 2*n + 10 cannot well influence the size of the overall result:

         That is, we derive the big-O notation:

1. Replace all additive constants in runtime with the constant 1.
2. In the modified run times function, only the highest-order term is retained.
3. If the highest-order term exists and is not 1, remove the constant multiplied by this term. The result is a big-O order.

        Then after using the asymptotic representation of big O, the time complexity obtained by func1 is: F(n^2).

        Next, let's give another chestnut to talk about how to calculate the time complexity in recursion:

        In the factorial function factorial that recursively calculates N in the above figure, although there is no loop statement, the statement "factorial(N-1) * N" will continue to be executed every time it recursively calls itself, which is executed N times in total, so The time complexity of this recursive function is also: O(N).

3. Space complexity

        Similar to the time complexity, the space complexity is not to find the space consumed by an algorithm, but a measure of the temporary additional space occupied by an algorithm . The specific calculation rules are the same as the time complexity, using the big O asymptotic notation. .

        Take bubble sort as an example:

         In the bubble sort, the memory space is only opened up once, that is, "boolean sorted = true". In the subsequent loop, each time the loop body is executed, no memory is opened up, and only the operation is performed on the original basis, so only a constant number of extras are used. space, so the space complexity is: O(1).

        So what is the space complexity of recursive functions? Here we also take factorial as an example:

        In recursion, each time the recursive function is called, it will return and store a piece of data until the last recursive function is completed, so the space complexity of this recursive function is O(n).

4. Summary

        Time complexity and space complexity are not difficult knowledge points. I believe that as long as you are more careful and practice more questions, you will be able to fully master this knowledge point, and practice makes perfect!

        Finally, a summary table of the time and space complexity of common sorting is attached, hoping to help everyone:

Sort method time complexity space complexity
Average situation worst case best case
Insertion sort O(n^2) O(n^2) O(n) O(1)
Hill sort O(n^1.3) O(n^2) O(n) O(1)
Bubble Sort O(n^2) O(n^2) O(n) O(1)
quick sort O(nlog2^n) O(n^2) O(nlog2^n) O (log2 ^ n)
selection sort O(n^2) O(n^2) O(n^2) O(1)
heap sort O(nlog2^n) O(nlog2^n) O(nlog2^n) O(1)
merge sort O(nlog2^n) O(nlog2^n) O(nlog2^n) O(n)

Guess you like

Origin blog.csdn.net/weixin_56960711/article/details/123178439