Time complexity comparison

      We all know that for the same problem, there can be multiple algorithms to solve the problem. Although the algorithm is not unique, there are still relatively good algorithms for the problem itself. Some people here may ask what is the criterion for distinguishing good from bad? This should be viewed from both the "timeliness" and "storage" aspects.

         "Timeliness" here refers to the time efficiency, that is, the execution time of the algorithm. For multiple different solving algorithms of the same problem, the shorter the execution time, the higher the efficiency of the algorithm, and the longer the efficiency; the "storage" refers to The storage space required when the algorithm is executed mainly refers to the memory space occupied by the algorithm program when it is running.

  Today, let’s take a look at the comparison of time complexity. In our later algorithm learning process, we will encounter various orders of magnitude functions. Below I will list several common order of magnitude functions for you:

                                            

   In the above figure, we can see that when n is very small, it is not easy to distinguish between functions, and it is difficult to say who is in the dominant position, but when n increases, we can see a very obvious difference, who is the boss at a glance:

  O(1) < O(logn) < O(n) < O(nlogn) < O(n^2) < O(n^3) < O(2^n)

Guess you like

Origin blog.csdn.net/yanghezheng/article/details/110834411