Detailed time and space complexity

Data structure often requires us to design an algorithm based on certain scenarios, it can help us to solve some problems. The core algorithm is the correctness of the algorithm is not correct, then simple is useless.

Another point is that we play a key role often say that time and space complexity. Time complexity of the algorithm reflects the order of program execution time with the input scale growth and growth to a large extent reflect the merits of the algorithm is well or not. Master the basic algorithm time complexity analysis is necessary. We used in the data structure is the upper bound of the algorithm is O (), what other lower bound, not talked about here.

Complexity of the algorithm to consider many such communications time-consuming, the use of instruction time-consuming disk access time used, etc., but in the data structures we consider only the number of core statement is executed on it, this time n he took infinite on the line, and then find it in the same order of infinity, which is its degree of complexity. The key here is to be derived univariate polynomial associated with the n. 

If a scale algorithm is n, the number of executions required to: 100n² + 0.02n + 10000000000000 (n approaches infinity)

In fact, we only see the top item on the list, it is the 100n², others are n² because of low-level infinity, it can be ignored.

Analysis of complexity just look at the highest index entry, written in a direct factor just fine.

Then it's time complexity is O (n²).

 

100n, 0.000001n, 3n + 20000,10000000000000n, 0.01n + 1000000000000000000000000000000

Are written complexity O (n)

1,3000,199990,10000000,

Are written complexity O (1) since they are often several times can be executed, regardless of the data size.

 

for (i=1; i<=n; i++)  

 {sum++;}

The time complexity is Ο (n)

 

for (i=1; i<=n; i++)  

     for (j=1; j<=n; j++)  

    {sum++;}

The time complexity is o ( )

 

If the above two small programs together is O (n²)

 

Similarly, the computational complexity of space as well.

In many scenes, we find that either need to spend more time, or they will take more space, most of the time the two can not have both.

According to current computer development, the vast majority of software response time has a relatively high demand, such as playing music Diminshing when you swap small animals, two seconds after the animation effects, although not running a little fever, to use a little bit of memory, but this rate is estimated to delete the game you just do not forget to greet a few developers. Most of the time can be sacrificed space for time, time for space using a small portion under low memory and low real-time requirements of the situation, even worse, may even memory is extremely scarce, such as external sorting algorithm is ordering us this situation.

Compare the pros and cons of the two algorithms, many people like to put in a concrete core data to compare how many times the statement is executed, the results put himself get dizzy, as we evaluate the complexity of the algorithm is in the scale of infinite time to evaluate the in particular, when we all used to count the corresponding data can be under the circumstances, to choose the best algorithms, such as 100 million data with 100 data algorithm was likely not the same, the same token, in different systems with a algorithm performance are not the same, such as in a distributed system, fast performance ranked a little less than satisfactory.

 

 

Guess you like

Origin blog.csdn.net/mad_sword/article/details/90738481