[Data structure] Time and space complexity

 We are about to enter the study of data structure. Let us first understand the time and space complexity, which can also determine whether our algorithm is good or bad;


How to measure the quality of an algorithm?

Just look at its algorithm efficiency

Algorithm efficiency

Algorithm efficiency analysis is divided into two types: the first is time efficiency, and the second is space efficiency. Time efficiency is called time complexity, while space efficiency is called space complexity. Time complexity mainly measures the running speed of an algorithm , while space complexity mainly measures the requirements of an algorithm extra space, In the early days of computer development, computers had very little storage capacity. So we care a lot about space complexity. However, after the rapid development of the computer industry, the storage capacity of computers has reached a very high level. So we no longer need to pay special attention to the space complexity of an algorithm.


time complexity 

Definition of time complexity: In computer science, the time complexity of an algorithm is a mathematical function that quantitatively describes the algorithm of running time. The time it takes to execute an algorithm cannot be calculated theoretically. You can only know it if you put your program on the machine and run it. But do we need to test every algorithm on a computer? It is possible to test everything on a computer, but this is very troublesome, so the time complexity analysis method is introduced. The time an algorithm takes is proportional to the number of executions of its statements. The number of executions of basic operations in the algorithm is the time complexity of the algorithm. 


Big O asymptotic notation

In fact, when we calculate the time complexity, we do not actually have to calculate the exact number of executions, but only the approximate number of executions, so here we use the asymptotic representation of Big O

Number of basic operations performed by Func:

F(N)=N^2+2N+10

But what if N is infinitely large?

N=10  F(N) = 130

N=100  F(N) =10210

N=1000 F(N)=1002010 

Therefore we have the following conclusions:

1. Replace all additive constants in the run time with constant 1. F(N)=N^2+2N+10
2. In the modified running times function, only the highest order term is retained. F(N)=N^2
3. If the highest-order term exists and is not 1, remove the constant multiplied by this term. The result is Big O order. 

If F(N) = 3N^2

Then 3 is omitted F(N) = N^2 

So Big O is O(N^2)

In addition, there are best, average and worst cases for the time complexity of some algorithms:
Worst case: the maximum number of runs (upper bound) for any input size
Average case: expected number of runs for any input size
Best case: minimum number of runs (lower bound) for any input size 

For example:Search for a data x in an array of length N

Best case: 1 time found
Worst case: N times found
Average case: N/2 times found

In practice, the general focus is on the worst operating situation of the algorithm, so the time complexity of searching for data in the array is O(N) 


Common time complexity calculation examples 

 

 F(N) = 2N+10

According to the formula F(N)=N;

So it is O(N)

 

Both M and N are unknown numbers

F(N)=M+N;

So it is O(M+N) 

F(N)=100;

So F(N)=1;

So it is O(1); 

There is a best case scenario and a worst case scenario,Generally we only consider the worst case scenario 

 

The time complexity of binary search is O(logN) 

 

Under normal circumstances, the time complexity of recursion is the number of recursions * the number of times each recursion is executed. 

 

This Fibonacci recursion requires drawing analysis

So Big O is O(2^N) 

 space complexity

Space complexity is a measure of the amount of storage space temporarily occupied by an algorithm during its operation. . Space complexity is not how many bytes of space the program occupies, because this is not very meaningful, so space complexity is calculated by the number of variables. The calculation rules of space complexity are basically similar to the time complexity, and also use the big O asymptotic notation

 

The space complexity is O(1) 

Because no other array was requested, only the original array was used.

The time complexity is O(N) 

Because another array is applied for

 

The recursive call is made N times, N stack frames are opened, and each stack frame uses a constant amount of space. The space complexity is O(N) 


Guess you like

Origin blog.csdn.net/chaodddddd/article/details/134606623