The time complexity and space complexity of the algorithm (java)

1. Time frequency

Time frequency: The time spent by an algorithm is directly proportional to the number of executions of the statement in the algorithm. The more the number of executions of the statement in that algorithm, the more time it takes. The number of executions of statements in an algorithm is called statement frequency or time frequency

For example:
Insert picture description hereInsert picture description here

It was executed only once here. So the execution time of the following code is much shorter than the above.

2. Time complexity

(1) In general, the number of repeated executions of the basic operation statement of the algorithm is a certain function of the problem scale n, represented by T(n), if a certain auxiliary function f(n), so that when n approaches to infinity , The extreme value of T(n)/f(n) is a constant that is not equal to zero, then f(n) is said to be a function of the same order of magnitude of T(n).
Recorded as T(n)=o(f(n)), so O(f(n)) is the progressive time complexity of the algorithm, referred to as time complexity.

(2) T(n) is different, but the time complexity may be the same, such as T(n)=5n²+7n+5 and T(n)=9n²+2n+1, the time complexity is O(n²)

(3) The calculation method of time complexity is for example the above: T(n)=5n²+7n+5 directly ignore the constant term, the first order term, and the coefficient of the highest order term.

Common time complexity

1. Constant order O(1)
2. Logarithmic order O(log2^n)
3. Linear order O(n)
4. Linear logarithmic order O(nlog2^n)
5. Square order O(n^2)
6 , Cubic order O (n^3)
7, k-th order O (n^k)
8, exponential order O (2^n)

Common time complexity map
Insert picture description here
Insert picture description here

Common algorithms are sorted from smallest to largest time: O(1)<O(log2^n) <O(n) <O(nlog2^n) <O(n 2)<O(n 3)
<O(n ^k) <O(2^n) The
larger n is, the time complexity increases, and the execution efficiency of the algorithm is lower

Example:
Constant order O(1)
Insert picture description here

No matter how many lines are executed in this paragraph, as long as it has no loop structure, then its time complexity is constant 1. Even if it has tens of thousands of lines of code, its time complexity is still 1.

Logarithmic order O(log2^n)
Insert picture description here

In the while loop, i is multiplied by 2 each time. After the multiplication, i is getting closer and closer to n. After looping x times, i will be greater than 2, and the loop will end at this time. The x power of 2 is equal to n, then x=log2^n,
that is, the code ends when the loop reaches log2^n times. So the time complexity of this code is O(log2^n), if i=i*3
, the corresponding time complexity is O(log3^n)

Linear order O(n)
Insert picture description here

For another example of this code, the code in the for loop will be executed n times, so the time it consumes varies with the time of n, so the time complexity of this type of code can be represented by O(n).

Linear logarithmic order O (nlog^n)
Insert picture description here

Linear logarithmic order O(nlog^N). If the time complexity is looped N times for a code with a time complexity of O(log^n), then its time complexity is n*O(log^N),
which is O(log^N).

Square order O(n²)
Insert picture description here

Double for loop, then its time complexity is O(n²) If n here is changed to a,b, then his time complexity is O(m*n)

3. Space complexity

Space complexity: Defined as the storage space consumed by an algorithm.
The space complexity is related to the number of temporary work units that the algorithm needs to take up to solve the scale problem n. The space complexity will increase as the time complexity n increases. When n is larger, more storage units will be occupied. .

Guess you like

Origin blog.csdn.net/weixin_46457946/article/details/113048490