408 Postgraduate entrance examination data structure review-time complexity and space complexity-with unified examination questions

insert image description here



1. Time complexity

The frequency of a statement refers to the number of times the statement is repeated in the algorithm. The sum of the frequencies of all sentences in the algorithm is denoted as T(n), it is a function of the problem size n of the algorithm, and the time complexity is mainly analyzed T(n)by the order of magnitude. The frequency of basic operations in the algorithm (statements in the deepest loop) is of the T(n)same order of magnitude, so the frequency of basic operations in the algorithm is usually used f(n)to analyze the time complexity of the algorithm. Therefore, the time complexity of the algorithm is recorded as T(n)=O(f(n)).

In the formula, the meaning of O is T(n)the order of magnitude, and its strict mathematical definition is: if T(n)and f(n)are two functions defined on the set of positive integers, then there are constants C and n0, so that at that n>=n0time , both are satisfied 0<=T(n)<=Cf(n).

The time complexity of the algorithm depends not only on the size n of the problem, but also on the nature of the input data. For example, in an array A[0...n-1], the algorithm for finding a given value k is roughly as follows:

int i=n-1;
while(i>=0&&(A[i]!=k))
	i--;
return i;	

The frequency of statements i--;(basic operations) in the algorithm is not only related to the problem size n, but also related to the value of each element of A and the value of k in the input instance:
①If there is no element in A that is equal to k, then the statement 3 frequency f(n)=n.
①If the last element in A is k, then the frequency of statement 3 f(n)=0.

The worst time complexity refers to the time complexity of the algorithm in the worst case. The average time complexity refers to the expected running time of the algorithm under the condition that all possible input instances appear with equal probability. The best time complexity refers to the time complexity of the algorithm in the best case. In general, the worst-case time complexity is always considered to ensure that the running time of the algorithm does not exceed it.

When analyzing the time complexity of a program, there are the following two rules:
① Addition rule

T(n)=T1(n)+T2(n)=O(f(n))+O(g(n))=O(max(f(n),g(n)))

② multiplication rule

T(n)=T1(n)*T2(n)=O(f(n))*O(g(n))=O(f(n)*g(n))

A common asymptotic time complexity is:

O(1)<O(logn)<O(n)<O(nlogn)<O(n*n)<O(n*n*n)<O(2的n方)<O(n!)<O(n的n方)

2. Space complexity

The space complexity S(n) of an algorithm is defined as the storage space consumed by the algorithm, which is a function of the problem size n. Denote it as S(n)=O(g(n)).

When a program is executed, in addition to the storage space to store the instructions, constants, variables and input data used by itself, it also needs some work units to operate on the data and auxiliary space to store some information required for the realization of the calculation. If the space occupied by the input data depends only on the problem itself and has nothing to do with the algorithm, then only the extra space in addition to the input and the program needs to be analyzed.

Algorithm working in place means that the auxiliary space required by the algorithm is constant, ie O(1).


3. Related topics

1 The time complexity of the following algorithm is (O(log2n))

void fun(int n){
    
    
    int i=1;
    while (i<=n)
        i=i*2;
}

Analysis: Find the basic operation i=i*2, set the number of executions to be t, then 2 to the power of t<=n, that is, t<=log2n, so the time complexity T(n)=O(log2n).

2 There is the following algorithm, its time complexity is (O(n square root 3))

void fun(int n){
    
    
    int i=0;
    while (i*i*i<=n)
        i++;
}

Analysis: The basic operation is i++, the number of executions is set to t, there is t t t<=n, that is, the third power of t <=n, then t<=n to the third power.

3 The program segment is as follows:

for(int i= n-1; i > 1; i--)
	for (int j = 1; j < i; j++) 
		if(A[j]>A[j+1])
			A[j]与A[j+1]对换;


Where n is a positive integer, the frequency of the last line of statements is (O(n*n)) parsing in the worst case :
insert image description here

4 [2011 Unified Examination Questions] Let n be a non-negative integer describing the size of the problem, the time complexity of the following program fragment is (O(log2n))

x=2;
while(x<n/2)
	x=2*x;

Analysis: The basic operation is x=2*x. Each time it is executed, it is multiplied by 2. If the number of executions is t, then there is a (t+1) power of 2<n/2, so t<log2(n/2)- 1=log2n-2, then T(n)=O(log2n).

5 [2012 Exam Questions] The algorithm for finding the factorial of an integer n is as follows, and its time complexity is (O(n))

int fact(int n){
    
    
    if(n<=1) return 1;
    return n* fact(n-1);
}

Analysis: This recursion is equivalent to returning O(n) times.

6 [2013 Unified Examination Questions] Given two ascending linked lists of length m and n, if they are merged into a descending linked list of length m+n, the worst-case time complexity is (O(max( m,n)))
Analysis: Merge two ascending linked lists, compare the elements in the list pairwise, and determine the link position of an element (take the smaller element) for each comparison. When the comparison of one linked list is completed, the remaining elements of the other linked list can be inserted. The worst case is that the elements in the two linked lists are compared in turn, because 2max(m,n)>=m+n, so the time complexity is O(max(m,n)).

7 [2014 Unified Examination Questions] The time complexity of the following program segment is (nlong2n)

count=0;
for(k=1;k<=n;k*=2)
	for(j=1;j<=n;j++)
		count++;

Analysis: The inner loop condition j<=n has nothing to do with the variables of the outer loop, and is independent of each other. Each time j is executed once, j is incremented by 1, and each inner loop is executed n times. The outer loop condition is k<=n, and the increment is defined as k*=2. It can be seen that the number of loops t satisfies the t power of k=2<=n, that is, t<=log2n. The time complexity of the inner loop is O(n) and the outer loop is O(log2n). For nested loops, according to the multiplication rule, T(n)=O(n)*O(log2n)=O(nlog2n).

Guess you like

Origin blog.csdn.net/m0_46653805/article/details/123413567