Big O notation: time complexity is O Notation usually large, defined as T [n] = O (f (n)). Said function T (n) to f (n) is known, or bounded T (n) is limited to f (n). If a problem is the size of n, a algorithm for solving this problem required time T (n). T (n) referred to as "time complexity" of this algorithm. When the input n gradually increase the limit where the time complexity of the algorithm is called "asymptotic complexity of time."
Iterative procedure
- topic:
int i=1;
while(i<=n) {
i=i*2;
}
- Thinking:
Suppose k times the loop is executed , then the \ (2 ^ k \) ≤n, the k≤logn, so the time complexity is T (n) = O (logn )
Recursive procedure
Main method
- Divide and conquer master theorem: T [n] = the aT [n / B] + f (n) , where n is the problem size , A ≧ 1 and B>. 1 is a constant , and f (n) is a positive function of progressive, i.e. other than recursive computation time , in order to use this master theorem, consider the following three cases:
- If f (n) = O ( \ (n-log_ba-^ [epsilon] {} \) ) (i.e., f (n) index of less than \ (log_ba} {\) ), for some constant ε> 0 holds (i.e., f ( n) of \ (n ^ {log_ba} \ ) of the low-order infinite ), then T (n) = O ( \ (n ^ {log_ba} \ ) ) (i.e., the time complexity depends on the high-order infinite
) If f (n) = O ( \ (n ^ {log_ba} \) ) (i.e., f (n) index equal to \ ({log_ba} \) ) (i.e., f (n) of \ (n ^ {log_ba} \ ) of the same order infinite ), then T (n-) = O ( \ (^ {n-logN log_ba} \) )
If f (n) = O ( \ (n-log_ba + [epsilon] ^ {} \) ) (i.e., f (n) of the index is greater than \ (log_ba} {\) ), for some constant ε> 0 holds, and the AF ( n-/ B) ≤cf (n-) , for some constant C <. 1 (a sufficiently large n-) established (i.e., f (n) of \ (n ^ {log_ba} \ ) high order infinite ), then T (n) = O (f (n))
- If f (n) = O ( \ (n-log_ba-^ [epsilon] {} \) ) (i.e., f (n) index of less than \ (log_ba} {\) ), for some constant ε> 0 holds (i.e., f ( n) of \ (n ^ {log_ba} \ ) of the low-order infinite ), then T (n) = O ( \ (n ^ {log_ba} \ ) ) (i.e., the time complexity depends on the high-order infinite
- Title:
T (n-) = 3T (n-/ 2) ^ 2 + n- - Ideas:
- If you can use the main method, A =. 3, B = 2, F (n-) = ^ n-2 satisfy the condition
- See which satisfied, since \ (log_23 \) <2, and \ (^ 2 with 3N /. 4 <CN ^ 2 \) (C <. 1) , to meet the third case, the T (n) = O (n ^ 2)
Iterative method
- topic:
//汉诺塔问题,假定move()的时间复杂度为O(1)
void hanoi(int n, char x, char y, char z) {
if(n == 1) {
move(x, 1, z);
}else {
hanoi(n-1, x, z, y);
move(x, n, z);
hanoi(n-1, y, x, z);
}
}
- Ideas:
- First write the expression: T (n) = 2T (n-1) + O (1) (ie, the size of your problems decomposed into question the scale of 2 n-1, plus the implementation of a basic operation of the move ()
- If you can use the main method, we found a = 2, b = 1, f (n) = O (1), does not satisfy b> 1 conditions can not be used
- By iteration, because the data size is reduced each iteration n 1, there must be the last to end, i.e. n == 1.
T (n-) = 2T (. 1-n-) + 1'd
T (. 1-n-) = 2T (n--2) + 1'd
simultaneous give
T (n) = 4T (n -2) + 1 + 2
by a mathematical induction method, to obtain
T (n) = 2 ^ (
n-1) T (1) + 1 + 2 + 4 + 8 + ... + 2 ^ (n-2) and the termination condition ∵ T (. 1). 1 =
∴ The time complexity is O (2 ^ n)