The time complexity of O

First, the time complexity

  When the algorithm analysis, the statement of the total number of executions T (n) is a function of problem size n, and then analyzed T (n) with the change of n and determine the time complexity T (n) of the order, algorithms, time algorithm metric is denoted as: T (n) = O (f ()). It represents n increases with scale of the problem, the same algorithm execution time rate f (n) growth, called asymptotic time complexity of the algorithm, referred to as the complexity of the time, where f (n) is a function of problem size n.

    Such uppercase [O ()] to reflect the time complexity of notation, we call big O notation. For example: O (n), O (1), O (n2), O (log n) and the like. In general, as n increases, T (n) algorithm is the slowest growth optimal algorithm.

Second, the method of derivation of large order O

1, instead of adding all the time constant calculation time by 1.

2, the function runs after the modification, retaining only the highest order term.

3, if the highest order entry exists and is not 1, then removed and the constant term is multiplied. The result is the big O-order.

One case: the time complexity is O (1) algorithm for constant order

. 1  int SUM = 0, n-= 100;     / * perform a * / 
2 SUM = (. 1 + n-) * n-/ 2;         / * perform a * / 
. 3 the printf ( "The SUM IS:% D", SUM);    / * execution time * /

  We can see that the number of operations is a function f (n) = 3. According to our above a large order formulas can be O, the constant term of 3 to 1, the highest order term is not found in the highest order of retention, so a large time complexity O (1). In other words, whether the algorithm is three times or 30 times, even 300 times, as long as these are constant term, its time complexity are big O (1), rather than O (3), O (30), O (300). That is what we call a constant order.

Example 2: time complexity of O (n) algorithm order linear

1 for(int i = 0; i < n; i++) {
2     sum += i;
3 }

  We can see from the above code, its time complexity is O (n), because the loop body code needs n times.

Example 3: time complexity of O (n- 2 ) of the order of the square algorithm

1 for(int i = 0; i < n; i++) {
2     for(int j = i; j < n; j++) {
3 // time complexity is O (n- 2 )
4     }
5 }

analysis:

  When i = 0, the loop is executed n times,

  When i = 1, the loop is executed n-1 times,

  ......

  When i = n-1 when. He performed one time,

  So the total number of executions is: n-= (. 1-n-) + (2-n-). 1 = n-+ ????? + (n-+. 1) / 2 = n- 2 / n-2 + / 2.

  Can be obtained from the above equation: The first code adder no constant term, irrespective; retaining only the second order term is reserved to n- 2 /2; the third term is multiplied by the constant removal, it is removed 1/2; finally we get the code segment time complexity is O (n- 2 ).

Four cases: the time complexity is O (log n) algorithm in order to

1 int count = 1;
2 while (count < n) {
3     count *= 2;
4 }

  We can see that the above code, count = count * 2 after step closer to n, that is, by multiplying the number of 2 is greater than n, the loop exits. Therefore, we can deduce that the 2x = n x = log2n, such as cycle time complexity, we referred to the complexity is the order of O (log n).

Three, O-order sorting algorithm efficiency

   Data structure, we commonly used time complexity expressed: O (1), O (n), O (n2), O (log n), O (nlog n), O (n3), O (2n).

  Descending chronological time complexity consumed were:

  O(1) < O(log n) < O(n) < O(nlog n) < O(n2) < O(n3) < O(2n)

Guess you like

Origin www.cnblogs.com/guanghe/p/11011534.html