Time complexity and space complexity DESCRIPTION

 

 

  We define : a number of executions of the algorithm referred to as statement  sentence frequency  or  time frequency ;

  Conventions : Efficiency test algorithm, the main consideration  the worst time complexity  and  average time complexity  is generally not specified, the time complexity of the discussion are the worst-case time complexity

1, time complexity

  An algorithm to perform time-consuming, in theory can not be counted out, the machine must be running tests to know. But we can not and is not necessary for every algorithm on the test machine, just know that it takes more than the time which algorithm, which algorithm takes less time on it. And time spent with a number of algorithm execution algorithm is proportional to the statement, the statement is executed which algorithm more often, it takes time and more.

  In time frequency, n is called the scale of the problem, when n is changing, time will continue to change the frequency of T (n). But it changes regularly, so the introduction of the concept of time complexity. In general, the basic operation algorithm of repetitions is a function of the problem size n, with T (n) represents, if an auxiliary function F (n), such that when n approaches infinity, T ( n) / f (n) is not equal to zero limit constant, called f (n) is T (n) is a function of the same order. Denoted by T (n) = O (f (n)), said O (f (n)) is a progressive time complexity of the algorithm, referred to as time complexity.

  Computing time complexity

  (1) If the execution time of the algorithm does not scale with the growth of n problems and growth, even if there are thousands of algorithms statement, the execution time is only a relatively large constant. Such time complexity of the algorithm is O (1);

   Press magnitude ascending, common time complexity are: Constant order O (. 1), of the order O (log 2 n-), linear order O (n-), linear logarithmic order O (nlog 2 n-), the square of order O (n- 2 ), the cubic order O (n- . 3 ), ..., K th order O (n- K ), exponential order O (2 n- ). N With the scale of the problem is increasing the time complexity is increasing, the lower the efficiency of the algorithm.

1 i = 100000;
2 while(i--) {
3   printf("hello");
4 }

  answer:

  The algorithm loops 100,000, though it is very large number of runs, but the execution time is mainly performed by the third line is a constant value, so his order constant time complexity O (1);

  (2) When there is a plurality of nested loops, the time complexity of the algorithm is the frequency of Loops is determined by a maximum nesting of

1 x=0; 
2 for(i=1;i<=n;i++) 
3     for(j=1;j<=i;j++)
4        for(k=1;k<=j;k++)
5           x++; 

  This algorithm is mainly performed in the line 5, its execution time is a constant value, but it has the above three cycles is performed separately per each (from the outermost to the inner) [(n-1)] , [ (n-1) + (n -2) ....], [(n-1-1) + (n-2-1) ....] times, the time complexity of the algorithm is O ( n- . 3 + residual low terms) ≈O (n- . 3 );

  PS: simple calculation method (Reference only): one cycle n, is a nested n- + 1'd , when a parallel + n, whichever yields the greater the final result;

2, space complexity

  Space complexity of a program is finished running a desired size refers to the program memory.

  (1) fixed part. The number and size of this subspace input / output data number, regardless of the value. Including the instruction space (i.e., code space), space occupied by the data space (a constant, simple variables) and the like. This part is a static space.
  (2) variable space, which includes a main part of the space dynamically allocated space, and the space required for the recursive stack. This part of the space and related algorithms.
  A storage space required by the algorithm represented by f (n). S (n) = O (f (n)) where n is the scale of the problem, S (n) represents the spatial complexity.

  We define : a number of executions of the algorithm referred to as statement  sentence frequency  or  time frequency ;

  Conventions : Efficiency test algorithm, the main consideration  the worst time complexity  and  average time complexity  is generally not specified, the time complexity of the discussion are the worst-case time complexity

1, time complexity

  An algorithm to perform time-consuming, in theory can not be counted out, the machine must be running tests to know. But we can not and is not necessary for every algorithm on the test machine, just know that it takes more than the time which algorithm, which algorithm takes less time on it. And time spent with a number of algorithm execution algorithm is proportional to the statement, the statement is executed which algorithm more often, it takes time and more.

  In time frequency, n is called the scale of the problem, when n is changing, time will continue to change the frequency of T (n). But it changes regularly, so the introduction of the concept of time complexity. In general, the basic operation algorithm of repetitions is a function of the problem size n, with T (n) represents, if an auxiliary function F (n), such that when n approaches infinity, T ( n) / f (n) is not equal to zero limit constant, called f (n) is T (n) is a function of the same order. Denoted by T (n) = O (f (n)), said O (f (n)) is a progressive time complexity of the algorithm, referred to as time complexity.

  Computing time complexity

  (1) If the execution time of the algorithm does not scale with the growth of n problems and growth, even if there are thousands of algorithms statement, the execution time is only a relatively large constant. Such time complexity of the algorithm is O (1);

   Press magnitude ascending, common time complexity are: Constant order O (. 1), of the order O (log 2 n-), linear order O (n-), linear logarithmic order O (nlog 2 n-), the square of order O (n- 2 ), the cubic order O (n- . 3 ), ..., K th order O (n- K ), exponential order O (2 n- ). N With the scale of the problem is increasing the time complexity is increasing, the lower the efficiency of the algorithm.

1 i = 100000;
2 while(i--) {
3   printf("hello");
4 }

  answer:

  The algorithm loops 100,000, though it is very large number of runs, but the execution time is mainly performed by the third line is a constant value, so his order constant time complexity O (1);

  (2) When there is a plurality of nested loops, the time complexity of the algorithm is the frequency of Loops is determined by a maximum nesting of

1 x=0; 
2 for(i=1;i<=n;i++) 
3     for(j=1;j<=i;j++)
4        for(k=1;k<=j;k++)
5           x++; 

  This algorithm is mainly performed in the line 5, its execution time is a constant value, but it has the above three cycles is performed separately per each (from the outermost to the inner) [(n-1)] , [ (n-1) + (n -2) ....], [(n-1-1) + (n-2-1) ....] times, the time complexity of the algorithm is O ( n- . 3 + residual low terms) ≈O (n- . 3 );

  PS: simple calculation method (Reference only): one cycle n, is a nested n- + 1'd , when a parallel + n, whichever yields the greater the final result;

2, space complexity

  Space complexity of a program is finished running a desired size refers to the program memory.

  (1) fixed part. The number and size of this subspace input / output data number, regardless of the value. Including the instruction space (i.e., code space), space occupied by the data space (a constant, simple variables) and the like. This part is a static space.
  (2) variable space, which includes a main part of the space dynamically allocated space, and the space required for the recursive stack. This part of the space and related algorithms.
  A storage space required by the algorithm represented by f (n). S (n) = O (f (n)) where n is the scale of the problem, S (n) represents the spatial complexity.

Guess you like

Origin www.cnblogs.com/Freedom0221/p/10992901.html