The time complexity of the algorithm commonly used analysis

Press magnitude ascending, common time complexity are:
Constant order O (1), of the order O (log2n), linear order O (n),
the linear logarithmic order O (nlog2n), the square of the order O (n ^ 2 ), the cubic order O (n-^. 3), ...,

K th order O (n ^ k), exponential order O (2 ^ n). N With the scale of the problem is increasing the time complexity is increasing, the lower the efficiency of the algorithm.

Time complexity: the number of basic operations performed repeatedly in the order of T (n) = o (f (n))
of the following six evaluation polynomial time algorithm is the most commonly used. The relationship is:
 <O (N3) O (. 1) <O (logN) <O (n-) <O (nlogn) <O (N2)
relationship exponential time is:
   (! N-) O (2N) <O < O (nn)
   when n is made large, the exponential time algorithm in polynomial time algorithm and the time required is very poor.

The following are examples of some common time complexity.

Complexity runtime class name (T (n)) algorithm runtime Examples Examples
constant time     
    O (1) 10 determines the parity of a binary
inverse Ackerman time     
    O (\ alpha (n))     
    of a single operation of disjoint-set amortized time
iterative logarithmic time     
    O (\ log n-^ {*})     
    EN: Cole-Vishkin algorithm
logarithm time     
    O (\ log \ log n)     
    Single operation bounded priority queue [1]
logarithmic time DLOGTIME O (\ log n) \ log n, \ log n ^ 2 binary search
power of the logarithm of time     
    (\ log n) ^ {O (1)} (\ log n-) ^ 2     
(less than 1) power time     
    O (n ^ c), wherein 0 <c <1 n ^ { \ frac {1} {2}}, n ^ {\ frac {2} {3}} Kd search operation tree
in linear time     
    O (n) search disordered array of n
linear iterative logarithmic time     
    O (\ log n} ^ {n *)     
    Raimund Seidel triangular polygon segmentation algorithm
linear logarithmic time     
    O (n \ log n) n \ log n, \ log n ! fastest Comparative sort
quadratic time     
    O (n ^ 2) n ^ 2 bubble sort, insertion sort
three time     
    O (n ^ 3) n ^ 3 matrix multiplication is basically calculated part correlation
polynomial P 2 ^ {O (\ log n)} = n ^ {O (1)} n, n \ log n, n ^ {10} linear programming en: Karmarkar's algorithm, AKS prime test
registration polynomial QP 2 ^ {(\ log n      ) ^ {O (1)}}
    To have the most famous on the Steiner tree problem O (\ log ^ 2 n) approximation algorithm
Subexponential time (first defined) SUBEXP O (2 ^ {n ^ {\ epsilon}}), ε arbitrary> 0 O (2 ^ {(\ log n ) ^ {\ log \ log n}}) Assuming complexity theoretic conjectures, BPP is contained in SUBEXP. [2]
sub-exponential time (second defined)     
    2O (n-) 2n1 /. 3 Best- known algorithm for integer factorization and graph isomorphism
exponential time E 2O (n) 1.1n, 10n using dynamic programming to solve the traveling salesman problem
factorial time     
    O (n!) n! solved by searching the traveling salesman problem of violence
exponential time eXPTIME 2poly (n ) 2n, 2n2     
double exponential time 2-eXPTIME 22poly (n) 22n Deciding the truth of a given statement in arithmetic Presburger



complexity FIG Standard time:



 
------------------ ---  
author: hear the sound of rain hb  
source: CSDN  
Original: https: //blog.csdn.net/u010010664/article/details/78834695  
Copyright: This article is a blogger original article, reproduced, please attach Bowen link!

Guess you like

Origin www.cnblogs.com/fengff/p/10950786.html