Algorithm Analysis and Design study notes 4 algorithm complexity analysis

Today we talk with algorithmic complexity of the problem, before speaking to me Tucao about, these days I searched for articles about algorithmic complexity analysis everywhere on the Internet, most of the articles I feel that is the first thing you see scary beginner ...... so I use the language as much as possible to allow the novice to understand this thing to tell you it is inevitable that there will be clear, not precise wording of the place, so please understand, did not talk much, to lecture

What is the complexity of the algorithm

  • That algorithm complexity algorithm computer resources needed
  • Complexity of the algorithm can be divided into spatial complexity of the algorithm time T (n) and the complexity of the algorithm S (n), where n is the size (input size) Problem

    Time complexity of the algorithm

    The time complexity is a function of the number of basic operations performed, referred to as T (n), it can be divided into:

  • Time worst-case complexity
  • Time best-case complexity
  • Time average case complexity

Space complexity of the algorithm

Space complexity of the algorithm is a temporary storage space is occupied metric during operation, referred to as S (n), can be divided into:

  • Save the program required storage space resources. That is the size of the program
  • Program storage resources needed in the implementation process consumed, such as intermediate variables

Time and space complexity of contact

An algorithm of time complexity and space complexity is often influence each other, the pursuit of a good time complexity is bound to sacrifice space complexity of the performance, and the pursuit of good spatial complexity is bound to sacrifice time complexity of performance. Fish and can not have both.

Asymptotic time complexity of the algorithm

In order to simplify the complexity of the analysis we introduce the notation in three progressive significance: o (progressive upper bound), [Omega (progressive lower bound), [theta] (progressive tight circles) **** (the remaining two symbols I will follow up on)

θ - Progressive tight circles

Introduces gradual tightening border
for a given function G (n-)
[theta] (G (n-)) = {F (n-) | ∃ * C . 1 , C 2 > 0, n- 0 , n-> n- 0 , C . 1 G (n-) ≦ f (n-) ≤C 2 G (n-)} *
F (n-) ∈θ (G (n-)), denoted as f (n) = θ (g (n))
Here Insert Picture Description
on the point of view so that many believe the formula old iron suddenly backed out, in fact, this formula is simpler than you think
the popular talk is the presence of c 1 , c 2 for any greater than the n- 0 the n-such that this formula was established
c 1 G (the n-) ≤f (n-) ≤C 2 G (n-)

Ο - Progressive upper bound

For a given function G (n-)
O (G (n-)) = {F (n-): the presence of normal, and the number c n- 0 satisfied for all N ≥ n 0 , 0 ≦ f (n-) ≤cg (n-)}
denoted by f (n) ∈O (g ( n)), or simply referred to as f (n) = O (g (n))
Here Insert Picture Description

Ω - Progressive lower bound

For a given function G (n-)
[Omega] (G (n-)) = {F (n-): the presence of normal numbers c and n- 0 , so that for all N ≥ n 0 , 0≤cg (n-) <F (n-) }
denoted by f (n) ∈Ω (g ( n)), or abbreviated as f (n) = Ω (g (n))
Here Insert Picture Description
believes iron son saw it was still foggy feeling, I used a examples to help understand
Here Insert Picture Description
(the latter continuously updated)

Guess you like

Origin www.cnblogs.com/AWSG-Shaodw/p/12423907.html