Asymptotic Notations

Asymptotic Notations

1. Definition

Asymptotic notation is used to describe the asymptotic running time mark, based on the domain of natural numbers N = {0, 1, 2 , ......} function defined. We usually use asymptotic notation to describe the computing time of an algorithm . Next, some basic asymptotic notations will be introduced.

2. O mark (big-Oh)

2.1 Definition

f(n) = O(g(n))——There exists constant c > 0 and n0 such that f(n) < c * g(n) for n >= n0.

There are normal numbers c and n0 such that for any n >= n0, f(n) <= c * g(n).

​ We often use the O notation to give an upper bound of a function within a constant factor, where g(n) is only required to be an asymptotic upper bound , not an asymptotic bound (ie 3n^2 + 4n + 5 = O( n ^2) holds, but in fact 3n^2 + 4n + 5 = O(n^1000000) also holds) .

2.2 Examples

2.2.1 Common examples

  • 3n ^ 2 + 4n + 5 = O (n ^ 2)
  • log10 n = O(log2 n) = O(log n) (by default log is base 2)
  • sin n = O (1)
  • 10 ^ 10 = O (1)
  • nΣi=1 i^2 = O(n^3)
  • nΣi=1 i = O(n^2)
  • log(n!) = log(n) + …… + log(1) = O(n log n)
  • nΣi=1 1/i = O(log n)

2.2.2 Proof example

(1) f(n) = 3n^2 + 4n + 5, prove that f(n) = O(n^2).

Solution: When n >= 1, f(n) = 3n^2 + 4n + 5 <= 3n^2 + 4n^2 + 5n^2 = 12n^2. So when there are constants c=12 and n0=1, for any n>=n0, f(n) <= c * n^2

(2) f(n) = nΣi=1 1/i, prove that f(n) = O(log2 n).

PS where nΣi=1 1/i is equal to 1/1 + 1/2 + 1/3 + …… + 1/n, and Harmonic Series harmonic series are used in the proof.

​ When n >= 2, we can first amplify n to the power of 2 closest to n, f(n) = 1/1 + 1/2 + 1/3 + 1/4 + 1/5 + 1/6 + 1/7 + …… + 1/(n-1) + 1/n <= 1/1 + 1/2 +1/2 + 1/4 + 1/4 + 1/4 + 1/ 4 + …… 1/(n/2) + 1/n = log n + 1/n <= 2 log n. So there are constants c = 2 and n0 = 2. For any n >= n0, f(n) <= c * log n. After zooming in, f(n) = O(log n), obviously when not zooming in, f(n) = O(log n) is also true.

2.2.3 Proof of counterexample

(1) Prove that f(n) = 2^ (2n) is not equal to O(2^n).

Solution: The counter example is to prove that for any normal number c and n0, n >= n0 exists, so that f(n)> c * g(n) . So in this question, for any normal number c and n0, there exists n = n0 + log c >= n0, so that f(n)> c * 2^n.

3. Ω symbol (big-Omega)

3.1 Definition

f(n) = Ω(g(n)) There exists constant c > 0 and n0 such that f(n) >= c * g(n) for n >= n0.

There are normal numbers c and n0 such that for any n >= n0, f(n) >= c * g(n).

​ The Ω notation provides the asymptotic lower bound of the function .

3.2 Examples

3.2.1 Common examples

  • n^2 / 2 - 3n = Ω(n^2)
  • log(n!) = Ω(n log n)
  • nΣi = 1 1 / i = Ω (log n)

3.2.2 Proof example

(1) f(n) = n^2 / 2-3n, prove that f(n) = Ω(n^2).

Solution: When n >= 12, f(n) = n^2 / 2-3n >= n^2 / 4. So there are normal numbers c = 1/4 and n0 = 12, so that for any n >= n0, f(n) >= c * n^2.

(2) f(n) = log(n!), prove f(n) = Ω(n log n).

Solution: When n >= 4, f(n) = log(n) + log(n-1) + …… + log 1 >= log(n) + log(n-1) + …… + log( n/2) >= n/2 * log(n/2) = n/2 * (log n-1) >= n log n / 4. So there are normal numbers c = 1/4 and n0 = 4, so that for any n >= n0, f(n) >= c * n log n.

(3) f(n) = nΣi=1 1/i, prove that f(n) = Ω(log n).

Solution: When n >= 2, you can reduce n to the power of 2 closest to n, f(n) = 1/1 + 1/2 + 1/3 + 1/4 + 1/5 + 1 /6 + 1/7 + …… + 1/(n-1) + 1/n >= 1/2 + 1/4 + 1/4 + 1/8 + 1/8 + 1/8 + 1/8 + ... + 1/n + 1/n >= n log n / 2 + 1/n >= n log n / 2. So there are normal numbers c = 1/2 and n0 = 2, so that for any n >= n0, f(n) >= c * n log n.

3.2.3 Proof of counterexample

(1) Prove that 100n^2 is not equal to Ω(n ^3).

Solution: The counter example is to prove that for any normal number c and n0, n >= n0 exists, so that f(n) <c * g(n) . So in this question, for any normal number c and n0, there exists n = n0 + 100/c >= n0, so that f(n) <c * n^3.

4. Theta notation (big-Theta)

4.1 Definition

f(n) = θ(g(n)) if and only if f(n) = O(g(n)) and f(n) = Ω(g(n))

The definition of θ(g(n)) requires that every member f(n)∈θ(g(n)) is asymptotically non-negative, and g(n) is an asymptotically compact bound of f(n) .

Guess you like

Origin blog.csdn.net/NickHan_cs/article/details/108454972