Introduction to Algorithms Part One Chapter Three-Function Growth

Introduction to Algorithms Chapter 3-Function Growth

When the input scale is so large that only the magnitude of the increase in running time is relevant, the asymptotic efficiency of the algorithm is being studied.
When we care about the unlimited growth of the input scale, in the limit, how the running time of the algorithm increases with the increase of the input scale.
For small input sizes, asymptotically more effective algorithms are the best choice.

Asymptotic notation is actually applied to functions, usually describing the running time of the algorithm, but also other aspects of the algorithm, such as the amount of space in the algorithm.

Mathematical knowledge supplement

Insert picture description here

Where i represents the lower bound, n represents the upper bound, k starts from i and continues to n, adding up all

Θ sign

For a given function g(n), use Θ(g(n)) to represent the set of functions

Θ(g(n)) = {
    
    
  f(n) 存在正常数c1 c2 n0 ,使对所有n >= n0, 有 0 <= c1*g(n) <= f(n)  <= c2*g(n)  
}

For any function f(n), there is a normal number c1 c2. When n is sufficiently large, f(n) can be sandwiched between c1 g(n) and c2 g(n),
f(n) belongs to the set Θ( g(n)).
f(n) ∈ Θ(g(n)), usually f(n) is Θ(g(n)).
For all n on the right side of n0, the value of f(n) is always c1 g(n) and between c2 g(n).
In other words, for all n>=n0, f(n) is equal to g(n) within a constant factor range.
We say that g(n) is an *** asymptotically perfect bound (asymptotically compact bound) of f(n) ***

Later, we assume the function Θ notation, are asymptotically non-negative .

Proof 1

Use formal definition of proof n2/2 - 3n = Θ(n2)
so we have to determine the constants c1, c2, n0
such that for all n> = n0
making c1*n2 <= n2/2 -3n <= c2*n2
on by dividing the n2
obtained c1 <= 1/2 - 3/n <= 3n c2
result can be obtained at any c2> = 1/2 allows the right of establishment
c1 <= 1/14 Let n>= 7 be established.
Finally, choose c1 = 1/14 c2 = 1/2 n0=7.

Use the contradiction method to prove 6*n3 !== Θ(n2)

Ο mark

Ο receiving Ao Mike read as
we use the notation given O function within a constant factor bound on the asymptotic

Ο(g(n)) = {
    
    
  f(n) 存在正常数c、n0 ,使对所有n >= n0, 有 0 <= f(n) <= c*g(n)
}

So remember to do f(n)=Ο(g(n))refers to f (n) is a set of member Ο (g (n)) of.

Note that f(n) = Θ(g(n)) contains f(n) = Ο(g(n))
Θ notation is a stronger concept than Ο.
For a function, the proof of Θ(n2) also It is proved that Ο(n2)
in the book, f(n) = Ο(g(n)), we only require a constant multiple of g(n) to be the asymptotic upper bound of f(n), but in the algorithm literature in,
to distinguish the asymptotic upper bound of the asymptotic and infimum
used Ο symbols, running time can be described by the general structure of the detection algorithm .
For example, the worst-case running time of the insertion sort double nested loop structure will produce an Ο(n2)upper bound.

When we say that the running time Ο(n2)means that there is a Ο(n2)function f(n), for any value of n, the upper bound of the running time is f(n), which
means that the worst-case running time isΟ(n2)

Ω notation

[Omega] symbol Providing asymptotic lower bound
for a given function g (n), with Ω (g (n)) represents the set of the following functions.

Ω(g(n)) = {
    
    
  f(n) 存在正常数c、n0 ,使对所有n >= n0, 有 0  <= c*g(n) <= f(n)   
}

Then remember it as f(n)= Ω(g(n))

theorem

For any two functions f(n) and g(n), we have f(n) = Θ(g(n)), and only if f(n) = Ο(g(n)) and f(n) = Ω(g(n))

We use this theorem to obtain the asymptotic upper and lower bounds from the asymptotic true bound, not the other way around.

When the running time of an algorithm is called Ω(g(n)), we mean that for any input n,
as long as n is large enough, the running time for that input is at least a constant multiple of g(n).
For example, the best case for insertion sort is Ω(n), so the running time of insertion sort is Ω(n) to Ο(n2).

Asymptotic signs in equations and inequalities

2n2+3n+1 = 2n2+Θ(n)

n=Θ(n2) The equal sign refers to the membership of the set n ∈ Θ(n2)

Generally, asymptotic signs appear in formulas to represent anonymous functions that do not care about names.
2n2+3n+1 = 2n2+Θ(n) means 2n2+3n+1 = 2n2+f(n),
where f(n) is a certain function of the set Θ(n).

*** No matter how you choose the anonymous function on the left, there is always a way to choose the anonymous function on the right to make the equation true. ***For
example

2n2+Θ(n)= Θ(n2)

We refer, ∈ Θ (n), there exists some function g for any function f (n) (n) ∈ Θ (n2), such that for all n, there 2n2 + f (n) = g (n)
In other sentences In other words, the details on the right side of the equation are more ambiguous than those on the left.

2n2+3n+1 = 2n2+Θ(n)= Θ(n2)

The first equation shows that there is a certain function f(n) ∈ Θ(n), so that for all n, there are 2n2+3n+1 = 2n2+f(n). The
second equation shows that there is a certain Function g(n) ∈ Θ(n), there is a certain function h(n) ∈ Θ(n2), so that for all n, there are 2n2+g(n)=h(n)

ο mark

ο represents an upper bound that is not asymptotically compact.

ο(g(n)) = {
    
    
  f(n) 对任意正常量 n > 0, 存在常量 n0 > 0, 使得对所有的 n>=n0,0 <= f(n) < cg(n) 
}

The difference between ο and Ο is f(n) = ο(g(n))that, the boundary 0 <= f(n) <cg(n) holds for all c.

ω

ω to represent a non-asymptotically compact lower bound,

ω(g(n)) = {
    
    
  f(n) 对任意正常量 c > 0, 存在常量 n0 > 0, 使得对所有的 n>=n0,0 <= cg(n) < f(n) 
}

For example, n2 = ω(n), the relation f(n) = ω(g(n)) implies

lim 	f(n)
n→∞	 ----  =g(n)

Function comparison

Assume that f(n) and g(n) are asymptotically positive

Transitivity

f(n)=Θ(g(n)) and g(n) = Θ(h(n)), which implies f(n) = Θ(h(n));
f(n)=Ο(g(n )) and g(n) = Ο(h(n)), which implies f(n) = Ο(h(n));
f(n)=Ω(g(n)) and g(n) = Ω( h(n)), implies f(n) = Ω(h(n));
f(n)=ο(g(n)) and g(n) = ο(h(n)), implies f(n ) = ο(h(n));
f(n)=ω(g(n)) and g(n) = ω(h(n)), which implies f(n) = ω(h(n));

Reflexivity

f (n) = Θ (f (n))
f (n) = Ο (f (n))
f (n) = Ω (f (n))

symmetry

f(n)=Θ(g(n)) and only if g(n)=Θ(f(n))

Transposed symmetry

f(n)=Ο(g(n)) and only when g(n)=Ω(f(n))
f(n)=ο(g(n)) and only when g(n)=ω(f (n))

Because these properties are true for asymptotic signs, a comparison can be made between the asymptotic comparison of the functions f and g and the two real numbers a and b.
f(n)=Θ(g(n)) similar to a = b
f(n)=Ο(g(n)) similar to a<= b
f(n)=Ο(g(n)) similar to a>= b
f(n)=ο(g(n)) is similar to a <b
f(n)=ω(g(n)) is similar to a> b
if f(n)=ο(g(n)), then f(n ) Asymptotically smaller than g(n)

Tripartite

For any two real numbers a and b, one of the three cases must be true, a> b, a = b, a <b;
although two real numbers can be compared, not all functions can be compared asymptotically.

Exercise 1

If f(n) and g(n) are both asymptotically non-negative functions,
prove that max(f(n), g(n)) = Θ(f(n) + g(n))

定义 
f(n) = Θ(g(n))
存在正实数 c1,c2,n0, 使得对所有 n>= n0, 有 c1*g(n) <= f(n) <= c1*g(n) 


假定存在n1 和 n2使得,对所有的n > n1, f(n) >=0, 对所有的n > n2, g(n) >=0,
假定 n0 = max(n1, n2),对于 n > n0 

f(n) <= max(f(n),g(n)) 
g(n) <= max(f(n),g(n)) 

上述相加,得到 
(f(n)+ g(n))/2 <= max(f(n),g(n));
同时 
max(f(n),g(n)) < f(n)+ g(n)
综上,所有对 n > n0 
0 <= (f(n)+ g(n))/2 < max(f(n),g(n)) < f(n)+ g(n) 就是定义 c1 = 1/2 c2 = 1 

Standard notation and common functions

Monotonicity

m <= n, f(m) <= f(n) monotonically increasing
m <= n, f(m) >= f(n) monotonically decreasing
m <n, f(m) <f(n) strictly increasing
m <n, f(m)> f(n) strictly decreasing

Round up and round down

⌊X⌋ is the largest integer
x less than or equal to x -1 <⌊x⌋ <= x <= ⌈x⌉ <x +1

Modular operation

For any positive integers a and n, a mod n is the remainder of dividing a by n.
a mod n = a-n * ⌊a/n⌋
result 0 <= a mod n <n
If a and b divided by n have the same remainder, then a===b(mod n)

Polynomial

For a non-negative integer d, a polynomial of degree d of n is the following function

		d
p(n) =  Σ a(i)* n^i
		i
其中ai是常数,对于一个d次渐近多项式p(n) = Θ(n^d)

如果对某个常量k,f(n) = Ο(n^k), 则称函数f(n)是多项式有界的。

index

a 0 = 1
a a = a
a -1 = 1 / a
(a m ) n = a mn
a m a n = a m + n

// a > 1的实常量a和b

lim   n^b
n→∞	 ----  = 0
      a^n

Obtain n b = ο (a n )
exponential function grows faster than polynomial function

e represents the base of the natural logarithm function 2.7...
e x = 1 + x + x 2 / 2! …to
get e x >= 1 + x

logarithm

lgn = log 2 n
lnn = log e n
lg k n = (lgn) k

a = b logba
log c (ab) = log c a + log c b
log b a n = nlog b a
log b a = log c a / log c b
log b 1 / a = -log b a
log b a * log a b = 1

The base of the logarithm changes from a constant to another constant by a constant factor that only changes the value of the logarithm.
We don't care about these constant factors, such as the Ο symbol, use lgn,
because scientists have found that 2 is the most natural base for logarithms, because many algorithms and data structures divide a problem into two.

For a constant k, f(n) = Ο(lg k n), then the function f(n) is called multi-logarithm bounded.

For any constant a> 0,
lg b n = o(n a )
because any positive polynomial function grows faster than any logarithmic function.

factorial

n! = n * n -1… 1
lgn! = Θ (nlgn)

Multiple functions

Use the notation f (i) (n) to indicate that the function f(n) is repeated i times to execute on an initial value n, for a non-negative integer i
f (i) (n) = { n // if i = 0 f(f (i-1) (n)) // if i> 0 }


For example, if f(n) = 2n, f (i) (n) = 2 (i) n

Multiple logarithmic function

lgn stands for multiple logarithm.
Suppose lg (i) n is defined as above, and f(n) = lg(n),
so lgn = min{i>=0:lg (i) n <=1}
Simply put, the smallest i value that makes lg (i) n ≦ 1 Multilogarithm
is a very slow-growing function

lg 2 = 1
lg
4 = 2
lg 16 = 3
lg
(2 16 ) = 4
So there are very few cases where lg * n> 5.

Fibonacci number

Each Fibonacci number is the sum of the first two numbers.
It is related to the golden ratio a and its conjugate b.
They are two solutions of x 2 = x + 1.

F i = (a i - b i ) removal than 根号5
F i = ⌊ 1/2 + a i / 根号5

exercise

Prove that a log b c = c log b a

logbc = logac / logab

So (a log a c ) 1/log a b = c log b a

Guess you like

Origin blog.csdn.net/qq_29334605/article/details/112998068