Introduction to Algorithms Study Notes Chapter 3 Function Growth

When the input scale is large enough, we need to study the asymptotic efficiency of the algorithm, that is, when the input scale increases infinitely, in the limit, how the running time of the algorithm increases with the increase of the input scale.

The following asymptotic notation is mainly used to describe the running time of the algorithm:
1. θ notation
Given a function g(n), use θ(g(n)) to represent the set of the following functions:
Insert picture description here
if there are normal quantities c1 and c2, it is sufficient for For large n, the function f(n) can be sandwiched between c1g(n) and c2g(n), then f(n) belongs to the set θ(g(n)).

For all n>=n0, the function f(n) is equal to g(n) within a constant factor. We call g(n) an asymptotically compact bound of f(n).
Insert picture description here
Each function in the θ notation is asymptotically non-negative (that is, when n is large enough, f(n) is non-negative), as are the other asymptotic notations in this chapter.
2.
O notation Use O notation to represent the asymptotic upper bound:
Insert picture description here
f(n)=θ(g(n)) implies f(n)=O(g(n)).

The worst running time limit θ(n 2 ) for insertion sort does not imply that the running time limit of insertion sort for each input is also θ(n²). For example, when the input is sorted, the running time of insertion sort is θ (N).

The worst-case running time bound for insertion sort O(n²) applies to the running time of the algorithm for each input.

Technically, it is not appropriate to call the running time of insertion sort as O(n²), because for a given n, the actual running time varies and depends on a specific input with a scale of n. We say "The running time is O(n²) )” means that there is a function f(n) of O(n²), so that for any value of n, no matter what specific input scale is selected, the upper bound of its running time is f(n), That is, the worst-case running time is O(n²).
3.
Ω symbol The Ω symbol provides an asymptotic lower bound:
Insert picture description here
Insert picture description here
the best-case running time of insertion sort is Ω(n), which implies that the running time of insertion sort is Ω(n). So insertion sort running time is between Ω(n) and O(n²).
4.
o notation The upper bound provided by the O notation may or may not be asymptotically compact. For example, 2n²=O(n²) is asymptotically compact, but 2n=O(n²) is not asymptotically compact. We use the o notation to represent a non-asymptotically compact upper bound:
Insert picture description here
in the o notation, when n approaches infinity, the function f(n) becomes insignificant relative to g(n):
Insert picture description here
5.ω notation
ω The notation represents a non-asymptotically compact lower bound:
Insert picture description here
f(n)=ω(g(n)) implies:
Insert picture description here
Insert picture description here
Insert picture description here
the analogy between the asymptotic comparison of functions f and g and the comparison of two real numbers a and b:
Insert picture description here
but real numbers The following properties of can not be carried into the asymptotic notation:
Insert picture description here
For two functions f(n) and g(n), maybe f(n)=O(g(n)) and f(n)=Ω(g(n)) Neither is true, such as functions n and n 1+sinn .

If f(n)=o(g(n)), then f(n) is said to be asymptotically smaller than g(n); if f(n)=ω(g(n)), then f(n) is said to be asymptotic Greater than g(n).

Any exponential function with a base greater than 1 grows faster than any polynomial function.

According to the formula for changing the base, the logarithmic functions of different bases differ only by a constant multiple:

l o g 2 x / l o g 10 x = l o g c x l o g c 10 / l o g c 2 l o g c x = l o g c 10 / l o g c 2 log_{2}x/log_{10}x = log_{c}xlog_{c}10/log_{c}2log_{c}x = log_{c}10/log_{c}2 log2x/log10x=logcxlogc10/logc2logcx=logc10/logc2
where c is a constant. Therefore, when we don't care about these constant factors, we often use the notation lgn (as in the O notation).

For the logarithmic function, for all real numbers a>0, b>0, c>0 and n, there are:
Insert picture description here
any positive polynomial function grows more than any logarithmic function.

Multiple function: The
Insert picture description here
multiple logarithmic function is a very slow-growing function.

The Fibonacci sequence grows exponentially.

Guess you like

Origin blog.csdn.net/tus00000/article/details/114796719