Appreciated that the time complexity of the algorithm, O (1), O (n), O (log2n), O (n ^ 2)

Complexity of the algorithm is divided into time complexity and space complexity , both good and bad code is also a measure two key indicators:

  • Time complexity: means performs computational effort required algorithm;
  • Between Complexity: refers to the implementation of this algorithm memory space needed.

How much of the computer resources required reflected the complexity of the algorithm is running the algorithm, the most important computer resources are time and space (ie register) resources, complexity into time and space complexity.

1. conceptual understanding

1.1 Basic execution times: T (n)

Due to the operating environment and input size, the absolute time code execution is incalculable, but we can estimate the number of basic code execution.

Generally, the basic algorithm operation is repeatedly performed a number of times a function of the problem size n, this function is used to express relative time, can be written as T (n).

1.2 time complexity: O (n)

Since the implementation of the rules of uncertainty (articles listed below on the four possible), so T (n) is not sufficient to analyze and compare the period of running time code, there is the complexity of progressive time concepts (asymptotic time complexity), the official It is defined as follows:

If there exist a function f (n), such that when n approaches infinity, T (n) / f (n) is not equal to zero limit constant, called f (n) is T (n) is a function of the same order . Denoted by T (n) = O (f (n)), said O (f (n)) is a progressive time complexity of the algorithm, abbreviation: time complexity.

Asymptotic time complexity with a capital "O" is represented, it is also known Big O notation.

There calculation complexity O (1), O (n), O (logn), O (nlogn), O (n ^ 2) concept, which is time and space complexity of the algorithm of FIG.

It is not only used to indicate the time complexity is also used for space complexity .

O brackets behind a function indicating the relationship between the time-consuming / consumption data space increment of an algorithm. Where n represents the amount of input data .

1.3 the spatial complexity: S (n)

Similarly the time complexity, the spatial complexity measure refers to the storage space required when the algorithm is executed within a computer, referred to as: S (n) = O (f (n)).

Mentioned above, O (n) time complexity is not only used to represent, it is also used for the spatial complexity.

2. scene analysis:

This is the time complexity for the scene analysis, the time order of complexity: O (1) <O ( log2n) <O (n) <O (n ^ 2)

Scene. 1: T (n-) O = (. 1)

It represents the running time of the algorithm is a constant, which is the minimum of time and space complexity, that is nothing to do time-consuming / space consumption and the size of the input data, regardless of the input data increases many times, time-consuming / space consumption are the same.

Hash algorithm is a typical O (1) time complexity, no matter what size data, you can find the target after a calculation (without regard to conflict of words).

Scene 2: T (n-) = O (log2n)

When the data is increased n times, consuming increasing log n times (log base 2 is here, for example, when the data is increased 256-fold, 8-fold consuming only increase, is lower than linear the time complexity).

Binary search is O (log n) algorithm, to find once every half exclude the possibility of 256 data looks just need to find eight to find the target.

Scene. 3: T (n-) = O (n-)

It indicates that the algorithm is a linear algorithm, several times the amount of data increases, consuming increases several times.

Such as common for loop through to find the largest number inside an array, you should n variables are scanned again, the number of operations is n, then the complexity of the algorithm is O (n).

Scene. 4: T (n-) = O (^ n-2)

It is increased n times the amount of data representative of time-consuming square-fold increase n, which is higher than the linear complexity of the time.

For example bubble sort, it is a typical O (n ^ 2) algorithm, the number n of the sort necessary to scan n × n times.

Has a variety of algorithms in the programming world, in addition to the four scenarios, as well as the time complexity of many different forms, we follow the time complexity, according to the order of magnitude is incremented in order of priority:

Constant order O (1) <of the order (log2n) <linear order O (n) <linear order O (nlog2n) of <square of order O (n ^ 2) <the cubic order O (n ^ 3) <k times Party order O (n ^ k) <exponential order O (2 ^ n) ......

3. Comparison of algorithms:

Sorting Algorithm

  Average time

Worst case

stability

Additional space

Remark

bubble

 O (n 2)

 O (n 2)

stable

O (1)

n hours better

exchange

  O (n 2)

  O (n 2)

Unstable

O (1)

n hours better

select

 O (n 2)

 O (n 2)

Unstable

O (1)

n hours better

insert

 O (n 2)

 O (n 2)

stable

O (1)

Most good when ordered

Cardinal number

O(log R B)

O(log R B)

stable

O (n)

B is true number (0-9),

R is a radix (a ten hundred)

Shell

O (nlogn)

O(n s ) 1<s<2

Unstable

O (1)

s is selected packet

fast

O (nlogn)

O (n 2)

Unstable

O (nlogn)

n large better

Merger

O (nlogn)

O (nlogn)

stable

O (1)

n large better

stack

O (nlogn)

O (nlogn)

Unstable

O (1)

n large better

 

Draws some official statistics, together with their own understanding, we put together a more comprehensive blog on the complexity of the introduction of time, covering from concept to extend the principle, to sum up the algorithm, we hope to have some help .

 

Shaoxia Please stay ...ヾ(◍ ° ∇ ° ◍) Techno゙... 
welcome thumbs up, comment, plus interest, to allow more people to see learn to earn
more exciting, please pay attention to my "Today's headlines No ": Java cloud notes

Published 171 original articles · won praise 312 · Views 100,000 +

Guess you like

Origin blog.csdn.net/weixin_44259720/article/details/104942598