Data Structures - Time Complexity

1. Algorithm: It is a description of solving a specific problem that cannot be solved. It is expressed in a computer as a finite sequence of instructions, and each instruction represents one or more operations.

1. Correctness : The correctness of the algorithm means that the algorithm should at least have no ambiguity in input, output and processing, can correctly reflect the needs of the problem, and can obtain the correct answer to the island problem.

2. Readability : Another purpose of algorithm design is to facilitate reading, understanding and communication.

3. Robustness : When the data is unreasonable, the algorithm can also make relevant processing, instead of producing abnormal or inexplicable results.

4. High time efficiency and low storage capacity

If a piece of code is run n times, the number of operations is f(n) = n

Second, the time complexity of the algorithm
In the analysis of the algorithm, the total execution times of the statement T(n) is a function of the problem size n, and then analyze the change of T(n) with n to determine the order of magnitude of T(n). The time complexity of the algorithm, that is, the time measurement of the algorithm, is written as: T(n) = O(f(n)) . He said that with the increase of the problem size n, the growth rate of the execution time of the algorithm is the same as the growth rate of f(n), which is called the asymptotic time complexity of the algorithm, or the time complexity for short. where f(n) is some function of problem size n.

Derivation of Big-O-Order Methods
1. Replace all additive constants in runtime with constant 1
2. In the modified run times function, only the highest order terms are kept.
3. If the highest-order term exists and is not 1, remove the constant multiplied by this term. The result is a big-O order.

1. Constant order : If f(n) is constant, the time complexity can be recorded as O(1).
2. Linear order : To analyze the complexity of the algorithm, the key is to analyze the operation of the loop structure. The time complexity of the sequence of program steps executed in the body of the for loop is O(1), and its time complexity is O(n). Thought the code needs to be executed n times.
for(int i = 0;i<n,i++)
{
/* Time complexity is O(1) */
}
3.对数阶
int count = 1;
while(count < n)
{
count = count * 2;
/* 时间复杂度为O(1)的程序步骤序列 */
}
由于每次count乘以2之后,就距离n更近了一分。也就是说,有多少个2相乘后大于n,则会退出循环。由2^x = n 得到 x = log2 n(以2为底n的对数)。所以这个循环的时间复杂度为O(logn);
4.平方阶
int i ,j;
for(i = 0; i < m; i ++ )
{
for(j = 0; j < n; j++)
{
/* 时间复杂度为O(1)的程序步骤序列 */
}
}
对于外循环,不过是内部这个时间复杂度为O(n)语句,再循环n次。所以这段代码的时间复杂度为O(m*n);

常见的时间复杂度
执行次数函数 阶 非正式术语
12 O(1) 常数阶
2n+3 O(n) 线性阶
3n^2 + 2n + 1 O(n^2) 平方阶
5log2 n + 20 O(logn) 对数阶
2n+3nlog2 n+19 O(nlogn) nlogn阶
6n^3+2n^2+3n+4 O(n^3) 立方阶
2^n O(2^n) 指数阶
常用的时间复杂度所消耗的时间从小到大依次是:
O(1) < O(logn) < O(n) < O(nlogn) < O(n^2) < O(n^3) < O(2^n) < O(n!) < O(n^n)

最坏情况运行时间是一种保证,那就是运行时间将不会再坏了。在应用中,这是一种最重要的需求,通常,除非特别指定,我们提到的运行时间都是最坏情况的运行时间。
平均运行时间是所有情况中最有意义的,因为他是期望的运行时间。
*一般在没有特殊说明的情况下,都是只最坏时间复杂度。

三、算法的空间复杂度
算法的空间复杂度通过计算算法所需的存储空间实现,算法空间复杂度的计算公式记作: S(n)=O(f(n)),其中,n为问题的规模,f(n)为语句关于所占存储空间的函数。

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=324400005&siteId=291194637
Recommended