Game algorithm: time complexity of the algorithm

As a programmer learning slag born during the interview by the interviewer asked: What is the complexity of the time?
I: time complexity is about time a method is running! (Although do not know, but this or know)
Interviewer: how is it calculated?
I: have seen, but did not understand (Han Han -!)
Interviewer: Do not read this?

I can say? Before the interview, specifically looked for articles related to learning, but really did not understand! ! !
After entry, we intend to carefully study the complexity of the algorithm, but also record what the process of growth! ! !

The same problem, there are different solutions, some methods fast, some slow method. Algorithm and the same, and a quality algorithm will affect the efficiency of the algorithm and the whole program.

The quality of an algorithm mainly from 时间复杂度and 空间复杂度measured.
Time Complexity:
time consumed by execution of the algorithm, in theory, is not calculated, but the more we know which algorithms the number of executions in the statement, the more time it takes. A statement of the execution of the algorithm referred to as a frequency or time statement frequency, referred to as T (n). The time complexity of the algorithm also refers to computational effort required for execution of the algorithm.
T(n) 即为一个函数,n是一个自变量,也就以n为自变量组成的函数
n is called the size of the problem, when n is changing, it will not change T (n).

The time complexity is usually expressed in capital letters O, O for method has the following rules:
1. Ignore the constant term
expression: O © = O (1) Example: O (10) = O ( 1)
, for example: T (n) n = + 50 represents a running time of calculation, when n = 1024, the influence of the constant term of this expression is only run time% 4.8 (50 / n)
2. ignore constant factor
expression: O (cT) = cO (T) = O ( T) example: O (10n) = O ( n) (c is a constant)
, for example: T (n) = n2 ( 1) and T (n) = 10n (2 ) two runs time algorithm, when (a) n is greater than 10, (1) will be greater than (2).
3. ignore low-order phase factor
expression: O (T1) + O ( T2) = max (O (T1), O (T2)) Example: O (n-) + O ((n- 2 ) = O (n- 2 )
For example: T (n) = n ^ 2 + n is a running time of the algorithm, and when n is 1024, the value of low-order terms in the expression of factors have less than 0.1% of the operating time.

why? See example:
time algorithm: T (n-) = with 3N 2 + 10N + 10
, the time complexity is:
O (T (n-)) = O (with 3N 2 + 10N + 10) = O (with 3N 2 ) = O (n- 2 )

When 10 = n-:
with 3N 2 / T (n-) = 73.2%
10N / T (n-) = 24.4%
10 / T (n-) = 2.4%

= 100 when n-:
with 3N 2 / T (n-) = 96.7%
10N / T (n-) 3.2% =
10 / T (n-) = <0.1%
can be seen that n- 2 occupies almost the entire running time.

Is incremented by the number of stages, the common time complexity:
constant order: O (1) obtaining a data set from the first element
of the order: O (log2n) a data set into two halves, each half is then subdivided into separate two halves, one forth (base 2 logarithm n)
对数:如果x^2=n ,那么2就为以x为底n的对数。也就对求幂的逆运算
linear order: O (n) set of data traversing a
linear order of: O (nlog2n) to a set of data divided in half, and half did not separate again in half, and so on, then the process data simultaneously through each half of the
order of the square: O (n ^ 2) traversing a data set for each element while the other set of data traversing the same order of magnitude
exponential order: O (2 ^ n ) which may be generated as a subset of all data set
factorial order:! O (n) to generate all the possible permutations of a set of data
with the problem of increasing the size of n, said time complexity is increasing, the algorithm the lower the efficiency.
Figure:
Here Insert Picture Description
Time complexity analysis of how it?
And algorithm execution of time will elapse = algorithms each statement
number of times (frequency) the execution time of each statement statements executed = * a statement execution time required.
Time each statement depends, among other factors code quality and machine performance instructions, and a speed generated by the compiler, it is usually assumed that the time required for each statement is executed once 1.
A time-consuming final algorithm is:
time = sum of frequencies and all statements

For example: find the time complexity of bubble sort

function BubbleSort(list)
	for i = 1,table.getn(list) - 1 do		            
		for j = 1,table.getn(list) - 1 - i do		   	 
			if list[j] > list[j+1] then							
				list[j],list[j+1] = list[j+1],list[j]
			end
		end
	end
end

In the bubble sort algorithm, for a total of two cycles, with 1 and 2 respectively identified,
the length is assumed, we pass to an array of rows to be n, then the cycle needs to be performed first layer n-1 times (in fact requires implementation of n-1 + 1 times, because for the first statement and the additional comparison to determine whether to continue the cycle).
The second layer is a loop execution:
I =. 1, n-'-n-=. 1,
I = 2, n-'-n-2 =
...
I-n-1-1 of =, n-'= N- (n-1-1 of-)
significantly point, that is, into a number, n = 5 assume
that
. 5
. 4
. 3
2
. 1
it is clear that the arithmetic series, the arithmetic series summation formula
(last item the first item +) item number * / 2 = (n + 1 ) n-/ 2 = (n- (n-+. 1)) / 2
according to the above:
T (n-) = (n- 2 + n-) / 2
and O according to the notation:
1. ignore the constant term
2 ignored constant coefficient
3. ignore the lowest order
then O (T (n-)) = O (n- 2 )

For example: Select sorting method:

function SelectionSort(t)
	local pos = 1;
	for i=1,#t do
		pos =i;
		for j=1+i,table.getn(t) do
			if t[j] < t[pos] then
				pos = j;
			end
		end
		t[pos],t[i] = t[i],t[pos]
	end
end

First floor required for loop n times, 2,3,4,5,6 ... n also need to perform arithmetic sequence summation, the worst time complexity of O (n 2 )

Space complexity:
space complexity metric storage means when the required algorithm is executed within the computer. Calculating S (n) = O (f (n))
required memory space during execution of the algorithm consists of three parts:
the space occupied by the program algorithm 1.
Initial 2. The input data storage space occupied
3. algorithm execution process additional space as needed.

Published 28 original articles · won praise 18 · views 20000 +

Guess you like

Origin blog.csdn.net/qq_18192161/article/details/89461127