Lecture on the time complexity of algorithms

Table of contents

1. Algorithm efficiency

1. Algorithm efficiency

1.1 How to measure the quality of an algorithm

1.2 Complexity of algorithm

2. Time complexity

1. The concept of time complexity

2. Big O’s asymptotic representation

3. Common time complexity calculation examples

3. Space complexity


1. Algorithm efficiency

1. Algorithm efficiency

1.1 How to measure the quality of an algorithm

long long Fib(int N)
{
    if(N < 3)
    return 1;

    return Fib(N-1) + Fib(N-2);
}

The recursive implementation of the Fibonacci sequence is very simple, but is simplicity good? So how to measure its good and bad?

1.2 Complexity of algorithm

After the algorithm is written into an executable program, it requires time resources and space (memory) resources to run. Therefore, the quality of an algorithm is generally measured from two dimensions: time and space, namely time complexity and space complexity.
Time complexity mainly measures how fast an algorithm runs, while space complexity mainly measures the extra space required to run an algorithm. In the early days of computer development, computers had very little storage capacity. So we care a lot about space complexity. However, after the rapid development of the computer industry, the storage capacity of computers has reached a very high level. So we no longer need to pay special attention to the space complexity of an algorithm.

2. Time complexity

1. The concept of time complexity

Definition of time complexity: Definition of time complexity: In computer science, the time complexity of an algorithm is a function that quantitatively describes the running time of the algorithm. The time it takes to execute an algorithm cannot be calculated theoretically. You can only know it if you put your program on the machine and run it. But do we need to test every algorithm on a computer? It is possible to test everything on a computer, but this is very troublesome, so the time complexity analysis method is introduced. The time an algorithm takes is proportional to the number of executions of its statements. The number of executions of basic operations in the algorithm is the time complexity of the algorithm.
That is: finding the mathematical expression between a certain basic statement and the problem size N is to calculate the time complexity of the algorithm.

Please calculate how many times the ++count statement in Func1 has been executed in total?

void Func1(int N)
{
	int count = 0;
	for (int i = 0; i < N; ++i)
	{
		for (int j = 0; j < N; ++j)
		{
			++count;
		}
	}
	for (int k = 0; k < 2 * N; ++k)
	{
		++count;
	}
	int M = 10;
	while (M--)
	{
		++count;
	}
	printf("%d\n", count);
}

We can get the number of executions of Func1: F(N)=N^2+2*N+10

When N=10 F(N)=30;

When N=100 F(N)=10210;

When N=1000 F(N)=1002010;

 In practice, when we calculate the time complexity, we do not actually have to calculate the exact number of executions, but only the approximate number of executions, so here we use the asymptotic representation of Big O.

2. Big O’s asymptotic representation

Big O notation: A mathematical notation suitable for describing the asymptotic behavior of functions.

The method of deriving Big O:

1. Replace all additive constants in the run time with constant 1.
2. In the modified running times function, only the highest order term is retained.
3. If the highest-order term exists and is not 1, remove the constant multiplied by this term. The result is Big O order.

Some big-O derivation is similar to taking limits in advanced mathematics

For example,
after Func1 uses the asymptotic representation of big O, the time complexity of Func1 is O(N^2)

N = 10 F(N) = 100
N = 100 F(N) = 10000
N = 1000 F(N) = 1000000

When the value of N is very large, we can omit some numbers in the data and take a rough estimate to see if it is similar to taking limits in advanced mathematics.

Through the above, we will find that Big O's asymptotic representation removes items that have little impact on the results , and expresses the number of executions concisely and clearly.
In addition, the time complexity of some algorithms has best, average and worst cases:
Worst case: the maximum number of runs (upper bound) for any input size;
average case: the expected number of runs for any input size
; best case: any input size Minimum number of runs (lower bound)
For example: Search for a data x in an array of length N
Best case: 1 time found
Worst case: N times found
Average case: N/2 times found

In practice, we generally focus on the worst-case operation of the algorithm. The time complexity of searching for data in an arbitrary array is O(N).

3. Common time complexity calculation examples

Example 1:

Calculate the time complexity of Func?

void Func(int N)
{
	int count = 0;
	for (int k = 0; k < 2 * N; ++k)
	{
		++count;
	}
	int M = 10;
	while (M--)
	{
		++count;
	}
	printf("%d\n", count);
}

Func execution times expression: F(N)=2*N+10

When N=10 F(N)=30

When N=1000 F(N)=2010

When N=1000000 F(N)=20000010

When n takes a very large value, the 10 in the expression can be omitted. Whether N is multiplied by 2 or not, it is in the same order of magnitude, and can also be omitted. In this way, we get the complexity of Func as N, expressed as O( 1).

Example 2:

Calculate the time complexity of Func?
 

void Func3(int N, int M)
{
	int count = 0;
	for (int k = 0; k < M; ++k)
	{
		++count;
	}
	for (int k = 0; k < N; ++k)
	{
		++count;
	}
	printf("%d\n", count);
}

The expression for the number of executions of Func is: F(N)=M+N

Here we will discuss it on a case-by-case basis.

Case 1: When M>>N, N can be omitted based on the idea of ​​finding limits in advanced mathematics

Therefore, we can get that the time complexity of Func is M, expressed as O(M).

Case 2: When M<<N, similarly the time complexity of Func can be N, expressed as O(N). '

Case 3: When M==N, the expression of the number of executions of Func can be expressed as F(N)=2M or F(N)=2N

The time expression of Func is O(N) or O(M).

Example 3:

Calculate the time complexity of Func?

void Func4(int N)
{
	int count = 0;
	for (int k = 0; k < 100; ++k)
	{
		++count;
	}
	printf("%d\n", count);
}

The calculation expression of Func is F(N)=100

According to Big O's derivation 1, we can easily get that the time complexity of Func is O(1).

Example 4:

Calculate time expression of strchr?

const char * strchr ( const char * str, int character );

The library function strchr means to find a character in a string

For this question, we also need to analyze the situation

It is best to perform the basic operation once, so the time complexity is O(1)

On average, basic operations are executed 2/N times, so the time complexity is O(N/2)

At worst, we have to traverse the entire array and perform basic operations N times, so the time complexity is O(N).

Example 5:

Calculate the time complexity of BubbleSort?

void BubbleSort(int* a, int n)
{
	assert(a);
	for (size_t end = n; end > 0; --end)
	{
		int exchange = 0;
		for (size_t i = 1; i < end; ++i)
		{
			if (a[i - 1] > a[i])
			{
				Swap(&a[i - 1], &a[i]);
				exchange = 1;
			}
		}
		if (exchange == 0)
			break;
	}
}

This is a calculation of the time complexity of a bubble function. Through the previous time complexity calculations, we found that the time complexity can be calculated by reading the code. At first glance, it is difficult for us to read the code to get the answer to this question.

Here it also depends on the situation:

Scenario 1: The given set of numbers happens to be in good order, which is the best case, and n-1 comparisons are required, so the time complexity is O(N).

Scenario 2: The given set of numbers is in chaotic order

It is not difficult for us to find that the number of executions is an arithmetic sequence. According to the summation of the arithmetic sequence, the expression for the number of executions is: F(N)=n*(n-1)/2

So the time complexity is O(N^2).

Example 6:

Calculate time complexity of BinarySearch(bisection)?

int BinarySearch(int* a, int n, int x)
{
	assert(a);
	int begin = 0;
	int end = n - 1;
	while (begin <= end)
	{
		int mid = begin + ((end - begin) >> 1);
		if (a[mid] < x)
			begin = mid + 1;
		else if (a[mid] > x)
			end = mid - 1;
		else
			return mid;
	}
	return -1;
}

Suppose we have N numbers. The best case is to execute it once and find it. The worst case is to keep folding it in half until the last number is left, which is the number we are looking for. Therefore, the expression for the number of executions is 2^ x=N, solve this expression x=logN, so the time complexity of the bisection method is O(logN).

ps:logN means that the base is 2 and the logarithm is N in algorithm analysis.

Example 7:

Time complexity of calculating factorial recursion?

long long Fac(size_t N)
{
	if (0 == N)
		return 1;
	return Fac(N - 1) * N;
}

The basic operation is performed N times, and the time complexity is O(N).

Summary: The time complexity of a recursive algorithm is the accumulation of multiple calls

Example 8:

Time complexity of calculating Fibonacci recursion?

long long Fib(size_t N)
{
	if (N < 3)
		return 1;
	return Fib(N - 1) + Fib(N - 2);
}

So the time complexity is O(2^N).

3. Space complexity

Space complexity is also a mathematical expression, which is a measure of the amount of storage space an algorithm temporarily occupies during operation.
Space complexity is not how many bytes of space the program occupies, because this is not very meaningful, so space complexity is calculated by the number of variables.
The space complexity calculation rules are basically similar to the practical complexity, and the big O asymptotic notation is also used.
Note: The stack space required when the function is running (storing parameters, local variables, some register information, etc.) has been determined during compilation, so the space complexity is mainly determined by the additional space explicitly applied for by the function at runtime.

Example 1:

Calculate the space complexity of BubbleSort?

void BubbleSort(int* a, int n)
{
	assert(a);
	for (size_t end = n; end > 0; --end)
	{
		int exchange = 0;
		for (size_t i = 1; i < end; ++i)
		{
			if (a[i - 1] > a[i])
			{
				Swap(&a[i - 1], &a[i]);
				exchange = 1;
			}
		}
		if (exchange == 0)
			break;
	}
}

Example 2:

Calculate the space complexity of Fibonacci sequence?

long long* Fibonacci(size_t n)
{
	if (n == 0)
		return NULL;
	long long* fibArray = (long long*)malloc((n + 1) * sizeof(long long));
	fibArray[0] = 0;
	fibArray[1] = 1;
	for (int i = 2; i <= n; ++i)
	{
		fibArray[i] = fibArray[i - 1] + fibArray[i - 2];
	}
	return fibArray;
}

Example 3:

Compute the space complexity of factorial recursion?

long long Fac(size_t N)
{
if(N == 0)
return 1;
return Fac(N-1)*N;
}

Answers and analysis:

1. Instance 1 uses a constant amount of extra space, so the space complexity is O(1)
2. Instance 2 dynamically opens up N spaces, and the space complexity is O(N)
3. Instance 3 calls N times recursively, and opens up There are N stack frames, and each stack frame uses a constant amount of space. The space complexity is O(N)

Guess you like

Origin blog.csdn.net/qq_55119554/article/details/132677290