[Data structure] Algorithm efficiency, time complexity, space complexity, big O asymptotic method

1. Algorithm efficiency

(1) Algorithm efficiency is divided into time efficiency (time complexity) and space efficiency (space complexity).
(2) Time complexity is now more important.
Because according to Moore's Law, computer memory is getting larger and cheaper nowadays, except for embedded devices that pay more attention to space complexity, others pay more attention to time efficiency.

2. Time complexity

concept

(1) The time spent by an algorithm is directly proportional to the number of executions of the statements in it. The number of executions of the basic operations in the algorithm is the time complexity of the algorithm.
(2) Time complexity is a function.
(3) Adopt the progressive notation
ps of big O : See the reasons for the number of executions:
(1) The running time of the same program on different machines is different.
(2) The actual test on the computer is troublesome and time-consuming.

3. Space complexity

concept

(1) The space complexity is calculated by the number of variables.
(2) Adopt the progressive notation of big O.

4. Progressive notation of big O

2.4.1 Concept

(1) Time complexity: When calculating the actual time complexity, it is not necessary to calculate the exact number of executions, only the approximate number of executions, that is, the incremental notation of big O is used.
(2) Space complexity: The progressive representation of Big O is also used.

2.4.2 Symbols

Big O notation: O() is used to describe the progressive behavior of a function.

2.4.3 Derivation method

1. Replace all addition constants in the running time with a constant 1. eg(1)
2. In the modified running count function, only the highest order items are retained. eg(2)
3. If the highest order item exists and is not 1, then remove this item. The result is a big O order. eg(2)
4. The time complexity of the algorithm depends on the worst case. eg(2)
5. You can't count by simply counting loops! eg:(3)


2.4.4 Typical Examples

eg(1)(2)(3) is the time complexity, eg(4)(5)(6) is the space complexity
eg:(1)

void Func1(int N)
{
    
    
	int count = 0;
	for (int k = 0; k < 10; ++k)
	{
    
    
		++count;
	}
	printf("%d\n", count);
}

f(n)=1+1+1+…+1 (add 10 ones)
From the derivation method 1, we can see that the time complexity is O(1) constant order.

eg: (2) Bubble sort

void BubbleSort(int* a, int n)
{
    
    
	assert(a);
	for (size_t end = n; end > 0; --end)
	{
    
    
		int exchange = 0;
		for (size_t i = 1; i < end; ++i)
		{
    
    
			if (a[i - 1] > a[i])
			{
    
    
				Swap(&a[i - 1], &a[i]);
				exchange = 1;
			}
		}
		if (exchange == 0)
			break;
	}
}

f(n)=n+n-1+n-2+…+1 (the first trip is n times, the second trip is n-1 times…)
=n(n+1)/2
=(n^2 )/2+n/2 According to the
derivation methods 2, 3, 4, the time complexity is O(n^2).

eg: (3) binary search

int BinarySearch(int* a, int n, int x)
{
    
    
	assert(a);
	int begin = 0;
	int end = n - 1;
	while (begin < end)
	{
    
    
		int mid = begin + ((end - begin) >> 1);
		if (a[mid] < x)
			begin = mid + 1;
		else if (a[mid] > x)
			end = mid;
		else
			return mid;
	}
	return -1;
}

Binary search removes general data each time, there are:
N/2/2/…/2=1 which means 1 2 2*…*2=N
then 2^x=N
x=log with 2 as the base N
x=logN or lgN, the time complexity is logN.
(! Computer text is not easy to write logarithms, it is generally abbreviated as logN)

eg(4) Bubble sort

void BubbleSort(int* a, int n) {
    
    
 assert(a);
 for (size_t end = n; end > 0; --end)
 {
    
    
 int exchange = 0;
 for (size_t i = 1; i < end; ++i)
 {
    
    
 if (a[i-1] > a[i])
 {
    
    
 Swap(&a[i-1], &a[i]);
 exchange = 1;
 }
 }
 if (exchange == 0)
 break;
 }
}

Open a constant number of extra spaces, and the space complexity is O(1).

eg(5) Fibonacci

long long* Fibonacci(size_t n) 
{
    
    
	if (n == 0)
		return NULL;
//动态开辟N个空间
	long long * fibArray =(long long *)malloc((n + 1) * sizeof(long long));
	fibArray[0] = 0;
	fibArray[1] = 1; for (int i = 2; i <= n; ++i)
	{
    
    
		fibArray[i] = fibArray[i - 1] + fibArray[i - 2];
	}
	return fibArray;
}

Open up N spaces dynamically, and the space complexity is O(N).

eg(6) Factorial factorial

long long Factorial(size_t N)
 {
    
    
 return N < 2 ? N : Factorial(N-1)*N;
 }

The function is called recursively N times, opening up N stack frames, each stack frame uses a constant space, and the space complexity is O(N).

Guess you like

Origin blog.csdn.net/m0_46630468/article/details/113484225