Data structure - time complexity and space complexity

Table of contents

1. What is space complexity and time complexity?

1.1 Algorithm efficiency

1.2 The concept of time complexity

1.3 The concept of space complexity

2. How to calculate the time complexity of common algorithms

2.1 Asymptotic representation of Big O

 Usage rules

3. How to calculate the space complexity of common algorithms

3.1 Big O asymptotic notation

3.2 Interview question - disappearing numbers

 3.3 Interview question - rotating array


1. What is space complexity and time complexity?

1.1 Algorithm efficiency

It is divided into two types. One is time efficiency, also known as time complexity, which mainly measures the running speed of the algorithm. The other is space efficiency, called space complexity, which measures the extra space required by the algorithm.

1.2 The concept of time complexity

Simply put, the number of executions of basic operations in an algorithm is the time complexity of the algorithm.

1.3 The concept of space complexity

Space complexity is a measure of the amount of storage space temporarily occupied by an algorithm during its operation. It is generally expressed using Big O asymptotic notation.


2. How to calculate the time complexity of common algorithms

2.1 Asymptotic representation of Big O

//实例一.请计算一下Func1基本操作执行了多少次?
void Func1(int N)
{
	int count = 0;
	for (int i = 0; i < N; ++i)
	{
		for (int j = 0; j < N; ++j)
		{
			++count;
		}
	}
	for (int k = 0; k < 2 * N; ++k)
	{
		++count;
	}

	int M = 10;
	while (M--)
	{
		++count;
	}
	printf("%d\n", count);
}

We can know that the exact number of times should be N*N ( the nested loop in each loop will loop N times, a total of N loops, so it is N*N times ) + 2*N ( only loop 2*N times ) + 10

This will be relevant at this time:

Let F(N)=N*N+2*N+10

N = 10 F(N) = 130

N = 100 F(N) = 10210

N = 1000 F(N) = 1002010

 important point:

  • As N increases, N^2 in this expression has the greatest impact on the result.
  • Time complexity is an estimate, looking at the term in the expression that has the greatest impact.
  • Big O asymptotic representation, estimated time complexity: O (N*2).

 Usage rules

  • Replace all additive constants in runtime with the constant 1.
  • In the modified number of runs function, only the highest order terms are retained.
  • If the highest-order term is present and is not 1, remove the constant multiplied by this term. The result is Big O order.

//实例二,计算Func2的时间复杂度
void Func2(int N)
{
	int count = 0;
	for (int k = 0; k < 2 * N; ++k)
	{
		++count;
	}
	int M = 10;
	while (M--)
	{
		++count;
	}
	printf("%d\n", count);
}

 The result is O(N), which is exactly 2*N+10. The reason why 2 is ignored is because as N increases infinitely, 2 can no longer affect N too much.

//实例三.计算Func3的时间复杂度
void Func3(int N, int M)
{
	int count = 0;
	for (int k = 0; k < M; ++k)
	{
		++count;
	}
	for (int k = 0; k < N; ++k)
	{
		++count;
	}
	printf("%d\n", count);
}

O(M+N) After all, there are 2 unknown numbers, but if there are conditions to say that M is much larger than N , then the result is O(M)

If the condition is that M is about the same as N, then the result is O(N) or O(M).

//实例四.计算Func4的时间复杂度
void Func4(int N)
{
	int count = 0;

	for (int k = 0; k < 100; ++k)
	{
		++count;
	}
	printf("%d\n", count);
}

O(1) As long as it is a certain constant degree, it is O(1)

//实例五.计算strchr的时间复杂度
const char* strchr(const char* str, char character)
{
	while (*str != '\0')
	{
		if (*str == character)
			return str;

		++str;
	}

	return NULL;
}

Equivalent to searching for characters in a character array. Assuming that the length of the string is N, the number of runs to find the characters 's', 'd', and 'x' in the following string are different.

 

 We will find that there are many situations:

  • Worst case: maximum number of runs for any input size (upper bound)
  • Average case: expected number of runs for any input size
  • Best case: minimum number of runs (lower bound) for any input size

For example: Search for a data x in an array of length N

Best case: found 1 time

Worst case: Found N times

Average situation: N/2 times found

In practice, the general situation is to focus on the worst-case operation of the algorithm. The time complexity of searching for data in all arrays is O(N)

//案例六.计算BubbleSort的时间复杂度
void BubbleSort(int* a, int n)
{
	assert(a);
	for (size_t end = n; end > 0; --end)
	{
		int exchange = 0;
		for (size_t i = 1; i < end; ++i)
		{
			if (a[i - 1] > a[i])
			{
				Swap(&a[i - 1], &a[i]);
				exchange = 1;
			}
		}
		if (exchange == 0)
			break;
	}
}

How should I put it, it's a bit abstract. After all, this is not just looking at the code like before. This time, you need to rely on your own understanding to concretize the abstract code. For example, the essence is bubble sorting, so we can imagine what the process of each bubble execution is like.

For example, if the array arr[5]={0,1,2,3,4}, the worst case of changing 0 to 4 places is 4 times (1, 2, 3, 4, 0), and the worst case of changing 1 to 3 places. The situation is 3 times. By analogy, when the last number is 4, there is no need to change the number to 0.

As shown below. (For the convenience of understanding, N-1~0 is replaced by N~1 here)

It can be seen that after listing all the results, it is found that this is an arithmetic sequence. Finally, we use the summation method to find the exact number. After further estimation, we get O(N^2)

//案例七.计算BinarySearch的时间复杂度
int BinarySearch(int* a, int n, int x)
{
	assert(a);

	int begin = 0;
	int end = n;
	while (begin < end)
	{
		int mid = begin + ((end - begin) >> 1);
		if (a[mid] < x)
			begin = mid + 1;
		else if (a[mid] > x)
			end = mid;
		else
			return mid;
	}
	return -1;
}

This is a classic binary search case. If you don’t understand its core, you can refer to this article:

Detailed explanation of binary search

Here we can imagine this array as a piece of paper, and the N numbers in it as a piece of paper with a length of N, and then continue to fold it in half, and continue to fold it in half. After folding it in half X times, it will either be found or not found. Finally we restore it back.

By doing this, we arrive at an expression:

//案例八.计算阶乘递归Factorial的时间复杂度
long long Factorial(size_t N)
{
	return N < 2 ? N : Factorial(N - 1) * N;
}

 

  Each recursive operation is a constant number of times, and since the recursive call is N times, it is O(N).

        

//案例九.计算斐波那契递归Fib的时间复杂度
long long Fib(size_t N)
{
	if (N < 3)
		return 1;
	return Fib(N - 1) + Fib(N - 2);
}

The number of executions is like a pyramid, 1 becomes 2, 2 becomes 3, and the number of executions is constantly multiplied by 2. 

It can be seen that the final time complexity is a geometric sequence summation formula

 Time complexity: O(2^N)

3. How to calculate the space complexity of common algorithms

3.1 Big O asymptotic notation

//案例一.计算BubbleSort的空间复杂度
void BubbleSort(int* a, int n)
{
	assert(a);
	for (size_t end = n; end > 0; --end)
	{
		int exchange = 0;
		for (size_t i = 1; i < end; ++i)
		{
			if (a[i - 1] > a[i])
			{
				Swap(&a[i - 1], &a[i]);
				exchange = 1;
			}
		}
		if (exchange == 0)
			break;
	}
}

 

We can find that there are 5 variables in total, which is equivalent to opening up 5 spaces, so: O(1), because 5 is a constant.

After walking N times in the loop, a space is reused, but the variables will be destroyed when they go out, but the space will not. Time is cumulative, space is not cumulative.

//案例二.计算Fibonacc1的空间复杂度
long long* Fibonacc1(size_t n)
{
	if (n == 0)
		return NULL;

	long long* fibArray = (long long*)malloc((n + 1) * sizeof(long long));
	fibArray[0] = 0;
	fibArray[1] = 1;
	for (int i = 2; i <= n; ++i)
	{
		fibArray[i] = fibArray[i - 1] + fibArray[i - 2];
	}
	return fibArray;
}

 

We can know that the exact space complexity is O(N+6) but it is actually O(N)

The meaning of malloc is to continuously open an array of n+1 long long type.

//案例三.计算阶乘递归Factorial的空间复杂度
long long Factorial(size_t N)
{
	return N < 2 ? N : Factorial(N - 1) * N;
}

 

 Although it will be destroyed in the end, the space complexity is calculated based on the accumulated most used space.

  

//案例四.计算斐波那契递归Fib的空间复杂度
long long Fib(size_t N)
{
	if (N < 3)
		return 1;
	return Fib(N - 1) + Fib(N - 2);
}

   We already know that the time complexity is O(2^N)

Let’s first use a stack frame diagram to represent it:

At the beginning, main opened a space and called Func1. Func1 will also open a space to store the variable a. After the end, this space will be destroyed. Then there is a call to Func2. The reason why the space of Func1 can be reused is because they are all similar, they all create an int integer variable (regardless of the value). That's why the addresses are the same.

Note: The addresses of the two functions are different, and the function address has nothing to do with the stack frame!

When opening space, Fib(N-1) and Fib(N-2) will not be called at the same time. Instead, space will be opened first along Fib(N-1) until Fib(2) is called. ) starts to return. At this time, although the space of Fib(2) is destroyed, the space permission is only returned to the system. When the recursion reaches Fib(3), Fib(2) and Fib(1) will be called again, and this At this time, Fib(2) will not open up a new space, but will go to the previously destroyed space. This is equivalent to the system giving the opening permission of that space to Fib(2) again. All following the arrow, we will It is found that there are already opened spaces on the left side. There are N places in total. The N places here are equivalent to the maximum number of spaces opened at one time . When the advancement is completed, the space needs to be destroyed. When the return call encounters the space that has been opened (destroyed) ) and then reuse this space.

To put it another way: the hotel has opened 10 rooms, and when Fib(2) ends and starts to return upward, it is equivalent to checking out. But the room is still there, and Fib(1) must be called after Fib(2) ends. The room that Fib(1) moves into is the room that Fib(2) exits from.

The essence is that the maximum number of rooms represents the deepest level reached in one progression, and the rest are just reuse. Keep checking out and opening rooms until everyone checks out, which means the recursion ends.

Time is accumulated and will never be returned, but space can be reused. Therefore the space complexity is O(N)

3.2 Interview question - disappearing numbers

Original title link: Likou - the disappearing number

Sorting + traversal: the next number is not equal to the next data + 1, the next number is the disappearing number), but just sorting (qsort) takes a long time.

 

 Time complexity is O(N)

int missingNumber(int* nums, int numsSize)
{
	int N = numsSize;
	int i = 0;
	int ret = N * (N + 1) / 2;
	for (i = 0; i < N; i++)
	{
		ret -= nums[i];
	}
	return ret;


}

 

Characteristics of XOR: first represent the binary number, then the same number will be 0, and the difference will be 1. No matter what the number is, the same number will be XORed.

 The point is that no order is required.

int missingNumber(int* nums, int numsSize)
{
	int i = 0;
	int j = 0;
	int x = 0;
	int N = numsSize;
	for (i = 0; i < N; i++)
	{
		x ^= nums[i];
	}
	for (j = 0; j <= N; j++)
	{
		x ^= j;
	}
	return x;


}

 3.3 Interview question - rotating array

Original title link: Likou - Rotating Array

Put the last element in tmp, move all the previous array elements to the right, then put tmp in the first place, and so on.

As for its time and space complexity, there is a saying. If the right rotation is executed N times once, then the K times of right rotation should be K*N times.

But this is a rotated array. If you rotate it 0 times to the right or rotate it 7 times, the result is the same, so k depends on the situation. The best is only 0 times of right rotation, and the worst is N-1 times of right rotation (Nth It enters reincarnation again), so the time complexity is also divided into O (1) and O (N^2). According to the rules, we take the worst case O (N^2).

Therefore, this method does not meet the conditions, pass~. Attached code:

void rotate(int* nums, int sumsSize, int k)
{
	int N = sumsSize;
	if (k >= N)
	{
		k %= N;
	}
	while (k--)
	{
		int tmp = nums[N - 1];
		for (int end = N - 2; end >= 0; end--)
		{
			nums[end + 1] = nums[end];
		}
		nums[0] = tmp;
	}
}

 

void rotate(int* nums, int sumsSize, int k)
{
	int N = sumsSize;
	int* tmp = (int*)malloc(sizeof(int) * N);//开辟与原数组空间同样大小的新数组
	k %= N;
	memcpy(tmp, nums + N - k, sizeof(int) * k);//把原数组中从下标第N-k开始拷贝k个
	memcpy(tmp, nums, sizeof(int) * (N - k));//把原数组从下标为0开始拷贝N-k个
	memcpy(nums, tmp, sizeof(int) * N);//把tmp数组中所有的k个元素都拷贝回nums中去
	free(tmp);//释放空间
    tmp = NULL;//置空

}

 

That is to move several at once, copy the last k 2's to the new array, then copy the first Nk 2's to the new array, and finally copy the new array back together, exchanging space for time, but it is still an old problem, if the array is too large Large, the space may not meet the requirements.

 The most wonderful method! ! !

void Reverse(int* nums, int left, int right)
{
	while (left < right)
	{
		int tmp = nums[left];
		nums[left] = nums[right];
		nums[right] = tmp;
		left++;
		right--;

	}
}


void rotate(int* nums, int sumsSize, int k)
{
	int N = sumsSize;
	k %= N;

	Reverse(nums, 0, N - k - 1);
	Reverse(nums, N-k, N - 1);
	Reverse(nums, 0, N-1);

}

4. Conclusion

This ends the explanation of complexity, I hope it can help everyone understand. In addition, due to the tight curriculum in school, I will stop updating the C language column for a while. I will finish the data structure first~ 

Guess you like

Origin blog.csdn.net/fax_player/article/details/132081696