The strongest data structure in history----the time complexity and space complexity of the algorithm

1. Algorithmic efficiency

How to measure the quality of an algorithm?

Generally , it is measured by the time complexity and space complexity of the algorithm.

Time complexity mainly measures how fast an algorithm runs , while space complexity mainly measures the extra space required to run an algorithm . In the early days of computer development, the storage capacity of computers was very small. So it is very concerned about the space complexity. But after the rapid development of the computer industry,
the storage capacity of the computer has reached a very high level. So we no longer need to pay special attention to the space complexity of an algorithm .

2 time complexity

2.1 The concept of time complexity

Definition: In computer science, the time complexity of an algorithm is a function that quantitatively describes the running time of that algorithm. The number of executions of the basic operations in the algorithm is the time complexity of the algorithm.

That is: to find the mathematical expression between a certain basic statement and the problem size N, is to calculate the time complexity of the algorithm.

example:

Q: How many times is the statement ++count statement in Func1 executed?

void Func1(int N)
{
	int count = 0;
	for (int i = 0; i < N; ++i)
	{
		for (int j = 0; j < N; ++j)
		{
			++count;
		}
	}

	for (int k = 0; k < 2 * N; ++k)
	{
		++count;
	}
	int M = 10;
	while (M--)
	{
		++count;
	}
}

The number of executions is expressed as a function as follows:

F(N) = N^2^ + 2 * N + 10

From the perspective of mathematical functions,
the image of 2 * N + 10 to the final operation result of the function gradually becomes smaller as N increases. When the N area is infinite, we can even ignore it, that is, the final result of the function tends to N. It is only related to N 2 at infinity
.

2.2 Asymptotic Notation for Big O

Big O notation: is a mathematical notation used to describe the asymptotic behavior of a function.

Derive the big-O method:

  1. Replace all additive constants in runtime with the constant 1 (if the number of executions of a function's basic statement is a certain constant).
  2. In the modified run times function, only the highest order terms are kept.
  3. If the highest-order term exists and is not 1, remove the constant multiplied by this term (ie, remove the coefficient of the highest-order term). The result is a big-O order.(Time complexity is measured in orders of magnitude)

After using big-O asymptotic notation, the time complexity of Func1 is:

​O (N 2 )

Through the above, we will find that the asymptotic representation of big O removes those items that have little effect on the result, and shows the number of executions succinctly and clearly.

In addition, the time complexity of some algorithms has a best, average and worst case:

Worst case: maximum number of runs for any input size (upper bound)

Average case: desired number of runs for any input size

Best case: minimum number of runs for any input size (lower bound)

In practice, the general concern is the worst-case operation of the algorithm, so the time complexity of searching for data in the array is O(N).

2.3 Calculation example of time complexity

2.3.1 Example 1

// 计算Func2的时间复杂度?
void Func2(int N)
{
	int count = 0;
	for (int k = 0; k < 2 * N; ++k)
	{
		++count;
	}
	int M = 10;
	while (M--)
	{
		++count;
	}
	printf("%d\n", count);
}

The basic operation is performed 2N+10 times, and the time complexity is O(N) by deriving the big O-order method.

2.3.2 Example 2

// 计算Func3的时间复杂度?
void Func3(int N, int M)
{
	int count = 0;
	for (int k = 0; k < M; ++k)
	{
		++count;
	}
	for (int k = 0; k < N; ++k)
	{
		++count;
	}
	printf("%d\n", count);
}

The basic operation is performed M+N times, there are two unknowns M and N, and the time complexity is O(N+M)

Conclusion: The time complexity does not necessarily have only one unknown, but there may be two. This depends on the specific parameter unknowns, but if the title tells us:

  1. M is much larger than N, and the time complexity becomes O(M).
  2. M and N are about the same size, and the time complexity will become O(M) or O(N), both of which can be written.

2.3.3 Example 3

// 计算Func4的时间复杂度?
void Func4(int N)
{
	int count = 0;
	for (int k = 0; k < 100; ++k)
	{
		++count;
	}
	printf("%d\n", count);
}

The basic operation is performed 100 times, and the time complexity is O(1) by deriving the big-O-order method

2.3.4 Example 4

//计算strchr的时间复杂度?
const char* strchr(const char* str, int character);

Here is an approximate implementation of this function:

while(*str)
{
	if(*str == character)
		return str;
	else
		++str;
}
return NULL;

The basic operation is best performed once, the worst is N times, the time complexity is generally the worst, and the time complexity is O(N)

2.3.5 Example 5

//计算BubbleSort的时间复杂度?
void BubbleSort(int* a, int n)
{
	assert(a);
	for (size_t end = n; end > 0; --end)
	{
		int exchange = 0;
		for (size_t i = 1; i < end; ++i)
		{
			if (a[i - 1] > a[i])
			{
				Swap(&a[i - 1], &a[i]);
				exchange = 1;
			}
		}
		if (exchange == 0)
			break;
	}
}

Note: The two-layer loop is not necessarily O(N 2 ), but also depends on the specific implementation of the function.

Number of first comparisons: N - 1

Number of second comparisons: N - 2

Number of third comparisons: N - 3

···

Number of comparisons N-1: 1

Worst case: F(N) = (N*(N-1))/2

Best case: F(N) = N - 1 (comparing the first round, no exchange occurs, indicating that the given data is in order, so there is no need to continue sorting)

Complexity: O(N 2 ) (worst case)

2.3.6 Example 6

//计算BinarySearch的时间复杂度?
int BinarySearch(int* a, int n, int x)
{
	assert(a);
	int begin = 0;
	int end = n;
	while (begin < end)
	{
		int mid = begin + ((end - begin) >> 1);
		//用右移运算符是为了防止(end+begin)的值溢出,超出最大整数值
		if (a[mid] < x)
			begin = mid + 1;
		else if (a[mid] > x)
			end = mid;
		else
			return mid;
	}
	return -1;
}

The above code usesbinary search

image-20220306153917718

The above code is a left closed and right open interval. The difference between the two writing methods is the processing of the boundary. No matter what kind of interval it is, it should be kept to the end . The following is the binary search method code of the left closed and right closed interval:

int BinarySearch(int* a, int n, int x)
{
	assert(a);
	int begin = 0;
	int end = n - 1;
	while (begin <= end)
	{
		int mid = begin + ((end - begin) >> 1);
		if (a[mid] < x)
			begin = mid + 1;
		else if (a[mid] > x)
			end = mid - 1;
		else
			return mid;
	}
	return -1;
}

Best case: O(1)

Worst case:

Suppose we have searched X times, and finally there is only one element left, and we still haven't found it, then

1 (one element remaining) * 2 X = N

X = log2N

Time complexity: O(log 2 N) (some are abbreviated as logN , and some are even abbreviated as lgN, but in the end this writing method has errors and is not recommended)

Conclusion: To accurately analyze the idea of ​​the algorithm, not just look at the number of layers of the loop.

2.3.7 Example 7

//计算阶乘递归Fac的时间复杂度?
long long Fac(size_t N)
{
	if (0 == N)
		return 1;

	return Fac(N - 1) * N;
}

Through computational analysis, it is found that the basic operations recurse N times (of course, the number of function calls is N+1), and the time complexity is O(N).

Here is a variation of the above function:

long long Fac(size_t N)
{
	if (0 == N)
		return 1;
	for(size_t i = 0;i < N;i++)
	{
		printf("%d",i);//看这个语句的执行次数
	}
	return Fac(N - 1) * N;
}

First function call: N

Second function call: N - 1

···

Nth function call: 1

N+1th function call: 0 (the for loop cannot proceed because N=0)

F(N) = (N + 1) * N / 2

So the time complexity is:O(N 2 )

Notice:

Time complexity calculation of recursive algorithm:

  1. Each function call is O(1), then it depends on the number of recursion
  2. Each function call is not O(1), it depends on the accumulation of the number of recursive calls.

2.3.8 Example 8

//计算斐波那契递归Fib的时间复杂度?
long long Fib(size_t N)
{
	if (N < 3)
		return 1;

	return Fib(N - 1) + Fib(N - 2);
}

image-20220306172602309

At this point, you can use the proportional series formula to calculate (1*(1 - 2 N-1 ))/(1-2) = 2 N-1 -
1 (at this time we imagined a full situation, but in fact Is there any place where it is full, for example, there is no place in the lower right corner above, because those numbers are relatively small and cannot reach there)

Through computational analysis, it is found that the basic operation recurs 2 N times (2 N-1 is regarded as 2 N power) times, and the time complexity is O(2 N ).

3. Space complexity

Space complexity is also a mathematical expression, which is a measure of the amount of storage space temporarily occupied by an algorithm during its operation.

The space complexity is not how many bytes of space the program occupies, because this does not make much sense, so ** space complexity counts the number of variables. **

The space complexity calculation rules are basically similar to the practical complexity, and also use the big-O asymptotic notation.

Note: The stack space (storage parameters, local variables, some register information, etc.) required by the function to run has been determined during compilation, so the space complexity is mainly determined by the additional space explicitly requested by the function at runtime.

3.1 Example 1:

//计算BubbleSort的空间复杂度?
void BubbleSort(int* a, int n)
{
	assert(a);
	for (size_t end = n; end > 0; --end)
	{
		int exchange = 0;
		for (size_t i = 1; i < end; ++i)
		{
			if (a[i - 1] > a[i])
			{
				Swap(&a[i - 1], &a[i]);
				exchange = 1;
			}
		}
		if (exchange == 0)
			break;
	}
}

Analysis: A total of three variables of end, exchange, and i are opened up, that is, a constant amount of extra space is used, so the space complexity is O(1).

3.2 Example 2:

//计算Fibonacci的空间复杂度?
// 返回斐波那契数列的前n项
long long* Fibonacci(size_t n)
{
	if (n == 0)
		return NULL;

	long long* fibArray = (long long*)malloc((n + 1) * sizeof(long long));
	fibArray[0] = 0;
	fibArray[1] = 1;
	for (int i = 2; i <= n; ++i)
	{
		fibArray[i] = fibArray[i - 1] + fibArray[i - 2];
	}
	return fibArray;
}

Analysis: A total of (n+1) integer spaces are dynamically opened up, so the space complexity is O(N).

3.3 Example 3:

//计算阶乘递归Fac的空间复杂度?
long long Fac(size_t N)
{
	if (N == 0)
		return 1;

	return Fac(N - 1) * N;
}

Analysis: The recursive call is made N times, and N stack frames are opened, and each stack frame uses a constant amount of space. The space complexity is O(N)

3.4 Example 4

//计算斐波那契递归Fib的空间复杂度?
long long Fib(size_t N)
{
	if (N < 3)
		return 1;

	return Fib(N - 1) + Fib(N - 2);
}

Note: Time is cumulative, and space can be reused.

image-20220308110512226

Analysis: After calling Fib(3), then call Fib(2) first. After Fib(2) is called, the stack frame of Fib(2) will be destroyed. After the stack frame of Fib(2) is destroyed, continue to call Fib(1), the space used by the Fib(1) stack frame at this time is the same space as the space used by the Fib(2) stack frame just now. The space occupied by the stack frames opened up by the same layer is the same space. that is,At a moment, a total of N-1 stack frames are opened up, so the space complexity is O(N).

For example:

image-20220308135851219

Explanation: After the stack frame space of f1 is destroyed, f2 overwrites the space of the original f1 stack frame.(The destruction of the space is only to return the right of use to the system)

4. Common complexity comparison

The common complexity of the general algorithm is as follows:

5201314 O(1) constant order
3n + 4 O(n) Linear order
3n2 + 4*n + 5 O(n 2 ) Square order
3log 2 n + 4 O(log2n) logarithmic
2n + 3nlog 2 n + 4 O(nlog2n) nlog 2 nth order
n 3 + 2n 2 + 4n + 6 O(n 3 ) cubic order
2 n O(2^n) Exponential order

Complexity comparison:

O(n!) > O(2n) > O(n2) > O(nlog2n) > O(n) > O(log2n) > O(1)

5. oj exercises for complexity

5.1 Disappearing numbers

image-20220306181020943

Solution:

  1. sort (bubble(N 2 ), qsort(nlog 2 N)) (not eligible)

  2. Mapping (subscript method: how many subscripts each value corresponds to) (O(N)), but this method has O(N) space complexity

    Code display:

    	int *ret = (int*)malloc(sizeof(int)*(numsSize+1));
        int i = 0;
        for(i = 0;i<numsSize+1;i++)
        {
            ret[i] = -1;
        }
        for(i = 0;i<numsSize;i++)
        {
            ret[nums[i]] = nums[i];
        }
        for(i = 0;i<numsSize+1;i++)
        {
            if(ret[i]==-1)
            {
                return i;
            }
        }
        return -1;    }    ```
    
    
    
    
  3. XOR (use a variable value to XOR the data from 0 to N, and then XOR with the given data) (O(N))

    	int value = 0;
    	int i = 0;
    	for (i = 0; i <= numsSize; i++)
    	{
    		value ^= i;
    	}
    	for (i = 0; i < numsSize; i++)
    	{
    		value ^= nums[i];
    	}
    	return value;    }    ```
    
    
  4. Arithmetic formula . Subtract all data of the original array from the sum of 0~N. (O(N))

    	int sum = 0;
    	int i = 0;
    	int n = numsSize;
    	sum = (n + 1) * n / 2;//等差数列的和的公式
    	for (i = 0; i < numsSize; i++)
    	{
    		sum -= nums[i];
    	}
    	return sum;    }    ```
    

5.2 Rotating arrays

image-20220308101540465

Advanced:

  • Come up with as many solutions as possible, there are at least three different ways to approach this problem.
  • Can you solve this problem using an in- placeO(1) algorithm with space complexity ?
  1. Rotate right k times, one character at a time.

    time complexity:

    Last case: O(N)

    Worst case: O(N*K) (of course it can also be written as O(N 2 )

    Space Complexity: O(1)

  2. To open an additional array, put the last k at the front of the opened array, and put the first N - k at the back of the array.

    Time complexity: O(N) (N loops through moving array elements)

    Space complexity: O(N)

  3. Three times of inversion: the first time: the first N - K are reversed; the second time: the last K are reversed; the third time: the whole is reversed.

    Time complexity: O(N) (a total of 2N elements are reversed)

    Space Complexity: O(1)

    Code:

        while(left<=right)
        {
            int temp = nums[left];
            nums[left] = nums[right];
            nums[right] = temp;
            left++;
            right--;
        }    }    void rotate(int* nums, int numsSize, int k){
        k%=numsSize;
        reverse(0,numsSize - k-1,nums);
        reverse(numsSize - k,numsSize-1,nums);
        reverse(0,numsSize - 1,nums);
        }    ```
    

Guess you like

Origin blog.csdn.net/m0_57304511/article/details/123354412