Time complexity-space complexity of data structure

Hello everyone, I am deep fish~

Table of contents

1. Preface to data structure

1.1 What is a data structure

1.2 What is an algorithm

1.3 The Importance of Data Structures and Algorithms

1.4 How to learn data structure and algorithm well

2. Efficiency of the algorithm

3. Time complexity

3.1 The concept of time complexity

3.2 Asymptotic representation of big O

[Example 1]: Time complexity of double loop: O(N)

[Example 2]: Time complexity of double loop: O(N+M)

[Example 3]: Time complexity of constant loop: O(1)

[Example 4]: Time complexity of strchr: O(N)

[Example 5]: Time complexity of bubble sorting: O(N^2)

[Example 6]: Time complexity of binary search: O(log2N)

[Example 7]: Time complexity of factorial recursion: O(N)

[Example 8]: Time complexity of Fibonacci recursion: O(2^N)

 4. Space complexity

[Example 1]: Space complexity of bubble sort: O(1)

[Example 2]: Space complexity of Fibonacci recursion: O(N)

[Example 3]: Space complexity of function factorial recursion: O(N)

 [Expansion] The space complexity of the recursive Fibonacci sequence: O(N)


1. Preface to data structure

1.1 What is a data structure

To implement some projects, data needs to be stored in memory. Data structure is the way computers store and organize data . Refers to a collection of data elements that have one or more specific relationships with each other. eg: array, linked list, tree...

1.2 What is an algorithm

An algorithm is simply a series of computational steps used to transform input data into output results . Common algorithms are: sorting, searching, duplicate checking, recommendation algorithm...

1.3 The Importance of Data Structures and Algorithms

There will be a lot of questions about data structures and algorithms in the written test of school recruitment

You can take a look at the link, in future work:

The importance of data structures and algorithms to a programmer

1.4 How to learn data structure and algorithm well

<1> Type more codes

<2>Pay attention to drawing and thinking

2. Efficiency of the algorithm

The efficiency of the algorithm depends on two points. The first point is the time efficiency, that is, the time complexity . The second point is the space efficiency, that is, the space complexity. However, with the development of the computer industry, the storage capacity of the computer has reached a very high level. degree, so now we don't have to pay too much attention to the space complexity of an algorithm

3. Time complexity

3.1 The concept of time complexity

The time complexity of an algorithm is a function expression with an unknown number in mathematics . The complexity of an algorithm does not depend on the running time of the algorithm, because the specific running time is different in different environments , eg: 10 years ago, 2-core cpu, The running time of a machine with 2g memory is different from today's machine with 8 core cpu and 8g memory. The number of executions of the basic operations in the algorithm is the time complexity of the algorithm

3.2 Asymptotic representation of big O

Please calculate how many times the basic operation of Func1 is performed?

void Func1(int N)
{
int count = 0;
for (int i = 0; i < N ; ++ i)
{
 for (int j = 0; j < N ; ++ j)
 {
 ++count;
 }
}
for (int k = 0; k < 2 * N ; ++ k)
{
 ++count;
}
int M = 10;
while (M--)
{
 ++count;
}
printf("%d\n", count);
}

Number of basic operations performed by Func1: F(N)=N*N+2*N+10

When N = 10 F(N) = 130

When N = 100 F(N) = 10210

When N = 1000 F(N) = 1002010

The larger N is, the smaller the influence of the last two items on the result is, so when actually calculating the time complexity, we only need an approximate number of executions, so here we use the asymptotic representation (estimation) of big O, that is, the time complexity: O( N^2)

Big O asymptotic notation:

(1) Replace all additive constants in run time with the constant 1

(2) In the modified number of runs function, only the highest order term is kept

(3) If the highest order exists and is not 1, remove the constant multiplied by this item

[Example 1]: Time complexity of double loop: O(N)

Should have been 2*N, according to the big O asymptotic representation (3) simplified to O(N)

// 计算Func2的时间复杂度?
void Func2(int N)
{
int count = 0;
for (int k = 0; k < 2 * N ; ++ k)
{
++count;
}
int M = 10;
while (M--)
{
++count;
}
printf("%d\n", count);
}

[Example 2]: Time complexity of double loop: O(N+M)

(If the premise: M>>N, then the time complexity is O(M);

                      N>>M, then the time complexity is O(N);

                      M and N are similar, so the time complexity O(M) or O(N) is fine)

In general, N is used for unknowns in time complexity calculations, but M, K, etc. can also be used.

// 计算Func3的时间复杂度?
void Func3(int N, int M)
{
int count = 0;
for (int k = 0; k < M; ++ k)
{
++count;
}
for (int k = 0; k < N ; ++ k)
{
++count;
}
printf("%d\n", count);
}

[Example 3]: Time complexity of constant loop: O(1)

Originally 100, according to the big O asymptotic notation (1) simplified to O(1)

( O(1) does not mean that the algorithm runs once, but a constant number of times )

// 计算Func4的时间复杂度?
void Func4(int N)
{
int count = 0;
for (int k = 0; k < 100; ++ k)
{
++count;
}
printf("%d\n", count);
}

[Example 4]: Time complexity of strchr: O(N)

// 计算strchr的时间复杂度?
const char * strchr ( const char * str, int character );

The logic of the strchr function is actually the following

while(*str)

{

     if(*str==character)

            return str;

     else

           ++str;

}

 Take the string hello world as an example:

Suppose you are looking for h: 1 Best case: Minimum number of runs for any input size (lower bound)

Assuming that you are looking for w: N/2 average case: the expected number of runs of any input size (probably the best and worst sum/2)

Suppose you are looking for d: N Worst case: maximum number of runs for any input size (upper bound)

When an algorithm has different time complexity depending on the input, the time complexity is pessimistically expected, and the worst case is seen (that is, the time complexity of this example is O(N))

[Example 5]: Time complexity of bubble sorting: O(N^2)

// 计算BubbleSort的时间复杂度?
void BubbleSort(int* a, int n)
{
assert(a);
for (size_t end = n; end > 0; --end)
{
int exchange = 0;
for (size_t i = 1; i < end; ++i)
{
if (a[i-1] > a[i])
{
Swap(&a[i-1], &a[i]);
exchange = 1;
}
}
if (exchange == 0)
break;
} 
}

Time complexity: N-1, N-2, N-3...1 The exact value is N*(N-1)/2, then the gradient representation of big O is O(N^2)

To calculate the time complexity, you can't just look at a few layers of loops, but look at his thoughts

[Example 6]: Time complexity of binary search: O(log2N)

// 计算BinarySearch的时间复杂度?
int BinarySearch(int* a, int n, int x)
{
assert(a);
int begin = 0;
int end = n-1;
while (begin < end)
{
int mid = begin + ((end-begin)>>1);
if (a[mid] < x)
begin = mid+1;
else if (a[mid] > x)
end = mid;
else
return mid;
}
return -1;
}

Best case: O(1)

Worst case: O(log2N)

Why is it O(log2N)?

[Picture understanding] : Suppose we want to search X times, the size of an array is N, if each binary search is not found, N will be divided by 2, considering the worst result, then until N is divided until only 1 is left it's over

N/2/2/2/2...=1

2^X=N

X=log2N

 It can be seen that the binary search algorithm is a very powerful algorithm

Search in N numbers Approximate number of searches

1000                              10

100W                             20

1 billion 30

But the premise of this algorithm is that the array is ordered

[Example 7]: Time complexity of factorial recursion: O(N)

Recursive algorithm time complexity: number of recursion * number of each recursive call

// 计算阶乘递归Factorial的时间复杂度?
long long Factorial(size_t N)
{
return N < 2 ? N : Factorial(N-1)*N;
}

Do(N) Do(N-1) ... Do(1)

[Example 8]: Time complexity of Fibonacci recursion: O(2^N)

// 计算斐波那契递归Fibonacci的时间复杂度?
long long Fibonacci(size_t N)
{
return N < 2 ? N : Fibonacci(N-1)+Fibonacci(N-2);
}

[Picture understanding]: Understand the logical thinking of recursion. Each recursion will call two small recursions. Finally, the recursive call on the right will end first, so the number of recursions is the sum of the geometric sequence minus the lower right corner. the number of times

Fib(N)=2^0+2^1+2^2+...+2^n-X

The number of recursive calls here is a constant, which is equivalent to no *

Then the big O asymptotic notation is O(2^N)

It can be seen that the recursive writing method of the Fibonacci sequence is a completely useless algorithm because it is too slow

 4. Space complexity

Space complexity is also a mathematical expression, which is a measure of the temporary additional storage space occupied by an algorithm during operation

Space complexity is not how many bytes the program occupies, because this is not very meaningful, so the space complexity is the number of variables

The space complexity calculation rules are basically similar to the time complexity, and the big O asymptotic notation is also used

[Note]: The stack space (storage parameters, local variables, some memory information, etc.) required by the function at runtime has been determined during compilation, so the space complexity is mainly determined by the function applying for additional space at runtime

[Example 1]: Space complexity of bubble sort: O(1)

There are three variables in bubble sort: exchang, end, i, then according to the big O progressive notation it is O(1)

// 计算BubbleSort的空间复杂度?
void BubbleSort(int* a, int n)
{
assert(a);
for (size_t end = n; end > 0; --end)
{
int exchange = 0;
for (size_t i = 1; i < end; ++i)
{
if (a[i-1] > a[i])
{
Swap(&a[i-1], &a[i]);
exchange = 1;
}
}
if (exchange == 0)
break;
}
}

[Example 2]: Space complexity of Fibonacci recursion: O(N)

An array of N numbers dynamically opens up N+1 spaces, and the space complexity after simplification is O(N)

This function returns an array of the first n items of the Fibonacci sequence, not a number

The time complexity of that function is O(N), which is much simpler than the recursive O(2^N)

// 计算Fibonacci的空间复杂度?
//返回斐波那契数列的前n项
long long* Fibonacci(size_t n)
{
if(n==0)
return NULL;
long long * fibArray =
(long long *)malloc((n+1) * sizeof(long long));
fibArray[0] = 0;
fibArray[1] = 1;
for (int i = 2; i <= n ; ++i)
{
fibArray[i ] = fibArray[ i - 1] + fibArray [i - 2];
}
return fibArray ;
}

[Example 3]: Space complexity of function factorial recursion: O(N)

// 计算阶乘递归Factorial的空间复杂度?
long long Factorial(size_t N)
{
return N < 2 ? N : Factorial(N-1)*N;
}

[Picture understanding]: The recursive function is called N times, and N stack frames are opened up, and each stack frame uses a constant space, so the space complexity is O(N) (just look at the depth of recursion )

 [Expansion] The space complexity of the recursive Fibonacci sequence: O(N)

// 计算斐波那契递归Fibonacci的空间复杂度?
long long Fibonacci(size_t N)
{
return N < 2 ? N : Fibonacci(N-1)+Fibonacci(N-2);
}

[Picture understanding]: The order of this function call space is Fbi(N), Fbi(N-1)...Fbi(1), which is the leftmost branch, and then the space of these functions is destroyed, and continue to the next one branches, so that the depth of function recursion is always N, not 2^N

Space can be reused and not accumulated

Time is gone forever, accumulated

This time, the content of the time and space complexity of the data structure is here. If you have any questions, please welcome the comment area or private message exchange. I think the author's writing is not bad, or I have gained a little bit. I come to a one-click three-link, thank you very much! 

Guess you like

Origin blog.csdn.net/qq_73017178/article/details/131690026