Time complexity of algorithm --- data structure

Table of contents

Foreword:

1. Time complexity

1.1 Understanding of time complexity

1.2 Scale and execution times of basic operations

1.3 Big O Asymptotic Notation

1.4 Calculate the number of basic operations

2. Common time complexity and its advantages and disadvantages comparison


❤ Blogger CSDN: Ah Su wants to learn

    ▶Column Category: Data Structure

  Learning data structure is an interesting thing. I hope readers can really feel the relationship between data in my blog post. When operating data elements, they can have a clear idea and a picture in their mind! 


Foreword:

  In computers, how to judge whether an algorithm is good or bad, we use time complexity to judge. Time complexity is not an exact value, but roughly depicts the time efficiency of an algorithm . The codes you currently write can be run quickly by the computer, because they are relatively simple constant order O(1) or linear order O(n), but if you write a very bad algorithm, add The online scale is very large (a lot of data), and the computer will cry when it reads it.

1. Time complexity

1.1 Understanding of time complexity

  The concept of time complexity, why do we need the concept of time complexity? Because it takes time for us to do arithmetic; it also takes time for the computer to execute our preset instructions, and the length of time is more important. Under the premise of ensuring a good performance, we expect the computer to run quickly. Get results without having to wait for a while .

  After understanding the necessity of time complexity, how do we calculate the time complexity of an algorithm? Put a program on the computer and run it, and calculate the time it takes? This method is obviously not feasible, and I believe readers have also thought of it : Different computers have different hardware performance and software blessings; even if it is the same computer, you cannot fully guarantee the status of each computer during execution. It is exactly the same as the previous time; .

1.2 Scale and execution times of basic operations

   Assuming that the time for the computer to process each statement is a computer unit time, we calculate the number of executions of the basic operations in the algorithm and use the big O (a bit like a limit) notation to express :

void Func(int N)
{
    int count = 0;
    for(int i = 0; i < N; i++)//执行n次
    {
        for(int j = 0; j < N; j++)
        {
            count++;//对于外层的每一次,里层都要执行n次
            //n*n
        }
    }
    for(int k = 0; k < 5N; k++)
    {
        count++;//5n
    }
    int a = 10;
    while(a--)
    {
        count++;//10
    } 
    printf("%d", count);
}

  How many times has the basic operation of count been executed?

  1. The first part is a two-layer loop: executed n^2 times;
  2. The second part is a layer of for loop: executed 5n times
  3. The third part is the while loop: executed 10 times

 A total of N = n^2 + 5n + 10 times are executed. As N (problem size) increases, the number of executions will also increase . Let's compare the number of basic operations performed for the problem sizes listed below:

  • N = 10 、n = 160
  • N = 100、n = 10510
  • N = 1000、n = 1005010

  As the size of the problem increases, n also becomes larger.

We count the number of executions for N = n^2

N = 10、n = 100  

N = 100、n = 10000

N = 1000、n = 1000000

Compared with the 5n+10 part, n^2 contributes the most to the number of executions. We only consider n^2, and the time complexity is O(N^2), which is the big O asymptotic notation . As the scale increases, the part that has the greatest impact on the time efficiency of the algorithm is the time complexity of the algorithm.     

 Big O asymptotic notation: We only focus on the item in the execution count expression that has the greatest impact on the result, that is, the highest order.

1.3 Big O Asymptotic Notation

  Let's analyze the code to understand the time complexity of the Big O method and summarize the method of Big O representation .

void Fun(int N)
{
    int count = 0;
    for(int k = 0; k < 2N; k++)
    {
        count++;//2n次
    }
    int n = 10;
    while(n--)
    {
        count++;//10次
    }
}

 The number of executions of this code is 2n+10, what is the time complexity? You may say that it is O(2N), but is it really the case? Actually it should be O(N) . Think of it this way, the greatest impact on the number of times is caused by n. If n is 0, 2n will be 0, and there are only 10 fixed times. And n is from 0-10000, and the number of times is increasing. The number of changes in the previous 2 depends on the size of n .

  Big O asymptotic notation: For the item with the highest degree, the constants that are multiplied or divided by it can be omitted, and the omission is the big O notation .

void Fun(int N, int M)
{
    int count = 0;
    for(i = 0; i < N; i++)
    {
        count++//n次
    }
    for(j = 0; j < M; j++)
    {
        count++;//m次
    }
}

  The time complexity here is O(N+M). The point I want to express is that we are used to using N to represent the number of executions, and don't hinder us from making correct judgments because of habits . Here N and M are both unknown numbers, which have an impact on the number of basic operations, so naturally they have to be written in big O. Big O notation can have two unknowns, or even more, depending on the situation .

 If the condition says that N is much greater than M, the time complexity becomes O(N); if N and M are not readable, the time complexity is O(N) or O(M), because N+M is equivalent For 2N or 2M, the constant in front of the highest term has little effect .

void Fun(int N)
{
    int count = 0;
    for(int i = 0; i < 100; i++)
    {
        count++;//100次
    }
}

  Here it is executed 100 times, and the time complexity is not O(N), because N is not used by us. Is that O(100)? Nope, we use 1 for all additive constants in big-O notation . So the time complexity of this code is O(1), which means that it is executed a constant number of times. O(1) does not mean that the algorithm is executed once. It is like writing O(1) for 100 executions. It does not mean that it is executed only once. Here It is emphasized that no matter how the scale changes or how big the change is, this algorithm will end after a constant number of executions, which is the optimal algorithm !

Big O asymptotic notation: replace all additive constants with the constant 1

Big O notation
The highest order has the most influence on the result, focus on the highest order.
The highest order constant is omitted
All additive constants are replaced with the number 1

  Example: N = 2n^2 + n + 10. It can be directly seen that the time complexity is N^2, and I think readers should have gotten it. Given the number of our basic operations, we can easily use big O notation to express the time complexity. This is not difficult. The important point is to calculate or see the number of executions of the algorithm .

1.4 Calculate the number of basic operations

three conditions:

char* serch_ch(const char* str, char ch)
{
    //从字符数组的头开始往后找
    //在一个字符数组里查找一个想要查找的字符
    while(*str != '\0')
    {
        if(*str == ch)
            return str;
        str++;
    }
    return NULL;
}

  A character array with a length of N, how many times (basic number of executions) do you need to search for a character ch in the character array?

  • Best case: the first character is what we want to find, only once, the time complexity is O(1)
  • Worst case: Traverse the entire array, there may be this character at the end, or there may not be, in short, the array of length n is searched (basic number of operations: searched n times), the time complexity is O(N ).
  • Average situation: The number to be searched is random, and the characters placed in the array are also random. In general, we will find it near the middle, execute n/2 times, and the time complexity is still O(N).

  When an algorithm has these three situations, we look at the worst situation, make the worst plan, and lay the foundation for us . If readers have read novels about cultivating immortals, we will generally see that the hero is very powerful but also thinks about the worst possible situation, in other words, he is thoughtful.

The number of basic operations of bubble sort:

  Bubble sort is to arrange a set of numbers in ascending order. Speaking of bubbling, have you ever thought that when the water boiler is boiling water in life, the water bubbles rise from the bottom and turn from small bubbles to big bubbles. Abstractly understand that the small bubbles are small numbers, and the big bubbles that rise are big numbers . Because it is computer knowledge, we introduce it to make everyone interested, not to talk about physics (doge).

  A set of numbers 8 7 6 5 4 3 2 1. We want to arrange in ascending sequence 1 2 3 4 5 6 7 8. We only need to have ideas and analyze the ideas, and we can calculate the time complexity of this idea without typing code .

  Because this is a sequence of numbers with a length of N, it takes N times to bubble the first number to the end, which is the first trip . The result of the first bubbling is that a number is placed where it should be placed, and then only N-1 numbers need to be sorted . The second bubbling N-1 times, the third bubbling N-2 times, ..., the Nth bubbling 1 time . The total number of bubbling N + (N - 1) + (N - 2) +.......+ 2 + 1 is an arithmetic sequence summation, the first item plus the last item multiplied by the number of items divided by 2-- - (N+1)*N/2 is the number of basic operations of bubble sort. Time complexity O(N^2)

The number of basic operations of binary search:

  Binary search is to find the desired number in an ordered array. Now there is an array whose element arrangement is 1 2 3 4 5 6 7 8 9.

  If not, suppose we are looking for the number 9. At this time, because the array is ordered, the middle element is smaller than the one we want to find, so we don’t need to look for the left of the middle subscript, and the left subscript becomes pointing to mid +1 element (equivalent to half of the data), and then regenerate the middle subscript with right :

  When left<=right, there is an element that can be found. When left is greater than right, it cannot be found. The last search is when left = right. At this time, no matter whether it is found or not, it is equivalent to the length The ordered array of N is searched . How many times have you checked? Since we check each time, the elements of the array are reduced by half, and for the length of the array N, we get that 2 to the power of x is equal to N. Here you can think of it this way, assuming that this array has 8 elements, I use binary search, half of the data will be removed every time I search, please check it a few times to complete the search, 2 to the 3rd power is equal to 8, we know 3 times is enough up .

  So X is the basic operation at this time X = log is the logarithm of N with base 2. The time complexity is O(logN).

  The meaning of logN is base 2, the logarithm of N. Since the base number is not supported on the computer, we agree to write it like this. Of course, some numbers are still written like this in logN. In fact, this unreasonable degree is a little bigger, so we try to write it as logN.

Exercise: Recursively find the time complexity of factorial:

long long Fac(size_t N)
{
    return N < 2 ? N : Fac(N-1) * N;
}

  Here, recursion is used to find the factorial of N. What is the time complexity here? Please see the drawing below:

   When N is any number, the Fac() function is called several times, and each time the function performs three basic operations (judgment, get value, and return) . In other words, the arithmetic complexity in the Fac function is O(1), and the time complexity of calling this function is O(N) . That is, 3N basic operations are performed, so the time complexity is O(N) .

It is also easy to understand with a for loop. Readers can make an analogy:

int main()
{
    int count = 0;
    int N = 0;
    scanf("%d", &N);
    for(i = 0; i < N; i++)//循环N次
    {
        for(j = 0; j < 3; j++)//i的每一个值都进行三次
        {
            count++;//3N次
        }
    }
    return 0;
}

  The looping N times here is equivalent to using the recursive call N times, and each time there are three basic operations, which is 3N .

  Extended Topic: Calculation of Time Complexity of Fibonacci Sequence

2. Common time complexity and its advantages and disadvantages comparison

  Common time complexities are: O(N^2), O(N), O(logN), O(1)

 O(1) is the best time complexity, nothing more powerful. O(logN) is also a very good time complexity. It has an order of magnitude with O(N). Let's calculate it :

  For example, when traversing an array to find an element, the desired element of the array happens to be at the end. When N = 1000, the O(N) algorithm needs to perform 1000 basic operations, and the O(logN) algorithm only needs 10 times. Because 2 to the 10th power is 1024 .

  When N = 1000000 (millions), O(N) needs millions of times, O(logN) only needs 20 times, because we simply regard 1024*1024 as 1000*1000 is enough .

  When N = 1000000000 (billion), O(logN) only needs 30 times!

  Speaking of this, we will finish the knowledge content of time complexity. Space complexity and some characteristics of the algorithm blogger may publish another article, the content is not much, and the space complexity and time complexity are expressed in big O notation, and they are also estimated. The space complexity is calculated as the program extra The amount of space occupied, such as creating an int a; variable, the amount occupied is 1.


Conclusion: I hope readers can gain something after reading it! Learn more about data structures!

  Readers who do not understand this article, or find errors in the content of the article, please leave a comment below to tell the blogger ~, and you can also make some suggestions for article improvement to the blogger, thank you very much! last of the last!

  ❤Please like, please pay attention, your likes are the driving force for my update, let's make progress together.

Guess you like

Origin blog.csdn.net/muwithxi/article/details/130313596