Time Complexity and Space Complexity Calculation

An algorithm is essentially a sequence of calculations. The same problem can be solved using different algorithms, but the time and resources consumed in the calculation process may vary widely. So how to compare the pros and cons of different algorithms?

At present, the analysis algorithm is mainly carried out from the two dimensions of time and space. The time dimension is the time that the algorithm needs to consume, and time complexity (time complexity) is a common analysis unit. The space dimension is the memory space that the algorithm needs to occupy, and the space complexity is a commonly used analysis unit.

Reference source: tips/time complexity and space complexity.md at master pro648/tips GitHub

Therefore, the analysis algorithm is mainly carried out from the time complexity and space complexity. In many cases, the two cannot be achieved at the same time. Sometimes time is exchanged for space, and sometimes space is exchanged for time.

 1. Time Complexity Time Complexity

Modern hardware is so powerful that even very expensive algorithms can be fast for small amounts of data. However, when the amount of data increases, the time overhead will increase significantly. Time complexity is a measure of how long an algorithm takes as the amount of data increases.

#### 1.1 Constant Time

The constant time algorithm does not change with the amount of data, and the time is fixed.

Check out the method below:

```
func checkFirst(names: [String]) {
    if let first = names.first {
        print(first)
    } else {
        print("No Names")
    }
}
```

The time required for this function to execute is independent of the size of the names array. Whether the array has ten elements or ten thousand elements, the function only checks the first element of the array.

The following figure is the relationship between data volume and time:

![ConstantTime](images/DSAConstantTime.png)

When the amount of data becomes larger, the time required by the algorithm remains the same.

For simplicity, various time complexities are represented using big O notation. The constant-time big-O notation is `O(1)`.

#### 1.2 Linear Time Linear Time

The following code prints all elements in the array:

```
func printNames(names: [String]) {
    for name in names {
        print(name)
    }
}
```

As the array grows larger, the number of for loops also increases synchronously, which is called linear time complexity.

![LinearTime](images/DSALinearTime.png)

Linear time complexity is best understood. As the amount of data increases, the time spent increases synchronously, as shown by the slash in the figure above. The Big O notation for linear time complexity is `O(n)`.

> If it is two loops, plus six `O(1)`, is the Big O notation `O(2n+6)`? Time complexity only describes the performance curve, therefore, adding some loops will not change the performance curve. Big O notation removes all constants, that is, `O(2n+6)` is equal to `O(n)`, but these constants cannot be ignored when optimizing absolute performance. An optimized GPU might be a hundred times faster than a CPU, but the time complexity of both is still `O(n)`.

#### 1.3 Quadratic Time

Quadratic Time (Quadratic Time) is also called the square of n, and the time consumed by the quadratic time complexity algorithm is the square of the amount of data. Refer to the following code:

```
func printNames(names: [String]) {
    for _ in names {
        for name in names {
            print(name)
        }
    }
}
```

If the array has 10 elements, the above function will print 10 elements 10 times, that is, print 100 times in total. If the array has 11 elements, 11 elements will be printed 11 times, that is, a total of 121 times will be printed. When the amount of data becomes large, the quadratic time algorithm can quickly get out of hand, and the required time increases dramatically.

![QuadraticTime](images/DSAQuadraticTime.png)

The big O notation for square time is `O(n²)`.

> No matter how inefficient the linear time algorithm is, when the amount of data is particularly large, the linear time algorithm will always be faster than the quadratic time algorithm, even if the quadratic time algorithm is optimized.

#### 1.4 Logarithmic Time Logarithmic Time

In linear time complexity and quadratic time complexity, the data is used at least once, but sometimes only part of the input data is used, and the running speed will be faster. Given an ordered array of integers, what's the fastest way to find a specific value?

There is a way to loop through the array and compare them sequentially, which is linear time complexity. The most direct comparison method is as follows:

```
let numbers = [1, 3, 5, 46, 88, 97, 115, 353]

func navieContains(_ value: Int, in array: [Int]) -> Bool {
    for element in array {
        if element == value {
            return true
        }
    }
    
    return false
}
```

The above algorithm loops through the full array if looking to see if the number 354 is in the array. Since the array is ordered, it can be searched in half:

```
func navieContaines(_ value: Int, in array: [Int]) -> Bool {
    guard !array.isEmpty else {
        return false
    }
    
    let middleIndex = array.count / 2
    if value < array[middleIndex] {
        for index in 0...middleIndex {
            if array[index] == value {
                return true
            }
        }
    } else {
        for index in middleIndex..<array.count {
            if array[index] == value {
                return true
            }
        }
    }
    
    return false
}
```

The above function performs a small but effective optimization by comparing only half of the elements in the array. First compare the elements in the middle of the array. If the specified value is smaller than the middle element, only the first half of the elements are compared; otherwise, only the latter half of the elements are compared.

If you repeat the halving, it will become a binary search, which is logarithmic time complexity. The following graph shows the logarithmic time complexity data versus time:

![LogarithmTime](images/DSALogarithmicTime.png)

As the amount of data becomes larger, the time required for the logarithmic time algorithm increases slowly. If the amount of data is 100, it will be 50 after halving; if the amount of data is 100,000, it will be 50,000 after halving. The larger the amount of data, the better the effect of halving.

The algorithm for halving is not complicated, but it is very effective. Big O notation for logarithmic time complexity is `O(log n)`.

#### 1.5 Quasilinear Time Quasilinear Time

Another commonly used time complexity is quasi-linear time. The quasi-linear time algorithm is less efficient than the linear time algorithm, but more efficient than the quadratic time algorithm. The `sorted()` algorithm for arrays in Swift is a quasi-linear time algorithm.

Quasi-linear time Big O notation is `O(n log n)`, a multiple of linear and logarithmic time. As shown below:

![QuasilinearTime](images/DSAQuasilinearTime.png)

The quasi-linear time complexity curve is somewhat similar to the quadratic time complexity curve, but the performance of the quasi-linear time algorithm is slightly better when the amount of data is large.

#### 1.6 Other Time Complexities

The five commonly encountered time complexities mentioned above, and some complexities used to deal with complex problems are not involved, such as polynomial time (polynomial time), exponential time (exponential time), factorial time (factorial time), etc. .

Time complexity is an overview of performance, that is, the performance ranking of different complexity, it cannot accurately describe the speed of the algorithm. Therefore, for two algorithms of the same complexity, one may be much faster than the other. When the amount of data is small, the time complexity may not be an accurate measure.

Quadratic algorithms like insertion sort can be faster than quasi-linear algorithms like merge sort when the dataset is small. This is because insertion sort does not need to allocate additional memory to execute the algorithm, whereas merge sort needs to allocate memory for the multiple arrays created. With small amounts of data, allocating memory can be more expensive than multiprocessing the data.

## 2. Space Complexity Space Complexity

Space complexity is a measure of the size of the storage space temporarily occupied by the algorithm during operation. As the amount of data increases, the program may need to occupy more memory space, and the space complexity reflects the trend of memory space growth.

Please see the following code:

```
func printSorted(_ array: [Int]) {
    let sorted = array.sorted()
    for element in sorted {
        print(element)
    }
}
```

The above method creates an ordered copy of the array and outputs its elements. In order to calculate the space complexity, it is necessary to analyze how much memory space the function occupies.

`array.sorted()` will create an array of the same size, and its space complexity is `O(n)`. The above method is relatively simple. If the method is particularly complicated, it may be necessary to reduce the number of initialized objects and memory usage.

The above method can be replaced with the following code:

```
func printSorted(_ array: [Int]) {
    // 1
    guard !array.isEmpty else {
        return
    }
    
    // 2
    var currentCount = 0
    var minValue = Int.min
    
    // 3
    for value in array {
        if value == minValue {
            print(value)
            currentCount += 1
        }
    }
    
    while currentCount < array.count {
        // 4
        var currentValue = array.max()!
        
        for value in array {
            if value < currentValue && value > minValue {
                currentValue = value
            }
        }
        
        // 5
        for value in array {
            if value == currentValue {
                print(value)
                currentCount += 1
            }
        }
        
        // 6
        minValue = currentValue
    }
}
```

The above algorithm avoids taking up too much space through multiple loops, and prints out the minimum value in sequence. The specific instructions are as follows:

1. Check to see if the array is empty. If empty, stop execution.
2. currentCount records how many times it is printed, and minValue records the last printed number.
3. The algorithm first prints out all the numbers that match minValue, and updates the currentCount of printing times synchronously.
4. The algorithm uses a while loop to find the minimum value greater than minValue and store it in currentValue.
5. Print out all numbers that are the same as currentValue and update currentCount.
6. minValue is set to currentValue and the next loop will find the next minimum value.

The above algorithm only initializes a few variables to track the progress of the search, so the space complexity is `O(1)`.

# Summarize

Currently, the use of Big O notation for algorithmic complexity has been introduced, which is the most common measure.

Here are some time complexity, space complexity points:

- Time complexity reflects the change in the time required by the algorithm when the amount of data becomes larger.
- Should be aware of constant time, logarithmic time, linear time, quasi-linear time and quadratic time and be able to sort by cost.
- Space complexity reflects the change in the memory space occupied by the algorithm when the amount of data becomes larger.
- Use big O notation to represent space complexity and time complexity.
- Time complexity and space complexity reflect the scalability of the program, and cannot calculate the specific execution time of the algorithm.
- When the amount of data is small, time complexity and execution time may not be relevant. For example, a quadratic algorithm like insertion sort may be faster than a quasi-linear algorithm like merge sort.

References:

1. [Time and Space Complexity Analysis "Data Structure and Algorithm 2"](https://turingplanet.org/2020/02/03/%E3%80%90%E6%95%B0%E6%8D%AE% E7%BB%93%E6%9E%84%E5%92%8C%E7%AE%97%E6%B3%953%E3%80%91/) 2. [Time complexity](https://
zh .wikipedia.org/wiki/%E6%97%B6%E9%97%B4%E5%A4%8D%E6%9D%82%E5%BA%A6)

Guess you like

Origin blog.csdn.net/Viviane_2022/article/details/130591716