1. Algorithm efficiency
algorithm complexity
After the algorithm is written into an executable program, it requires time resources and space (memory) resources to run. Therefore, the quality of an algorithm is generally measured from two dimensions: time and space , namely time complexity and space complexity.
Time complexity mainly measures how fast an algorithm runs, while space complexity mainly measures the extra space required to run an algorithm . In the early days of computer development, computers had very little storage capacity. So we care a lot about space complexity. However, after the rapid development of the computer industry, the storage capacity of computers has reached a very high level. So we no longer need to pay special attention to the space complexity of an algorithm.
Nowadays, the time complexity of an algorithm is more important than the space complexity. When we use a software, we all hope that it will run quickly and without lag. We don’t really care about the space occupied by the software, because now mobile phones and computers Updates are iterating quickly, the memory inside is getting higher and higher, and there are fewer and fewer low-memory devices. People basically don’t have to consider the problem of insufficient memory when using the software.
This starts with this person. He is Gordon Moore , one of the founders of Intel .
He proposed Moore's Law : When the price remains unchanged, the number of transistors that can be accommodated on an integrated circuit will double approximately every 18 months, and the performance will also double. In other words, computer performance per dollar will more than double every 18 months. This law reveals the speed of information technology progress.
2. Time complexity
1. The concept of time complexity
Definition of time complexity: In computer science, the time complexity of an algorithm is a function that quantitatively describes the running time of the algorithm. The time it takes to execute an algorithm cannot be calculated theoretically. You can only know it if you put your program on the machine and run it. But do we need to test every algorithm on a computer? It is possible to test everything on the computer, but this is very annoying, so the time complexity analysis method is developed. The time an algorithm takes is proportional to the number of executions of its statements. The number of executions of basic operations in the algorithm is the time complexity of the algorithm.
2. Big O asymptotic representation
Big O notation: is a mathematical notation used to describe the asymptotic behavior of a function.
3. Examples of common time complexity calculations
void Func1(int N)
{
int count = 0;
for (int i = 0; i < N; ++i)
{
for (int j = 0; j < N; ++j)
{
++count;
}
}
for (int k = 0; k < 2 * N; ++k)
{
++count;
}
int M = 10;
while (M--)
{
++count;
}
printf("%d\n", count);
}
function expression
F(N) = N² + 2 * N + 10
The time complexity of this code is O(N²) according to the asymptotic representation of big O.
Example 2. Calculate the time complexity of Func2?
void Func2(int N)
{
int count = 0;
for (int k = 0; k < 2 * N; ++k)
{
++count;
}
int M = 10;
while (M--)
{
++count;
}
printf("%d\n", count);
}
function expression
F(N) = 2 * N + 10
The time complexity of this code is O(N) according to the asymptotic representation of big O.
Example 3. Calculate the time complexity of Func3?
void Func3(int N, int M)
{
int count = 0;
for (int k = 0; k < M; ++k)
{
++count;
}
for (int k = 0; k < N; ++k)
{
++count;
}
printf("%d\n", count);
}
function expression
F(N) = M + N
There are three situations
There is no fixed answer to the time complexity of this code, only specific analysis of specific situations.
Example 4. Calculate the time complexity of Func4?
void Func4(int N)
{
int count = 0;
for (int k = 0; k < 100; ++k)
{
++count;
}
printf("%d\n", count);
}
function expression
F(N) = 100
100 is a constant
Then the time complexity of this code is O(1) according to the asymptotic representation of big O, which represents a constant degree
Example 5. Time complexity of calculating strchr?
The number of traversals depends on the length of str, so the time complexity is O(1)
const char* strchr(const char* str, int character);
Example 6. Time complexity of calculating BubbleSort?
void BubbleSort(int* a, int n)
{
assert(a);
for (size_t end = n; end > 0; --end)
{
int exchange = 0;
for (size_t i = 1; i < end; ++i)
{
if (a[i - 1] > a[i])
{
Swap(&a[i - 1], &a[i]);
exchange = 1;
}
}
if (exchange == 0)
break;
}
}
According to the rules of the previous examples, the time complexity is O(N), but it is found that F(N) = N * (N - 1) / 2
The time complexity is O(N²)
Example 7. Calculate the time complexity of BinarySearch?
int BinarySearch(int* a, int n, int x)
{
assert(a);
int begin = 0;
int end = n - 1;
while (begin < end)
{
int mid = begin + ((end - begin) >> 1);
if (a[mid] < x)
begin = mid + 1;
else if (a[mid] > x)
end = mid;
else
return mid;
}
return -1;
}
3. Space complexity
Note: The stack space required when the function is running (storing parameters, local variables, some register information, etc.) has been determined during compilation, so the space complexity is mainly determined by the additional space explicitly applied for by the function at runtime.
4. Common complexity comparison
The common complexity of general algorithms is as follows:
5. Complex oj exercises
3.1 Missing number OJ link : https://leetcode-cn.com/problems/missing-number-lcci/
If you don't consider the complexity issue, you can use this method to dynamically open up space directly.
Idea 1 Sorting + Traversal
If the next number is not equal to the next data, it is +1. This next number is the missing number.
Time complexity O(log(N*N))
//思路一 动态开辟空间
int Findnum(int* pc,int sz)
{
assert(pc);
int* arr2;
arr2 = (int*)malloc((sz+1)*sizeof(int));
if (arr2 == NULL)
{
perror("Findnum::malloc");
return 0;
}
memset(arr2, -1, (sz+1)*sizeof(int));
int i = 0;
for (i = 0;i < sz;i++)
{
int tmp = *(pc + i);
*(arr2+tmp) = *(pc + i);
}
int j = 0;
for (j = 0;j < sz + 1;j++)
{
if (*(arr2+j) != j)
{
return j;
}
}
free(arr2);
arr2 = NULL;
}
int main()
{
int arr1[] = { 0,3,5,6,1,2 };
int sz = sizeof(arr1) / sizeof(arr1[0]);
int num = Findnum(arr1,sz);
printf("%d", num);
return 0;
}
Two ideas
Time complexity O(N)
//思路二 异或
int Findnum(int* pc,int sz)
{
assert(pc);
int i = 0;
int n = 0;
for (i = 0;i < sz;i++)
{
n ^= *(pc + i);
}
for (int j = 0;j < sz + 1;j++)
{
n^=j;
}
return n;
}
int main()
{
int arr1[] = { 0,3,5,6,1,2 };
int sz = sizeof(arr1) / sizeof(arr1[0]);
int num = Findnum(arr1,sz);
printf("%d", num);
return 0;
}
Idea three: summation
The calculation result of the 0+N arithmetic sequence formula is the disappearing number
Time complexity O(N)
//思路三 求和
int main()
{
int arr1[] = { 0,3,5,6,1,2 };
int sum1 = 0;
int sum2 = 0;
int sz = sizeof(arr1) / sizeof(arr1[0]);
int i = 0;
for (i = 0;i < sz;i++)
{
sum1 = sum1+arr1[i];
}
int j = 0;
for (j = 0;j < sz + 1;j++)
{
sum2 = sum2 + j;
}
int n = sum2 - sum1;
printf("%d", n);
return 0;
}