Five algorithms that programmers must master

 

As a programmer, mastering algorithms is essential. Algorithms are one of the key tools in program development and problem solving. The following will introduce five algorithms that programmers will definitely meet and need to master, hoping to help everyone in their study and work.

1. Sorting Algorithm - Quick Sort

The sorting algorithm is one of the most basic and commonly used algorithms. One of the most efficient and worth mastering algorithms is quicksort. Quicksort is a divide-and-conquer algorithm. By selecting a reference number in the sequence to be sorted and dividing the sequence into two subsequences, one part is smaller than the reference number and the other is greater than the reference number, and then recursively sorts the two subsequences. The average time complexity of quick sort is O(nlogn), which is one of the most widely used sorting algorithms at present.

The following is a sample code for quick sort:

def quick_sort(arr):
    if len(arr) <= 1:
        return arr
    else:
        pivot = arr[0]
        less = [x for x in arr[1:] if x <= pivot]
        greater = [x for x in arr[1:] if x > pivot]
        return quick_sort(less) + [pivot] + quick_sort(greater)

arr = [3, 7, 5, 1, 9, 2]
sorted_arr = quick_sort(arr)
print(sorted_arr)

2. Search Algorithm - Binary Search

Binary search is an efficient search algorithm when dealing with ordered lists or arrays. Its basic idea is to reduce the search range to half each time until the target element is found or determined not to exist. The time complexity of binary search is O(logn).

The following is a sample code for binary search:

def binary_search(arr, target):
    low = 0
    high = len(arr) - 1
    
    while low <= high:
        mid = (low + high) // 2
        if arr[mid] == target:
            return mid
        elif arr[mid] < target:
            low = mid + 1
        else:
            high = mid - 1
    
    return -1

arr = [1, 2, 3, 5, 7, 9]
target = 5
index = binary_search(arr, target)
print("目标元素在索引", index)

3. Graph Algorithm - Depth First Search

Depth-first search (DFS) is an algorithm for traversing a graph or tree. Its basic idea is to start from the starting node, go deep along the path until it cannot continue, and then fall back to the previous node to continue exploring other paths. DFS is usually implemented using recursion or stacks (non-recursive).

The following is a sample code for depth-first search:

def dfs(graph, start, visited=None):
    if visited is None:
        visited = set()
    
    visited.add(start)
    print(start, end=" ")
    
    for neighbor in graph[start]:
        if neighbor not in visited:
            dfs(graph, neighbor, visited)

# 图的表示方式为邻接表
graph = {
    'A': ['B', 'C'],
    'B': ['D', 'E'],
    'C': ['F'],
    'D': [],
    'E': ['F'],
    'F': []
}

dfs(graph, 'A')

4. Dynamic Programming - Knapsack Problem

Dynamic programming is an optimization method for solving problems characterized by overlapping subproblems. The knapsack problem is one of the classic problems in dynamic programming. Its basic idea is to select some items to put into the knapsack to achieve the maximum value or minimum weight under the given capacity and item collection.

Here is a sample code for the knapsack problem:

def knapsack(weights, values, capacity):
    n = len(weights)
    dp = [[0] * (capacity + 1) for _ in range(n + 1)]
    
    for i in range(1, n + 1):
        for j in range(1, capacity + 1):
            if weights[i - 1] <= j:
                dp[i][j] = max(dp[i - 1][j], values[i - 1] + dp[i - 1][j - weights[i - 1]])
            else:
                dp[i][j] = dp[i - 1][j]
    
    return dp[n][capacity]

weights = [2, 3, 4, 5]
values = [3, 4, 5, 6]
capacity = 8
max_value = knapsack(weights, values, capacity)
print("背包能装的最大价值为", max_value)

5. Combinatorial Optimization - Traveling Salesman Problem

The traveling salesman problem is a classic combinatorial optimization problem that requires finding the shortest path between given cities such that each city is visited only once and eventually returns to the starting city. This is an NP-hard problem with no polynomial-time solution, but there are many heuristics that can approximate it.

Here is a sample code for the traveling salesman problem (using dynamic programming):

def tsp(distances, start):
    n = len(distances)
    dp = [[float('inf')] * (1 << n) for _ in range(n)]
    
    for i in range(n):
        dp[i][1 << i] = distances[i][start]
    
    for mask in range(1, 1 << n):
        for i in range(n):
            if mask & (1 << i) == 0:
                continue
            for j in range(n):
                if mask & (1 << j) != 0:
                    continue
                dp[j][mask | (1 << j)] = min(dp[j][mask | (1 << j)], dp[i][mask] + distances[i][j])
    
    min_distance = float('inf')
    for i in range(n):
        min_distance = min(min_distance, dp[i][(1 << n) - 1])
    
    return min_distance

distances = [
    [0, 2, 9, 10],
    [1, 0, 6, 4],
    [15, 7, 0, 8],
    [6, 3, 12, 0]
]
start = 0
min_distance = tsp(distances, start)
print("最短路径的总距离为", min_distance)

The above are five important algorithms that a programmer must master. By learning and mastering these algorithms, you can improve the efficiency of program development and problem solving, and lay a good foundation for your career development. I hope this article is helpful to everyone!

Guess you like

Origin blog.csdn.net/qq_40379132/article/details/131910708