On Algorithms and Data Structures

What is?

Data structures and algorithms is an idea, improve efficiency and performance,
the face of new problems, probably have an idea which direction to go to solve

The concept of algorithm

Algorithm is the essence of computer processing of information, (simply, the idea is to solve problems with the implementation of the program) on a computer program is essentially an algorithm that tells the computer the exact steps to perform a specified task.

Algorithm is independent of the presence of a problem-solving methods and ideas (language is not important, important ideas)

Five properties

  1. Input: a plurality of input 0 ~
  2. Output: at least one or more output
  3. There are poor: the algorithm will end in a limited step, rather than wireless steps; and each step can be completed within an acceptable time
  4. Uncertainty: each step of the algorithm has a definite meaning, does not appear ambiguous
  5. Feasibility: Each step of the algorithm is possible, every step possible to perform a limited number of complete (achievable)

A measure of the efficiency of the algorithm

The execution time of reaction efficiency of the algorithm? Hardware facilities? Storage?

  • The time complexity of> reflects a trend

    O = n (scale) Number of basic arithmetic ^
    O = K * n ^ 3 + C

    • The worst time complexity: worst calculation step
    • The best time complexity: the step of calculating the best
    • Average time complexity: average calculating step
    • The worst time complexity provides a guarantee, a major concern him
  • Computing time complexity
    1. Basic operations, only the constant term, that the time complexity is O (1)
    2. Sequential structure, addition calculation
    3. Cyclic structure, multiplication
    4. Branch structure (if), takes a maximum value
    5. We retain the highest order term, Changshu and other minor items can be ignored
    6. No special instructions, usually the worst time complexity
  • Common large time complexity O calculating
    the number of executions of example function | step | Informal term
    - | - | -
    12 is | O (. 1) | constant order
    2n + 3 | O (n) | linear order
    3n ^ 2 + 2n + 1 | O (n ^ 2) | order of the square
    5log2n + 20 | O (logn) | of order
    2n + 3nlog2n + 19 | O ( nlogn) | nlogn order
    6n ^ 3 + 2n ^ 2 + 3n + 4 | O (n ^ 3) | cubic order
    2 ^ n | O (2 ^ N) | exponential order

    Note: log2n binary logarithm
    consumed time in ascending order
    O (1) <O (logn ) <O (n) <O (nlogn) <O (n ^ 2) <O (n ^ 3) <O (2 ^ n) < O (n!) <O (n ^ n)
  • Space complexity 
    • Algorithm is a measure of a temporary occupation of storage space during operation, also it reflects a trend
    • Calculation time and complexity similar

data structure

Data is an independent concept, the basic types (int, float) that the classification of programming languages, between the data elements are not independent, there is a specific relationship, these relationships is the structure.
Algorithms + Data Structures = Programs

Abstract data types ADT (Abstract Data Type)

The data types and data types of operations bundled
most common data operations:

  1. insert
  2. delete
  3. modify
  4. Inquire 
  5. Sequence

Linear table comprises a list order table and

Order table

Order (continuous) to store

  1. The basic layout of the order
  2. External element sequence table
    sequence table includes a complete information required two parts: header information (the number of the current capacity and storage)

Method to realize

  • Integrated
    memory header and data together in a continuous manner already area
    element storage fetch replaced (entire Change)
  • Separate
    header and data areas are stored separately by connecting to the associated
    elements taken replacement storage (header area and the data link address update)

Element store expansion strategy :( dynamic sequence table)

  • Every expansion increased fixed storage location, features: saving space, the expansion of operating frequency
  • Every doubling expansion capacity storage location, features: expansion of the number of operations to reduce waste of space resources 
  • Space for time, doubling the recommended way of expansion

List

0x11 ( 'data area', '0x34') 0x34 ( ' data area', 'memory address of the next node')
- one-way circular linked list
tail node .next redirected head node
during the time of operation, to be noted that the first junction point and the associated end node

  • Doubly linked list
    (predecessor node field Data field region successor node) node pointing to the tail None
  • Two-way circular list

Stack

A container is
only allowed to operate in the period, so it is a last in first out (LIFO) principle of operation

queue

FIFO is a linear form

Deque

Head and tail can be taken and stored

Sorting Algorithm

Stability sorting algorithms: stable sorting algorithm will make the original record to maintain relatively equivalent key order

Bubble Sort

O(n**2)

def bubble_sort (alist):
    for i in range(len(alist)-1,0,-1):
        for j in range(i):
            if alist[j] >alist[j+1]:
                alist[j],alist[j+1] = alist[j+1],alist[j]

Selection Sort

O (n ** 2)

def bubble_sort(alist):
    for i in range(len(alist)):
        for j in range(i+1,len(alist)):
            if alist[i] > alist[j]:
                alist[i],alist[j] = alist[j],alist[i]

# [16,3,4,4,234,]

Insertion Sort

def insert_sort(alist):
    for i range(1,len(alist)):
        for j in range(0,i):
            if alist[i] < alist[j]:
                alist[i],alist[j] = alist[j],alist[i]

Shell sort

Based on insertion sort implementation

def shell_sort(alist):
    grap = int(len(alist)/2)
    i = 1
    while i>0:
        if alist[i] < alist[i-grap]:
            alist[i],alist[i=grap] = alist[i-grap],alist[i]
            i-= grap

Quick Sort

Time complexity of O (n ** 2)

def quick_short(alist,first,last):
    if first >= last:
        return 
    mid_value = alist[first]
    low = first
    high = last
    while low < high:
        while low < high and alist[high] >= mid_value: 
            high -= 1
        alist[low] = alist[high]
        while low < high and alist[low] < mid_value: 
            low += 1
        alist[high] = alist[low]
    alist[low] = mid_value
    # 递归
    # 对左边快速排序
    quick_short(alist,first,low-1)  
    # 对右边快排
    quick_short(alist,low+1 ,last)          

Note the use of recursive thought

Merge sort

def marge_sort(alist):
    mid = len(alist)//2
    if len(alist) <= 1:
        return alist
    left = marge_sort(alist[:mid]) 
    right = marge_sort(alist[mid:]) 
    # 将两个 合并为一个新的
    left_pointer,right_pointer = 0,0
    result = []
    while left_pointer < len(left) and right_pointer < len(right):
        if left[left_pointer]<right[right_pointer]:
            result.append(left[left_pointer])
            left_pointer+=1
        else:
            result.append(right[right_pointer])
            right_pointer +=1
    result += left[left_pointer:]
    result += right[right_pointer:]
    return result

Binary search

Non-recursive version

def er_find(ley,alist):
    min = 0
    max = len(alist)-1
    cen = (min+max)//2
    if key in alist:
        while True:
            if alist[cen] > key:
                cen-=1
            elif alist[cen]<key:
                cen+=1
            elif alist[cen]==key:
                return cen
    else:       
        raise

Recursive version

def er_find(key,alist):
    n = len(n)
    if n >0:
        mid = n//2
        if alist[mid] == key:
            return True
        elif key < alist[mid]:
            return er_find(alist[:mid],key)
        else:
            return er_find(alist[mid+1:])
    return False

Common sorting algorithm efficiency comparison

This is a summary of their own to see the video. . If there is anything wrong with it, but please let me know big brother. . . .

Guess you like

Origin www.cnblogs.com/200zhl/p/11078610.html