Westward data structure notes - Chapter algorithm

First, note


 

2.1 algorithm

  Algorithm is a method to solve the problem described.

  Describe a specific algorithm to solve the problem solution step, expressed as a finite sequence of instructions in a computer, and each represents one or more instruction operations.

    Commands can be human or machine performing a computing device. May be computer instructions can also be our usual language.

    In order to solve a problem or type of, as represented by instructions need to be a certain sequence of operations, the sequence of operations comprising a set of operations, each operation to complete a specific function, which is the algorithm.

 

  • Data Structures and Algorithms relations

   Learning data structures, algorithms talked about, but also to help better understand the data structure.

 

Characteristics 2.2 algorithm

  5 Characteristics algorithm: input, output, finite, certainty and feasibility.

  Input to the algorithm may be zero. But at least one or more of the algorithm output. It may be output in the form of a printout or the like to return one or more values. 

  Finite: it refers to a limited algorithm after performing the step of automatically an infinite loop without ends, and each step is completed within an acceptable time.

  Uncertainty: each step of the algorithm have the meanings defined, it does not appear ambiguous.

    Feasibility: Each step of the algorithm must be feasible, that is, each step can be completed by performing a limited number of times. It means algorithm can be converted to run on a machine program and get the right result.

 

2.3 algorithm design requirements

  Algorithm design requirements: low accuracy, readability, robustness, high efficiency and storage time.

  • Correctness: The correctness of the algorithm means algorithm should at least have input, output and processing five ambiguity, correctly reflect the problem of demand, can get the right answer.

    Correctness can be divided into the following four levels:

    ① algorithm program no syntax errors.

    ② algorithm program can generate an output result satisfies the requirements for valid input data.

    ③ algorithm program for illegal input data can be derived to meet the specifications of the results.

    ④ algorithm program for carefully selected, even make things difficult test data have to meet the output requirements.

    It proved a complex algorithm at all levels are correct price is very expensive, so under normal circumstances, the level 3 as a standard algorithm is correct.

  • Readability: It is another object of the algorithm is designed for ease of reading, understanding and communication.

    High readability help people understand the algorithm, the algorithm often obscure hidden errors, difficult to detect and difficult to debug and modify.

  • Robustness: undesirably when the input data, the algorithm can also make correlation processing, rather than an exception or inexplicable results.
  • Time efficiency, refers to the execution time of the algorithm. Low high short execution time of the algorithm efficiency, long execution time efficiency.
  • Demand refers to the amount of storage needed in the algorithm implementation process maximum storage space, mainly referring to the algorithm running time occupied by memory or external hard disk storage space.

   Design algorithm should try to meet the high time efficiency and low memory requirements.

 

2.4 Algorithm Efficiency Measurement Method

  • Later statistical methods: mainly, the use of computer algorithms without the timer running time of the preparation of the program by comparing the test procedures and data designed to determine the level of efficiency of the algorithm.

   defect:

    ① must implement a program based on the algorithm number, we spend a lot of time and effort.

    ② rely on environmental factors, such as computer hardware and software.

      ③ algorithm test data design difficulties. Scale test data.

    General shall not be accepted after the statistical methods.

 

  • Prior analysis estimation methods: in front of the computer programming, algorithms based on statistical methods to estimate.

     Running time consumed depends on:

     ① algorithm strategies, methods; (fundamental)

     ② the quality of the code generated by the compiler. (software support)

     ③ Enter the scale of the problem. (Enter the amount of how much)

     ④ machine instruction execution speed. (Hardware performance)

    So, remove the hardware and software factors, the running time of a program is good or bad depends on the size of the input and the problem of the algorithm.

    Test Run Time is the most reliable method is to calculate the number of executions has consumed the basic operations of running time. Running time is proportional to the technology.

    Finally, when analyzing the running time of the program, the most important thing is to be seen as a program independent of the programming language algorithm or series of steps.

   The number and size of the input of the associated basic operation, i.e., the number of basic operations to be expressed as a function of the size of the input.

 

Progressive growth of 2.5 function

  • Progressive growth function: to two functions f (n) and g (n), if there exists an integer N, such that for all n> N, f (n) is always greater than g (n), then we say f (n) is a progressive increase faster than g (n).

   When determining the efficiency of an algorithm, function constants and other minor items can often be ignored, but should be concerned about the main items (the highest order term) of the order.   

   An algorithm, as n increases, it will become increasingly algorithm is better than another, or is getting worse in the other algorithms. Time algorithm to estimate the time complexity of algorithm efficiency.  

 

  • The time complexity of the algorithm: The algorithm analysis, the total number of executions statements T (n) n is a question about the size of a function, then analyze T (n) with n and determine changes in T (n) of the order of magnitude. Time complexity of the algorithm, which is a measure of the time method,

     Denoted as: T (n) = O ( f (n)). N represents a problem which increases with the size, growth and algorithm execution time  same as f (n) growth, called progressive time complexity of the algorithm, referred to as time complexity. Where f (n) is a function of problem size n.

   Uppercase O () to reflect the time complexity of the algorithm notation, called big O notation.   

     When n is increased, T (n) algorithm is the slowest growth optimal algorithm.

 

  • Big O is derived order:

   ① All additive constant by a constant-substituted 1 runtime. 

   ② the number of runs a function in the revised, retaining only the highest order term.

   ③ If the highest order entry exists and is not 1, then removed and the constant term is multiplied.

   The result is the big O-order.

 

   Constant Order: constant regardless of how much is accounted for as O (1), but can not be O (3), O (12) such as any other number. Therefore, a simple branch structure (not included in the cyclic structure), the time complexity is O (1).

   Linear order: need to identify a specific number of times a statement or set of statements to run. Analysis algorithm complexity, the key is to analyze the operation of the loop structure.

   Big O is derived understand is not difficult, difficult is a number of related operations on the number of columns, this is more a study your math knowledge and skills.

The number of function execution

Rank

Informal term

12

O (1)

Constant Order

2n + 3

O (n)

Linear Order

3n 2 + 2n + 1

O (N 2 )

Order of the square

5log2n+20

O (logn)

To order

++ 3nlog 2n 2 N + 19

O (nlogn)

nlogn order

6n 3 + 5n 2 + 2n + 12

O(n3)

Cubic order

2 N

O (2 N )

Index order

 

   Common time complexity time taken from small to large are:

   O(1) < O(logn) < O(n) < O(nlogn) < O(n2) <  O(n3) <  O(2n) < O(n!) <  O(nn)  

 

  • Worst case and the average case

    Worst-case running time is the time complexity in the worst case calculation, it is a guarantee that this application is one of the most important needs. Unless otherwise specified, the running time is mentioned in the worst-case running time.

    The average running time is to calculate the average of all cases, all cases are the most significant, because it is the desired operating time. In reality, the average running time is difficult to get through the analysis, are generally estimated from experimental data after a certain number of runs by.

 

2.6 algorithm space complexity

  In exchange for computation time by overhead on the sum space.

  Space complexity of the calculations required by the implemented algorithm storage space, calculated spatial complexity algorithm denoted: S (n) = O (f (n)), where, n-scale of the problem is, f (n) as a statement about the function of the storage space occupied by n.

  In general, when a program is executed on the machine, in addition to the program itself need to store instructions, constants, variables and input data, but also data stored in the storage unit operation. Only we need to analyze the auxiliary unit to the algorithms required at the time of implementation.

  If the space required for the auxiliary algorithm is executed with respect to the amount of input data is constant terms, this algorithm is called to place work . Space complexity is O (1).

 

  The use of "time complexity" to refer to the needs of running time, use the "space complexity" refers to the space requirements. When not using the qualifier "complexity", usually refers to the time complexity.

 

Second, summary


  • Relational data structures and algorithms are interdependent and indivisible.
  • Defined algorithm: describing solve a specific problem solution step, expressed as a finite sequence of instructions in a computer, and each represents one or more instruction operations.
  • Characteristics algorithm: input, output, finite, certainty and feasibility.
  • Algorithm design requirements: low accuracy, readability, robustness, high efficiency and storage time.
  • Algorithms and algorithm design characteristics confusing, need to compare memory.
  • Algorithm efficiency metrics: ex post statistical methods (unscientific, inaccurate), prior analysis estimation methods.
  • Progressive growth function: to two functions f (n) and g (n), if there exists an integer N, such that for all n> N, f (n) is always greater than g (n), then we say f (n) is a progressive increase faster than g (n).

     An algorithm, as n increases, it will become increasingly algorithm is better than another, or is getting worse in the other algorithms. Time algorithm to estimate the time complexity of algorithm efficiency.

  • Big O is derived order:

   ① All additive constant by a constant-substituted 1 runtime.     

   ② the number of runs a function in the revised, retaining only the highest order term.

   ③ If the highest order entry exists and is not 1, then removed and the constant term is multiplied.

   The result is the big O-order.

  • Common time complexity time taken from small to large are:

   O(1) < O(logn) < O(n) < O(nlogn) < O(n2) <  O(n3) <  O(2n) < O(n!) <  O(nn)  

 

Guess you like

Origin www.cnblogs.com/zixuandiezhuzhu/p/11764427.html