Preferably, the worst average, amortized time complexity

Public concern number MageByte, punctuation set star "look" is that we create a good culture of power. Background reply "plus group" into the technical exchange group won more technical growth. This article written from MageByte- Aoba

We said last time complexity and degree of spatial multiplexing , cited a number of analytical techniques as well as some common complexity analysis such as O (1), O (logn ), O (n), O (nlogn), we will continue to fine today of time complexity.

1. The best-case time complexity (best case time complexity)

2. The worst case time complexity (worst case time complexity)

3. The case where the average time complexity (average case time complexity)

4. amortized time complexity (amortized time complexity)

Complexity Analysis

public int findGirl(int[] girlArray, int number) {
  int i = 0;
  int pos = -1;
  int n = girlArray.lentgh();
  for (; i < n; ++i) {
    if (girlArray[i] == number) {
      pos = i;
      break;
    }
  }
  return pos;
}

Code logic you should be very easy to see, find the location number appears in the disordered array, if not found returns -1. "Flirting Scholar" Yeh through the array to find the protagonist Scholar by this method, because at the moment we have not learned a variety of algorithms flair, from start to finish inspection is not only Scholar, so only through the array. girlArray array holds Scholar, Dong-xiang, maid ...... coding, now Flirting by selecting the number of whether this ratio is coded Scholar.

the public

The code in different situations, time complexity is not the same, so in order to describe the complexity of the code at different times in different situations, we have introduced == the best, the worst, the average time complexity ==. girlArray n = length of the array.

  1. When, as a first time Scholar, code complexity that is O (1).
  2. When the last Surge in the team, that the time code is the complexity of O (n).
  3. When Scholar but not in the first team, is no longer the last one, it is not determined in the ranks.
  4. If Washington cheaters, also ranks Scholar does not exist at all, Tang Bo Xu also need to complete one inspection team learned that the time complexity it becomes O (n)

Best-case time complexity

In the best case, the execution time of this code, that is, "Flirting" the fastest point in Scholar. If this row represents girlArray girl array, number Scholar variable is coded. If the first is a girl, "Scholar" and that the time complexity is O (1).

The worst case time complexity

In the worst case, the complexity of the execution time of this code. A check is true to a longitudinal array of O (n).

The average case time complexity

In fact, the best and the worst case is an extreme case, the probability is not great. Therefore, in order to more accurately represent the time complexity of the average case, another change is introduced: the average case time complexity .

Or the above "looking Scholar" code to determine the position of number coding appear in the loop, there == n + 1 == case:

At 0 ~ n-1 and not the array in this array. There are n cases in the array, plus it is not in the array n + 1 planted. The number of girls to traverse each case is different. We need to find in each case the number of cumulative girl, and then divided by the total number of cases (n + 1), we need to traverse to get the average number of times. Knock blackboard: complexity of the formulas is the average number of elements = each traverse accumulating / the number of all cases

The average degree of complexity:

$$\frac {((1+2+3… +n) + n)} {(n+1)} = \frac {n(n+3)} {2(n+1)}$$

The derivation process:

$$\because 1+2+3 …+ n = n + (n-1) + (n-2)… + 1$$

$$\therefore (1 +2 +3… + n) = \frac {n(1+n)} {2}$$

$$\therefore (1+2+3+…+n) + n = \frac {n(3+n)} {2}$$

According to our previous studies time and space complexity of the multiplexing Big O notation, omitting coefficients, ground operators, constants, so that the average case time complexity is O (n-) .

Expected time complexity

The above average-case time complexity is derived without considering the probability of occurrence of each case, n-a + 1 case where, in each case the probability of occurrence is not the same, so also re-introduced into the probability of occurrence of each specific analysis.

Surge or in the number of numbers 0 ~ n-1, or is not in 0 ~ n-1, so their probability $ \ frac {1} {2} $.

The joint probability number 0 ~ n-1 is the same as the respective positions of 1 / n. The probability law of multiplication, number in the 0 ~ n-1 is the probability that an arbitrary position $$ \ frac {1} {2n} $$.

Therefore, based on the previous derivation of the probability of occurrence of each case and then we take into account, would be calculated as the average case time complexity is:

Considering the average probability of Complexity:

$$(1 \frac {1} {2n} + 2 \frac {1} {2n}+ 3 \frac {1} {2n}…+n\frac {n} {2n} ) + n \frac {1} {2} = \frac {3n+1} {4}$$

This is the weighted average probability theory, also called the expected value, the average time complexity full name: the weighted average time complexity or time complexity expectations .

After the introduction, and the average complexity becomes O ($$ \ FRAC with 3N +. 1 {{}}. 4 $$) , and ignoring constant coefficients, the weighted average time of the resulting complexity is O (n). Finally analysis to derive finished, students can relax.

note:

In most cases, we do not need to distinguish between the best and the worst, the average case time complexity . Only the same piece of code at the time complexity of different situations have on the order of the gap , we will distinguish three cases, in order to more effectively describe the time complexity of the code.

Amortized time complexity of the situation

Finally, a hard bone to the understanding of the above plus the probability of the expected time to look at the complexity of this much easier. Amortized time complexity, with the average time it sounds a bit like complexity.

Amortized complexity is a more advanced concept, it is a special case, the application of special scenes and more limited.

Corresponding analysis is called: amortization amortized analysis or analysis.

// array 表示一个长度为 n 的数组
// 代码中的 array.length 就等于 n
 int[] array = new int[n];
 int count = 0;

public void insert(int val) {
    if (count == array.length) {
       int sum = 0;
       for (int i = 0; i < array.length; ++i) {
          sum = sum + array[i];
       }
       array[0] = sum;
       count = 1;
    }

    array[count] = val;
    ++count;
 }

Code logic: inserting into a data array, when the array is full count == array.lenth, summed through the array, the sum of the sum value after the first position into the array, and then the new data is inserted. But if the array outset of free space, directly insert data into an array. The data here is full: can be repeatedly read and write to the storage space, users think it is empty it is empty. If you define a complete rewrite is cleared to 0 or a value that can be! Users only care about the new value to be stored!

Analysis of the time complexity of the above:

  1. The ideal case, the free space can be directly inserted into the position of array subscript count. So is O (1).
  2. The worst case, the array is no free space, one needs to do first loop through the summing, then insert. Time complexity of O (n).

The average time complexity

Array of length n, can be inserted as a different position, the case where there are n, each complexity is O (1).

There is also a special case, when no free space is inserted, the complexity is O (n), n + 1 is the total cases, and the probability of each case are $$ \ frac {1} {n + 1} $ $. Therefore, according to the weighted average calculation method, the average time complexity:

$$(1 \frac {1} {n+1} + 1 \frac {1} {n+1}+ 1 \frac {1} {n+1}…+1\frac {1} {n+1} ) + n \frac {1} {n+1} = \frac {2n} {n+1}$$

When the constants and coefficients are omitted, the average time complexity is O (1).

In fact, we do not need to be so complicated, with contrast findGirl insert method.

  1. In extreme cases findGirl complexity of O (1), and insert the basic situation is O (1). Only when the full array is O (n).
  2. For insert () function for, complexity O (1) time of insertion and O (n) time complexity of insertion, the frequency of occurrence is very regular, and before and after a certain timing relationship, a generally O after (n) is inserted, followed by the n-1 O (1) inserting operation, the cycle.

Amortization analysis

Average complexity analysis of the above example does not need so complex, without introducing knowledge of probability theory.

As can be seen by analyzing the sample for the majority of code complexity O (1), only the extreme complexity of higher O (n). At the same time the complexity of the follow certain rules, typically an O (n), and n number of O (1). Use a simpler method for analyzing a particular scene: Amortization analysis .

Amortization obtained by the analysis time complexity is amortized time complexity .

The general idea: Every time O (n) will be followed by n times O (1), so that the time-consuming complexity shared equally low complexity time-consuming . Obtained amortized time complexity is O (1).

Application scenarios : amortized time complexity and amortization more special analysis scenarios, the data for a continuous operation, in most cases the time complexity is very low, only time complexity of individual cases higher. This set of operations that exist coherent timing relationship.

This time we will put this set of actions analyzed together, the high degree of complexity to the remainder shared equally low complexity , it is generally amortized time complexity is equivalent to the best-case time complexity.

Note: amortized time complexity is (using a special application scenarios) a special kind of average complexity, the way you can master the analysis.

Amortized time complexity is a special kind of average time complexity , we do not need to spend too much effort to distinguish between them. You should have most of its analysis, amortization analysis. As the result of analysis is called out or called shared equally average, this is just a statement, it does not matter.

The end of the sentence thought

Finally, leave a question for everyone, with the study of this article just below the code analysis, "best", "the worst" time complexity "shared equally."

/ 全局变量,大小为 10 的数组 array,长度 len,下标 i。
int array[] = new int[10];
int len = 10;
int i = 0;

// 往数组中添加一个元素
void add(int element) {
   if (i >= len) { // 数组空间不够了
     // 重新申请一个 2 倍大小的数组空间
     int new_array[] = new int[len*2];
     // 把原来 array 数组中的数据依次 copy 到 new_array
     for (int j = 0; j < len; ++j) {
       new_array[j] = array[j];
     }
     // new_array 复制给 array,array 现在大小就是 2 倍 len 了
     array = new_array;
     len = 2 * len;
   }
   // 将 element 放到下标为 i 的位置,下标 i 加一
   array[i] = element;
   ++i;
}

The overall meaning is to add an element to the array, when the space is not enough time to regenerate a feeling of space and twice the original array in order to copy the original array data to the new array.

In fact, students here can also be extended to the extension of capacity of HashMap, when elements sword load capacity factor of 0.75, twice the capacity of HashMap requires extension of the original and re-elements into the new array. Then the time complexity is how much?

No. MageByte public attention backstage reply "add" to obtain answers to this topic, you can also reply "plus group" technology to join the group to share with us your thoughts together, we are the first time feedback.

MageByte

References: "Data Structures and Algorithms beauty"

Guess you like

Origin blog.51cto.com/14745561/2478783