In-depth analysis of fast sorting algorithm

The project was shelved, look at the document to see sleepy to look algorithm, for the first time (probably learned before, forgotten) see this algorithm, I feel quite interesting, then to analyze in depth:

The first step is divided into two quick sort, at first retreated to the number of mathematical formulas traversal (here on the assumption that other time-consuming operations are not only time-consuming operation traversal).

Then the number of ranking is traversed ordinary n * n-1, the number of traverse is divided into two n + m * (m - 1) + (n - m - 1) * (n - m - 2) [1 <m < n],

Later explain this formula, n is a packet in front of, behind the ordinary sort in two stages, m represents the length of the first segment.

(n * n-1) - (n + m * (m - 1) + (n - m - 1) * (n - m - 2) [1 < m < n]) = m * m - 2 *(n - 1) * (m + 2)

This formula seems constant is less than zero, do not prove that derive around ah around bitter and difficult, but here ignores the other time-consuming operations, so to speak, right or directly with the code:

import now from 'performance-now';

// Generate Array to Sort
const originArray = [];
const max = 10;
for (let k = 0; k < max; k++) {
  originArray[k] = Math.floor(Math.random() * max) + 1;
}
console.log("Origin       Array", JSON.stringify(originArray));
console.log();

const len = originArray.length;
let normalSort = [], quickSort = [], normalTimes = 0, quickTimes = 0;

var t0 = now();
//Normal Sort Method
normalSort = Array.from(originArray);
for (let i = 0; i < len; i++) {
  for (let j = i + 1; j < len; j++) {
    normalTimes++;
    if (normalSort[i] < normalSort[j]) {
      let temp = normalSort[i];
      normalSort[i] = normalSort[j];
      normalSort[j] = temp;
    }
  }
}

var t1 = now();

//Quick Sort Method
const half = Math.floor(len / 2);
let rightPart = [];
for (let i = 0; i < len; i++) {
  if (originArray[i] > originArray[half]) {
    quickSort.push(originArray[i]);
  } else {
    rightPart.push(originArray[i]);
  }
}
const splitLen = quickSort.length;
quickSort = quickSort.concat(rightPart);
for (let i = 0; i < splitLen; i++) {
  for (let j = i + 1; j < splitLen; j++) {
    quickTimes++;
    if (quickSort[i] < quickSort[j]) {
      let temp = quickSort[i];
      quickSort[i] = quickSort[j];
      quickSort[j] = temp;
    }
  }
}
for (let i = splitLen; i < len; i++) {
  for (let j = i + 1; j < len; j++) {
    quickTimes++;
    if (quickSort[i] < quickSort[j]) {
      let temp = quickSort[i];
      quickSort[i] = quickSort[j];
      quickSort[j] = temp;
    }
  }
}
var t2 = now();

console.log("Normal Sort Result", JSON.stringify(normalSort));
console.log("Quick Sort  Result", JSON.stringify(quickSort));
console.log();

console.log("NormalSort took " + (t1 - t0) + " milliseconds. loop times" + normalTimes);
console.log("QuickSort  took " + (t2 - t1) + " milliseconds. loop times" + quickTimes);

The following are the length of the array 10, 100, ... corresponding to the execution time:

 

As can be seen, apart from 1000, when a little slow, other times, the speed advantage is obvious.

 

Guess you like

Origin www.cnblogs.com/jerryqi/p/11468428.html