Recently w3c do the actual algorithm, to record their realization; the subject is required to achieve fast row algorithm, as I understand it, is to find a row of fast random value in the array as the base value, the value is greater than the plan for a group smaller than this value is classified as a group, and then divided again for each group, and the principle of sorting binary tree like this to achieve something like, such as the original array is [5,8,9,2,3] is then divided into 5 groups set [2,3], 5, [8,9]; then again [2,3] and [8,9] are sequentially returned to differentiate i.e., the leftmost digit is the smallest number has been read rightmost digit is the maximum, the following are my ways:
function group(array) {
if (array.length == 0|| array.length == 1) {
return array;
}
var x = Math.floor(Math.random(array.length - 1));
var jizhi = array[x];
var leftArray = [];
var rightArray = [];
var centerArray = [];
var result = [];
for (var i in array) {
if (array[i] > jizhi) {
rightArray.push(array[i]);
} else if (array[i] < jizhi) {
leftArray.push(array[i]);
} else {
centerArray.push(array[i]);
}
}
result = result.concat(group(leftArray));
result = result.concat(centerArray);
result = result.concat(group(rightArray));
return result;
}
console.log(group([1, 4, 2, 8, 345, 123, 43, 32, 5643, 63, 123, 43, 2, 55, 1, 234, 92]));
Feel implementation of some waste memory, because the declaration too many objects, then looked at me answer w3c
function quickSort(array) {
function sort(prev, numsize) {
var nonius = prev;
var j = numsize - 1;
var flag = array[prev];
if ((numsize - prev) > 1) {
while (nonius < j) {
for (; nonius < j; j--) {
if (array[j] < flag) {
array[nonius++] = array[j]; //a[i] = a[j]; i += 1;
break;
}
}
for (; nonius < j; nonius++) {
if (array[nonius] > flag) {
array[j--] = array[nonius];
break;
}
}
}
array[nonius] = flag;
sort(0, nonius);
sort(nonius + 1, numsize);
}
}
sort(0, array.length);
return array;
}
Efficiency seems to be much higher, and then I did a test and found that strange that I realized seems to be faster, a little confused here, have seen the great God, help point out where the next problem, thanks, here is a comparison process
behind saw something like a merge sort, but on closer inspection, or to a difference, directly merge sort the array into two groups of data from the intermediate row fast random production base value