Report statistics summation algorithm

  For reporting system daemon, we often encounter some problems. For example: the packet in accordance with the respective summed problem granularity for data.

  First, we describe a situation when we have the data in the current range segment $ Array [[date, time], ...], the time granularity $ TimeList [time1, time2, time3 ...], we want the data group sum,

To a more efficient algorithm:

     data:

. 1  $ array1 = [
 2      10,20,30,40,50
 . 3  ];
 . 4  $ array2 = [
 . 5      [1,0],
 . 6      [3,4-],
 . 7      [5,15],
 . 8      [7,25] ,
 9      [9, 26],
 10      [15,30],
 11      [17, 35],
 12      [12,46],
 13      [3,48],
 14      [8,50],
 15  
16  ]; 
 . 17  
18 is  algorithm function:
 19  / * *
 20  * is obtained between the particle size of the data packet interval the SUM
 21 is * @Param $ Array array
 22 is  * @param $ timelist size sequence
 23 is  * a one-dimensional array @return Array
 24   * / 
25  function ( $ the Array , $ timelist ) {
 26 is      $ LengthArray = COUNT ( $ the Array );
 27      $ LengthTimeList = COUNT ( $ timelist );
 28      $ RES = [];
 29      // initialize the returned array. 4 
30      for ( $ I = 0; $ I < $ timelist ; $ I ++ ) {
31         $res[$i] = 0;
32     }
33     //分组求和
34     $i = 0;
35     $j = 0;
36     for(;$i<$LengthTimeList;$i++){
37         for(;$j<$LengthArray;$j++){
38             if($TimeList[$j][1] > $Array[$i]){
39                 break;
40              }
 41         // depending on the requirements set 
42 is              $ RES [ $ I ] + = $ the Array [ $ J ] [0 ];
 43 is          }
 44 is      }
 45      return  $ RES ;
 46 is  }
 47  
48 NOTE: This article represents only personal thinking, pointing welcome criticism.

 

Guess you like

Origin www.cnblogs.com/oldhands/p/11442622.html