I saw a business scenario in the development. Large batches of data will be very time-consuming to execute. One strategy is to process in batches and process a certain amount of data each time. This will improve the efficiency of data processing. The following is what I wrote The code, comments are written on it, if you have any questions or better ideas, suggestions, please write in the comments, thankyou
This is a unit test I wrote, the code can be used directly
@Test
publicvoidtesttheam(){
//向list中添加数据,用于后面的测试
List list =newArrayList<>();for(int i =1; i <17; i++){
list.add(i);}//list的长度
int listSize = list.size();//每5个元素生成一个新list
int toIndex =5;for(int i =0; i < list.size(); i +=5){
//不足5个元素的生成到一个list中if(i +5> listSize){
toIndex = listSize - i;}//生成新的list
List sublist=list.subList(i, i + toIndex);
Map<String,List> map=newHashMap<String,List>();
map.put("测试新生成的list:索引从"+i+"--到--"+toIndex, sublist);
JSONObject jsonObject = JSONObject.fromObject(map);//3、将json对象转化为json字符串
String returnjson = jsonObject.toString();
System.out.println(returnjson );}}
This is the result of the data ran out of the console, which is often nice