Spring Batch Batch (5) - ItemWriter

ItemWriter


When the read data for reading an item is read as a unit cycle, and for writing data writer is based on a unit of chunk, a piece of writing

The first to write a Job as an example and ItermReader

@Configuration
public class DbOutputDemoJobConfiguration {
 
    @Autowired
    public JobBuilderFactory jobBuilderFactory;
 
    @Autowired
    public StepBuilderFactory stepBuilderFactory;
 
    @Autowired
    @Qualifier("dbOutputDemoJobFlatFileReader")
    public ItemReader<Customer> dbOutputDemoJobFlatFileReader;
 
    @Autowired
    @Qualifier("dbOutputDemoJobFlatFileWriter")
    public ItemWriter<Customer> dbOutputDemoJobFlatFileWriter;
 
    @Bean
    public Step dbOutputDemoStep() {
        return stepBuilderFactory.get("dbOutputDemoStep")
                .<Customer,Customer>chunk(10)
                .reader(dbOutputDemoJobFlatFileReader)
                .writer(dbOutputDemoJobFlatFileWriter)
                .build();
    }
 
    @Bean
    public Job dbOutputDemoJob() {
        return jobBuilderFactory.get("dbOutputDemoJob")
                .start(dbOutputDemoStep())
                .build();
    }
}
 
@Configuration
public class DbOutputDemoJobReaderConfiguration {
 
    @Bean
    public FlatFileItemReader<Customer> dbOutputDemoJobFlatFileReader() {
        FlatFileItemReader<Customer> reader = new FlatFileItemReader<>();
        reader.setResource(new ClassPathResource("customerInit.csv"));
        DefaultLineMapper<Customer> customerLineMapper = new DefaultLineMapper<>();
 
        DelimitedLineTokenizer tokenizer = new DelimitedLineTokenizer();
        tokenizer.setNames(new String[] {"id","firstName", "lastName", "birthdate"});
 
        customerLineMapper.setLineTokenizer(tokenizer);
        customerLineMapper.setFieldSetMapper((fieldSet -> {
            return Customer.builder().id(fieldSet.readLong("id"))
                    .firstName(fieldSet.readString("firstName"))
                    .lastName(fieldSet.readString("lastName"))
                    .birthdate(fieldSet.readString("birthdate"))
                    .build();
        }));
        customerLineMapper.afterPropertiesSet();
        reader.setLineMapper(customerLineMapper);
        return reader;
    }
}




Data written to the database


Text data to be warehousing

file

Database Table

file

JdbcBatchItemWriter

@Configuration
public class DbOutputDemoJobWriterConfiguration {
 
    @Autowired
    public DataSource dataSource;
 
    @Bean
    public JdbcBatchItemWriter<Customer> dbOutputDemoJobFlatFileWriter(){
        JdbcBatchItemWriter<Customer> itemWriter = new JdbcBatchItemWriter<>();
		// 设置数据源
        itemWriter.setDataSource(dataSource);
		// 执行sql语句
        itemWriter.setSql("insert into customer(id,firstName,lastName,birthdate) values " +
                "(:id,:firstName,:lastName,:birthdate)");
		// 替换属性值
        itemWriter.setItemSqlParameterSourceProvider(new BeanPropertyItemSqlParameterSourceProvider<>());
        return itemWriter;
    }
}

Results of the

file




Data is written to a file .data


FlatFileItemWriter any type can be written into a common file for the target data T

We will customerInit.csv data is read out and written to the file in customerInfo.data

FlatFileItemWriter

@Configuration
public class FlatFileDemoJobWriterConfiguration {
 
    @Bean
    public FlatFileItemWriter<Customer> flatFileDemoFlatFileWriter() throws Exception {
        FlatFileItemWriter<Customer> itemWriter = new FlatFileItemWriter<>();
		// 输出文件路径
        String path = File.createTempFile("customerInfo",".data").getAbsolutePath();
        System.out.println(">> file is created in: " + path);
        itemWriter.setResource(new FileSystemResource(path));
 
		// 将Customer对象转为字符串
        itemWriter.setLineAggregator(new MyCustomerLineAggregator());
        itemWriter.afterPropertiesSet();
 
        return itemWriter;
 
    }
}

public class MyCustomerLineAggregator implements LineAggregator<Customer> {
    //JSON
    private ObjectMapper mapper = new ObjectMapper();
 
    @Override
    public String aggregate(Customer customer) {
 
        try {
            return mapper.writeValueAsString(customer);
        } catch (JsonProcessingException e) {
           throw new RuntimeException("Unable to serialize.",e);
        }
    }
}




Data is written to an XML file


Write data to xml file, you must use StaxEventItemWriter, will be used to sequence files XStreamMarshaller

StaxEventItemWriter

@Configuration
public class XMLFileDemoJobWriterConfiguration {
 
    @Bean
    public StaxEventItemWriter<Customer> xmlFileDemoXMLFileWriter() throws Exception {
		// 对象转为XML
        XStreamMarshaller marshaller = new XStreamMarshaller();
        Map<String,Class> aliases = new HashMap<>();
        aliases.put("customer",Customer.class);
        marshaller.setAliases(aliases);
 
        StaxEventItemWriter<Customer> itemWriter = new StaxEventItemWriter<>();
		// 指定根标签
        itemWriter.setRootTagName("customers");
        itemWriter.setMarshaller(marshaller);
 
		// 指定输出xml文件路径
        String path = File.createTempFile("customerInfo",".xml").getAbsolutePath();
        System.out.println(">> xml file is generated: " + path);
        itemWriter.setResource(new FileSystemResource(path));
        itemWriter.afterPropertiesSet();
 
        return itemWriter;
    }
}

Output follows

file




Data is written to multiple files


Writing data to a plurality of files, use or use ClassifierCompositItemWriter CompositItemWriter

Difference between them:

  • CompositeItemWriter is the total amount of data are written into a plurality of files;

  • ClassifierCompositeItemWriter rule is specified, the condition of the data into the specified file;


The data are written to an xml file and json file, write the file implemented in CompositeItemWriter, ClassifierCompositeItemWriter in


   @Bean
    public StaxEventItemWriter<Customer> xmlFileWriter() throws Exception {
		// 对象转为XML
        XStreamMarshaller marshaller = new XStreamMarshaller();
        Map<String,Class> aliases = new HashMap<>();
        aliases.put("customer",Customer.class);
        marshaller.setAliases(aliases);
 
        StaxEventItemWriter<Customer> itemWriter = new StaxEventItemWriter<>();
		// 指定根标签
        itemWriter.setRootTagName("customers");
        itemWriter.setMarshaller(marshaller);
 
		// 指定输出路径
        String path = File.createTempFile("multiInfo",".xml").getAbsolutePath();
        System.out.println(">> xml file is created in: " + path);
        itemWriter.setResource(new FileSystemResource(path));
        itemWriter.afterPropertiesSet();
 
        return itemWriter;
    }
 
    @Bean
    public FlatFileItemWriter<Customer> jsonFileWriter() throws Exception {
        FlatFileItemWriter<Customer> itemWriter = new FlatFileItemWriter<>();
		// 指定输出路径
        String path = File.createTempFile("multiInfo",".json").getAbsolutePath();
        System.out.println(">> json file is created in: " + path);
        itemWriter.setResource(new FileSystemResource(path));
 
        itemWriter.setLineAggregator(new MyCustomerLineAggregator());
        itemWriter.afterPropertiesSet();
 
        return itemWriter;
 
    }

CompositeItemWriter


Use CompositeItemWriter output data to a plurality of files `` `@Bean public CompositeItemWriter customerCompositeItemWriter ( ) throws Exception {CompositeItemWriter itemWriter = new CompositeItemWriter <> (); // the specified plurality of output objects itemWriter.setDelegates (Arrays.asList (xmlFileWriter (), jsonFileWriter ())); itemWriter.afterPropertiesSet ( ); return itemWriter;} `` `
output

file


file




ClassifierCompositeItemWriter


ClassifierCompositeItemWriter used to output data according to the rules file

    @Bean
    public ClassifierCompositeItemWriter<Customer> customerCompositeItemWriter() throws Exception {
       
        ClassifierCompositeItemWriter<Customer> itemWriter = new ClassifierCompositeItemWriter<>();
        itemWriter.setClassifier(new MyCustomerClassifier(xmlFileWriter(),jsonFileWriter()));
        return itemWriter;
    }

Division rules specified data structure, classified according to the id customer `` `public class MyCustomerClassifier implements Classifier

}

<br/>

输出结果

![file](https://imgconvert.csdnimg.cn/aHR0cHM6Ly9ncmFwaC5iYWlkdS5jb20vcmVzb3VyY2UvMjIyMDMwYmUzN2U5OWM2MzAwNTgzMDE1ODMzMzMyMzIucG5n?x-oss-process=image/format,png)


<br/>

![file](https://imgconvert.csdnimg.cn/aHR0cHM6Ly9ncmFwaC5iYWlkdS5jb20vcmVzb3VyY2UvMjIyZjAwNWRiNjVmNjcwNmVkMWEwMDE1ODMzMzMzMTcucG5n?x-oss-process=image/format,png)


<br/>
<br/>
参考:

https://blog.csdn.net/wuzhiwei549/article/details/88593942

https://blog.51cto.com/13501268/2298822
He published 187 original articles · won praise 79 · views 160 000 +

Guess you like

Origin blog.csdn.net/weixin_38004638/article/details/104765005