kafka杂记之生产者API使用(自定义分区器和序列化器)

版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/lyzx_in_csdn/article/details/82155695

1、自定义分区器

    1.1 实现org.apache.kafka.clients.producer.Partitioner接口

public int partition(String s, Object o, byte[] bytes, Object o1, byte[] bytes1, Cluster cluster)
public void close()
public void configure(Map<String, ?> map)

    partition()方法时分区的主要方法,返回值表示分区号

    config()相当于一个初始化方法,最先调用

    close() 需要关闭初始化时的资源

    1.2 在producer的属性文件中配置即可    

props.put(ProducerConfig.PARTITIONER_CLASS_CONFIG,MyDefinePatitioner.class);

code如下:   

package com.lyzx.kafka;

import org.apache.kafka.common.Cluster;
import org.apache.kafka.common.PartitionInfo;
import java.util.List;
import java.util.Map;

public class MyDefinePatitioner implements org.apache.kafka.clients.producer.Partitioner{

    @Override
    public int partition(String s, Object o, byte[] bytes, Object o1, byte[] bytes1, Cluster cluster){
        List<PartitionInfo> partitionInfos = cluster.partitionsForTopic(s);
        System.out.println("partitionInfos="+partitionInfos);
        System.out.println("topic:"+s+"  ,key="+o+",    byte_key_len="+bytes.length+",   value="+o1+",   byte_value_len="+bytes1.length+"     ");
        return 0;
    }

    @Override
    public void close(){
        System.out.println("MyDefinePatitioner 被关闭。。。");
    }

    @Override
    public void configure(Map<String, ?> map) {
        System.out.println("MyDefinePatitioner 调用configure。。。");
    }
}

测试用例如下 

 @Test
 public void basicTest3(){
        Properties props = new Properties();
        props.put("bootstrap.servers",ips);
        props.put(ProducerConfig.PARTITIONER_CLASS_CONFIG,MyDefinePatitioner.class);
        props.put("buffer.memory", 33554432);
        props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
        props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
        Producer<String, String> producer = new KafkaProducer<>(props);
        for(int i = 0; i < 100; i++){
            producer.send(new ProducerRecord<>("yh1","key_"+i,"v_"+i));
        }
        producer.close();
    }

2、自定义序列化器

2.1,实现  org.apache.kafka.common.serialization.Serializer这个接口

public void configure(Map map, boolean b)
public byte[] serialize(String topic, Object o)
public void close()

  config()最先调用的方法,用于初始化资源

  serialize()用于把对象转换为字节数组,topic表示要发送的topic,o表示数据

   close()用于关闭资源

2.2,在producer属性文件中配置

props.put(ProducerConfig.PARTITIONER_CLASS_CONFIG,MyDefinePatitioner.class);

实现类如下:

package com.lyzx.kafka;

import org.codehaus.jackson.map.ObjectMapper;
import java.io.IOException;
import java.util.Map;


public class UserSerializer implements org.apache.kafka.common.serialization.Serializer{
    private ObjectMapper om = null;
    @Override
    public void configure(Map map, boolean b){
        om = new ObjectMapper();
        System.out.println("UserSerializer  调用configure.......");
    }

    @Override
    public byte[] serialize(String topic, Object o){
        System.out.println("topic="+topic+",   data="+o);
        byte[] bytes = null;
        try{
            bytes = om.writeValueAsString(o).getBytes("UTF-8");
        }catch(IOException e){
            e.printStackTrace();
        }
        return bytes;
    }

    @Override
    public void close(){
        System.out.println("UserSerializer  调用close.......");
    }
}

测试用例如下:

    @Test
    public void basicTest4(){
        Properties props = new Properties();
        props.put("bootstrap.servers",ips);
        props.put("buffer.memory", 33554432);
        props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
        props.put("value.serializer",UserSerializer.class);

        Producer<String,User> producer = new KafkaProducer<>(props);
        for(int i = 100; i < 200; i++){
            User user = new User("name_" + i, 100 + i);
            producer.send(new ProducerRecord<>("user","key_"+i,user));
        }
        producer.close();
    }

猜你喜欢

转载自blog.csdn.net/lyzx_in_csdn/article/details/82155695