@KafkaListener no está consumiendo mensajes - problema con deserialización

Priya Tanwar:

Productor mensaje utilizando los enlaces de Kafka de la primavera nube arroyos

@Component
public static class PageViewEventSource implements ApplicationRunner {

private final MessageChannel pageViewsOut;
private final Log log = LogFactory.getLog(getClass());

public PageViewEventSource(AnalyticsBinding binding) {
    this.pageViewsOut = binding.pageViewsOut();
}

@Override
public void run(ApplicationArguments args) throws Exception {
    List<String> names = Arrays.asList("priya", "dyser", "Ray", "Mark", "Oman", "Larry");
    List<String> pages = Arrays.asList("blog", "facebook", "instagram", "news", "youtube", "about");
    Runnable runnable = () -> {
        String rPage = pages.get(new Random().nextInt(pages.size()));
        String rName = pages.get(new Random().nextInt(names.size()));
        PageViewEvent pageViewEvent = new PageViewEvent(rName, rPage, Math.random() > .5 ? 10 : 1000);


        Serializer<PageViewEvent> serializer = new JsonSerde<>(PageViewEvent.class).serializer();
        byte[] m = serializer.serialize(null, pageViewEvent);

        Message<byte[]> message =  MessageBuilder
                .withPayload(m).build();

        try {
            this.pageViewsOut.send(message);
            log.info("sent " + message);
        } catch (Exception e) {
            log.error(e);
        }
    };
    Executors.newScheduledThreadPool(1).scheduleAtFixedRate(runnable, 1, 1, TimeUnit.SECONDS);
}

Este uso por debajo de serialización

spring.cloud.stream.kafka.streams.binder.configuration.default.key.serde = org.apache.kafka.common.serialization.Serdes $ StringSerde spring.cloud.stream.kafka.streams.binder.configuration.default.value .serde = org.apache.kafka.common.serialization.Serdes $ BytesSerde

Estoy tratando de consumir estos mensajes en aplicación de consumidor por separado a través de la primavera Kafka - KafkaListener

@Service
public class PriceEventConsumer {


private static final Logger LOG = LoggerFactory.getLogger(PriceEventConsumer.class);

@KafkaListener(topics = "test1" , groupId = "json", containerFactory = "kafkaListenerContainerFactory")
public void receive(Bytes data){
//public void receive(@Payload PageViewEvent data,@Headers MessageHeaders headers) {
    LOG.info("Message received");
    LOG.info("received data='{}'", data);

    }

configuración de fábrica de envases

 @Bean
 public ConsumerFactory<String, Bytes> consumerFactory() {

  Map<String, Object> props = new HashMap<>();
  props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
  props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG,
  StringDeserializer.class);
   props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, 
  BytesDeserializer.class);
  props.put(ConsumerConfig.GROUP_ID_CONFIG, "json");
  props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");

  return new DefaultKafkaConsumerFactory<>(props);

 }

 @Bean
 public ConcurrentKafkaListenerContainerFactory<String, Bytes> 

 kafkaListenerContainerFactory() {

  ConcurrentKafkaListenerContainerFactory<String, Bytes> factory =
        new ConcurrentKafkaListenerContainerFactory<>();
  factory.setConsumerFactory(consumerFactory());
  return factory;
 }

Con esta configuración del Consumidor no está recogiendo los mensajes (bytes). Si cambio de Kafka oyente a aceptar Cadena entonces me da a continuación una excepción:

   @KafkaListener(topics = "test1" , groupId = "json", containerFactory = "kafkaListenerContainerFactory")

 public void receive(String data){

    LOG.info("Message received");
    LOG.info("received data='{}'", data);

    }

Causado por:

org.springframework.messaging.converter.MessageConversionException: No se puede manejar el mensaje; excepción anidada es org.springframework.messaging.converter.MessageConversionException: No se puede convertir de [org.apache.kafka.common.utils.Bytes] a [java.lang.String] para GenericMessage [carga útil = { "userId": "facebook" "página": "acerca de", "duración": 10}, cabeceras = {kafka_offset = 4213, kafka_consumer=brave.kafka.clients.TracingConsumer@9a75f94, kafka_timestampType = CREATE_TIME, kafka_receivedMessageKey = null, kafka_receivedPartitionId = 0, kafka_receivedTopic = test1 , kafka_receivedTimestamp = 1553007593670}], FailedMessage = GenericMessage [carga útil = { "userId": "facebook", "página": "aproximadamente", "duración": 10}, cabeceras = {kafka_offset = 4,213, kafka_consumer = brave.kafka. clientela.

Cualquier punteros serán muy útiles.

Actualización de POJO Parte

Parte Sing -

  @KafkaListener(topics = "test1" , groupId = "json", containerFactory = "kafkaListenerContainerFactory")

  public void receive(@Payload PageViewEvent data,@Headers MessageHeaders headers) {
    LOG.info("Message received");
    LOG.info("received data='{}'", data);

   }

configuración de fábrica de envases

 @Bean
 public ConsumerFactory<String,PageViewEvent > priceEventConsumerFactory() {

    Map<String, Object> props = new HashMap<>();
    props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
    props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
    props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, JsonDeserializer.class);
    props.put(ConsumerConfig.GROUP_ID_CONFIG, "json");
    props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
    return new DefaultKafkaConsumerFactory<>(props, new StringDeserializer(), new JsonDeserializer<>(PageViewEvent.class));



}

    @Bean
    public ConcurrentKafkaListenerContainerFactory<String, PageViewEvent> priceEventsKafkaListenerContainerFactory() {
      ConcurrentKafkaListenerContainerFactory<String, PageViewEvent> factory =
            new ConcurrentKafkaListenerContainerFactory<>();
     factory.setConsumerFactory(priceEventConsumerFactory());
     return factory;
     }

productor -

  @Override
  public void run(ApplicationArguments args) throws Exception {
  List<String> names = Arrays.asList("priya", "dyser", "Ray", "Mark", "Oman", "Larry");
   List<String> pages = Arrays.asList("blog", "facebook", "instagram", "news", "youtube", "about");
   Runnable runnable = () -> {
    String rPage = pages.get(new Random().nextInt(pages.size()));
    String rName = pages.get(new Random().nextInt(names.size()));
    PageViewEvent pageViewEvent = new PageViewEvent(rName, rPage, Math.random() > .5 ? 10 : 1000);

    Message<PageViewEvent> message =  MessageBuilder
            .withPayload(pageViewEvent).build();
                    try {
        this.pageViewsOut.send(message);
        log.info("sent " + message);
    } catch (Exception e) {
        log.error(e);
    }
};
Consorcio inactivo :

Puede deserializar el registro de kfka en POJO, para las versiones <2.2.x utilizar el MessageConverter

A partir de la versión 2.2, puede configurar explícitamente el deserializer utilizar el tipo de objetivo suministrado e ignorar la información de tipo en las cabeceras mediante el uso de uno de los constructores sobrecargados que tienen un valor lógico

@Bean
public ConsumerFactory<String,PageViewEvent > priceEventConsumerFactory() {

Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, JsonDeserializer.class);
props.put(ConsumerConfig.GROUP_ID_CONFIG, "json");
props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
return new DefaultKafkaConsumerFactory<>(props, new StringDeserializer(), new JsonDeserializer<>(PageViewEvent.class,false));

}

O mediante el uso MessageConverter

 @Bean
 public ConcurrentKafkaListenerContainerFactory<String, Bytes> kafkaListenerContainerFactory() {

 ConcurrentKafkaListenerContainerFactory<String, Bytes> factory =
    new ConcurrentKafkaListenerContainerFactory<>();
 factory.setConsumerFactory(consumerFactory());
 factory.setMessageConverter(new StringJsonMessageConverter());
 return factory;
}

Supongo que te gusta

Origin http://43.154.161.224:23101/article/api/json?id=219598&siteId=1
Recomendado
Clasificación